DeGAML-LLM Checkpoints
This repository contains pre-trained checkpoints for the generalization module of our proposed DeGAML-LLM framework - a novel meta-learning approach that decouples generalization and adaptation for Large Language Models.
π Links
- Project Page: https://nitinvetcha.github.io/DeGAML-LLM/
- GitHub Repository: https://github.com/nitinvetcha/DeGAML-LLM
- HuggingFace Profile: https://huggingface.co/Nitin2004
π¦ Available Checkpoints
All checkpoints are trained on Qwen2.5-0.5B-Instruct using LoRA adapters optimized with the DeGAML-LLM framework:
| Checkpoint Name | Dataset | Size |
|---|---|---|
qwen0.5lora__ARC-c.pth |
ARC-Challenge | ~4.45 GB |
qwen0.5lora__ARC-e.pth |
ARC-Easy | ~4.45 GB |
qwen0.5lora__BoolQ.pth |
BoolQ | ~4.45 GB |
qwen0.5lora__HellaSwag.pth |
HellaSwag | ~4.45 GB |
qwen0.5lora__PIQA.pth |
PIQA | ~4.45 GB |
qwen0.5lora__SocialIQA.pth |
SocialIQA | ~4.45 GB |
qwen0.5lora__WinoGrande.pth |
WinoGrande | ~4.45 GB |
π Usage
Download
from huggingface_hub import hf_hub_download
# Download a specific checkpoint
checkpoint_path = hf_hub_download(
repo_id="Nitin2004/DeGAML-LLM-checkpoints",
filename="qwen0.5lora__ARC-c.pth"
)
Load with PyTorch
import torch
# Load the checkpoint
checkpoint = torch.load(checkpoint_path)
print(checkpoint.keys())
Use with DeGAML-LLM
Refer to the DeGAML-LLM repository for detailed usage instructions on how to integrate these checkpoints with the framework.
π Performance
These checkpoints achieve state-of-the-art results on common-sense reasoning tasks when used with the DeGAML-LLM adaptation framework. See the project page for complete benchmark results.
π Citation
If you use these checkpoints in your research, please cite:
@article{degaml-llm2025,
title={Decoupling Generalization and Adaptation in Meta-Learning for Large Language Models},
author={Vetcha, Nitin and Xu, Binqian and Liu, Dianbo},
year={2026}
}
π§ Contact
For questions or issues, please:
- Open an issue on GitHub
- Contact: [email protected]
π License
Apache License 2.0 - See LICENSE for details.
- Downloads last month
- -