Text Generation
Transformers
PyTorch
English
Chinese
llama
text-generation-inference
Inference Endpoints
MiniLoong-3B / README.md
GeneZC's picture
Update README.md
791cafa verified
---
language:
- en
- zh
license: apache-2.0
library_name: transformers
datasets:
- EleutherAI/pile
- togethercomputer/RedPajama-Data-1T
- p208p2002/wudao
widget:
- text: <s> 4 + 3 =
---
## MiniLoong-3B
πŸ“‘ [arXiv](https://arxiv.org/abs/2311.07052) | πŸ‘» [GitHub](https://github.com/GeneZC/MiniMA) | πŸ€— [HuggingFace-MiniMA-3B](https://huggingface.co/GeneZC/MiniMA-3B) | πŸ€— [HuggingFace-MiniChat-3B](https://huggingface.co/GeneZC/MiniChat-3B) | πŸ€– [ModelScope-MiniMA-3B](https://modelscope.cn/models/GeneZC/MiniMA-3B) | πŸ€– [ModelScope-MiniChat-3B](https://modelscope.cn/models/GeneZC/MiniChat-3B) | πŸ€— [HuggingFace-MiniChat-1.5-3B](https://huggingface.co/GeneZC/MiniChat-1.5-3B) | πŸ€— [HuggingFace-MiniMA-2-3B](https://huggingface.co/GeneZC/MiniMA-2-3B) | πŸ€— [HuggingFace-MiniChat-2-3B](https://huggingface.co/GeneZC/MiniChat-2-3B) | πŸ€— [HuggingFace-MiniMA-2-1B](https://huggingface.co/GeneZC/MiniMA-2-1B) | πŸ€— [HuggingFace-MiniLoong-3B](https://huggingface.co/GeneZC/MiniLoong-3B) | πŸ€— [HuggingFace-MiniMix-2/4x3B](https://huggingface.co/GeneZC/MiniMix-2_4x3B)
❗ Must comply with LICENSE of LLaMA-2 since it is derived from LLaMA-2.
<img src="./teaser_d.jpg" alt="teaser_d" width="700" />
## Bibtex
```bibtex
@article{zhang2023law,
title={Towards the Law of Capacity Gap in Distilling Language Models},
author={Zhang, Chen and Song, Dawei and Ye, Zheyu and Gao, Yan},
year={2023},
url={https://arxiv.org/abs/2311.07052}
}
```