File size: 429 Bytes
00cd08c f59a278 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
---
title: README
emoji: π
colorFrom: purple
colorTo: blue
sdk: static
pinned: false
---
## Training Small Language Models with Knowledge Distillation
Official pre-trained models and baselines in
+ [MiniLLM](https://github.com/microsoft/LMOps/tree/main/minillm): Knowledge distillation of LLMs during instruction tuning.
+ [MiniPLM](https://github.com/thu-coai/MiniPLM): Knowledge distillation of LLMs during pre-training.
|