<p> | |
This open-source model was created by <a target="_blank" href="https://qwenlm.github.io/">The Qwen Team of Alibaba cloud <a>. | |
You can find the release blog post <a target="_blank" href="https://qwenlm.github.io/blog/qwen2.5/">here</a>. | |
The model is available on the huggingface hub: <a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-0.5B-Instruct">https://huggingface.co/Qwen/Qwen2.5-0.5B-Instruct</a>. | |
The 0.5B model was pretrained on 18 trillion tokens spanning 29 languages. | |
It supports up to 128K tokens and can generate up to 8K tokens. | |
</p> | |