File size: 834 Bytes
a0a6cd6 7e91fa3 95ff654 bd7b4b2 95ff654 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
---
title: README
emoji: 🦀
colorFrom: purple
colorTo: yellow
sdk: static
pinned: false
---
The organization is founded by [THUNLP](https://nlp.csai.tsinghua.edu.cn/) and [ModelBest](modelbest.cn) with the help of [IPADS](https://ipads.se.sjtu.edu.cn/), aimed at promoting the development of Sparse Large Language Models (SparseLLMs). By utilizing the sparsity of LLMs, we can significantly reduce the computational cost of inference. Currently, the organization is mainly focused on the ReLU-activated LLMs, which are converted from existing LLMs through fine-tuning.
The LLaMA series within it is provided by [THUNLP](https://nlp.csai.tsinghua.edu.cn/) and [ModelBest](modelbest.cn). Additionally, [IPADS](https://ipads.se.sjtu.edu.cn/) contributed the Falcon model, and participation from other institutions is also welcomed.
|