Update README.md
Browse files
README.md
CHANGED
@@ -9,5 +9,5 @@ pinned: false
|
|
9 |
|
10 |
The organization is founded by [THUNLP](https://nlp.csai.tsinghua.edu.cn/) and [ModelBest](modelbest.cn) with the help of [IPADS](https://ipads.se.sjtu.edu.cn/), aimed at promoting the development of Sparse Large Language Models (SparseLLMs). By utilizing the sparsity of LLMs, we can significantly reduce the computational cost of inference. Currently, the organization is mainly focused on the ReLU-activated LLMs, which are converted from existing LLMs through fine-tuning.
|
11 |
|
12 |
-
The LLaMA series within it is provided by [THUNLP](https://nlp.csai.tsinghua.edu.cn/) and [ModelBest](modelbest.cn). Additionally, [IPADS](https://ipads.se.sjtu.edu.cn/
|
13 |
|
|
|
9 |
|
10 |
The organization is founded by [THUNLP](https://nlp.csai.tsinghua.edu.cn/) and [ModelBest](modelbest.cn) with the help of [IPADS](https://ipads.se.sjtu.edu.cn/), aimed at promoting the development of Sparse Large Language Models (SparseLLMs). By utilizing the sparsity of LLMs, we can significantly reduce the computational cost of inference. Currently, the organization is mainly focused on the ReLU-activated LLMs, which are converted from existing LLMs through fine-tuning.
|
11 |
|
12 |
+
The LLaMA series within it is provided by [THUNLP](https://nlp.csai.tsinghua.edu.cn/) and [ModelBest](modelbest.cn). Additionally, [IPADS](https://ipads.se.sjtu.edu.cn/) contributed the Falcon model, and participation from other institutions is also welcomed.
|
13 |
|