yixinsong commited on
Commit
bd7b4b2
1 Parent(s): 7e91fa3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -9,5 +9,5 @@ pinned: false
9
 
10
  The organization is founded by [THUNLP](https://nlp.csai.tsinghua.edu.cn/) and [ModelBest](modelbest.cn) with the help of [IPADS](https://ipads.se.sjtu.edu.cn/), aimed at promoting the development of Sparse Large Language Models (SparseLLMs). By utilizing the sparsity of LLMs, we can significantly reduce the computational cost of inference. Currently, the organization is mainly focused on the ReLU-activated LLMs, which are converted from existing LLMs through fine-tuning.
11
 
12
- The LLaMA series within it is provided by [THUNLP](https://nlp.csai.tsinghua.edu.cn/) and [ModelBest](modelbest.cn). Additionally, [IPADS](https://ipads.se.sjtu.edu.cn/zh/index.html) contributed the Falcon model, and participation from other institutions is also welcomed.
13
 
 
9
 
10
  The organization is founded by [THUNLP](https://nlp.csai.tsinghua.edu.cn/) and [ModelBest](modelbest.cn) with the help of [IPADS](https://ipads.se.sjtu.edu.cn/), aimed at promoting the development of Sparse Large Language Models (SparseLLMs). By utilizing the sparsity of LLMs, we can significantly reduce the computational cost of inference. Currently, the organization is mainly focused on the ReLU-activated LLMs, which are converted from existing LLMs through fine-tuning.
11
 
12
+ The LLaMA series within it is provided by [THUNLP](https://nlp.csai.tsinghua.edu.cn/) and [ModelBest](modelbest.cn). Additionally, [IPADS](https://ipads.se.sjtu.edu.cn/) contributed the Falcon model, and participation from other institutions is also welcomed.
13