Spaces:
Running
Running
SinclairWang
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -11,4 +11,7 @@ pinned: false
|
|
11 |
<img src="https://cdn-uploads.huggingface.co/production/uploads/628f6e5ab90dde28ef57d293/gfqBTSEIa140Hu-mfo9Qe.png" alt="Clickable Image" />
|
12 |
</a>
|
13 |
|
14 |
-
GAIR-ProX, a subsidiary of [GAIR](https://huggingface.co/GAIR), spearheads the ProX Project. This initiative aims to enhance pre-training efficiency by refining corpus documents using language models at scale. Through meticulous operations (e.g., document-level filtering and chunk-level cleaning), implemented as scalable, executable programs, ProX seeks to improve pre-training data quality at scale, ultimately developing more robust and efficient language models.
|
|
|
|
|
|
|
|
11 |
<img src="https://cdn-uploads.huggingface.co/production/uploads/628f6e5ab90dde28ef57d293/gfqBTSEIa140Hu-mfo9Qe.png" alt="Clickable Image" />
|
12 |
</a>
|
13 |
|
14 |
+
GAIR-ProX, a subsidiary of [GAIR](https://huggingface.co/GAIR), spearheads the 🫐 ProX Project. This initiative aims to enhance pre-training efficiency by refining corpus documents using language models at scale. Through meticulous operations (e.g., document-level filtering and chunk-level cleaning), implemented as scalable, executable programs, 🫐 ProX seeks to improve pre-training data quality at scale, ultimately developing more robust and efficient language models.
|
15 |
+
|
16 |
+
|
17 |
+
<i>Read our [technical report](https://huggingface.co/papers/2409.17115)!</i>
|