dododododo
commited on
Commit
β’
2ece35c
1
Parent(s):
7ce5b79
Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,15 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# CT-LLM-SFT-experiment-ckpts
|
2 |
+
|
3 |
+
[**π Homepage**](https://chinese-tiny-llm.github.io) | [**π€ MAP-CC**](https://huggingface.co/datasets/m-a-p/MAP-CC) | [**π€ CHC-Bench**](https://huggingface.co/datasets/m-a-p/CHC-Bench) | [**π€ CT-LLM**](https://huggingface.co/collections/m-a-p/chinese-tiny-llm-660d0133dff6856f94ce0fc6) | [**π arXiv**]() | [**GitHub**](https://github.com/Chinese-Tiny-LLM/Chinese-Tiny-LLM)
|
4 |
+
|
5 |
+
This warehouse contains all SFT experiment ckpts, which are fine-tuned by different Chinese and English data ratios, as follows:
|
6 |
+
- zh_105k_en_105k(1:1)
|
7 |
+
- zh_105k_en_52k(2:1)
|
8 |
+
- zh_105k_en_26k(4:1)
|
9 |
+
- zh_105k_en_13k(8:1)
|
10 |
+
- zh_105k(only Chinese)
|
11 |
+
- en_105k(only English)
|
12 |
+
|
13 |
+
## Uses
|
14 |
+
|
15 |
+
Please refer to the usage [CT-LLM-SFT](https://huggingface.co/m-a-p/CT-LLM-SFT)
|