deepseek-admin
commited on
Commit
•
abc5359
1
Parent(s):
66b2bdc
Update README.md
Browse files
README.md
CHANGED
@@ -14,9 +14,9 @@ license_link: LICENSE
|
|
14 |
|
15 |
### 1. Introduction of Deepseek Coder
|
16 |
|
17 |
-
Deepseek Coder
|
18 |
|
19 |
-
- **Massive Training Data**: Trained on 2T tokens, including 87% code and 13% linguistic data in both English and Chinese languages.
|
20 |
|
21 |
- **Highly Flexible & Scalable**: Offered in model sizes of 1.3B, 5.7B, 6.7B, and 33B, enabling users to choose the setup most suitable for their requirements.
|
22 |
|
|
|
14 |
|
15 |
### 1. Introduction of Deepseek Coder
|
16 |
|
17 |
+
Deepseek Coder is composed of a series of code language models, each trained from scratch on 2T tokens, with a composition of 87% code and 13% natural language in both English and Chinese. We provide various sizes of the code model, ranging from 1B to 33B versions. Each model is pre-trained on project-level code corpus by employing a window size of 16K and a extra fill-in-the-blank task, to support project-level code completion and infilling. For coding capabilities, Deepseek Coder achieves state-of-the-art performance among open-source code models on multiple programming languages and various benchmarks.
|
18 |
|
19 |
+
- **Massive Training Data**: Trained from scratch on 2T tokens, including 87% code and 13% linguistic data in both English and Chinese languages.
|
20 |
|
21 |
- **Highly Flexible & Scalable**: Offered in model sizes of 1.3B, 5.7B, 6.7B, and 33B, enabling users to choose the setup most suitable for their requirements.
|
22 |
|