deepseek-admin
commited on
Commit
•
c32b8df
1
Parent(s):
7ecb2ce
Update README.md
Browse files
README.md
CHANGED
@@ -3,14 +3,16 @@ license: other
|
|
3 |
license_name: deepseek
|
4 |
license_link: LICENSE
|
5 |
---
|
|
|
6 |
<p align="center">
|
7 |
<img width="1000px" alt="DeepSeek Coder" src="https://github.com/deepseek-ai/DeepSeek-Coder/blob/main/pictures/logo.png?raw=true">
|
8 |
</p>
|
9 |
-
<p align="center"><a href="https://www.deepseek.com/">[🏠Homepage]</a> | <a href="https://coder.deepseek.com/">[🤖 Chat with DeepSeek Coder]</a> | <a href="https://discord.gg/
|
10 |
<hr>
|
11 |
|
12 |
|
13 |
|
|
|
14 |
### 1. Introduction of Deepseek Coder
|
15 |
|
16 |
Deepseek Coder comprises a series of code language models trained on both 87% code and 13% natural language in English and Chinese, with each model pre-trained on 2T tokens. We provide various sizes of the code model, ranging from 1B to 33B versions. Each model is pre-trained on project-level code corpus by employing a window size of 16K and a extra fill-in-the-blank task, to support project-level code completion and infilling. For coding capabilities, Deepseek Coder achieves state-of-the-art performance among open-source code models on multiple programming languages and various benchmarks.
|
|
|
3 |
license_name: deepseek
|
4 |
license_link: LICENSE
|
5 |
---
|
6 |
+
|
7 |
<p align="center">
|
8 |
<img width="1000px" alt="DeepSeek Coder" src="https://github.com/deepseek-ai/DeepSeek-Coder/blob/main/pictures/logo.png?raw=true">
|
9 |
</p>
|
10 |
+
<p align="center"><a href="https://www.deepseek.com/">[🏠Homepage]</a> | <a href="https://coder.deepseek.com/">[🤖 Chat with DeepSeek Coder]</a> | <a href="https://discord.gg/Tc7c45Zzu5">[Discord]</a> | <a href="https://github.com/guoday/assert/blob/main/QR.png?raw=true">[Wechat(微信)]</a> </p>
|
11 |
<hr>
|
12 |
|
13 |
|
14 |
|
15 |
+
|
16 |
### 1. Introduction of Deepseek Coder
|
17 |
|
18 |
Deepseek Coder comprises a series of code language models trained on both 87% code and 13% natural language in English and Chinese, with each model pre-trained on 2T tokens. We provide various sizes of the code model, ranging from 1B to 33B versions. Each model is pre-trained on project-level code corpus by employing a window size of 16K and a extra fill-in-the-blank task, to support project-level code completion and infilling. For coding capabilities, Deepseek Coder achieves state-of-the-art performance among open-source code models on multiple programming languages and various benchmarks.
|