Spaces:
Running
Running
Commit
·
165781c
1
Parent(s):
ad2451f
Update README.md
Browse files
README.md
CHANGED
@@ -22,7 +22,7 @@ Hello World! This is codefuse!
|
|
22 |
**The mission of CodeFuse is to develop Code Large Language Models (Code LLMs) specifically designed to support the entire software development lifecycle, covering crucial stages such as design, requirements, coding, testing, deployment, operations, and maintenance.** We are passionate about creating innovative solutions that empower developers throughout the software development process.
|
23 |
|
24 |
In this release, we are open sourcing
|
25 |
-
1. [**The MFT (Multi-Task Fine-Tuning) framework, known as
|
26 |
2. **Two datasets for enhancing the coding capabilities of LLMs**, that is, [Code Exercise](https://huggingface.co/datasets/codefuse/CodeExercise-Python-27k) and [Evol-Instruction](https://huggingface.co/datasets/codefuse/Evol-instruction-66k);
|
27 |
3. [**A faster and more reliable deployment framework based on FasterTransformer**](https://github.com/codefuse-ai/FasterTransformer4CodeFuse);
|
28 |
|
@@ -37,7 +37,7 @@ CodeFuse的使命是开发专门设计用于支持整个软件开发生命周期
|
|
37 |
|
38 |
|
39 |
在本次发布中,我们开源了以下内容:
|
40 |
-
1. [**MFT(多任务微调)框架,也称为
|
41 |
2. **两个用于增强LLMs编码能力的数据集**,包括[Code Exercise](https://huggingface.co/datasets/codefuse/CodeExercise-Python-27k)和[Evol-Instruction](https://huggingface.co/datasets/codefuse/Evol-instruction-66k);
|
42 |
3. [**基于FasterTransformer的更快速、更可靠的部署框架**](https://github.com/codefuse-ai/FasterTransformer4CodeFuse);。
|
43 |
|
|
|
22 |
**The mission of CodeFuse is to develop Code Large Language Models (Code LLMs) specifically designed to support the entire software development lifecycle, covering crucial stages such as design, requirements, coding, testing, deployment, operations, and maintenance.** We are passionate about creating innovative solutions that empower developers throughout the software development process.
|
23 |
|
24 |
In this release, we are open sourcing
|
25 |
+
1. [**The MFT (Multi-Task Fine-Tuning) framework, known as MFTCoder**](https://github.com/codefuse-ai/MFTCoder);
|
26 |
2. **Two datasets for enhancing the coding capabilities of LLMs**, that is, [Code Exercise](https://huggingface.co/datasets/codefuse/CodeExercise-Python-27k) and [Evol-Instruction](https://huggingface.co/datasets/codefuse/Evol-instruction-66k);
|
27 |
3. [**A faster and more reliable deployment framework based on FasterTransformer**](https://github.com/codefuse-ai/FasterTransformer4CodeFuse);
|
28 |
|
|
|
37 |
|
38 |
|
39 |
在本次发布中,我们开源了以下内容:
|
40 |
+
1. [**MFT(多任务微调)框架,也称为MFTCoder**](https://github.com/codefuse-ai/MFTCoder);
|
41 |
2. **两个用于增强LLMs编码能力的数据集**,包括[Code Exercise](https://huggingface.co/datasets/codefuse/CodeExercise-Python-27k)和[Evol-Instruction](https://huggingface.co/datasets/codefuse/Evol-instruction-66k);
|
42 |
3. [**基于FasterTransformer的更快速、更可靠的部署框架**](https://github.com/codefuse-ai/FasterTransformer4CodeFuse);。
|
43 |
|