Update README.md
Browse files
README.md
CHANGED
@@ -23,11 +23,11 @@ pipeline_tag: text-generation
|
|
23 |
|
24 |
[GitHub](https://github.com/THUDM/CodeGeeX4)
|
25 |
|
26 |
-
We introduce CodeGeeX4-ALL-9B, the open-source version of the latest CodeGeeX4 model series. It is a multilingual code generation model continually trained on the [GLM-4-9B](https://github.com/THUDM/GLM-4), significantly enhancing its code generation capabilities. Using a single CodeGeeX4-ALL-9B model, it can support comprehensive functions such as code completion and generation, code interpreter, web search, function call, repository-level code Q&A, covering various scenarios of software development. CodeGeeX4-ALL-9B has achieved highly competitive performance on public benchmarks, such as [BigCodeBench](https://huggingface.co/
|
27 |
|
28 |
## Get Started
|
29 |
|
30 |
-
Use `4.39.0<=transformers<=4.40.2` to quickly launch [codegeex4-all-9b](https://huggingface.co/THUDM/
|
31 |
|
32 |
```python
|
33 |
import torch
|
@@ -48,7 +48,7 @@ with torch.no_grad():
|
|
48 |
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
|
49 |
```
|
50 |
|
51 |
-
If you want to build the **chat** prompt
|
52 |
```
|
53 |
f"<|system|>\n{system_prompt}\n<|user|>\n{prompt}\n<|assistant|>\n"
|
54 |
```
|
|
|
23 |
|
24 |
[GitHub](https://github.com/THUDM/CodeGeeX4)
|
25 |
|
26 |
+
We introduce CodeGeeX4-ALL-9B, the open-source version of the latest CodeGeeX4 model series. It is a multilingual code generation model continually trained on the [GLM-4-9B](https://github.com/THUDM/GLM-4), significantly enhancing its code generation capabilities. Using a single CodeGeeX4-ALL-9B model, it can support comprehensive functions such as code completion and generation, code interpreter, web search, function call, repository-level code Q&A, covering various scenarios of software development. CodeGeeX4-ALL-9B has achieved highly competitive performance on public benchmarks, such as [BigCodeBench](https://huggingface.co/spaces/bigcode/bigcodebench-leaderboard) and [NaturalCodeBench](https://github.com/THUDM/NaturalCodeBench). It is currently the most powerful code generation model with less than 10B parameters, even surpassing much larger general-purpose models, achieving the best balance in terms of inference speed and model performance.
|
27 |
|
28 |
## Get Started
|
29 |
|
30 |
+
Use `4.39.0<=transformers<=4.40.2` to quickly launch [codegeex4-all-9b](https://huggingface.co/THUDM/codegeex4-all-9b):
|
31 |
|
32 |
```python
|
33 |
import torch
|
|
|
48 |
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
|
49 |
```
|
50 |
|
51 |
+
If you want to build the **chat** prompt manually, please make sure it follows the following format:
|
52 |
```
|
53 |
f"<|system|>\n{system_prompt}\n<|user|>\n{prompt}\n<|assistant|>\n"
|
54 |
```
|