Update README.md
Browse files
README.md
CHANGED
@@ -77,7 +77,7 @@ DeepSeek-V2.5 better aligns with human preferences and has been optimized in var
|
|
77 |
## 2. How to run locally
|
78 |
|
79 |
**To utilize DeepSeek-V2.5 in BF16 format for inference, 80GB*8 GPUs are required.**
|
80 |
-
### Inference
|
81 |
You can directly employ [Huggingface's Transformers](https://github.com/huggingface/transformers) for model inference.
|
82 |
|
83 |
```python
|
|
|
77 |
## 2. How to run locally
|
78 |
|
79 |
**To utilize DeepSeek-V2.5 in BF16 format for inference, 80GB*8 GPUs are required.**
|
80 |
+
### Inference Huggingface's Transformers
|
81 |
You can directly employ [Huggingface's Transformers](https://github.com/huggingface/transformers) for model inference.
|
82 |
|
83 |
```python
|