马 仕镕 commited on
Commit
941577e
1 Parent(s): 6e11005

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -2
README.md CHANGED
@@ -206,7 +206,7 @@ print(result)
206
  import torch
207
  from transformers import AutoTokenizer, AutoModelForCausalLM, GenerationConfig
208
 
209
- model_name = "deepseek-ai/DeepSeek-V2-Chat-RL"
210
  tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
211
  # `max_memory` should be set based on your devices
212
  max_memory = {i: "75GB" for i in range(8)}
@@ -223,8 +223,11 @@ outputs = model.generate(input_tensor.to(model.device), max_new_tokens=100)
223
  result = tokenizer.decode(outputs[0][input_tensor.shape[1]:], skip_special_tokens=True)
224
  print(result)
225
  ```
226
- The complete chat template can be founded within `tokenizer_config.json` located in the huggingface model repository/
 
 
227
  An example of chat template is as belows:
 
228
  ```bash
229
  <|begin▁of▁sentence|>User: {user_message_1}
230
 
@@ -232,7 +235,9 @@ Assistant: {assistant_message_1}<|end▁of▁sentence|>User: {user_message_2
232
 
233
  Assistant:
234
  ```
 
235
  You can also add an optional system message:
 
236
  ```bash
237
  <|begin▁of▁sentence|>{system_message}
238
 
 
206
  import torch
207
  from transformers import AutoTokenizer, AutoModelForCausalLM, GenerationConfig
208
 
209
+ model_name = "deepseek-ai/DeepSeek-V2-Chat"
210
  tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
211
  # `max_memory` should be set based on your devices
212
  max_memory = {i: "75GB" for i in range(8)}
 
223
  result = tokenizer.decode(outputs[0][input_tensor.shape[1]:], skip_special_tokens=True)
224
  print(result)
225
  ```
226
+
227
+ The complete chat template can be found within `tokenizer_config.json` located in the huggingface model repository.
228
+
229
  An example of chat template is as belows:
230
+
231
  ```bash
232
  <|begin▁of▁sentence|>User: {user_message_1}
233
 
 
235
 
236
  Assistant:
237
  ```
238
+
239
  You can also add an optional system message:
240
+
241
  ```bash
242
  <|begin▁of▁sentence|>{system_message}
243