bleysg commited on
Commit
0f6c100
1 Parent(s): 8df47df

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -18
README.md CHANGED
@@ -69,30 +69,16 @@ which means that lists of messages can be formatted for you with the `apply_chat
69
 
70
  ```python
71
  chat = [
72
- {"role": "user", "content": "Hello, how are you?"},
73
- {"role": "assistant", "content": "I'm doing great. How can I help you today?"},
74
- {"role": "user", "content": "I'd like to show off how chat templating works!"},
 
75
  ]
76
  tokenizer.apply_chat_template(chat, tokenize=False, add_generation_prompt=True)
77
  ```
78
 
79
  which will yield:
80
 
81
- ```
82
- <|im_start|>user
83
- Hello, how are you?<|im_end|>
84
- <|im_start|>assistant
85
- I'm doing great. How can I help you today?<|im_end|>
86
- <|im_start|>user
87
- I'd like to show off how chat templating works!<|im_end|>
88
- <|im_start|>assistant
89
- ```
90
-
91
- If you use `tokenize=True` and `return_tensors="pt"` instead, then you will get a tokenized
92
- and formatted conversation ready to pass to `model.generate()`.
93
-
94
- ## Example Prompt Exchange
95
-
96
  ```
97
  <|im_start|>system
98
  You are MistralOrca, a large language model trained by Alignment Lab AI. Write out your reasoning step-by-step to be sure you get the right answers!
@@ -103,8 +89,12 @@ How are you?<|im_end|>
103
  I am doing well!<|im_end|>
104
  <|im_start|>user
105
  Please tell me about how mistral winds have attracted super-orcas.<|im_end|>
 
106
  ```
107
 
 
 
 
108
 
109
  # Inference
110
 
 
69
 
70
  ```python
71
  chat = [
72
+ {"role": "system", "content": "You are MistralOrca, a large language model trained by Alignment Lab AI. Write out your reasoning step-by-step to be sure you get the right answers!"}
73
+ {"role": "user", "content": "How are you?"},
74
+ {"role": "assistant", "content": "I am doing well!"},
75
+ {"role": "user", "content": "Please tell me about how mistral winds have attracted super-orcas."},
76
  ]
77
  tokenizer.apply_chat_template(chat, tokenize=False, add_generation_prompt=True)
78
  ```
79
 
80
  which will yield:
81
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
82
  ```
83
  <|im_start|>system
84
  You are MistralOrca, a large language model trained by Alignment Lab AI. Write out your reasoning step-by-step to be sure you get the right answers!
 
89
  I am doing well!<|im_end|>
90
  <|im_start|>user
91
  Please tell me about how mistral winds have attracted super-orcas.<|im_end|>
92
+ <|im_start|>assistant
93
  ```
94
 
95
+ If you use `tokenize=True` and `return_tensors="pt"` instead, then you will get a tokenized
96
+ and formatted conversation ready to pass to `model.generate()`.
97
+
98
 
99
  # Inference
100