Update README.md
Browse files
README.md
CHANGED
@@ -86,7 +86,7 @@ outputs = model.generate(input_prompt['input_ids'], max_new_tokens=256, do_sampl
|
|
86 |
print(tokenizer.batch_decode(outputs)[0])
|
87 |
```
|
88 |
|
89 |
-
Please make sure that the BOS token is always included in the tokenized prompts. This
|
90 |
|
91 |
# Evaluation
|
92 |
|
|
|
86 |
print(tokenizer.batch_decode(outputs)[0])
|
87 |
```
|
88 |
|
89 |
+
Please make sure that the BOS token is always included in the tokenized prompts. This might not be the default setting in all evaluation or fine-tuning frameworks.
|
90 |
|
91 |
# Evaluation
|
92 |
|