irenedea commited on
Commit
3222c6f
1 Parent(s): 4a14e9e

Add chat_template to tokenizer_config.json

Browse files

Manually tested with
```
from transformers import AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained('mosaicml/mpt-7b-8k-chat', revision='refs/pr/7')

chat = [
{"role": "system", "content": "This is a system prompt!"},
{"role": "user", "content": "Hello, how are you?"},
{"role": "assistant", "content": "I'm doing great. How can I help you today?"},
{"role": "user", "content": "I'd like to show off how chat templating works!"},
]

print(tokenizer.apply_chat_template(chat, tokenize=False))

# Remove system prompt
chat = chat[1:]

print("\nUsing default system prompt!\n")

print(tokenizer.apply_chat_template(chat, tokenize=False))
```

output:
```
<|im_start|>system
This is a system prompt!
<|im_start|>user
Hello, how are you?<|im_end|>
<|im_start|>assistant
I'm doing great. How can I help you today?<|im_end|>
<|im_start|>user
I'd like to show off how chat templating works!<|im_end|>

Using default system prompt!

<|im_start|>system
A conversation between a user and an LLM-based AI assistant. The assistant gives helpful and honest answers.
<|im_start|>user
Hello, how are you?<|im_end|>
<|im_start|>assistant
I'm doing great. How can I help you today?<|im_end|>
<|im_start|>user
I'd like to show off how chat templating works!<|im_end|>
```

Files changed (1) hide show
  1. tokenizer_config.json +2 -1
tokenizer_config.json CHANGED
@@ -5,5 +5,6 @@
5
  "eos_token": "<|endoftext|>",
6
  "model_max_length": 8192,
7
  "tokenizer_class": "GPTNeoXTokenizer",
8
- "unk_token": "<|endoftext|>"
 
9
  }
 
5
  "eos_token": "<|endoftext|>",
6
  "model_max_length": 8192,
7
  "tokenizer_class": "GPTNeoXTokenizer",
8
+ "unk_token": "<|endoftext|>",
9
+ "chat_template": "{% if messages[0]['role'] == 'system' %}{% set loop_messages = messages[1:] %}{% set system_message = messages[0]['content'] %}{% elif not 'system' in messages[0]['role'] %}{% set loop_messages = messages %}{% set system_message = 'A conversation between a user and an LLM-based AI assistant. The assistant gives helpful and honest answers.' %}{% else %}{% set loop_messages = messages %}{% set system_message = false %}{% endif %}{% for message in loop_messages %}{% if loop.index0 == 0 %}{% if system_message != false %}{{ '<|im_start|>system\n' + system_message.strip() + '\n'}}{% endif %}{{ '<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' }}{% else %}{{ '\n' + '<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' }}{% endif %}{% if (add_generation_prompt == true) %}{{ '\n' + '<|im_start|>' + 'assistant' + '\n' }}{% elif (message['role'] == 'assistant') %}{% endif %}{% endfor %}"
10
  }