fine / added_tokens.json
BEGADE's picture
Add trained model and tokenizer
3f58367
raw
history blame contribute delete
51 Bytes
{
"<|im_end|>": 50258,
"<|im_start|>": 50257
}