Text Generation
Transformers
Safetensors
English
llama
conversational
Eval Results
Inference Endpoints
text-generation-inference
Felladrin commited on
Commit
ae4bd39
1 Parent(s): fd57078

Remove check for undefined `add_generation_prompt` from the `chat_template`, which was causing an issue when using transformers.js

Browse files
Files changed (1) hide show
  1. tokenizer_config.json +1 -1
tokenizer_config.json CHANGED
@@ -26,7 +26,7 @@
26
  }
27
  },
28
  "bos_token": "<s>",
29
- "chat_template": "{% if not add_generation_prompt is defined %}{% set add_generation_prompt = false %}{% endif %}{% for message in messages %}{{'<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' + '\n'}}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant\n' }}{% endif %}",
30
  "clean_up_tokenization_spaces": false,
31
  "eos_token": "</s>",
32
  "legacy": false,
 
26
  }
27
  },
28
  "bos_token": "<s>",
29
+ "chat_template": "{% for message in messages %}{{'<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' + '\n'}}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant\n' }}{% endif %}",
30
  "clean_up_tokenization_spaces": false,
31
  "eos_token": "</s>",
32
  "legacy": false,