Can someone point to the chat template used for this model? Would be awesome.

#7
by hrishbhdalal - opened

When I use the .apply_chat_template, I get this warning: No chat template is defined for this tokenizer - using the default template for the LlamaTokenizerFast class. If the default is not appropriate for your model, please set tokenizer.chat_template to an appropriate template. See https://huggingface.co/docs/transformers/main/chat_templating for more information.

If you use tokenizer.apply_chat_template and there is no additional "chat_template" in the model tokenizer_config.json, a default template of torch package is used.
My problem is why the chat templated is provided inm-a-p/OpenCodeInterpreter-DS-6.7B but not provided in m-a-p/OpenCodeInterpreter-DS-33B.
And if i apply the chat template of 6.7B for 33B, 33B will return response in wierd format, which weaken its accuracy in HumanEval test.

Thanks I figured it out later, but was confused as it did not seem to have a system prompt at that time..

How did you solve that problem? Just make a Jinja template for it?

I just utilized a normal template of you are a smart assistant and helpful, etc. I applied it manually by giving this template to the tokenizer and then just applying it using tokenizer.apply_template

Sign up or log in to comment