Updated with a working chat_template
#4
by
CISCai
- opened
Thanks for contributing! We really appreciate it. Sorry for the late response! We found that different chat templates are used in various packages, so we haven't updated the official chat_template and recommend developers use our official get_prompt(.)
function instead as ground truth to avoid inconsistencies. We'll update the system prompt in chat_template from Deepseek to Gorilla LLM. Thanks for catching this again!
We'll close this issue. Let us know if you have additional questions / concerns, we'll reopen this thread. Thanks again!
CharlieJi
changed pull request status to
closed