Update chat template

#6
by CISCai - opened

I know it's a bit of a pain, but could you update the chat template to the latest chat templates now that llama.cpp supports it?

At least you won't have to requantize everything as I made a handy script that lets you create a new GGUF using the updated tokenizer_config.json file, see the details in the PR. :)

PS: You only have to update the first file in a split GGUF.

Owner

Thanks for the PR. I've updated all weights in this repo with chat templates

Thank you for the quick response. :)

CISCai changed discussion status to closed

Sign up or log in to comment