Template for LLama CPP server

#6
by alexcardo - opened

Can you please provide the correct template to use with LLama CPP server. I've been trying numerous approaches. But always getting the endless output.

The only way that worked for me is using this quant with Ollma, however, it provided the incorrect output already in title. I've been trying to translate from English to Dutch.

It's completely impossible to find the correct Llama 3 template for text generation, not for chatting with model.

I don't need to chat with it. I want my prompt to be "Translate from English to Dutch {some text}"

Thank you in advance

Sign up or log in to comment