How to prompt this model for its (initial) translation task?

#5
by alexcardo - opened

I'm trying to use this model for translation task. I use llama cpp server as a backend. I use the ChatML template as a chat template. But the model refuses following the instructions. Can you please provide the exact things to do to force the model to translate from the source to target language? For example, I want to translate some text from Danish to Dutch.

./llama-server -m ~/.cache/lm-studio/models/RichardErkhov/BSC-LT_-_salamandra-7b-instruct-gguf/salamandra-7b-instruct.Q4_K_S.gguf -c 2048 --temp 1 --chat-template chatml --verbose

Also, if it's possible, please explain it to me how to use the base model as it always translate to Catalonian only. No matter what I write... for example:

[English]{text} \n[Danish]
['English']{text} \n['Danish']
....
whatever I try, it's only Catalonian....

Perhaps llama cpp doesn't support this model, but since you use the basic ChatML template, but instead, the model provides the same output in the source language...

Sign up or log in to comment