how to fine tune this model to get better polish translation

#3
by Devdarshan - opened

can someone guide me how i can fine tune this model to get better polish translations of paragraphs and large passages with better quality grammar

@Devdarshan much better is to find good translation dataset and fine-tuning for Mistral 7B model, check my repo

Anyone tried to generate with this one? The output is always empty (or rather containing only bos and eos tokens). Other Opus-mt-tc-en-... models generate just fine.

Owner

Hi @Tomsterrus , thanks for reporting this. Indeed there seem to be a problem with the model, probably introduced by newer versions of the transformers library. I sadly won't have time to look into this myself, but maybe re-exporting the original model from the Tatoeba repository would work! I'll add a mention in the readme.

Sign up or log in to comment