How did you train your model to be multilingual with SD?

#1
by jqlive - opened

Hi, I'm very curious about your training methodology and dataset... I'm planning to train a model that is multilingual and good with SD prompts, specifically with Spanish. I want to be able to input a story scene in spanish and then let the LLM both translate to english and reformat it into a SD prompt. I'm curious about how you achieved that with Chinese. I know Qwen is already multilingual with English and Chinese. But how did you train it to be so with SD prompts? would you be open to sharing like 5 entries of your dataset? just so i can get an idea of how I could do the same with Spanish and Mistral 7B.
Any guidance would be super appreciated.

Sign up or log in to comment