Edit model card

Hungarian-centered 12-lingual finetuned M2M100_1.2B model

For further details, see or our demo site.

  • Source language: Bulgarian (bg), Czech (cs), German (de), English (en), Croatian (hr), Polish, (pl), Romanian (ro), Russian (ru), Slovak (sk), Slovene (sl), Serbian (sr), Ukrainian (uk)

  • Target language: Hungarian (hu)

  • Finetuned on subcorpora from OPUS

    • Segments: 3 million per language

Limitations

  • max_source_length: 256
  • max_target_length: 256

Citation

If you use this model, please cite the following paper:

@article{laki-yang-12lang,
    title = {Solving Hungarian natural language processing tasks with multilingual generative models},
    journal = {Annales Mathematicae et Informaticae},
    year = {2023},
    author = {Yang, Zijian Győző and Laki László János},
    volume = {57},
    pages = {92–-106},
    doi = {10.33039/ami.2022.11.001}
}
Downloads last month
12
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.