Edit model card

Citation

If you use this model, please cite the following paper:

@inproceedings {yang-language-models,
    title = {Training language models with low resources: RoBERTa, BART and ELECTRA experimental models for Hungarian},
    booktitle = {Proceedings of 12th IEEE International Conference on Cognitive Infocommunications (CogInfoCom 2021)},
    year = {2021},
    publisher = {IEEE},
    address = {Online},
    author = {Yang, Zijian Győző and Váradi, Tamás},
    pages = {279--285}
}
Downloads last month
7
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.