Twitter-XLM-Roberta-large
This is a XLM-T large language model specialised on Twitter. The base model is a multilingual XLM-R which was re-trained on over 1 billion tweets from different languages until December 2022.
To evaluate this and other LMs on Twitter-specific data, please refer to the XLM-T main repository. A base-size XLM-T model and sample code is available here.
Finally, this model is fully compatible with the TweetNLP library.
BibTeX entry and citation info
More information in the reference papers about multilingual language models on Twitter and time-specific models. Please cite the relevant reference papers if you use this model.
@inproceedings{barbieri-etal-2022-xlm,
title = "{XLM}-{T}: Multilingual Language Models in {T}witter for Sentiment Analysis and Beyond",
author = "Barbieri, Francesco and
Espinosa Anke, Luis and
Camacho-Collados, Jose",
booktitle = "Proceedings of the Thirteenth Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.lrec-1.27",
pages = "258--266"
}
@inproceedings{loureiro-etal-2022-timelms,
title = "{T}ime{LM}s: Diachronic Language Models from {T}witter",
author = "Loureiro, Daniel and
Barbieri, Francesco and
Neves, Leonardo and
Espinosa Anke, Luis and
Camacho-collados, Jose",
booktitle = "Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: System Demonstrations",
month = may,
year = "2022",
address = "Dublin, Ireland",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.acl-demo.25",
doi = "10.18653/v1/2022.acl-demo.25",
pages = "251--260"
}
- Downloads last month
- 126