|
--- |
|
language: |
|
- es |
|
|
|
tags: |
|
- masked-lm |
|
|
|
widget: |
|
- text: "La mayor ventaja de la democracia es su [MASK]." |
|
example_title: "Ejemplo 1" |
|
|
|
--- |
|
|
|
# PolitiBETO: A Spanish BERT adapted to a language domain of Political Tweets |
|
|
|
PolitiBETO is a [BERT model](https://github.com/google-research/bert) tailored for political tasks in social media corpora. It is a Domain Adaptation on top of [BETO](https://huggingface.co/dccuchile/bert-base-spanish-wwm-uncased), a pretrained BERT in Spanish. |
|
This model is meant to be fine-tuned for downstream tasks. |
|
|
|
|
|
## Citation |
|
|
|
[NLP-CIMAT at PoliticEs 2022: PolitiBETO, a Domain-Adapted Transformer for Multi-class Political Author Profiling](https://ceur-ws.org/Vol-3202/politices-paper2.pdf) |
|
|
|
To cite this in a publication please use the following: |
|
|
|
``` |
|
@inproceedings{PolitiBeto2022, |
|
title={{NLP-CIMAT} at {P}olitic{E}s 2022: {P}oliti{BETO}, a {D}omain-{A}dapted {T}ransformer for {M}ulti-class {P}olitical {A}uthor {P}rofiling}, |
|
author={Emilio Villa-Cueva and Ivan Gonz{\'a}lez-Franco and Fernando Sanchez-Vega and Adri{\'a}n Pastor L{\'o}pez-Monroy}, |
|
booktitle={Proceedings of the Iberian Languages Evaluation Forum (IberLEF 2022)}, |
|
series = {{CEUR} Workshop Proceedings}, |
|
publisher = {CEUR-WS}, |
|
year={2022} |
|
} |
|
``` |