metadata
license: cc-by-4.0
language:
- en
- ar
- tl
- es
- ru
- ko
- ja
- ms
- id
- he
- hr
- pl
- pt
- af
- am
- az
- et
- eo
- th
- te
- ta
- tr
- uk
- ur
- uz
- vi
- ro
- km
- da
- de
- fi
- fa
- fr
- ga
- hu
- ky
- sa
tags:
- multilingual
- bert
- roberta
- xlmr
- bm
Model type: Transformer-based masked language model
Training data: No additional pretraining, merges two existing models
Languages: 100+ languages
Architecture:
- Base architectures:
- XLM-RoBERTa base (multilingual)
- BERT base cased (multilingual)