gn-bert-base-cased / README.md
mmaguero's picture
First model version
8e99cfc
|
raw
history blame
277 Bytes
---
language: gn
datasets:
- wikipedia
- wiktionary
widget:
- text: "Paraguay ha'e peteĩ táva oĩva [MASK] retãme "
---
# BERT-i-base-cased (gnBERT-base-cased)
A pre-trained BERT model for **Guarani** (12 layers, cased). Trained on Wikipedia + Wiktionary (~800K tokens).