--- language: gn license: mit datasets: - wikipedia - wiktionary widget: - text: "Paraguay ha'e peteĩ táva oĩva [MASK] retãme " - text: "Augusto Roa Bastos ha'e peteĩ [MASK] arandu" --- # BERT-i-large-cased (gnBERT-large-cased) A pre-trained BERT model for **Guarani** (24 layers, cased). Trained on Wikipedia + Wiktionary (~800K tokens).