theyorubayesian
commited on
Commit
·
6ded7d8
1
Parent(s):
d34a5b4
Update README.md
Browse files
README.md
CHANGED
@@ -27,7 +27,7 @@ language:
|
|
27 |
|
28 |
# AfriTeVa V2 Large
|
29 |
|
30 |
-
AfriTeVa V2 Large is a multilingual T5 [Version 1.1](https://github.com/google-research/text-to-text-transfer-transformer/blob/main/released_checkpoints.md#t511) model pretrained on [Wura](https://huggingface.co/datasets/castorini/wura) with a vocabulary size of 150,000. The model has
|
31 |
|
32 |
Paper: [Better Quality Pretraining Data & T5 Models for African Languages](https://openreview.net/forum?id=ybc9V6Cbq2)
|
33 |
|
|
|
27 |
|
28 |
# AfriTeVa V2 Large
|
29 |
|
30 |
+
AfriTeVa V2 Large is a multilingual T5 [Version 1.1](https://github.com/google-research/text-to-text-transfer-transformer/blob/main/released_checkpoints.md#t511) model pretrained on [Wura](https://huggingface.co/datasets/castorini/wura) with a vocabulary size of 150,000. The model has 1B parameters.
|
31 |
|
32 |
Paper: [Better Quality Pretraining Data & T5 Models for African Languages](https://openreview.net/forum?id=ybc9V6Cbq2)
|
33 |
|