sberbank-ai commited on
Commit
86bef56
1 Parent(s): 9cde8a5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -11,7 +11,7 @@ It has 24 layers and 1536 hidden size.
11
 
12
  Model was trained on a mixture of 7 denoisers like UL2 with several differences .
13
 
14
- It trained on Russian language corpus (300GB). The dataset is the same as for ruT5 models.
15
 
16
  Bbpe tokenizer. First half of the time model was trained on the small part of all datasets (1%).
17
 
 
11
 
12
  Model was trained on a mixture of 7 denoisers like UL2 with several differences .
13
 
14
+ It trained on Russian language corpus (300GB). Dataset is the same as for ruT5 models.
15
 
16
  Bbpe tokenizer. First half of the time model was trained on the small part of all datasets (1%).
17