tay-yozhik
commited on
Commit
•
0e13f1c
1
Parent(s):
17d08ad
Update README.md
Browse files
README.md
CHANGED
@@ -4,4 +4,21 @@ datasets:
|
|
4 |
- tay-yozhik/NaturalText
|
5 |
language:
|
6 |
- ru
|
7 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4 |
- tay-yozhik/NaturalText
|
5 |
language:
|
6 |
- ru
|
7 |
+
---
|
8 |
+
|
9 |
+
# NaturalRoBERTa
|
10 |
+
|
11 |
+
This is a pre-trained model of type [RoBERTa](https://arxiv.org/abs/1907.11692).
|
12 |
+
NaturalRoBERTa is built on a dataset obtained from open sources: three news sub-corpuses [Taiga](https://github.com/TatianaShavrina/taiga_site) (Lenta.ru, Interfax, N+1) and [Russian Wikipedia texts](https://ru.wikipedia.org/).
|
13 |
+
|
14 |
+
# Evaluation
|
15 |
+
|
16 |
+
This model was evaluated on [RussianSuperGLUE tests](https://russiansuperglue.com/):
|
17 |
+
| Task | Result | Metrics |
|
18 |
+
|-------|----------|---------|
|
19 |
+
| LiDiRus | 0,0 | Matthews Correlation Coefficient |
|
20 |
+
| RCB | 0,217 / 0,484 | F1 / Accuracy |
|
21 |
+
| PARus | 0,498 | Accuracy |
|
22 |
+
| TERRa | 0,487 | Accuracy |
|
23 |
+
| RUSSE | 0,587 | Accuracy |
|
24 |
+
| RWSD | 0,669 | Accuracy |
|