Update README.md
Browse filesAdd COPA-SR results
README.md
CHANGED
@@ -22,8 +22,8 @@ Mean F1 scores were used to evaluate performance.
|
|
22 |
|
23 |
| system | dataset | F1 score |
|
24 |
|:-----------------------------------------------------------------------|:----|------:|
|
25 |
-
| [BERTić](https://huggingface.co/classla/bcms-bertic) | hr500k | 0.925 |
|
26 |
| **XLM-R-BERTić** | hr500k | 0.927 |
|
|
|
27 |
| XLM-R-SloBERTić | hr500k | 0.923 |
|
28 |
| XLM-Roberta-Large |hr500k | 0.919 |
|
29 |
| [crosloengual-bert](https://huggingface.co/EMBEDDIA/crosloengual-bert) | hr500k | 0.918 |
|
@@ -46,7 +46,18 @@ The procedure is explained in greater detail in the dedicated [benchmarking repo
|
|
46 |
| XLM-Roberta-Base | ParlaSent_BCS.jsonl | ParlaSent_BCS_test.jsonl | 0.500 |
|
47 |
| dummy (mean) | ParlaSent_BCS.jsonl | ParlaSent_BCS_test.jsonl | -0.12 |
|
48 |
## COPA
|
49 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
50 |
|
51 |
# Citation
|
52 |
(to be added soon)
|
|
|
22 |
|
23 |
| system | dataset | F1 score |
|
24 |
|:-----------------------------------------------------------------------|:----|------:|
|
|
|
25 |
| **XLM-R-BERTić** | hr500k | 0.927 |
|
26 |
+
| [BERTić](https://huggingface.co/classla/bcms-bertic) | hr500k | 0.925 |
|
27 |
| XLM-R-SloBERTić | hr500k | 0.923 |
|
28 |
| XLM-Roberta-Large |hr500k | 0.919 |
|
29 |
| [crosloengual-bert](https://huggingface.co/EMBEDDIA/crosloengual-bert) | hr500k | 0.918 |
|
|
|
46 |
| XLM-Roberta-Base | ParlaSent_BCS.jsonl | ParlaSent_BCS_test.jsonl | 0.500 |
|
47 |
| dummy (mean) | ParlaSent_BCS.jsonl | ParlaSent_BCS_test.jsonl | -0.12 |
|
48 |
## COPA
|
49 |
+
|
50 |
+
|
51 |
+
| system | dataset | Accuracy score |
|
52 |
+
|:-----------------------------------------------------------------------|:----|------:|
|
53 |
+
| [BERTić](https://huggingface.co/classla/bcms-bertic) | Copa-SR | 0.689 |
|
54 |
+
| XLM-R-SloBERTić | Copa-SR | 0.665 |
|
55 |
+
| **XLM-R-BERTić** | Copa-SR | 0.637 |
|
56 |
+
| [crosloengual-bert](https://huggingface.co/EMBEDDIA/crosloengual-bert) | Copa-SR | 0.607 |
|
57 |
+
| XLM-Roberta-Base | Copa-SR | 0.573 |
|
58 |
+
| XLM-Roberta-Large |Copa-SR | 0.570 |
|
59 |
+
|
60 |
+
|
61 |
|
62 |
# Citation
|
63 |
(to be added soon)
|