mapama247 commited on
Commit
3961240
·
1 Parent(s): 375961d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -7
README.md CHANGED
@@ -113,11 +113,11 @@ This model has been fine-tuned on the downstream tasks of the [Catalan Language
113
 
114
  | Dataset | Task| Total | Train | Dev | Test |
115
  |:----------|:----|:--------|:-------|:------|:------|
116
- | Ancora | NER | 13,581 | 10,628 | 1,427 | 1,526 |
117
- | Ancora | POS | 16,678 | 13,123 | 1,709 | 1,846 |
118
  | STS-ca | STS | 3,073 | 2,073 | 500 | 500 |
119
  | TeCla | TC | 137,775 | 110,203| 13,786| 13,786|
120
- | TE-ca | TE | 21,163 | 16,930 | 2,116 | 2,117 |
121
  | CatalanQA | QA | 21,427 | 17,135 | 2,157 | 2,135 |
122
  | XQuAD-ca | QA | - | - | - | 1,189 |
123
 
@@ -125,10 +125,10 @@ This model has been fine-tuned on the downstream tasks of the [Catalan Language
125
 
126
  This is how it compares to its teacher when fine-tuned on the aforementioned downstream tasks:
127
 
128
- | Model \ Task |NER (F1)|POS (F1)|STS-ca (Comb)|TeCla (Acc.)|TEca (Acc.)|CatalanQA (F1/EM)| XQuAD-ca <sup>1</sup> (F1/EM) |
129
- | ------------------------|:-------|:-------|:------------|:-----------|:----------|:----------------|:------------------------------|
130
- | RoBERTa-base-ca-v2 | 89.29 | 98.96 | 79.07 | 74.26 | 83.14 | 89.50/76.63 | 73.64/55.42 |
131
- | DistilRoBERTa-base-ca | 87.88 | 98.83 | 77.26 | 73.20 | 76.00 | 84.07/70.77 | xx.xx/xx.xx |
132
 
133
  <sup>1</sup> : Trained on CatalanQA, tested on XQuAD-ca.
134
 
 
113
 
114
  | Dataset | Task| Total | Train | Dev | Test |
115
  |:----------|:----|:--------|:-------|:------|:------|
116
+ | AnCora | NER | 13,581 | 10,628 | 1,427 | 1,526 |
117
+ | AnCora | POS | 16,678 | 13,123 | 1,709 | 1,846 |
118
  | STS-ca | STS | 3,073 | 2,073 | 500 | 500 |
119
  | TeCla | TC | 137,775 | 110,203| 13,786| 13,786|
120
+ | TE-ca | RTE | 21,163 | 16,930 | 2,116 | 2,117 |
121
  | CatalanQA | QA | 21,427 | 17,135 | 2,157 | 2,135 |
122
  | XQuAD-ca | QA | - | - | - | 1,189 |
123
 
 
125
 
126
  This is how it compares to its teacher when fine-tuned on the aforementioned downstream tasks:
127
 
128
+ | Model \ Task |NER (F1)|POS (F1)|STS-ca (Comb.)|TeCla (Acc.)|TEca (Acc.)|CatalanQA (F1/EM)| XQuAD-ca <sup>1</sup> (F1/EM) |
129
+ | ------------------------|:-------|:-------|:-------------|:-----------|:----------|:----------------|:------------------------------|
130
+ | RoBERTa-base-ca-v2 | 89.29 | 98.96 | 79.07 | 74.26 | 83.14 | 89.50/76.63 | 73.64/55.42 |
131
+ | DistilRoBERTa-base-ca | 87.88 | 98.83 | 77.26 | 73.20 | 76.00 | 84.07/70.77 | xx.xx/xx.xx |
132
 
133
  <sup>1</sup> : Trained on CatalanQA, tested on XQuAD-ca.
134