David
commited on
Commit
•
cc009e2
1
Parent(s):
ff2fccb
Update README.md
Browse files
README.md
CHANGED
@@ -51,17 +51,16 @@ We provide models fine-tuned on the [XNLI dataset](https://huggingface.co/datase
|
|
51 |
|
52 |
## Metrics
|
53 |
|
54 |
-
We fine-tune our models on
|
55 |
|
56 |
- [XNLI](https://huggingface.co/datasets/xnli)
|
57 |
- [PAWS-X](https://huggingface.co/datasets/paws-x)
|
58 |
-
- [CoNLL2002 - POS](https://huggingface.co/datasets/conll2002)
|
59 |
- [CoNLL2002 - NER](https://huggingface.co/datasets/conll2002)
|
60 |
|
61 |
For each task, we conduct 5 trials and state the mean and standard deviation of the metrics in the table below.
|
62 |
To compare our results to other Spanish language models, we provide the same metrics taken from the [evaluation table](https://github.com/PlanTL-SANIDAD/lm-spanish#evaluation-) of the [Spanish Language Model](https://github.com/PlanTL-SANIDAD/lm-spanish) repo.
|
63 |
|
64 |
-
| Model |
|
65 |
| --- | --- | --- | --- | --- |
|
66 |
| SELECTRA small | 0.865 +- 0.004 | 0.896 +- 0.002 | 0.784 +- 0.002 | 22M |
|
67 |
| SELECTRA medium | 0.873 +- 0.003 | 0.896 +- 0.002 | 0.804 +- 0.002 | 41M |
|
|
|
51 |
|
52 |
## Metrics
|
53 |
|
54 |
+
We fine-tune our models on 3 different down-stream tasks:
|
55 |
|
56 |
- [XNLI](https://huggingface.co/datasets/xnli)
|
57 |
- [PAWS-X](https://huggingface.co/datasets/paws-x)
|
|
|
58 |
- [CoNLL2002 - NER](https://huggingface.co/datasets/conll2002)
|
59 |
|
60 |
For each task, we conduct 5 trials and state the mean and standard deviation of the metrics in the table below.
|
61 |
To compare our results to other Spanish language models, we provide the same metrics taken from the [evaluation table](https://github.com/PlanTL-SANIDAD/lm-spanish#evaluation-) of the [Spanish Language Model](https://github.com/PlanTL-SANIDAD/lm-spanish) repo.
|
62 |
|
63 |
+
| Model | CoNLL2002 - NER (f1) | PAWS-X (acc) | XNLI (acc) | Params |
|
64 |
| --- | --- | --- | --- | --- |
|
65 |
| SELECTRA small | 0.865 +- 0.004 | 0.896 +- 0.002 | 0.784 +- 0.002 | 22M |
|
66 |
| SELECTRA medium | 0.873 +- 0.003 | 0.896 +- 0.002 | 0.804 +- 0.002 | 41M |
|