matejulcar
commited on
Commit
•
500c992
1
Parent(s):
7d95fb6
Added link to the paper
Browse files
README.md
CHANGED
@@ -17,5 +17,8 @@ The following corpora were used for training the model:
|
|
17 |
* Slovenian parliamentary corpus siParl 2.0
|
18 |
* slWaC
|
19 |
|
|
|
|
|
|
|
20 |
## Changelog
|
21 |
2022-07-21: updated with v2 of the model, the old one is still accesible at [cjvt/legacy-t5-sl-small](https://huggingface.co/cjvt/legacy-t5-sl-small).
|
|
|
17 |
* Slovenian parliamentary corpus siParl 2.0
|
18 |
* slWaC
|
19 |
|
20 |
+
## Evaluation
|
21 |
+
The model is described in detail and evaluated in our paper ["*Sequence to sequence pretraining for a less-resourced Slovenian language*"](https://arxiv.org/abs/2207.13988)
|
22 |
+
|
23 |
## Changelog
|
24 |
2022-07-21: updated with v2 of the model, the old one is still accesible at [cjvt/legacy-t5-sl-small](https://huggingface.co/cjvt/legacy-t5-sl-small).
|