Update README.md
Browse files
README.md
CHANGED
@@ -36,7 +36,9 @@ task_ids:
|
|
36 |
---
|
37 |
|
38 |
# WikiNEuRal: Combined Neural and Knowledge-based Silver Data Creation for Multilingual NER
|
39 |
-
This is the model card for the EMNLP 2021 paper [WikiNEuRal: Combined Neural and Knowledge-based Silver Data Creation for Multilingual NER](https://aclanthology.org/2021.findings-emnlp.215/). In a nutshell, WikiNEuRal consists in a novel technique which builds upon a multilingual lexical knowledge base (i.e., BabelNet) and transformer-based architectures (i.e., BERT) to produce high-quality annotations for multilingual NER. We then fine-tuned a multilingual language model (mBERT) for 3 epochs on the resulting WikiNEuRal dataset. The system supports the 9 languages covered by WikiNEuRal (de, en, es, fr, it, nl, pl, pt, ru), and it was trained on all 9 languages jointly.
|
|
|
|
|
40 |
|
41 |
```bibtex
|
42 |
@inproceedings{tedeschi-etal-2021-wikineural-combined,
|
|
|
36 |
---
|
37 |
|
38 |
# WikiNEuRal: Combined Neural and Knowledge-based Silver Data Creation for Multilingual NER
|
39 |
+
This is the model card for the EMNLP 2021 paper [WikiNEuRal: Combined Neural and Knowledge-based Silver Data Creation for Multilingual NER](https://aclanthology.org/2021.findings-emnlp.215/). In a nutshell, WikiNEuRal consists in a novel technique which builds upon a multilingual lexical knowledge base (i.e., BabelNet) and transformer-based architectures (i.e., BERT) to produce high-quality annotations for multilingual NER. We then fine-tuned a multilingual language model (mBERT) for 3 epochs on the resulting WikiNEuRal dataset. The system supports the 9 languages covered by WikiNEuRal (de, en, es, fr, it, nl, pl, pt, ru), and it was trained on all 9 languages jointly.
|
40 |
+
|
41 |
+
**If you use the model, please reference this work in your paper**:
|
42 |
|
43 |
```bibtex
|
44 |
@inproceedings{tedeschi-etal-2021-wikineural-combined,
|