Update README.md
Browse files
README.md
CHANGED
@@ -63,6 +63,18 @@ Challenges remain in low-resource languages, where the model tends to have highe
|
|
63 |
---
|
64 |
|
65 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
66 |
## Acknowledgements
|
67 |
|
68 |
We extend our thanks to the language communities and contributors who helped source, clean, and validate the diverse data used in the MaLA Corpus. Their efforts are invaluable in supporting linguistic diversity in AI research.
|
|
|
63 |
---
|
64 |
|
65 |
|
66 |
+
## Citation
|
67 |
+
|
68 |
+
```
|
69 |
+
@article{ji2024emma500enhancingmassivelymultilingual,
|
70 |
+
title={{EMMA}-500: Enhancing Massively Multilingual Adaptation of Large Language Models},
|
71 |
+
author={Shaoxiong Ji and Zihao Li and Indraneil Paul and Jaakko Paavola and Peiqin Lin and Pinzhen Chen and Dayyán O'Brien and Hengyu Luo and Hinrich Schütze and Jörg Tiedemann and Barry Haddow},
|
72 |
+
year={2024},
|
73 |
+
journal={arXiv preprint 2409.17892},
|
74 |
+
url={https://arxiv.org/abs/2409.17892},
|
75 |
+
}
|
76 |
+
```
|
77 |
+
|
78 |
## Acknowledgements
|
79 |
|
80 |
We extend our thanks to the language communities and contributors who helped source, clean, and validate the diverse data used in the MaLA Corpus. Their efforts are invaluable in supporting linguistic diversity in AI research.
|