Update README.md
Browse files
README.md
CHANGED
@@ -50,10 +50,12 @@ The model was initialized with the weights of XLM-RoBERTa(base) and trained usin
|
|
50 |
If you find mLUKE useful for your work, please cite the following paper:
|
51 |
|
52 |
```latex
|
53 |
-
@inproceedings{
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
|
|
|
|
|
59 |
```
|
|
|
50 |
If you find mLUKE useful for your work, please cite the following paper:
|
51 |
|
52 |
```latex
|
53 |
+
@inproceedings{ri-etal-2022-mluke,
|
54 |
+
title = "m{LUKE}: {T}he Power of Entity Representations in Multilingual Pretrained Language Models",
|
55 |
+
author = "Ri, Ryokan and
|
56 |
+
Yamada, Ikuya and
|
57 |
+
Tsuruoka, Yoshimasa",
|
58 |
+
booktitle = "Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
|
59 |
+
year = "2022",
|
60 |
+
url = "https://aclanthology.org/2022.acl-long.505",
|
61 |
```
|