kimsan0622 commited on
Commit
59370a4
1 Parent(s): 0c1d552

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +22 -1
README.md CHANGED
@@ -34,12 +34,13 @@ T5-Base is the checkpoint with 220 million parameters.
34
  - **Parent Model:** T5
35
  - **Resources for more information:**
36
  - [GitHub Repo](https://github.com/google-research/text-to-text-transfer-transformer#released-model-checkpoints)
 
 
37
  - [Associated Paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf)
38
  - [Blog Post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html)
39
 
40
  # Uses
41
 
42
-
43
  ## Direct Use
44
 
45
  The developers write in a [blog post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) that the model:
@@ -142,6 +143,26 @@ More information needed
142
 
143
 
144
  **BibTeX:**
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
145
  ```bibtex
146
  @article{2020t5,
147
  author = {Colin Raffel and Noam Shazeer and Adam Roberts and Katherine Lee and Sharan Narang and Michael Matena and Yanqi Zhou and Wei Li and Peter J. Liu},
 
34
  - **Parent Model:** T5
35
  - **Resources for more information:**
36
  - [GitHub Repo](https://github.com/google-research/text-to-text-transfer-transformer#released-model-checkpoints)
37
+ - [KE-T5 Github Repo](https://github.com/AIRC-KETI/ke-t5)
38
+ - [Paper](https://aclanthology.org/2021.findings-emnlp.33/)
39
  - [Associated Paper](https://jmlr.org/papers/volume21/20-074/20-074.pdf)
40
  - [Blog Post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html)
41
 
42
  # Uses
43
 
 
44
  ## Direct Use
45
 
46
  The developers write in a [blog post](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) that the model:
 
143
 
144
 
145
  **BibTeX:**
146
+
147
+ ```bibtex
148
+ @inproceedings{kim-etal-2021-model-cross,
149
+ title = "A Model of Cross-Lingual Knowledge-Grounded Response Generation for Open-Domain Dialogue Systems",
150
+ author = "Kim, San and
151
+ Jang, Jin Yea and
152
+ Jung, Minyoung and
153
+ Shin, Saim",
154
+ booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2021",
155
+ month = nov,
156
+ year = "2021",
157
+ address = "Punta Cana, Dominican Republic",
158
+ publisher = "Association for Computational Linguistics",
159
+ url = "https://aclanthology.org/2021.findings-emnlp.33",
160
+ doi = "10.18653/v1/2021.findings-emnlp.33",
161
+ pages = "352--365",
162
+ abstract = "Research on open-domain dialogue systems that allow free topics is challenging in the field of natural language processing (NLP). The performance of the dialogue system has been improved recently by the method utilizing dialogue-related knowledge; however, non-English dialogue systems suffer from reproducing the performance of English dialogue systems because securing knowledge in the same language with the dialogue system is relatively difficult. Through experiments with a Korean dialogue system, this paper proves that the performance of a non-English dialogue system can be improved by utilizing English knowledge, highlighting the system uses cross-lingual knowledge. For the experiments, we 1) constructed a Korean version of the Wizard of Wikipedia dataset, 2) built Korean-English T5 (KE-T5), a language model pre-trained with Korean and English corpus, and 3) developed a knowledge-grounded Korean dialogue model based on KE-T5. We observed the performance improvement in the open-domain Korean dialogue model even only English knowledge was given. The experimental results showed that the knowledge inherent in cross-lingual language models can be helpful for generating responses in open dialogue systems.",
163
+ }
164
+ ```
165
+
166
  ```bibtex
167
  @article{2020t5,
168
  author = {Colin Raffel and Noam Shazeer and Adam Roberts and Katherine Lee and Sharan Narang and Michael Matena and Yanqi Zhou and Wei Li and Peter J. Liu},