Text Generation
Transformers
PyTorch
Safetensors
Japanese
English
qwen
custom_code
keisawada commited on
Commit
d0b7a9c
1 Parent(s): bfbd7ee

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +16 -4
README.md CHANGED
@@ -10,6 +10,9 @@ language:
10
  tags:
11
  - qwen
12
  inference: false
 
 
 
13
  ---
14
 
15
  # `rinna/nekomata-14b-instruction`
@@ -42,7 +45,7 @@ The model is the instruction-tuned version of [`rinna/nekomata-14b`](https://hug
42
  * yasashi-japanese
43
  * The [remaining sections](https://github.com/masanorihirano/llm-japanese-dataset/tree/main/datasets-cc-by-sa) contain commonly used evaluation corpora so they are skipped to prevent data leak.
44
 
45
- * **Authors**
46
 
47
  - [Tianyu Zhao](https://huggingface.co/tianyuz)
48
  - [Kei Sawada](https://huggingface.co/keisawada)
@@ -125,10 +128,19 @@ Please refer to [`rinna/nekomata-14b`](https://huggingface.co/rinna/nekomata-14b
125
 
126
  # How to cite
127
  ~~~
128
- @misc{RinnaNekomataInstruction14b,
129
- url={https://huggingface.co/rinna/nekomata-14b-instruction},
130
- title={rinna/nekomata-14b-instruction},
131
  author={Zhao, Tianyu and Sawada, Kei}
 
 
 
 
 
 
 
 
 
 
132
  }
133
  ~~~
134
  ---
 
10
  tags:
11
  - qwen
12
  inference: false
13
+ license: other
14
+ license_name: tongyi-qianwen-license-agreement
15
+ license_link: https://github.com/QwenLM/Qwen/blob/main/Tongyi%20Qianwen%20LICENSE%20AGREEMENT
16
  ---
17
 
18
  # `rinna/nekomata-14b-instruction`
 
45
  * yasashi-japanese
46
  * The [remaining sections](https://github.com/masanorihirano/llm-japanese-dataset/tree/main/datasets-cc-by-sa) contain commonly used evaluation corpora so they are skipped to prevent data leak.
47
 
48
+ * **Contributors**
49
 
50
  - [Tianyu Zhao](https://huggingface.co/tianyuz)
51
  - [Kei Sawada](https://huggingface.co/keisawada)
 
128
 
129
  # How to cite
130
  ~~~
131
+ @misc{rinna-nekomata-14b-instruction,
132
+ title = {rinna/nekomata-14b-instruction},
 
133
  author={Zhao, Tianyu and Sawada, Kei}
134
+ url = {https://huggingface.co/rinna/nekomata-14b-instruction},
135
+ }
136
+
137
+ @inproceedings{sawada2024release,
138
+ title = {Release of Pre-Trained Models for the {J}apanese Language},
139
+ author = {Sawada, Kei and Zhao, Tianyu and Shing, Makoto and Mitsui, Kentaro and Kaga, Akio and Hono, Yukiya and Wakatsuki, Toshiaki and Mitsuda, Koh},
140
+ booktitle = {Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)},
141
+ month = {5},
142
+ year = {2024},
143
+ url = {https://arxiv.org/abs/2404.01657},
144
  }
145
  ~~~
146
  ---