Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,49 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
|
2 |
+
---
|
3 |
+
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1
|
4 |
+
# Doc / guide: https://huggingface.co/docs/hub/model-cards
|
5 |
+
license: apache-2.0
|
6 |
+
language:
|
7 |
+
- zh
|
8 |
+
widget:
|
9 |
+
- text: >-
|
10 |
+
A chat between a curious user and an artificial intelligence assistant.
|
11 |
+
The assistant gives helpful, detailed, and polite answers to the user's
|
12 |
+
questions. USER: 你好,請問你可以幫我寫一封推薦信嗎? ASSISTANT:
|
13 |
+
library_name: transformers
|
14 |
+
pipeline_tag: text-generation
|
15 |
+
extra_gated_heading: Acknowledge license to accept the repository.
|
16 |
+
extra_gated_prompt: Please contact the author for access.
|
17 |
+
extra_gated_button_content: Acknowledge license 同意以上內容
|
18 |
+
extra_gated_fields:
|
19 |
+
Name: text
|
20 |
+
Mail: text
|
21 |
+
Organization: text
|
22 |
+
Country: text
|
23 |
+
Any utilization of the Taiwan LLM repository mandates the explicit acknowledgment and attribution to the original author: checkbox
|
24 |
+
使用Taiwan LLM必須明確地承認和歸功於優必達株式會社 Ubitus 以及原始作者: checkbox
|
25 |
+
---
|
26 |
+
|
27 |
+
## Taiwan-LLM-13B-v2.0-chat with ExLlamaV2 Quantization
|
28 |
+
Original model 原始模型: https://huggingface.co/yentinglin/Taiwan-LLM-13B-v2.0-chat
|
29 |
+
|
30 |
+
This is a quantizated model from [yentinglin/Taiwan-LLM-13B-v2.0-chat](https://huggingface.co/yentinglin/Taiwan-LLM-13B-v2.0-chat) in exl2 format.
|
31 |
+
|
32 |
+
## Citation
|
33 |
+
|
34 |
+
If you find Taiwan LLM is useful in your work, please cite it with:
|
35 |
+
|
36 |
+
```
|
37 |
+
@misc{lin2023taiwan,
|
38 |
+
title={Taiwan LLM: Bridging the Linguistic Divide with a Culturally Aligned Language Model},
|
39 |
+
author={Yen-Ting Lin and Yun-Nung Chen},
|
40 |
+
year={2023},
|
41 |
+
eprint={2311.17487},
|
42 |
+
archivePrefix={arXiv},
|
43 |
+
primaryClass={cs.CL}
|
44 |
+
}
|
45 |
+
```
|
46 |
+
|
47 |
+
# Acknowledgement
|
48 |
+
|
49 |
+
Taiwan LLM v2 is conducted in collaboration with [Ubitus K.K.](http://ubitus.net). Ubitus provides valuable compute resources for the project.
|