Spaces:
Running
Running
yentinglin
commited on
Commit
β’
4ecef00
1
Parent(s):
4d31b4c
Update app.py
Browse files
app.py
CHANGED
@@ -6,6 +6,16 @@ from transformers import AutoTokenizer
|
|
6 |
DESCRIPTION = """
|
7 |
# Language Models for Taiwanese Culture
|
8 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
9 |
Taiwan-LLaMa is a fine-tuned model specifically designed for traditional Chinese applications. It is built upon the LLaMa 2 architecture and includes a pretraining phase with over 5 billion tokens and fine-tuning with over 490k multi-turn conversational data in Traditional Chinese.
|
10 |
|
11 |
## Key Features
|
|
|
6 |
DESCRIPTION = """
|
7 |
# Language Models for Taiwanese Culture
|
8 |
|
9 |
+
<p align="center">
|
10 |
+
βοΈ <a href="https://huggingface.co/spaces/yentinglin/Taiwan-LLaMa2" target="_blank">Online Demo</a>
|
11 |
+
β’
|
12 |
+
π€ <a href="https://huggingface.co/yentinglin" target="_blank">HF Repo</a> β’ π¦ <a href="https://twitter.com/yentinglin56" target="_blank">Twitter</a> β’ π <a href="https://arxiv.org/pdf/2305.13711.pdf" target="_blank">[Paper Coming Soon]</a>
|
13 |
+
β’ π¨οΈ <a href="https://yentingl.com/" target="_blank">Yen-Ting Lin</a>
|
14 |
+
<br/><br/>
|
15 |
+
<img src="https://www.csie.ntu.edu.tw/~miulab/taiwan-llama/logo-v2.png" width="100"> <br/>
|
16 |
+
</p>
|
17 |
+
|
18 |
+
|
19 |
Taiwan-LLaMa is a fine-tuned model specifically designed for traditional Chinese applications. It is built upon the LLaMa 2 architecture and includes a pretraining phase with over 5 billion tokens and fine-tuning with over 490k multi-turn conversational data in Traditional Chinese.
|
20 |
|
21 |
## Key Features
|