Update README.md
Browse files
README.md
CHANGED
@@ -25,51 +25,23 @@ and is comparable with Mistral-7B-v0.1 on MMLU and MT-Bench in English.
|
|
25 |
- **Model type:** Causal decoder-only transformer language model
|
26 |
- **Language:** English and Traditional Chinese (zh-tw)
|
27 |
|
28 |
-
## Performance
|
29 |
-
|
30 |
-
|
|
31 |
-
|
32 |
-
|
|
33 |
-
| MediaTek-Research/Breeze-7B-
|
34 |
-
| mistralai/Mistral-7B-v0.1 |
|
35 |
-
|
|
36 |
-
| yentinglin/Taiwan-LLM-
|
37 |
-
|
|
38 |
-
|
|
39 |
-
|
|
40 |
-
|
|
41 |
-
|
42 |
-
|
43 |
-
|
44 |
-
|
|
45 |
-
| Qwen/Qwen-7B-Chat | | | |
|
46 |
-
| Qwen/Qwen-14B | | | - |
|
47 |
-
| Qwen/Qwen-14B-Chat | | | |
|
48 |
-
| gpt-3.5-turbo-0613 | | | |
|
49 |
-
|
50 |
-
|
51 |
-
| **[English Benchmarks]** | MMLU 5-shot (ACC) | MT-Bench (Score) |
|
52 |
-
|-----------------------------------------------------|---------------------|------------------|
|
53 |
-
| MediaTek-Research/Breeze-7B-Base-v0.1 | | - |
|
54 |
-
| MediaTek-Research/Breeze-7B-Instruct-v0.1 | | |
|
55 |
-
| mistralai/Mistral-7B-v0.1 | | - |
|
56 |
-
| mistralai/Mistral-7B-Instruct-v0.1 | | |
|
57 |
-
| yentinglin/Taiwan-LLM-7B-v2.1-base | | - |
|
58 |
-
| yentinglin/Taiwan-LLM-7B-v2.1-chat | | |
|
59 |
-
| yentinglin/Taiwan-LLM-13B-v2.0-base | | - |
|
60 |
-
| yentinglin/Taiwan-LLM-13B-v2.0-chat | | - |
|
61 |
-
| 01-ai/Yi-6B-Base | | - |
|
62 |
-
| 01-ai/Yi-6B-Chat | | |
|
63 |
-
| 01-ai/Yi-34B-Base | | - |
|
64 |
-
| 01-ai/Yi-34B-Chat | | |
|
65 |
-
| Qwen/Qwen-7B | | - |
|
66 |
-
| Qwen/Qwen-7B-Chat | | |
|
67 |
-
| Qwen/Qwen-14B | | - |
|
68 |
-
| Qwen/Qwen-14B-Chat | | |
|
69 |
-
| gpt-3.5-turbo-0613 | | |
|
70 |
-
|
71 |
-
|
72 |
-
| **[Inference Metrics on Traditional Chinese]** | Speed (char/sec) | Compression Ratio | Max Character Size |
|
73 |
|--------------------------------------------------------------------|-------------------|-------------------|--------------------|
|
74 |
| MediaTek-Research/Breeze-7B-Base-v0.1 | | | | |
|
75 |
| mistralai/Mistral-7B-v0.1 | | | |
|
@@ -80,6 +52,22 @@ and is comparable with Mistral-7B-v0.1 on MMLU and MT-Bench in English.
|
|
80 |
| Qwen/Qwen-7B | | | |
|
81 |
| Qwen/Qwen-14B | | | |
|
82 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
83 |
|
84 |
## Use in Transformers
|
85 |
|
|
|
25 |
- **Model type:** Causal decoder-only transformer language model
|
26 |
- **Language:** English and Traditional Chinese (zh-tw)
|
27 |
|
28 |
+
## Base Model Performance
|
29 |
+
|
30 |
+
| Models | TMMLU+ (ACC) | DRCD (EM) | MMLU (ACC) |
|
31 |
+
|-----------------------------------------------------|--------------|-----------|------------|
|
32 |
+
| | 5 shot | 3 shot | 5 shot |
|
33 |
+
| MediaTek-Research/Breeze-7B-Base-v0.1 | | | |
|
34 |
+
| mistralai/Mistral-7B-v0.1 | | | |
|
35 |
+
| yentinglin/Taiwan-LLM-7B-v2.1-base | | | |
|
36 |
+
| yentinglin/Taiwan-LLM-13B-v2.0-base | | | |
|
37 |
+
| 01-ai/Yi-6B | | | |
|
38 |
+
| 01-ai/Yi-34B | | | |
|
39 |
+
| Qwen/Qwen-7B | | | |
|
40 |
+
| Qwen/Qwen-14B | | | |
|
41 |
+
|
42 |
+
## Inference Performance
|
43 |
+
|
44 |
+
| Models | Speed (char/sec) | Compression Ratio | Max Character Size |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
45 |
|--------------------------------------------------------------------|-------------------|-------------------|--------------------|
|
46 |
| MediaTek-Research/Breeze-7B-Base-v0.1 | | | | |
|
47 |
| mistralai/Mistral-7B-v0.1 | | | |
|
|
|
52 |
| Qwen/Qwen-7B | | | |
|
53 |
| Qwen/Qwen-14B | | | |
|
54 |
|
55 |
+
## Chat Model Performance
|
56 |
+
|
57 |
+
| Models | TMMLU+ (ACC) | DRCD (EM) | MT-Bench-tw (Score) | MMLU (ACC) | MT-Bench (Score) |
|
58 |
+
|-----------------------------------------------------|--------------|-----------|---------------------|------------|------------------|
|
59 |
+
| | 5 shot | 3 shot | 0 shot | 5 shot | 0 shot |
|
60 |
+
| MediaTek-Research/Breeze-7B-Instruct-v0.1 | | | | | |
|
61 |
+
| mistralai/Mistral-7B-Instruct-v0.1 | | | | | |
|
62 |
+
| yentinglin/Taiwan-LLM-7B-v2.1-chat | | | | | |
|
63 |
+
| yentinglin/Taiwan-LLM-13B-v2.0-chat | | | | | |
|
64 |
+
| 01-ai/Yi-6B-Chat | | | | | |
|
65 |
+
| 01-ai/Yi-34B-Chat | | | | | |
|
66 |
+
| Qwen/Qwen-7B-Chat | | | | | |
|
67 |
+
| Qwen/Qwen-14B-Chat | | | | | |
|
68 |
+
| gpt-3.5-turbo-0613 | | 76.30 | | | |
|
69 |
+
|
70 |
+
|
71 |
|
72 |
## Use in Transformers
|
73 |
|