SR
commited on
Commit
•
f0a4dc0
1
Parent(s):
729a5f4
Update README.md
Browse files
README.md
CHANGED
@@ -92,19 +92,19 @@ Finetuning datasets are sourced from [LAION OIG chip2 and infill_dbpedia (Apache
|
|
92 |
# Evaluation
|
93 |
We performed human and machine evaluations on XQuAD zero-shot and one-shot settings:
|
94 |
## XQuAD
|
95 |
-
| Model |
|
96 |
-
|
97 |
-
| openthaigpt7B |
|
98 |
-
| SeaLLM7B
|
99 |
-
| Typhoon-7b |
|
100 |
-
| WangchanLion7B |
|
101 |
|
102 |
## iAPP Wiki QA
|
103 |
-
| Model |
|
104 |
-
|
105 |
-
| openthaigpt7B |
|
106 |
-
| SeaLLM7B
|
107 |
-
| WangchanLion7B |
|
108 |
|
109 |
# What WangchanLion offers:
|
110 |
- Transparent pretrained model: The development of SEA-LION is community-driven, with different ASEAN collaborators contributing pretraining datasets. The SEA-LION developers ensure that all datasets are safe and can be utilized without commercial restrictions. This transparency extends to the provision of pretraining code, ensuring anyone can replicate SEA-LION using the provided datasets.
|
|
|
92 |
# Evaluation
|
93 |
We performed human and machine evaluations on XQuAD zero-shot and one-shot settings:
|
94 |
## XQuAD
|
95 |
+
| Model | F1 (Zero-shot) | F1 (One-shot) |
|
96 |
+
|:--------------:|:--------------:|:-------------:|
|
97 |
+
| openthaigpt7B | 27.3487 | 34.3104 |
|
98 |
+
| SeaLLM7B V2 | 16.1104 | 25.7399 |
|
99 |
+
| Typhoon-7b | 34.46 | **54.03** |
|
100 |
+
| WangchanLion7B | **45.8763** | 49.9145 |
|
101 |
|
102 |
## iAPP Wiki QA
|
103 |
+
| Model | F1 (Zero-shot) | F1 (One-shot) |
|
104 |
+
|:--------------:|:--------------:|:-------------:|
|
105 |
+
| openthaigpt7B | 40.0614 | 46.6883 |
|
106 |
+
| SeaLLM7B V2 | 23.6425 | 28.9934 |
|
107 |
+
| WangchanLion7B | **58.9051** | **62.9776** |
|
108 |
|
109 |
# What WangchanLion offers:
|
110 |
- Transparent pretrained model: The development of SEA-LION is community-driven, with different ASEAN collaborators contributing pretraining datasets. The SEA-LION developers ensure that all datasets are safe and can be utilized without commercial restrictions. This transparency extends to the provision of pretraining code, ensuring anyone can replicate SEA-LION using the provided datasets.
|