ikuyamada commited on
Commit
be13be7
1 Parent(s): 602bc70

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +39 -0
README.md CHANGED
@@ -1,3 +1,42 @@
1
  ---
2
  license: apache-2.0
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
+ language:
4
+ - ja
5
  ---
6
+ # Leia-Swallow-7B
7
+
8
+ LEIA is a training technique for autoregressive LLMs to effectively improve their performance in languages other than English by enhancing cross-lingual knowledge transfer from English to a target language.
9
+ This model is constructed by applying LEIA to Swallow, a Japanese-English bilingual LLM based on LLaMA 2.
10
+ The model achieves the enhanced performance on six Japanese question answering benchmarks as reported below.
11
+
12
+ Please refer to our paper or blog post (in Japanese) for further technical details:
13
+
14
+ - [LEIA: Facilitating Cross-Lingual Knowledge Transfer in Language Models with Entity-based Data Augmentation](https://arxiv.org/abs/2402.11485) (arxiv.org)
15
+ - [LEIA: 言語間転移学習でLLMを賢くする新しい方法](#) (zenn.dev)
16
+
17
+ ## Model List
18
+
19
+ - [Leia-Swallow-7b](https://huggingface.co/leia-llm/Leia-Swallow-7b/)
20
+ - [Leia-Swallow-13b](https://huggingface.co/leia-llm/Leia-Swallow-13b/)
21
+
22
+ ## Empirical Results
23
+
24
+ The model is assessed using the following six question answering benchmarks:
25
+ - X-CODAH
26
+ - X-CSQA
27
+ - JCommonsenseQA
28
+ - NIILC
29
+ - JEMHopQA
30
+ - JAQKET v2
31
+
32
+ | Model | X-CODAH | X-CSQA | JCommonsenseQA | NIILC | JEMHopQA | JAQKET v2 |
33
+ | ---- | ---- | ---- | ---- | ---- | ---- | ---- |
34
+ | Swallow | 42.0 | 41.0 | 80.3 | 59.5 | 50.8 | 86.2 |
35
+ | LEIA | **42.7** | **42.4** | **80.6** | **60.3** | **54.7** | **86.5** |
36
+
37
+ For further details of this experiment, please refer to [our paper](https://arxiv.org/abs/2402.11485).
38
+
39
+ ## Contributors
40
+
41
+ Ikuya Yamada (Studio Ousia, RIKEN)
42
+ Ryokan Ri (LY Corporation, SB Intuitions)