umiyuki commited on
Commit
d1c4012
·
verified ·
1 Parent(s): 420a58e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +18 -3
README.md CHANGED
@@ -4,9 +4,24 @@ library_name: transformers
4
  tags:
5
  - mergekit
6
  - merge
7
-
 
 
8
  ---
9
- # itr012_score3.93
 
 
 
 
 
 
 
 
 
 
 
 
 
10
 
11
  This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
12
 
@@ -170,4 +185,4 @@ slices:
170
  model: /home/umiyuki/automerge/evol_merge_storage/input_models/Ninja-v1-RP-expressive-v2_4102792561
171
  parameters:
172
  weight: 0.4471258297582539
173
- ```
 
4
  tags:
5
  - mergekit
6
  - merge
7
+ license: apache-2.0
8
+ language:
9
+ - ja
10
  ---
11
+ # Umievo-itr012-Gleipnir-7B
12
+
13
+ このモデルは強力な4つの日本語モデルを進化的アルゴリズムで進化的マージしたものです。Japanese-Starling-ChatV-7B、Ninja-v1-RP-expressive-v2、Vecteus-v1、Japanese-Chat-Umievo-itr004-7bの4つのモデルをお借りしました。
14
+ マージに使用させていただいたモデル制作者のAratakoさん、Bakuさん、Local-Novel-LLM-projectのみなさまに感謝します。それから問題解決のきっかけをくれたHoly-foxさんに感謝します。
15
+
16
+ This model is an evolutionary merge of four powerful Japanese models with an evolutionary algorithm. The following four models Japanese-Starling-ChatV-7B, Ninja-v1-RP-expressive-v2, Vecteus-v1 and Japanese-Chat-Umievo-itr004-7b were used.
17
+ I would like to thank the model makers Aratako, Baku and Local-Novel-LLM-project for allowing me to use their models for the merge. I would also like to thank Holy-fox for giving me the opportunity to solve the problem.
18
+
19
+ ElyzaTasks100ベンチマークで平均点が3.91でした。(Llama3-70Bによる自動評価を3回行った平均点)
20
+
21
+ The average score was 3.91 on the ElyzaTasks100 benchmark. (Average score after 3 automatic evaluations by Llama3-70B)
22
+
23
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/630420b4eedc089484c853e8/FxUBzBUKpe_JSHSJufSv5.png)
24
+
25
 
26
  This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
27
 
 
185
  model: /home/umiyuki/automerge/evol_merge_storage/input_models/Ninja-v1-RP-expressive-v2_4102792561
186
  parameters:
187
  weight: 0.4471258297582539
188
+ ```