File size: 800 Bytes
698c992 dc49658 0a8d4b3 dc49658 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
---
license: apache-2.0
---
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64b63f8ad57e02621dc93c8b/3ZEZkVjboJRi2Z2ymiQkO.png)
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64b63f8ad57e02621dc93c8b/K8c138jtaTA4qJeGRm0dO.png)
# Base checkpoint
augmxnt/shisa-7b-v1
* Mistral-7B base
* Pre-trained on 8B of MADLAD-Ja
* Finetuned on Japanese instructions
* Highest scoring 7B model on conversation benchmark (JA MT-Bench)
# Training datasets (total ~7B)
* Aozora Bunko
* Japanese Law Precedent Dataset
* Japanese Wikipedia
* .lg.jp, .go.jp, .ac.jp domain webscrapes from CulturaX (Any documents with same first 25 characters were de-duplicated)
* English Ultrachat200K-gen (So that it doesn't forget English and chatting ability learned in the base checkpoint) |