metadata
license: mit
datasets:
- unicamp-dl/mmarco
- shunk031/jsnli
language:
- ja
Model trained on 800,000 Japanese sentences after reducing oshizo/japanese-e5-mistral-7b_slerp to 8 layers.
See this article for details(Japanese)
https://note.com/oshizo/n/n9140df790315
See intfloat/e5-mistral-7b-instruct page for model usage.