Text Generation
Transformers
Safetensors
Chinese
English
mixtral
Mistral
conversational
text-generation-inference
Inference Endpoints
Zephyr-8x7b / README.md
qq8933's picture
Update README.md
2b087de
|
raw
history blame
No virus
582 Bytes
metadata
license: mit
datasets:
  - vicgalle/alpaca-gpt4
  - BelleGroup/train_1M_CN
  - stingning/ultrachat
  - HuggingFaceH4/no_robots
  - Open-Orca/OpenOrca
language:
  - zh
  - en
pipeline_tag: conversational
tags:
  - Mistral

Zephyr-8x7b:Zephyr Models but Mixtral 8x7B

We present to you the Zephyr-8x7b, a Mixtral 8x7B MoE model that SFT-only training on a dataset of nearly four million conversation corpora.

It has demonstrated strong contextual understanding, reasoning, and human moral alignment without alignment techniques like DPO, and we invite you to participate in our exploration!