Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
mlx-community
/
Nous-Hermes-2-Mixtral-8x7B-DPO-4bit
like
18
Follow
MLX Community
2,406
MLX
English
mixtral
Mixtral
instruct
finetune
chatml
DPO
RLHF
gpt4
synthetic data
distillation
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Train
Use this model
main
Nous-Hermes-2-Mixtral-8x7B-DPO-4bit
/
tokenizer_config.json
Commit History
Upload folder using huggingface_hub
c77f0fc
verified
thomadev0
commited on
Jan 17