Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
mlx-community
/
Nous-Hermes-2-Mixtral-8x7B-DPO-4bit
like
18
Follow
MLX Community
3.62k
MLX
English
mixtral
Mixtral
instruct
finetune
chatml
DPO
RLHF
gpt4
synthetic data
distillation
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Train
Use this model
main
Nous-Hermes-2-Mixtral-8x7B-DPO-4bit
/
added_tokens.json
thomadev0
Upload folder using huggingface_hub
c77f0fc
verified
about 1 year ago
raw
Copy download link
history
blame
contribute
delete
Safe
51 Bytes
{
"<|im_end|>"
:
32000
,
"<|im_start|>"
:
32001
}