Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
yunconglong
/
Mixtral_7Bx2_MoE_13B_DPO
like
0
Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
conversational
text-generation-inference
Inference Endpoints
License:
cc-by-nc-4.0
Model card
Files
Files and versions
Community
4
Train
Deploy
Use this model
Update README.md
#1
by
cloudyu
- opened
Jan 27
base:
refs/heads/main
←
from:
refs/pr/1
Discussion
Files changed
+1
-1
Files changed (1)
hide
show
README.md
+1
-1
README.md
CHANGED
Viewed
@@ -4,7 +4,7 @@
4
- moe
5
---
6
7
-
# Mixtral MOE 2x7B
8
9
10
4
- moe
5
---
6
7
+
#
Fine Tuned
Mixtral MOE 2x7B
8
9
10