Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
mixtao
/
MixTAO-7Bx2-MoE-v8.1
like
54
Follow
MixTAO Labs
1
Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
conversational
Eval Results
text-generation-inference
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
10
Train
Deploy
Use this model
zhengr
commited on
Feb 26
Commit
da82baf
•
1 Parent(s):
a8d2629
Create README.md
Browse files
Files changed (1)
hide
show
README.md
+9
-0
README.md
ADDED
Viewed
@@ -0,0 +1,9 @@
1
+
---
2
+
license: apache-2.0
3
+
tags:
4
+
- moe
5
+
---
6
+
7
+
# MixTAO-7Bx2-MoE
8
+
9
+
MixTAO-7Bx2-MoE is a Mixure of Experts (MoE).