zhengr commited on
Commit
2d8cff9
1 Parent(s): da82baf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -6,4 +6,5 @@ tags:
6
 
7
  # MixTAO-7Bx2-MoE
8
 
9
- MixTAO-7Bx2-MoE is a Mixure of Experts (MoE).
 
 
6
 
7
  # MixTAO-7Bx2-MoE
8
 
9
+ MixTAO-7Bx2-MoE is a Mixure of Experts (MoE).
10
+ This model is mainly used for large model technology experiments, and increasingly perfect iterations will eventually create high-level large language models.