zhengr commited on
Commit
1456aae
1 Parent(s): a8fdc8f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -109,7 +109,7 @@ model-index:
109
 
110
  # MixTAO-7Bx2-MoE
111
 
112
- MixTAO-7Bx2-MoE is a Mixure of Experts (MoE).
113
  This model is mainly used for large model technology experiments, and increasingly perfect iterations will eventually create high-level large language models.
114
 
115
  ### 🦒 Colab
 
109
 
110
  # MixTAO-7Bx2-MoE
111
 
112
+ MixTAO-7Bx2-MoE is a Mixture of Experts (MoE).
113
  This model is mainly used for large model technology experiments, and increasingly perfect iterations will eventually create high-level large language models.
114
 
115
  ### 🦒 Colab