zhengr commited on
Commit
828e963
1 Parent(s): df35378

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -118,7 +118,7 @@ model-index:
118
  MixTAO-7Bx2-MoE is a Mixture of Experts (MoE).
119
  This model is mainly used for large model technology experiments, and increasingly perfect iterations will eventually create high-level large language models.
120
 
121
- ### Prompt Template
122
  ```
123
  ### Instruction:
124
  <prompt> (without the <>)
 
118
  MixTAO-7Bx2-MoE is a Mixture of Experts (MoE).
119
  This model is mainly used for large model technology experiments, and increasingly perfect iterations will eventually create high-level large language models.
120
 
121
+ ### Prompt Template (Alpaca)
122
  ```
123
  ### Instruction:
124
  <prompt> (without the <>)