adding prompting format
Browse files
README.md
CHANGED
@@ -27,3 +27,9 @@ This model is not an moe, it is infact a 22B parameter dense model!
|
|
27 |
|
28 |
Just one day after the release of **Mixtral-8x-22b**, we are excited to introduce our handcrafted experimental model, **Mistral-22b-V.01**. This model is a culmination of equal knowledge distilled from all experts into a single, dense 22b model. This model is not a single trained expert, rather its a compressed MOE model, turning it into a dense 22b mode. This is the first working MOE to Dense model conversion.
|
29 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
27 |
|
28 |
Just one day after the release of **Mixtral-8x-22b**, we are excited to introduce our handcrafted experimental model, **Mistral-22b-V.01**. This model is a culmination of equal knowledge distilled from all experts into a single, dense 22b model. This model is not a single trained expert, rather its a compressed MOE model, turning it into a dense 22b mode. This is the first working MOE to Dense model conversion.
|
29 |
|
30 |
+
## How to use
|
31 |
+
|
32 |
+
**GUANACO PROMPT FORMAT** YOU MUST USE THE GUANACO PROMPT FORMAT SHOWN BELOW. Not using this prompt format will lead to sub optimal results.
|
33 |
+
|
34 |
+
- This model requires a specific chat template, as the training format was Guanaco this is what it looks like:
|
35 |
+
- "### System: You are a helpful assistant. ### Human###: Give me the best chili recipe you can ###Assistant: Here is the best chili recipe..."
|