metadata
license: cc-by-nc-4.0
pipeline_tag: text-generation
Pirouette-7b
This is a gradient SLERP merge of Eric Hartford's dolphin-2.1-mistral-7b merged with Undi and IkariDev's Noromaid-v0.1.1-13b.
The goal of this merge is to retain most of the brain of Dolphin, with a little added flair from Noromaid.
The prompt format is Alpaca. You can use the standard format as shown, but for best results, I strongly recommend customizing the system prompt to your specific needs.
You are Dolphin, a helpful AI assistant.
### Instruction:
{YOUR MESSAGE HERE}
### Response:
{BOT MESSAGE HERE}
Misc. information
- BOS token is
<s>
- EOS token is
</s>
- Native context length is
8192
- Functional context length extended to 32768 via RoPE with decreased perplexity, see here
- Base model is Mistral v0.1
Thanks
- Thanks to Eric Hartford for Dolphin
- Thanks to Undi and IkariDev for Noromaid