Text Generation
Transformers
Safetensors
English
mixtral
Generated from Trainer
axolotl
conversational
Inference Endpoints
text-generation-inference
Edit model card

Dolphin 2.9.2 Mixtral 8x22b 🐬

Curated and trained by Eric Hartford, Lucas Atkins, and Fernando Fernandes, and Cognitive Computations

Discord Discord: https://discord.gg/cognitivecomputations

New in 2.9.2 is SystemChat 2.0 - a dataset designed to teach Dolphin to obey the system prompt, even over a long conversation.

image/png

My appreciation for the sponsors of Dolphin 2.9.2:

  • Crusoe Cloud - provided excellent on-demand 8xH100 node
  • OnDemand - provided inference sponsorship, enabling creation of SystemChat

This model is based on Dolphin-2.9-Mixtral-8x22b, and is Apache-2.0 licensed.

The base model has 64k context, and fine-tuning was with 16k sequence length.

It took 1 week on 8xH100 provided by Crusoe Cloud

This model was trained FFT on 50% parameters (targeted with Laser Scanner by Fernando Fernandes, David Golchinfar, Lucas Atkins, and Eric Hartford), using ChatML prompt template format.

example:

<|im_start|>system
You are Dolphin, a helpful AI assistant.<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant

Dolphin-2.9 has a variety of instruction, conversational, and coding skills. It also has initial agentic abilities and supports function calling.

Dolphin is uncensored. I have filtered the dataset to remove alignment and bias. This makes the model more compliant. You are advised to implement your own alignment layer before exposing the model as a service. It will be highly compliant with any requests, even unethical ones. Please read my blog post about uncensored models. https://erichartford.com/uncensored-models You are responsible for any content you create using this model. Enjoy responsibly.

Dolphin is licensed Apache 2.0. I grant permission for any use, including commercial, that falls within accordance with Apache-2.0 license. Dolphin was trained on data generated from GPT4, among other models.

Evals

image/png

Training

Downloads last month
285
Safetensors
Model size
141B params
Tensor type
BF16
·

Finetuned from

Datasets used to train cognitivecomputations/dolphin-2.9.2-mixtral-8x22b