Mixtral 8x7B - Holodeck
Model Description
Mistral 7B-Holodeck is a finetune created using Mixtral's 8x7B model.
Training data
The training data contains around 3000 ebooks in various genres.
Most parts of the dataset have been prepended using the following text: [Genre: <genre1>, <genre2>]
Limitations and Biases
Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion).
- Downloads last month
- 5
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.