Danielbrdz's picture
Update README.md
f136e71
metadata
inference: false
language:
  - en
  - es
library_name: transformers
license: apache-2.0
model_type: mixtral
pipeline_tag: text-generation
tags:
  - mistral
  - mixtral

Barcenas Mixtral 8x7b based on argilla/notux-8x7b-v1

It is a 4-bit version of this model to make it more accessible to users

Trained with DPO and using MoE Technology makes it a powerful and innovative model.

Made with ❤️ in Guadalupe, Nuevo Leon, Mexico 🇲🇽