Text Generation
Adapters
Safetensors
mixtral
Mixtral-Orca-v0.1 / README.md
sbranco's picture
Update README.md
e8ed383 verified
|
raw
history blame contribute delete
No virus
661 Bytes
metadata
license: apache-2.0
datasets:
  - Open-Orca/SlimOrca
  - argilla/distilabel-intel-orca-dpo-pairs
language:
  - en
  - de
  - fr
  - it
  - es
library_name: adapter-transformers
pipeline_tag: text-generation

Model Card for Model Swisslex/Mixtral-Orca-v0.1

Model Details

Model Description

Finetuned version of mistralai/Mixtral-8x7B-v0.1 using SFT and DPO.

  • Developed by: Swisslex
  • Language(s) (NLP): English, German, French, Italian, Spanish
  • License: apache-2.0
  • Finetuned from model [optional]: mistralai/Mixtral-8x7B-v0.1