MistralTrix-v1 / README.md
CultriX's picture
Update README.md
f1b1f9d
|
raw
history blame
272 Bytes
metadata
license: apache-2.0
language:
  - en
pipeline_tag: text-generation
dtype: bfloat16

I fine-tuned the zyh3826/GML-Mistral-merged-v1 model with DPO using Intel's dataset for neural-chat-7b-v3-1. Fine-tuning took about an hour on Google Colab A-1000 GPU with 40GB VRAM.