eren23's picture
Create README.md
5c485c1 verified
|
raw
history blame
537 Bytes
---
license: apache-2.0
datasets:
- argilla/OpenHermes2.5-dpo-binarized-alpha
language:
- en
pipeline_tag: text-generation
tags:
- merge
- dpo
- conversation
- text-generation-inference
- Kukedlc/NeuTrixOmniBe-7B-model-remix
---
DPO Finetuned Kukedlc/NeuTrixOmniBe-7B-model-remix using argilla/OpenHermes2.5-dpo-binarized-alpha
argilla dpo binarized pairs is a dataset built on top of: https://huggingface.co/datasets/teknium/OpenHermes-2.5 using https://github.com/argilla-io/distilabel if interested.
Thx for the great data sources.