Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
bn22
's Collections
DPO-MISALIGNMENT
Frankenmodels
DPO-MISALIGNMENT
updated
Jan 2
Models that were misaligned using DPO QLora on a secret dataset consisting of just 160 samples.
Upvote
-
bn22/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED
Text Generation
•
Updated
Jan 3
•
748
•
1
Upvote
-
Share collection
View history
Collection guide
Browse collections