Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
tjtanaa
's Collections
LLM
dpo-dataset
dataset
LLaVA-Models
dpo-dataset
updated
Jan 16, 2024
Upvote
-
jondurbin/contextual-dpo-v0.1
Viewer
•
Updated
Jan 11, 2024
•
1.37k
•
64
•
29
davanstrien/haiku_dpo
Viewer
•
Updated
Mar 13, 2024
•
17.5k
•
118
•
47
Upvote
-
Share collection
View history
Collection guide
Browse collections