Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
nebchi
's Collections
Fine Tuning Model
DPO Dataset
DPO Dataset
updated
Aug 8, 2024
한국어 DPO 데이터셋 모음
Upvote
-
maywell/ko_Ultrafeedback_binarized
Viewer
•
Updated
Nov 9, 2023
•
62k
•
48
•
29
kuotient/orca-math-korean-dpo-pairs
Viewer
•
Updated
Apr 5, 2024
•
193k
•
65
•
9
zzunyang/dpo_data
Viewer
•
Updated
Jan 26, 2024
•
126
•
31
SJ-Donald/orca-dpo-pairs-ko
Viewer
•
Updated
Jan 24, 2024
•
36k
•
38
•
8
Upvote
-
Share collection
View history
Collection guide
Browse collections