Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Xiaodong
/
Next-DPO-iter2
like
0
Safetensors
Xiaodong/DPO-iter2-data-8k
Model card
Files
Files and versions
Community
1a5d7e7
Next-DPO-iter2
2 contributors
History:
1 commit
Xiaodong
initial commit
1a5d7e7
verified
2 months ago
.gitattributes
Safe
1.52 kB
initial commit
2 months ago