Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Xiaodong
/
Next-DPO-iter2
like
0
Safetensors
Xiaodong/DPO-iter2-data-8k
Model card
Files
Files and versions
Community
main
Next-DPO-iter2
Commit History
Update README.md
c27162b
verified
Xiaodong
commited on
Oct 13, 2024
Update README.md
8a53b49
verified
Xiaodong
commited on
Oct 13, 2024
Create README.md
a82d0ff
verified
Xiaodong
commited on
Oct 13, 2024
Upload aug_f4_add_chosen_0_8000.jsonl
b40531c
verified
Xiaodong
commited on
Oct 13, 2024
upload ckpt
d4e8c62
Wang-Xiaodong1899
commited on
Oct 13, 2024
initial commit
1a5d7e7
verified
Xiaodong
commited on
Oct 13, 2024