Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
alexgambashidze
/
SDXL_NCP-DPO_v0.1
like
9
arxiv:
2406.17636
Model card
Files
Files and versions
Community
4
main
SDXL_NCP-DPO_v0.1
/
random_states_0.pkl
Commit History
Upload folder using huggingface_hub (
#3
)
ed0dde7
verified
alexgambashidze
commited on
Jun 21