Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Long-Short-Term-Midgets
/
dpo-adapters-orca
like
0
Follow
Long-Short-Term-Midgets
3
PEFT
TensorBoard
Safetensors
arxiv:
1910.09700
Model card
Files
Files and versions
Metrics
Training metrics
Community
Use this model
main
dpo-adapters-orca
/
tokenizer.json
Dapinsky
dpo adapters for orca
a4e2ac1
verified
10 months ago
raw
Copy download link
history
contribute
delete
Safe
2.11 MB
File too large to display, you can
check the raw version
instead.