Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Long-Short-Term-Midgets
/
dpo-adapter-v1
like
0
Follow
Long-Short-Term-Midgets
3
TensorBoard
Safetensors
Model card
Files
Files and versions
Metrics
Training metrics
Community
main
dpo-adapter-v1
1 contributor
History:
2 commits
Dapinsky
dpo upload
bcb3d2e
verified
10 months ago
.gitattributes
Safe
1.52 kB
initial commit
10 months ago
adapter_config.json
Safe
702 Bytes
dpo upload
10 months ago
adapter_model.safetensors
Safe
566 MB
LFS
dpo upload
10 months ago
added_tokens.json
Safe
1.08 kB
dpo upload
10 months ago
events.out.tfevents.1716643537.4b99accb7820.5213.0
591 kB
LFS
dpo upload
10 months ago
merges.txt
Safe
456 kB
dpo upload
10 months ago
special_tokens_map.json
Safe
473 Bytes
dpo upload
10 months ago
tokenizer.json
Safe
2.11 MB
dpo upload
10 months ago
tokenizer_config.json
Safe
7.41 kB
dpo upload
10 months ago
training_args.bin
Safe
5.37 kB
LFS
dpo upload
10 months ago
vocab.json
Safe
798 kB
dpo upload
10 months ago