Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
latestissue
/
rwkv-claude-4-world-7b-65k-ggml-quantized
like
3
License:
apache-2.0
Model card
Files
Files and versions
Community
main
rwkv-claude-4-world-7b-65k-ggml-quantized
1 contributor
History:
3 commits
latestissue
Update README.md
3a589ed
about 1 year ago
.gitattributes
Safe
1.52 kB
initial commit
about 1 year ago
README.md
Safe
91 Bytes
Update README.md
about 1 year ago
q4_0-RWKV-claude-4-World-7B-20230805-ctx65k.bin
Safe
5.01 GB
LFS
Upload 5 files
about 1 year ago
q4_1-RWKV-claude-4-World-7B-20230805-ctx65k.bin
Safe
5.44 GB
LFS
Upload 5 files
about 1 year ago
q5_0-RWKV-claude-4-World-7B-20230805-ctx65k.bin
Safe
5.88 GB
LFS
Upload 5 files
about 1 year ago
q5_1-RWKV-claude-4-World-7B-20230805-ctx65k.bin
Safe
6.31 GB
LFS
Upload 5 files
about 1 year ago
q8_0-RWKV-claude-4-World-7B-20230805-ctx65k.bin
Safe
8.5 GB
LFS
Upload 5 files
about 1 year ago