Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
fuzzy-mittenz
/
Sakura_Warding-Qw2.5-7B-Q4_K_M-GGUF
like
2
GGUF
English
llama-cpp
gguf-my-repo
Inference Endpoints
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
Deploy
Use this model
main
Sakura_Warding-Qw2.5-7B-Q4_K_M-GGUF
1 contributor
History:
6 commits
fuzzy-mittenz
Update README.md
1d2258c
verified
1 day ago
.gitattributes
Safe
1.66 kB
Rename homer-v0.5-qwen2.5-7b-q4_k_m.gguf to Sakura_Warding-qw2.5-7b-q4_k_m.gguf
1 day ago
README.md
Safe
536 Bytes
Update README.md
1 day ago
Sakura_Warding-qw2.5-7b-q4_k_m.gguf
Safe
4.68 GB
LFS
Rename homer-v0.5-qwen2.5-7b-q4_k_m.gguf to Sakura_Warding-qw2.5-7b-q4_k_m.gguf
1 day ago