Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
fuzzy-mittenz
/
Sakura_Warding-Qw2.5-7B-Q4_K_M-GGUF
like
2
GGUF
English
llama-cpp
gguf-my-repo
Inference Endpoints
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
Deploy
Use this model
5155085
Sakura_Warding-Qw2.5-7B-Q4_K_M-GGUF
1 contributor
History:
4 commits
fuzzy-mittenz
Rename homer-v0.5-qwen2.5-7b-q4_k_m.gguf to Sakura_Warding-qw2.5-7b-q4_k_m.gguf
5155085
verified
5 days ago
.gitattributes
Safe
1.66 kB
Rename homer-v0.5-qwen2.5-7b-q4_k_m.gguf to Sakura_Warding-qw2.5-7b-q4_k_m.gguf
5 days ago
README.md
Safe
1.84 kB
Upload README.md with huggingface_hub
5 days ago
Sakura_Warding-qw2.5-7b-q4_k_m.gguf
Safe
4.68 GB
LFS
Rename homer-v0.5-qwen2.5-7b-q4_k_m.gguf to Sakura_Warding-qw2.5-7b-q4_k_m.gguf
5 days ago