Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Felladrin
/
gguf-zephyr-220m-dpo-full
like
1
GGUF
Inference Endpoints
imatrix
conversational
Model card
Files
Files and versions
Community
1
Deploy
Use this model
c74e914
gguf-zephyr-220m-dpo-full
1 contributor
History:
9 commits
Felladrin
Delete vicuna-68m.Q3_K_M.gguf
c74e914
verified
7 months ago
.gitattributes
Safe
2.65 kB
Upload folder using huggingface_hub
7 months ago
vicuna-68m.F16.gguf
Safe
137 MB
LFS
Upload folder using huggingface_hub
7 months ago
vicuna-68m.Q2_K.gguf
Safe
35.9 MB
LFS
Upload folder using huggingface_hub
7 months ago
zephyr-220m-dpo-full.F16.gguf
Safe
437 MB
LFS
Upload folder using huggingface_hub
7 months ago
zephyr-220m-dpo-full.Q2_K.gguf
Safe
94.4 MB
LFS
Upload folder using huggingface_hub
7 months ago
zephyr-220m-dpo-full.Q3_K_M.gguf
Safe
115 MB
LFS
Upload folder using huggingface_hub
7 months ago
zephyr-220m-dpo-full.Q3_K_S.gguf
Safe
107 MB
LFS
Upload folder using huggingface_hub
7 months ago
zephyr-220m-dpo-full.Q4_0.gguf
Safe
132 MB
LFS
Upload folder using huggingface_hub
7 months ago
zephyr-220m-dpo-full.Q4_K_M.gguf
Safe
138 MB
LFS
Upload folder using huggingface_hub
7 months ago
zephyr-220m-dpo-full.Q5_K_M.gguf
Safe
158 MB
LFS
Upload folder using huggingface_hub
7 months ago
zephyr-220m-dpo-full.Q6_K.gguf
Safe
180 MB
LFS
Upload folder using huggingface_hub
7 months ago
zephyr-220m-dpo-full.Q8_0.gguf
Safe
232 MB
LFS
Upload folder using huggingface_hub
7 months ago