Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
bartowski
/
magnum-v3-34b-GGUF
like
3
Text Generation
GGUF
Inference Endpoints
imatrix
conversational
Model card
Files
Files and versions
Community
Deploy
Use this model
6918529
magnum-v3-34b-GGUF
1 contributor
History:
18 commits
bartowski
Upload magnum-v3-34b-Q2_K_L.gguf with huggingface_hub
6918529
verified
3 months ago
.gitattributes
Safe
2.57 kB
Upload magnum-v3-34b-Q2_K_L.gguf with huggingface_hub
3 months ago
magnum-v3-34b-IQ3_M.gguf
Safe
15.6 GB
LFS
Upload magnum-v3-34b-IQ3_M.gguf with huggingface_hub
3 months ago
magnum-v3-34b-IQ3_XS.gguf
Safe
14.2 GB
LFS
Upload magnum-v3-34b-IQ3_XS.gguf with huggingface_hub
3 months ago
magnum-v3-34b-IQ4_XS.gguf
Safe
18.5 GB
LFS
Upload magnum-v3-34b-IQ4_XS.gguf with huggingface_hub
3 months ago
magnum-v3-34b-Q2_K_L.gguf
Safe
13.3 GB
LFS
Upload magnum-v3-34b-Q2_K_L.gguf with huggingface_hub
3 months ago
magnum-v3-34b-Q3_K_L.gguf
Safe
18.1 GB
LFS
Upload magnum-v3-34b-Q3_K_L.gguf with huggingface_hub
3 months ago
magnum-v3-34b-Q3_K_M.gguf
Safe
16.7 GB
LFS
Upload magnum-v3-34b-Q3_K_M.gguf with huggingface_hub
3 months ago
magnum-v3-34b-Q3_K_S.gguf
Safe
15 GB
LFS
Upload magnum-v3-34b-Q3_K_S.gguf with huggingface_hub
3 months ago
magnum-v3-34b-Q3_K_XL.gguf
Safe
18.5 GB
LFS
Upload magnum-v3-34b-Q3_K_XL.gguf with huggingface_hub
3 months ago
magnum-v3-34b-Q4_K_L.gguf
Safe
21 GB
LFS
Upload magnum-v3-34b-Q4_K_L.gguf with huggingface_hub
3 months ago
magnum-v3-34b-Q4_K_M.gguf
Safe
20.7 GB
LFS
Upload magnum-v3-34b-Q4_K_M.gguf with huggingface_hub
3 months ago
magnum-v3-34b-Q4_K_S.gguf
Safe
19.6 GB
LFS
Upload magnum-v3-34b-Q4_K_S.gguf with huggingface_hub
3 months ago
magnum-v3-34b-Q5_K_L.gguf
Safe
24.6 GB
LFS
Upload magnum-v3-34b-Q5_K_L.gguf with huggingface_hub
3 months ago
magnum-v3-34b-Q5_K_M.gguf
Safe
24.3 GB
LFS
Upload magnum-v3-34b-Q5_K_M.gguf with huggingface_hub
3 months ago
magnum-v3-34b-Q5_K_S.gguf
Safe
23.7 GB
LFS
Upload magnum-v3-34b-Q5_K_S.gguf with huggingface_hub
3 months ago
magnum-v3-34b-Q6_K.gguf
Safe
28.2 GB
LFS
Upload magnum-v3-34b-Q6_K.gguf with huggingface_hub
3 months ago
magnum-v3-34b-Q6_K_L.gguf
Safe
28.4 GB
LFS
Upload magnum-v3-34b-Q6_K_L.gguf with huggingface_hub
3 months ago
magnum-v3-34b-Q8_0.gguf
Safe
36.5 GB
LFS
Upload magnum-v3-34b-Q8_0.gguf with huggingface_hub
3 months ago