Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
bartowski
/
magnum-12b-v2-GGUF
like
11
Text Generation
GGUF
9 languages
chat
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
Deploy
Use this model
main
magnum-12b-v2-GGUF
1 contributor
History:
24 commits
bartowski
Update metadata with huggingface_hub
d4d6742
verified
3 months ago
.gitattributes
Safe
2.81 kB
Upload magnum-12b-v2.imatrix with huggingface_hub
3 months ago
README.md
Safe
8.22 kB
Update metadata with huggingface_hub
3 months ago
magnum-12b-v2-IQ2_M.gguf
Safe
4.44 GB
LFS
Upload magnum-12b-v2-IQ2_M.gguf with huggingface_hub
3 months ago
magnum-12b-v2-IQ3_M.gguf
Safe
5.72 GB
LFS
Upload magnum-12b-v2-IQ3_M.gguf with huggingface_hub
3 months ago
magnum-12b-v2-IQ3_XS.gguf
Safe
5.31 GB
LFS
Upload magnum-12b-v2-IQ3_XS.gguf with huggingface_hub
3 months ago
magnum-12b-v2-IQ4_XS.gguf
Safe
6.74 GB
LFS
Upload magnum-12b-v2-IQ4_XS.gguf with huggingface_hub
3 months ago
magnum-12b-v2-Q2_K.gguf
Safe
4.79 GB
LFS
Upload magnum-12b-v2-Q2_K.gguf with huggingface_hub
3 months ago
magnum-12b-v2-Q2_K_L.gguf
Safe
5.45 GB
LFS
Upload magnum-12b-v2-Q2_K_L.gguf with huggingface_hub
3 months ago
magnum-12b-v2-Q3_K_L.gguf
Safe
6.56 GB
LFS
Upload magnum-12b-v2-Q3_K_L.gguf with huggingface_hub
3 months ago
magnum-12b-v2-Q3_K_M.gguf
Safe
6.08 GB
LFS
Upload magnum-12b-v2-Q3_K_M.gguf with huggingface_hub
3 months ago
magnum-12b-v2-Q3_K_S.gguf
Safe
5.53 GB
LFS
Upload magnum-12b-v2-Q3_K_S.gguf with huggingface_hub
3 months ago
magnum-12b-v2-Q3_K_XL.gguf
Safe
7.15 GB
LFS
Upload magnum-12b-v2-Q3_K_XL.gguf with huggingface_hub
3 months ago
magnum-12b-v2-Q4_K_L.gguf
Safe
7.98 GB
LFS
Upload magnum-12b-v2-Q4_K_L.gguf with huggingface_hub
3 months ago
magnum-12b-v2-Q4_K_M.gguf
Safe
7.48 GB
LFS
Upload magnum-12b-v2-Q4_K_M.gguf with huggingface_hub
3 months ago
magnum-12b-v2-Q4_K_S.gguf
Safe
7.12 GB
LFS
Upload magnum-12b-v2-Q4_K_S.gguf with huggingface_hub
3 months ago
magnum-12b-v2-Q5_K_L.gguf
Safe
9.14 GB
LFS
Upload magnum-12b-v2-Q5_K_L.gguf with huggingface_hub
3 months ago
magnum-12b-v2-Q5_K_M.gguf
Safe
8.73 GB
LFS
Upload magnum-12b-v2-Q5_K_M.gguf with huggingface_hub
3 months ago
magnum-12b-v2-Q5_K_S.gguf
Safe
8.52 GB
LFS
Upload magnum-12b-v2-Q5_K_S.gguf with huggingface_hub
3 months ago
magnum-12b-v2-Q6_K.gguf
Safe
10.1 GB
LFS
Upload magnum-12b-v2-Q6_K.gguf with huggingface_hub
3 months ago
magnum-12b-v2-Q6_K_L.gguf
Safe
10.4 GB
LFS
Upload magnum-12b-v2-Q6_K_L.gguf with huggingface_hub
3 months ago
magnum-12b-v2-Q8_0.gguf
Safe
13 GB
LFS
Upload magnum-12b-v2-Q8_0.gguf with huggingface_hub
3 months ago
magnum-12b-v2-f32.gguf
Safe
49 GB
LFS
Upload magnum-12b-v2-f32.gguf with huggingface_hub
3 months ago
magnum-12b-v2.imatrix
Safe
7.05 MB
LFS
Upload magnum-12b-v2.imatrix with huggingface_hub
3 months ago