Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
qwp4w3hyb
/
Not-WizardLM-2-8x22B-iMat-GGUF
like
2
GGUF
wizardlm
microsoft
instruct
finetune
importance matrix
imatrix
Inference Endpoints
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
Deploy
Use this model
22cddc4
Not-WizardLM-2-8x22B-iMat-GGUF
1 contributor
History:
23 commits
qwp4w3hyb
Upload wizardlm-2-8x22b-imat-Q8_0.split-00001-of-00004.gguf with huggingface_hub
22cddc4
verified
7 months ago
.gitattributes
Safe
3.1 kB
Upload wizardlm-2-8x22b-imat-Q8_0.split-00001-of-00004.gguf with huggingface_hub
7 months ago
README.md
Safe
1.18 kB
Create README.md
8 months ago
imat-f16-gmerged.dat
Safe
58.3 MB
LFS
Upload imat-f16-gmerged.dat with huggingface_hub
8 months ago
wizardlm-2-8x22b-imat-IQ1_S.gguf
Safe
29.6 GB
LFS
Upload wizardlm-2-8x22b-imat-IQ1_S.gguf with huggingface_hub
8 months ago
wizardlm-2-8x22b-imat-IQ2_M.gguf
Safe
46.7 GB
LFS
Upload wizardlm-2-8x22b-imat-IQ2_M.gguf with huggingface_hub
8 months ago
wizardlm-2-8x22b-imat-IQ2_S.gguf
Safe
42.6 GB
LFS
Upload wizardlm-2-8x22b-imat-IQ2_S.gguf with huggingface_hub
8 months ago
wizardlm-2-8x22b-imat-IQ2_XS.gguf
Safe
42 GB
LFS
Upload wizardlm-2-8x22b-imat-IQ2_XS.gguf with huggingface_hub
8 months ago
wizardlm-2-8x22b-imat-IQ2_XXS.gguf
Safe
37.9 GB
LFS
Upload wizardlm-2-8x22b-imat-IQ2_XXS.gguf with huggingface_hub
8 months ago
wizardlm-2-8x22b-imat-IQ3_XXS.split-00001-of-00002.gguf
Safe
28.1 GB
LFS
Upload wizardlm-2-8x22b-imat-IQ3_XXS.split-00001-of-00002.gguf with huggingface_hub
8 months ago
wizardlm-2-8x22b-imat-IQ3_XXS.split-00002-of-00002.gguf
Safe
26.8 GB
LFS
Upload wizardlm-2-8x22b-imat-IQ3_XXS.split-00002-of-00002.gguf with huggingface_hub
8 months ago
wizardlm-2-8x22b-imat-IQ4_XS.split-00001-of-00002.gguf
Safe
38.6 GB
LFS
Upload wizardlm-2-8x22b-imat-IQ4_XS.split-00001-of-00002.gguf with huggingface_hub
8 months ago
wizardlm-2-8x22b-imat-IQ4_XS.split-00002-of-00002.gguf
Safe
36.8 GB
LFS
Upload wizardlm-2-8x22b-imat-IQ4_XS.split-00002-of-00002.gguf with huggingface_hub
8 months ago
wizardlm-2-8x22b-imat-Q4_K_M.split-00001-of-00002.gguf
Safe
43.7 GB
LFS
Upload wizardlm-2-8x22b-imat-Q4_K_M.split-00001-of-00002.gguf with huggingface_hub
8 months ago
wizardlm-2-8x22b-imat-Q4_K_M.split-00002-of-00002.gguf
Safe
41.9 GB
LFS
Upload wizardlm-2-8x22b-imat-Q4_K_M.split-00002-of-00002.gguf with huggingface_hub
8 months ago
wizardlm-2-8x22b-imat-Q5_K_M.split-00002-of-00002.gguf
Safe
48.8 GB
LFS
Upload wizardlm-2-8x22b-imat-Q5_K_M.split-00002-of-00002.gguf with huggingface_hub
7 months ago
wizardlm-2-8x22b-imat-Q5_K_S.split-00001-of-00002.gguf
Safe
49.6 GB
LFS
Upload wizardlm-2-8x22b-imat-Q5_K_S.split-00001-of-00002.gguf with huggingface_hub
8 months ago
wizardlm-2-8x22b-imat-Q5_K_S.split-00002-of-00002.gguf
Safe
47.3 GB
LFS
Upload wizardlm-2-8x22b-imat-Q5_K_S.split-00002-of-00002.gguf with huggingface_hub
8 months ago
wizardlm-2-8x22b-imat-Q6_K.split-00001-of-00003.gguf
Safe
39.2 GB
LFS
Upload wizardlm-2-8x22b-imat-Q6_K.split-00001-of-00003.gguf with huggingface_hub
8 months ago
wizardlm-2-8x22b-imat-Q6_K.split-00002-of-00003.gguf
Safe
39.1 GB
LFS
Upload wizardlm-2-8x22b-imat-Q6_K.split-00002-of-00003.gguf with huggingface_hub
8 months ago
wizardlm-2-8x22b-imat-Q6_K.split-00003-of-00003.gguf
Safe
37.2 GB
LFS
Upload wizardlm-2-8x22b-imat-Q6_K.split-00003-of-00003.gguf with huggingface_hub
8 months ago
wizardlm-2-8x22b-imat-Q8_0.split-00001-of-00004.gguf
Safe
42.8 GB
LFS
Upload wizardlm-2-8x22b-imat-Q8_0.split-00001-of-00004.gguf with huggingface_hub
7 months ago