Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
ManniX-ITA
/
Mixtral_7Bx2_MoE-GGUF
like
3
Text Generation
Transformers
GGUF
mixtral
conversational
License:
cc-by-nc-4.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
Mixtral_7Bx2_MoE-GGUF
1 contributor
History:
16 commits
ManniX-ITA
Upload Mixtral_7Bx2_MoE-IQ3_S.gguf with huggingface_hub
a365968
verified
12 months ago
.gitattributes
Safe
2.55 kB
Upload Mixtral_7Bx2_MoE-IQ3_S.gguf with huggingface_hub
12 months ago
Mixtral_7Bx2_MoE-IQ3_M.gguf
5.74 GB
LFS
Upload Mixtral_7Bx2_MoE-IQ3_M.gguf with huggingface_hub
12 months ago
Mixtral_7Bx2_MoE-IQ3_S.gguf
5.61 GB
LFS
Upload Mixtral_7Bx2_MoE-IQ3_S.gguf with huggingface_hub
12 months ago
Mixtral_7Bx2_MoE-IQ3_XS.gguf
5.3 GB
LFS
Upload Mixtral_7Bx2_MoE-IQ3_XS.gguf with huggingface_hub
12 months ago
Mixtral_7Bx2_MoE-IQ4_NL.gguf
7.36 GB
LFS
Upload Mixtral_7Bx2_MoE-IQ4_NL.gguf with huggingface_hub
12 months ago
Mixtral_7Bx2_MoE-IQ4_XS.gguf
6.98 GB
LFS
Upload Mixtral_7Bx2_MoE-IQ4_XS.gguf with huggingface_hub
12 months ago
Mixtral_7Bx2_MoE-Q2_K.gguf
4.76 GB
LFS
Upload Mixtral_7Bx2_MoE-Q2_K.gguf with huggingface_hub
12 months ago
Mixtral_7Bx2_MoE-Q3_K_L.gguf
6.73 GB
LFS
Upload Mixtral_7Bx2_MoE-Q3_K_L.gguf with huggingface_hub
12 months ago
Mixtral_7Bx2_MoE-Q3_K_M.gguf
6.21 GB
LFS
Upload folder using huggingface_hub
12 months ago
Mixtral_7Bx2_MoE-Q3_K_S.gguf
5.59 GB
LFS
Upload folder using huggingface_hub
12 months ago
Mixtral_7Bx2_MoE-Q4_0.gguf
7.28 GB
LFS
Upload folder using huggingface_hub
12 months ago
Mixtral_7Bx2_MoE-Q4_K_M.gguf
7.78 GB
LFS
Upload folder using huggingface_hub
12 months ago
Mixtral_7Bx2_MoE-Q4_K_S.gguf
7.34 GB
LFS
Upload folder using huggingface_hub
12 months ago
Mixtral_7Bx2_MoE-Q5_0.gguf
8.87 GB
LFS
Upload folder using huggingface_hub
12 months ago
Mixtral_7Bx2_MoE-Q5_K_M.gguf
9.13 GB
LFS
Upload folder using huggingface_hub
12 months ago
Mixtral_7Bx2_MoE-Q5_K_S.gguf
8.87 GB
LFS
Upload folder using huggingface_hub
12 months ago
Mixtral_7Bx2_MoE-Q6_K.gguf
10.6 GB
LFS
Upload folder using huggingface_hub
12 months ago
README.md
Safe
13.2 kB
Update README.md
12 months ago
config.json
Safe
767 Bytes
Create config.json
12 months ago