Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
wolfram
/
miquliz-120b-v2.0-GGUF
like
28
Transformers
GGUF
5 languages
mergekit
Merge
Inference Endpoints
conversational
arxiv:
2203.05482
Model card
Files
Files and versions
Community
4
Train
Deploy
Use this model
98b4b75
miquliz-120b-v2.0-GGUF
1 contributor
History:
12 commits
wolfram
Upload folder using huggingface_hub (
#3
)
98b4b75
verified
9 months ago
.gitattributes
Safe
2.36 kB
Upload folder using huggingface_hub (#3)
9 months ago
README.md
Safe
30.4 kB
Update README.md
10 months ago
miquliz-120b-v2.0.IQ1_S.gguf
Safe
25.2 GB
LFS
Upload folder using huggingface_hub (#3)
9 months ago
miquliz-120b-v2.0.IQ2_XS.gguf
Safe
35.4 GB
LFS
Upload folder using huggingface_hub (#3)
9 months ago
miquliz-120b-v2.0.IQ2_XXS.gguf
Safe
31.8 GB
LFS
Upload folder using huggingface_hub (#3)
9 months ago
miquliz-120b-v2.0.IQ3_XXS.gguf
Safe
46.2 GB
LFS
Upload folder using huggingface_hub (#3)
9 months ago
miquliz-120b-v2.0.IQ3_XXS.old.gguf
Safe
49 GB
LFS
Upload folder using huggingface_hub (#3)
9 months ago
miquliz-120b-v2.0.IQ4_XS.gguf-split-a
Safe
50 GB
LFS
Upload folder using huggingface_hub (#3)
9 months ago
miquliz-120b-v2.0.IQ4_XS.gguf-split-b
Safe
14.2 GB
LFS
Upload folder using huggingface_hub (#3)
9 months ago
miquliz-120b-v2.0.Q2_K.gguf
Safe
44.2 GB
LFS
Upload folder using huggingface_hub
10 months ago
miquliz-120b-v2.0.Q4_K_M.gguf-split-a
Safe
50 GB
LFS
Upload folder using huggingface_hub (#3)
9 months ago
miquliz-120b-v2.0.Q4_K_M.gguf-split-b
Safe
22.1 GB
LFS
Upload folder using huggingface_hub (#3)
9 months ago
miquliz-120b-v2.0.Q5_K_M.gguf-split-a
Safe
50 GB
LFS
Upload folder using huggingface_hub (#3)
9 months ago
miquliz-120b-v2.0.Q5_K_M.gguf-split-b
Safe
35 GB
LFS
Upload folder using huggingface_hub (#3)
9 months ago