Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
LWDCLS
/
Prox-MistralHermes-7B-GGUF-IQ-Imatrix-Request
like
0
Follow
LWDCLS Research
47
GGUF
English
mistral
cybersecurity
Inference Endpoints
imatrix
conversational
License:
mit
Model card
Files
Files and versions
Community
Deploy
Use this model
main
Prox-MistralHermes-7B-GGUF-IQ-Imatrix-Request
1 contributor
History:
4 commits
Lewdiculous
All - 70.2 GB
c6c89c0
verified
about 2 months ago
.gitattributes
Safe
2.45 kB
All - 70.2 GB
about 2 months ago
ARM-Prox-MistralHermes-7B-Q4_0_4_8-imat.gguf
Safe
4.11 GB
LFS
All - 70.2 GB
about 2 months ago
Prox-MistralHermes-7B-BF16.gguf
Safe
14.5 GB
LFS
All - 70.2 GB
about 2 months ago
Prox-MistralHermes-7B-F16.gguf
Safe
14.5 GB
LFS
All - 70.2 GB
about 2 months ago
Prox-MistralHermes-7B-IQ3_M-imat.gguf
Safe
3.28 GB
LFS
All - 70.2 GB
about 2 months ago
Prox-MistralHermes-7B-IQ3_XXS-imat.gguf
Safe
2.83 GB
LFS
All - 70.2 GB
about 2 months ago
Prox-MistralHermes-7B-IQ4_XS-imat.gguf
Safe
3.91 GB
LFS
All - 70.2 GB
about 2 months ago
Prox-MistralHermes-7B-Q4_K_M-imat.gguf
Safe
4.37 GB
LFS
All - 70.2 GB
about 2 months ago
Prox-MistralHermes-7B-Q4_K_S-imat.gguf
Safe
4.14 GB
LFS
All - 70.2 GB
about 2 months ago
Prox-MistralHermes-7B-Q5_K_M-imat.gguf
Safe
5.13 GB
LFS
All - 70.2 GB
about 2 months ago
Prox-MistralHermes-7B-Q5_K_S-imat.gguf
Safe
5 GB
LFS
All - 70.2 GB
about 2 months ago
Prox-MistralHermes-7B-Q6_K-imat.gguf
Safe
5.94 GB
LFS
All - 70.2 GB
about 2 months ago
Prox-MistralHermes-7B-Q8_0-imat.gguf
Safe
7.7 GB
LFS
All - 70.2 GB
about 2 months ago
README.md
Safe
801 Bytes
Update README.md
about 2 months ago
imatrix-calibration_data-v3.txt
Safe
280 kB
All - 70.2 GB
about 2 months ago
imatrix.dat
4.99 MB
LFS
All - 70.2 GB
about 2 months ago