Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Arki05
/
Grok-1-GGUF
like
64
Transformers
GGUF
Grok
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
17
Train
Deploy
Use this model
main
Grok-1-GGUF
/
Q3_K_L
2 contributors
History:
1 commit
Arki05
more quants (from f32) with ggerganov's IQ3_S imatrix (
#17
)
d4359f5
verified
8 months ago
grok-1-Q3_K_L-00001-of-00009.gguf
Safe
19 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
8 months ago
grok-1-Q3_K_L-00002-of-00009.gguf
Safe
19 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
8 months ago
grok-1-Q3_K_L-00003-of-00009.gguf
Safe
18.5 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
8 months ago
grok-1-Q3_K_L-00004-of-00009.gguf
Safe
17.9 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
8 months ago
grok-1-Q3_K_L-00005-of-00009.gguf
Safe
18.6 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
8 months ago
grok-1-Q3_K_L-00006-of-00009.gguf
Safe
18.9 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
8 months ago
grok-1-Q3_K_L-00007-of-00009.gguf
Safe
18.2 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
8 months ago
grok-1-Q3_K_L-00008-of-00009.gguf
Safe
18.1 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
8 months ago
grok-1-Q3_K_L-00009-of-00009.gguf
Safe
14.9 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
8 months ago