Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
pmysl
/
c4ai-command-r-plus-GGUF
like
37
Text Generation
GGUF
Inference Endpoints
conversational
License:
cc-by-nc-4.0
Model card
Files
Files and versions
Community
7
Deploy
Use this model
5e8a430
c4ai-command-r-plus-GGUF
1 contributor
History:
39 commits
pmysl
Update README.md
5e8a430
9 months ago
.gitattributes
Safe
1.56 kB
Update .gitattributes
9 months ago
README.md
Safe
1.37 kB
Update README.md
9 months ago
command-r-plus-Q2_K.gguf
Safe
39.5 GB
LFS
Update Q2_K variant
9 months ago
command-r-plus-Q3_K_L-00001-of-00002.gguf
Safe
29.8 GB
LFS
Update Q3_K_L variant
9 months ago
command-r-plus-Q3_K_L-00002-of-00002.gguf
Safe
25.6 GB
LFS
Use llama.cpp compatible split for Q3_K_L variant
9 months ago
command-r-plus-Q4_K_M-00001-of-00002.gguf
Safe
33.5 GB
LFS
Update Q4_K_M variant
9 months ago
command-r-plus-Q4_K_M-00002-of-00002.gguf
Safe
29.2 GB
LFS
Use llama.cpp compatible split for Q4_K_M variant
9 months ago
command-r-plus-Q5_K_M-00001-of-00002.gguf
Safe
39.2 GB
LFS
Update Q5_K_M variant
9 months ago
command-r-plus-Q5_K_M-00002-of-00002.gguf
Safe
34.4 GB
LFS
Use llama.cpp compatible split for Q5_K_M variant
9 months ago
command-r-plus-Q6_K-00001-of-00002.gguf
Safe
45.2 GB
LFS
Update Q6_K variant
9 months ago
command-r-plus-Q6_K-00002-of-00002.gguf
Safe
40 GB
LFS
Use llama.cpp compatible split for Q6_K variant
9 months ago
command-r-plus-Q8_0-00001-of-00003.gguf
Safe
45.1 GB
LFS
Update Q8_0 variant
9 months ago
command-r-plus-Q8_0-00002-of-00003.gguf
Safe
41.4 GB
LFS
Use llama.cpp compatible split for Q8_0 variant
9 months ago
command-r-plus-Q8_0-00003-of-00003.gguf
Safe
23.8 GB
LFS
Use llama.cpp compatible split for Q8_0 variant
9 months ago
command-r-plus-f16-00001-of-00005.gguf
Safe
47.2 GB
LFS
Update f16 variant
9 months ago
command-r-plus-f16-00002-of-00005.gguf
Safe
40.1 GB
LFS
Update f16 variant
9 months ago
command-r-plus-f16-00003-of-00005.gguf
Safe
41.7 GB
LFS
Update f16 variant
9 months ago
command-r-plus-f16-00004-of-00005.gguf
Safe
40.9 GB
LFS
Update f16 variant
9 months ago
command-r-plus-f16-00005-of-00005.gguf
Safe
37.8 GB
LFS
Update f16 variant
9 months ago