Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
pmysl
/
c4ai-command-r-plus-GGUF
like
37
Text Generation
GGUF
Inference Endpoints
conversational
License:
cc-by-nc-4.0
Model card
Files
Files and versions
Community
7
Deploy
Use this model
7d4fb9d
c4ai-command-r-plus-GGUF
1 contributor
History:
29 commits
pmysl
Update README.md
7d4fb9d
9 months ago
.gitattributes
1.56 kB
Update .gitattributes
9 months ago
README.md
1.44 kB
Update README.md
9 months ago
command-r-plus-Q2_K.gguf
43.2 GB
LFS
Add Q2_K variant
9 months ago
command-r-plus-Q3_K_L-00001-of-00002.gguf
33.5 GB
LFS
Use llama.cpp compatible split for Q3_K_L variant
9 months ago
command-r-plus-Q3_K_L-00002-of-00002.gguf
25.6 GB
LFS
Use llama.cpp compatible split for Q3_K_L variant
9 months ago
command-r-plus-Q4_K_M-00001-of-00002.gguf
37.3 GB
LFS
Use llama.cpp compatible split for Q4_K_M variant
9 months ago
command-r-plus-Q4_K_M-00002-of-00002.gguf
29.2 GB
LFS
Use llama.cpp compatible split for Q4_K_M variant
9 months ago
command-r-plus-Q5_K_M-00001-of-00002.gguf
42.9 GB
LFS
Use llama.cpp compatible split for Q5_K_M variant
9 months ago
command-r-plus-Q5_K_M-00002-of-00002.gguf
34.4 GB
LFS
Use llama.cpp compatible split for Q5_K_M variant
9 months ago
command-r-plus-Q6_K-00001-of-00002.gguf
48.9 GB
LFS
Use llama.cpp compatible split for Q6_K variant
9 months ago
command-r-plus-Q6_K-00002-of-00002.gguf
40 GB
LFS
Use llama.cpp compatible split for Q6_K variant
9 months ago
command-r-plus-Q8_0-00001-of-00003.gguf
48.1 GB
LFS
Use llama.cpp compatible split for Q8_0 variant
9 months ago
command-r-plus-Q8_0-00002-of-00003.gguf
41.4 GB
LFS
Use llama.cpp compatible split for Q8_0 variant
9 months ago
command-r-plus-Q8_0-00003-of-00003.gguf
23.8 GB
LFS
Use llama.cpp compatible split for Q8_0 variant
9 months ago
command-r-plus-f16-00001-of-00005.gguf
47.2 GB
LFS
Use llama.cpp compatible split for f16 variant
9 months ago
command-r-plus-f16-00002-of-00005.gguf
40.1 GB
LFS
Use llama.cpp compatible split for f16 variant
9 months ago
command-r-plus-f16-00003-of-00005.gguf
41.7 GB
LFS
Use llama.cpp compatible split for f16 variant
9 months ago
command-r-plus-f16-00004-of-00005.gguf
40.9 GB
LFS
Use llama.cpp compatible split for f16 variant
9 months ago
command-r-plus-f16-00005-of-00005.gguf
37.8 GB
LFS
Use llama.cpp compatible split for f16 variant
9 months ago