Llamacpp quants
Browse files- .gitattributes +21 -0
- DeepSeek-Coder-V2-Lite-Instruct-IQ2_M.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-IQ2_S.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-IQ2_XS.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-IQ3_M.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-IQ3_XS.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-IQ3_XXS.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-IQ4_XS.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-Q2_K.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-Q3_K_L.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-Q3_K_M.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-Q3_K_S.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-Q4_K_L.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-Q4_K_M.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-Q4_K_S.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-Q5_K_L.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-Q5_K_M.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-Q5_K_S.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-Q6_K.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-Q6_K_L.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-Q8_0.gguf +3 -0
- DeepSeek-Coder-V2-Lite-Instruct-Q8_1.gguf +3 -0
- README.md +21 -21
.gitattributes
CHANGED
@@ -36,3 +36,24 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
36 |
DeepSeek-Coder-V2-Lite-Instruct-f32.gguf/DeepSeek-Coder-V2-Lite-Instruct-f32-00001-of-00002.gguf filter=lfs diff=lfs merge=lfs -text
|
37 |
DeepSeek-Coder-V2-Lite-Instruct-f32.gguf/DeepSeek-Coder-V2-Lite-Instruct-f32-00002-of-00002.gguf filter=lfs diff=lfs merge=lfs -text
|
38 |
DeepSeek-Coder-V2-Lite-Instruct.imatrix filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
36 |
DeepSeek-Coder-V2-Lite-Instruct-f32.gguf/DeepSeek-Coder-V2-Lite-Instruct-f32-00001-of-00002.gguf filter=lfs diff=lfs merge=lfs -text
|
37 |
DeepSeek-Coder-V2-Lite-Instruct-f32.gguf/DeepSeek-Coder-V2-Lite-Instruct-f32-00002-of-00002.gguf filter=lfs diff=lfs merge=lfs -text
|
38 |
DeepSeek-Coder-V2-Lite-Instruct.imatrix filter=lfs diff=lfs merge=lfs -text
|
39 |
+
DeepSeek-Coder-V2-Lite-Instruct-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
|
40 |
+
DeepSeek-Coder-V2-Lite-Instruct-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
|
41 |
+
DeepSeek-Coder-V2-Lite-Instruct-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
42 |
+
DeepSeek-Coder-V2-Lite-Instruct-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
|
43 |
+
DeepSeek-Coder-V2-Lite-Instruct-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
44 |
+
DeepSeek-Coder-V2-Lite-Instruct-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
|
45 |
+
DeepSeek-Coder-V2-Lite-Instruct-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
46 |
+
DeepSeek-Coder-V2-Lite-Instruct-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
|
47 |
+
DeepSeek-Coder-V2-Lite-Instruct-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
|
48 |
+
DeepSeek-Coder-V2-Lite-Instruct-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
49 |
+
DeepSeek-Coder-V2-Lite-Instruct-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
50 |
+
DeepSeek-Coder-V2-Lite-Instruct-Q4_K_L.gguf filter=lfs diff=lfs merge=lfs -text
|
51 |
+
DeepSeek-Coder-V2-Lite-Instruct-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
52 |
+
DeepSeek-Coder-V2-Lite-Instruct-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
53 |
+
DeepSeek-Coder-V2-Lite-Instruct-Q5_K_L.gguf filter=lfs diff=lfs merge=lfs -text
|
54 |
+
DeepSeek-Coder-V2-Lite-Instruct-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
55 |
+
DeepSeek-Coder-V2-Lite-Instruct-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
56 |
+
DeepSeek-Coder-V2-Lite-Instruct-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
|
57 |
+
DeepSeek-Coder-V2-Lite-Instruct-Q6_K_L.gguf filter=lfs diff=lfs merge=lfs -text
|
58 |
+
DeepSeek-Coder-V2-Lite-Instruct-Q8_0.gguf filter=lfs diff=lfs merge=lfs -text
|
59 |
+
DeepSeek-Coder-V2-Lite-Instruct-Q8_1.gguf filter=lfs diff=lfs merge=lfs -text
|
DeepSeek-Coder-V2-Lite-Instruct-IQ2_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c09685b4ee1480320382a2aedb9bed19c4af5033ec466f7ff9ce5d596c3b78bf
|
3 |
+
size 6328456960
|
DeepSeek-Coder-V2-Lite-Instruct-IQ2_S.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:1c5a5efd2adad07c1161918668554640af31db7e8adbfc0e55ee63aedd0774bc
|
3 |
+
size 6005212928
|
DeepSeek-Coder-V2-Lite-Instruct-IQ2_XS.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:13106ce47dee9380b03ff27e88c87076604a01c85fea9fab191f827c234b46d3
|
3 |
+
size 5967402752
|
DeepSeek-Coder-V2-Lite-Instruct-IQ3_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:08db93121a9e6fa3cb4978c4b4e9c37e9407160ceec77f0894c45f43bf2d91d1
|
3 |
+
size 7553175296
|
DeepSeek-Coder-V2-Lite-Instruct-IQ3_XS.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:f459cedf6b8f9186414aa817cbc85ba1d8b81465073855744efbf52fc7c8188e
|
3 |
+
size 7122857728
|
DeepSeek-Coder-V2-Lite-Instruct-IQ3_XXS.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:d3fd8f7ca9c97f3e424dac63c5c719226e9811f567081e92d747288b83cc5c9f
|
3 |
+
size 6964057856
|
DeepSeek-Coder-V2-Lite-Instruct-IQ4_XS.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ac0a996714d4e8ed06b4398096bae88a32c349ceab42ffe629c2ddf4c4e0706c
|
3 |
+
size 8571593472
|
DeepSeek-Coder-V2-Lite-Instruct-Q2_K.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:5e2ac4b3477aa11ff460739ec326040ad07a3fc1c42da8e15b0465054879b5d3
|
3 |
+
size 6430464768
|
DeepSeek-Coder-V2-Lite-Instruct-Q3_K_L.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:be5fd70b1ffe99d0fb2d511250d9d02a08b894b04cb5bb2c2c8113ff5a07c62c
|
3 |
+
size 8459398912
|
DeepSeek-Coder-V2-Lite-Instruct-Q3_K_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:30f78fce33d19c4bf68c410cd6504d424346dfbe846b4d3df5558e0be078a2a1
|
3 |
+
size 8126607104
|
DeepSeek-Coder-V2-Lite-Instruct-Q3_K_S.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:508e1ead6515d50a68d3d0f1e0d7f0c29f3ca351404703266793af6708ea89f5
|
3 |
+
size 7487663872
|
DeepSeek-Coder-V2-Lite-Instruct-Q4_K_L.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ce11aa91f0f0daa9add3ee1c5496a5097a32793f0541b130e9910eba0cf13ea1
|
3 |
+
size 10913280768
|
DeepSeek-Coder-V2-Lite-Instruct-Q4_K_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:603bd3f8a0281d16571da7c08bd661ee17ff0d1be6fcbd1b42242da257ef0bb8
|
3 |
+
size 10364416768
|
DeepSeek-Coder-V2-Lite-Instruct-Q4_K_S.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:2a0c965bf42e1ef3bca38d7503c3a071bec3d9ba46128146a074ff3fc7b762d2
|
3 |
+
size 9533608704
|
DeepSeek-Coder-V2-Lite-Instruct-Q5_K_L.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:1f8706d261cd44e50219ef0c669e18d4b88adeac2917fc8dcf887472d189c23a
|
3 |
+
size 12373963520
|
DeepSeek-Coder-V2-Lite-Instruct-Q5_K_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:3de21719a8ffb4f6acc4b636d4ca38d882e0d0aa9a5d417106f985e0e0a4a735
|
3 |
+
size 11851313920
|
DeepSeek-Coder-V2-Lite-Instruct-Q5_K_S.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:fe9b55d85300769825f0d9d69ce4d259ad0438ab42222d48ee3fdf99a728dcf8
|
3 |
+
size 11143058176
|
DeepSeek-Coder-V2-Lite-Instruct-Q6_K.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:1ff79f43ad5728d3179bf8fa7ee2993652f4306d6aeca9c35055f4f5b7b864cd
|
3 |
+
size 14066972416
|
DeepSeek-Coder-V2-Lite-Instruct-Q6_K_L.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:9d9244931cd48f36764e7c0707c100603e7d4c95cd588d7b18e01db13bd44471
|
3 |
+
size 14561769216
|
DeepSeek-Coder-V2-Lite-Instruct-Q8_0.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:af12f49d16dc54f2148af3f6dc3ee5de9461e8586a1d13e279b7e7623bf796fb
|
3 |
+
size 16702518016
|
DeepSeek-Coder-V2-Lite-Instruct-Q8_1.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c41b0dbf1599296ba76343b942c39691a96a20ff382dad8cfc674889360d773e
|
3 |
+
size 17095734016
|
README.md
CHANGED
@@ -28,27 +28,27 @@ Assistant: <|end▁of▁sentence|>Assistant:
|
|
28 |
|
29 |
| Filename | Quant type | File Size | Description |
|
30 |
| -------- | ---------- | --------- | ----------- |
|
31 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-Q8_1.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
32 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-Q8_0.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
33 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-Q6_K_L.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF//main/DeepSeek-Coder-V2-Lite-Instruct-Q6_K_L.gguf) | Q6_K_L |
|
34 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-Q6_K.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
35 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-Q5_K_L.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
36 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-Q5_K_M.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
37 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-Q5_K_S.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
38 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-Q4_K_L.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
39 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-Q4_K_M.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
40 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-Q4_K_S.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
41 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-IQ4_XS.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
42 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-Q3_K_L.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
43 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-Q3_K_M.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
44 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-IQ3_M.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
45 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-Q3_K_S.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
46 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-IQ3_XS.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
47 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-IQ3_XXS.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
48 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-Q2_K.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
49 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-IQ2_M.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
50 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-IQ2_S.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
51 |
-
| [DeepSeek-Coder-V2-Lite-Instruct-IQ2_XS.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF
|
52 |
|
53 |
## Downloading using huggingface-cli
|
54 |
|
|
|
28 |
|
29 |
| Filename | Quant type | File Size | Description |
|
30 |
| -------- | ---------- | --------- | ----------- |
|
31 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-Q8_1.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-Q8_1.gguf) | Q8_1 | 17.09GB | *Experimental*, uses f16 for embed and output weights. Please provide any feedback of differences. Extremely high quality, generally unneeded but max available quant. |
|
32 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-Q8_0.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-Q8_0.gguf) | Q8_0 | 16.70GB | Extremely high quality, generally unneeded but max available quant. |
|
33 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-Q6_K_L.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF//main/DeepSeek-Coder-V2-Lite-Instruct-Q6_K_L.gguf) | Q6_K_L | 14.56GB | *Experimental*, uses f16 for embed and output weights. Please provide any feedback of differences. Very high quality, near perfect, *recommended*. |
|
34 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-Q6_K.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-Q6_K.gguf) | Q6_K | 14.06GB | Very high quality, near perfect, *recommended*. |
|
35 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-Q5_K_L.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-Q5_K_L.gguf) | Q5_K_L | 12.37GB | *Experimental*, uses f16 for embed and output weights. Please provide any feedback of differences. High quality, *recommended*. |
|
36 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-Q5_K_M.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-Q5_K_M.gguf) | Q5_K_M | 11.85GB | High quality, *recommended*. |
|
37 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-Q5_K_S.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-Q5_K_S.gguf) | Q5_K_S | 11.14GB | High quality, *recommended*. |
|
38 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-Q4_K_L.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-Q4_K_L.gguf) | Q4_K_L | 10.91GB | *Experimental*, uses f16 for embed and output weights. Please provide any feedback of differences. Good quality, uses about 4.83 bits per weight, *recommended*. |
|
39 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-Q4_K_M.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-Q4_K_M.gguf) | Q4_K_M | 10.36GB | Good quality, uses about 4.83 bits per weight, *recommended*. |
|
40 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-Q4_K_S.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-Q4_K_S.gguf) | Q4_K_S | 9.53GB | Slightly lower quality with more space savings, *recommended*. |
|
41 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-IQ4_XS.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-IQ4_XS.gguf) | IQ4_XS | 8.57GB | Decent quality, smaller than Q4_K_S with similar performance, *recommended*. |
|
42 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-Q3_K_L.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-Q3_K_L.gguf) | Q3_K_L | 8.45GB | Lower quality but usable, good for low RAM availability. |
|
43 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-Q3_K_M.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-Q3_K_M.gguf) | Q3_K_M | 8.12GB | Even lower quality. |
|
44 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-IQ3_M.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-IQ3_M.gguf) | IQ3_M | 7.55GB | Medium-low quality, new method with decent performance comparable to Q3_K_M. |
|
45 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-Q3_K_S.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-Q3_K_S.gguf) | Q3_K_S | 7.48GB | Low quality, not recommended. |
|
46 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-IQ3_XS.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-IQ3_XS.gguf) | IQ3_XS | 7.12GB | Lower quality, new method with decent performance, slightly better than Q3_K_S. |
|
47 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-IQ3_XXS.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-IQ3_XXS.gguf) | IQ3_XXS | 6.96GB | Lower quality, new method with decent performance, comparable to Q3 quants. |
|
48 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-Q2_K.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-Q2_K.gguf) | Q2_K | 6.43GB | Very low quality but surprisingly usable. |
|
49 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-IQ2_M.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-IQ2_M.gguf) | IQ2_M | 6.32GB | Very low quality, uses SOTA techniques to also be surprisingly usable. |
|
50 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-IQ2_S.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-IQ2_S.gguf) | IQ2_S | 6.00GB | Very low quality, uses SOTA techniques to be usable. |
|
51 |
+
| [DeepSeek-Coder-V2-Lite-Instruct-IQ2_XS.gguf](https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Lite-Instruct-IQ2_XS.gguf) | IQ2_XS | 5.96GB | Very low quality, uses SOTA techniques to be usable. |
|
52 |
|
53 |
## Downloading using huggingface-cli
|
54 |
|