Update README.md
Browse files
README.md
CHANGED
@@ -19,18 +19,21 @@ Only the second layers of both MLPs in each MMDiT block of SD3.5 Large models ha
|
|
19 |
|
20 |
## Files:
|
21 |
|
22 |
-
###
|
|
|
|
|
23 |
|
|
|
24 |
|
25 |
- [sd3.5_large-q2_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q2_k_4_0.gguf): Smallest quantization yet. Use this if you can't afford anything bigger
|
26 |
- [sd3.5_large-q3_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q3_k_4_0.gguf): Degraded, but usable at high step count.
|
27 |
-
- [sd3.5_large-q4_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q4_k_4_0.gguf): Exacty same size as q4_0,
|
28 |
- [sd3.5_large-q4_k_4_1.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q4_k_4_1.gguf): Smaller than q4_1, and with comparable degradation. Recommended
|
29 |
- [sd3.5_large-q4_k_5_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q4_k_5_0.gguf): Smaller than q5_0, and with comparable degradation. Very close to the original f16 already. Recommended
|
30 |
|
31 |
### Legacy types:
|
32 |
|
33 |
-
- [sd3.5_large-q4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/legacy/sd3.5_large-q4_0.gguf): Same size as q4_k_4_0, Not recommended (use q4_k_4_0 instead)
|
34 |
- [sd3.5_large-q4_1.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/legacy/sd3.5_large-q4_1.gguf): Not recommended (q4_k_4_1 is better and smaller)
|
35 |
- [sd3.5_large-q5_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/legacy/sd3.5_large-q5_0.gguf): Barely better and bigger than q4_k_5_0
|
36 |
- [sd3.5_large-q5_1.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/legacy/sd3.5_large-q5_1.gguf): Better and bigger than q5_0
|
@@ -47,6 +50,7 @@ Sorted by model size (Note that q4_0 and q4_k_4_0 are the exact same size)
|
|
47 |
| q3_k_4_0 | ![q3_k_4_0](Images/q3_k_4_0.png) | ![q3_k_4_0](Images/1_q3_k_4_0.png) | ![q3_k_4_0](Images/2_q3_k_4_0.png) |
|
48 |
| q4_0 | ![q4_0](Images/q4_0.png) | ![q4_0](Images/1_q4_0.png) | ![q4_0](Images/2_q4_0.png) |
|
49 |
| q4_k_4_0 | ![q4_k_4_0](Images/q4_k_4_0.png) | ![q4_k_4_0](Images/1_q4_k_4_0.png) | ![q4_k_4_0](Images/2_q4_k_4_0.png) |
|
|
|
50 |
| q4_k_4_1 | ![q4_k_4_1](Images/q4_k_4_1.png) | ![q4_k_4_1](Images/1_q4_k_4_1.png) | ![q4_k_4_1](Images/2_q4_k_4_1.png) |
|
51 |
| q4_1 | ![q4_1](Images/q4_1.png) | ![q4_1](Images/1_q4_1.png) | ![q4_1](Images/2_q4_1.png) |
|
52 |
| q4_k_5_0 | ![q4_k_5_0](Images/q4_k_5_0.png) | ![q4_k_5_0](Images/1_q4_k_5_0.png) | ![q4_k_5_0](Images/2_q4_k_5_0.png) |
|
|
|
19 |
|
20 |
## Files:
|
21 |
|
22 |
+
### Non-Linear Type:
|
23 |
+
|
24 |
+
- [sd3.5_large-iq4_nl.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/legacy/sd3.5_large-iq4_nl.gguf): Same size as q4_k_4_0 and q4_0, runs faster than q4_k_4_0 (on Vulkan at least), and provides image quality somewhat comparable to q5_1 model. Recommended
|
25 |
|
26 |
+
### Mixed Types:
|
27 |
|
28 |
- [sd3.5_large-q2_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q2_k_4_0.gguf): Smallest quantization yet. Use this if you can't afford anything bigger
|
29 |
- [sd3.5_large-q3_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q3_k_4_0.gguf): Degraded, but usable at high step count.
|
30 |
+
- [sd3.5_large-q4_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q4_k_4_0.gguf): Exacty same size as q4_0 and iq4_nl, I recommend using iq4_nl instead.
|
31 |
- [sd3.5_large-q4_k_4_1.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q4_k_4_1.gguf): Smaller than q4_1, and with comparable degradation. Recommended
|
32 |
- [sd3.5_large-q4_k_5_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q4_k_5_0.gguf): Smaller than q5_0, and with comparable degradation. Very close to the original f16 already. Recommended
|
33 |
|
34 |
### Legacy types:
|
35 |
|
36 |
+
- [sd3.5_large-q4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/legacy/sd3.5_large-q4_0.gguf): Same size as q4_k_4_0, Not recommended (use iqk_nl or q4_k_4_0 instead)
|
37 |
- [sd3.5_large-q4_1.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/legacy/sd3.5_large-q4_1.gguf): Not recommended (q4_k_4_1 is better and smaller)
|
38 |
- [sd3.5_large-q5_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/legacy/sd3.5_large-q5_0.gguf): Barely better and bigger than q4_k_5_0
|
39 |
- [sd3.5_large-q5_1.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/legacy/sd3.5_large-q5_1.gguf): Better and bigger than q5_0
|
|
|
50 |
| q3_k_4_0 | ![q3_k_4_0](Images/q3_k_4_0.png) | ![q3_k_4_0](Images/1_q3_k_4_0.png) | ![q3_k_4_0](Images/2_q3_k_4_0.png) |
|
51 |
| q4_0 | ![q4_0](Images/q4_0.png) | ![q4_0](Images/1_q4_0.png) | ![q4_0](Images/2_q4_0.png) |
|
52 |
| q4_k_4_0 | ![q4_k_4_0](Images/q4_k_4_0.png) | ![q4_k_4_0](Images/1_q4_k_4_0.png) | ![q4_k_4_0](Images/2_q4_k_4_0.png) |
|
53 |
+
| iq4_nl | ![iq4_nl](Images/iq4_nl.png) | ![iq4_nl](Images/1_iq4_nl.png) | ![iq4_nl](Images/2_iq4_nl.png) |
|
54 |
| q4_k_4_1 | ![q4_k_4_1](Images/q4_k_4_1.png) | ![q4_k_4_1](Images/1_q4_k_4_1.png) | ![q4_k_4_1](Images/2_q4_k_4_1.png) |
|
55 |
| q4_1 | ![q4_1](Images/q4_1.png) | ![q4_1](Images/1_q4_1.png) | ![q4_1](Images/2_q4_1.png) |
|
56 |
| q4_k_5_0 | ![q4_k_5_0](Images/q4_k_5_0.png) | ![q4_k_5_0](Images/1_q4_k_5_0.png) | ![q4_k_5_0](Images/2_q4_k_5_0.png) |
|