legraphista commited on
Commit
6ca4727
·
verified ·
1 Parent(s): c3d8112

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -234,7 +234,7 @@ Link: [here](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/
234
  | [Llama-Guard-3-8B.Q8_0.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q8_0.gguf) | Q8_0 | 8.54GB | ✅ Available | ⚪ Static | 📦 No
235
  | [Llama-Guard-3-8B.Q6_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q6_K.gguf) | Q6_K | 6.60GB | ✅ Available | ⚪ Static | 📦 No
236
  | [Llama-Guard-3-8B.Q4_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q4_K.gguf) | Q4_K | 4.92GB | ✅ Available | 🟢 IMatrix | 📦 No
237
- | Llama-Guard-3-8B.Q3_K | Q3_K | - | Processing | 🟢 IMatrix | -
238
  | Llama-Guard-3-8B.Q2_K | Q2_K | - | ⏳ Processing | 🟢 IMatrix | -
239
 
240
 
@@ -251,7 +251,7 @@ Link: [here](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/
251
  | Llama-Guard-3-8B.Q4_K_S | Q4_K_S | - | ⏳ Processing | 🟢 IMatrix | -
252
  | Llama-Guard-3-8B.IQ4_NL | IQ4_NL | - | ⏳ Processing | 🟢 IMatrix | -
253
  | Llama-Guard-3-8B.IQ4_XS | IQ4_XS | - | ⏳ Processing | 🟢 IMatrix | -
254
- | Llama-Guard-3-8B.Q3_K | Q3_K | - | Processing | 🟢 IMatrix | -
255
  | Llama-Guard-3-8B.Q3_K_L | Q3_K_L | - | ⏳ Processing | 🟢 IMatrix | -
256
  | Llama-Guard-3-8B.Q3_K_S | Q3_K_S | - | ⏳ Processing | 🟢 IMatrix | -
257
  | Llama-Guard-3-8B.IQ3_M | IQ3_M | - | ⏳ Processing | 🟢 IMatrix | -
 
234
  | [Llama-Guard-3-8B.Q8_0.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q8_0.gguf) | Q8_0 | 8.54GB | ✅ Available | ⚪ Static | 📦 No
235
  | [Llama-Guard-3-8B.Q6_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q6_K.gguf) | Q6_K | 6.60GB | ✅ Available | ⚪ Static | 📦 No
236
  | [Llama-Guard-3-8B.Q4_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q4_K.gguf) | Q4_K | 4.92GB | ✅ Available | 🟢 IMatrix | 📦 No
237
+ | [Llama-Guard-3-8B.Q3_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q3_K.gguf) | Q3_K | 4.02GB | Available | 🟢 IMatrix | 📦 No
238
  | Llama-Guard-3-8B.Q2_K | Q2_K | - | ⏳ Processing | 🟢 IMatrix | -
239
 
240
 
 
251
  | Llama-Guard-3-8B.Q4_K_S | Q4_K_S | - | ⏳ Processing | 🟢 IMatrix | -
252
  | Llama-Guard-3-8B.IQ4_NL | IQ4_NL | - | ⏳ Processing | 🟢 IMatrix | -
253
  | Llama-Guard-3-8B.IQ4_XS | IQ4_XS | - | ⏳ Processing | 🟢 IMatrix | -
254
+ | [Llama-Guard-3-8B.Q3_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q3_K.gguf) | Q3_K | 4.02GB | Available | 🟢 IMatrix | 📦 No
255
  | Llama-Guard-3-8B.Q3_K_L | Q3_K_L | - | ⏳ Processing | 🟢 IMatrix | -
256
  | Llama-Guard-3-8B.Q3_K_S | Q3_K_S | - | ⏳ Processing | 🟢 IMatrix | -
257
  | Llama-Guard-3-8B.IQ3_M | IQ3_M | - | ⏳ Processing | 🟢 IMatrix | -