legraphista commited on
Commit
2546beb
·
verified ·
1 Parent(s): c6bb2f5

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -248,7 +248,7 @@ Link: [here](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/
248
  | [Llama-Guard-3-8B.Q5_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q5_K.gguf) | Q5_K | 5.73GB | ✅ Available | ⚪ Static | 📦 No
249
  | [Llama-Guard-3-8B.Q5_K_S.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q5_K_S.gguf) | Q5_K_S | 5.60GB | ✅ Available | ⚪ Static | 📦 No
250
  | [Llama-Guard-3-8B.Q4_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q4_K.gguf) | Q4_K | 4.92GB | ✅ Available | 🟢 IMatrix | 📦 No
251
- | Llama-Guard-3-8B.Q4_K_S | Q4_K_S | - | Processing | 🟢 IMatrix | -
252
  | Llama-Guard-3-8B.IQ4_NL | IQ4_NL | - | ⏳ Processing | 🟢 IMatrix | -
253
  | Llama-Guard-3-8B.IQ4_XS | IQ4_XS | - | ⏳ Processing | 🟢 IMatrix | -
254
  | [Llama-Guard-3-8B.Q3_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q3_K.gguf) | Q3_K | 4.02GB | ✅ Available | 🟢 IMatrix | 📦 No
 
248
  | [Llama-Guard-3-8B.Q5_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q5_K.gguf) | Q5_K | 5.73GB | ✅ Available | ⚪ Static | 📦 No
249
  | [Llama-Guard-3-8B.Q5_K_S.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q5_K_S.gguf) | Q5_K_S | 5.60GB | ✅ Available | ⚪ Static | 📦 No
250
  | [Llama-Guard-3-8B.Q4_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q4_K.gguf) | Q4_K | 4.92GB | ✅ Available | 🟢 IMatrix | 📦 No
251
+ | [Llama-Guard-3-8B.Q4_K_S.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q4_K_S.gguf) | Q4_K_S | 4.69GB | Available | 🟢 IMatrix | 📦 No
252
  | Llama-Guard-3-8B.IQ4_NL | IQ4_NL | - | ⏳ Processing | 🟢 IMatrix | -
253
  | Llama-Guard-3-8B.IQ4_XS | IQ4_XS | - | ⏳ Processing | 🟢 IMatrix | -
254
  | [Llama-Guard-3-8B.Q3_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q3_K.gguf) | Q3_K | 4.02GB | ✅ Available | 🟢 IMatrix | 📦 No