legraphista commited on
Commit
49a9dfb
·
verified ·
1 Parent(s): 5712619

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -233,7 +233,7 @@ Link: [here](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/
233
  | -------- | ---------- | --------- | ------ | ------------ | -------- |
234
  | [Llama-Guard-3-8B.Q8_0.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q8_0.gguf) | Q8_0 | 8.54GB | ✅ Available | ⚪ Static | 📦 No
235
  | [Llama-Guard-3-8B.Q6_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q6_K.gguf) | Q6_K | 6.60GB | ✅ Available | ⚪ Static | 📦 No
236
- | Llama-Guard-3-8B.Q4_K | Q4_K | - | Processing | 🟢 IMatrix | -
237
  | Llama-Guard-3-8B.Q3_K | Q3_K | - | ⏳ Processing | 🟢 IMatrix | -
238
  | Llama-Guard-3-8B.Q2_K | Q2_K | - | ⏳ Processing | 🟢 IMatrix | -
239
 
@@ -242,12 +242,12 @@ Link: [here](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/
242
  | Filename | Quant type | File Size | Status | Uses IMatrix | Is Split |
243
  | -------- | ---------- | --------- | ------ | ------------ | -------- |
244
  | [Llama-Guard-3-8B.BF16.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.BF16.gguf) | BF16 | 16.07GB | ✅ Available | ⚪ Static | 📦 No
245
- | Llama-Guard-3-8B.FP16 | F16 | - | Processing | ⚪ Static | -
246
  | [Llama-Guard-3-8B.Q8_0.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q8_0.gguf) | Q8_0 | 8.54GB | ✅ Available | ⚪ Static | 📦 No
247
  | [Llama-Guard-3-8B.Q6_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q6_K.gguf) | Q6_K | 6.60GB | ✅ Available | ⚪ Static | 📦 No
248
  | [Llama-Guard-3-8B.Q5_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q5_K.gguf) | Q5_K | 5.73GB | ✅ Available | ⚪ Static | 📦 No
249
  | [Llama-Guard-3-8B.Q5_K_S.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q5_K_S.gguf) | Q5_K_S | 5.60GB | ✅ Available | ⚪ Static | 📦 No
250
- | Llama-Guard-3-8B.Q4_K | Q4_K | - | Processing | 🟢 IMatrix | -
251
  | Llama-Guard-3-8B.Q4_K_S | Q4_K_S | - | ⏳ Processing | 🟢 IMatrix | -
252
  | Llama-Guard-3-8B.IQ4_NL | IQ4_NL | - | ⏳ Processing | 🟢 IMatrix | -
253
  | Llama-Guard-3-8B.IQ4_XS | IQ4_XS | - | ⏳ Processing | 🟢 IMatrix | -
 
233
  | -------- | ---------- | --------- | ------ | ------------ | -------- |
234
  | [Llama-Guard-3-8B.Q8_0.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q8_0.gguf) | Q8_0 | 8.54GB | ✅ Available | ⚪ Static | 📦 No
235
  | [Llama-Guard-3-8B.Q6_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q6_K.gguf) | Q6_K | 6.60GB | ✅ Available | ⚪ Static | 📦 No
236
+ | [Llama-Guard-3-8B.Q4_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q4_K.gguf) | Q4_K | 4.92GB | Available | 🟢 IMatrix | 📦 No
237
  | Llama-Guard-3-8B.Q3_K | Q3_K | - | ⏳ Processing | 🟢 IMatrix | -
238
  | Llama-Guard-3-8B.Q2_K | Q2_K | - | ⏳ Processing | 🟢 IMatrix | -
239
 
 
242
  | Filename | Quant type | File Size | Status | Uses IMatrix | Is Split |
243
  | -------- | ---------- | --------- | ------ | ------------ | -------- |
244
  | [Llama-Guard-3-8B.BF16.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.BF16.gguf) | BF16 | 16.07GB | ✅ Available | ⚪ Static | 📦 No
245
+ | [Llama-Guard-3-8B.FP16.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.FP16.gguf) | F16 | 16.07GB | Available | ⚪ Static | 📦 No
246
  | [Llama-Guard-3-8B.Q8_0.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q8_0.gguf) | Q8_0 | 8.54GB | ✅ Available | ⚪ Static | 📦 No
247
  | [Llama-Guard-3-8B.Q6_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q6_K.gguf) | Q6_K | 6.60GB | ✅ Available | ⚪ Static | 📦 No
248
  | [Llama-Guard-3-8B.Q5_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q5_K.gguf) | Q5_K | 5.73GB | ✅ Available | ⚪ Static | 📦 No
249
  | [Llama-Guard-3-8B.Q5_K_S.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q5_K_S.gguf) | Q5_K_S | 5.60GB | ✅ Available | ⚪ Static | 📦 No
250
+ | [Llama-Guard-3-8B.Q4_K.gguf](https://huggingface.co/legraphista/Llama-Guard-3-8B-IMat-GGUF/blob/main/Llama-Guard-3-8B.Q4_K.gguf) | Q4_K | 4.92GB | Available | 🟢 IMatrix | 📦 No
251
  | Llama-Guard-3-8B.Q4_K_S | Q4_K_S | - | ⏳ Processing | 🟢 IMatrix | -
252
  | Llama-Guard-3-8B.IQ4_NL | IQ4_NL | - | ⏳ Processing | 🟢 IMatrix | -
253
  | Llama-Guard-3-8B.IQ4_XS | IQ4_XS | - | ⏳ Processing | 🟢 IMatrix | -