legraphista
commited on
Commit
•
fe3321b
1
Parent(s):
22349a0
Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -242,8 +242,8 @@ Link: [here](https://huggingface.co/legraphista/Meta-Llama-3.1-8B-Instruct-IMat-
|
|
242 |
| [Meta-Llama-3.1-8B-Instruct.Q8_0.gguf](https://huggingface.co/legraphista/Meta-Llama-3.1-8B-Instruct-IMat-GGUF/blob/main/Meta-Llama-3.1-8B-Instruct.Q8_0.gguf) | Q8_0 | 8.54GB | ✅ Available | ⚪ Static | 📦 No
|
243 |
| [Meta-Llama-3.1-8B-Instruct.Q6_K.gguf](https://huggingface.co/legraphista/Meta-Llama-3.1-8B-Instruct-IMat-GGUF/blob/main/Meta-Llama-3.1-8B-Instruct.Q6_K.gguf) | Q6_K | 6.60GB | ✅ Available | ⚪ Static | 📦 No
|
244 |
| [Meta-Llama-3.1-8B-Instruct.Q4_K.gguf](https://huggingface.co/legraphista/Meta-Llama-3.1-8B-Instruct-IMat-GGUF/blob/main/Meta-Llama-3.1-8B-Instruct.Q4_K.gguf) | Q4_K | 4.92GB | ✅ Available | 🟢 IMatrix | 📦 No
|
245 |
-
| Meta-Llama-3.1-8B-Instruct.Q3_K | Q3_K |
|
246 |
-
| Meta-Llama-3.1-8B-Instruct.Q2_K | Q2_K |
|
247 |
|
248 |
|
249 |
### All Quants
|
@@ -259,14 +259,14 @@ Link: [here](https://huggingface.co/legraphista/Meta-Llama-3.1-8B-Instruct-IMat-
|
|
259 |
| Meta-Llama-3.1-8B-Instruct.Q4_K_S | Q4_K_S | - | ⏳ Processing | 🟢 IMatrix | -
|
260 |
| Meta-Llama-3.1-8B-Instruct.IQ4_NL | IQ4_NL | - | ⏳ Processing | 🟢 IMatrix | -
|
261 |
| Meta-Llama-3.1-8B-Instruct.IQ4_XS | IQ4_XS | - | ⏳ Processing | 🟢 IMatrix | -
|
262 |
-
| Meta-Llama-3.1-8B-Instruct.Q3_K | Q3_K |
|
263 |
| Meta-Llama-3.1-8B-Instruct.Q3_K_L | Q3_K_L | - | ⏳ Processing | 🟢 IMatrix | -
|
264 |
| Meta-Llama-3.1-8B-Instruct.Q3_K_S | Q3_K_S | - | ⏳ Processing | 🟢 IMatrix | -
|
265 |
| Meta-Llama-3.1-8B-Instruct.IQ3_M | IQ3_M | - | ⏳ Processing | 🟢 IMatrix | -
|
266 |
| Meta-Llama-3.1-8B-Instruct.IQ3_S | IQ3_S | - | ⏳ Processing | 🟢 IMatrix | -
|
267 |
| Meta-Llama-3.1-8B-Instruct.IQ3_XS | IQ3_XS | - | ⏳ Processing | 🟢 IMatrix | -
|
268 |
| Meta-Llama-3.1-8B-Instruct.IQ3_XXS | IQ3_XXS | - | ⏳ Processing | 🟢 IMatrix | -
|
269 |
-
| Meta-Llama-3.1-8B-Instruct.Q2_K | Q2_K |
|
270 |
| Meta-Llama-3.1-8B-Instruct.Q2_K_S | Q2_K_S | - | ⏳ Processing | 🟢 IMatrix | -
|
271 |
| Meta-Llama-3.1-8B-Instruct.IQ2_M | IQ2_M | - | ⏳ Processing | 🟢 IMatrix | -
|
272 |
| Meta-Llama-3.1-8B-Instruct.IQ2_S | IQ2_S | - | ⏳ Processing | 🟢 IMatrix | -
|
|
|
242 |
| [Meta-Llama-3.1-8B-Instruct.Q8_0.gguf](https://huggingface.co/legraphista/Meta-Llama-3.1-8B-Instruct-IMat-GGUF/blob/main/Meta-Llama-3.1-8B-Instruct.Q8_0.gguf) | Q8_0 | 8.54GB | ✅ Available | ⚪ Static | 📦 No
|
243 |
| [Meta-Llama-3.1-8B-Instruct.Q6_K.gguf](https://huggingface.co/legraphista/Meta-Llama-3.1-8B-Instruct-IMat-GGUF/blob/main/Meta-Llama-3.1-8B-Instruct.Q6_K.gguf) | Q6_K | 6.60GB | ✅ Available | ⚪ Static | 📦 No
|
244 |
| [Meta-Llama-3.1-8B-Instruct.Q4_K.gguf](https://huggingface.co/legraphista/Meta-Llama-3.1-8B-Instruct-IMat-GGUF/blob/main/Meta-Llama-3.1-8B-Instruct.Q4_K.gguf) | Q4_K | 4.92GB | ✅ Available | 🟢 IMatrix | 📦 No
|
245 |
+
| [Meta-Llama-3.1-8B-Instruct.Q3_K.gguf](https://huggingface.co/legraphista/Meta-Llama-3.1-8B-Instruct-IMat-GGUF/blob/main/Meta-Llama-3.1-8B-Instruct.Q3_K.gguf) | Q3_K | 4.02GB | ✅ Available | 🟢 IMatrix | 📦 No
|
246 |
+
| [Meta-Llama-3.1-8B-Instruct.Q2_K.gguf](https://huggingface.co/legraphista/Meta-Llama-3.1-8B-Instruct-IMat-GGUF/blob/main/Meta-Llama-3.1-8B-Instruct.Q2_K.gguf) | Q2_K | 3.18GB | ✅ Available | 🟢 IMatrix | 📦 No
|
247 |
|
248 |
|
249 |
### All Quants
|
|
|
259 |
| Meta-Llama-3.1-8B-Instruct.Q4_K_S | Q4_K_S | - | ⏳ Processing | 🟢 IMatrix | -
|
260 |
| Meta-Llama-3.1-8B-Instruct.IQ4_NL | IQ4_NL | - | ⏳ Processing | 🟢 IMatrix | -
|
261 |
| Meta-Llama-3.1-8B-Instruct.IQ4_XS | IQ4_XS | - | ⏳ Processing | 🟢 IMatrix | -
|
262 |
+
| [Meta-Llama-3.1-8B-Instruct.Q3_K.gguf](https://huggingface.co/legraphista/Meta-Llama-3.1-8B-Instruct-IMat-GGUF/blob/main/Meta-Llama-3.1-8B-Instruct.Q3_K.gguf) | Q3_K | 4.02GB | ✅ Available | 🟢 IMatrix | 📦 No
|
263 |
| Meta-Llama-3.1-8B-Instruct.Q3_K_L | Q3_K_L | - | ⏳ Processing | 🟢 IMatrix | -
|
264 |
| Meta-Llama-3.1-8B-Instruct.Q3_K_S | Q3_K_S | - | ⏳ Processing | 🟢 IMatrix | -
|
265 |
| Meta-Llama-3.1-8B-Instruct.IQ3_M | IQ3_M | - | ⏳ Processing | 🟢 IMatrix | -
|
266 |
| Meta-Llama-3.1-8B-Instruct.IQ3_S | IQ3_S | - | ⏳ Processing | 🟢 IMatrix | -
|
267 |
| Meta-Llama-3.1-8B-Instruct.IQ3_XS | IQ3_XS | - | ⏳ Processing | 🟢 IMatrix | -
|
268 |
| Meta-Llama-3.1-8B-Instruct.IQ3_XXS | IQ3_XXS | - | ⏳ Processing | 🟢 IMatrix | -
|
269 |
+
| [Meta-Llama-3.1-8B-Instruct.Q2_K.gguf](https://huggingface.co/legraphista/Meta-Llama-3.1-8B-Instruct-IMat-GGUF/blob/main/Meta-Llama-3.1-8B-Instruct.Q2_K.gguf) | Q2_K | 3.18GB | ✅ Available | 🟢 IMatrix | 📦 No
|
270 |
| Meta-Llama-3.1-8B-Instruct.Q2_K_S | Q2_K_S | - | ⏳ Processing | 🟢 IMatrix | -
|
271 |
| Meta-Llama-3.1-8B-Instruct.IQ2_M | IQ2_M | - | ⏳ Processing | 🟢 IMatrix | -
|
272 |
| Meta-Llama-3.1-8B-Instruct.IQ2_S | IQ2_S | - | ⏳ Processing | 🟢 IMatrix | -
|