legraphista commited on
Commit
f30fbb5
β€’
1 Parent(s): 0a1cf13

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -82,7 +82,7 @@ Link: [here](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/
82
  | [xLAM-8x7b-r.Q5_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q5_K.gguf) | Q5_K | 33.23GB | βœ… Available | βšͺ Static | πŸ“¦ No
83
  | [xLAM-8x7b-r.Q5_K_S.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q5_K_S.gguf) | Q5_K_S | 32.23GB | βœ… Available | βšͺ Static | πŸ“¦ No
84
  | [xLAM-8x7b-r.Q4_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q4_K.gguf) | Q4_K | 28.45GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
85
- | xLAM-8x7b-r.Q4_K_S | Q4_K_S | - | ⏳ Processing | 🟒 IMatrix | -
86
  | xLAM-8x7b-r.IQ4_NL | IQ4_NL | - | ⏳ Processing | 🟒 IMatrix | -
87
  | xLAM-8x7b-r.IQ4_XS | IQ4_XS | - | ⏳ Processing | 🟒 IMatrix | -
88
  | [xLAM-8x7b-r.Q3_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q3_K.gguf) | Q3_K | 22.55GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
 
82
  | [xLAM-8x7b-r.Q5_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q5_K.gguf) | Q5_K | 33.23GB | βœ… Available | βšͺ Static | πŸ“¦ No
83
  | [xLAM-8x7b-r.Q5_K_S.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q5_K_S.gguf) | Q5_K_S | 32.23GB | βœ… Available | βšͺ Static | πŸ“¦ No
84
  | [xLAM-8x7b-r.Q4_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q4_K.gguf) | Q4_K | 28.45GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
85
+ | [xLAM-8x7b-r.Q4_K_S.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q4_K_S.gguf) | Q4_K_S | 26.75GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
86
  | xLAM-8x7b-r.IQ4_NL | IQ4_NL | - | ⏳ Processing | 🟒 IMatrix | -
87
  | xLAM-8x7b-r.IQ4_XS | IQ4_XS | - | ⏳ Processing | 🟒 IMatrix | -
88
  | [xLAM-8x7b-r.Q3_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q3_K.gguf) | Q3_K | 22.55GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No