MKLLM-7B-Instruct-exl2

EXL2 quants of trajkovnikola/MKLLM-7B-Instruct

The "main" branch only contains the measurement.json (which can be used for further conversions), download one of the other branches for the model:

6.5 bits per weight with 6 lm_head bits
5.0 bits per weight with 6 lm_head bits
4.25 bits per weight with 6 lm_head bits
3.5 bits per weight with 6 lm_head bits

measurement.json

Download instructions

With git:

git clone --single-branch --branch 5.0bpw-h6 https://huggingface.co/martinkozle/MKLLM-7B-Instruct-exl2 MKLLM-7B-Instruct-exl2-5.0bpw-h6

With huggingface hub:

To download a specific branch, use the --revision parameter.

huggingface-cli download martinkozle/MKLLM-7B-Instruct-exl2 --revision 5.0bpw-h6 --local-dir MKLLM-7B-Instruct-exl2-5.0bpw-h6 --local-dir-use-symlinks False
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.