EXL2 quants of [failspy/llama-3-70B-Instruct-abliterated](https://huggingface.co/failspy/llama-3-70B-Instruct-abliterated) Branches: - `main` -- `measurement.json` - `2.25b6h` -- 2.25bpw, 6bit lm_head - `3.5b6h` -- 3.5bpw, 6bit lm_head - `3.75b6h` -- 3.75bpw, 6bit lm_head - `4b6h` -- 4bpw, 6bit lm_head - `4.5b6h` -- 4.5bpw, 6bit lm_head - `4.65b6h` -- 4.65bpw, 6bit lm_head - `6b6h` -- 6bpw, 6bit lm_head - `8b8h` -- 8bpw, 8bit lm_head No rpcal for this one, just a normal batch of standard-settings exl2.