6.5 bpw EXL2 quant of Acolyte-22B
Acolyte-22B
LoRA of a bunch of random datasets on top of Mistral-Small-Instruct-2409, then SLERPed onto base at 0.5. Decent enough for its size. Check the LoRA for dataset info.
Use Mistral V2 & V3
template.
- Downloads last month
- 24
Model tree for Brioch/Acolyte-22B-6.5bpw-exl2
Base model
rAIfle/Acolyte-22B