license: gpl-3.0 inference: false
ExLlamav2 6.85 bpw quants of https://huggingface.co/CausalLM/34b-beta