--- license: gpl-3.0 inference: false --- # CausalLM 34B β ExLlamav2 6.85 bpw quants of https://huggingface.co/CausalLM/34b-beta