--- license: apache-2.0 --- The model is quantized using https://github.com/WanBenLe/AutoAWQ-with-llava-v1.6.git The source model is `llava-hf/llava-v1.6-mistral-7b-hf`