Support for vllm/lmdeploy?
#1
by
solankibhargav
- opened
Thanks for building this model and sharing. Is there a support for faster inference engine? like lmdeploy/vllm? ( i tried both and currently it fails )
I believe TGI should have support for llava models https://huggingface.co/docs/text-generation-inference/en/basic_tutorials/visual_language_models