could you upload Qwen2-VL-2B onnx?
#1
by
meixitu
- opened
Thanks for you great job.
I try to use optimum-cli to export the model, but failed.
could you upload Qwen2-VL-2B-Instruct-onnx?
Thanks
Hi @meixitu , I used https://github.com/wangzhaode/llm-export to export it. Optimum doesn't support text-image-to-text, I had the same issue. That exporter should work on the 2b one too. I am trying to get it work with transformers.js now though, it expects a slightly different format.