Spaces:
Running
on
A10G
Running
on
A10G
support mllm? such as llava.
#77
by
wangrongsheng
- opened
support mllm? such as llava.
Hi @wangrongsheng - we only support architectures and models that are supported in llama.cpp at the moment. If the quantisation is not supported by llama.cpp then we do not support it here.
Cheers!
reach-vb
changed discussion status to
closed