VLLM is not supporting MistralForCausalLM

#11
by narenzen - opened

ValueError: Model architectures ['MistralForCausalLM'] are not supported for now. Supported architectures: ['AquilaModel', 'BaiChuanForCausalLM', 'BaichuanForCausalLM', 'BloomForCausalLM', 'FalconForCausalLM', 'GPT2LMHeadModel', 'GPTBigCodeForCausalLM', 'GPTJForCausalLM', 'GPTNeoXForCausalLM', 'InternLMForCausalLM', 'LlamaForCausalLM', 'LLaMAForCausalLM', 'MPTForCausalLM', 'OPTForCausalLM', 'QWenLMHeadModel', 'RWForCausalLM']

Mistral AI_ org

Hey, it will be in the next release ! https://github.com/vllm-project/vllm/issues/1089
We were a bit slow addressing some issues with our PR yesterday :)

timlacroix changed discussion status to closed

Sign up or log in to comment