how to run vllm on mac m1?

#1
by acql - opened

I had problem to run model using mac m1 in vllm

vllm is only supported on linux. https://docs.vllm.ai/en/latest/getting_started/installation.html#requirements

if you want to run a 4 bit model on mac, use a GGUF model with llama.cpp or one of the MLX libraries

Sign up or log in to comment