How many GPU VRAM do I need to run this model?

#3
by catworld1212 - opened

How many GPU VRAM do I need to run this model? Can I run on vLLM?

Sign up or log in to comment