How much VRAM is required to run inference?

#2
by valdanito - opened

I use A100 40G machine still OOM.

I use A100 40G machine still OOM.

I use A100 80G and ~75G are occupied during inference.

Sign up or log in to comment