Did anyone figured it out how to run it in low vRAM like 15-25 GB
#21
by
zohadev
- opened
I just wonder if it is possible to run it in low vRAM devices because it's not not cheap to run it on a A100 GPU
Wtf. I just use comfy and my 3090 is enough. For low res it uses below 15 for sure
Well I use fp16 though, but dunno if it matters
Here is a link to my thread where I have posted my solution to the issue: https://huggingface.co/stabilityai/stable-video-diffusion-img2vid-xt/discussions/23
runs on 8gb RTX 3070, i tested. 512x512.
you can try this repo
https://github.com/GaussianGuaicai/generative-models