Diffusers
Safetensors
StableDiffusionUpscalePipeline
stable-diffusion

Cuda Out of Memory

#23
by xings19 - opened

my code:

pipeline = StableDiffusionUpscalePipeline.from_pretrained(model_id,revision="fp16",torch_dtype=torch.float16)
pipeline = pipeline.to("cuda")
upscaled_image = pipeline(prompt=prompt, image=in_img).images[0]

Error:

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 112.50 GiB (GPU 0; 23.69 GiB total capacity; 3.62 GiB already allocated; 18.78 GiB free; 3.92 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation.  See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

in_img size is (320,768)

If you have low GPU RAM available, make sure to add a pipe.enable_attention_slicing() after sending it to cuda for less VRAM usage (to the cost of speed)

Sign up or log in to comment