Lora doesn't work with fp8 version FLUX

#176
by xzt1111 - opened

hi,i tried fp8 version flux on diffusers,it is amazing. However, it seems that the lora doesn't work. With or without lora, fp8 version output the same pictures. Here's my code, can someone help me?

import torch
from diffusers import DiffusionPipeline, FluxTransformer2DModel
from optimum.quanto import freeze, qfloat8, quantize

device = "cuda" if torch.cuda.is_available() else "cpu"

transformer = FluxTransformer2DModel.from_single_file("https://huggingface.co/Kijai/flux-fp8/blob/main/flux1-dev-fp8.safetensors", torch_dtype=torch.bfloat16)
quantize(transformer, weights=qfloat8)
freeze(transformer)

pipe = DiffusionPipeline.from_pretrained("black-forest-labs/FLUX.1-dev", torch_dtype=torch.bfloat16)
pipe.transformer = transformer
pipe = pipe.to(device)

prompt = "A blue jay standing on a large basket of rainbow macarons, disney style"
#prompt = "The portrait of a brazilian person"

generator = torch.Generator(device="cpu").manual_seed(42)
image = pipe(prompt, generator=generator, guidance_scale=3.5).images[0]
image.save("no_lora.png")

pipe.load_lora_weights("XLabs-AI/flux-lora-collection", weight_name="disney_lora.safetensors")
#pipe.load_lora_weights("XLabs-AI/flux-RealismLora")

generator = torch.Generator(device="cpu").manual_seed(42)
image = pipe(prompt, generator=generator, joint_attention_kwargs={"scale": 1}, guidance_scale=3.5).images[0]
image.save("lora_1_0.png")

generator = torch.Generator(device="cpu").manual_seed(42)
image = pipe(prompt, generator=generator, joint_attention_kwargs={"scale": 1.75}, guidance_scale=3.5).images[0]
image.save("lora_1_75.png")

I'm running into the same issue. Flux LoRA's currently don't work.

Even running the inference examples from popular adapters currently just outputs a no-lora image.

I'm running into the same issue. Flux LoRA's currently don't work.

Even running the inference examples from popular adapters currently just outputs a no-lora image.
I have the same problem, have you solved this problem now?

Sign up or log in to comment