Error when attempting to run .9.1 model

#46
by jdc4429 - opened

I get this error when I attempt to run the newer smaller .9.1 model... The model is only 5.32gb when downloaded but states its 5.72gb in the files section. The .9.0 model works fine...

Error(s) in loading state_dict for VideoVAE:
size mismatch for decoder.conv_in.conv.weight: copying a param with shape torch.Size([1024, 128, 3, 3, 3]) from checkpoint, the shape in current model is torch.Size([512, 128, 3, 3, 3]).
size mismatch for decoder.conv_in.conv.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for decoder.up_blocks.0.res_blocks.0.conv1.conv.weight: copying a param with shape torch.Size([1024, 1024, 3, 3, 3]) from checkpoint, the shape in current model is torch.Size([512, 512, 3, 3, 3]).

It works in ComfyUI if you use the default Checkpoint Loader and select the VAE from the Checkpoint Loader as well.

Thank you, I did try switching out the checkpoint loader which gave another error, did not try adding the VAE from there.

jdc4429 changed discussion status to closed
jdc4429 changed discussion status to open

Still not working. Also updated all just in case. Still get the same errors as before. I get an error with the vae using the standard load checkpoint, error with the sampler if I try using the LTX one...

I'm having the same issue and it can be reproduced with native i2v workflow. 0.9.0 worked fine.
https://comfyanonymous.github.io/ComfyUI_examples/ltxv/

I have a bunch of different workflows.. I was only able to get it working with 1 of them though

jdc4429 changed discussion status to closed

This should not be closed. Model configuration does not match the checkpoint. I think this has something to do with the conversion from fp32 to bf16..

Lightricks org

The new VAE decoder has a new architecture and more parameters. And it requires updating the nodes.
I understand that Comfy just updated the nodes to support the new version. Can you try again with the updated nodes?

Thank you, it worked. Sorry for the inconvenience, I had it updated yesterday and turns out there was a new one right after.

Sign up or log in to comment