Support for diffusers model loading and fp16 variant loading
#1
by
handsomesteve
- opened
Hello, will it be possible to upload the unet, vae, text_encoder, text_encoder_2, tokenizer, tokenizer2 files into this repo? Particularly the fp16 variant as well. It will definitely help speed up loading and inference time.
Thank you!
Hello, I have uploaded the diffusers version for v3, and all of my models are already in fp16.
thank you
handsomesteve
changed discussion status to
closed