Is there a Lora training script?
#10
by
paulv
- opened
Looking to fine tune, are there scripts anywhere for this?
tried to load it up and train a lora and it won't even fit on a single 80G GPU. wah.
well, figured out the vram issues. here is a guide to train a LoRA, though it uses Dev, you just need to change the MODEL_NAME for schnell's.
https://github.com/bghira/SimpleTuner/blob/main/documentation/quickstart/FLUX.md
@bghira
Did you manage to make LORA work for Flux schnell?
AFAIU from diff threads: 𧡠1 & 𧡠2, the results were unsatisfactory
Does anyone have an update or anything to comment on, report new findings, etc.?
For example, anyone who has used ostris trick π
βοΈβοΈβοΈ
@Octree
perhaps you have smth to say about question above...
no? π