This is a smaller checkpoint for flux1-dev that will work better for ComfyUI users with less VRAM (under 24gb).
The two text encoders used by Flux are already included in this one safetensor.
Use it with the Load Checkpoint
node in ComfyUI.
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.