- Paper: https://arxiv.org/abs/2411.05007
- Quantization Library: https://github.com/mit-han-lab/deepcompressor
- Inference Engine: https://github.com/mit-han-lab/nunchaku
- Website: https://hanlab.mit.edu/projects/svdquant
- Demo: https://svdquant.mit.edu/
- Blog: https://hanlab.mit.edu/blog/svdquant
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for mit-han-lab/FLUX.1-dev-LoRA-Collections
Base model
black-forest-labs/FLUX.1-dev