Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
lorraine2 
posted an update Dec 4, 2024
Post
1208
New NVIDIA paper: ⚡ Multi-student Diffusion Distillation for Better One-step Generators ⚡

Do you want to make your diffusion models (a) run in a single step, (b) run with a smaller model, and (c) have improved quality simultaneously? Check out our multi-student distillation (MSD) method, which is simple and applicable to most diffusion models! The only catch is now we have to distill (and store) a mixture-of-expert student generators.

Explore the MSD project page to learn more: https://research.nvidia.com/labs/toronto-ai/MSD/

Work led by Yanke Song along with Weili Nie, Karsten Kreis and James Lucas

Check out more work from the Toronto AI Lab here: https://research.nvidia.com/labs/toronto-ai/

this is really cool!