Training Consistency Models with Variational Noise Coupling
Abstract
Consistency Training (CT) has recently emerged as a promising alternative to diffusion models, achieving competitive performance in image generation tasks. However, non-distillation consistency training often suffers from high variance and instability, and analyzing and improving its training dynamics is an active area of research. In this work, we propose a novel CT training approach based on the Flow Matching framework. Our main contribution is a trained noise-coupling scheme inspired by the architecture of Variational Autoencoders (VAE). By training a data-dependent noise emission model implemented as an encoder architecture, our method can indirectly learn the geometry of the noise-to-data mapping, which is instead fixed by the choice of the forward process in classical CT. Empirical results across diverse image datasets show significant generative improvements, with our model outperforming baselines and achieving the state-of-the-art (SoTA) non-distillation CT FID on CIFAR-10, and attaining FID on par with SoTA on ImageNet at 64 times 64 resolution in 2-step generation. Our code is available at https://github.com/sony/vct .
Community
We train the data-noise coupling in Consistency Models with a VAE-style loss, and achieve SOTA consistency training performance on CIFAR and beat the equivalent iCT and ECM baseline on all datasets. We also provide a mathematical formalization based on the theory of flow matching.
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- Improved Training Technique for Latent Consistency Models (2025)
- Score-of-Mixture Training: Training One-Step Generative Models Made Simple via Score Estimation of Mixture Distributions (2025)
- Robust Representation Consistency Model via Contrastive Denoising (2025)
- EQ-VAE: Equivariance Regularized Latent Space for Improved Generative Image Modeling (2025)
- Consistency Training with Physical Constraints (2025)
- Improved Diffusion-based Generative Model with Better Adversarial Robustness (2025)
- TraFlow: Trajectory Distillation on Pre-Trained Rectified Flow (2025)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment:
@librarian-bot
recommend
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper