File size: 1,844 Bytes
6e6cf8c
 
 
 
 
 
9a18140
5bf67ae
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
---
license: apache-2.0
base_model:
- black-forest-labs/FLUX.1-dev
new_version: black-forest-labs/FLUX.1-dev
pipeline_tag: text-to-image
library_name: adapter-transformers
---
# TLCM: Training-efficient Latent Consistency Model for Image Generation with 2-8 Steps

<p align="center">
   📃 <a href="https://arxiv.org/html/2406.05768v5" target="_blank">Paper</a></p>

<!-- **TLCM: Training-efficient Latent Consistency Model for Image Generation with 2-8 Steps** -->

<!-- Our method accelerates LDMs via data-free multistep latent consistency distillation (MLCD), and data-free latent consistency distillation is proposed to efficiently guarantee the inter-segment consistency in MLCD. 

Furthermore, we introduce bags of techniques, e.g., distribution matching, adversarial learning, and preference learning, to enhance TLCM’s performance at few-step inference without any real data.

TLCM demonstrates a high level of flexibility by enabling adjustment of sampling steps within the range of 2 to 8 while still producing competitive outputs compared
to full-step approaches. -->
we propose an innovative two-stage data-free consistency distillation (TDCD) approach to accelerate latent consistency model. The first stage improves consistency constraint  by data-free sub-segment consistency distillation (DSCD). The second stage enforces the
global consistency across inter-segments through data-free consistency distillation (DCD). Besides, we explore various
 techniques to promote TLCM’s performance in data-free manner, forming Training-efficient Latent Consistency
 Model (TLCM) with 2-8 step inference.

TLCM demonstrates a high level of flexibility by enabling adjustment of sampling steps within the range of 2 to 8 while still producing competitive outputs compared
to full-step approaches.

## This is for Flux-base LoRA.