nyanko7 commited on
Commit
5a8d6ea
1 Parent(s): 5fcd829

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +25 -3
README.md CHANGED
@@ -1,3 +1,25 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ language:
4
+ - en
5
+ base_model:
6
+ - black-forest-labs/FLUX.1-dev
7
+ ---
8
+
9
+ # Flux-dev-de-distill
10
+
11
+ This is an experiment to de-distill guidance from flux.1-dev. We removed the original distilled guidance and make true classifier-free guidance reworks.
12
+
13
+ ## Model Details
14
+
15
+ Following Algorithm 1 in [On Distillation of Guided Diffusion Models](https://arxiv.org/abs/2210.03142), we attempted to reverse the distillation process by re-matching guidance scale w. we introduce a student model
16
+ x(zt) to match the output of the teacher at any time-step t ∈ [0, 1] and any guidance scale w ∈ [1, 4]. We initialize the student model with parameters from the teacher model except for the parameters related to w-embedding.
17
+
18
+ Since this model uses true CFG instead of distilled CFG, it is not compatible with diffusers pipeline. Please use [inference script](./inference.py) or manually add guidance in the iteration loop.
19
+
20
+ Train: 150K Unsplash images, 1024px square, 6k steps with global batch size 32, frozen teacher model, approx 12 hours due to limited compute.
21
+
22
+ Examples: Distilled CFG / True CFG
23
+
24
+ ![](./example2.webp)
25
+ ![](./example1.webp)