Update README.md
Browse files
README.md
CHANGED
@@ -17,6 +17,32 @@ Also welcome to try the demo host on [🤗 Space](https://huggingface.co/spaces/
|
|
17 |
|
18 |
![](./assets/teaser_fig.png)
|
19 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
20 |
## Introduction
|
21 |
|
22 |
TCD, inspired by [Consistency Models](https://arxiv.org/abs/2303.01469), is a novel distillation technology that enables the distillation of knowledge from pre-trained diffusion models into a few-step sampler. In this repository, we release the inference code and our model named TCD-SDXL, which is distilled from [SDXL Base 1.0](https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0). We provide the LoRA checkpoint in this [repository]().
|
|
|
17 |
|
18 |
![](./assets/teaser_fig.png)
|
19 |
|
20 |
+
## A Solemn Statement Regarding the Plagiarism Allegations.
|
21 |
+
|
22 |
+
We regret to hear about the serious accusations from the CTM team.
|
23 |
+
|
24 |
+
<blockquote class="twitter-tweet"><p lang="en" dir="ltr">We sadly found out our CTM paper (ICLR24) was plagiarized by TCD! It's unbelievable😢—they not only stole our idea of trajectory consistency but also comitted "verbatim plagiarism," literally copying our proofs word for word! Please help me spread this. <a href="https://t.co/aR6pRjhj5X">pic.twitter.com/aR6pRjhj5X</a></p>— Dongjun Kim (@gimdong58085414) <a href="https://twitter.com/gimdong58085414/status/1772350285270188069?ref_src=twsrc%5Etfw">March 25, 2024</a></blockquote>
|
25 |
+
|
26 |
+
Before this post, we already have several rounds of communication with CTM's authors.
|
27 |
+
We shall proceed to elucidate the situation here.
|
28 |
+
|
29 |
+
<blockquote class="twitter-tweet"><p lang="en" dir="ltr">We regret to hear about the serious accusations from the CTM team <a href="https://twitter.com/gimdong58085414?ref_src=twsrc%5Etfw">@gimdong58085414</a>. I shall proceed to elucidate the situation and make an archive here. We already have several rounds of communication with CTM's authors. <a href="https://t.co/BKn3w1jXuh">https://t.co/BKn3w1jXuh</a></p>— Michael (@Merci0318) <a href="https://twitter.com/Merci0318/status/1772502247563559014?ref_src=twsrc%5Etfw">March 26, 2024</a></blockquote>
|
30 |
+
|
31 |
+
1. In our first arXiv pre-print, we have indicated "mainly borrows the proof from CTM" and have never intended to claim credits. As we have mentioned in our email, we would like to extend a formal apology to the CTM authors for the clearly inadequate level of referencing in our paper.
|
32 |
+
|
33 |
+
2. Our entire sampling algorithm and the whole proof of Theorem 4 are predicated upon DPMSolver and DEIS and we also provide the proof in the email.
|
34 |
+
|
35 |
+
3. CTM and TCD are different from motivation, method to experiments. The experimental results also cannot be obtained from any type of CTM algorithm.
|
36 |
+
|
37 |
+
3.1 Here we provide a simple method to check: use our sampler here to sample the checkpoint [CTM released](https://github.com/sony/ctm), or vice versa.
|
38 |
+
|
39 |
+
3.2 [CTM](https://github.com/sony/ctm) also provided training script. We welcome anyone to reproduce the experiments on SDXL based on CTM algorithm.
|
40 |
+
|
41 |
+
We believe the assertion of plagiarism is not only severe but also detrimental to the academic integrity of the involved parties.
|
42 |
+
We earnestly hope that everyone involved gains a more comprehensive understanding of this matter.
|
43 |
+
|
44 |
+
All related docs can be found [here](https://drive.google.com/file/d/19c1QMfOMgp3McR4FCBk4pjdf22avyp8X/view).
|
45 |
+
|
46 |
## Introduction
|
47 |
|
48 |
TCD, inspired by [Consistency Models](https://arxiv.org/abs/2303.01469), is a novel distillation technology that enables the distillation of knowledge from pre-trained diffusion models into a few-step sampler. In this repository, we release the inference code and our model named TCD-SDXL, which is distilled from [SDXL Base 1.0](https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0). We provide the LoRA checkpoint in this [repository]().
|