pcuenq HF staff commited on
Commit
407ace8
1 Parent(s): cab8a49

Update text

Browse files
Files changed (1) hide show
  1. app.py +4 -2
app.py CHANGED
@@ -382,7 +382,9 @@ with block:
382
  </h1>
383
  </div>
384
  <p style="margin-bottom: 10px; font-size: 94%">
385
- Paella is a novel text-to-image model that uses a compressed quantized latent space, based on a f8 VQGAN, and a masked training objective to achieve fast generation in ~10 inference steps.
 
 
386
  </p>
387
  </div>
388
  """
@@ -432,7 +434,7 @@ with block:
432
  </div>
433
  <div class="acknowledgments">
434
  <p><h4>Resources</h4>
435
- <a href="https://arxiv.org/abs/2211.07292" style="text-decoration: underline;">Paper</a>, <a href="https://github.com/dome272/Paella" style="text-decoration: underline;">official implementation</a>.
436
  </p>
437
  <p><h4>LICENSE</h4>
438
  <a href="https://github.com/dome272/Paella/blob/main/LICENSE" style="text-decoration: underline;">MIT</a>.
 
382
  </h1>
383
  </div>
384
  <p style="margin-bottom: 10px; font-size: 94%">
385
+ Paella is a novel text-to-image model that uses a compressed quantized latent space, based on a VQGAN, and a masked training objective to achieve fast generation in ~10 inference steps.
386
+
387
+ This version builds on top of our initial paper, bringing Paella to a similar level as other state-of-the-art models, while preserving the compactness and clarity of the previous implementations. Please, refer to the resources below for details.
388
  </p>
389
  </div>
390
  """
 
434
  </div>
435
  <div class="acknowledgments">
436
  <p><h4>Resources</h4>
437
+ <a href="https://arxiv.org/abs/2211.07292" style="text-decoration: underline;">Paper</a>, <a href="https://github.com/dome272/Paella" style="text-decoration: underline;">official implementation</a>, <a href="https://huggingface.co/dome272/Paella" style="text-decoration: underline;">Model Card</a>.
438
  </p>
439
  <p><h4>LICENSE</h4>
440
  <a href="https://github.com/dome272/Paella/blob/main/LICENSE" style="text-decoration: underline;">MIT</a>.