librarian-bot
commited on
Commit
•
b74cb8d
1
Parent(s):
f52d59d
Scheduled Commit
Browse files- data/2403.10783.json +1 -0
- data/2404.00878.json +1 -0
- data/2404.09512.json +1 -0
data/2403.10783.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"paper_url": "https://huggingface.co/papers/2403.10783", "comment": "This is an automated message from the [Librarian Bot](https://huggingface.co/librarian-bots). I found the following papers similar to this paper. \n\nThe following papers were recommended by the Semantic Scholar API \n\n* [Magic Clothing: Controllable Garment-Driven Image Synthesis](https://huggingface.co/papers/2404.09512) (2024)\n* [MMTryon: Multi-Modal Multi-Reference Control for High-Quality Fashion Generation](https://huggingface.co/papers/2405.00448) (2024)\n* [TryOn-Adapter: Efficient Fine-Grained Clothing Identity Adaptation for High-Fidelity Virtual Try-On](https://huggingface.co/papers/2404.00878) (2024)\n* [Texture-Preserving Diffusion Models for High-Fidelity Virtual Try-On](https://huggingface.co/papers/2404.01089) (2024)\n* [ViViD: Video Virtual Try-on using Diffusion Models](https://huggingface.co/papers/2405.11794) (2024)\n\n\n Please give a thumbs up to this comment if you found it helpful!\n\n If you want recommendations for any Paper on Hugging Face checkout [this](https://huggingface.co/spaces/librarian-bots/recommend_similar_papers) Space\n\n You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: `@librarian-bot recommend`"}
|
data/2404.00878.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"paper_url": "https://huggingface.co/papers/2404.00878", "comment": "This is an automated message from the [Librarian Bot](https://huggingface.co/librarian-bots). I found the following papers similar to this paper. \n\nThe following papers were recommended by the Semantic Scholar API \n\n* [Texture-Preserving Diffusion Models for High-Fidelity Virtual Try-On](https://huggingface.co/papers/2404.01089) (2024)\n* [ViViD: Video Virtual Try-on using Diffusion Models](https://huggingface.co/papers/2405.11794) (2024)\n* [MV-VTON: Multi-View Virtual Try-On with Diffusion Models](https://huggingface.co/papers/2404.17364) (2024)\n* [FLDM-VTON: Faithful Latent Diffusion Model for Virtual Try-on](https://huggingface.co/papers/2404.14162) (2024)\n* [Magic Clothing: Controllable Garment-Driven Image Synthesis](https://huggingface.co/papers/2404.09512) (2024)\n\n\n Please give a thumbs up to this comment if you found it helpful!\n\n If you want recommendations for any Paper on Hugging Face checkout [this](https://huggingface.co/spaces/librarian-bots/recommend_similar_papers) Space\n\n You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: `@librarian-bot recommend`"}
|
data/2404.09512.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"paper_url": "https://huggingface.co/papers/2404.09512", "comment": "This is an automated message from the [Librarian Bot](https://huggingface.co/librarian-bots). I found the following papers similar to this paper. \n\nThe following papers were recommended by the Semantic Scholar API \n\n* [From Parts to Whole: A Unified Reference Framework for Controllable Human Image Generation](https://huggingface.co/papers/2404.15267) (2024)\n* [Training-free Subject-Enhanced Attention Guidance for Compositional Text-to-image Generation](https://huggingface.co/papers/2405.06948) (2024)\n* [MoMA: Multimodal LLM Adapter for Fast Personalized Image Generation](https://huggingface.co/papers/2404.05674) (2024)\n* [MasterWeaver: Taming Editability and Identity for Personalized Text-to-Image Generation](https://huggingface.co/papers/2405.05806) (2024)\n* [TryOn-Adapter: Efficient Fine-Grained Clothing Identity Adaptation for High-Fidelity Virtual Try-On](https://huggingface.co/papers/2404.00878) (2024)\n\n\n Please give a thumbs up to this comment if you found it helpful!\n\n If you want recommendations for any Paper on Hugging Face checkout [this](https://huggingface.co/spaces/librarian-bots/recommend_similar_papers) Space\n\n You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: `@librarian-bot recommend`"}
|