aihao commited on
Commit
1736ed6
Β·
1 Parent(s): 04b31b7
README.md CHANGED
@@ -1,6 +1,6 @@
1
- # IP Adapter Artist:
2
 
3
- <a href='https://huggingface.co/AisingioroHao0/IP-Adapter-Artist'><img src='https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Model-blue'></a><a href=''><img src='https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Dataset-blue'></a> [![**IP Adapter Artist Demo**](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1kV7q3Gzr8GPG9cChdDQ5ncCx84TYjuu3?usp=sharing)
4
 
5
  ![image-20240807232402569](./README.assets/main.png)
6
 
@@ -8,13 +8,13 @@
8
 
9
  ## Introduction
10
 
11
- IP Adapter Artist is a specialized version that uses a professional style encoder. Its goal is to achieve style control through reference images in the text-to-image diffusion model and solve the problems of instability and incomplete stylization of existing methods. This is a preprint version, and more models and training data coming soon.
12
 
13
  ## How to use
14
 
15
- [![**IP Adapter Artist Demo**](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1kV7q3Gzr8GPG9cChdDQ5ncCx84TYjuu3?usp=sharing) can be used to conduct experiments directly.
16
 
17
- For local experiments, please refer to a [demo](https://github.com/aihao2000/IP-Adapter-Artist/blob/main/ip_adapter_artist_sdxl_demo.ipynb).
18
 
19
  Local experiments require a basic torch environment and dependencies:
20
 
@@ -22,7 +22,7 @@ Local experiments require a basic torch environment and dependencies:
22
  pip install diffusers
23
  pip install transformers
24
  pip install git+https://github.com/openai/CLIP.git
25
- pip install git+https://github.com/aihao2000/IP-Adapter-Artist.git
26
  ```
27
 
28
  ## More Examples
@@ -33,12 +33,12 @@ pip install git+https://github.com/aihao2000/IP-Adapter-Artist.git
33
  ## Citation
34
 
35
  ```
36
- @misc{IP-Adapter-Artist,
37
  author = {Hao Ai, Xiaosai Zhang},
38
- title = {IP Adapter Artist},
39
  year = {2024},
40
  publisher = {GitHub},
41
  journal = {GitHub repository},
42
- howpublished = {\url{https://github.com/aihao2000/IP-Adapter-Artist}}
43
  }
44
  ```
 
1
+ # IP Adapter Art:
2
 
3
+ <a href='https://huggingface.co/AisingioroHao0/IP-Adapter-Art'><img src='https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Model-blue'></a><a href=''><img src='https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Dataset-blue'></a> [![**IP Adapter Art Demo**](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1kV7q3Gzr8GPG9cChdDQ5ncCx84TYjuu3?usp=sharing)
4
 
5
  ![image-20240807232402569](./README.assets/main.png)
6
 
 
8
 
9
  ## Introduction
10
 
11
+ IP Adapter Art is a specialized version that uses a professional style encoder. Its goal is to achieve style control through reference images in the text-to-image diffusion model and solve the problems of instability and incomplete stylization of existing methods. This is a preprint version, and more models and training data coming soon.
12
 
13
  ## How to use
14
 
15
+ [![**IP Adapter Art Demo**](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1kV7q3Gzr8GPG9cChdDQ5ncCx84TYjuu3?usp=sharing) can be used to conduct experiments directly.
16
 
17
+ For local experiments, please refer to a [demo](https://github.com/aihao2000/IP-Adapter-Artist/blob/main/ip_adapter_art_sdxl_demo.ipynb).
18
 
19
  Local experiments require a basic torch environment and dependencies:
20
 
 
22
  pip install diffusers
23
  pip install transformers
24
  pip install git+https://github.com/openai/CLIP.git
25
+ pip install git+https://github.com/aihao2000/IP-Adapter-Art.git
26
  ```
27
 
28
  ## More Examples
 
33
  ## Citation
34
 
35
  ```
36
+ @misc{IP-Adapter-Art,
37
  author = {Hao Ai, Xiaosai Zhang},
38
+ title = {IP Adapter Art},
39
  year = {2024},
40
  publisher = {GitHub},
41
  journal = {GitHub repository},
42
+ howpublished = {\url{https://github.com/aihao2000/IP-Adapter-Art}}
43
  }
44
  ```
{ip_adapter_artist β†’ ip_adapter_art}/__init__.py RENAMED
File without changes
{ip_adapter_artist β†’ ip_adapter_art}/utils/__init__.py RENAMED
File without changes
{ip_adapter_artist β†’ ip_adapter_art}/utils/csd_clip.py RENAMED
File without changes
{ip_adapter_artist β†’ ip_adapter_art}/utils/ip_adapter.py RENAMED
File without changes
ip_adapter_artist_sdxl_demo.ipynb β†’ ip_adapter_art_sdxl_demo.ipynb RENAMED
@@ -6,14 +6,14 @@
6
  "metadata": {},
7
  "outputs": [],
8
  "source": [
9
- "from ip_adapter_artist.utils.csd_clip import CSD_CLIP\n",
10
- "from ip_adapter_artist.utils.ip_adapter import (\n",
11
  " load_ip_adapter,\n",
12
  ")\n",
13
  "import torch\n",
14
  "from transformers import CLIPImageProcessor\n",
15
  "from PIL import Image\n",
16
- "from diffusers.utils import make_image_grid,load_image\n",
17
  "from huggingface_hub import hf_hub_download\n",
18
  "from diffusers import StableDiffusionXLPipeline"
19
  ]
@@ -33,7 +33,7 @@
33
  "outputs": [],
34
  "source": [
35
  "csd_clip_path = hf_hub_download(\n",
36
- " repo_id=\"AisingioroHao0/IP-Adapter-Artist\", filename=\"csd_clip.pth\"\n",
37
  ")"
38
  ]
39
  },
@@ -43,8 +43,8 @@
43
  "metadata": {},
44
  "outputs": [],
45
  "source": [
46
- "ip_adapter_artist_path = hf_hub_download(\n",
47
- " repo_id=\"AisingioroHao0/IP-Adapter-Artist\", filename=\"ip_adapter_artist_sdxl_512.pth\"\n",
48
  ")"
49
  ]
50
  },
 
6
  "metadata": {},
7
  "outputs": [],
8
  "source": [
9
+ "from ip_adapter_art.utils.csd_clip import CSD_CLIP\n",
10
+ "from ip_adapter_art.utils.ip_adapter import (\n",
11
  " load_ip_adapter,\n",
12
  ")\n",
13
  "import torch\n",
14
  "from transformers import CLIPImageProcessor\n",
15
  "from PIL import Image\n",
16
+ "from diffusers.utils import make_image_grid, load_image\n",
17
  "from huggingface_hub import hf_hub_download\n",
18
  "from diffusers import StableDiffusionXLPipeline"
19
  ]
 
33
  "outputs": [],
34
  "source": [
35
  "csd_clip_path = hf_hub_download(\n",
36
+ " repo_id=\"AisingioroHao0/IP-Adapter-Art\", filename=\"csd_clip.pth\"\n",
37
  ")"
38
  ]
39
  },
 
43
  "metadata": {},
44
  "outputs": [],
45
  "source": [
46
+ "ip_adapter_art_path = hf_hub_download(\n",
47
+ " repo_id=\"AisingioroHao0/IP-Adapter-Art\", filename=\"ip_adapter_art_sdxl_512.pth\"\n",
48
  ")"
49
  ]
50
  },
setup.py CHANGED
@@ -2,7 +2,7 @@ from setuptools import find_packages, setup
2
 
3
 
4
  setup(
5
- name="ip_adapter_artist",
6
  version="0.1",
7
  description="Using reference images to control style in diffusion models",
8
  long_description=open("README.md", "r", encoding="utf-8").read(),
@@ -11,7 +11,7 @@ setup(
11
  license="Apache",
12
  author="aihao",
13
  author_email="aihao2000@outlook.com",
14
- url="https://github.com/aihao2000/IP-Adapter-Artist",
15
  packages=find_packages(),
16
  python_requires=">=3.8.0",
17
  install_requires=[
 
2
 
3
 
4
  setup(
5
+ name="ip_adapter_art",
6
  version="0.1",
7
  description="Using reference images to control style in diffusion models",
8
  long_description=open("README.md", "r", encoding="utf-8").read(),
 
11
  license="Apache",
12
  author="aihao",
13
  author_email="aihao2000@outlook.com",
14
+ url="https://github.com/aihao2000/IP-Adapter-Art",
15
  packages=find_packages(),
16
  python_requires=">=3.8.0",
17
  install_requires=[