metadata
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
Dataset of w/W/W (Arknights)
This is the dataset of w/W/W (Arknights), containing 500 images and their tags.
The core tags of this character are horns, short_hair, bangs, breasts, grey_hair, demon_horns, red_eyes, medium_breasts, white_hair, ahoge, tail
, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
Name | Images | Size | Download | Type | Description |
---|---|---|---|---|---|
raw | 500 | 1.02 GiB | Download | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
stage3-800 | 1383 | 1.06 GiB | Download | IMG+TXT | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
stage3-p480-800 | 1339 | 1.05 GiB | Download | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
stage3-1200 | 1383 | 1.68 GiB | Download | IMG+TXT | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
stage3-p480-1200 | 1339 | 1.67 GiB | Download | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/w_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])