Datasets:
Tasks:
Image Segmentation
Modalities:
Image
Languages:
English
Tags:
Cloud Detection
Cloud Segmentation
Remote Sensing Images
Satellite Images
HRC-WHU
CloudSEN12-High
License:
license: cc-by-nc-4.0 | |
task_categories: | |
- image-segmentation | |
language: | |
- en | |
tags: | |
- Cloud Detection | |
- Cloud Segmentation | |
- Remote Sensing Images | |
- Satellite Images | |
- HRC-WHU | |
- CloudSEN12-High | |
- GF12MS-WHU | |
- L8-Biome | |
# Cloud-Adapter-Datasets | |
This dataset card aims to describe the datasets used in the [Cloud-Adapter](https://github.com/XavierJiezou/cloud-adapter), a collection of high-resolution satellite images and semantic segmentation masks for cloud detection and related tasks. | |
## Install | |
```bash | |
pip install huggingface-hub | |
``` | |
## Usage | |
```bash | |
# Step 1: Download datasets | |
huggingface-cli download --repo-type dataset XavierJiezou/cloud-adapter-datasets --local-dir data --include hrc_whu.zip | |
huggingface-cli download --repo-type dataset XavierJiezou/cloud-adapter-datasets --local-dir data --include gf12ms_whu_gf1.zip | |
huggingface-cli download --repo-type dataset XavierJiezou/cloud-adapter-datasets --local-dir data --include gf12ms_whu_gf2.zip | |
huggingface-cli download --repo-type dataset XavierJiezou/cloud-adapter-datasets --local-dir data --include cloudsen12_high_l1c.zip | |
huggingface-cli download --repo-type dataset XavierJiezou/cloud-adapter-datasets --local-dir data --include cloudsen12_high_l2a.zip | |
huggingface-cli download --repo-type dataset XavierJiezou/cloud-adapter-datasets --local-dir data --include l8_biome.zip | |
# Step 2: Extract datasets | |
unzip hrc_whu.zip -d hrc_whu | |
unzip gf12ms_whu_gf1.zip -d gf12ms_whu_gf1 | |
unzip gf12ms_whu_gf2.zip -d gf12ms_whu_gf2 | |
unzip cloudsen12_high_l1c.zip -d cloudsen12_high_l1c | |
unzip cloudsen12_high_l2a.zip -d cloudsen12_high_l2a | |
unzip l8_biome.zip -d l8_biome | |
``` | |
## Example | |
```python | |
import os | |
import zipfile | |
from huggingface_hub import hf_hub_download | |
# Define the dataset repository | |
repo_id = "XavierJiezou/Cloud-Adapter" | |
# Select the zip file of the dataset to download | |
zip_files = [ | |
"hrc_whu.zip", | |
# "gf12ms_whu_gf1.zip", | |
# "gf12ms_whu_gf2.zip", | |
# "cloudsen12_high_l1c.zip", | |
# "cloudsen12_high_l2a.zip", | |
# "l8_biome.zip", | |
] | |
# Define a directory to extract the datasets | |
output_dir = "cloud_adapter_paper_data" | |
# Ensure the output directory exists | |
os.makedirs(output_dir, exist_ok=True) | |
# Step 1: Download and extract each ZIP file | |
for zip_file in zip_files: | |
print(f"Downloading {zip_file}...") | |
# Download the ZIP file from Hugging Face Hub | |
zip_path = hf_hub_download(repo_id=repo_id, filename=zip_file, repo_type="dataset") | |
# Extract the ZIP file | |
extract_path = os.path.join(output_dir, zip_file.replace(".zip", "")) | |
with zipfile.ZipFile(zip_path, "r") as zip_ref: | |
print(f"Extracting {zip_file} to {extract_path}...") | |
zip_ref.extractall(extract_path) | |
# Step 2: Explore the extracted datasets | |
# Example: Load and display the contents of the "hrc_whu" dataset | |
dataset_path = os.path.join(output_dir, "hrc_whu") | |
train_images_path = os.path.join(dataset_path, "img_dir", "train") | |
train_annotations_path = os.path.join(dataset_path, "ann_dir", "train") | |
# Display some files in the training set | |
print("Training Images:", os.listdir(train_images_path)[:5]) | |
print("Training Annotations:", os.listdir(train_annotations_path)[:5]) | |
# Example: Load and display an image and its annotation | |
from PIL import Image | |
# Load an example image and annotation | |
image_path = os.path.join(train_images_path, os.listdir(train_images_path)[0]) | |
annotation_path = os.path.join(train_annotations_path, os.listdir(train_annotations_path)[0]) | |
# Open and display the image | |
image = Image.open(image_path) | |
annotation = Image.open(annotation_path) | |
print("Displaying the image...") | |
image.show() | |
print("Displaying the annotation...") | |
annotation.show() | |
``` | |
## Source Data | |
- hrc_whu: https://github.com/dr-lizhiwei/HRC_WHU | |
- gf12ms_whu: https://github.com/whu-ZSC/GF1-GF2MS-WHU | |
- cloudsen12_high: https://huggingface.co/datasets/csaybar/CloudSEN12-high | |
- l8_biome: https://landsat.usgs.gov/landsat-8-cloud-cover-assessment-validation-data | |
## Citation | |
```bib | |
@article{hrc_whu, | |
title = {Deep learning based cloud detection for medium and high resolution remote sensing images of different sensors}, | |
journal = {ISPRS Journal of Photogrammetry and Remote Sensing}, | |
volume = {150}, | |
pages = {197-212}, | |
year = {2019}, | |
author = {Zhiwei Li and Huanfeng Shen and Qing Cheng and Yuhao Liu and Shucheng You and Zongyi He}, | |
} | |
@article{gf12ms_whu, | |
author={Zhu, Shaocong and Li, Zhiwei and Shen, Huanfeng}, | |
journal={IEEE Transactions on Geoscience and Remote Sensing}, | |
title={Transferring Deep Models for Cloud Detection in Multisensor Images via Weakly Supervised Learning}, | |
year={2024}, | |
volume={62}, | |
pages={1-18}, | |
} | |
@article{cloudsen12_high, | |
title={CloudSEN12, a global dataset for semantic understanding of cloud and cloud shadow in Sentinel-2}, | |
author={Aybar, Cesar and Ysuhuaylas, Luis and Loja, Jhomira and Gonzales, Karen and Herrera, Fernando and Bautista, Lesly and Yali, Roy and Flores, Angie and Diaz, Lissette and Cuenca, Nicole and others}, | |
journal={Scientific data}, | |
volume={9}, | |
number={1}, | |
pages={782}, | |
year={2022}, | |
} | |
@article{l8_biome, | |
title = {Cloud detection algorithm comparison and validation for operational Landsat data products}, | |
journal = {Remote Sensing of Environment}, | |
volume = {194}, | |
pages = {379-390}, | |
year = {2017}, | |
author = {Steve Foga and Pat L. Scaramuzza and Song Guo and Zhe Zhu and Ronald D. Dilley and Tim Beckmann and Gail L. Schmidt and John L. Dwyer and M. {Joseph Hughes} and Brady Laue} | |
} | |
``` | |
## Contact | |
For questions, please contact Xavier Jiezou at xuechaozou (at) foxmail (dot) com. |