Datasets:

Hest download fails randomly due to consistency check failure

#7
by adiv5 - opened

While downloading the data using instructions mentioned here, randomly getting this error.

OSError: Consistency check failed: file should be of size 493976088 but has size 334638656 (MEND157.h5).
We are sorry for the inconvenience. Please retry with `force_download=True`.
If the issue persists, please let us know by opening an issue on https://github.com/huggingface/huggingface_hub.

Currently im re running the download script if it fails, but is there a solution to this error? Does restarting download corrupt the data for which error was observed (here its MEND157.h5) or does it skip it and starts downloading for next sample

AI for Pathology Image Analysis Lab @ HMS / BWH org

This can happen if your connection is unstable. You can fix using:

path = './hest_data/'

def load_dataset_until_success():
    while True:
        try:
            dataset = datasets.load_dataset(
                'MahmoodLab/hest', 
                cache_dir=path,
                patterns='*'
            )
            return dataset
        except Exception as e:
            print(f"Error occurred: {e}. Retrying...")
            time.sleep(1)  

# Call the function
dataset = load_dataset_until_success()

Then, to make sure everything is downloaded you can use:

from hest import iter_hest
df = pd.read_csv('./assets/HEST_v1_1_0.csv')
ids = df['id].values.tolist()
for st in iter_hest('./hest_data', id_list=ids):
    print(st)

Sign up or log in to comment