Cannot access to the dataset through load_dataset
I am using the code below to load the dataset
dataset = load_dataset("ShapeNet/ShapeNetCore", Token=True)
But, it won't load or download the dataset, the output is as below:
Resolving data files: 100%
55/55 [00:09<00:00, 2.10it/s]
I have logged into my Hugging Face account and also have access to all my other datasets.
FileNotFoundError Traceback (most recent call last)
Cell In[7], line 1
----> 1 dataset = load_dataset("ShapeNet/ShapeNetCore",
2 token=True,
3 )
File /nlp-llm/lib/python3.8/site-packages/datasets/load.py:2129, in load_dataset(path, name, data_dir, data_files, split, cache_dir, features, download_config, download_mode, verification_mode, ignore_verifications, keep_in_memory, save_infos, revision, token, use_auth_token, task, streaming, num_proc, storage_options, **config_kwargs)
2124 verification_mode = VerificationMode(
2125 (verification_mode or VerificationMode.BASIC_CHECKS) if not save_infos else VerificationMode.ALL_CHECKS
2126 )
2128 # Create a dataset builder
-> 2129 builder_instance = load_dataset_builder(
2130 path=path,
2131 name=name,
2132 data_dir=data_dir,
2133 data_files=data_files,
2134 cache_dir=cache_dir,
2135 features=features,
2136 download_config=download_config,
2137 download_mode=download_mode,
2138 revision=revision,
2139 token=token,
2140 storage_options=storage_options,
2141 **config_kwargs,
2142 )
2144 # Return iterable dataset in case of streaming
2145 if streaming:
File /nlp-llm/lib/python3.8/site-packages/datasets/load.py:1815, in load_dataset_builder(path, name, data_dir, data_files, cache_dir, features, download_config, download_mode, revision, token, use_auth_token, storage_options, **config_kwargs)
1813 download_config = download_config.copy() if download_config else DownloadConfig()
1814 download_config.storage_options.update(storage_options)
-> 1815 dataset_module = dataset_module_factory(
1816 path,
1817 revision=revision,
1818 download_config=download_config,
1819 download_mode=download_mode,
1820 data_dir=data_dir,
1821 data_files=data_files,
1822 )
1823 # Get dataset builder class from the processing script
1824 builder_kwargs = dataset_module.builder_kwargs
File /nlp-llm/lib/python3.8/site-packages/datasets/load.py:1508, in dataset_module_factory(path, revision, download_config, download_mode, dynamic_modules_path, data_dir, data_files, **download_kwargs)
1506 raise e1 from None
1507 if isinstance(e1, FileNotFoundError):
-> 1508 raise FileNotFoundError(
1509 f"Couldn't find a dataset script at {relative_to_absolute_path(combined_path)} or any data file in the same directory. "
1510 f"Couldn't find '{path}' on the Hugging Face Hub either: {type(e1).name}: {e1}"
1511 ) from None
1512 raise e1 from None
1513 else:
FileNotFoundError: Couldn't find a dataset script at /ShapeNet/ShapeNetCore/ShapeNetCore.py or any data file in the same directory. Couldn't find 'ShapeNet/ShapeNetCore' on the Hugging Face Hub either: FileNotFoundError: No (supported) data files or dataset script found in ShapeNet/ShapeNetCore.
It does seem like the load_dataset
load is broken, I ended up just creating a simple script here: https://gist.github.com/rishub-tamirisa/49bebedc4ec362fe2c7caacf42f90bf6.
export your huggingface token as an env var using export HF_TOKEN=<your_hf_token>
, then run that and then unzip \*.zip
to unzip the files.
Thanks for your comment. Still I could not load the dataset in huggingface
@ArmanAsq I downloaded by cloning, like so:
- create user access token here https://huggingface.co/settings/tokens
- git clone https://huggingface.co/datasets/ShapeNet/ShapeNetCore
- enter your hugging face username and the user access token you just created in step 1 as your password
This will clone the main branch of ShapeNetCore into the location you run this in.
If that doesn't work, you can also try using an ssh token (or use existing on if you have it, instructions here https://huggingface.co/docs/hub/security-git-ssh) and use git clone git@hf.co:datasets/ShapeNet/ShapeNetCore instead. Now you have the data and you can just write a script to unzip the files within ShapeNetCore. Hope this helps
How do I download specific objects with their ids?
@JemuelStanley I am having the same issue. I am able to load a specific subset of the data using this line
dataset = load_dataset("ShapeNet/ShapeNetCore", data_files="02773838.zip/")
However, i'm unsure of how to download a specific .obj file based on an input object_id.
this: https://gist.github.com/rishub-tamirisa/49bebedc4ec362fe2c7caacf42f90bf6 worked for me, thanks!