Cannot load "berkeley_gnm_cory_hall"
#2
by
vmoens
- opened
As reported here, we cannot load the berkeley_gnm_cory_hall dataset:
import datasets
datasets.load_dataset(
"jxu124/OpenX-Embodiment",
"berkeley_gnm_cory_hall",
streaming=False,
split="train",
cache_dir="./dump",
trust_remote_code=True,
)
results in
File "/Users/vmoens/.cache/huggingface/modules/datasets_modules/datasets/jxu124--OpenX-Embodiment/317e9044a9bb97bb1db9ea5aebf1c15f5cc3e1e071e5da025e97892e96dae22b/OpenX-Embodiment.py", line 29, in decode_image
data = data.decode()
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xff in position 0: invalid start byte
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/vmoens/Library/Application Support/JetBrains/PyCharm2023.3/scratches/scratch_7.py", line 15, in <module>
datasets.load_dataset(
File "/Users/vmoens/venv/rl/lib/python3.10/site-packages/datasets/load.py", line 2096, in load_dataset
builder_instance.download_and_prepare(
File "/Users/vmoens/venv/rl/lib/python3.10/site-packages/datasets/builder.py", line 924, in download_and_prepare
self._download_and_prepare(
File "/Users/vmoens/venv/rl/lib/python3.10/site-packages/datasets/builder.py", line 1647, in _download_and_prepare
super()._download_and_prepare(
File "/Users/vmoens/venv/rl/lib/python3.10/site-packages/datasets/builder.py", line 999, in _download_and_prepare
self._prepare_split(split_generator, **prepare_split_kwargs)
File "/Users/vmoens/venv/rl/lib/python3.10/site-packages/datasets/builder.py", line 1485, in _prepare_split
for job_id, done, content in self._prepare_split_single(
File "/Users/vmoens/venv/rl/lib/python3.10/site-packages/datasets/builder.py", line 1642, in _prepare_split_single
raise DatasetGenerationError("An error occurred while generating the dataset") from e
datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset
Thanks for your feedback, the bug of image decoding has been fixed.