Datasets:
Tasks:
Image Segmentation
Modalities:
Image
Formats:
parquet
Sub-tasks:
semantic-segmentation
Size:
1K - 10K
License:
How to compress images in parquet.
#3
by
Divelix
- opened
Hi, thank you for open sourcing this dataset! When I tried to extract images from your parquet file, I was amazed by decompression ratio: parquet is 324 Mb while output png images took around 2.4 GB of space on disk!
I found your discussion on forum (https://discuss.huggingface.co/t/image-dataset-best-practices/13974), but when I tried dataset.map()
trick I could get only parquet file of the same size as original data. Can you explain please how did you manage to achieve such high compression rate?
@Divelix how are you extracting this parquet file, any relevant code and modules you used? Please share. thank you!