Datasets:
pierreguillou
commited on
Commit
·
2c46936
1
Parent(s):
a617bea
Update README.md
Browse files
README.md
CHANGED
@@ -92,7 +92,7 @@ Citation of the page 3 of the [DocLayNet paper](https://arxiv.org/abs/2206.01062
|
|
92 |
|
93 |
The size of the DocLayNet large is about 100% of the DocLayNet dataset (random selection respectively in the train, val and test files).
|
94 |
|
95 |
-
**WARNING** The following code allows to download DocLayNet large but it can not run until the end in Google Colab because of the size needed to store cache data and the CPU RAM to download the data (for example, the cache data in /home/ubuntu/.cache/huggingface/datasets/ needs almost 120 GB during the downloading process).
|
96 |
|
97 |
```
|
98 |
# !pip install -q datasets
|
|
|
92 |
|
93 |
The size of the DocLayNet large is about 100% of the DocLayNet dataset (random selection respectively in the train, val and test files).
|
94 |
|
95 |
+
**WARNING** The following code allows to download DocLayNet large but it can not run until the end in Google Colab because of the size needed to store cache data and the CPU RAM to download the data (for example, the cache data in /home/ubuntu/.cache/huggingface/datasets/ needs almost 120 GB during the downloading process). And even with a suitable instance, the download time of the DocLayNet large dataset is around 1h50.
|
96 |
|
97 |
```
|
98 |
# !pip install -q datasets
|