Datasets:

Modalities:
Text
Formats:
parquet
Libraries:
Datasets
Dask
License:
File size: 839 Bytes
f7642be
 
e67f383
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f7642be
3187539
cfe18cf
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
---
license: other
dataset_info:
  config_name: ko
  features:
  - name: source
    dtype: string
  - name: id
    dtype: string
  - name: text
    dtype: string
  - name: added
    dtype: string
  - name: timestamp
    dtype: timestamp[s]
  - name: metadata
    struct:
    - name: url
      dtype: string
  - name: lang
    struct:
    - name: ko.tfrecord
      dtype: float64
  splits:
  - name: train
    num_bytes: 151177516676
    num_examples: 24035493
  download_size: 16185376673
  dataset_size: 151177516676
configs:
- config_name: ko
  data_files:
  - split: train
    path: ko/train-*
---
mc4 but in HPC friendly parquet format (32GiB shards)
Attribution,license, copyright info: [Google](https://www.tensorflow.org/datasets/catalog/c4) and [AI^2](https://huggingface.co/datasets/allenai/c4)  for producing and uploading them.