File size: 358 Bytes
81484f7
 
7e4b5aa
81484f7
 
1ba53c1
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
# Wiki Sentences

A dataset of all english sentences in Wikipedia.

Taken from the OPTIMUS project. https://github.com/ChunyuanLI/Optimus/blob/master/download_datasets.md

The dataset is 11.8GB so best to load it using streaming:

```python
from datasets import load_dataset
dataset = load_dataset("Fraser/wiki_sentences", split='train', streaming=True)
```