metadata
language:
- en
size_categories:
- 1B<n<10B
A large dataset of LA Times articles, spanning over a century (1914-2024). In total, there are 3.6M full text articles, comprised of 12B characters. Using the Llama-3 tokenzier, this comes out to 2.6B tokens.