Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
Q-bert
/
tokenized-wikipedia
like
0
Modalities:
Text
Formats:
parquet
Size:
100K - 1M
Libraries:
Datasets
Dask
Croissant
+ 1
License:
mit
Dataset card
Viewer
Files
Files and versions
Community
2
refs/convert/parquet
tokenized-wikipedia
Commit History
Delete old duckdb index files
3cf1b3f
verified
parquet-converter
commited on
Mar 7
Update duckdb index files
caa0d8c
parquet-converter
commited on
Nov 29, 2023
Update parquet files
c67e8ef
parquet-converter
commited on
Nov 29, 2023
initial commit
fb298da
Q-bert
commited on
Nov 29, 2023