Datasets:
Upload reddit-3.8million-corpus-webis-tldr-17.parquet
Hi @KnutJaegersberg , thanks for your contribution.
It would be interesting if you could give more context about your contribution: why you make it, where you got this data file format, if you are planning to propose additional modifications/contributions, etc...
Please, also note that at Hugging Face we are in the process of providing data in Parquet format for all datasets on the Hub: that will be accessible from the Git reference https://huggingface.co/datasets/reddit/tree/refs%2Fconvert%2Fparquet
Hi
@albertvillanova
,
context: I was looking to sample some data from a particular subreddit, so I grabbed this reddit dataset to filter for contents of that subreddit.
So I downloaded the json from the Link in the python file and loaded it with jsonlite as dataframe in R.
I thought others might want to work with the data in tabular form, too, so I uploaded the parquet file I generated for myself.
Good to hear you plan to give access to data in parquet files, have a nice day!
Should I delete this branch given changed reddit terms of service? I don't know how.