Datasets:
Deduped version of fineweb on HuggingFace yields "This dataset has 218 files that have been marked as unsafe."
I'm in the process of publishing deduped version of fineweb dataset. No additions, strict removing of duplicated rows. After upload of most of the files I get "This dataset has 218 files that have been marked as unsafe.". Why I have this problem and this deduplicated version of fineweb-edu does not have it? I think it's low probability that the reason for this is existence of some problematic rows in fineweb in fineweb-edu - I have many affected files and I would imagine that at least 1 or 2 rows should have slipped into fineweb-edu.
I uploaded this data using the Hugging Face datasets push_to_hub
method. This should likely ensure that the uploaded data is properly formatted for what HF expects. FWIW, I doubt it has to do with the actual text in the dataset rows. Here's HF's page on their malware scanning: https://huggingface.co/docs/hub/security-malware
If all that doesn't help maybe ask the HF folks on their discord?