saheedniyi
commited on
Commit
•
f9668aa
1
Parent(s):
f4c2c3e
Update README.md
Browse files
README.md
CHANGED
@@ -82,10 +82,9 @@ The dataset was collected from **Nairaland.com**, extracting **1,795,908 unique
|
|
82 |
The cleaning process was conducted using **[Datatrove](https://github.com/huggingface/datatrove)**, the same library employed in cleaning the **[FineWeb](https://huggingface.co/datasets/HuggingFaceFW/fineweb)** dataset, which is known for its high quality. The data cleaning process involved multiple stages of deduplication, filtering, and normalization to ensure the dataset's quality matches that of other high-performing datasets.
|
83 |
|
84 |
### Data Cleaning Procedure:
|
85 |
-
- **
|
86 |
-
- **
|
87 |
-
- **
|
88 |
-
- **Step 4:** Language detection and correction based on predicted language probabilities.
|
89 |
|
90 |
## Example Entry
|
91 |
|
|
|
82 |
The cleaning process was conducted using **[Datatrove](https://github.com/huggingface/datatrove)**, the same library employed in cleaning the **[FineWeb](https://huggingface.co/datasets/HuggingFaceFW/fineweb)** dataset, which is known for its high quality. The data cleaning process involved multiple stages of deduplication, filtering, and normalization to ensure the dataset's quality matches that of other high-performing datasets.
|
83 |
|
84 |
### Data Cleaning Procedure:
|
85 |
+
- **URL Filtering**
|
86 |
+
- **Repitition and quality filtering:**
|
87 |
+
- **Personal Identifiable Information (PII) Removal**
|
|
|
88 |
|
89 |
## Example Entry
|
90 |
|