saheedniyi commited on
Commit
f9668aa
1 Parent(s): f4c2c3e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -4
README.md CHANGED
@@ -82,10 +82,9 @@ The dataset was collected from **Nairaland.com**, extracting **1,795,908 unique
82
  The cleaning process was conducted using **[Datatrove](https://github.com/huggingface/datatrove)**, the same library employed in cleaning the **[FineWeb](https://huggingface.co/datasets/HuggingFaceFW/fineweb)** dataset, which is known for its high quality. The data cleaning process involved multiple stages of deduplication, filtering, and normalization to ensure the dataset's quality matches that of other high-performing datasets.
83
 
84
  ### Data Cleaning Procedure:
85
- - **Step 1:** Remove duplicate posts and links.
86
- - **Step 2:** Filter out non-Nigerian context posts based on domain analysis.
87
- - **Step 3:** Normalize textual content, removing HTML artifacts and irrelevant metadata.
88
- - **Step 4:** Language detection and correction based on predicted language probabilities.
89
 
90
  ## Example Entry
91
 
 
82
  The cleaning process was conducted using **[Datatrove](https://github.com/huggingface/datatrove)**, the same library employed in cleaning the **[FineWeb](https://huggingface.co/datasets/HuggingFaceFW/fineweb)** dataset, which is known for its high quality. The data cleaning process involved multiple stages of deduplication, filtering, and normalization to ensure the dataset's quality matches that of other high-performing datasets.
83
 
84
  ### Data Cleaning Procedure:
85
+ - **URL Filtering**
86
+ - **Repitition and quality filtering:**
87
+ - **Personal Identifiable Information (PII) Removal**
 
88
 
89
  ## Example Entry
90