File size: 1,796 Bytes
d960c9d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
37bd076
 
 
 
 
 
 
d960c9d
 
3bd1ef5
d960c9d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
---
license: mit
datasets:
- ethical-spectacle/biased-corpus
language:
- en
metrics:
- f1(0.8998)
- precision
- recall()
library_name: transformers
co2_eq_emissions:
  emissions: 10
  source: Code Carbon
  training_type: fine-tuning
  geographical_location: Albany, New York
  hardware_used: T4
base_model:
- google-bert/bert-base-uncased
pipeline_tag: text-classification
tags:
- Social Bias
---

## How to Use
```
classifier = pipeline("text-classification", model="maximuspowers/bias-type-classifier") // pass in return_all_scores=True for multi-label
result = classifier("Tall people are so clumsy")

// Example Result
// [
//    {
//     "label": "physical",
//     "score": 0.9972801208496094
//    }
// ]
```

This model was trained on a [synthetic dataset](https://huggingface.co/datasets/ethical-spectacle/biased-corpus) of biased statements and questions, generated by Mistal 7B as part of the [GUS-Net paper](https://arxiv.org/abs/2410.08388). 

### Model Performance:
| Label           | F1 Score | Precision | Recall |
|-----------------|----------|-----------|--------|
| **Macro Average** | **0.8998** | **0.9213**  | **0.8807** |
| racial          | 0.8613   | 0.9262    | 0.8049 |
| religious       | 0.9655   | 0.9716    | 0.9595 |
| gender          | 0.9160   | 0.9099    | 0.9223 |
| age             | 0.9185   | 0.9683    | 0.8737 |
| nationality     | 0.9083   | 0.9053    | 0.9113 |
| sexuality       | 0.9304   | 0.9484    | 0.9131 |
| socioeconomic   | 0.8273   | 0.8727    | 0.7864 |
| educational     | 0.8791   | 0.9091    | 0.8511 |
| disability      | 0.8713   | 0.8762    | 0.8665 |
| political       | 0.9127   | 0.8914    | 0.9351 |
| physical        | 0.9069   | 0.9547    | 0.8635 |

### Training Params:
**Learning Rate:** 5e-5
**Batch Size:** 16
**Epochs:** 3