Text Classification
Transformers
Safetensors
English
tiny_transformer
Michielo commited on
Commit
bcc8c98
·
verified ·
1 Parent(s): d355084

Updating stylization

Browse files
Files changed (1) hide show
  1. README.md +14 -13
README.md CHANGED
@@ -11,19 +11,6 @@ A tiny comment toxicity classifier model at only 2M parameters. With only ~10MB
11
  A paper on this model is being released soon.
12
 
13
 
14
- ## Benchmarks
15
-
16
- The Tiny-Toxic-Detector achieves an impressive 90.26% on the Toxigen benchmark and 87.34% on the Jigsaw-Toxic-Comment-Classification-Challenge. Here we compare our results against other toxic classification models:
17
-
18
-
19
- | Model | Size (parameters) | Toxigen (%) | Jigsaw (%) |
20
- | --------------------------------- | ----------------- | ----------- | ---------- |
21
- | lmsys/toxicchat-t5-large-v1.0 | 738M | 72.67 | 88.82 |
22
- | s-nlp/roberta toxicity classifier | 124M | 88.41 | **94.92** |
23
- | mohsenfayyaz/toxicity-classifier | 109M | 81.50 | 83.31 |
24
- | martin-ha/toxic-comment-model | 67M | 68.02 | 91.56 |
25
- | **Tiny-toxic-detector** | **2M** | **90.97** | 86.98 |
26
-
27
 
28
  ## Usage
29
  This model uses custom architecture and requires some extra custom code to work. Below you can find the architecture and a fully-usable example.
@@ -204,6 +191,20 @@ with torch.no_grad():
204
  prediction = "Toxic" if logits > 0.5 else "Not Toxic"
205
  ```
206
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
207
 
208
  ## Usage and Limitations
209
 
 
11
  A paper on this model is being released soon.
12
 
13
 
 
 
 
 
 
 
 
 
 
 
 
 
 
14
 
15
  ## Usage
16
  This model uses custom architecture and requires some extra custom code to work. Below you can find the architecture and a fully-usable example.
 
191
  prediction = "Toxic" if logits > 0.5 else "Not Toxic"
192
  ```
193
 
194
+ ## Benchmarks
195
+
196
+ The Tiny-Toxic-Detector achieves an impressive 90.26% on the Toxigen benchmark and 87.34% on the Jigsaw-Toxic-Comment-Classification-Challenge. Here we compare our results against other toxic classification models:
197
+
198
+
199
+ | Model | Size (parameters) | Toxigen (%) | Jigsaw (%) |
200
+ | --------------------------------- | ----------------- | ----------- | ---------- |
201
+ | lmsys/toxicchat-t5-large-v1.0 | 738M | 72.67 | 88.82 |
202
+ | s-nlp/roberta toxicity classifier | 124M | 88.41 | **94.92** |
203
+ | mohsenfayyaz/toxicity-classifier | 109M | 81.50 | 83.31 |
204
+ | martin-ha/toxic-comment-model | 67M | 68.02 | 91.56 |
205
+ | **Tiny-toxic-detector** | **2M** | **90.97** | 86.98 |
206
+
207
+
208
 
209
  ## Usage and Limitations
210