Toxicity Classification Model
This model is trained for toxicity classification task using. The dataset used for training is the dataset by Jigsaw ( Jigsaw 2020). We split it into two parts and fine-tune a DistilBERT model (DistilBERT base model (uncased) ) on it. DistilBERT is a distilled version of the BERT base model. It was introduced in this paper.
How to use
from transformers import pipeline
text = "This was a masterpiece. Not completely faithful to the books, but enthralling from beginning to end. Might be my favorite of the three."
classifier = pipeline("text-classification", model="tensor-trek/distilbert-toxicity-classifier")
classifier(text)
License
- Downloads last month
- 7
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.