metadata
language: en
tags:
- distilroberta
- sentiment
- emotion
- twitter
- reddit
widget:
- text: Oh my God, he's lost it. He's totally lost it.
- text: What?
- text: Wow, congratulations! So excited for you!
Fine-tuned DistilRoBERTa-base for Emotion Classification π€¬π€’ππππ²
Model Description
DistilRoBERTa-base is a transformers model, which is pre-trained on the Friends show to classify emotions from text data. It predicts 6 Ekman emotions and a neutral class. These emotions include anger, disgust, fear, joy, neutral, sadness, and surprise.
The model is a fine-tuned version of Emotion English DistilRoBERTa-base and DistilRoBERTa-base. This model was initially trained on
Name | anger | disgust | fear | joy | neutral | sadness | surprise |
---|---|---|---|---|---|---|---|
Crowdflower (2016) | Yes | - | - | Yes | Yes | Yes | Yes |
Emotion Dataset, Elvis et al. (2018) | Yes | - | Yes | Yes | - | Yes | Yes |
GoEmotions, Demszky et al. (2020) | Yes | Yes | Yes | Yes | Yes | Yes | Yes |
ISEAR, Vikash (2018) | Yes | Yes | Yes | Yes | - | Yes | - |
MELD, Poria et al. (2019) | Yes | Yes | Yes | Yes | Yes | Yes | Yes |
SemEval-2018, EI-reg, Mohammad et al. (2018) | Yes | - | Yes | Yes | - | Yes | - |
It was fine-tuned on:
Name | anger | disgust | fear | joy | neutral | sadness | surprise |
---|---|---|---|---|---|---|---|
Crowdflower (2016) | Yes | - | - | Yes | Yes | Yes | Yes |
How to Use
from transformers import pipeline
classifier = pipeline("sentiment-analysis", model="michellejieli/emotion_text_classifier")
classifier("I love this!")
Output:
[{'label': 'joy', 'score': 0.9887555241584778}]
Contact
Please reach out to michelle.li851@duke.edu if you have any questions or feedback.
Reference
Jochen Hartmann, "Emotion English DistilRoBERTa-base". https://huggingface.co/j-hartmann/emotion-english-distilroberta-base/, 2022.
Ashritha R Murthy and K M Anil Kumar 2021 IOP Conf. Ser.: Mater. Sci. Eng. 1110 012009