|
--- |
|
language: |
|
- en |
|
tags: |
|
- text-classification |
|
- emotion |
|
- pytorch |
|
license: mit |
|
datasets: |
|
- emotion |
|
metrics: |
|
- accuracy |
|
- precision |
|
- recall |
|
- f1 |
|
--- |
|
|
|
# bert-base-uncased-emotion |
|
|
|
## Model description |
|
|
|
`bert-base-uncased` finetuned on the unify-emotion-datasets (https://github.com/sarnthil/unify-emotion-datasets), then transferred to |
|
a small sample of 10K hand-tagged StockTwits messages. Optimized for extracting emotions from financial contexts. |
|
|
|
Sequence length 64, learning rate 2e-5, batch size 128, 8 epochs. |
|
|
|
For more details, please visit https://github.com/dvamossy/EmTract. |
|
|
|
## Training data |
|
|
|
Data came from https://github.com/sarnthil/unify-emotion-datasets. |
|
|