nickwong64/bert-base-uncased-finance-sentiment
Bert is a Transformer Bidirectional Encoder based Architecture trained on MLM(Mask Language Modeling) objective. bert-base-uncased finetuned on the cyrilzhang/financial_phrasebank_split dataset using HuggingFace Trainer with below training parameters.
learning rate 2e-5,
batch size 8,
num_train_epochs=6,
Model Performance
Epoch | Training Loss | Validation Loss | Accuracy | F1 |
---|---|---|---|---|
6 | 0.034100 | 0.954745 | 0.853608 | 0.854358 |
How to Use the Model
from transformers import pipeline
nlp = pipeline(task='text-classification',
model='nickwong64/bert-base-uncased-finance-sentiment')
p1 = "HK stocks open lower after Fed rate comments"
p2 = "US stocks end lower on earnings worries"
p3 = "Muted Fed, AI hopes send Wall Street higher"
print(nlp(p1))
print(nlp(p2))
print(nlp(p3))
"""
output:
[{'label': 'negative', 'score': 0.9991507530212402}]
[{'label': 'negative', 'score': 0.9997240900993347}]
[{'label': 'neutral', 'score': 0.9834381937980652}]
"""
Dataset
cyrilzhang/financial_phrasebank_split
Labels
{0: 'negative', 1: 'neutral', 2: 'positive'}
Evaluation
{'test_loss': 0.9547446370124817,
'test_accuracy': 0.8536082474226804,
'test_f1': 0.8543579048224414,
'test_runtime': 4.9865,
'test_samples_per_second': 97.263,
'test_steps_per_second': 12.233}
- Downloads last month
- 30
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.