File size: 15,688 Bytes
a31b3f5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
---
base_model: ProsusAI/finbert
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
- precision
- recall
model-index:
- name: finBERT_sentiment_analysis_20e
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# finBERT_sentiment_analysis_20e

This model is a fine-tuned version of [ProsusAI/finbert](https://huggingface.co/ProsusAI/finbert) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8337
- Accuracy: 0.9040
- F1: 0.9040
- Precision: 0.9038
- Recall: 0.9044

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 20
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch   | Step | Validation Loss | Accuracy | F1     | Precision | Recall |
|:-------------:|:-------:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
| 1.2898        | 0.1323  | 50   | 0.5537          | 0.7885   | 0.7864 | 0.7889    | 0.7896 |
| 0.5067        | 0.2646  | 100  | 0.4689          | 0.8270   | 0.8216 | 0.8290    | 0.8281 |
| 0.4212        | 0.3968  | 150  | 0.4237          | 0.8467   | 0.8458 | 0.8475    | 0.8476 |
| 0.3969        | 0.5291  | 200  | 0.4118          | 0.8502   | 0.8477 | 0.8500    | 0.8511 |
| 0.3891        | 0.6614  | 250  | 0.3767          | 0.8545   | 0.8559 | 0.8586    | 0.8546 |
| 0.3927        | 0.7937  | 300  | 0.3542          | 0.8696   | 0.8695 | 0.8692    | 0.8701 |
| 0.378         | 0.9259  | 350  | 0.3563          | 0.8703   | 0.8689 | 0.8694    | 0.8710 |
| 0.3537        | 1.0582  | 400  | 0.3472          | 0.8686   | 0.8696 | 0.8709    | 0.8689 |
| 0.32          | 1.1905  | 450  | 0.3591          | 0.8721   | 0.8710 | 0.8710    | 0.8727 |
| 0.3119        | 1.3228  | 500  | 0.3545          | 0.8709   | 0.8704 | 0.8707    | 0.8714 |
| 0.2992        | 1.4550  | 550  | 0.3378          | 0.8753   | 0.8763 | 0.8777    | 0.8756 |
| 0.3025        | 1.5873  | 600  | 0.3320          | 0.8785   | 0.8779 | 0.8778    | 0.8790 |
| 0.2913        | 1.7196  | 650  | 0.3835          | 0.8729   | 0.8702 | 0.8737    | 0.8738 |
| 0.3103        | 1.8519  | 700  | 0.3321          | 0.8812   | 0.8805 | 0.8804    | 0.8817 |
| 0.2847        | 1.9841  | 750  | 0.3337          | 0.8832   | 0.8836 | 0.8839    | 0.8835 |
| 0.2037        | 2.1164  | 800  | 0.3848          | 0.8809   | 0.8811 | 0.8812    | 0.8812 |
| 0.2199        | 2.2487  | 850  | 0.4087          | 0.8708   | 0.8690 | 0.8720    | 0.8714 |
| 0.2103        | 2.3810  | 900  | 0.3562          | 0.8794   | 0.8784 | 0.8787    | 0.8799 |
| 0.2178        | 2.5132  | 950  | 0.3473          | 0.8848   | 0.8847 | 0.8846    | 0.8853 |
| 0.2067        | 2.6455  | 1000 | 0.3554          | 0.8864   | 0.8863 | 0.8861    | 0.8869 |
| 0.2327        | 2.7778  | 1050 | 0.3667          | 0.8761   | 0.8778 | 0.8826    | 0.8760 |
| 0.2139        | 2.9101  | 1100 | 0.3657          | 0.8813   | 0.8825 | 0.8847    | 0.8814 |
| 0.1998        | 3.0423  | 1150 | 0.3761          | 0.8802   | 0.8815 | 0.8843    | 0.8804 |
| 0.1387        | 3.1746  | 1200 | 0.3918          | 0.8870   | 0.8874 | 0.8876    | 0.8873 |
| 0.1355        | 3.3069  | 1250 | 0.4323          | 0.8810   | 0.8791 | 0.8818    | 0.8818 |
| 0.1546        | 3.4392  | 1300 | 0.4122          | 0.8853   | 0.8842 | 0.8848    | 0.8860 |
| 0.135         | 3.5714  | 1350 | 0.3948          | 0.8862   | 0.8857 | 0.8857    | 0.8867 |
| 0.1827        | 3.7037  | 1400 | 0.3676          | 0.8846   | 0.8852 | 0.8858    | 0.8848 |
| 0.1551        | 3.8360  | 1450 | 0.3910          | 0.8893   | 0.8887 | 0.8888    | 0.8898 |
| 0.1415        | 3.9683  | 1500 | 0.3669          | 0.8913   | 0.8919 | 0.8926    | 0.8916 |
| 0.1122        | 4.1005  | 1550 | 0.4385          | 0.8876   | 0.8882 | 0.8897    | 0.8879 |
| 0.0996        | 4.2328  | 1600 | 0.4151          | 0.8926   | 0.8922 | 0.8921    | 0.8931 |
| 0.1077        | 4.3651  | 1650 | 0.4277          | 0.8902   | 0.8902 | 0.8899    | 0.8906 |
| 0.1207        | 4.4974  | 1700 | 0.4166          | 0.8859   | 0.8852 | 0.8853    | 0.8864 |
| 0.0993        | 4.6296  | 1750 | 0.4141          | 0.8931   | 0.8934 | 0.8934    | 0.8933 |
| 0.1172        | 4.7619  | 1800 | 0.4173          | 0.8929   | 0.8933 | 0.8937    | 0.8931 |
| 0.121         | 4.8942  | 1850 | 0.4067          | 0.8934   | 0.8926 | 0.8928    | 0.8939 |
| 0.1134        | 5.0265  | 1900 | 0.4496          | 0.8936   | 0.8930 | 0.8931    | 0.8942 |
| 0.0677        | 5.1587  | 1950 | 0.4808          | 0.8896   | 0.8901 | 0.8906    | 0.8899 |
| 0.0722        | 5.2910  | 2000 | 0.4848          | 0.8881   | 0.8889 | 0.8903    | 0.8881 |
| 0.0917        | 5.4233  | 2050 | 0.4863          | 0.8927   | 0.8931 | 0.8937    | 0.8928 |
| 0.0871        | 5.5556  | 2100 | 0.4359          | 0.8973   | 0.8969 | 0.8967    | 0.8977 |
| 0.078         | 5.6878  | 2150 | 0.4410          | 0.8926   | 0.8925 | 0.8924    | 0.8931 |
| 0.0761        | 5.8201  | 2200 | 0.4724          | 0.8949   | 0.8945 | 0.8944    | 0.8954 |
| 0.0925        | 5.9524  | 2250 | 0.4932          | 0.8953   | 0.8944 | 0.8950    | 0.8958 |
| 0.076         | 6.0847  | 2300 | 0.5118          | 0.8885   | 0.8885 | 0.8889    | 0.8887 |
| 0.049         | 6.2169  | 2350 | 0.5233          | 0.8928   | 0.8930 | 0.8930    | 0.8930 |
| 0.0628        | 6.3492  | 2400 | 0.5108          | 0.9001   | 0.8998 | 0.8997    | 0.9006 |
| 0.0661        | 6.4815  | 2450 | 0.5096          | 0.8952   | 0.8952 | 0.8950    | 0.8955 |
| 0.0625        | 6.6138  | 2500 | 0.5538          | 0.8917   | 0.8920 | 0.8921    | 0.8919 |
| 0.0689        | 6.7460  | 2550 | 0.5341          | 0.8929   | 0.8931 | 0.8930    | 0.8932 |
| 0.0614        | 6.8783  | 2600 | 0.5080          | 0.8966   | 0.8968 | 0.8968    | 0.8969 |
| 0.0699        | 7.0106  | 2650 | 0.5037          | 0.8987   | 0.8987 | 0.8986    | 0.8991 |
| 0.0527        | 7.1429  | 2700 | 0.5176          | 0.9002   | 0.9002 | 0.9001    | 0.9006 |
| 0.0553        | 7.2751  | 2750 | 0.5412          | 0.8973   | 0.8980 | 0.8988    | 0.8976 |
| 0.0601        | 7.4074  | 2800 | 0.5279          | 0.8905   | 0.8916 | 0.8939    | 0.8906 |
| 0.0519        | 7.5397  | 2850 | 0.5628          | 0.9008   | 0.9006 | 0.9008    | 0.9013 |
| 0.0418        | 7.6720  | 2900 | 0.5653          | 0.8977   | 0.8974 | 0.8973    | 0.8982 |
| 0.0499        | 7.8042  | 2950 | 0.5412          | 0.8970   | 0.8972 | 0.8973    | 0.8974 |
| 0.0424        | 7.9365  | 3000 | 0.5626          | 0.8977   | 0.8969 | 0.8973    | 0.8982 |
| 0.0324        | 8.0688  | 3050 | 0.6073          | 0.9001   | 0.9000 | 0.8999    | 0.9005 |
| 0.0309        | 8.2011  | 3100 | 0.6108          | 0.8982   | 0.8983 | 0.8982    | 0.8984 |
| 0.03          | 8.3333  | 3150 | 0.6021          | 0.8975   | 0.8973 | 0.8971    | 0.8979 |
| 0.0429        | 8.4656  | 3200 | 0.6003          | 0.8953   | 0.8955 | 0.8955    | 0.8956 |
| 0.0455        | 8.5979  | 3250 | 0.6162          | 0.8947   | 0.8953 | 0.8961    | 0.8948 |
| 0.037         | 8.7302  | 3300 | 0.5923          | 0.8957   | 0.8961 | 0.8962    | 0.8959 |
| 0.0462        | 8.8624  | 3350 | 0.5522          | 0.8979   | 0.8977 | 0.8975    | 0.8983 |
| 0.0356        | 8.9947  | 3400 | 0.5926          | 0.9010   | 0.9009 | 0.9011    | 0.9014 |
| 0.0243        | 9.1270  | 3450 | 0.6353          | 0.8972   | 0.8968 | 0.8967    | 0.8976 |
| 0.0341        | 9.2593  | 3500 | 0.6161          | 0.8925   | 0.8931 | 0.8939    | 0.8926 |
| 0.0271        | 9.3915  | 3550 | 0.6381          | 0.9008   | 0.9007 | 0.9006    | 0.9012 |
| 0.0344        | 9.5238  | 3600 | 0.6282          | 0.9001   | 0.9000 | 0.8998    | 0.9005 |
| 0.0236        | 9.6561  | 3650 | 0.7047          | 0.8982   | 0.8968 | 0.8989    | 0.8988 |
| 0.035         | 9.7884  | 3700 | 0.6561          | 0.8975   | 0.8974 | 0.8974    | 0.8979 |
| 0.0308        | 9.9206  | 3750 | 0.6754          | 0.8973   | 0.8968 | 0.8968    | 0.8978 |
| 0.0404        | 10.0529 | 3800 | 0.6452          | 0.8994   | 0.8988 | 0.8989    | 0.8999 |
| 0.0176        | 10.1852 | 3850 | 0.6636          | 0.8993   | 0.8989 | 0.8988    | 0.8998 |
| 0.0233        | 10.3175 | 3900 | 0.6820          | 0.8953   | 0.8948 | 0.8947    | 0.8957 |
| 0.0192        | 10.4497 | 3950 | 0.6954          | 0.8989   | 0.8981 | 0.8986    | 0.8995 |
| 0.0175        | 10.5820 | 4000 | 0.6959          | 0.8985   | 0.8982 | 0.8980    | 0.8989 |
| 0.0339        | 10.7143 | 4050 | 0.6624          | 0.8990   | 0.8993 | 0.8995    | 0.8993 |
| 0.0259        | 10.8466 | 4100 | 0.6787          | 0.8997   | 0.8993 | 0.8994    | 0.9002 |
| 0.0236        | 10.9788 | 4150 | 0.6708          | 0.8987   | 0.8989 | 0.8992    | 0.8990 |
| 0.0175        | 11.1111 | 4200 | 0.6893          | 0.9021   | 0.9021 | 0.9023    | 0.9026 |
| 0.0233        | 11.2434 | 4250 | 0.6769          | 0.8999   | 0.8998 | 0.8996    | 0.9003 |
| 0.0112        | 11.3757 | 4300 | 0.6949          | 0.8990   | 0.8990 | 0.8988    | 0.8993 |
| 0.017         | 11.5079 | 4350 | 0.6952          | 0.9019   | 0.9015 | 0.9015    | 0.9023 |
| 0.0159        | 11.6402 | 4400 | 0.6913          | 0.9031   | 0.9032 | 0.9032    | 0.9035 |
| 0.0214        | 11.7725 | 4450 | 0.7120          | 0.8996   | 0.8988 | 0.8992    | 0.9001 |
| 0.0257        | 11.9048 | 4500 | 0.6963          | 0.9032   | 0.9031 | 0.9031    | 0.9036 |
| 0.0189        | 12.0370 | 4550 | 0.6746          | 0.9032   | 0.9031 | 0.9031    | 0.9036 |
| 0.0138        | 12.1693 | 4600 | 0.7145          | 0.8996   | 0.9001 | 0.9008    | 0.8998 |
| 0.0095        | 12.3016 | 4650 | 0.7094          | 0.9018   | 0.9017 | 0.9017    | 0.9022 |
| 0.0189        | 12.4339 | 4700 | 0.7084          | 0.9000   | 0.9001 | 0.9000    | 0.9002 |
| 0.0159        | 12.5661 | 4750 | 0.7567          | 0.8937   | 0.8941 | 0.8948    | 0.8938 |
| 0.0127        | 12.6984 | 4800 | 0.7099          | 0.9013   | 0.9011 | 0.9009    | 0.9017 |
| 0.0147        | 12.8307 | 4850 | 0.7231          | 0.9032   | 0.9032 | 0.9030    | 0.9036 |
| 0.0134        | 12.9630 | 4900 | 0.7168          | 0.9008   | 0.9009 | 0.9008    | 0.9011 |
| 0.0121        | 13.0952 | 4950 | 0.7427          | 0.9030   | 0.9027 | 0.9027    | 0.9035 |
| 0.0114        | 13.2275 | 5000 | 0.7568          | 0.8998   | 0.8999 | 0.8998    | 0.9001 |
| 0.0157        | 13.3598 | 5050 | 0.7427          | 0.9024   | 0.9019 | 0.9020    | 0.9029 |
| 0.0104        | 13.4921 | 5100 | 0.7503          | 0.9020   | 0.9014 | 0.9015    | 0.9024 |
| 0.0129        | 13.6243 | 5150 | 0.7438          | 0.9020   | 0.9018 | 0.9017    | 0.9024 |
| 0.0152        | 13.7566 | 5200 | 0.7613          | 0.8984   | 0.8987 | 0.8987    | 0.8986 |
| 0.0072        | 13.8889 | 5250 | 0.7603          | 0.9030   | 0.9026 | 0.9026    | 0.9034 |
| 0.0103        | 14.0212 | 5300 | 0.7771          | 0.9000   | 0.9003 | 0.9004    | 0.9003 |
| 0.0115        | 14.1534 | 5350 | 0.7600          | 0.9031   | 0.9031 | 0.9030    | 0.9035 |
| 0.006         | 14.2857 | 5400 | 0.7614          | 0.9034   | 0.9031 | 0.9030    | 0.9038 |
| 0.0067        | 14.4180 | 5450 | 0.7912          | 0.9021   | 0.9023 | 0.9023    | 0.9023 |
| 0.0089        | 14.5503 | 5500 | 0.7771          | 0.9030   | 0.9031 | 0.9030    | 0.9034 |
| 0.0103        | 14.6825 | 5550 | 0.7795          | 0.9031   | 0.9031 | 0.9029    | 0.9035 |
| 0.0159        | 14.8148 | 5600 | 0.7478          | 0.9040   | 0.9039 | 0.9037    | 0.9043 |
| 0.0089        | 14.9471 | 5650 | 0.7904          | 0.8973   | 0.8978 | 0.8983    | 0.8974 |
| 0.0115        | 15.0794 | 5700 | 0.7904          | 0.8987   | 0.8990 | 0.8989    | 0.8990 |
| 0.0063        | 15.2116 | 5750 | 0.7864          | 0.9033   | 0.9032 | 0.9030    | 0.9037 |
| 0.0078        | 15.3439 | 5800 | 0.7965          | 0.9001   | 0.9005 | 0.9006    | 0.9004 |
| 0.0026        | 15.4762 | 5850 | 0.7972          | 0.9027   | 0.9026 | 0.9024    | 0.9030 |
| 0.0109        | 15.6085 | 5900 | 0.7800          | 0.9031   | 0.9029 | 0.9030    | 0.9036 |
| 0.0075        | 15.7407 | 5950 | 0.7770          | 0.9049   | 0.9047 | 0.9046    | 0.9053 |
| 0.008         | 15.8730 | 6000 | 0.7980          | 0.9013   | 0.9017 | 0.9019    | 0.9015 |
| 0.0039        | 16.0053 | 6050 | 0.7939          | 0.9045   | 0.9044 | 0.9043    | 0.9049 |
| 0.0048        | 16.1376 | 6100 | 0.8197          | 0.9003   | 0.9006 | 0.9007    | 0.9005 |
| 0.0077        | 16.2698 | 6150 | 0.8159          | 0.9030   | 0.9028 | 0.9027    | 0.9035 |
| 0.0047        | 16.4021 | 6200 | 0.8150          | 0.9018   | 0.9019 | 0.9017    | 0.9021 |
| 0.0044        | 16.5344 | 6250 | 0.8150          | 0.9018   | 0.9020 | 0.9019    | 0.9021 |
| 0.0057        | 16.6667 | 6300 | 0.8151          | 0.9025   | 0.9024 | 0.9023    | 0.9028 |
| 0.0089        | 16.7989 | 6350 | 0.8155          | 0.9026   | 0.9022 | 0.9021    | 0.9030 |
| 0.0027        | 16.9312 | 6400 | 0.8215          | 0.9028   | 0.9029 | 0.9029    | 0.9031 |
| 0.0041        | 17.0635 | 6450 | 0.8356          | 0.9011   | 0.9011 | 0.9009    | 0.9015 |
| 0.0058        | 17.1958 | 6500 | 0.8291          | 0.9018   | 0.9018 | 0.9016    | 0.9022 |
| 0.003         | 17.3280 | 6550 | 0.8411          | 0.9017   | 0.9016 | 0.9014    | 0.9021 |
| 0.0086        | 17.4603 | 6600 | 0.8326          | 0.9010   | 0.9010 | 0.9008    | 0.9013 |
| 0.0041        | 17.5926 | 6650 | 0.8296          | 0.9015   | 0.9015 | 0.9013    | 0.9018 |
| 0.0055        | 17.7249 | 6700 | 0.8302          | 0.9014   | 0.9014 | 0.9012    | 0.9017 |
| 0.005         | 17.8571 | 6750 | 0.8357          | 0.9021   | 0.9019 | 0.9017    | 0.9025 |
| 0.0038        | 17.9894 | 6800 | 0.8310          | 0.9015   | 0.9014 | 0.9012    | 0.9018 |
| 0.0065        | 18.1217 | 6850 | 0.8276          | 0.9026   | 0.9027 | 0.9026    | 0.9029 |
| 0.005         | 18.2540 | 6900 | 0.8336          | 0.9011   | 0.9013 | 0.9012    | 0.9014 |
| 0.002         | 18.3862 | 6950 | 0.8343          | 0.9014   | 0.9014 | 0.9012    | 0.9017 |
| 0.0022        | 18.5185 | 7000 | 0.8368          | 0.9033   | 0.9033 | 0.9032    | 0.9036 |
| 0.0045        | 18.6508 | 7050 | 0.8339          | 0.9032   | 0.9032 | 0.9031    | 0.9036 |
| 0.0055        | 18.7831 | 7100 | 0.8346          | 0.9040   | 0.9038 | 0.9037    | 0.9044 |
| 0.0034        | 18.9153 | 7150 | 0.8320          | 0.9038   | 0.9035 | 0.9034    | 0.9042 |
| 0.0037        | 19.0476 | 7200 | 0.8382          | 0.9039   | 0.9035 | 0.9035    | 0.9043 |
| 0.0024        | 19.1799 | 7250 | 0.8398          | 0.9040   | 0.9037 | 0.9038    | 0.9045 |
| 0.0041        | 19.3122 | 7300 | 0.8356          | 0.9035   | 0.9034 | 0.9034    | 0.9040 |
| 0.0037        | 19.4444 | 7350 | 0.8332          | 0.9036   | 0.9036 | 0.9034    | 0.9040 |
| 0.0052        | 19.5767 | 7400 | 0.8342          | 0.9036   | 0.9036 | 0.9034    | 0.9040 |
| 0.0051        | 19.7090 | 7450 | 0.8331          | 0.9039   | 0.9038 | 0.9036    | 0.9043 |
| 0.0043        | 19.8413 | 7500 | 0.8334          | 0.9042   | 0.9041 | 0.9040    | 0.9046 |
| 0.0022        | 19.9735 | 7550 | 0.8337          | 0.9040   | 0.9040 | 0.9038    | 0.9044 |


### Framework versions

- Transformers 4.44.0
- Pytorch 2.2.1+cu121
- Tokenizers 0.19.1