Upload ./training.log with huggingface_hub
Browse files- training.log +258 -0
training.log
ADDED
@@ -0,0 +1,258 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2023-10-19 01:39:56,601 ----------------------------------------------------------------------------------------------------
|
2 |
+
2023-10-19 01:39:56,602 Model: "SequenceTagger(
|
3 |
+
(embeddings): TransformerWordEmbeddings(
|
4 |
+
(model): BertModel(
|
5 |
+
(embeddings): BertEmbeddings(
|
6 |
+
(word_embeddings): Embedding(31103, 768)
|
7 |
+
(position_embeddings): Embedding(512, 768)
|
8 |
+
(token_type_embeddings): Embedding(2, 768)
|
9 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
10 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
11 |
+
)
|
12 |
+
(encoder): BertEncoder(
|
13 |
+
(layer): ModuleList(
|
14 |
+
(0-11): 12 x BertLayer(
|
15 |
+
(attention): BertAttention(
|
16 |
+
(self): BertSelfAttention(
|
17 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
18 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
19 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
20 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
21 |
+
)
|
22 |
+
(output): BertSelfOutput(
|
23 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
24 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
25 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
26 |
+
)
|
27 |
+
)
|
28 |
+
(intermediate): BertIntermediate(
|
29 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
30 |
+
(intermediate_act_fn): GELUActivation()
|
31 |
+
)
|
32 |
+
(output): BertOutput(
|
33 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
34 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
35 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
36 |
+
)
|
37 |
+
)
|
38 |
+
)
|
39 |
+
)
|
40 |
+
(pooler): BertPooler(
|
41 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
42 |
+
(activation): Tanh()
|
43 |
+
)
|
44 |
+
)
|
45 |
+
)
|
46 |
+
(locked_dropout): LockedDropout(p=0.5)
|
47 |
+
(linear): Linear(in_features=768, out_features=81, bias=True)
|
48 |
+
(loss_function): CrossEntropyLoss()
|
49 |
+
)"
|
50 |
+
2023-10-19 01:39:56,602 ----------------------------------------------------------------------------------------------------
|
51 |
+
2023-10-19 01:39:56,602 Corpus: 6900 train + 1576 dev + 1833 test sentences
|
52 |
+
2023-10-19 01:39:56,602 ----------------------------------------------------------------------------------------------------
|
53 |
+
2023-10-19 01:39:56,602 Train: 6900 sentences
|
54 |
+
2023-10-19 01:39:56,602 (train_with_dev=False, train_with_test=False)
|
55 |
+
2023-10-19 01:39:56,602 ----------------------------------------------------------------------------------------------------
|
56 |
+
2023-10-19 01:39:56,602 Training Params:
|
57 |
+
2023-10-19 01:39:56,603 - learning_rate: "5e-05"
|
58 |
+
2023-10-19 01:39:56,603 - mini_batch_size: "16"
|
59 |
+
2023-10-19 01:39:56,603 - max_epochs: "10"
|
60 |
+
2023-10-19 01:39:56,603 - shuffle: "True"
|
61 |
+
2023-10-19 01:39:56,603 ----------------------------------------------------------------------------------------------------
|
62 |
+
2023-10-19 01:39:56,603 Plugins:
|
63 |
+
2023-10-19 01:39:56,603 - TensorboardLogger
|
64 |
+
2023-10-19 01:39:56,603 - LinearScheduler | warmup_fraction: '0.1'
|
65 |
+
2023-10-19 01:39:56,603 ----------------------------------------------------------------------------------------------------
|
66 |
+
2023-10-19 01:39:56,603 Final evaluation on model from best epoch (best-model.pt)
|
67 |
+
2023-10-19 01:39:56,603 - metric: "('micro avg', 'f1-score')"
|
68 |
+
2023-10-19 01:39:56,603 ----------------------------------------------------------------------------------------------------
|
69 |
+
2023-10-19 01:39:56,603 Computation:
|
70 |
+
2023-10-19 01:39:56,603 - compute on device: cuda:0
|
71 |
+
2023-10-19 01:39:56,603 - embedding storage: none
|
72 |
+
2023-10-19 01:39:56,603 ----------------------------------------------------------------------------------------------------
|
73 |
+
2023-10-19 01:39:56,603 Model training base path: "autotrain-flair-mobie-gbert_base-bs16-e10-lr5e-05-3"
|
74 |
+
2023-10-19 01:39:56,603 ----------------------------------------------------------------------------------------------------
|
75 |
+
2023-10-19 01:39:56,603 ----------------------------------------------------------------------------------------------------
|
76 |
+
2023-10-19 01:39:56,604 Logging anything other than scalars to TensorBoard is currently not supported.
|
77 |
+
2023-10-19 01:40:11,037 epoch 1 - iter 43/432 - loss 4.25713314 - time (sec): 14.43 - samples/sec: 430.01 - lr: 0.000005 - momentum: 0.000000
|
78 |
+
2023-10-19 01:40:25,630 epoch 1 - iter 86/432 - loss 3.24789196 - time (sec): 29.03 - samples/sec: 420.70 - lr: 0.000010 - momentum: 0.000000
|
79 |
+
2023-10-19 01:40:40,727 epoch 1 - iter 129/432 - loss 2.71426774 - time (sec): 44.12 - samples/sec: 420.26 - lr: 0.000015 - momentum: 0.000000
|
80 |
+
2023-10-19 01:40:55,443 epoch 1 - iter 172/432 - loss 2.40284727 - time (sec): 58.84 - samples/sec: 419.42 - lr: 0.000020 - momentum: 0.000000
|
81 |
+
2023-10-19 01:41:10,081 epoch 1 - iter 215/432 - loss 2.15479567 - time (sec): 73.48 - samples/sec: 420.02 - lr: 0.000025 - momentum: 0.000000
|
82 |
+
2023-10-19 01:41:25,394 epoch 1 - iter 258/432 - loss 1.94996109 - time (sec): 88.79 - samples/sec: 417.19 - lr: 0.000030 - momentum: 0.000000
|
83 |
+
2023-10-19 01:41:40,178 epoch 1 - iter 301/432 - loss 1.78391625 - time (sec): 103.57 - samples/sec: 418.57 - lr: 0.000035 - momentum: 0.000000
|
84 |
+
2023-10-19 01:41:55,533 epoch 1 - iter 344/432 - loss 1.65459407 - time (sec): 118.93 - samples/sec: 414.58 - lr: 0.000040 - momentum: 0.000000
|
85 |
+
2023-10-19 01:42:10,188 epoch 1 - iter 387/432 - loss 1.54328420 - time (sec): 133.58 - samples/sec: 415.90 - lr: 0.000045 - momentum: 0.000000
|
86 |
+
2023-10-19 01:42:24,418 epoch 1 - iter 430/432 - loss 1.44458615 - time (sec): 147.81 - samples/sec: 417.31 - lr: 0.000050 - momentum: 0.000000
|
87 |
+
2023-10-19 01:42:25,052 ----------------------------------------------------------------------------------------------------
|
88 |
+
2023-10-19 01:42:25,052 EPOCH 1 done: loss 1.4425 - lr: 0.000050
|
89 |
+
2023-10-19 01:42:38,766 DEV : loss 0.45496100187301636 - f1-score (micro avg) 0.7175
|
90 |
+
2023-10-19 01:42:38,790 saving best model
|
91 |
+
2023-10-19 01:42:39,214 ----------------------------------------------------------------------------------------------------
|
92 |
+
2023-10-19 01:42:54,241 epoch 2 - iter 43/432 - loss 0.45728563 - time (sec): 15.03 - samples/sec: 415.63 - lr: 0.000049 - momentum: 0.000000
|
93 |
+
2023-10-19 01:43:08,781 epoch 2 - iter 86/432 - loss 0.45661912 - time (sec): 29.57 - samples/sec: 414.74 - lr: 0.000049 - momentum: 0.000000
|
94 |
+
2023-10-19 01:43:23,811 epoch 2 - iter 129/432 - loss 0.44475303 - time (sec): 44.60 - samples/sec: 414.57 - lr: 0.000048 - momentum: 0.000000
|
95 |
+
2023-10-19 01:43:38,997 epoch 2 - iter 172/432 - loss 0.43971531 - time (sec): 59.78 - samples/sec: 418.02 - lr: 0.000048 - momentum: 0.000000
|
96 |
+
2023-10-19 01:43:54,683 epoch 2 - iter 215/432 - loss 0.43320350 - time (sec): 75.47 - samples/sec: 412.79 - lr: 0.000047 - momentum: 0.000000
|
97 |
+
2023-10-19 01:44:10,487 epoch 2 - iter 258/432 - loss 0.42523129 - time (sec): 91.27 - samples/sec: 411.43 - lr: 0.000047 - momentum: 0.000000
|
98 |
+
2023-10-19 01:44:24,503 epoch 2 - iter 301/432 - loss 0.41730337 - time (sec): 105.29 - samples/sec: 413.24 - lr: 0.000046 - momentum: 0.000000
|
99 |
+
2023-10-19 01:44:39,860 epoch 2 - iter 344/432 - loss 0.40943162 - time (sec): 120.64 - samples/sec: 412.31 - lr: 0.000046 - momentum: 0.000000
|
100 |
+
2023-10-19 01:44:55,802 epoch 2 - iter 387/432 - loss 0.40131277 - time (sec): 136.59 - samples/sec: 407.01 - lr: 0.000045 - momentum: 0.000000
|
101 |
+
2023-10-19 01:45:11,079 epoch 2 - iter 430/432 - loss 0.39353999 - time (sec): 151.86 - samples/sec: 406.04 - lr: 0.000044 - momentum: 0.000000
|
102 |
+
2023-10-19 01:45:11,757 ----------------------------------------------------------------------------------------------------
|
103 |
+
2023-10-19 01:45:11,758 EPOCH 2 done: loss 0.3936 - lr: 0.000044
|
104 |
+
2023-10-19 01:45:25,074 DEV : loss 0.34911495447158813 - f1-score (micro avg) 0.7839
|
105 |
+
2023-10-19 01:45:25,098 saving best model
|
106 |
+
2023-10-19 01:45:26,362 ----------------------------------------------------------------------------------------------------
|
107 |
+
2023-10-19 01:45:42,225 epoch 3 - iter 43/432 - loss 0.23809661 - time (sec): 15.86 - samples/sec: 384.64 - lr: 0.000044 - momentum: 0.000000
|
108 |
+
2023-10-19 01:45:56,725 epoch 3 - iter 86/432 - loss 0.25537867 - time (sec): 30.36 - samples/sec: 400.60 - lr: 0.000043 - momentum: 0.000000
|
109 |
+
2023-10-19 01:46:12,566 epoch 3 - iter 129/432 - loss 0.25249369 - time (sec): 46.20 - samples/sec: 397.89 - lr: 0.000043 - momentum: 0.000000
|
110 |
+
2023-10-19 01:46:28,424 epoch 3 - iter 172/432 - loss 0.24958478 - time (sec): 62.06 - samples/sec: 390.97 - lr: 0.000042 - momentum: 0.000000
|
111 |
+
2023-10-19 01:46:43,338 epoch 3 - iter 215/432 - loss 0.24589922 - time (sec): 76.98 - samples/sec: 394.71 - lr: 0.000042 - momentum: 0.000000
|
112 |
+
2023-10-19 01:46:57,935 epoch 3 - iter 258/432 - loss 0.24803978 - time (sec): 91.57 - samples/sec: 401.48 - lr: 0.000041 - momentum: 0.000000
|
113 |
+
2023-10-19 01:47:13,333 epoch 3 - iter 301/432 - loss 0.25300654 - time (sec): 106.97 - samples/sec: 400.09 - lr: 0.000041 - momentum: 0.000000
|
114 |
+
2023-10-19 01:47:27,657 epoch 3 - iter 344/432 - loss 0.25326885 - time (sec): 121.29 - samples/sec: 404.99 - lr: 0.000040 - momentum: 0.000000
|
115 |
+
2023-10-19 01:47:42,549 epoch 3 - iter 387/432 - loss 0.25155598 - time (sec): 136.19 - samples/sec: 405.65 - lr: 0.000039 - momentum: 0.000000
|
116 |
+
2023-10-19 01:47:58,354 epoch 3 - iter 430/432 - loss 0.24897874 - time (sec): 151.99 - samples/sec: 405.38 - lr: 0.000039 - momentum: 0.000000
|
117 |
+
2023-10-19 01:47:58,869 ----------------------------------------------------------------------------------------------------
|
118 |
+
2023-10-19 01:47:58,869 EPOCH 3 done: loss 0.2490 - lr: 0.000039
|
119 |
+
2023-10-19 01:48:12,087 DEV : loss 0.3246375322341919 - f1-score (micro avg) 0.8108
|
120 |
+
2023-10-19 01:48:12,111 saving best model
|
121 |
+
2023-10-19 01:48:13,359 ----------------------------------------------------------------------------------------------------
|
122 |
+
2023-10-19 01:48:27,749 epoch 4 - iter 43/432 - loss 0.15451661 - time (sec): 14.39 - samples/sec: 429.09 - lr: 0.000038 - momentum: 0.000000
|
123 |
+
2023-10-19 01:48:42,381 epoch 4 - iter 86/432 - loss 0.15328832 - time (sec): 29.02 - samples/sec: 427.64 - lr: 0.000038 - momentum: 0.000000
|
124 |
+
2023-10-19 01:48:57,422 epoch 4 - iter 129/432 - loss 0.16923728 - time (sec): 44.06 - samples/sec: 422.89 - lr: 0.000037 - momentum: 0.000000
|
125 |
+
2023-10-19 01:49:12,798 epoch 4 - iter 172/432 - loss 0.17532061 - time (sec): 59.44 - samples/sec: 418.25 - lr: 0.000037 - momentum: 0.000000
|
126 |
+
2023-10-19 01:49:27,957 epoch 4 - iter 215/432 - loss 0.18129161 - time (sec): 74.60 - samples/sec: 411.85 - lr: 0.000036 - momentum: 0.000000
|
127 |
+
2023-10-19 01:49:42,076 epoch 4 - iter 258/432 - loss 0.18102947 - time (sec): 88.71 - samples/sec: 414.35 - lr: 0.000036 - momentum: 0.000000
|
128 |
+
2023-10-19 01:49:56,693 epoch 4 - iter 301/432 - loss 0.17706809 - time (sec): 103.33 - samples/sec: 414.23 - lr: 0.000035 - momentum: 0.000000
|
129 |
+
2023-10-19 01:50:10,961 epoch 4 - iter 344/432 - loss 0.17600081 - time (sec): 117.60 - samples/sec: 420.72 - lr: 0.000034 - momentum: 0.000000
|
130 |
+
2023-10-19 01:50:26,404 epoch 4 - iter 387/432 - loss 0.17705983 - time (sec): 133.04 - samples/sec: 415.10 - lr: 0.000034 - momentum: 0.000000
|
131 |
+
2023-10-19 01:50:41,675 epoch 4 - iter 430/432 - loss 0.17678007 - time (sec): 148.31 - samples/sec: 415.08 - lr: 0.000033 - momentum: 0.000000
|
132 |
+
2023-10-19 01:50:42,245 ----------------------------------------------------------------------------------------------------
|
133 |
+
2023-10-19 01:50:42,245 EPOCH 4 done: loss 0.1773 - lr: 0.000033
|
134 |
+
2023-10-19 01:50:55,364 DEV : loss 0.29224464297294617 - f1-score (micro avg) 0.8349
|
135 |
+
2023-10-19 01:50:55,388 saving best model
|
136 |
+
2023-10-19 01:50:56,635 ----------------------------------------------------------------------------------------------------
|
137 |
+
2023-10-19 01:51:11,222 epoch 5 - iter 43/432 - loss 0.12751561 - time (sec): 14.59 - samples/sec: 398.73 - lr: 0.000033 - momentum: 0.000000
|
138 |
+
2023-10-19 01:51:25,804 epoch 5 - iter 86/432 - loss 0.12363292 - time (sec): 29.17 - samples/sec: 408.91 - lr: 0.000032 - momentum: 0.000000
|
139 |
+
2023-10-19 01:51:40,313 epoch 5 - iter 129/432 - loss 0.12950420 - time (sec): 43.68 - samples/sec: 420.14 - lr: 0.000032 - momentum: 0.000000
|
140 |
+
2023-10-19 01:51:54,408 epoch 5 - iter 172/432 - loss 0.12809390 - time (sec): 57.77 - samples/sec: 427.66 - lr: 0.000031 - momentum: 0.000000
|
141 |
+
2023-10-19 01:52:09,245 epoch 5 - iter 215/432 - loss 0.13394395 - time (sec): 72.61 - samples/sec: 424.89 - lr: 0.000031 - momentum: 0.000000
|
142 |
+
2023-10-19 01:52:25,088 epoch 5 - iter 258/432 - loss 0.13182392 - time (sec): 88.45 - samples/sec: 417.72 - lr: 0.000030 - momentum: 0.000000
|
143 |
+
2023-10-19 01:52:40,113 epoch 5 - iter 301/432 - loss 0.13212724 - time (sec): 103.48 - samples/sec: 416.20 - lr: 0.000029 - momentum: 0.000000
|
144 |
+
2023-10-19 01:52:55,382 epoch 5 - iter 344/432 - loss 0.13098423 - time (sec): 118.75 - samples/sec: 414.14 - lr: 0.000029 - momentum: 0.000000
|
145 |
+
2023-10-19 01:53:09,958 epoch 5 - iter 387/432 - loss 0.13216788 - time (sec): 133.32 - samples/sec: 416.82 - lr: 0.000028 - momentum: 0.000000
|
146 |
+
2023-10-19 01:53:25,239 epoch 5 - iter 430/432 - loss 0.13046149 - time (sec): 148.60 - samples/sec: 415.11 - lr: 0.000028 - momentum: 0.000000
|
147 |
+
2023-10-19 01:53:25,712 ----------------------------------------------------------------------------------------------------
|
148 |
+
2023-10-19 01:53:25,713 EPOCH 5 done: loss 0.1305 - lr: 0.000028
|
149 |
+
2023-10-19 01:53:39,313 DEV : loss 0.32371342182159424 - f1-score (micro avg) 0.8266
|
150 |
+
2023-10-19 01:53:39,360 ----------------------------------------------------------------------------------------------------
|
151 |
+
2023-10-19 01:53:55,116 epoch 6 - iter 43/432 - loss 0.08270545 - time (sec): 15.75 - samples/sec: 388.28 - lr: 0.000027 - momentum: 0.000000
|
152 |
+
2023-10-19 01:54:10,124 epoch 6 - iter 86/432 - loss 0.08062138 - time (sec): 30.76 - samples/sec: 396.33 - lr: 0.000027 - momentum: 0.000000
|
153 |
+
2023-10-19 01:54:25,180 epoch 6 - iter 129/432 - loss 0.08479993 - time (sec): 45.82 - samples/sec: 408.11 - lr: 0.000026 - momentum: 0.000000
|
154 |
+
2023-10-19 01:54:39,491 epoch 6 - iter 172/432 - loss 0.09233340 - time (sec): 60.13 - samples/sec: 414.18 - lr: 0.000026 - momentum: 0.000000
|
155 |
+
2023-10-19 01:54:54,018 epoch 6 - iter 215/432 - loss 0.09404072 - time (sec): 74.66 - samples/sec: 415.41 - lr: 0.000025 - momentum: 0.000000
|
156 |
+
2023-10-19 01:55:09,360 epoch 6 - iter 258/432 - loss 0.09241168 - time (sec): 90.00 - samples/sec: 412.03 - lr: 0.000024 - momentum: 0.000000
|
157 |
+
2023-10-19 01:55:24,823 epoch 6 - iter 301/432 - loss 0.09115702 - time (sec): 105.46 - samples/sec: 408.85 - lr: 0.000024 - momentum: 0.000000
|
158 |
+
2023-10-19 01:55:40,215 epoch 6 - iter 344/432 - loss 0.09173961 - time (sec): 120.85 - samples/sec: 410.46 - lr: 0.000023 - momentum: 0.000000
|
159 |
+
2023-10-19 01:55:55,726 epoch 6 - iter 387/432 - loss 0.09239848 - time (sec): 136.36 - samples/sec: 408.04 - lr: 0.000023 - momentum: 0.000000
|
160 |
+
2023-10-19 01:56:10,691 epoch 6 - iter 430/432 - loss 0.09560995 - time (sec): 151.33 - samples/sec: 407.44 - lr: 0.000022 - momentum: 0.000000
|
161 |
+
2023-10-19 01:56:11,371 ----------------------------------------------------------------------------------------------------
|
162 |
+
2023-10-19 01:56:11,371 EPOCH 6 done: loss 0.0956 - lr: 0.000022
|
163 |
+
2023-10-19 01:56:24,571 DEV : loss 0.33836397528648376 - f1-score (micro avg) 0.8366
|
164 |
+
2023-10-19 01:56:24,595 saving best model
|
165 |
+
2023-10-19 01:56:25,847 ----------------------------------------------------------------------------------------------------
|
166 |
+
2023-10-19 01:56:41,610 epoch 7 - iter 43/432 - loss 0.07430404 - time (sec): 15.76 - samples/sec: 395.59 - lr: 0.000022 - momentum: 0.000000
|
167 |
+
2023-10-19 01:56:56,421 epoch 7 - iter 86/432 - loss 0.07375232 - time (sec): 30.57 - samples/sec: 420.31 - lr: 0.000021 - momentum: 0.000000
|
168 |
+
2023-10-19 01:57:10,786 epoch 7 - iter 129/432 - loss 0.07598234 - time (sec): 44.94 - samples/sec: 416.57 - lr: 0.000021 - momentum: 0.000000
|
169 |
+
2023-10-19 01:57:26,303 epoch 7 - iter 172/432 - loss 0.07509303 - time (sec): 60.45 - samples/sec: 410.82 - lr: 0.000020 - momentum: 0.000000
|
170 |
+
2023-10-19 01:57:42,174 epoch 7 - iter 215/432 - loss 0.07645940 - time (sec): 76.33 - samples/sec: 406.76 - lr: 0.000019 - momentum: 0.000000
|
171 |
+
2023-10-19 01:57:57,853 epoch 7 - iter 258/432 - loss 0.07560066 - time (sec): 92.00 - samples/sec: 401.87 - lr: 0.000019 - momentum: 0.000000
|
172 |
+
2023-10-19 01:58:12,315 epoch 7 - iter 301/432 - loss 0.07564389 - time (sec): 106.47 - samples/sec: 404.02 - lr: 0.000018 - momentum: 0.000000
|
173 |
+
2023-10-19 01:58:26,427 epoch 7 - iter 344/432 - loss 0.07480741 - time (sec): 120.58 - samples/sec: 407.24 - lr: 0.000018 - momentum: 0.000000
|
174 |
+
2023-10-19 01:58:41,447 epoch 7 - iter 387/432 - loss 0.07468539 - time (sec): 135.60 - samples/sec: 408.08 - lr: 0.000017 - momentum: 0.000000
|
175 |
+
2023-10-19 01:58:57,103 epoch 7 - iter 430/432 - loss 0.07527820 - time (sec): 151.25 - samples/sec: 407.55 - lr: 0.000017 - momentum: 0.000000
|
176 |
+
2023-10-19 01:58:57,823 ----------------------------------------------------------------------------------------------------
|
177 |
+
2023-10-19 01:58:57,823 EPOCH 7 done: loss 0.0752 - lr: 0.000017
|
178 |
+
2023-10-19 01:59:11,170 DEV : loss 0.366791695356369 - f1-score (micro avg) 0.835
|
179 |
+
2023-10-19 01:59:11,195 ----------------------------------------------------------------------------------------------------
|
180 |
+
2023-10-19 01:59:26,059 epoch 8 - iter 43/432 - loss 0.05625100 - time (sec): 14.86 - samples/sec: 391.47 - lr: 0.000016 - momentum: 0.000000
|
181 |
+
2023-10-19 01:59:41,193 epoch 8 - iter 86/432 - loss 0.05428466 - time (sec): 30.00 - samples/sec: 402.91 - lr: 0.000016 - momentum: 0.000000
|
182 |
+
2023-10-19 01:59:56,249 epoch 8 - iter 129/432 - loss 0.05442877 - time (sec): 45.05 - samples/sec: 416.65 - lr: 0.000015 - momentum: 0.000000
|
183 |
+
2023-10-19 02:00:10,299 epoch 8 - iter 172/432 - loss 0.05554676 - time (sec): 59.10 - samples/sec: 418.21 - lr: 0.000014 - momentum: 0.000000
|
184 |
+
2023-10-19 02:00:26,123 epoch 8 - iter 215/432 - loss 0.05670463 - time (sec): 74.93 - samples/sec: 412.94 - lr: 0.000014 - momentum: 0.000000
|
185 |
+
2023-10-19 02:00:42,124 epoch 8 - iter 258/432 - loss 0.05639046 - time (sec): 90.93 - samples/sec: 412.58 - lr: 0.000013 - momentum: 0.000000
|
186 |
+
2023-10-19 02:00:58,560 epoch 8 - iter 301/432 - loss 0.05698070 - time (sec): 107.36 - samples/sec: 408.39 - lr: 0.000013 - momentum: 0.000000
|
187 |
+
2023-10-19 02:01:12,870 epoch 8 - iter 344/432 - loss 0.05669179 - time (sec): 121.67 - samples/sec: 410.66 - lr: 0.000012 - momentum: 0.000000
|
188 |
+
2023-10-19 02:01:27,187 epoch 8 - iter 387/432 - loss 0.05647273 - time (sec): 135.99 - samples/sec: 410.80 - lr: 0.000012 - momentum: 0.000000
|
189 |
+
2023-10-19 02:01:42,211 epoch 8 - iter 430/432 - loss 0.05599076 - time (sec): 151.01 - samples/sec: 408.03 - lr: 0.000011 - momentum: 0.000000
|
190 |
+
2023-10-19 02:01:42,730 ----------------------------------------------------------------------------------------------------
|
191 |
+
2023-10-19 02:01:42,730 EPOCH 8 done: loss 0.0559 - lr: 0.000011
|
192 |
+
2023-10-19 02:01:56,092 DEV : loss 0.3890858292579651 - f1-score (micro avg) 0.8455
|
193 |
+
2023-10-19 02:01:56,117 saving best model
|
194 |
+
2023-10-19 02:01:57,379 ----------------------------------------------------------------------------------------------------
|
195 |
+
2023-10-19 02:02:11,342 epoch 9 - iter 43/432 - loss 0.03669925 - time (sec): 13.96 - samples/sec: 437.43 - lr: 0.000011 - momentum: 0.000000
|
196 |
+
2023-10-19 02:02:25,561 epoch 9 - iter 86/432 - loss 0.03253165 - time (sec): 28.18 - samples/sec: 443.85 - lr: 0.000010 - momentum: 0.000000
|
197 |
+
2023-10-19 02:02:39,678 epoch 9 - iter 129/432 - loss 0.03230387 - time (sec): 42.30 - samples/sec: 436.34 - lr: 0.000009 - momentum: 0.000000
|
198 |
+
2023-10-19 02:02:54,485 epoch 9 - iter 172/432 - loss 0.03547786 - time (sec): 57.10 - samples/sec: 434.76 - lr: 0.000009 - momentum: 0.000000
|
199 |
+
2023-10-19 02:03:09,360 epoch 9 - iter 215/432 - loss 0.03554073 - time (sec): 71.98 - samples/sec: 431.86 - lr: 0.000008 - momentum: 0.000000
|
200 |
+
2023-10-19 02:03:24,437 epoch 9 - iter 258/432 - loss 0.03645877 - time (sec): 87.06 - samples/sec: 428.05 - lr: 0.000008 - momentum: 0.000000
|
201 |
+
2023-10-19 02:03:40,009 epoch 9 - iter 301/432 - loss 0.03894331 - time (sec): 102.63 - samples/sec: 422.18 - lr: 0.000007 - momentum: 0.000000
|
202 |
+
2023-10-19 02:03:54,976 epoch 9 - iter 344/432 - loss 0.04040420 - time (sec): 117.60 - samples/sec: 419.48 - lr: 0.000007 - momentum: 0.000000
|
203 |
+
2023-10-19 02:04:10,792 epoch 9 - iter 387/432 - loss 0.04188062 - time (sec): 133.41 - samples/sec: 414.28 - lr: 0.000006 - momentum: 0.000000
|
204 |
+
2023-10-19 02:04:25,690 epoch 9 - iter 430/432 - loss 0.04223260 - time (sec): 148.31 - samples/sec: 416.20 - lr: 0.000006 - momentum: 0.000000
|
205 |
+
2023-10-19 02:04:26,228 ----------------------------------------------------------------------------------------------------
|
206 |
+
2023-10-19 02:04:26,228 EPOCH 9 done: loss 0.0423 - lr: 0.000006
|
207 |
+
2023-10-19 02:04:39,946 DEV : loss 0.42729729413986206 - f1-score (micro avg) 0.8371
|
208 |
+
2023-10-19 02:04:39,970 ----------------------------------------------------------------------------------------------------
|
209 |
+
2023-10-19 02:04:55,624 epoch 10 - iter 43/432 - loss 0.02081255 - time (sec): 15.65 - samples/sec: 405.15 - lr: 0.000005 - momentum: 0.000000
|
210 |
+
2023-10-19 02:05:10,511 epoch 10 - iter 86/432 - loss 0.02735695 - time (sec): 30.54 - samples/sec: 415.62 - lr: 0.000004 - momentum: 0.000000
|
211 |
+
2023-10-19 02:05:24,272 epoch 10 - iter 129/432 - loss 0.03060242 - time (sec): 44.30 - samples/sec: 419.49 - lr: 0.000004 - momentum: 0.000000
|
212 |
+
2023-10-19 02:05:39,137 epoch 10 - iter 172/432 - loss 0.02883699 - time (sec): 59.17 - samples/sec: 420.60 - lr: 0.000003 - momentum: 0.000000
|
213 |
+
2023-10-19 02:05:53,697 epoch 10 - iter 215/432 - loss 0.03080710 - time (sec): 73.73 - samples/sec: 418.95 - lr: 0.000003 - momentum: 0.000000
|
214 |
+
2023-10-19 02:06:08,310 epoch 10 - iter 258/432 - loss 0.03245300 - time (sec): 88.34 - samples/sec: 417.77 - lr: 0.000002 - momentum: 0.000000
|
215 |
+
2023-10-19 02:06:23,371 epoch 10 - iter 301/432 - loss 0.03296380 - time (sec): 103.40 - samples/sec: 418.36 - lr: 0.000002 - momentum: 0.000000
|
216 |
+
2023-10-19 02:06:39,073 epoch 10 - iter 344/432 - loss 0.03338415 - time (sec): 119.10 - samples/sec: 414.26 - lr: 0.000001 - momentum: 0.000000
|
217 |
+
2023-10-19 02:06:53,055 epoch 10 - iter 387/432 - loss 0.03464620 - time (sec): 133.08 - samples/sec: 418.93 - lr: 0.000001 - momentum: 0.000000
|
218 |
+
2023-10-19 02:07:08,821 epoch 10 - iter 430/432 - loss 0.03437346 - time (sec): 148.85 - samples/sec: 413.85 - lr: 0.000000 - momentum: 0.000000
|
219 |
+
2023-10-19 02:07:09,419 ----------------------------------------------------------------------------------------------------
|
220 |
+
2023-10-19 02:07:09,420 EPOCH 10 done: loss 0.0344 - lr: 0.000000
|
221 |
+
2023-10-19 02:07:22,605 DEV : loss 0.4345364570617676 - f1-score (micro avg) 0.8397
|
222 |
+
2023-10-19 02:07:23,067 ----------------------------------------------------------------------------------------------------
|
223 |
+
2023-10-19 02:07:23,068 Loading model from best epoch ...
|
224 |
+
2023-10-19 02:07:25,281 SequenceTagger predicts: Dictionary with 81 tags: O, S-location-route, B-location-route, E-location-route, I-location-route, S-location-stop, B-location-stop, E-location-stop, I-location-stop, S-trigger, B-trigger, E-trigger, I-trigger, S-organization-company, B-organization-company, E-organization-company, I-organization-company, S-location-city, B-location-city, E-location-city, I-location-city, S-location, B-location, E-location, I-location, S-event-cause, B-event-cause, E-event-cause, I-event-cause, S-location-street, B-location-street, E-location-street, I-location-street, S-time, B-time, E-time, I-time, S-date, B-date, E-date, I-date, S-number, B-number, E-number, I-number, S-duration, B-duration, E-duration, I-duration, S-organization
|
225 |
+
2023-10-19 02:07:43,100
|
226 |
+
Results:
|
227 |
+
- F-score (micro) 0.752
|
228 |
+
- F-score (macro) 0.5641
|
229 |
+
- Accuracy 0.6451
|
230 |
+
|
231 |
+
By class:
|
232 |
+
precision recall f1-score support
|
233 |
+
|
234 |
+
location-stop 0.8407 0.8484 0.8445 765
|
235 |
+
trigger 0.6564 0.5138 0.5764 833
|
236 |
+
location 0.7890 0.8376 0.8125 665
|
237 |
+
location-city 0.8415 0.8534 0.8474 566
|
238 |
+
date 0.9040 0.8604 0.8817 394
|
239 |
+
location-street 0.9471 0.8808 0.9128 386
|
240 |
+
time 0.7862 0.8906 0.8352 256
|
241 |
+
location-route 0.7375 0.6725 0.7035 284
|
242 |
+
organization-company 0.8309 0.6825 0.7495 252
|
243 |
+
distance 0.9881 0.9940 0.9910 167
|
244 |
+
number 0.7135 0.8188 0.7625 149
|
245 |
+
duration 0.3185 0.3067 0.3125 163
|
246 |
+
event-cause 0.0000 0.0000 0.0000 0
|
247 |
+
disaster-type 0.9500 0.2754 0.4270 69
|
248 |
+
organization 0.5200 0.4643 0.4906 28
|
249 |
+
person 0.4444 0.8000 0.5714 10
|
250 |
+
set 0.0000 0.0000 0.0000 0
|
251 |
+
org-position 0.0000 0.0000 0.0000 1
|
252 |
+
money 0.0000 0.0000 0.0000 0
|
253 |
+
|
254 |
+
micro avg 0.7493 0.7548 0.7520 4988
|
255 |
+
macro avg 0.5930 0.5631 0.5641 4988
|
256 |
+
weighted avg 0.7900 0.7548 0.7673 4988
|
257 |
+
|
258 |
+
2023-10-19 02:07:43,100 ----------------------------------------------------------------------------------------------------
|