stefan-it's picture
Upload ./training.log with huggingface_hub
24cbc73
raw
history blame
25.5 kB
2023-10-19 03:28:58,589 ----------------------------------------------------------------------------------------------------
2023-10-19 03:28:58,590 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): BertModel(
(embeddings): BertEmbeddings(
(word_embeddings): Embedding(31103, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): BertEncoder(
(layer): ModuleList(
(0-11): 12 x BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): BertPooler(
(dense): Linear(in_features=768, out_features=768, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=81, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-19 03:28:58,590 ----------------------------------------------------------------------------------------------------
2023-10-19 03:28:58,590 Corpus: 6900 train + 1576 dev + 1833 test sentences
2023-10-19 03:28:58,591 ----------------------------------------------------------------------------------------------------
2023-10-19 03:28:58,591 Train: 6900 sentences
2023-10-19 03:28:58,591 (train_with_dev=False, train_with_test=False)
2023-10-19 03:28:58,591 ----------------------------------------------------------------------------------------------------
2023-10-19 03:28:58,591 Training Params:
2023-10-19 03:28:58,591 - learning_rate: "5e-05"
2023-10-19 03:28:58,591 - mini_batch_size: "16"
2023-10-19 03:28:58,591 - max_epochs: "10"
2023-10-19 03:28:58,591 - shuffle: "True"
2023-10-19 03:28:58,591 ----------------------------------------------------------------------------------------------------
2023-10-19 03:28:58,591 Plugins:
2023-10-19 03:28:58,591 - TensorboardLogger
2023-10-19 03:28:58,591 - LinearScheduler | warmup_fraction: '0.1'
2023-10-19 03:28:58,591 ----------------------------------------------------------------------------------------------------
2023-10-19 03:28:58,591 Final evaluation on model from best epoch (best-model.pt)
2023-10-19 03:28:58,591 - metric: "('micro avg', 'f1-score')"
2023-10-19 03:28:58,591 ----------------------------------------------------------------------------------------------------
2023-10-19 03:28:58,591 Computation:
2023-10-19 03:28:58,592 - compute on device: cuda:0
2023-10-19 03:28:58,592 - embedding storage: none
2023-10-19 03:28:58,592 ----------------------------------------------------------------------------------------------------
2023-10-19 03:28:58,592 Model training base path: "autotrain-flair-mobie-gbert_base-bs16-e10-lr5e-05-5"
2023-10-19 03:28:58,592 ----------------------------------------------------------------------------------------------------
2023-10-19 03:28:58,592 ----------------------------------------------------------------------------------------------------
2023-10-19 03:28:58,592 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-19 03:29:11,756 epoch 1 - iter 43/432 - loss 4.47089142 - time (sec): 13.16 - samples/sec: 462.75 - lr: 0.000005 - momentum: 0.000000
2023-10-19 03:29:26,354 epoch 1 - iter 86/432 - loss 3.40687789 - time (sec): 27.76 - samples/sec: 449.44 - lr: 0.000010 - momentum: 0.000000
2023-10-19 03:29:40,695 epoch 1 - iter 129/432 - loss 2.81984328 - time (sec): 42.10 - samples/sec: 453.89 - lr: 0.000015 - momentum: 0.000000
2023-10-19 03:29:54,625 epoch 1 - iter 172/432 - loss 2.48664817 - time (sec): 56.03 - samples/sec: 446.05 - lr: 0.000020 - momentum: 0.000000
2023-10-19 03:30:09,101 epoch 1 - iter 215/432 - loss 2.20938885 - time (sec): 70.51 - samples/sec: 442.36 - lr: 0.000025 - momentum: 0.000000
2023-10-19 03:30:23,495 epoch 1 - iter 258/432 - loss 2.00682608 - time (sec): 84.90 - samples/sec: 442.49 - lr: 0.000030 - momentum: 0.000000
2023-10-19 03:30:38,275 epoch 1 - iter 301/432 - loss 1.83000069 - time (sec): 99.68 - samples/sec: 439.34 - lr: 0.000035 - momentum: 0.000000
2023-10-19 03:30:52,684 epoch 1 - iter 344/432 - loss 1.69649278 - time (sec): 114.09 - samples/sec: 437.00 - lr: 0.000040 - momentum: 0.000000
2023-10-19 03:31:07,161 epoch 1 - iter 387/432 - loss 1.59029602 - time (sec): 128.57 - samples/sec: 433.05 - lr: 0.000045 - momentum: 0.000000
2023-10-19 03:31:22,191 epoch 1 - iter 430/432 - loss 1.48899645 - time (sec): 143.60 - samples/sec: 428.42 - lr: 0.000050 - momentum: 0.000000
2023-10-19 03:31:22,823 ----------------------------------------------------------------------------------------------------
2023-10-19 03:31:22,823 EPOCH 1 done: loss 1.4847 - lr: 0.000050
2023-10-19 03:31:35,250 DEV : loss 0.46572113037109375 - f1-score (micro avg) 0.7239
2023-10-19 03:31:35,274 saving best model
2023-10-19 03:31:35,705 ----------------------------------------------------------------------------------------------------
2023-10-19 03:31:49,420 epoch 2 - iter 43/432 - loss 0.47787826 - time (sec): 13.71 - samples/sec: 428.87 - lr: 0.000049 - momentum: 0.000000
2023-10-19 03:32:02,986 epoch 2 - iter 86/432 - loss 0.46422594 - time (sec): 27.28 - samples/sec: 444.51 - lr: 0.000049 - momentum: 0.000000
2023-10-19 03:32:17,113 epoch 2 - iter 129/432 - loss 0.47090059 - time (sec): 41.41 - samples/sec: 450.56 - lr: 0.000048 - momentum: 0.000000
2023-10-19 03:32:30,939 epoch 2 - iter 172/432 - loss 0.46448888 - time (sec): 55.23 - samples/sec: 446.35 - lr: 0.000048 - momentum: 0.000000
2023-10-19 03:32:45,263 epoch 2 - iter 215/432 - loss 0.45313238 - time (sec): 69.56 - samples/sec: 444.26 - lr: 0.000047 - momentum: 0.000000
2023-10-19 03:32:58,608 epoch 2 - iter 258/432 - loss 0.44160239 - time (sec): 82.90 - samples/sec: 450.48 - lr: 0.000047 - momentum: 0.000000
2023-10-19 03:33:13,377 epoch 2 - iter 301/432 - loss 0.43407568 - time (sec): 97.67 - samples/sec: 443.48 - lr: 0.000046 - momentum: 0.000000
2023-10-19 03:33:27,360 epoch 2 - iter 344/432 - loss 0.42190457 - time (sec): 111.65 - samples/sec: 443.35 - lr: 0.000046 - momentum: 0.000000
2023-10-19 03:33:41,325 epoch 2 - iter 387/432 - loss 0.41622439 - time (sec): 125.62 - samples/sec: 439.91 - lr: 0.000045 - momentum: 0.000000
2023-10-19 03:33:55,134 epoch 2 - iter 430/432 - loss 0.40436365 - time (sec): 139.43 - samples/sec: 442.06 - lr: 0.000044 - momentum: 0.000000
2023-10-19 03:33:55,654 ----------------------------------------------------------------------------------------------------
2023-10-19 03:33:55,654 EPOCH 2 done: loss 0.4050 - lr: 0.000044
2023-10-19 03:34:08,178 DEV : loss 0.3259238302707672 - f1-score (micro avg) 0.7826
2023-10-19 03:34:08,202 saving best model
2023-10-19 03:34:09,461 ----------------------------------------------------------------------------------------------------
2023-10-19 03:34:23,428 epoch 3 - iter 43/432 - loss 0.28190490 - time (sec): 13.97 - samples/sec: 452.84 - lr: 0.000044 - momentum: 0.000000
2023-10-19 03:34:36,562 epoch 3 - iter 86/432 - loss 0.26687223 - time (sec): 27.10 - samples/sec: 450.04 - lr: 0.000043 - momentum: 0.000000
2023-10-19 03:34:51,195 epoch 3 - iter 129/432 - loss 0.26449528 - time (sec): 41.73 - samples/sec: 441.48 - lr: 0.000043 - momentum: 0.000000
2023-10-19 03:35:05,122 epoch 3 - iter 172/432 - loss 0.25665322 - time (sec): 55.66 - samples/sec: 439.37 - lr: 0.000042 - momentum: 0.000000
2023-10-19 03:35:19,292 epoch 3 - iter 215/432 - loss 0.25287570 - time (sec): 69.83 - samples/sec: 439.23 - lr: 0.000042 - momentum: 0.000000
2023-10-19 03:35:34,052 epoch 3 - iter 258/432 - loss 0.24910280 - time (sec): 84.59 - samples/sec: 438.36 - lr: 0.000041 - momentum: 0.000000
2023-10-19 03:35:48,084 epoch 3 - iter 301/432 - loss 0.25157812 - time (sec): 98.62 - samples/sec: 438.86 - lr: 0.000041 - momentum: 0.000000
2023-10-19 03:36:02,258 epoch 3 - iter 344/432 - loss 0.25377885 - time (sec): 112.80 - samples/sec: 439.87 - lr: 0.000040 - momentum: 0.000000
2023-10-19 03:36:15,813 epoch 3 - iter 387/432 - loss 0.25537236 - time (sec): 126.35 - samples/sec: 440.41 - lr: 0.000039 - momentum: 0.000000
2023-10-19 03:36:30,192 epoch 3 - iter 430/432 - loss 0.25493575 - time (sec): 140.73 - samples/sec: 438.13 - lr: 0.000039 - momentum: 0.000000
2023-10-19 03:36:30,621 ----------------------------------------------------------------------------------------------------
2023-10-19 03:36:30,621 EPOCH 3 done: loss 0.2546 - lr: 0.000039
2023-10-19 03:36:42,948 DEV : loss 0.3176376521587372 - f1-score (micro avg) 0.8099
2023-10-19 03:36:42,972 saving best model
2023-10-19 03:36:44,252 ----------------------------------------------------------------------------------------------------
2023-10-19 03:36:58,581 epoch 4 - iter 43/432 - loss 0.16639809 - time (sec): 14.33 - samples/sec: 429.53 - lr: 0.000038 - momentum: 0.000000
2023-10-19 03:37:12,557 epoch 4 - iter 86/432 - loss 0.16046383 - time (sec): 28.30 - samples/sec: 449.63 - lr: 0.000038 - momentum: 0.000000
2023-10-19 03:37:25,777 epoch 4 - iter 129/432 - loss 0.16293076 - time (sec): 41.52 - samples/sec: 454.44 - lr: 0.000037 - momentum: 0.000000
2023-10-19 03:37:38,639 epoch 4 - iter 172/432 - loss 0.17169823 - time (sec): 54.39 - samples/sec: 458.05 - lr: 0.000037 - momentum: 0.000000
2023-10-19 03:37:52,354 epoch 4 - iter 215/432 - loss 0.17568654 - time (sec): 68.10 - samples/sec: 455.30 - lr: 0.000036 - momentum: 0.000000
2023-10-19 03:38:05,210 epoch 4 - iter 258/432 - loss 0.17539204 - time (sec): 80.96 - samples/sec: 459.20 - lr: 0.000036 - momentum: 0.000000
2023-10-19 03:38:19,129 epoch 4 - iter 301/432 - loss 0.17616913 - time (sec): 94.88 - samples/sec: 454.33 - lr: 0.000035 - momentum: 0.000000
2023-10-19 03:38:32,506 epoch 4 - iter 344/432 - loss 0.17582288 - time (sec): 108.25 - samples/sec: 453.92 - lr: 0.000034 - momentum: 0.000000
2023-10-19 03:38:46,863 epoch 4 - iter 387/432 - loss 0.17865856 - time (sec): 122.61 - samples/sec: 449.89 - lr: 0.000034 - momentum: 0.000000
2023-10-19 03:39:00,675 epoch 4 - iter 430/432 - loss 0.18280202 - time (sec): 136.42 - samples/sec: 450.86 - lr: 0.000033 - momentum: 0.000000
2023-10-19 03:39:01,048 ----------------------------------------------------------------------------------------------------
2023-10-19 03:39:01,048 EPOCH 4 done: loss 0.1824 - lr: 0.000033
2023-10-19 03:39:12,966 DEV : loss 0.29886704683303833 - f1-score (micro avg) 0.8234
2023-10-19 03:39:12,991 saving best model
2023-10-19 03:39:14,266 ----------------------------------------------------------------------------------------------------
2023-10-19 03:39:28,400 epoch 5 - iter 43/432 - loss 0.11682448 - time (sec): 14.13 - samples/sec: 461.41 - lr: 0.000033 - momentum: 0.000000
2023-10-19 03:39:42,804 epoch 5 - iter 86/432 - loss 0.12275492 - time (sec): 28.54 - samples/sec: 444.72 - lr: 0.000032 - momentum: 0.000000
2023-10-19 03:39:56,165 epoch 5 - iter 129/432 - loss 0.12609206 - time (sec): 41.90 - samples/sec: 441.07 - lr: 0.000032 - momentum: 0.000000
2023-10-19 03:40:09,952 epoch 5 - iter 172/432 - loss 0.12711545 - time (sec): 55.69 - samples/sec: 446.73 - lr: 0.000031 - momentum: 0.000000
2023-10-19 03:40:22,923 epoch 5 - iter 215/432 - loss 0.12886348 - time (sec): 68.66 - samples/sec: 448.19 - lr: 0.000031 - momentum: 0.000000
2023-10-19 03:40:36,641 epoch 5 - iter 258/432 - loss 0.12845048 - time (sec): 82.37 - samples/sec: 449.42 - lr: 0.000030 - momentum: 0.000000
2023-10-19 03:40:50,351 epoch 5 - iter 301/432 - loss 0.12913153 - time (sec): 96.08 - samples/sec: 446.05 - lr: 0.000029 - momentum: 0.000000
2023-10-19 03:41:04,628 epoch 5 - iter 344/432 - loss 0.12936027 - time (sec): 110.36 - samples/sec: 445.78 - lr: 0.000029 - momentum: 0.000000
2023-10-19 03:41:19,188 epoch 5 - iter 387/432 - loss 0.13265379 - time (sec): 124.92 - samples/sec: 443.12 - lr: 0.000028 - momentum: 0.000000
2023-10-19 03:41:32,431 epoch 5 - iter 430/432 - loss 0.13736553 - time (sec): 138.16 - samples/sec: 445.95 - lr: 0.000028 - momentum: 0.000000
2023-10-19 03:41:32,872 ----------------------------------------------------------------------------------------------------
2023-10-19 03:41:32,872 EPOCH 5 done: loss 0.1372 - lr: 0.000028
2023-10-19 03:41:44,920 DEV : loss 0.3204568028450012 - f1-score (micro avg) 0.8314
2023-10-19 03:41:44,948 saving best model
2023-10-19 03:41:46,214 ----------------------------------------------------------------------------------------------------
2023-10-19 03:41:59,586 epoch 6 - iter 43/432 - loss 0.09314269 - time (sec): 13.37 - samples/sec: 447.56 - lr: 0.000027 - momentum: 0.000000
2023-10-19 03:42:13,450 epoch 6 - iter 86/432 - loss 0.09075584 - time (sec): 27.23 - samples/sec: 437.57 - lr: 0.000027 - momentum: 0.000000
2023-10-19 03:42:27,171 epoch 6 - iter 129/432 - loss 0.09591164 - time (sec): 40.95 - samples/sec: 448.35 - lr: 0.000026 - momentum: 0.000000
2023-10-19 03:42:40,635 epoch 6 - iter 172/432 - loss 0.09597865 - time (sec): 54.42 - samples/sec: 454.58 - lr: 0.000026 - momentum: 0.000000
2023-10-19 03:42:54,660 epoch 6 - iter 215/432 - loss 0.09909182 - time (sec): 68.44 - samples/sec: 450.91 - lr: 0.000025 - momentum: 0.000000
2023-10-19 03:43:09,178 epoch 6 - iter 258/432 - loss 0.10174570 - time (sec): 82.96 - samples/sec: 444.40 - lr: 0.000024 - momentum: 0.000000
2023-10-19 03:43:23,824 epoch 6 - iter 301/432 - loss 0.10046164 - time (sec): 97.61 - samples/sec: 437.85 - lr: 0.000024 - momentum: 0.000000
2023-10-19 03:43:36,945 epoch 6 - iter 344/432 - loss 0.10000466 - time (sec): 110.73 - samples/sec: 443.92 - lr: 0.000023 - momentum: 0.000000
2023-10-19 03:43:51,453 epoch 6 - iter 387/432 - loss 0.10122306 - time (sec): 125.24 - samples/sec: 443.84 - lr: 0.000023 - momentum: 0.000000
2023-10-19 03:44:05,912 epoch 6 - iter 430/432 - loss 0.10223935 - time (sec): 139.70 - samples/sec: 441.20 - lr: 0.000022 - momentum: 0.000000
2023-10-19 03:44:06,379 ----------------------------------------------------------------------------------------------------
2023-10-19 03:44:06,380 EPOCH 6 done: loss 0.1020 - lr: 0.000022
2023-10-19 03:44:19,759 DEV : loss 0.33529824018478394 - f1-score (micro avg) 0.8401
2023-10-19 03:44:19,784 saving best model
2023-10-19 03:44:21,049 ----------------------------------------------------------------------------------------------------
2023-10-19 03:44:36,276 epoch 7 - iter 43/432 - loss 0.07203442 - time (sec): 15.23 - samples/sec: 412.93 - lr: 0.000022 - momentum: 0.000000
2023-10-19 03:44:51,038 epoch 7 - iter 86/432 - loss 0.07103726 - time (sec): 29.99 - samples/sec: 414.67 - lr: 0.000021 - momentum: 0.000000
2023-10-19 03:45:05,416 epoch 7 - iter 129/432 - loss 0.06899312 - time (sec): 44.37 - samples/sec: 415.88 - lr: 0.000021 - momentum: 0.000000
2023-10-19 03:45:20,795 epoch 7 - iter 172/432 - loss 0.07194985 - time (sec): 59.75 - samples/sec: 415.65 - lr: 0.000020 - momentum: 0.000000
2023-10-19 03:45:34,874 epoch 7 - iter 215/432 - loss 0.07052196 - time (sec): 73.82 - samples/sec: 417.57 - lr: 0.000019 - momentum: 0.000000
2023-10-19 03:45:49,995 epoch 7 - iter 258/432 - loss 0.07040585 - time (sec): 88.94 - samples/sec: 415.72 - lr: 0.000019 - momentum: 0.000000
2023-10-19 03:46:04,708 epoch 7 - iter 301/432 - loss 0.07018022 - time (sec): 103.66 - samples/sec: 419.87 - lr: 0.000018 - momentum: 0.000000
2023-10-19 03:46:19,073 epoch 7 - iter 344/432 - loss 0.07150833 - time (sec): 118.02 - samples/sec: 419.32 - lr: 0.000018 - momentum: 0.000000
2023-10-19 03:46:33,525 epoch 7 - iter 387/432 - loss 0.07349951 - time (sec): 132.47 - samples/sec: 418.40 - lr: 0.000017 - momentum: 0.000000
2023-10-19 03:46:48,302 epoch 7 - iter 430/432 - loss 0.07369037 - time (sec): 147.25 - samples/sec: 419.18 - lr: 0.000017 - momentum: 0.000000
2023-10-19 03:46:49,000 ----------------------------------------------------------------------------------------------------
2023-10-19 03:46:49,000 EPOCH 7 done: loss 0.0738 - lr: 0.000017
2023-10-19 03:47:01,924 DEV : loss 0.34933581948280334 - f1-score (micro avg) 0.8467
2023-10-19 03:47:01,948 saving best model
2023-10-19 03:47:03,222 ----------------------------------------------------------------------------------------------------
2023-10-19 03:47:17,090 epoch 8 - iter 43/432 - loss 0.05716144 - time (sec): 13.87 - samples/sec: 440.63 - lr: 0.000016 - momentum: 0.000000
2023-10-19 03:47:32,144 epoch 8 - iter 86/432 - loss 0.05847175 - time (sec): 28.92 - samples/sec: 406.80 - lr: 0.000016 - momentum: 0.000000
2023-10-19 03:47:47,486 epoch 8 - iter 129/432 - loss 0.05964240 - time (sec): 44.26 - samples/sec: 406.24 - lr: 0.000015 - momentum: 0.000000
2023-10-19 03:48:01,911 epoch 8 - iter 172/432 - loss 0.05753793 - time (sec): 58.69 - samples/sec: 420.36 - lr: 0.000014 - momentum: 0.000000
2023-10-19 03:48:16,435 epoch 8 - iter 215/432 - loss 0.05591410 - time (sec): 73.21 - samples/sec: 417.99 - lr: 0.000014 - momentum: 0.000000
2023-10-19 03:48:31,234 epoch 8 - iter 258/432 - loss 0.05596529 - time (sec): 88.01 - samples/sec: 417.49 - lr: 0.000013 - momentum: 0.000000
2023-10-19 03:48:46,757 epoch 8 - iter 301/432 - loss 0.05651818 - time (sec): 103.53 - samples/sec: 415.10 - lr: 0.000013 - momentum: 0.000000
2023-10-19 03:49:01,544 epoch 8 - iter 344/432 - loss 0.05762473 - time (sec): 118.32 - samples/sec: 416.82 - lr: 0.000012 - momentum: 0.000000
2023-10-19 03:49:15,864 epoch 8 - iter 387/432 - loss 0.05808607 - time (sec): 132.64 - samples/sec: 418.46 - lr: 0.000012 - momentum: 0.000000
2023-10-19 03:49:31,049 epoch 8 - iter 430/432 - loss 0.05758731 - time (sec): 147.83 - samples/sec: 416.94 - lr: 0.000011 - momentum: 0.000000
2023-10-19 03:49:31,826 ----------------------------------------------------------------------------------------------------
2023-10-19 03:49:31,826 EPOCH 8 done: loss 0.0578 - lr: 0.000011
2023-10-19 03:49:44,970 DEV : loss 0.38339680433273315 - f1-score (micro avg) 0.8393
2023-10-19 03:49:44,995 ----------------------------------------------------------------------------------------------------
2023-10-19 03:49:59,204 epoch 9 - iter 43/432 - loss 0.03477718 - time (sec): 14.21 - samples/sec: 433.13 - lr: 0.000011 - momentum: 0.000000
2023-10-19 03:50:14,060 epoch 9 - iter 86/432 - loss 0.03916332 - time (sec): 29.06 - samples/sec: 431.15 - lr: 0.000010 - momentum: 0.000000
2023-10-19 03:50:27,952 epoch 9 - iter 129/432 - loss 0.03614330 - time (sec): 42.96 - samples/sec: 437.68 - lr: 0.000009 - momentum: 0.000000
2023-10-19 03:50:42,814 epoch 9 - iter 172/432 - loss 0.03823699 - time (sec): 57.82 - samples/sec: 436.70 - lr: 0.000009 - momentum: 0.000000
2023-10-19 03:50:57,585 epoch 9 - iter 215/432 - loss 0.03949486 - time (sec): 72.59 - samples/sec: 427.62 - lr: 0.000008 - momentum: 0.000000
2023-10-19 03:51:12,290 epoch 9 - iter 258/432 - loss 0.04255173 - time (sec): 87.29 - samples/sec: 426.32 - lr: 0.000008 - momentum: 0.000000
2023-10-19 03:51:27,273 epoch 9 - iter 301/432 - loss 0.04281995 - time (sec): 102.28 - samples/sec: 424.55 - lr: 0.000007 - momentum: 0.000000
2023-10-19 03:51:41,879 epoch 9 - iter 344/432 - loss 0.04288552 - time (sec): 116.88 - samples/sec: 422.87 - lr: 0.000007 - momentum: 0.000000
2023-10-19 03:51:57,269 epoch 9 - iter 387/432 - loss 0.04340112 - time (sec): 132.27 - samples/sec: 420.18 - lr: 0.000006 - momentum: 0.000000
2023-10-19 03:52:12,096 epoch 9 - iter 430/432 - loss 0.04288897 - time (sec): 147.10 - samples/sec: 418.86 - lr: 0.000006 - momentum: 0.000000
2023-10-19 03:52:12,574 ----------------------------------------------------------------------------------------------------
2023-10-19 03:52:12,574 EPOCH 9 done: loss 0.0429 - lr: 0.000006
2023-10-19 03:52:25,616 DEV : loss 0.4035045802593231 - f1-score (micro avg) 0.8476
2023-10-19 03:52:25,641 saving best model
2023-10-19 03:52:27,667 ----------------------------------------------------------------------------------------------------
2023-10-19 03:52:41,579 epoch 10 - iter 43/432 - loss 0.02320239 - time (sec): 13.91 - samples/sec: 431.98 - lr: 0.000005 - momentum: 0.000000
2023-10-19 03:52:56,667 epoch 10 - iter 86/432 - loss 0.02686951 - time (sec): 29.00 - samples/sec: 410.53 - lr: 0.000004 - momentum: 0.000000
2023-10-19 03:53:10,356 epoch 10 - iter 129/432 - loss 0.02767008 - time (sec): 42.69 - samples/sec: 426.65 - lr: 0.000004 - momentum: 0.000000
2023-10-19 03:53:25,523 epoch 10 - iter 172/432 - loss 0.02995656 - time (sec): 57.85 - samples/sec: 433.48 - lr: 0.000003 - momentum: 0.000000
2023-10-19 03:53:40,267 epoch 10 - iter 215/432 - loss 0.03134205 - time (sec): 72.60 - samples/sec: 431.03 - lr: 0.000003 - momentum: 0.000000
2023-10-19 03:53:55,194 epoch 10 - iter 258/432 - loss 0.03131445 - time (sec): 87.53 - samples/sec: 426.60 - lr: 0.000002 - momentum: 0.000000
2023-10-19 03:54:08,818 epoch 10 - iter 301/432 - loss 0.03184536 - time (sec): 101.15 - samples/sec: 428.50 - lr: 0.000002 - momentum: 0.000000
2023-10-19 03:54:23,364 epoch 10 - iter 344/432 - loss 0.03434134 - time (sec): 115.70 - samples/sec: 428.91 - lr: 0.000001 - momentum: 0.000000
2023-10-19 03:54:37,614 epoch 10 - iter 387/432 - loss 0.03405121 - time (sec): 129.95 - samples/sec: 426.70 - lr: 0.000001 - momentum: 0.000000
2023-10-19 03:54:50,631 epoch 10 - iter 430/432 - loss 0.03334862 - time (sec): 142.96 - samples/sec: 431.18 - lr: 0.000000 - momentum: 0.000000
2023-10-19 03:54:51,170 ----------------------------------------------------------------------------------------------------
2023-10-19 03:54:51,170 EPOCH 10 done: loss 0.0333 - lr: 0.000000
2023-10-19 03:55:03,307 DEV : loss 0.41629332304000854 - f1-score (micro avg) 0.8463
2023-10-19 03:55:03,786 ----------------------------------------------------------------------------------------------------
2023-10-19 03:55:03,787 Loading model from best epoch ...
2023-10-19 03:55:06,502 SequenceTagger predicts: Dictionary with 81 tags: O, S-location-route, B-location-route, E-location-route, I-location-route, S-location-stop, B-location-stop, E-location-stop, I-location-stop, S-trigger, B-trigger, E-trigger, I-trigger, S-organization-company, B-organization-company, E-organization-company, I-organization-company, S-location-city, B-location-city, E-location-city, I-location-city, S-location, B-location, E-location, I-location, S-event-cause, B-event-cause, E-event-cause, I-event-cause, S-location-street, B-location-street, E-location-street, I-location-street, S-time, B-time, E-time, I-time, S-date, B-date, E-date, I-date, S-number, B-number, E-number, I-number, S-duration, B-duration, E-duration, I-duration, S-organization
2023-10-19 03:55:22,259
Results:
- F-score (micro) 0.7689
- F-score (macro) 0.5855
- Accuracy 0.6696
By class:
precision recall f1-score support
location-stop 0.8735 0.8392 0.8560 765
trigger 0.7202 0.5654 0.6335 833
location 0.7834 0.8376 0.8096 665
location-city 0.8197 0.8834 0.8503 566
date 0.8877 0.8426 0.8646 394
location-street 0.9356 0.8653 0.8991 386
time 0.7944 0.8906 0.8398 256
location-route 0.8525 0.7324 0.7879 284
organization-company 0.8191 0.6468 0.7228 252
distance 1.0000 0.9940 0.9970 167
number 0.6932 0.8188 0.7508 149
duration 0.3533 0.3252 0.3387 163
event-cause 0.0000 0.0000 0.0000 0
disaster-type 0.8649 0.4638 0.6038 69
organization 0.5714 0.5714 0.5714 28
person 0.4500 0.9000 0.6000 10
set 0.0000 0.0000 0.0000 0
org-position 0.0000 0.0000 0.0000 1
money 0.0000 0.0000 0.0000 0
micro avg 0.7694 0.7684 0.7689 4988
macro avg 0.6010 0.5882 0.5855 4988
weighted avg 0.8067 0.7684 0.7835 4988
2023-10-19 03:55:22,259 ----------------------------------------------------------------------------------------------------