Upload folder using huggingface_hub
Browse files- best-model.pt +3 -0
- dev.tsv +0 -0
- loss.tsv +11 -0
- runs/events.out.tfevents.1697589108.3ae7c61396a7.1160.15 +3 -0
- test.tsv +0 -0
- training.log +237 -0
best-model.pt
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:d5235b43b7d0001616ad07ade685556a4c10e62f087b33c911b4ea832c834930
|
3 |
+
size 440954373
|
dev.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
loss.tsv
ADDED
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
EPOCH TIMESTAMP LEARNING_RATE TRAIN_LOSS DEV_LOSS DEV_PRECISION DEV_RECALL DEV_F1 DEV_ACCURACY
|
2 |
+
1 00:38:50 0.0000 0.4120 0.1611 0.2089 0.5625 0.3046 0.1802
|
3 |
+
2 00:46:01 0.0000 0.1949 0.1895 0.2506 0.5701 0.3482 0.2123
|
4 |
+
3 00:53:12 0.0000 0.1461 0.2023 0.3385 0.4583 0.3894 0.2427
|
5 |
+
4 01:00:15 0.0000 0.1126 0.2687 0.2644 0.5928 0.3657 0.2253
|
6 |
+
5 01:07:12 0.0000 0.0836 0.3014 0.2755 0.5833 0.3742 0.2318
|
7 |
+
6 01:14:19 0.0000 0.0652 0.3667 0.2537 0.5909 0.3549 0.2167
|
8 |
+
7 01:21:25 0.0000 0.0464 0.4333 0.2509 0.6288 0.3587 0.2210
|
9 |
+
8 01:28:44 0.0000 0.0373 0.4130 0.2610 0.6174 0.3669 0.2267
|
10 |
+
9 01:36:00 0.0000 0.0200 0.4562 0.2818 0.6212 0.3877 0.2410
|
11 |
+
10 01:43:01 0.0000 0.0131 0.4727 0.2803 0.6307 0.3881 0.2417
|
runs/events.out.tfevents.1697589108.3ae7c61396a7.1160.15
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:84f87dd8da4de237226af4e134c5e1f43d699cfcef9a50d217d18016c29e6260
|
3 |
+
size 2923780
|
test.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
training.log
ADDED
@@ -0,0 +1,237 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2023-10-18 00:31:48,574 ----------------------------------------------------------------------------------------------------
|
2 |
+
2023-10-18 00:31:48,576 Model: "SequenceTagger(
|
3 |
+
(embeddings): TransformerWordEmbeddings(
|
4 |
+
(model): ElectraModel(
|
5 |
+
(embeddings): ElectraEmbeddings(
|
6 |
+
(word_embeddings): Embedding(32001, 768)
|
7 |
+
(position_embeddings): Embedding(512, 768)
|
8 |
+
(token_type_embeddings): Embedding(2, 768)
|
9 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
10 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
11 |
+
)
|
12 |
+
(encoder): ElectraEncoder(
|
13 |
+
(layer): ModuleList(
|
14 |
+
(0-11): 12 x ElectraLayer(
|
15 |
+
(attention): ElectraAttention(
|
16 |
+
(self): ElectraSelfAttention(
|
17 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
18 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
19 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
20 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
21 |
+
)
|
22 |
+
(output): ElectraSelfOutput(
|
23 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
24 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
25 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
26 |
+
)
|
27 |
+
)
|
28 |
+
(intermediate): ElectraIntermediate(
|
29 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
30 |
+
(intermediate_act_fn): GELUActivation()
|
31 |
+
)
|
32 |
+
(output): ElectraOutput(
|
33 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
34 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
35 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
36 |
+
)
|
37 |
+
)
|
38 |
+
)
|
39 |
+
)
|
40 |
+
)
|
41 |
+
)
|
42 |
+
(locked_dropout): LockedDropout(p=0.5)
|
43 |
+
(linear): Linear(in_features=768, out_features=17, bias=True)
|
44 |
+
(loss_function): CrossEntropyLoss()
|
45 |
+
)"
|
46 |
+
2023-10-18 00:31:48,576 ----------------------------------------------------------------------------------------------------
|
47 |
+
2023-10-18 00:31:48,577 MultiCorpus: 20847 train + 1123 dev + 3350 test sentences
|
48 |
+
- NER_HIPE_2022 Corpus: 20847 train + 1123 dev + 3350 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/newseye/de/with_doc_seperator
|
49 |
+
2023-10-18 00:31:48,577 ----------------------------------------------------------------------------------------------------
|
50 |
+
2023-10-18 00:31:48,577 Train: 20847 sentences
|
51 |
+
2023-10-18 00:31:48,577 (train_with_dev=False, train_with_test=False)
|
52 |
+
2023-10-18 00:31:48,577 ----------------------------------------------------------------------------------------------------
|
53 |
+
2023-10-18 00:31:48,577 Training Params:
|
54 |
+
2023-10-18 00:31:48,577 - learning_rate: "5e-05"
|
55 |
+
2023-10-18 00:31:48,577 - mini_batch_size: "4"
|
56 |
+
2023-10-18 00:31:48,577 - max_epochs: "10"
|
57 |
+
2023-10-18 00:31:48,577 - shuffle: "True"
|
58 |
+
2023-10-18 00:31:48,577 ----------------------------------------------------------------------------------------------------
|
59 |
+
2023-10-18 00:31:48,577 Plugins:
|
60 |
+
2023-10-18 00:31:48,578 - TensorboardLogger
|
61 |
+
2023-10-18 00:31:48,578 - LinearScheduler | warmup_fraction: '0.1'
|
62 |
+
2023-10-18 00:31:48,578 ----------------------------------------------------------------------------------------------------
|
63 |
+
2023-10-18 00:31:48,578 Final evaluation on model from best epoch (best-model.pt)
|
64 |
+
2023-10-18 00:31:48,578 - metric: "('micro avg', 'f1-score')"
|
65 |
+
2023-10-18 00:31:48,578 ----------------------------------------------------------------------------------------------------
|
66 |
+
2023-10-18 00:31:48,578 Computation:
|
67 |
+
2023-10-18 00:31:48,578 - compute on device: cuda:0
|
68 |
+
2023-10-18 00:31:48,578 - embedding storage: none
|
69 |
+
2023-10-18 00:31:48,578 ----------------------------------------------------------------------------------------------------
|
70 |
+
2023-10-18 00:31:48,578 Model training base path: "hmbench-newseye/de-hmteams/teams-base-historic-multilingual-discriminator-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-4"
|
71 |
+
2023-10-18 00:31:48,578 ----------------------------------------------------------------------------------------------------
|
72 |
+
2023-10-18 00:31:48,578 ----------------------------------------------------------------------------------------------------
|
73 |
+
2023-10-18 00:31:48,579 Logging anything other than scalars to TensorBoard is currently not supported.
|
74 |
+
2023-10-18 00:32:29,918 epoch 1 - iter 521/5212 - loss 1.66906812 - time (sec): 41.34 - samples/sec: 930.71 - lr: 0.000005 - momentum: 0.000000
|
75 |
+
2023-10-18 00:33:10,635 epoch 1 - iter 1042/5212 - loss 1.03719361 - time (sec): 82.05 - samples/sec: 909.14 - lr: 0.000010 - momentum: 0.000000
|
76 |
+
2023-10-18 00:33:51,571 epoch 1 - iter 1563/5212 - loss 0.79777765 - time (sec): 122.99 - samples/sec: 892.20 - lr: 0.000015 - momentum: 0.000000
|
77 |
+
2023-10-18 00:34:33,364 epoch 1 - iter 2084/5212 - loss 0.66095486 - time (sec): 164.78 - samples/sec: 895.33 - lr: 0.000020 - momentum: 0.000000
|
78 |
+
2023-10-18 00:35:14,940 epoch 1 - iter 2605/5212 - loss 0.58101719 - time (sec): 206.36 - samples/sec: 885.33 - lr: 0.000025 - momentum: 0.000000
|
79 |
+
2023-10-18 00:35:56,337 epoch 1 - iter 3126/5212 - loss 0.52357318 - time (sec): 247.76 - samples/sec: 882.27 - lr: 0.000030 - momentum: 0.000000
|
80 |
+
2023-10-18 00:36:37,849 epoch 1 - iter 3647/5212 - loss 0.47988816 - time (sec): 289.27 - samples/sec: 891.55 - lr: 0.000035 - momentum: 0.000000
|
81 |
+
2023-10-18 00:37:19,590 epoch 1 - iter 4168/5212 - loss 0.45555744 - time (sec): 331.01 - samples/sec: 894.11 - lr: 0.000040 - momentum: 0.000000
|
82 |
+
2023-10-18 00:38:01,595 epoch 1 - iter 4689/5212 - loss 0.43054978 - time (sec): 373.02 - samples/sec: 889.91 - lr: 0.000045 - momentum: 0.000000
|
83 |
+
2023-10-18 00:38:42,289 epoch 1 - iter 5210/5212 - loss 0.41213934 - time (sec): 413.71 - samples/sec: 887.73 - lr: 0.000050 - momentum: 0.000000
|
84 |
+
2023-10-18 00:38:42,441 ----------------------------------------------------------------------------------------------------
|
85 |
+
2023-10-18 00:38:42,442 EPOCH 1 done: loss 0.4120 - lr: 0.000050
|
86 |
+
2023-10-18 00:38:50,396 DEV : loss 0.16113218665122986 - f1-score (micro avg) 0.3046
|
87 |
+
2023-10-18 00:38:50,458 saving best model
|
88 |
+
2023-10-18 00:38:51,056 ----------------------------------------------------------------------------------------------------
|
89 |
+
2023-10-18 00:39:33,061 epoch 2 - iter 521/5212 - loss 0.20391832 - time (sec): 42.00 - samples/sec: 861.95 - lr: 0.000049 - momentum: 0.000000
|
90 |
+
2023-10-18 00:40:16,191 epoch 2 - iter 1042/5212 - loss 0.20211911 - time (sec): 85.13 - samples/sec: 855.50 - lr: 0.000049 - momentum: 0.000000
|
91 |
+
2023-10-18 00:40:57,547 epoch 2 - iter 1563/5212 - loss 0.19999558 - time (sec): 126.49 - samples/sec: 860.72 - lr: 0.000048 - momentum: 0.000000
|
92 |
+
2023-10-18 00:41:39,640 epoch 2 - iter 2084/5212 - loss 0.20513933 - time (sec): 168.58 - samples/sec: 854.18 - lr: 0.000048 - momentum: 0.000000
|
93 |
+
2023-10-18 00:42:21,573 epoch 2 - iter 2605/5212 - loss 0.20038286 - time (sec): 210.51 - samples/sec: 863.40 - lr: 0.000047 - momentum: 0.000000
|
94 |
+
2023-10-18 00:43:02,141 epoch 2 - iter 3126/5212 - loss 0.19864827 - time (sec): 251.08 - samples/sec: 869.12 - lr: 0.000047 - momentum: 0.000000
|
95 |
+
2023-10-18 00:43:42,673 epoch 2 - iter 3647/5212 - loss 0.19854493 - time (sec): 291.61 - samples/sec: 867.61 - lr: 0.000046 - momentum: 0.000000
|
96 |
+
2023-10-18 00:44:24,671 epoch 2 - iter 4168/5212 - loss 0.19715426 - time (sec): 333.61 - samples/sec: 878.42 - lr: 0.000046 - momentum: 0.000000
|
97 |
+
2023-10-18 00:45:07,789 epoch 2 - iter 4689/5212 - loss 0.19743047 - time (sec): 376.73 - samples/sec: 881.73 - lr: 0.000045 - momentum: 0.000000
|
98 |
+
2023-10-18 00:45:48,782 epoch 2 - iter 5210/5212 - loss 0.19494220 - time (sec): 417.72 - samples/sec: 879.26 - lr: 0.000044 - momentum: 0.000000
|
99 |
+
2023-10-18 00:45:48,936 ----------------------------------------------------------------------------------------------------
|
100 |
+
2023-10-18 00:45:48,936 EPOCH 2 done: loss 0.1949 - lr: 0.000044
|
101 |
+
2023-10-18 00:46:01,264 DEV : loss 0.1894654929637909 - f1-score (micro avg) 0.3482
|
102 |
+
2023-10-18 00:46:01,333 saving best model
|
103 |
+
2023-10-18 00:46:02,756 ----------------------------------------------------------------------------------------------------
|
104 |
+
2023-10-18 00:46:44,151 epoch 3 - iter 521/5212 - loss 0.15457492 - time (sec): 41.39 - samples/sec: 873.70 - lr: 0.000044 - momentum: 0.000000
|
105 |
+
2023-10-18 00:47:25,991 epoch 3 - iter 1042/5212 - loss 0.14322200 - time (sec): 83.23 - samples/sec: 885.31 - lr: 0.000043 - momentum: 0.000000
|
106 |
+
2023-10-18 00:48:06,252 epoch 3 - iter 1563/5212 - loss 0.14225428 - time (sec): 123.49 - samples/sec: 875.22 - lr: 0.000043 - momentum: 0.000000
|
107 |
+
2023-10-18 00:48:48,040 epoch 3 - iter 2084/5212 - loss 0.14483257 - time (sec): 165.28 - samples/sec: 888.19 - lr: 0.000042 - momentum: 0.000000
|
108 |
+
2023-10-18 00:49:30,204 epoch 3 - iter 2605/5212 - loss 0.14346341 - time (sec): 207.44 - samples/sec: 881.51 - lr: 0.000042 - momentum: 0.000000
|
109 |
+
2023-10-18 00:50:12,207 epoch 3 - iter 3126/5212 - loss 0.14044970 - time (sec): 249.45 - samples/sec: 882.67 - lr: 0.000041 - momentum: 0.000000
|
110 |
+
2023-10-18 00:50:54,038 epoch 3 - iter 3647/5212 - loss 0.14195401 - time (sec): 291.28 - samples/sec: 887.79 - lr: 0.000041 - momentum: 0.000000
|
111 |
+
2023-10-18 00:51:36,494 epoch 3 - iter 4168/5212 - loss 0.14437001 - time (sec): 333.73 - samples/sec: 882.83 - lr: 0.000040 - momentum: 0.000000
|
112 |
+
2023-10-18 00:52:18,181 epoch 3 - iter 4689/5212 - loss 0.14670163 - time (sec): 375.42 - samples/sec: 877.15 - lr: 0.000039 - momentum: 0.000000
|
113 |
+
2023-10-18 00:53:00,008 epoch 3 - iter 5210/5212 - loss 0.14610716 - time (sec): 417.25 - samples/sec: 880.17 - lr: 0.000039 - momentum: 0.000000
|
114 |
+
2023-10-18 00:53:00,154 ----------------------------------------------------------------------------------------------------
|
115 |
+
2023-10-18 00:53:00,155 EPOCH 3 done: loss 0.1461 - lr: 0.000039
|
116 |
+
2023-10-18 00:53:11,968 DEV : loss 0.20233668386936188 - f1-score (micro avg) 0.3894
|
117 |
+
2023-10-18 00:53:12,019 saving best model
|
118 |
+
2023-10-18 00:53:13,437 ----------------------------------------------------------------------------------------------------
|
119 |
+
2023-10-18 00:53:55,104 epoch 4 - iter 521/5212 - loss 0.11142981 - time (sec): 41.66 - samples/sec: 895.43 - lr: 0.000038 - momentum: 0.000000
|
120 |
+
2023-10-18 00:54:39,019 epoch 4 - iter 1042/5212 - loss 0.10715561 - time (sec): 85.58 - samples/sec: 863.93 - lr: 0.000038 - momentum: 0.000000
|
121 |
+
2023-10-18 00:55:20,313 epoch 4 - iter 1563/5212 - loss 0.11064029 - time (sec): 126.87 - samples/sec: 878.12 - lr: 0.000037 - momentum: 0.000000
|
122 |
+
2023-10-18 00:56:01,136 epoch 4 - iter 2084/5212 - loss 0.11448970 - time (sec): 167.69 - samples/sec: 872.48 - lr: 0.000037 - momentum: 0.000000
|
123 |
+
2023-10-18 00:56:41,309 epoch 4 - iter 2605/5212 - loss 0.11567949 - time (sec): 207.87 - samples/sec: 877.17 - lr: 0.000036 - momentum: 0.000000
|
124 |
+
2023-10-18 00:57:22,189 epoch 4 - iter 3126/5212 - loss 0.11662647 - time (sec): 248.75 - samples/sec: 881.56 - lr: 0.000036 - momentum: 0.000000
|
125 |
+
2023-10-18 00:58:03,052 epoch 4 - iter 3647/5212 - loss 0.11508943 - time (sec): 289.61 - samples/sec: 883.76 - lr: 0.000035 - momentum: 0.000000
|
126 |
+
2023-10-18 00:58:45,312 epoch 4 - iter 4168/5212 - loss 0.11455258 - time (sec): 331.87 - samples/sec: 886.66 - lr: 0.000034 - momentum: 0.000000
|
127 |
+
2023-10-18 00:59:25,409 epoch 4 - iter 4689/5212 - loss 0.11467035 - time (sec): 371.97 - samples/sec: 887.99 - lr: 0.000034 - momentum: 0.000000
|
128 |
+
2023-10-18 01:00:03,308 epoch 4 - iter 5210/5212 - loss 0.11262872 - time (sec): 409.87 - samples/sec: 896.20 - lr: 0.000033 - momentum: 0.000000
|
129 |
+
2023-10-18 01:00:03,451 ----------------------------------------------------------------------------------------------------
|
130 |
+
2023-10-18 01:00:03,452 EPOCH 4 done: loss 0.1126 - lr: 0.000033
|
131 |
+
2023-10-18 01:00:15,154 DEV : loss 0.2687111496925354 - f1-score (micro avg) 0.3657
|
132 |
+
2023-10-18 01:00:15,206 ----------------------------------------------------------------------------------------------------
|
133 |
+
2023-10-18 01:00:54,028 epoch 5 - iter 521/5212 - loss 0.07311031 - time (sec): 38.82 - samples/sec: 928.03 - lr: 0.000033 - momentum: 0.000000
|
134 |
+
2023-10-18 01:01:34,435 epoch 5 - iter 1042/5212 - loss 0.08017798 - time (sec): 79.23 - samples/sec: 944.73 - lr: 0.000032 - momentum: 0.000000
|
135 |
+
2023-10-18 01:02:15,896 epoch 5 - iter 1563/5212 - loss 0.07619780 - time (sec): 120.69 - samples/sec: 945.18 - lr: 0.000032 - momentum: 0.000000
|
136 |
+
2023-10-18 01:02:56,684 epoch 5 - iter 2084/5212 - loss 0.07909195 - time (sec): 161.48 - samples/sec: 924.66 - lr: 0.000031 - momentum: 0.000000
|
137 |
+
2023-10-18 01:03:37,694 epoch 5 - iter 2605/5212 - loss 0.08215076 - time (sec): 202.49 - samples/sec: 918.77 - lr: 0.000031 - momentum: 0.000000
|
138 |
+
2023-10-18 01:04:18,619 epoch 5 - iter 3126/5212 - loss 0.08334033 - time (sec): 243.41 - samples/sec: 920.85 - lr: 0.000030 - momentum: 0.000000
|
139 |
+
2023-10-18 01:04:59,547 epoch 5 - iter 3647/5212 - loss 0.08207112 - time (sec): 284.34 - samples/sec: 917.40 - lr: 0.000029 - momentum: 0.000000
|
140 |
+
2023-10-18 01:05:40,848 epoch 5 - iter 4168/5212 - loss 0.08331101 - time (sec): 325.64 - samples/sec: 913.88 - lr: 0.000029 - momentum: 0.000000
|
141 |
+
2023-10-18 01:06:20,943 epoch 5 - iter 4689/5212 - loss 0.08261802 - time (sec): 365.73 - samples/sec: 912.70 - lr: 0.000028 - momentum: 0.000000
|
142 |
+
2023-10-18 01:07:01,129 epoch 5 - iter 5210/5212 - loss 0.08358709 - time (sec): 405.92 - samples/sec: 905.02 - lr: 0.000028 - momentum: 0.000000
|
143 |
+
2023-10-18 01:07:01,270 ----------------------------------------------------------------------------------------------------
|
144 |
+
2023-10-18 01:07:01,271 EPOCH 5 done: loss 0.0836 - lr: 0.000028
|
145 |
+
2023-10-18 01:07:12,187 DEV : loss 0.301369845867157 - f1-score (micro avg) 0.3742
|
146 |
+
2023-10-18 01:07:12,250 ----------------------------------------------------------------------------------------------------
|
147 |
+
2023-10-18 01:07:54,992 epoch 6 - iter 521/5212 - loss 0.06866396 - time (sec): 42.74 - samples/sec: 896.81 - lr: 0.000027 - momentum: 0.000000
|
148 |
+
2023-10-18 01:08:36,567 epoch 6 - iter 1042/5212 - loss 0.06896073 - time (sec): 84.31 - samples/sec: 882.54 - lr: 0.000027 - momentum: 0.000000
|
149 |
+
2023-10-18 01:09:16,846 epoch 6 - iter 1563/5212 - loss 0.06751622 - time (sec): 124.59 - samples/sec: 906.02 - lr: 0.000026 - momentum: 0.000000
|
150 |
+
2023-10-18 01:09:57,387 epoch 6 - iter 2084/5212 - loss 0.06455834 - time (sec): 165.13 - samples/sec: 914.12 - lr: 0.000026 - momentum: 0.000000
|
151 |
+
2023-10-18 01:10:38,142 epoch 6 - iter 2605/5212 - loss 0.06669693 - time (sec): 205.89 - samples/sec: 905.24 - lr: 0.000025 - momentum: 0.000000
|
152 |
+
2023-10-18 01:11:18,567 epoch 6 - iter 3126/5212 - loss 0.06865838 - time (sec): 246.31 - samples/sec: 891.57 - lr: 0.000024 - momentum: 0.000000
|
153 |
+
2023-10-18 01:12:00,376 epoch 6 - iter 3647/5212 - loss 0.06873425 - time (sec): 288.12 - samples/sec: 886.06 - lr: 0.000024 - momentum: 0.000000
|
154 |
+
2023-10-18 01:12:43,940 epoch 6 - iter 4168/5212 - loss 0.06803666 - time (sec): 331.69 - samples/sec: 877.70 - lr: 0.000023 - momentum: 0.000000
|
155 |
+
2023-10-18 01:13:25,958 epoch 6 - iter 4689/5212 - loss 0.06603815 - time (sec): 373.71 - samples/sec: 881.12 - lr: 0.000023 - momentum: 0.000000
|
156 |
+
2023-10-18 01:14:07,418 epoch 6 - iter 5210/5212 - loss 0.06522923 - time (sec): 415.17 - samples/sec: 884.91 - lr: 0.000022 - momentum: 0.000000
|
157 |
+
2023-10-18 01:14:07,588 ----------------------------------------------------------------------------------------------------
|
158 |
+
2023-10-18 01:14:07,588 EPOCH 6 done: loss 0.0652 - lr: 0.000022
|
159 |
+
2023-10-18 01:14:18,954 DEV : loss 0.36671698093414307 - f1-score (micro avg) 0.3549
|
160 |
+
2023-10-18 01:14:19,009 ----------------------------------------------------------------------------------------------------
|
161 |
+
2023-10-18 01:15:01,826 epoch 7 - iter 521/5212 - loss 0.03359695 - time (sec): 42.81 - samples/sec: 885.06 - lr: 0.000022 - momentum: 0.000000
|
162 |
+
2023-10-18 01:15:44,187 epoch 7 - iter 1042/5212 - loss 0.03709266 - time (sec): 85.17 - samples/sec: 872.76 - lr: 0.000021 - momentum: 0.000000
|
163 |
+
2023-10-18 01:16:24,595 epoch 7 - iter 1563/5212 - loss 0.04253411 - time (sec): 125.58 - samples/sec: 864.90 - lr: 0.000021 - momentum: 0.000000
|
164 |
+
2023-10-18 01:17:05,737 epoch 7 - iter 2084/5212 - loss 0.04345550 - time (sec): 166.73 - samples/sec: 867.32 - lr: 0.000020 - momentum: 0.000000
|
165 |
+
2023-10-18 01:17:47,521 epoch 7 - iter 2605/5212 - loss 0.04796044 - time (sec): 208.51 - samples/sec: 871.07 - lr: 0.000019 - momentum: 0.000000
|
166 |
+
2023-10-18 01:18:28,730 epoch 7 - iter 3126/5212 - loss 0.04973625 - time (sec): 249.72 - samples/sec: 866.88 - lr: 0.000019 - momentum: 0.000000
|
167 |
+
2023-10-18 01:19:10,790 epoch 7 - iter 3647/5212 - loss 0.04988937 - time (sec): 291.78 - samples/sec: 871.16 - lr: 0.000018 - momentum: 0.000000
|
168 |
+
2023-10-18 01:19:53,123 epoch 7 - iter 4168/5212 - loss 0.04833380 - time (sec): 334.11 - samples/sec: 886.96 - lr: 0.000018 - momentum: 0.000000
|
169 |
+
2023-10-18 01:20:33,853 epoch 7 - iter 4689/5212 - loss 0.04815110 - time (sec): 374.84 - samples/sec: 881.91 - lr: 0.000017 - momentum: 0.000000
|
170 |
+
2023-10-18 01:21:14,383 epoch 7 - iter 5210/5212 - loss 0.04638672 - time (sec): 415.37 - samples/sec: 884.32 - lr: 0.000017 - momentum: 0.000000
|
171 |
+
2023-10-18 01:21:14,530 ----------------------------------------------------------------------------------------------------
|
172 |
+
2023-10-18 01:21:14,530 EPOCH 7 done: loss 0.0464 - lr: 0.000017
|
173 |
+
2023-10-18 01:21:25,727 DEV : loss 0.43325743079185486 - f1-score (micro avg) 0.3587
|
174 |
+
2023-10-18 01:21:25,783 ----------------------------------------------------------------------------------------------------
|
175 |
+
2023-10-18 01:22:07,559 epoch 8 - iter 521/5212 - loss 0.04888283 - time (sec): 41.77 - samples/sec: 868.38 - lr: 0.000016 - momentum: 0.000000
|
176 |
+
2023-10-18 01:22:51,475 epoch 8 - iter 1042/5212 - loss 0.05588160 - time (sec): 85.69 - samples/sec: 853.90 - lr: 0.000016 - momentum: 0.000000
|
177 |
+
2023-10-18 01:23:30,474 epoch 8 - iter 1563/5212 - loss 0.04998288 - time (sec): 124.69 - samples/sec: 860.88 - lr: 0.000015 - momentum: 0.000000
|
178 |
+
2023-10-18 01:24:11,992 epoch 8 - iter 2084/5212 - loss 0.04736604 - time (sec): 166.21 - samples/sec: 860.69 - lr: 0.000014 - momentum: 0.000000
|
179 |
+
2023-10-18 01:24:54,218 epoch 8 - iter 2605/5212 - loss 0.04423360 - time (sec): 208.43 - samples/sec: 861.12 - lr: 0.000014 - momentum: 0.000000
|
180 |
+
2023-10-18 01:25:36,727 epoch 8 - iter 3126/5212 - loss 0.04220802 - time (sec): 250.94 - samples/sec: 867.37 - lr: 0.000013 - momentum: 0.000000
|
181 |
+
2023-10-18 01:26:21,477 epoch 8 - iter 3647/5212 - loss 0.03990034 - time (sec): 295.69 - samples/sec: 871.35 - lr: 0.000013 - momentum: 0.000000
|
182 |
+
2023-10-18 01:27:07,438 epoch 8 - iter 4168/5212 - loss 0.03988533 - time (sec): 341.65 - samples/sec: 864.12 - lr: 0.000012 - momentum: 0.000000
|
183 |
+
2023-10-18 01:27:50,105 epoch 8 - iter 4689/5212 - loss 0.03847361 - time (sec): 384.32 - samples/sec: 859.02 - lr: 0.000012 - momentum: 0.000000
|
184 |
+
2023-10-18 01:28:33,305 epoch 8 - iter 5210/5212 - loss 0.03727021 - time (sec): 427.52 - samples/sec: 858.96 - lr: 0.000011 - momentum: 0.000000
|
185 |
+
2023-10-18 01:28:33,460 ----------------------------------------------------------------------------------------------------
|
186 |
+
2023-10-18 01:28:33,460 EPOCH 8 done: loss 0.0373 - lr: 0.000011
|
187 |
+
2023-10-18 01:28:44,615 DEV : loss 0.4129716157913208 - f1-score (micro avg) 0.3669
|
188 |
+
2023-10-18 01:28:44,677 ----------------------------------------------------------------------------------------------------
|
189 |
+
2023-10-18 01:29:29,811 epoch 9 - iter 521/5212 - loss 0.01869215 - time (sec): 45.13 - samples/sec: 865.28 - lr: 0.000011 - momentum: 0.000000
|
190 |
+
2023-10-18 01:30:11,850 epoch 9 - iter 1042/5212 - loss 0.02072276 - time (sec): 87.17 - samples/sec: 861.21 - lr: 0.000010 - momentum: 0.000000
|
191 |
+
2023-10-18 01:30:55,037 epoch 9 - iter 1563/5212 - loss 0.02156431 - time (sec): 130.36 - samples/sec: 845.78 - lr: 0.000009 - momentum: 0.000000
|
192 |
+
2023-10-18 01:31:38,177 epoch 9 - iter 2084/5212 - loss 0.02107749 - time (sec): 173.50 - samples/sec: 833.83 - lr: 0.000009 - momentum: 0.000000
|
193 |
+
2023-10-18 01:32:20,284 epoch 9 - iter 2605/5212 - loss 0.02033570 - time (sec): 215.60 - samples/sec: 833.24 - lr: 0.000008 - momentum: 0.000000
|
194 |
+
2023-10-18 01:33:00,969 epoch 9 - iter 3126/5212 - loss 0.02020637 - time (sec): 256.29 - samples/sec: 847.42 - lr: 0.000008 - momentum: 0.000000
|
195 |
+
2023-10-18 01:33:41,854 epoch 9 - iter 3647/5212 - loss 0.02018397 - time (sec): 297.17 - samples/sec: 856.84 - lr: 0.000007 - momentum: 0.000000
|
196 |
+
2023-10-18 01:34:24,795 epoch 9 - iter 4168/5212 - loss 0.01967850 - time (sec): 340.12 - samples/sec: 863.34 - lr: 0.000007 - momentum: 0.000000
|
197 |
+
2023-10-18 01:35:06,699 epoch 9 - iter 4689/5212 - loss 0.01954663 - time (sec): 382.02 - samples/sec: 864.64 - lr: 0.000006 - momentum: 0.000000
|
198 |
+
2023-10-18 01:35:47,910 epoch 9 - iter 5210/5212 - loss 0.01998115 - time (sec): 423.23 - samples/sec: 867.76 - lr: 0.000006 - momentum: 0.000000
|
199 |
+
2023-10-18 01:35:48,061 ----------------------------------------------------------------------------------------------------
|
200 |
+
2023-10-18 01:35:48,061 EPOCH 9 done: loss 0.0200 - lr: 0.000006
|
201 |
+
2023-10-18 01:36:00,418 DEV : loss 0.4562455713748932 - f1-score (micro avg) 0.3877
|
202 |
+
2023-10-18 01:36:00,486 ----------------------------------------------------------------------------------------------------
|
203 |
+
2023-10-18 01:36:44,282 epoch 10 - iter 521/5212 - loss 0.01029065 - time (sec): 43.79 - samples/sec: 856.77 - lr: 0.000005 - momentum: 0.000000
|
204 |
+
2023-10-18 01:37:26,680 epoch 10 - iter 1042/5212 - loss 0.01334978 - time (sec): 86.19 - samples/sec: 866.79 - lr: 0.000004 - momentum: 0.000000
|
205 |
+
2023-10-18 01:38:06,606 epoch 10 - iter 1563/5212 - loss 0.01386083 - time (sec): 126.12 - samples/sec: 869.64 - lr: 0.000004 - momentum: 0.000000
|
206 |
+
2023-10-18 01:38:46,290 epoch 10 - iter 2084/5212 - loss 0.01335369 - time (sec): 165.80 - samples/sec: 867.67 - lr: 0.000003 - momentum: 0.000000
|
207 |
+
2023-10-18 01:39:27,175 epoch 10 - iter 2605/5212 - loss 0.01357185 - time (sec): 206.69 - samples/sec: 888.86 - lr: 0.000003 - momentum: 0.000000
|
208 |
+
2023-10-18 01:40:07,806 epoch 10 - iter 3126/5212 - loss 0.01324602 - time (sec): 247.32 - samples/sec: 893.64 - lr: 0.000002 - momentum: 0.000000
|
209 |
+
2023-10-18 01:40:48,240 epoch 10 - iter 3647/5212 - loss 0.01325955 - time (sec): 287.75 - samples/sec: 904.07 - lr: 0.000002 - momentum: 0.000000
|
210 |
+
2023-10-18 01:41:28,545 epoch 10 - iter 4168/5212 - loss 0.01305894 - time (sec): 328.06 - samples/sec: 901.55 - lr: 0.000001 - momentum: 0.000000
|
211 |
+
2023-10-18 01:42:08,664 epoch 10 - iter 4689/5212 - loss 0.01288314 - time (sec): 368.18 - samples/sec: 897.88 - lr: 0.000001 - momentum: 0.000000
|
212 |
+
2023-10-18 01:42:48,729 epoch 10 - iter 5210/5212 - loss 0.01314818 - time (sec): 408.24 - samples/sec: 899.94 - lr: 0.000000 - momentum: 0.000000
|
213 |
+
2023-10-18 01:42:48,862 ----------------------------------------------------------------------------------------------------
|
214 |
+
2023-10-18 01:42:48,862 EPOCH 10 done: loss 0.0131 - lr: 0.000000
|
215 |
+
2023-10-18 01:43:00,991 DEV : loss 0.47269320487976074 - f1-score (micro avg) 0.3881
|
216 |
+
2023-10-18 01:43:01,627 ----------------------------------------------------------------------------------------------------
|
217 |
+
2023-10-18 01:43:01,629 Loading model from best epoch ...
|
218 |
+
2023-10-18 01:43:04,487 SequenceTagger predicts: Dictionary with 17 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-ORG, B-ORG, E-ORG, I-ORG, S-HumanProd, B-HumanProd, E-HumanProd, I-HumanProd
|
219 |
+
2023-10-18 01:43:23,620
|
220 |
+
Results:
|
221 |
+
- F-score (micro) 0.3373
|
222 |
+
- F-score (macro) 0.219
|
223 |
+
- Accuracy 0.2047
|
224 |
+
|
225 |
+
By class:
|
226 |
+
precision recall f1-score support
|
227 |
+
|
228 |
+
LOC 0.4867 0.3624 0.4155 1214
|
229 |
+
PER 0.3677 0.2426 0.2923 808
|
230 |
+
ORG 0.2075 0.1416 0.1684 353
|
231 |
+
HumanProd 0.0000 0.0000 0.0000 15
|
232 |
+
|
233 |
+
micro avg 0.4088 0.2870 0.3373 2390
|
234 |
+
macro avg 0.2655 0.1867 0.2190 2390
|
235 |
+
weighted avg 0.4022 0.2870 0.3347 2390
|
236 |
+
|
237 |
+
2023-10-18 01:43:23,620 ----------------------------------------------------------------------------------------------------
|