diegokauer commited on
Commit
45a3aca
1 Parent(s): 12e14b1

End of training

Browse files
README.md ADDED
@@ -0,0 +1,289 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-sa-4.0
3
+ base_model: microsoft/layoutlmv3-base
4
+ tags:
5
+ - generated_from_trainer
6
+ datasets:
7
+ - funsd-layoutlmv3
8
+ metrics:
9
+ - precision
10
+ - recall
11
+ - f1
12
+ - accuracy
13
+ model-index:
14
+ - name: layoutlmv3-fintuned-funsd
15
+ results:
16
+ - task:
17
+ name: Token Classification
18
+ type: token-classification
19
+ dataset:
20
+ name: funsd-layoutlmv3
21
+ type: funsd-layoutlmv3
22
+ config: funsd
23
+ split: test
24
+ args: funsd
25
+ metrics:
26
+ - name: Precision
27
+ type: precision
28
+ value: 0.9130434782608695
29
+ - name: Recall
30
+ type: recall
31
+ value: 0.9180327868852459
32
+ - name: F1
33
+ type: f1
34
+ value: 0.9155313351498637
35
+ - name: Accuracy
36
+ type: accuracy
37
+ value: 0.8387020087959111
38
+ ---
39
+
40
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
41
+ should probably proofread and complete it, then remove this comment. -->
42
+
43
+ # layoutlmv3-fintuned-funsd
44
+
45
+ This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on the funsd-layoutlmv3 dataset.
46
+ It achieves the following results on the evaluation set:
47
+ - Loss: 1.7000
48
+ - Precision: 0.9130
49
+ - Recall: 0.9180
50
+ - F1: 0.9155
51
+ - Accuracy: 0.8387
52
+
53
+ ## Model description
54
+
55
+ More information needed
56
+
57
+ ## Intended uses & limitations
58
+
59
+ More information needed
60
+
61
+ ## Training and evaluation data
62
+
63
+ More information needed
64
+
65
+ ## Training procedure
66
+
67
+ ### Training hyperparameters
68
+
69
+ The following hyperparameters were used during training:
70
+ - learning_rate: 1e-05
71
+ - train_batch_size: 2
72
+ - eval_batch_size: 2
73
+ - seed: 42
74
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
75
+ - lr_scheduler_type: linear
76
+ - training_steps: 20000
77
+
78
+ ### Training results
79
+
80
+ | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
81
+ |:-------------:|:------:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
82
+ | No log | 1.33 | 100 | 0.6462 | 0.7683 | 0.8137 | 0.7903 | 0.7808 |
83
+ | No log | 2.67 | 200 | 0.4416 | 0.8248 | 0.8887 | 0.8556 | 0.8444 |
84
+ | No log | 4.0 | 300 | 0.5276 | 0.8563 | 0.9061 | 0.8805 | 0.8351 |
85
+ | No log | 5.33 | 400 | 0.5038 | 0.8319 | 0.8922 | 0.8610 | 0.8317 |
86
+ | 0.5539 | 6.67 | 500 | 0.5489 | 0.8864 | 0.9151 | 0.9005 | 0.8583 |
87
+ | 0.5539 | 8.0 | 600 | 0.5947 | 0.8646 | 0.9165 | 0.8898 | 0.8383 |
88
+ | 0.5539 | 9.33 | 700 | 0.7367 | 0.8796 | 0.8857 | 0.8827 | 0.8362 |
89
+ | 0.5539 | 10.67 | 800 | 0.8299 | 0.8911 | 0.9230 | 0.9068 | 0.8350 |
90
+ | 0.5539 | 12.0 | 900 | 0.6754 | 0.8934 | 0.9116 | 0.9024 | 0.8465 |
91
+ | 0.1203 | 13.33 | 1000 | 0.8242 | 0.8814 | 0.9011 | 0.8912 | 0.8420 |
92
+ | 0.1203 | 14.67 | 1100 | 0.9349 | 0.8835 | 0.8857 | 0.8846 | 0.8208 |
93
+ | 0.1203 | 16.0 | 1200 | 1.0205 | 0.8853 | 0.8783 | 0.8818 | 0.8131 |
94
+ | 0.1203 | 17.33 | 1300 | 0.8790 | 0.8865 | 0.8962 | 0.8913 | 0.8542 |
95
+ | 0.1203 | 18.67 | 1400 | 0.9262 | 0.8870 | 0.8967 | 0.8918 | 0.8482 |
96
+ | 0.0522 | 20.0 | 1500 | 0.9744 | 0.8979 | 0.8952 | 0.8965 | 0.8323 |
97
+ | 0.0522 | 21.33 | 1600 | 1.0198 | 0.8976 | 0.9146 | 0.9060 | 0.8401 |
98
+ | 0.0522 | 22.67 | 1700 | 1.0466 | 0.9114 | 0.9146 | 0.9130 | 0.8368 |
99
+ | 0.0522 | 24.0 | 1800 | 0.9874 | 0.8944 | 0.9086 | 0.9014 | 0.8462 |
100
+ | 0.0522 | 25.33 | 1900 | 1.1240 | 0.9051 | 0.9141 | 0.9095 | 0.8364 |
101
+ | 0.0208 | 26.67 | 2000 | 1.0658 | 0.9021 | 0.9205 | 0.9112 | 0.8399 |
102
+ | 0.0208 | 28.0 | 2100 | 1.2349 | 0.8934 | 0.9116 | 0.9024 | 0.8227 |
103
+ | 0.0208 | 29.33 | 2200 | 1.2906 | 0.8930 | 0.9081 | 0.9005 | 0.8111 |
104
+ | 0.0208 | 30.67 | 2300 | 1.2133 | 0.9020 | 0.9096 | 0.9058 | 0.8398 |
105
+ | 0.0208 | 32.0 | 2400 | 1.2202 | 0.9055 | 0.9096 | 0.9076 | 0.8375 |
106
+ | 0.0154 | 33.33 | 2500 | 1.2454 | 0.8909 | 0.9131 | 0.9019 | 0.8393 |
107
+ | 0.0154 | 34.67 | 2600 | 1.2065 | 0.9058 | 0.9126 | 0.9092 | 0.8364 |
108
+ | 0.0154 | 36.0 | 2700 | 1.2165 | 0.8998 | 0.9096 | 0.9046 | 0.8284 |
109
+ | 0.0154 | 37.33 | 2800 | 1.3195 | 0.8913 | 0.9160 | 0.9035 | 0.8272 |
110
+ | 0.0154 | 38.67 | 2900 | 1.3834 | 0.8957 | 0.9086 | 0.9021 | 0.8240 |
111
+ | 0.007 | 40.0 | 3000 | 1.3035 | 0.9001 | 0.9086 | 0.9043 | 0.8306 |
112
+ | 0.007 | 41.33 | 3100 | 1.3553 | 0.8991 | 0.9121 | 0.9055 | 0.8241 |
113
+ | 0.007 | 42.67 | 3200 | 1.3227 | 0.9054 | 0.9036 | 0.9045 | 0.8343 |
114
+ | 0.007 | 44.0 | 3300 | 1.2750 | 0.9262 | 0.9170 | 0.9216 | 0.8460 |
115
+ | 0.007 | 45.33 | 3400 | 1.2563 | 0.8980 | 0.9101 | 0.9040 | 0.8408 |
116
+ | 0.0035 | 46.67 | 3500 | 1.2013 | 0.9015 | 0.9136 | 0.9075 | 0.8383 |
117
+ | 0.0035 | 48.0 | 3600 | 1.2035 | 0.8997 | 0.9180 | 0.9088 | 0.8487 |
118
+ | 0.0035 | 49.33 | 3700 | 1.3997 | 0.9206 | 0.9041 | 0.9123 | 0.8312 |
119
+ | 0.0035 | 50.67 | 3800 | 1.3818 | 0.9117 | 0.9180 | 0.9149 | 0.8337 |
120
+ | 0.0035 | 52.0 | 3900 | 1.3568 | 0.9045 | 0.9086 | 0.9066 | 0.8410 |
121
+ | 0.0057 | 53.33 | 4000 | 1.3255 | 0.8935 | 0.9170 | 0.9051 | 0.8313 |
122
+ | 0.0057 | 54.67 | 4100 | 1.2954 | 0.8950 | 0.9151 | 0.9049 | 0.8343 |
123
+ | 0.0057 | 56.0 | 4200 | 1.4201 | 0.9028 | 0.9141 | 0.9084 | 0.8234 |
124
+ | 0.0057 | 57.33 | 4300 | 1.3858 | 0.9016 | 0.9061 | 0.9039 | 0.8238 |
125
+ | 0.0057 | 58.67 | 4400 | 1.3748 | 0.8976 | 0.9056 | 0.9016 | 0.8286 |
126
+ | 0.0155 | 60.0 | 4500 | 1.4346 | 0.9024 | 0.9051 | 0.9038 | 0.8274 |
127
+ | 0.0155 | 61.33 | 4600 | 1.4059 | 0.9073 | 0.9136 | 0.9104 | 0.8324 |
128
+ | 0.0155 | 62.67 | 4700 | 1.3779 | 0.9076 | 0.9121 | 0.9098 | 0.8358 |
129
+ | 0.0155 | 64.0 | 4800 | 1.3785 | 0.9064 | 0.9235 | 0.9149 | 0.8389 |
130
+ | 0.0155 | 65.33 | 4900 | 1.3995 | 0.9014 | 0.9220 | 0.9116 | 0.8190 |
131
+ | 0.0043 | 66.67 | 5000 | 1.2618 | 0.9087 | 0.9101 | 0.9094 | 0.8329 |
132
+ | 0.0043 | 68.0 | 5100 | 1.2878 | 0.9093 | 0.9165 | 0.9129 | 0.8526 |
133
+ | 0.0043 | 69.33 | 5200 | 1.4384 | 0.9063 | 0.9126 | 0.9094 | 0.8372 |
134
+ | 0.0043 | 70.67 | 5300 | 1.5029 | 0.9093 | 0.9165 | 0.9129 | 0.8347 |
135
+ | 0.0043 | 72.0 | 5400 | 1.4592 | 0.9107 | 0.9071 | 0.9089 | 0.8345 |
136
+ | 0.0045 | 73.33 | 5500 | 1.4604 | 0.9099 | 0.9136 | 0.9118 | 0.8263 |
137
+ | 0.0045 | 74.67 | 5600 | 1.5632 | 0.8933 | 0.9066 | 0.8999 | 0.8162 |
138
+ | 0.0045 | 76.0 | 5700 | 1.5839 | 0.9114 | 0.9096 | 0.9105 | 0.8335 |
139
+ | 0.0045 | 77.33 | 5800 | 1.5557 | 0.9099 | 0.9081 | 0.9090 | 0.8392 |
140
+ | 0.0045 | 78.67 | 5900 | 1.4348 | 0.9024 | 0.9051 | 0.9038 | 0.8266 |
141
+ | 0.0089 | 80.0 | 6000 | 1.2747 | 0.9026 | 0.9160 | 0.9093 | 0.8429 |
142
+ | 0.0089 | 81.33 | 6100 | 1.3560 | 0.8963 | 0.9230 | 0.9094 | 0.8376 |
143
+ | 0.0089 | 82.67 | 6200 | 1.2859 | 0.8987 | 0.9165 | 0.9075 | 0.8480 |
144
+ | 0.0089 | 84.0 | 6300 | 1.3389 | 0.9032 | 0.9220 | 0.9125 | 0.8315 |
145
+ | 0.0089 | 85.33 | 6400 | 1.3922 | 0.8922 | 0.9126 | 0.9023 | 0.8413 |
146
+ | 0.0045 | 86.67 | 6500 | 1.3723 | 0.9003 | 0.9066 | 0.9035 | 0.8353 |
147
+ | 0.0045 | 88.0 | 6600 | 1.3220 | 0.9011 | 0.9141 | 0.9075 | 0.8441 |
148
+ | 0.0045 | 89.33 | 6700 | 1.4433 | 0.9063 | 0.9126 | 0.9094 | 0.8213 |
149
+ | 0.0045 | 90.67 | 6800 | 1.5350 | 0.8977 | 0.9195 | 0.9085 | 0.8266 |
150
+ | 0.0045 | 92.0 | 6900 | 1.3681 | 0.9115 | 0.9215 | 0.9165 | 0.8467 |
151
+ | 0.002 | 93.33 | 7000 | 1.3141 | 0.8981 | 0.9240 | 0.9109 | 0.8398 |
152
+ | 0.002 | 94.67 | 7100 | 1.3836 | 0.9069 | 0.9146 | 0.9107 | 0.8310 |
153
+ | 0.002 | 96.0 | 7200 | 1.4456 | 0.8995 | 0.9155 | 0.9074 | 0.8310 |
154
+ | 0.002 | 97.33 | 7300 | 1.4218 | 0.9018 | 0.9031 | 0.9025 | 0.8304 |
155
+ | 0.002 | 98.67 | 7400 | 1.5428 | 0.8958 | 0.9225 | 0.9090 | 0.8252 |
156
+ | 0.0016 | 100.0 | 7500 | 1.4982 | 0.8974 | 0.9215 | 0.9093 | 0.8280 |
157
+ | 0.0016 | 101.33 | 7600 | 1.4787 | 0.9025 | 0.9240 | 0.9131 | 0.8330 |
158
+ | 0.0016 | 102.67 | 7700 | 1.5694 | 0.8982 | 0.9026 | 0.9004 | 0.8148 |
159
+ | 0.0016 | 104.0 | 7800 | 1.4361 | 0.8985 | 0.9146 | 0.9065 | 0.8244 |
160
+ | 0.0016 | 105.33 | 7900 | 1.5643 | 0.8912 | 0.9280 | 0.9092 | 0.8265 |
161
+ | 0.0065 | 106.67 | 8000 | 1.5890 | 0.9017 | 0.9155 | 0.9086 | 0.8269 |
162
+ | 0.0065 | 108.0 | 8100 | 1.5755 | 0.8901 | 0.9170 | 0.9034 | 0.8209 |
163
+ | 0.0065 | 109.33 | 8200 | 1.7716 | 0.9006 | 0.9051 | 0.9029 | 0.8105 |
164
+ | 0.0065 | 110.67 | 8300 | 1.6814 | 0.8973 | 0.9160 | 0.9066 | 0.8168 |
165
+ | 0.0065 | 112.0 | 8400 | 1.5316 | 0.9002 | 0.9230 | 0.9115 | 0.8199 |
166
+ | 0.0012 | 113.33 | 8500 | 1.5376 | 0.9041 | 0.9136 | 0.9088 | 0.8338 |
167
+ | 0.0012 | 114.67 | 8600 | 1.5773 | 0.9085 | 0.9175 | 0.9130 | 0.8334 |
168
+ | 0.0012 | 116.0 | 8700 | 1.5998 | 0.9050 | 0.9086 | 0.9068 | 0.8380 |
169
+ | 0.0012 | 117.33 | 8800 | 1.6401 | 0.8985 | 0.9101 | 0.9042 | 0.8298 |
170
+ | 0.0012 | 118.67 | 8900 | 1.6894 | 0.9055 | 0.8897 | 0.8975 | 0.8248 |
171
+ | 0.0011 | 120.0 | 9000 | 1.6945 | 0.9004 | 0.9031 | 0.9018 | 0.8275 |
172
+ | 0.0011 | 121.33 | 9100 | 1.5659 | 0.9038 | 0.9240 | 0.9138 | 0.8332 |
173
+ | 0.0011 | 122.67 | 9200 | 1.5270 | 0.8947 | 0.9240 | 0.9091 | 0.8358 |
174
+ | 0.0011 | 124.0 | 9300 | 1.5225 | 0.9081 | 0.9230 | 0.9155 | 0.8304 |
175
+ | 0.0011 | 125.33 | 9400 | 1.6064 | 0.8906 | 0.9141 | 0.9022 | 0.8232 |
176
+ | 0.0035 | 126.67 | 9500 | 1.5898 | 0.9034 | 0.9240 | 0.9136 | 0.8294 |
177
+ | 0.0035 | 128.0 | 9600 | 1.5404 | 0.8949 | 0.9225 | 0.9085 | 0.8336 |
178
+ | 0.0035 | 129.33 | 9700 | 1.4890 | 0.9074 | 0.9250 | 0.9161 | 0.8460 |
179
+ | 0.0035 | 130.67 | 9800 | 1.5620 | 0.9049 | 0.9175 | 0.9112 | 0.8315 |
180
+ | 0.0035 | 132.0 | 9900 | 1.5565 | 0.9050 | 0.9180 | 0.9115 | 0.8279 |
181
+ | 0.0014 | 133.33 | 10000 | 1.5553 | 0.8989 | 0.9230 | 0.9108 | 0.8424 |
182
+ | 0.0014 | 134.67 | 10100 | 1.5287 | 0.9060 | 0.9195 | 0.9127 | 0.8356 |
183
+ | 0.0014 | 136.0 | 10200 | 1.5282 | 0.9109 | 0.9146 | 0.9127 | 0.8398 |
184
+ | 0.0014 | 137.33 | 10300 | 1.5280 | 0.9073 | 0.9141 | 0.9107 | 0.8437 |
185
+ | 0.0014 | 138.67 | 10400 | 1.5719 | 0.9092 | 0.9151 | 0.9121 | 0.8387 |
186
+ | 0.0035 | 140.0 | 10500 | 1.5059 | 0.9074 | 0.9155 | 0.9115 | 0.8426 |
187
+ | 0.0035 | 141.33 | 10600 | 1.5702 | 0.9013 | 0.9250 | 0.9130 | 0.8355 |
188
+ | 0.0035 | 142.67 | 10700 | 1.5080 | 0.9035 | 0.9260 | 0.9146 | 0.8455 |
189
+ | 0.0035 | 144.0 | 10800 | 1.4643 | 0.9097 | 0.9255 | 0.9175 | 0.8467 |
190
+ | 0.0035 | 145.33 | 10900 | 1.5316 | 0.9037 | 0.9230 | 0.9132 | 0.8387 |
191
+ | 0.0011 | 146.67 | 11000 | 1.5314 | 0.9114 | 0.9195 | 0.9154 | 0.8392 |
192
+ | 0.0011 | 148.0 | 11100 | 1.4988 | 0.9114 | 0.9200 | 0.9157 | 0.8493 |
193
+ | 0.0011 | 149.33 | 11200 | 1.4546 | 0.9121 | 0.9275 | 0.9197 | 0.8538 |
194
+ | 0.0011 | 150.67 | 11300 | 1.5075 | 0.9062 | 0.9170 | 0.9116 | 0.8456 |
195
+ | 0.0011 | 152.0 | 11400 | 1.4556 | 0.8973 | 0.9076 | 0.9024 | 0.8393 |
196
+ | 0.0009 | 153.33 | 11500 | 1.5058 | 0.8911 | 0.9185 | 0.9046 | 0.8272 |
197
+ | 0.0009 | 154.67 | 11600 | 1.5903 | 0.9197 | 0.9101 | 0.9149 | 0.8318 |
198
+ | 0.0009 | 156.0 | 11700 | 1.5263 | 0.9164 | 0.9146 | 0.9155 | 0.8413 |
199
+ | 0.0009 | 157.33 | 11800 | 1.5729 | 0.9129 | 0.9160 | 0.9145 | 0.8386 |
200
+ | 0.0009 | 158.67 | 11900 | 1.5880 | 0.9086 | 0.9131 | 0.9108 | 0.8398 |
201
+ | 0.0009 | 160.0 | 12000 | 1.5907 | 0.9090 | 0.9126 | 0.9108 | 0.8399 |
202
+ | 0.0009 | 161.33 | 12100 | 1.5714 | 0.9111 | 0.9111 | 0.9111 | 0.8373 |
203
+ | 0.0009 | 162.67 | 12200 | 1.5848 | 0.9135 | 0.9126 | 0.9130 | 0.8378 |
204
+ | 0.0009 | 164.0 | 12300 | 1.5816 | 0.9112 | 0.9175 | 0.9144 | 0.8405 |
205
+ | 0.0009 | 165.33 | 12400 | 1.5425 | 0.9080 | 0.9121 | 0.9100 | 0.8386 |
206
+ | 0.0 | 166.67 | 12500 | 1.5837 | 0.9046 | 0.9136 | 0.9090 | 0.8362 |
207
+ | 0.0 | 168.0 | 12600 | 1.6781 | 0.9025 | 0.9195 | 0.9109 | 0.8290 |
208
+ | 0.0 | 169.33 | 12700 | 1.6219 | 0.9028 | 0.9185 | 0.9106 | 0.8326 |
209
+ | 0.0 | 170.67 | 12800 | 1.5786 | 0.9076 | 0.9126 | 0.9101 | 0.8380 |
210
+ | 0.0 | 172.0 | 12900 | 1.6212 | 0.9020 | 0.9146 | 0.9082 | 0.8322 |
211
+ | 0.0018 | 173.33 | 13000 | 1.6451 | 0.9086 | 0.9141 | 0.9113 | 0.8315 |
212
+ | 0.0018 | 174.67 | 13100 | 1.6730 | 0.9064 | 0.9185 | 0.9124 | 0.8293 |
213
+ | 0.0018 | 176.0 | 13200 | 1.6106 | 0.9026 | 0.9071 | 0.9049 | 0.8354 |
214
+ | 0.0018 | 177.33 | 13300 | 1.6403 | 0.9081 | 0.9180 | 0.9130 | 0.8402 |
215
+ | 0.0018 | 178.67 | 13400 | 1.6343 | 0.9043 | 0.9200 | 0.9121 | 0.8361 |
216
+ | 0.0012 | 180.0 | 13500 | 1.5853 | 0.9096 | 0.9195 | 0.9145 | 0.8431 |
217
+ | 0.0012 | 181.33 | 13600 | 1.5859 | 0.9101 | 0.9205 | 0.9153 | 0.8432 |
218
+ | 0.0012 | 182.67 | 13700 | 1.6137 | 0.9071 | 0.9215 | 0.9142 | 0.8394 |
219
+ | 0.0012 | 184.0 | 13800 | 1.6416 | 0.9002 | 0.9185 | 0.9093 | 0.8299 |
220
+ | 0.0012 | 185.33 | 13900 | 1.5497 | 0.9085 | 0.9126 | 0.9105 | 0.8457 |
221
+ | 0.0002 | 186.67 | 14000 | 1.6534 | 0.9015 | 0.9141 | 0.9077 | 0.8322 |
222
+ | 0.0002 | 188.0 | 14100 | 1.6003 | 0.9044 | 0.9116 | 0.9080 | 0.8290 |
223
+ | 0.0002 | 189.33 | 14200 | 1.5269 | 0.9046 | 0.9185 | 0.9115 | 0.8443 |
224
+ | 0.0002 | 190.67 | 14300 | 1.5977 | 0.9069 | 0.9141 | 0.9104 | 0.8360 |
225
+ | 0.0002 | 192.0 | 14400 | 1.5968 | 0.9090 | 0.9131 | 0.9110 | 0.8374 |
226
+ | 0.0004 | 193.33 | 14500 | 1.5945 | 0.9090 | 0.9131 | 0.9110 | 0.8376 |
227
+ | 0.0004 | 194.67 | 14600 | 1.6041 | 0.9117 | 0.9126 | 0.9121 | 0.8388 |
228
+ | 0.0004 | 196.0 | 14700 | 1.6038 | 0.9071 | 0.9121 | 0.9096 | 0.8362 |
229
+ | 0.0004 | 197.33 | 14800 | 1.6108 | 0.9059 | 0.9131 | 0.9095 | 0.8347 |
230
+ | 0.0004 | 198.67 | 14900 | 1.5873 | 0.9087 | 0.9151 | 0.9119 | 0.8393 |
231
+ | 0.0 | 200.0 | 15000 | 1.6047 | 0.9042 | 0.9146 | 0.9094 | 0.8417 |
232
+ | 0.0 | 201.33 | 15100 | 1.6125 | 0.9017 | 0.9155 | 0.9086 | 0.8348 |
233
+ | 0.0 | 202.67 | 15200 | 1.6191 | 0.9039 | 0.9155 | 0.9097 | 0.8360 |
234
+ | 0.0 | 204.0 | 15300 | 1.6626 | 0.9095 | 0.9131 | 0.9113 | 0.8320 |
235
+ | 0.0 | 205.33 | 15400 | 1.5967 | 0.9051 | 0.9190 | 0.9120 | 0.8427 |
236
+ | 0.0003 | 206.67 | 15500 | 1.5989 | 0.8982 | 0.9116 | 0.9048 | 0.8318 |
237
+ | 0.0003 | 208.0 | 15600 | 1.5990 | 0.8995 | 0.9116 | 0.9055 | 0.8307 |
238
+ | 0.0003 | 209.33 | 15700 | 1.6338 | 0.9043 | 0.9111 | 0.9077 | 0.8325 |
239
+ | 0.0003 | 210.67 | 15800 | 1.6390 | 0.9034 | 0.9101 | 0.9067 | 0.8329 |
240
+ | 0.0003 | 212.0 | 15900 | 1.6372 | 0.9015 | 0.9096 | 0.9055 | 0.8368 |
241
+ | 0.0001 | 213.33 | 16000 | 1.6020 | 0.9045 | 0.9131 | 0.9088 | 0.8378 |
242
+ | 0.0001 | 214.67 | 16100 | 1.5761 | 0.9071 | 0.9170 | 0.9121 | 0.8397 |
243
+ | 0.0001 | 216.0 | 16200 | 1.6536 | 0.9010 | 0.9131 | 0.9070 | 0.8293 |
244
+ | 0.0001 | 217.33 | 16300 | 1.6549 | 0.9023 | 0.9131 | 0.9077 | 0.8290 |
245
+ | 0.0001 | 218.67 | 16400 | 1.6737 | 0.8948 | 0.9081 | 0.9014 | 0.8292 |
246
+ | 0.0002 | 220.0 | 16500 | 1.6918 | 0.9106 | 0.9155 | 0.9131 | 0.8402 |
247
+ | 0.0002 | 221.33 | 16600 | 1.6726 | 0.9102 | 0.9165 | 0.9134 | 0.8379 |
248
+ | 0.0002 | 222.67 | 16700 | 1.6962 | 0.9121 | 0.9175 | 0.9148 | 0.8369 |
249
+ | 0.0002 | 224.0 | 16800 | 1.6974 | 0.9038 | 0.9146 | 0.9091 | 0.8367 |
250
+ | 0.0002 | 225.33 | 16900 | 1.7147 | 0.9126 | 0.9185 | 0.9156 | 0.8376 |
251
+ | 0.0006 | 226.67 | 17000 | 1.7000 | 0.9130 | 0.9180 | 0.9155 | 0.8387 |
252
+ | 0.0006 | 228.0 | 17100 | 1.6951 | 0.9083 | 0.9155 | 0.9119 | 0.8374 |
253
+ | 0.0006 | 229.33 | 17200 | 1.7014 | 0.9097 | 0.9160 | 0.9129 | 0.8369 |
254
+ | 0.0006 | 230.67 | 17300 | 1.7029 | 0.9102 | 0.9165 | 0.9134 | 0.8369 |
255
+ | 0.0006 | 232.0 | 17400 | 1.7039 | 0.9112 | 0.9170 | 0.9141 | 0.8374 |
256
+ | 0.0 | 233.33 | 17500 | 1.6516 | 0.9157 | 0.9116 | 0.9136 | 0.8355 |
257
+ | 0.0 | 234.67 | 17600 | 1.6536 | 0.9148 | 0.9126 | 0.9137 | 0.8348 |
258
+ | 0.0 | 236.0 | 17700 | 1.6548 | 0.9144 | 0.9131 | 0.9137 | 0.8348 |
259
+ | 0.0 | 237.33 | 17800 | 1.7110 | 0.9068 | 0.9185 | 0.9126 | 0.8360 |
260
+ | 0.0 | 238.67 | 17900 | 1.7115 | 0.9073 | 0.9185 | 0.9129 | 0.8362 |
261
+ | 0.0 | 240.0 | 18000 | 1.7124 | 0.9054 | 0.9180 | 0.9117 | 0.8362 |
262
+ | 0.0 | 241.33 | 18100 | 1.7146 | 0.9072 | 0.9175 | 0.9123 | 0.8376 |
263
+ | 0.0 | 242.67 | 18200 | 1.7217 | 0.9100 | 0.9141 | 0.9120 | 0.8317 |
264
+ | 0.0 | 244.0 | 18300 | 1.7225 | 0.9096 | 0.9146 | 0.9121 | 0.8315 |
265
+ | 0.0 | 245.33 | 18400 | 1.7159 | 0.9070 | 0.9155 | 0.9112 | 0.8323 |
266
+ | 0.0001 | 246.67 | 18500 | 1.7164 | 0.9074 | 0.9155 | 0.9115 | 0.8322 |
267
+ | 0.0001 | 248.0 | 18600 | 1.6927 | 0.9009 | 0.9165 | 0.9086 | 0.8326 |
268
+ | 0.0001 | 249.33 | 18700 | 1.6767 | 0.9034 | 0.9155 | 0.9094 | 0.8335 |
269
+ | 0.0001 | 250.67 | 18800 | 1.6773 | 0.9034 | 0.9155 | 0.9094 | 0.8335 |
270
+ | 0.0001 | 252.0 | 18900 | 1.6885 | 0.9029 | 0.9151 | 0.9090 | 0.8334 |
271
+ | 0.0002 | 253.33 | 19000 | 1.7032 | 0.9053 | 0.9165 | 0.9109 | 0.8312 |
272
+ | 0.0002 | 254.67 | 19100 | 1.7036 | 0.9057 | 0.9160 | 0.9108 | 0.8307 |
273
+ | 0.0002 | 256.0 | 19200 | 1.7041 | 0.9053 | 0.9160 | 0.9106 | 0.8310 |
274
+ | 0.0002 | 257.33 | 19300 | 1.7045 | 0.9053 | 0.9160 | 0.9106 | 0.8310 |
275
+ | 0.0002 | 258.67 | 19400 | 1.7049 | 0.9053 | 0.9160 | 0.9106 | 0.8310 |
276
+ | 0.0 | 260.0 | 19500 | 1.7069 | 0.9057 | 0.9165 | 0.9111 | 0.8310 |
277
+ | 0.0 | 261.33 | 19600 | 1.7062 | 0.9076 | 0.9170 | 0.9123 | 0.8312 |
278
+ | 0.0 | 262.67 | 19700 | 1.7071 | 0.9071 | 0.9170 | 0.9121 | 0.8312 |
279
+ | 0.0 | 264.0 | 19800 | 1.7083 | 0.9067 | 0.9170 | 0.9118 | 0.8313 |
280
+ | 0.0 | 265.33 | 19900 | 1.7084 | 0.9058 | 0.9170 | 0.9114 | 0.8316 |
281
+ | 0.0 | 266.67 | 20000 | 1.7086 | 0.9058 | 0.9170 | 0.9114 | 0.8316 |
282
+
283
+
284
+ ### Framework versions
285
+
286
+ - Transformers 4.34.1
287
+ - Pytorch 2.1.0+cu121
288
+ - Datasets 2.14.5
289
+ - Tokenizers 0.14.1
config.json ADDED
@@ -0,0 +1,58 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "microsoft/layoutlmv3-base",
3
+ "architectures": [
4
+ "LayoutLMv3ForTokenClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "bos_token_id": 0,
8
+ "classifier_dropout": null,
9
+ "coordinate_size": 128,
10
+ "eos_token_id": 2,
11
+ "has_relative_attention_bias": true,
12
+ "has_spatial_attention_bias": true,
13
+ "hidden_act": "gelu",
14
+ "hidden_dropout_prob": 0.1,
15
+ "hidden_size": 768,
16
+ "id2label": {
17
+ "0": "O",
18
+ "1": "B-HEADER",
19
+ "2": "I-HEADER",
20
+ "3": "B-QUESTION",
21
+ "4": "I-QUESTION",
22
+ "5": "B-ANSWER",
23
+ "6": "I-ANSWER"
24
+ },
25
+ "initializer_range": 0.02,
26
+ "input_size": 224,
27
+ "intermediate_size": 3072,
28
+ "label2id": {
29
+ "B-ANSWER": 5,
30
+ "B-HEADER": 1,
31
+ "B-QUESTION": 3,
32
+ "I-ANSWER": 6,
33
+ "I-HEADER": 2,
34
+ "I-QUESTION": 4,
35
+ "O": 0
36
+ },
37
+ "layer_norm_eps": 1e-05,
38
+ "max_2d_position_embeddings": 1024,
39
+ "max_position_embeddings": 514,
40
+ "max_rel_2d_pos": 256,
41
+ "max_rel_pos": 128,
42
+ "model_type": "layoutlmv3",
43
+ "num_attention_heads": 12,
44
+ "num_channels": 3,
45
+ "num_hidden_layers": 12,
46
+ "pad_token_id": 1,
47
+ "patch_size": 16,
48
+ "rel_2d_pos_bins": 64,
49
+ "rel_pos_bins": 32,
50
+ "second_input_size": 112,
51
+ "shape_size": 128,
52
+ "text_embed": true,
53
+ "torch_dtype": "float32",
54
+ "transformers_version": "4.34.1",
55
+ "type_vocab_size": 1,
56
+ "visual_embed": true,
57
+ "vocab_size": 50265
58
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
preprocessor_config.json ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "apply_ocr": false,
3
+ "do_normalize": true,
4
+ "do_rescale": true,
5
+ "do_resize": true,
6
+ "feature_extractor_type": "LayoutLMv3FeatureExtractor",
7
+ "image_mean": [
8
+ 0.5,
9
+ 0.5,
10
+ 0.5
11
+ ],
12
+ "image_processor_type": "LayoutLMv3ImageProcessor",
13
+ "image_std": [
14
+ 0.5,
15
+ 0.5,
16
+ 0.5
17
+ ],
18
+ "ocr_lang": null,
19
+ "processor_class": "LayoutLMv3Processor",
20
+ "resample": 2,
21
+ "rescale_factor": 0.00392156862745098,
22
+ "size": {
23
+ "height": 224,
24
+ "width": 224
25
+ },
26
+ "tesseract_config": ""
27
+ }
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ac3d27cc23880b14bcf7e1e218c00041cc27c3e99179fcce994a2c1e5a02487a
3
+ size 501403746
special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": true,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "<s>",
11
+ "lstrip": false,
12
+ "normalized": true,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": true,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "<mask>",
25
+ "lstrip": true,
26
+ "normalized": true,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "<pad>",
32
+ "lstrip": false,
33
+ "normalized": true,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "</s>",
39
+ "lstrip": false,
40
+ "normalized": true,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "<unk>",
46
+ "lstrip": false,
47
+ "normalized": true,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,79 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": true,
3
+ "added_tokens_decoder": {
4
+ "0": {
5
+ "content": "<s>",
6
+ "lstrip": false,
7
+ "normalized": true,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "1": {
13
+ "content": "<pad>",
14
+ "lstrip": false,
15
+ "normalized": true,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "2": {
21
+ "content": "</s>",
22
+ "lstrip": false,
23
+ "normalized": true,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": true
27
+ },
28
+ "3": {
29
+ "content": "<unk>",
30
+ "lstrip": false,
31
+ "normalized": true,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": true
35
+ },
36
+ "50264": {
37
+ "content": "<mask>",
38
+ "lstrip": true,
39
+ "normalized": true,
40
+ "rstrip": false,
41
+ "single_word": false,
42
+ "special": true
43
+ }
44
+ },
45
+ "apply_ocr": false,
46
+ "bos_token": "<s>",
47
+ "clean_up_tokenization_spaces": true,
48
+ "cls_token": "<s>",
49
+ "cls_token_box": [
50
+ 0,
51
+ 0,
52
+ 0,
53
+ 0
54
+ ],
55
+ "eos_token": "</s>",
56
+ "errors": "replace",
57
+ "mask_token": "<mask>",
58
+ "model_max_length": 512,
59
+ "only_label_first_subword": true,
60
+ "pad_token": "<pad>",
61
+ "pad_token_box": [
62
+ 0,
63
+ 0,
64
+ 0,
65
+ 0
66
+ ],
67
+ "pad_token_label": -100,
68
+ "processor_class": "LayoutLMv3Processor",
69
+ "sep_token": "</s>",
70
+ "sep_token_box": [
71
+ 0,
72
+ 0,
73
+ 0,
74
+ 0
75
+ ],
76
+ "tokenizer_class": "LayoutLMv3Tokenizer",
77
+ "trim_offsets": true,
78
+ "unk_token": "<unk>"
79
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c712b2159d186b838942fb4f76406bc60647ad8bbdea8f85e0eeb87a02aec347
3
+ size 4536
vocab.json ADDED
The diff for this file is too large to render. See raw diff