layoutlm-custom_no_text
This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.5142
- Noise: {'precision': 0.6764705882352942, 'recall': 0.6695205479452054, 'f1': 0.6729776247848538, 'number': 584}
- Signal: {'precision': 0.629757785467128, 'recall': 0.6232876712328768, 'f1': 0.6265060240963856, 'number': 584}
- Overall Precision: 0.6531
- Overall Recall: 0.6464
- Overall F1: 0.6497
- Overall Accuracy: 0.9156
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Noise | Signal | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|
0.5712 | 1.0 | 18 | 0.4356 | {'precision': 0.3712374581939799, 'recall': 0.3801369863013699, 'f1': 0.3756345177664975, 'number': 584} | {'precision': 0.3294314381270903, 'recall': 0.3373287671232877, 'f1': 0.3333333333333333, 'number': 584} | 0.3503 | 0.3587 | 0.3545 | 0.8008 |
0.4233 | 2.0 | 36 | 0.3745 | {'precision': 0.4048964218455744, 'recall': 0.3681506849315068, 'f1': 0.38565022421524664, 'number': 584} | {'precision': 0.3483992467043315, 'recall': 0.3167808219178082, 'f1': 0.33183856502242154, 'number': 584} | 0.3766 | 0.3425 | 0.3587 | 0.8287 |
0.3817 | 3.0 | 54 | 0.3632 | {'precision': 0.45740740740740743, 'recall': 0.4229452054794521, 'f1': 0.43950177935943063, 'number': 584} | {'precision': 0.3851851851851852, 'recall': 0.3561643835616438, 'f1': 0.3701067615658363, 'number': 584} | 0.4213 | 0.3896 | 0.4048 | 0.8413 |
0.3472 | 4.0 | 72 | 0.3133 | {'precision': 0.5143953934740882, 'recall': 0.4589041095890411, 'f1': 0.4850678733031674, 'number': 584} | {'precision': 0.43378119001919385, 'recall': 0.386986301369863, 'f1': 0.40904977375565615, 'number': 584} | 0.4741 | 0.4229 | 0.4471 | 0.8550 |
0.3132 | 5.0 | 90 | 0.3254 | {'precision': 0.5112016293279023, 'recall': 0.4297945205479452, 'f1': 0.4669767441860465, 'number': 584} | {'precision': 0.4460285132382892, 'recall': 0.375, 'f1': 0.4074418604651162, 'number': 584} | 0.4786 | 0.4024 | 0.4372 | 0.8525 |
0.282 | 6.0 | 108 | 0.3033 | {'precision': 0.5387453874538746, 'recall': 0.5, 'f1': 0.5186500888099467, 'number': 584} | {'precision': 0.46863468634686345, 'recall': 0.4349315068493151, 'f1': 0.45115452930728245, 'number': 584} | 0.5037 | 0.4675 | 0.4849 | 0.8656 |
0.2486 | 7.0 | 126 | 0.2827 | {'precision': 0.5498220640569395, 'recall': 0.5291095890410958, 'f1': 0.5392670157068062, 'number': 584} | {'precision': 0.5071174377224199, 'recall': 0.488013698630137, 'f1': 0.49738219895287955, 'number': 584} | 0.5285 | 0.5086 | 0.5183 | 0.8773 |
0.2276 | 8.0 | 144 | 0.2798 | {'precision': 0.5597826086956522, 'recall': 0.5291095890410958, 'f1': 0.5440140845070423, 'number': 584} | {'precision': 0.5235507246376812, 'recall': 0.4948630136986301, 'f1': 0.5088028169014084, 'number': 584} | 0.5417 | 0.5120 | 0.5264 | 0.8793 |
0.197 | 9.0 | 162 | 0.2778 | {'precision': 0.5948905109489051, 'recall': 0.5582191780821918, 'f1': 0.5759717314487632, 'number': 584} | {'precision': 0.5602189781021898, 'recall': 0.5256849315068494, 'f1': 0.5424028268551236, 'number': 584} | 0.5776 | 0.5420 | 0.5592 | 0.8891 |
0.1812 | 10.0 | 180 | 0.2932 | {'precision': 0.5907407407407408, 'recall': 0.5462328767123288, 'f1': 0.5676156583629893, 'number': 584} | {'precision': 0.5666666666666667, 'recall': 0.523972602739726, 'f1': 0.5444839857651246, 'number': 584} | 0.5787 | 0.5351 | 0.5560 | 0.8888 |
0.1611 | 11.0 | 198 | 0.2785 | {'precision': 0.6156648451730419, 'recall': 0.5787671232876712, 'f1': 0.5966460723742276, 'number': 584} | {'precision': 0.5719489981785064, 'recall': 0.5376712328767124, 'f1': 0.5542806707855252, 'number': 584} | 0.5938 | 0.5582 | 0.5755 | 0.8991 |
0.1441 | 12.0 | 216 | 0.2738 | {'precision': 0.6263537906137184, 'recall': 0.5941780821917808, 'f1': 0.6098418277680141, 'number': 584} | {'precision': 0.5776173285198556, 'recall': 0.547945205479452, 'f1': 0.562390158172232, 'number': 584} | 0.6020 | 0.5711 | 0.5861 | 0.9016 |
0.1294 | 13.0 | 234 | 0.3072 | {'precision': 0.6201413427561837, 'recall': 0.601027397260274, 'f1': 0.6104347826086957, 'number': 584} | {'precision': 0.5795053003533569, 'recall': 0.5616438356164384, 'f1': 0.5704347826086956, 'number': 584} | 0.5998 | 0.5813 | 0.5904 | 0.8989 |
0.1218 | 14.0 | 252 | 0.2963 | {'precision': 0.629695885509839, 'recall': 0.6027397260273972, 'f1': 0.6159230096237971, 'number': 584} | {'precision': 0.5849731663685152, 'recall': 0.559931506849315, 'f1': 0.5721784776902886, 'number': 584} | 0.6073 | 0.5813 | 0.5941 | 0.9030 |
0.1032 | 15.0 | 270 | 0.3365 | {'precision': 0.6106194690265486, 'recall': 0.5907534246575342, 'f1': 0.6005221932114881, 'number': 584} | {'precision': 0.5681415929203539, 'recall': 0.5496575342465754, 'f1': 0.5587467362924282, 'number': 584} | 0.5894 | 0.5702 | 0.5796 | 0.8991 |
0.0981 | 16.0 | 288 | 0.3342 | {'precision': 0.631858407079646, 'recall': 0.6113013698630136, 'f1': 0.6214099216710183, 'number': 584} | {'precision': 0.5893805309734513, 'recall': 0.5702054794520548, 'f1': 0.5796344647519582, 'number': 584} | 0.6106 | 0.5908 | 0.6005 | 0.9039 |
0.0844 | 17.0 | 306 | 0.3543 | {'precision': 0.6502636203866432, 'recall': 0.6335616438356164, 'f1': 0.6418039895923676, 'number': 584} | {'precision': 0.5957820738137083, 'recall': 0.5804794520547946, 'f1': 0.5880312228967911, 'number': 584} | 0.6230 | 0.6070 | 0.6149 | 0.9050 |
0.0763 | 18.0 | 324 | 0.3559 | {'precision': 0.6392294220665499, 'recall': 0.625, 'f1': 0.632034632034632, 'number': 584} | {'precision': 0.5989492119089317, 'recall': 0.5856164383561644, 'f1': 0.5922077922077922, 'number': 584} | 0.6191 | 0.6053 | 0.6121 | 0.9075 |
0.0682 | 19.0 | 342 | 0.3599 | {'precision': 0.6666666666666666, 'recall': 0.6335616438356164, 'f1': 0.6496927129060578, 'number': 584} | {'precision': 0.618018018018018, 'recall': 0.5873287671232876, 'f1': 0.6022827041264267, 'number': 584} | 0.6423 | 0.6104 | 0.6260 | 0.9086 |
0.0685 | 20.0 | 360 | 0.3574 | {'precision': 0.670863309352518, 'recall': 0.6386986301369864, 'f1': 0.6543859649122807, 'number': 584} | {'precision': 0.6151079136690647, 'recall': 0.5856164383561644, 'f1': 0.6, 'number': 584} | 0.6430 | 0.6122 | 0.6272 | 0.9114 |
0.0591 | 21.0 | 378 | 0.3742 | {'precision': 0.6573426573426573, 'recall': 0.6438356164383562, 'f1': 0.6505190311418684, 'number': 584} | {'precision': 0.6171328671328671, 'recall': 0.6044520547945206, 'f1': 0.6107266435986158, 'number': 584} | 0.6372 | 0.6241 | 0.6306 | 0.9100 |
0.0521 | 22.0 | 396 | 0.4063 | {'precision': 0.6566901408450704, 'recall': 0.6386986301369864, 'f1': 0.6475694444444444, 'number': 584} | {'precision': 0.6161971830985915, 'recall': 0.5993150684931506, 'f1': 0.607638888888889, 'number': 584} | 0.6364 | 0.6190 | 0.6276 | 0.9095 |
0.0492 | 23.0 | 414 | 0.3971 | {'precision': 0.649737302977233, 'recall': 0.6352739726027398, 'f1': 0.6424242424242426, 'number': 584} | {'precision': 0.5971978984238179, 'recall': 0.583904109589041, 'f1': 0.5904761904761905, 'number': 584} | 0.6235 | 0.6096 | 0.6165 | 0.9086 |
0.045 | 24.0 | 432 | 0.4198 | {'precision': 0.6448275862068965, 'recall': 0.6404109589041096, 'f1': 0.6426116838487972, 'number': 584} | {'precision': 0.5948275862068966, 'recall': 0.5907534246575342, 'f1': 0.5927835051546393, 'number': 584} | 0.6198 | 0.6156 | 0.6177 | 0.9061 |
0.0391 | 25.0 | 450 | 0.4477 | {'precision': 0.643979057591623, 'recall': 0.6318493150684932, 'f1': 0.6378565254969749, 'number': 584} | {'precision': 0.5986038394415357, 'recall': 0.5873287671232876, 'f1': 0.5929127052722557, 'number': 584} | 0.6213 | 0.6096 | 0.6154 | 0.9061 |
0.0411 | 26.0 | 468 | 0.4080 | {'precision': 0.6400679117147708, 'recall': 0.6455479452054794, 'f1': 0.6427962489343563, 'number': 584} | {'precision': 0.597623089983022, 'recall': 0.6027397260273972, 'f1': 0.6001705029838021, 'number': 584} | 0.6188 | 0.6241 | 0.6215 | 0.9084 |
0.0369 | 27.0 | 486 | 0.4339 | {'precision': 0.6614035087719298, 'recall': 0.6455479452054794, 'f1': 0.6533795493934141, 'number': 584} | {'precision': 0.6105263157894737, 'recall': 0.5958904109589042, 'f1': 0.6031195840554593, 'number': 584} | 0.6360 | 0.6207 | 0.6282 | 0.9103 |
0.0315 | 28.0 | 504 | 0.4303 | {'precision': 0.6637931034482759, 'recall': 0.6592465753424658, 'f1': 0.6615120274914089, 'number': 584} | {'precision': 0.6137931034482759, 'recall': 0.6095890410958904, 'f1': 0.6116838487972508, 'number': 584} | 0.6388 | 0.6344 | 0.6366 | 0.9117 |
0.0332 | 29.0 | 522 | 0.4253 | {'precision': 0.6643717728055077, 'recall': 0.660958904109589, 'f1': 0.6626609442060085, 'number': 584} | {'precision': 0.6179001721170396, 'recall': 0.6147260273972602, 'f1': 0.6163090128755364, 'number': 584} | 0.6411 | 0.6378 | 0.6395 | 0.9134 |
0.0272 | 30.0 | 540 | 0.4594 | {'precision': 0.6495726495726496, 'recall': 0.6506849315068494, 'f1': 0.6501283147989735, 'number': 584} | {'precision': 0.5931623931623932, 'recall': 0.5941780821917808, 'f1': 0.5936698032506416, 'number': 584} | 0.6214 | 0.6224 | 0.6219 | 0.9078 |
0.027 | 31.0 | 558 | 0.4680 | {'precision': 0.6621160409556314, 'recall': 0.6643835616438356, 'f1': 0.6632478632478632, 'number': 584} | {'precision': 0.6143344709897611, 'recall': 0.6164383561643836, 'f1': 0.6153846153846154, 'number': 584} | 0.6382 | 0.6404 | 0.6393 | 0.9111 |
0.0295 | 32.0 | 576 | 0.4367 | {'precision': 0.6719022687609075, 'recall': 0.6592465753424658, 'f1': 0.6655142610198791, 'number': 584} | {'precision': 0.612565445026178, 'recall': 0.601027397260274, 'f1': 0.6067415730337079, 'number': 584} | 0.6422 | 0.6301 | 0.6361 | 0.9120 |
0.0216 | 33.0 | 594 | 0.4674 | {'precision': 0.681260945709282, 'recall': 0.666095890410959, 'f1': 0.6735930735930735, 'number': 584} | {'precision': 0.6357267950963222, 'recall': 0.6215753424657534, 'f1': 0.6285714285714286, 'number': 584} | 0.6585 | 0.6438 | 0.6511 | 0.9139 |
0.0212 | 34.0 | 612 | 0.4702 | {'precision': 0.6666666666666666, 'recall': 0.6643835616438356, 'f1': 0.6655231560891938, 'number': 584} | {'precision': 0.6202749140893471, 'recall': 0.6181506849315068, 'f1': 0.6192109777015438, 'number': 584} | 0.6435 | 0.6413 | 0.6424 | 0.9103 |
0.0227 | 35.0 | 630 | 0.4637 | {'precision': 0.657672849915683, 'recall': 0.6678082191780822, 'f1': 0.6627017841971112, 'number': 584} | {'precision': 0.6155143338954469, 'recall': 0.625, 'f1': 0.6202209005947323, 'number': 584} | 0.6366 | 0.6464 | 0.6415 | 0.9109 |
0.0196 | 36.0 | 648 | 0.4639 | {'precision': 0.6660899653979239, 'recall': 0.6592465753424658, 'f1': 0.6626506024096386, 'number': 584} | {'precision': 0.6141868512110726, 'recall': 0.6078767123287672, 'f1': 0.6110154905335629, 'number': 584} | 0.6401 | 0.6336 | 0.6368 | 0.9125 |
0.0183 | 37.0 | 666 | 0.4656 | {'precision': 0.6632478632478632, 'recall': 0.6643835616438356, 'f1': 0.6638152266894781, 'number': 584} | {'precision': 0.6, 'recall': 0.601027397260274, 'f1': 0.6005132591958939, 'number': 584} | 0.6316 | 0.6327 | 0.6322 | 0.9131 |
0.0209 | 38.0 | 684 | 0.4754 | {'precision': 0.6649214659685864, 'recall': 0.6523972602739726, 'f1': 0.658599827139153, 'number': 584} | {'precision': 0.6073298429319371, 'recall': 0.5958904109589042, 'f1': 0.6015557476231633, 'number': 584} | 0.6361 | 0.6241 | 0.6301 | 0.9131 |
0.0166 | 39.0 | 702 | 0.4703 | {'precision': 0.6695352839931153, 'recall': 0.666095890410959, 'f1': 0.6678111587982833, 'number': 584} | {'precision': 0.612736660929432, 'recall': 0.6095890410958904, 'f1': 0.6111587982832618, 'number': 584} | 0.6411 | 0.6378 | 0.6395 | 0.9151 |
0.0152 | 40.0 | 720 | 0.4739 | {'precision': 0.6626712328767124, 'recall': 0.6626712328767124, 'f1': 0.6626712328767124, 'number': 584} | {'precision': 0.6215753424657534, 'recall': 0.6215753424657534, 'f1': 0.6215753424657534, 'number': 584} | 0.6421 | 0.6421 | 0.6421 | 0.9139 |
0.0173 | 41.0 | 738 | 0.4839 | {'precision': 0.6610738255033557, 'recall': 0.6746575342465754, 'f1': 0.6677966101694915, 'number': 584} | {'precision': 0.6191275167785235, 'recall': 0.6318493150684932, 'f1': 0.6254237288135593, 'number': 584} | 0.6401 | 0.6533 | 0.6466 | 0.9139 |
0.0162 | 42.0 | 756 | 0.4854 | {'precision': 0.6610455311973018, 'recall': 0.6712328767123288, 'f1': 0.6661002548853017, 'number': 584} | {'precision': 0.6138279932546374, 'recall': 0.6232876712328768, 'f1': 0.6185216652506373, 'number': 584} | 0.6374 | 0.6473 | 0.6423 | 0.9156 |
0.0186 | 43.0 | 774 | 0.4747 | {'precision': 0.666095890410959, 'recall': 0.666095890410959, 'f1': 0.666095890410959, 'number': 584} | {'precision': 0.6061643835616438, 'recall': 0.6061643835616438, 'f1': 0.6061643835616438, 'number': 584} | 0.6361 | 0.6361 | 0.6361 | 0.9156 |
0.0149 | 44.0 | 792 | 0.4920 | {'precision': 0.6695501730103807, 'recall': 0.6626712328767124, 'f1': 0.666092943201377, 'number': 584} | {'precision': 0.6141868512110726, 'recall': 0.6078767123287672, 'f1': 0.6110154905335629, 'number': 584} | 0.6419 | 0.6353 | 0.6386 | 0.9139 |
0.0126 | 45.0 | 810 | 0.4911 | {'precision': 0.6621392190152802, 'recall': 0.6678082191780822, 'f1': 0.6649616368286446, 'number': 584} | {'precision': 0.6146010186757216, 'recall': 0.6198630136986302, 'f1': 0.6172208013640239, 'number': 584} | 0.6384 | 0.6438 | 0.6411 | 0.9117 |
0.0142 | 46.0 | 828 | 0.4932 | {'precision': 0.671280276816609, 'recall': 0.6643835616438356, 'f1': 0.6678141135972462, 'number': 584} | {'precision': 0.6228373702422145, 'recall': 0.6164383561643836, 'f1': 0.6196213425129088, 'number': 584} | 0.6471 | 0.6404 | 0.6437 | 0.9123 |
0.0107 | 47.0 | 846 | 0.5057 | {'precision': 0.6730103806228374, 'recall': 0.666095890410959, 'f1': 0.6695352839931152, 'number': 584} | {'precision': 0.6245674740484429, 'recall': 0.6181506849315068, 'f1': 0.6213425129087781, 'number': 584} | 0.6488 | 0.6421 | 0.6454 | 0.9139 |
0.0127 | 48.0 | 864 | 0.5076 | {'precision': 0.6800699300699301, 'recall': 0.666095890410959, 'f1': 0.6730103806228375, 'number': 584} | {'precision': 0.6293706293706294, 'recall': 0.6164383561643836, 'f1': 0.6228373702422145, 'number': 584} | 0.6547 | 0.6413 | 0.6479 | 0.9156 |
0.0116 | 49.0 | 882 | 0.5185 | {'precision': 0.6759098786828422, 'recall': 0.6678082191780822, 'f1': 0.6718346253229973, 'number': 584} | {'precision': 0.6291161178509532, 'recall': 0.6215753424657534, 'f1': 0.6253229974160206, 'number': 584} | 0.6525 | 0.6447 | 0.6486 | 0.9148 |
0.0099 | 50.0 | 900 | 0.5142 | {'precision': 0.6764705882352942, 'recall': 0.6695205479452054, 'f1': 0.6729776247848538, 'number': 584} | {'precision': 0.629757785467128, 'recall': 0.6232876712328768, 'f1': 0.6265060240963856, 'number': 584} | 0.6531 | 0.6464 | 0.6497 | 0.9156 |
Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.1
- Tokenizers 0.15.0
- Downloads last month
- 5
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for uttam333/layoutlm-custom_no_text
Base model
microsoft/layoutlm-base-uncased