layoutlmv2-sroie-test
This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0312
- Address: {'precision': 0.9915751850906306, 'recall': 0.9941131302789864, 'f1': 0.992842535787321, 'number': 3907}
- Company: {'precision': 0.966491458607096, 'recall': 0.9865861837692823, 'f1': 0.9764354463989379, 'number': 1491}
- Date: {'precision': 1.0, 'recall': 0.985981308411215, 'f1': 0.9929411764705882, 'number': 428}
- Total: {'precision': 0.8783068783068783, 'recall': 0.894878706199461, 'f1': 0.8865153538050735, 'number': 371}
- Overall Precision: 0.9792
- Overall Recall: 0.9858
- Overall F1: 0.9825
- Overall Accuracy: 0.9947
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
Training results
Training Loss | Epoch | Step | Validation Loss | Address | Company | Date | Total | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|
0.3558 | 1.0 | 40 | 0.0698 | {'precision': 0.9855256475368207, 'recall': 0.9933452777066804, 'f1': 0.9894200127469727, 'number': 3907} | {'precision': 0.8677685950413223, 'recall': 0.9859154929577465, 'f1': 0.9230769230769231, 'number': 1491} | {'precision': 0.8384458077709611, 'recall': 0.9579439252336449, 'f1': 0.8942202835332607, 'number': 428} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 371} | 0.9412 | 0.9296 | 0.9354 | 0.9808 |
0.0489 | 2.0 | 80 | 0.0374 | {'precision': 0.9917547023962896, 'recall': 0.9851548502687484, 'f1': 0.9884437596302004, 'number': 3907} | {'precision': 0.92625, 'recall': 0.993963782696177, 'f1': 0.9589129731478485, 'number': 1491} | {'precision': 0.9699074074074074, 'recall': 0.9789719626168224, 'f1': 0.9744186046511628, 'number': 428} | {'precision': 0.7655367231638418, 'recall': 0.7304582210242587, 'f1': 0.7475862068965516, 'number': 371} | 0.9607 | 0.9716 | 0.9661 | 0.9899 |
0.0282 | 3.0 | 120 | 0.0277 | {'precision': 0.9913198876691346, 'recall': 0.993857179421551, 'f1': 0.9925869120654397, 'number': 3907} | {'precision': 0.9633986928104575, 'recall': 0.98859825620389, 'f1': 0.9758358159549818, 'number': 1491} | {'precision': 0.9929078014184397, 'recall': 0.9813084112149533, 'f1': 0.9870740305522915, 'number': 428} | {'precision': 0.8376068376068376, 'recall': 0.7924528301886793, 'f1': 0.8144044321329641, 'number': 371} | 0.9759 | 0.9797 | 0.9778 | 0.9933 |
0.0194 | 4.0 | 160 | 0.0259 | {'precision': 0.990063694267516, 'recall': 0.9946250319938572, 'f1': 0.9923391215526048, 'number': 3907} | {'precision': 0.9754152823920266, 'recall': 0.9845741113346748, 'f1': 0.9799732977303072, 'number': 1491} | {'precision': 0.9929245283018868, 'recall': 0.9836448598130841, 'f1': 0.9882629107981221, 'number': 428} | {'precision': 0.8320209973753281, 'recall': 0.8544474393530997, 'f1': 0.8430851063829787, 'number': 371} | 0.9771 | 0.9831 | 0.9801 | 0.9939 |
0.0148 | 5.0 | 200 | 0.0259 | {'precision': 0.990316004077472, 'recall': 0.9946250319938572, 'f1': 0.9924658408887755, 'number': 3907} | {'precision': 0.9597141000649773, 'recall': 0.9906103286384976, 'f1': 0.9749174917491749, 'number': 1491} | {'precision': 0.9952941176470588, 'recall': 0.9883177570093458, 'f1': 0.9917936694021102, 'number': 428} | {'precision': 0.8621621621621621, 'recall': 0.8598382749326146, 'f1': 0.8609986504723346, 'number': 371} | 0.9756 | 0.9852 | 0.9803 | 0.9940 |
0.0113 | 6.0 | 240 | 0.0255 | {'precision': 0.9910714285714286, 'recall': 0.9943690811364219, 'f1': 0.9927175162897662, 'number': 3907} | {'precision': 0.9659239842726082, 'recall': 0.98859825620389, 'f1': 0.9771295989393438, 'number': 1491} | {'precision': 0.9976415094339622, 'recall': 0.9883177570093458, 'f1': 0.9929577464788731, 'number': 428} | {'precision': 0.9008746355685131, 'recall': 0.8328840970350404, 'f1': 0.8655462184873949, 'number': 371} | 0.9804 | 0.9829 | 0.9816 | 0.9944 |
0.0094 | 7.0 | 280 | 0.0267 | {'precision': 0.9908233494774408, 'recall': 0.9948809828512926, 'f1': 0.9928480204342274, 'number': 3907} | {'precision': 0.9627450980392157, 'recall': 0.9879275653923542, 'f1': 0.9751737835153922, 'number': 1491} | {'precision': 0.9952830188679245, 'recall': 0.985981308411215, 'f1': 0.9906103286384976, 'number': 428} | {'precision': 0.8787878787878788, 'recall': 0.8598382749326146, 'f1': 0.8692098092643051, 'number': 371} | 0.9777 | 0.9845 | 0.9811 | 0.9942 |
0.0082 | 8.0 | 320 | 0.0274 | {'precision': 0.9915751850906306, 'recall': 0.9941131302789864, 'f1': 0.992842535787321, 'number': 3907} | {'precision': 0.9671268902038133, 'recall': 0.9865861837692823, 'f1': 0.9767596281540504, 'number': 1491} | {'precision': 0.9929577464788732, 'recall': 0.9883177570093458, 'f1': 0.990632318501171, 'number': 428} | {'precision': 0.8898071625344353, 'recall': 0.8706199460916442, 'f1': 0.880108991825613, 'number': 371} | 0.9798 | 0.9845 | 0.9821 | 0.9946 |
0.0069 | 9.0 | 360 | 0.0273 | {'precision': 0.9915751850906306, 'recall': 0.9941131302789864, 'f1': 0.992842535787321, 'number': 3907} | {'precision': 0.972203838517538, 'recall': 0.9852448021462106, 'f1': 0.9786808794137242, 'number': 1491} | {'precision': 1.0, 'recall': 0.9813084112149533, 'f1': 0.9905660377358491, 'number': 428} | {'precision': 0.88, 'recall': 0.889487870619946, 'f1': 0.8847184986595173, 'number': 371} | 0.9807 | 0.9848 | 0.9828 | 0.9948 |
0.0055 | 10.0 | 400 | 0.0291 | {'precision': 0.9905636317266003, 'recall': 0.9941131302789864, 'f1': 0.9923352069494124, 'number': 3907} | {'precision': 0.9671268902038133, 'recall': 0.9865861837692823, 'f1': 0.9767596281540504, 'number': 1491} | {'precision': 0.9976359338061466, 'recall': 0.985981308411215, 'f1': 0.991774383078731, 'number': 428} | {'precision': 0.9025787965616046, 'recall': 0.8490566037735849, 'f1': 0.875, 'number': 371} | 0.9804 | 0.9831 | 0.9817 | 0.9944 |
0.0045 | 11.0 | 440 | 0.0292 | {'precision': 0.9915751850906306, 'recall': 0.9941131302789864, 'f1': 0.992842535787321, 'number': 3907} | {'precision': 0.9696169088507266, 'recall': 0.9845741113346748, 'f1': 0.9770382695507488, 'number': 1491} | {'precision': 1.0, 'recall': 0.985981308411215, 'f1': 0.9929411764705882, 'number': 428} | {'precision': 0.8817204301075269, 'recall': 0.8840970350404312, 'f1': 0.882907133243607, 'number': 371} | 0.9802 | 0.9847 | 0.9825 | 0.9947 |
0.0042 | 12.0 | 480 | 0.0310 | {'precision': 0.9913221031138336, 'recall': 0.9941131302789864, 'f1': 0.9927156549520767, 'number': 3907} | {'precision': 0.9683794466403162, 'recall': 0.9859154929577465, 'f1': 0.9770687936191426, 'number': 1491} | {'precision': 1.0, 'recall': 0.9836448598130841, 'f1': 0.9917550058892814, 'number': 428} | {'precision': 0.8763440860215054, 'recall': 0.8787061994609164, 'f1': 0.8775235531628534, 'number': 371} | 0.9795 | 0.9845 | 0.9820 | 0.9945 |
0.0038 | 13.0 | 520 | 0.0316 | {'precision': 0.9915751850906306, 'recall': 0.9941131302789864, 'f1': 0.992842535787321, 'number': 3907} | {'precision': 0.9652230971128609, 'recall': 0.9865861837692823, 'f1': 0.9757877280265339, 'number': 1491} | {'precision': 0.9952830188679245, 'recall': 0.985981308411215, 'f1': 0.9906103286384976, 'number': 428} | {'precision': 0.8649350649350649, 'recall': 0.8975741239892183, 'f1': 0.8809523809523809, 'number': 371} | 0.9776 | 0.9860 | 0.9818 | 0.9944 |
0.0035 | 14.0 | 560 | 0.0311 | {'precision': 0.9915751850906306, 'recall': 0.9941131302789864, 'f1': 0.992842535787321, 'number': 3907} | {'precision': 0.9658568614576494, 'recall': 0.9865861837692823, 'f1': 0.9761114797611148, 'number': 1491} | {'precision': 1.0, 'recall': 0.985981308411215, 'f1': 0.9929411764705882, 'number': 428} | {'precision': 0.8790322580645161, 'recall': 0.8814016172506739, 'f1': 0.8802153432032301, 'number': 371} | 0.9791 | 0.9850 | 0.9821 | 0.9945 |
0.0032 | 15.0 | 600 | 0.0312 | {'precision': 0.9915751850906306, 'recall': 0.9941131302789864, 'f1': 0.992842535787321, 'number': 3907} | {'precision': 0.966491458607096, 'recall': 0.9865861837692823, 'f1': 0.9764354463989379, 'number': 1491} | {'precision': 1.0, 'recall': 0.985981308411215, 'f1': 0.9929411764705882, 'number': 428} | {'precision': 0.8783068783068783, 'recall': 0.894878706199461, 'f1': 0.8865153538050735, 'number': 371} | 0.9792 | 0.9858 | 0.9825 | 0.9947 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.13.3
- Downloads last month
- 90
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.