metadata
tags:
- generated_from_trainer
model-index:
- name: layoutlm-sroie
results: []
layoutlm-sroie
This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0325
- Address: {'precision': 0.9044943820224719, 'recall': 0.9279538904899135, 'f1': 0.9160739687055477, 'number': 347}
- Company: {'precision': 0.9196675900277008, 'recall': 0.9567723342939481, 'f1': 0.9378531073446328, 'number': 347}
- Date: {'precision': 0.9828080229226361, 'recall': 0.9884726224783862, 'f1': 0.985632183908046, 'number': 347}
- Total: {'precision': 0.8913649025069638, 'recall': 0.9221902017291066, 'f1': 0.9065155807365438, 'number': 347}
- Overall Precision: 0.9242
- Overall Recall: 0.9488
- Overall F1: 0.9364
- Overall Accuracy: 0.9947
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
Training results
Training Loss | Epoch | Step | Validation Loss | Address | Company | Date | Total | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|
0.4787 | 1.0 | 40 | 0.1051 | {'precision': 0.8115183246073299, 'recall': 0.8933717579250721, 'f1': 0.850480109739369, 'number': 347} | {'precision': 0.6813725490196079, 'recall': 0.8011527377521613, 'f1': 0.7364238410596026, 'number': 347} | {'precision': 0.7438423645320197, 'recall': 0.8703170028818443, 'f1': 0.8021248339973439, 'number': 347} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 347} | 0.7441 | 0.6412 | 0.6889 | 0.9705 |
0.0698 | 2.0 | 80 | 0.0453 | {'precision': 0.848, 'recall': 0.9164265129682997, 'f1': 0.8808864265927978, 'number': 347} | {'precision': 0.8080808080808081, 'recall': 0.9221902017291066, 'f1': 0.8613728129205921, 'number': 347} | {'precision': 0.9281767955801105, 'recall': 0.968299711815562, 'f1': 0.9478138222849083, 'number': 347} | {'precision': 0.6992481203007519, 'recall': 0.8040345821325648, 'f1': 0.7479892761394101, 'number': 347} | 0.8179 | 0.9027 | 0.8582 | 0.9882 |
0.0326 | 3.0 | 120 | 0.0317 | {'precision': 0.8763736263736264, 'recall': 0.9193083573487032, 'f1': 0.8973277074542897, 'number': 347} | {'precision': 0.888283378746594, 'recall': 0.9394812680115274, 'f1': 0.9131652661064427, 'number': 347} | {'precision': 0.9713467048710601, 'recall': 0.9769452449567724, 'f1': 0.9741379310344828, 'number': 347} | {'precision': 0.8259668508287292, 'recall': 0.861671469740634, 'f1': 0.8434414668547249, 'number': 347} | 0.8897 | 0.9244 | 0.9067 | 0.9928 |
0.0222 | 4.0 | 160 | 0.0333 | {'precision': 0.8922651933701657, 'recall': 0.930835734870317, 'f1': 0.9111424541607898, 'number': 347} | {'precision': 0.8983516483516484, 'recall': 0.9423631123919308, 'f1': 0.919831223628692, 'number': 347} | {'precision': 0.9912280701754386, 'recall': 0.9769452449567724, 'f1': 0.9840348330914369, 'number': 347} | {'precision': 0.7348837209302326, 'recall': 0.9106628242074928, 'f1': 0.8133848133848134, 'number': 347} | 0.8712 | 0.9402 | 0.9044 | 0.9921 |
0.0185 | 5.0 | 200 | 0.0288 | {'precision': 0.9209039548022598, 'recall': 0.9394812680115274, 'f1': 0.9300998573466476, 'number': 347} | {'precision': 0.8856382978723404, 'recall': 0.9596541786743515, 'f1': 0.921161825726141, 'number': 347} | {'precision': 0.991304347826087, 'recall': 0.9855907780979827, 'f1': 0.9884393063583815, 'number': 347} | {'precision': 0.8547945205479452, 'recall': 0.899135446685879, 'f1': 0.8764044943820225, 'number': 347} | 0.9118 | 0.9460 | 0.9286 | 0.9938 |
0.0141 | 6.0 | 240 | 0.0269 | {'precision': 0.8991596638655462, 'recall': 0.9250720461095101, 'f1': 0.9119318181818182, 'number': 347} | {'precision': 0.9108635097493036, 'recall': 0.9423631123919308, 'f1': 0.926345609065156, 'number': 347} | {'precision': 0.9884393063583815, 'recall': 0.9855907780979827, 'f1': 0.987012987012987, 'number': 347} | {'precision': 0.8795518207282913, 'recall': 0.9048991354466859, 'f1': 0.8920454545454546, 'number': 347} | 0.9190 | 0.9395 | 0.9291 | 0.9944 |
0.0117 | 7.0 | 280 | 0.0281 | {'precision': 0.9178470254957507, 'recall': 0.9337175792507204, 'f1': 0.9257142857142857, 'number': 347} | {'precision': 0.9138888888888889, 'recall': 0.9481268011527377, 'f1': 0.9306930693069307, 'number': 347} | {'precision': 0.9855907780979827, 'recall': 0.9855907780979827, 'f1': 0.9855907780979827, 'number': 347} | {'precision': 0.8985507246376812, 'recall': 0.8933717579250721, 'f1': 0.8959537572254335, 'number': 347} | 0.9288 | 0.9402 | 0.9345 | 0.9945 |
0.0104 | 8.0 | 320 | 0.0313 | {'precision': 0.9101123595505618, 'recall': 0.9337175792507204, 'f1': 0.9217638691322901, 'number': 347} | {'precision': 0.9043715846994536, 'recall': 0.9538904899135446, 'f1': 0.9284712482468442, 'number': 347} | {'precision': 0.9717514124293786, 'recall': 0.9913544668587896, 'f1': 0.9814550641940086, 'number': 347} | {'precision': 0.868632707774799, 'recall': 0.9337175792507204, 'f1': 0.8999999999999999, 'number': 347} | 0.9130 | 0.9532 | 0.9327 | 0.9941 |
0.009 | 9.0 | 360 | 0.0282 | {'precision': 0.9204545454545454, 'recall': 0.9337175792507204, 'f1': 0.927038626609442, 'number': 347} | {'precision': 0.9171270718232044, 'recall': 0.9567723342939481, 'f1': 0.9365303244005642, 'number': 347} | {'precision': 0.9828571428571429, 'recall': 0.9913544668587896, 'f1': 0.9870875179340028, 'number': 347} | {'precision': 0.8885793871866295, 'recall': 0.9193083573487032, 'f1': 0.9036827195467422, 'number': 347} | 0.9269 | 0.9503 | 0.9385 | 0.9949 |
0.0081 | 10.0 | 400 | 0.0313 | {'precision': 0.9047619047619048, 'recall': 0.930835734870317, 'f1': 0.9176136363636365, 'number': 347} | {'precision': 0.9217877094972067, 'recall': 0.9510086455331412, 'f1': 0.9361702127659575, 'number': 347} | {'precision': 0.9828080229226361, 'recall': 0.9884726224783862, 'f1': 0.985632183908046, 'number': 347} | {'precision': 0.8974358974358975, 'recall': 0.9077809798270894, 'f1': 0.9025787965616047, 'number': 347} | 0.9265 | 0.9445 | 0.9354 | 0.9945 |
0.0064 | 11.0 | 440 | 0.0318 | {'precision': 0.9047619047619048, 'recall': 0.930835734870317, 'f1': 0.9176136363636365, 'number': 347} | {'precision': 0.9138888888888889, 'recall': 0.9481268011527377, 'f1': 0.9306930693069307, 'number': 347} | {'precision': 0.9828080229226361, 'recall': 0.9884726224783862, 'f1': 0.985632183908046, 'number': 347} | {'precision': 0.8791208791208791, 'recall': 0.9221902017291066, 'f1': 0.90014064697609, 'number': 347} | 0.9196 | 0.9474 | 0.9333 | 0.9944 |
0.0063 | 12.0 | 480 | 0.0335 | {'precision': 0.8994413407821229, 'recall': 0.9279538904899135, 'f1': 0.9134751773049646, 'number': 347} | {'precision': 0.9088397790055248, 'recall': 0.9481268011527377, 'f1': 0.928067700987306, 'number': 347} | {'precision': 0.9828080229226361, 'recall': 0.9884726224783862, 'f1': 0.985632183908046, 'number': 347} | {'precision': 0.8839779005524862, 'recall': 0.9221902017291066, 'f1': 0.9026798307475318, 'number': 347} | 0.9182 | 0.9467 | 0.9322 | 0.9943 |
0.0054 | 13.0 | 520 | 0.0312 | {'precision': 0.9070422535211268, 'recall': 0.9279538904899135, 'f1': 0.9173789173789173, 'number': 347} | {'precision': 0.9194444444444444, 'recall': 0.9538904899135446, 'f1': 0.9363507779349363, 'number': 347} | {'precision': 0.9828080229226361, 'recall': 0.9884726224783862, 'f1': 0.985632183908046, 'number': 347} | {'precision': 0.8839779005524862, 'recall': 0.9221902017291066, 'f1': 0.9026798307475318, 'number': 347} | 0.9229 | 0.9481 | 0.9353 | 0.9947 |
0.0054 | 14.0 | 560 | 0.0326 | {'precision': 0.9044943820224719, 'recall': 0.9279538904899135, 'f1': 0.9160739687055477, 'number': 347} | {'precision': 0.9222222222222223, 'recall': 0.9567723342939481, 'f1': 0.9391796322489392, 'number': 347} | {'precision': 0.9828080229226361, 'recall': 0.9884726224783862, 'f1': 0.985632183908046, 'number': 347} | {'precision': 0.8910614525139665, 'recall': 0.9193083573487032, 'f1': 0.9049645390070922, 'number': 347} | 0.9248 | 0.9481 | 0.9363 | 0.9946 |
0.0048 | 15.0 | 600 | 0.0325 | {'precision': 0.9044943820224719, 'recall': 0.9279538904899135, 'f1': 0.9160739687055477, 'number': 347} | {'precision': 0.9196675900277008, 'recall': 0.9567723342939481, 'f1': 0.9378531073446328, 'number': 347} | {'precision': 0.9828080229226361, 'recall': 0.9884726224783862, 'f1': 0.985632183908046, 'number': 347} | {'precision': 0.8913649025069638, 'recall': 0.9221902017291066, 'f1': 0.9065155807365438, 'number': 347} | 0.9242 | 0.9488 | 0.9364 | 0.9947 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.13.3