layoutlm-sroie-dacn

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1268
  • Address: {'precision': 0.9905708460754332, 'recall': 0.9948809828512926, 'f1': 0.9927212361128847, 'number': 3907}
  • Company: {'precision': 0.9704530531845043, 'recall': 0.9912810194500336, 'f1': 0.9807564698075647, 'number': 1491}
  • Date: {'precision': 1.0, 'recall': 0.9929906542056075, 'f1': 0.9964830011723329, 'number': 428}
  • Total: {'precision': 0.8861788617886179, 'recall': 0.8814016172506739, 'f1': 0.8837837837837837, 'number': 371}
  • Overall Precision: 0.9801
  • Overall Recall: 0.9871
  • Overall F1: 0.9836
  • Overall Accuracy: 0.9950

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20
  • label_smoothing_factor: 0.02

Training results

Training Loss Epoch Step Validation Loss Address Company Date Total Overall Precision Overall Recall Overall F1 Overall Accuracy
0.3877 1.0 40 0.1597 {'precision': 0.9882921863069484, 'recall': 0.993857179421551, 'f1': 0.9910668708524758, 'number': 3907} {'precision': 0.8616279069767442, 'recall': 0.993963782696177, 'f1': 0.9230769230769231, 'number': 1491} {'precision': 0.838, 'recall': 0.9789719626168224, 'f1': 0.9030172413793104, 'number': 428} {'precision': 0.9090909090909091, 'recall': 0.026954177897574125, 'f1': 0.05235602094240838, 'number': 371} 0.9406 0.9350 0.9378 0.9812
0.1439 2.0 80 0.1310 {'precision': 0.9940934771443246, 'recall': 0.9907857691323266, 'f1': 0.9924368670683246, 'number': 3907} {'precision': 0.9517994858611826, 'recall': 0.9932930918846412, 'f1': 0.9721037085658024, 'number': 1491} {'precision': 0.9789719626168224, 'recall': 0.9789719626168224, 'f1': 0.9789719626168224, 'number': 428} {'precision': 0.7128205128205128, 'recall': 0.7493261455525606, 'f1': 0.7306176084099868, 'number': 371} 0.9651 0.9761 0.9706 0.9911
0.1267 3.0 120 0.1283 {'precision': 0.9890557393738865, 'recall': 0.9946250319938572, 'f1': 0.9918325676365493, 'number': 3907} {'precision': 0.9647979139504563, 'recall': 0.9926224010731053, 'f1': 0.978512396694215, 'number': 1491} {'precision': 0.9952718676122931, 'recall': 0.9836448598130841, 'f1': 0.9894242068155112, 'number': 428} {'precision': 0.783375314861461, 'recall': 0.8382749326145552, 'f1': 0.8098958333333334, 'number': 371} 0.9706 0.9840 0.9772 0.9931
0.12 4.0 160 0.1280 {'precision': 0.984070796460177, 'recall': 0.9961607371384694, 'f1': 0.9900788603408802, 'number': 3907} {'precision': 0.9678899082568807, 'recall': 0.9906103286384976, 'f1': 0.9791183294663572, 'number': 1491} {'precision': 0.9929411764705882, 'recall': 0.985981308411215, 'f1': 0.9894490035169988, 'number': 428} {'precision': 0.8219895287958116, 'recall': 0.8463611859838275, 'f1': 0.8339973439575034, 'number': 371} 0.9709 0.9852 0.9780 0.9932
0.1167 5.0 200 0.1254 {'precision': 0.9903061224489796, 'recall': 0.9936012285641157, 'f1': 0.9919509390571101, 'number': 3907} {'precision': 0.9692206941715783, 'recall': 0.9926224010731053, 'f1': 0.9807819748177601, 'number': 1491} {'precision': 0.9929411764705882, 'recall': 0.985981308411215, 'f1': 0.9894490035169988, 'number': 428} {'precision': 0.8445040214477212, 'recall': 0.8490566037735849, 'f1': 0.8467741935483872, 'number': 371} 0.9766 0.9842 0.9804 0.9940
0.1128 6.0 240 0.1273 {'precision': 0.9865584580268831, 'recall': 0.9956488354235987, 'f1': 0.9910828025477707, 'number': 3907} {'precision': 0.9698952879581152, 'recall': 0.993963782696177, 'f1': 0.9817820470354421, 'number': 1491} {'precision': 1.0, 'recall': 0.9789719626168224, 'f1': 0.9893742621015349, 'number': 428} {'precision': 0.8575268817204301, 'recall': 0.8598382749326146, 'f1': 0.8586810228802152, 'number': 371} 0.9757 0.9860 0.9808 0.9941
0.1119 7.0 280 0.1250 {'precision': 0.9943604204050244, 'recall': 0.9928333759918095, 'f1': 0.9935963114754098, 'number': 3907} {'precision': 0.9717477003942181, 'recall': 0.9919517102615694, 'f1': 0.9817457683372054, 'number': 1491} {'precision': 0.9976415094339622, 'recall': 0.9883177570093458, 'f1': 0.9929577464788731, 'number': 428} {'precision': 0.9037900874635568, 'recall': 0.8355795148247979, 'f1': 0.8683473389355741, 'number': 371} 0.9840 0.9829 0.9835 0.9950
0.1111 8.0 320 0.1264 {'precision': 0.9900586286005608, 'recall': 0.9941131302789864, 'f1': 0.9920817369093232, 'number': 3907} {'precision': 0.9654498044328553, 'recall': 0.9932930918846412, 'f1': 0.9791735537190083, 'number': 1491} {'precision': 0.9953161592505855, 'recall': 0.9929906542056075, 'f1': 0.9941520467836257, 'number': 428} {'precision': 0.8904494382022472, 'recall': 0.8544474393530997, 'f1': 0.8720770288858323, 'number': 371} 0.9787 0.9855 0.9821 0.9946
0.1097 9.0 360 0.1264 {'precision': 0.9890529531568228, 'recall': 0.9943690811364219, 'f1': 0.9917038927887685, 'number': 3907} {'precision': 0.9679738562091503, 'recall': 0.9932930918846412, 'f1': 0.9804700430321085, 'number': 1491} {'precision': 1.0, 'recall': 0.9883177570093458, 'f1': 0.9941245593419507, 'number': 428} {'precision': 0.8919667590027701, 'recall': 0.8679245283018868, 'f1': 0.8797814207650273, 'number': 371} 0.9790 0.9861 0.9826 0.9947
0.1091 10.0 400 0.1256 {'precision': 0.9918346516968615, 'recall': 0.9948809828512926, 'f1': 0.9933554817275747, 'number': 3907} {'precision': 0.972972972972973, 'recall': 0.9899396378269618, 'f1': 0.9813829787234043, 'number': 1491} {'precision': 1.0, 'recall': 0.9906542056074766, 'f1': 0.9953051643192489, 'number': 428} {'precision': 0.898876404494382, 'recall': 0.862533692722372, 'f1': 0.8803301237964237, 'number': 371} 0.9825 0.9855 0.9840 0.9951
0.1085 11.0 440 0.1270 {'precision': 0.9913287426676868, 'recall': 0.9948809828512926, 'f1': 0.9931016862544711, 'number': 3907} {'precision': 0.9673416067929458, 'recall': 0.9932930918846412, 'f1': 0.9801455989410985, 'number': 1491} {'precision': 1.0, 'recall': 0.9906542056074766, 'f1': 0.9953051643192489, 'number': 428} {'precision': 0.8828337874659401, 'recall': 0.8733153638814016, 'f1': 0.878048780487805, 'number': 371} 0.9797 0.9869 0.9833 0.9949
0.1089 12.0 480 0.1263 {'precision': 0.9903184713375797, 'recall': 0.9948809828512926, 'f1': 0.9925944841675179, 'number': 3907} {'precision': 0.969301110385369, 'recall': 0.9953051643192489, 'f1': 0.9821310390469887, 'number': 1491} {'precision': 1.0, 'recall': 0.9906542056074766, 'f1': 0.9953051643192489, 'number': 428} {'precision': 0.905982905982906, 'recall': 0.8571428571428571, 'f1': 0.8808864265927977, 'number': 371} 0.9811 0.9864 0.9837 0.9950
0.1081 13.0 520 0.1258 {'precision': 0.9913287426676868, 'recall': 0.9948809828512926, 'f1': 0.9931016862544711, 'number': 3907} {'precision': 0.9711475409836066, 'recall': 0.9932930918846412, 'f1': 0.9820954907161804, 'number': 1491} {'precision': 1.0, 'recall': 0.9906542056074766, 'f1': 0.9953051643192489, 'number': 428} {'precision': 0.8885869565217391, 'recall': 0.8814016172506739, 'f1': 0.884979702300406, 'number': 371} 0.9809 0.9874 0.9842 0.9952
0.1081 14.0 560 0.1275 {'precision': 0.9903184713375797, 'recall': 0.9948809828512926, 'f1': 0.9925944841675179, 'number': 3907} {'precision': 0.9655172413793104, 'recall': 0.9953051643192489, 'f1': 0.9801849405548217, 'number': 1491} {'precision': 1.0, 'recall': 0.9906542056074766, 'f1': 0.9953051643192489, 'number': 428} {'precision': 0.8837837837837837, 'recall': 0.8814016172506739, 'f1': 0.8825910931174089, 'number': 371} 0.9786 0.9879 0.9832 0.9949
0.108 15.0 600 0.1276 {'precision': 0.9900662251655629, 'recall': 0.9948809828512926, 'f1': 0.992467764585727, 'number': 3907} {'precision': 0.961139896373057, 'recall': 0.9953051643192489, 'f1': 0.9779242174629325, 'number': 1491} {'precision': 1.0, 'recall': 0.9929906542056075, 'f1': 0.9964830011723329, 'number': 428} {'precision': 0.8937329700272479, 'recall': 0.8840970350404312, 'f1': 0.888888888888889, 'number': 371} 0.9780 0.9882 0.9831 0.9948
0.1077 16.0 640 0.1268 {'precision': 0.9900662251655629, 'recall': 0.9948809828512926, 'f1': 0.992467764585727, 'number': 3907} {'precision': 0.965472312703583, 'recall': 0.993963782696177, 'f1': 0.9795109054857898, 'number': 1491} {'precision': 1.0, 'recall': 0.9929906542056075, 'f1': 0.9964830011723329, 'number': 428} {'precision': 0.8885869565217391, 'recall': 0.8814016172506739, 'f1': 0.884979702300406, 'number': 371} 0.9787 0.9877 0.9832 0.9949
0.1074 17.0 680 0.1266 {'precision': 0.9903184713375797, 'recall': 0.9948809828512926, 'f1': 0.9925944841675179, 'number': 3907} {'precision': 0.9679529103989536, 'recall': 0.9926224010731053, 'f1': 0.980132450331126, 'number': 1491} {'precision': 1.0, 'recall': 0.9929906542056075, 'f1': 0.9964830011723329, 'number': 428} {'precision': 0.8922651933701657, 'recall': 0.8706199460916442, 'f1': 0.8813096862210095, 'number': 371} 0.9798 0.9868 0.9833 0.9949
0.1072 18.0 720 0.1266 {'precision': 0.9905708460754332, 'recall': 0.9948809828512926, 'f1': 0.9927212361128847, 'number': 3907} {'precision': 0.9710716633793557, 'recall': 0.9906103286384976, 'f1': 0.9807436918990704, 'number': 1491} {'precision': 1.0, 'recall': 0.9929906542056075, 'f1': 0.9964830011723329, 'number': 428} {'precision': 0.8814016172506739, 'recall': 0.8814016172506739, 'f1': 0.8814016172506739, 'number': 371} 0.9800 0.9869 0.9834 0.9950
0.1071 19.0 760 0.1266 {'precision': 0.9905708460754332, 'recall': 0.9948809828512926, 'f1': 0.9927212361128847, 'number': 3907} {'precision': 0.9704530531845043, 'recall': 0.9912810194500336, 'f1': 0.9807564698075647, 'number': 1491} {'precision': 1.0, 'recall': 0.9929906542056075, 'f1': 0.9964830011723329, 'number': 428} {'precision': 0.8858695652173914, 'recall': 0.8787061994609164, 'f1': 0.8822733423545331, 'number': 371} 0.9801 0.9869 0.9835 0.9950
0.1076 20.0 800 0.1268 {'precision': 0.9905708460754332, 'recall': 0.9948809828512926, 'f1': 0.9927212361128847, 'number': 3907} {'precision': 0.9704530531845043, 'recall': 0.9912810194500336, 'f1': 0.9807564698075647, 'number': 1491} {'precision': 1.0, 'recall': 0.9929906542056075, 'f1': 0.9964830011723329, 'number': 428} {'precision': 0.8861788617886179, 'recall': 0.8814016172506739, 'f1': 0.8837837837837837, 'number': 371} 0.9801 0.9871 0.9836 0.9950

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.13.3
Downloads last month
9
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.