Edit model card

layoutlm-base-uncased-finetuned-invoices-1

This model is a fine-tuned version of riteshbehera857/layoutlm-base-uncased-finetuned-invoices-0 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0241
  • B-adress: {'precision': 0.9670050761421319, 'recall': 0.9621212121212122, 'f1': 0.9645569620253166, 'number': 1188}
  • B-name: {'precision': 0.9629629629629629, 'recall': 0.9854227405247813, 'f1': 0.9740634005763689, 'number': 343}
  • Gst no: {'precision': 0.9682539682539683, 'recall': 0.9838709677419355, 'f1': 0.976, 'number': 124}
  • Invoice no: {'precision': 0.9727272727272728, 'recall': 0.981651376146789, 'f1': 0.9771689497716896, 'number': 109}
  • Order date: {'precision': 0.9841269841269841, 'recall': 0.9763779527559056, 'f1': 0.9802371541501976, 'number': 127}
  • Order id: {'precision': 0.9769230769230769, 'recall': 0.9769230769230769, 'f1': 0.9769230769230769, 'number': 130}
  • S-adress: {'precision': 0.9961221522055259, 'recall': 0.9961221522055259, 'f1': 0.9961221522055259, 'number': 2063}
  • S-name: {'precision': 0.9838383838383838, 'recall': 0.9979508196721312, 'f1': 0.9908443540183113, 'number': 488}
  • Total gross: {'precision': 0.9482758620689655, 'recall': 1.0, 'f1': 0.9734513274336283, 'number': 55}
  • Total net: {'precision': 0.984, 'recall': 0.9919354838709677, 'f1': 0.9879518072289156, 'number': 124}
  • Overall Precision: 0.9822
  • Overall Recall: 0.9853
  • Overall F1: 0.9837
  • Overall Accuracy: 0.9945

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 10
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss B-adress B-name Gst no Invoice no Order date Order id S-adress S-name Total gross Total net Overall Precision Overall Recall Overall F1 Overall Accuracy
0.0357 1.0 19 0.0235 {'precision': 0.9775280898876404, 'recall': 0.952020202020202, 'f1': 0.9646055437100214, 'number': 1188} {'precision': 0.9630681818181818, 'recall': 0.9883381924198251, 'f1': 0.9755395683453237, 'number': 343} {'precision': 0.976, 'recall': 0.9838709677419355, 'f1': 0.9799196787148594, 'number': 124} {'precision': 0.963963963963964, 'recall': 0.981651376146789, 'f1': 0.9727272727272728, 'number': 109} {'precision': 0.9841269841269841, 'recall': 0.9763779527559056, 'f1': 0.9802371541501976, 'number': 127} {'precision': 0.9770992366412213, 'recall': 0.9846153846153847, 'f1': 0.9808429118773947, 'number': 130} {'precision': 0.9875060067275349, 'recall': 0.9961221522055259, 'f1': 0.9917953667953667, 'number': 2063} {'precision': 0.9779116465863453, 'recall': 0.9979508196721312, 'f1': 0.9878296146044625, 'number': 488} {'precision': 0.8208955223880597, 'recall': 1.0, 'f1': 0.9016393442622952, 'number': 55} {'precision': 0.968, 'recall': 0.9758064516129032, 'f1': 0.9718875502008033, 'number': 124} 0.9782 0.9827 0.9805 0.9935
0.0318 2.0 38 0.0267 {'precision': 0.9350441058540497, 'recall': 0.9814814814814815, 'f1': 0.957700205338809, 'number': 1188} {'precision': 0.9797101449275363, 'recall': 0.9854227405247813, 'f1': 0.9825581395348838, 'number': 343} {'precision': 0.9384615384615385, 'recall': 0.9838709677419355, 'f1': 0.9606299212598426, 'number': 124} {'precision': 0.9380530973451328, 'recall': 0.9724770642201835, 'f1': 0.954954954954955, 'number': 109} {'precision': 0.9763779527559056, 'recall': 0.9763779527559056, 'f1': 0.9763779527559056, 'number': 127} {'precision': 0.9621212121212122, 'recall': 0.9769230769230769, 'f1': 0.9694656488549619, 'number': 130} {'precision': 0.9837476099426387, 'recall': 0.9975763451284537, 'f1': 0.9906137184115523, 'number': 2063} {'precision': 0.9876796714579056, 'recall': 0.985655737704918, 'f1': 0.9866666666666666, 'number': 488} {'precision': 0.8870967741935484, 'recall': 1.0, 'f1': 0.9401709401709402, 'number': 55} {'precision': 0.9831932773109243, 'recall': 0.9435483870967742, 'f1': 0.962962962962963, 'number': 124} 0.9670 0.9880 0.9774 0.9925
0.0251 3.0 57 0.0237 {'precision': 0.96, 'recall': 0.9696969696969697, 'f1': 0.964824120603015, 'number': 1188} {'precision': 0.9712643678160919, 'recall': 0.9854227405247813, 'f1': 0.9782923299565845, 'number': 343} {'precision': 0.9682539682539683, 'recall': 0.9838709677419355, 'f1': 0.976, 'number': 124} {'precision': 0.9464285714285714, 'recall': 0.9724770642201835, 'f1': 0.9592760180995474, 'number': 109} {'precision': 0.9841269841269841, 'recall': 0.9763779527559056, 'f1': 0.9802371541501976, 'number': 127} {'precision': 0.9694656488549618, 'recall': 0.9769230769230769, 'f1': 0.9731800766283524, 'number': 130} {'precision': 0.995635305528613, 'recall': 0.9951526902569074, 'f1': 0.9953939393939395, 'number': 2063} {'precision': 0.9817813765182186, 'recall': 0.9938524590163934, 'f1': 0.9877800407331976, 'number': 488} {'precision': 0.9016393442622951, 'recall': 1.0, 'f1': 0.9482758620689655, 'number': 55} {'precision': 0.953125, 'recall': 0.9838709677419355, 'f1': 0.9682539682539683, 'number': 124} 0.9783 0.9859 0.9821 0.9941
0.0208 4.0 76 0.0241 {'precision': 0.9670050761421319, 'recall': 0.9621212121212122, 'f1': 0.9645569620253166, 'number': 1188} {'precision': 0.9629629629629629, 'recall': 0.9854227405247813, 'f1': 0.9740634005763689, 'number': 343} {'precision': 0.9682539682539683, 'recall': 0.9838709677419355, 'f1': 0.976, 'number': 124} {'precision': 0.9727272727272728, 'recall': 0.981651376146789, 'f1': 0.9771689497716896, 'number': 109} {'precision': 0.9841269841269841, 'recall': 0.9763779527559056, 'f1': 0.9802371541501976, 'number': 127} {'precision': 0.9769230769230769, 'recall': 0.9769230769230769, 'f1': 0.9769230769230769, 'number': 130} {'precision': 0.9961221522055259, 'recall': 0.9961221522055259, 'f1': 0.9961221522055259, 'number': 2063} {'precision': 0.9838383838383838, 'recall': 0.9979508196721312, 'f1': 0.9908443540183113, 'number': 488} {'precision': 0.9482758620689655, 'recall': 1.0, 'f1': 0.9734513274336283, 'number': 55} {'precision': 0.984, 'recall': 0.9919354838709677, 'f1': 0.9879518072289156, 'number': 124} 0.9822 0.9853 0.9837 0.9945
0.0174 5.0 95 0.0230 {'precision': 0.9662731871838112, 'recall': 0.9646464646464646, 'f1': 0.9654591406908172, 'number': 1188} {'precision': 0.976878612716763, 'recall': 0.9854227405247813, 'f1': 0.981132075471698, 'number': 343} {'precision': 0.9682539682539683, 'recall': 0.9838709677419355, 'f1': 0.976, 'number': 124} {'precision': 0.9636363636363636, 'recall': 0.9724770642201835, 'f1': 0.9680365296803652, 'number': 109} {'precision': 0.9841269841269841, 'recall': 0.9763779527559056, 'f1': 0.9802371541501976, 'number': 127} {'precision': 0.9694656488549618, 'recall': 0.9769230769230769, 'f1': 0.9731800766283524, 'number': 130} {'precision': 0.9951620706337687, 'recall': 0.9970916141541445, 'f1': 0.9961259079903148, 'number': 2063} {'precision': 0.9857433808553971, 'recall': 0.9918032786885246, 'f1': 0.9887640449438202, 'number': 488} {'precision': 0.9, 'recall': 0.9818181818181818, 'f1': 0.9391304347826087, 'number': 55} {'precision': 1.0, 'recall': 0.9758064516129032, 'f1': 0.9877551020408163, 'number': 124} 0.9822 0.9848 0.9835 0.9946
0.0137 6.0 114 0.0292 {'precision': 0.9582985821517932, 'recall': 0.9671717171717171, 'f1': 0.9627147046501886, 'number': 1188} {'precision': 0.9768115942028985, 'recall': 0.9825072886297376, 'f1': 0.9796511627906976, 'number': 343} {'precision': 0.9682539682539683, 'recall': 0.9838709677419355, 'f1': 0.976, 'number': 124} {'precision': 0.954954954954955, 'recall': 0.9724770642201835, 'f1': 0.9636363636363636, 'number': 109} {'precision': 0.9841269841269841, 'recall': 0.9763779527559056, 'f1': 0.9802371541501976, 'number': 127} {'precision': 0.9694656488549618, 'recall': 0.9769230769230769, 'f1': 0.9731800766283524, 'number': 130} {'precision': 0.9941747572815534, 'recall': 0.9927290353853612, 'f1': 0.9934513703613873, 'number': 2063} {'precision': 0.972, 'recall': 0.9959016393442623, 'f1': 0.9838056680161944, 'number': 488} {'precision': 0.9473684210526315, 'recall': 0.9818181818181818, 'f1': 0.9642857142857142, 'number': 55} {'precision': 0.9534883720930233, 'recall': 0.9919354838709677, 'f1': 0.9723320158102766, 'number': 124} 0.9774 0.9842 0.9808 0.9937
0.0133 7.0 133 0.0291 {'precision': 0.9573221757322176, 'recall': 0.9629629629629629, 'f1': 0.9601342845153168, 'number': 1188} {'precision': 0.9824561403508771, 'recall': 0.9795918367346939, 'f1': 0.981021897810219, 'number': 343} {'precision': 0.9682539682539683, 'recall': 0.9838709677419355, 'f1': 0.976, 'number': 124} {'precision': 0.9553571428571429, 'recall': 0.981651376146789, 'f1': 0.9683257918552036, 'number': 109} {'precision': 0.9841269841269841, 'recall': 0.9763779527559056, 'f1': 0.9802371541501976, 'number': 127} {'precision': 0.9769230769230769, 'recall': 0.9769230769230769, 'f1': 0.9769230769230769, 'number': 130} {'precision': 0.9965294992563213, 'recall': 0.9743092583616093, 'f1': 0.9852941176470588, 'number': 2063} {'precision': 0.9052044609665427, 'recall': 0.9979508196721312, 'f1': 0.949317738791423, 'number': 488} {'precision': 0.9473684210526315, 'recall': 0.9818181818181818, 'f1': 0.9642857142857142, 'number': 55} {'precision': 0.968503937007874, 'recall': 0.9919354838709677, 'f1': 0.9800796812749005, 'number': 124} 0.9715 0.9754 0.9734 0.9924
0.0119 8.0 152 0.0298 {'precision': 0.9609958506224067, 'recall': 0.9747474747474747, 'f1': 0.9678228165482657, 'number': 1188} {'precision': 0.9824561403508771, 'recall': 0.9795918367346939, 'f1': 0.981021897810219, 'number': 343} {'precision': 0.9682539682539683, 'recall': 0.9838709677419355, 'f1': 0.976, 'number': 124} {'precision': 0.954954954954955, 'recall': 0.9724770642201835, 'f1': 0.9636363636363636, 'number': 109} {'precision': 0.9838709677419355, 'recall': 0.9606299212598425, 'f1': 0.9721115537848605, 'number': 127} {'precision': 0.9694656488549618, 'recall': 0.9769230769230769, 'f1': 0.9731800766283524, 'number': 130} {'precision': 0.9965260545905708, 'recall': 0.9733397964129908, 'f1': 0.984796468857283, 'number': 2063} {'precision': 0.9018518518518519, 'recall': 0.9979508196721312, 'f1': 0.9474708171206225, 'number': 488} {'precision': 0.9473684210526315, 'recall': 0.9818181818181818, 'f1': 0.9642857142857142, 'number': 55} {'precision': 0.984, 'recall': 0.9919354838709677, 'f1': 0.9879518072289156, 'number': 124} 0.9722 0.9773 0.9747 0.9930
0.0108 9.0 171 0.0323 {'precision': 0.9576059850374065, 'recall': 0.9696969696969697, 'f1': 0.9636135508155584, 'number': 1188} {'precision': 0.9768115942028985, 'recall': 0.9825072886297376, 'f1': 0.9796511627906976, 'number': 343} {'precision': 0.9682539682539683, 'recall': 0.9838709677419355, 'f1': 0.976, 'number': 124} {'precision': 0.9553571428571429, 'recall': 0.981651376146789, 'f1': 0.9683257918552036, 'number': 109} {'precision': 0.984, 'recall': 0.968503937007874, 'f1': 0.976190476190476, 'number': 127} {'precision': 0.9769230769230769, 'recall': 0.9769230769230769, 'f1': 0.9769230769230769, 'number': 130} {'precision': 0.9946157611355849, 'recall': 0.984973339796413, 'f1': 0.9897710667316123, 'number': 2063} {'precision': 0.9400386847195358, 'recall': 0.9959016393442623, 'f1': 0.9671641791044776, 'number': 488} {'precision': 0.9464285714285714, 'recall': 0.9636363636363636, 'f1': 0.9549549549549549, 'number': 55} {'precision': 0.9838709677419355, 'recall': 0.9838709677419355, 'f1': 0.9838709677419355, 'number': 124} 0.9749 0.9811 0.9780 0.9934
0.01 10.0 190 0.0308 {'precision': 0.9553719008264463, 'recall': 0.9730639730639731, 'f1': 0.9641367806505421, 'number': 1188} {'precision': 0.9768115942028985, 'recall': 0.9825072886297376, 'f1': 0.9796511627906976, 'number': 343} {'precision': 0.9682539682539683, 'recall': 0.9838709677419355, 'f1': 0.976, 'number': 124} {'precision': 0.9553571428571429, 'recall': 0.981651376146789, 'f1': 0.9683257918552036, 'number': 109} {'precision': 0.984, 'recall': 0.968503937007874, 'f1': 0.976190476190476, 'number': 127} {'precision': 0.9769230769230769, 'recall': 0.9769230769230769, 'f1': 0.9769230769230769, 'number': 130} {'precision': 0.9951028403525954, 'recall': 0.984973339796413, 'f1': 0.9900121802679658, 'number': 2063} {'precision': 0.939922480620155, 'recall': 0.9938524590163934, 'f1': 0.9661354581673307, 'number': 488} {'precision': 0.9473684210526315, 'recall': 0.9818181818181818, 'f1': 0.9642857142857142, 'number': 55} {'precision': 0.984, 'recall': 0.9919354838709677, 'f1': 0.9879518072289156, 'number': 124} 0.9745 0.9821 0.9783 0.9936
0.0093 11.0 209 0.0314 {'precision': 0.9599332220367279, 'recall': 0.968013468013468, 'f1': 0.9639564124056998, 'number': 1188} {'precision': 0.9795918367346939, 'recall': 0.9795918367346939, 'f1': 0.9795918367346939, 'number': 343} {'precision': 0.9682539682539683, 'recall': 0.9838709677419355, 'f1': 0.976, 'number': 124} {'precision': 0.9553571428571429, 'recall': 0.981651376146789, 'f1': 0.9683257918552036, 'number': 109} {'precision': 0.984, 'recall': 0.968503937007874, 'f1': 0.976190476190476, 'number': 127} {'precision': 0.9769230769230769, 'recall': 0.9769230769230769, 'f1': 0.9769230769230769, 'number': 130} {'precision': 0.995551161641127, 'recall': 0.9762481822588464, 'f1': 0.9858051884483603, 'number': 2063} {'precision': 0.908411214953271, 'recall': 0.9959016393442623, 'f1': 0.9501466275659824, 'number': 488} {'precision': 0.9473684210526315, 'recall': 0.9818181818181818, 'f1': 0.9642857142857142, 'number': 55} {'precision': 0.9761904761904762, 'recall': 0.9919354838709677, 'f1': 0.9840000000000001, 'number': 124} 0.9721 0.9771 0.9746 0.9928
0.008 12.0 228 0.0327 {'precision': 0.957464553794829, 'recall': 0.9663299663299664, 'f1': 0.9618768328445748, 'number': 1188} {'precision': 0.9767441860465116, 'recall': 0.9795918367346939, 'f1': 0.9781659388646288, 'number': 343} {'precision': 0.9682539682539683, 'recall': 0.9838709677419355, 'f1': 0.976, 'number': 124} {'precision': 0.9553571428571429, 'recall': 0.981651376146789, 'f1': 0.9683257918552036, 'number': 109} {'precision': 0.984, 'recall': 0.968503937007874, 'f1': 0.976190476190476, 'number': 127} {'precision': 0.9769230769230769, 'recall': 0.9769230769230769, 'f1': 0.9769230769230769, 'number': 130} {'precision': 0.9955423476968797, 'recall': 0.9743092583616093, 'f1': 0.9848113669769721, 'number': 2063} {'precision': 0.9014869888475836, 'recall': 0.9938524590163934, 'f1': 0.9454191033138402, 'number': 488} {'precision': 0.9473684210526315, 'recall': 0.9818181818181818, 'f1': 0.9642857142857142, 'number': 55} {'precision': 0.9761904761904762, 'recall': 0.9919354838709677, 'f1': 0.9840000000000001, 'number': 124} 0.9705 0.9756 0.9730 0.9924
0.0077 13.0 247 0.0328 {'precision': 0.9623745819397993, 'recall': 0.9688552188552189, 'f1': 0.9656040268456376, 'number': 1188} {'precision': 0.9767441860465116, 'recall': 0.9795918367346939, 'f1': 0.9781659388646288, 'number': 343} {'precision': 0.9682539682539683, 'recall': 0.9838709677419355, 'f1': 0.976, 'number': 124} {'precision': 0.9553571428571429, 'recall': 0.981651376146789, 'f1': 0.9683257918552036, 'number': 109} {'precision': 0.984, 'recall': 0.968503937007874, 'f1': 0.976190476190476, 'number': 127} {'precision': 0.9769230769230769, 'recall': 0.9769230769230769, 'f1': 0.9769230769230769, 'number': 130} {'precision': 0.9960356788899901, 'recall': 0.9743092583616093, 'f1': 0.9850526831658908, 'number': 2063} {'precision': 0.9016697588126159, 'recall': 0.9959016393442623, 'f1': 0.9464459591041869, 'number': 488} {'precision': 0.9473684210526315, 'recall': 0.9818181818181818, 'f1': 0.9642857142857142, 'number': 55} {'precision': 0.9761904761904762, 'recall': 0.9919354838709677, 'f1': 0.9840000000000001, 'number': 124} 0.9719 0.9764 0.9742 0.9928
0.0075 14.0 266 0.0331 {'precision': 0.9607023411371237, 'recall': 0.9671717171717171, 'f1': 0.9639261744966443, 'number': 1188} {'precision': 0.9767441860465116, 'recall': 0.9795918367346939, 'f1': 0.9781659388646288, 'number': 343} {'precision': 0.9682539682539683, 'recall': 0.9838709677419355, 'f1': 0.976, 'number': 124} {'precision': 0.9553571428571429, 'recall': 0.981651376146789, 'f1': 0.9683257918552036, 'number': 109} {'precision': 0.984, 'recall': 0.968503937007874, 'f1': 0.976190476190476, 'number': 127} {'precision': 0.9769230769230769, 'recall': 0.9769230769230769, 'f1': 0.9769230769230769, 'number': 130} {'precision': 0.9960356788899901, 'recall': 0.9743092583616093, 'f1': 0.9850526831658908, 'number': 2063} {'precision': 0.9016697588126159, 'recall': 0.9959016393442623, 'f1': 0.9464459591041869, 'number': 488} {'precision': 0.9464285714285714, 'recall': 0.9636363636363636, 'f1': 0.9549549549549549, 'number': 55} {'precision': 0.984, 'recall': 0.9919354838709677, 'f1': 0.9879518072289156, 'number': 124} 0.9717 0.9758 0.9737 0.9927
0.0076 15.0 285 0.0333 {'precision': 0.9607023411371237, 'recall': 0.9671717171717171, 'f1': 0.9639261744966443, 'number': 1188} {'precision': 0.9767441860465116, 'recall': 0.9795918367346939, 'f1': 0.9781659388646288, 'number': 343} {'precision': 0.9682539682539683, 'recall': 0.9838709677419355, 'f1': 0.976, 'number': 124} {'precision': 0.9553571428571429, 'recall': 0.981651376146789, 'f1': 0.9683257918552036, 'number': 109} {'precision': 0.984, 'recall': 0.968503937007874, 'f1': 0.976190476190476, 'number': 127} {'precision': 0.9769230769230769, 'recall': 0.9769230769230769, 'f1': 0.9769230769230769, 'number': 130} {'precision': 0.996039603960396, 'recall': 0.9752787203102278, 'f1': 0.9855498408033309, 'number': 2063} {'precision': 0.9050279329608939, 'recall': 0.9959016393442623, 'f1': 0.9482926829268293, 'number': 488} {'precision': 0.9464285714285714, 'recall': 0.9636363636363636, 'f1': 0.9549549549549549, 'number': 55} {'precision': 0.9761904761904762, 'recall': 0.9919354838709677, 'f1': 0.9840000000000001, 'number': 124} 0.9719 0.9762 0.9741 0.9927

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
113M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for riteshbehera857/layoutlm-base-uncased-finetuned-invoices-1