riteshbehera857 commited on
Commit
cb56e9b
1 Parent(s): 632e133

End of training

Browse files
README.md ADDED
@@ -0,0 +1,87 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: mit
4
+ base_model: riteshbehera857/layoutlm-base-uncased-finetuned-invoices-1
5
+ tags:
6
+ - generated_from_trainer
7
+ model-index:
8
+ - name: layoutlm-base-uncased-finetuned-invoices-2
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # layoutlm-base-uncased-finetuned-invoices-2
16
+
17
+ This model is a fine-tuned version of [riteshbehera857/layoutlm-base-uncased-finetuned-invoices-1](https://huggingface.co/riteshbehera857/layoutlm-base-uncased-finetuned-invoices-1) on the None dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 0.0342
20
+ - B-adress: {'precision': 0.9669491525423729, 'recall': 0.971063829787234, 'f1': 0.9690021231422504, 'number': 1175}
21
+ - B-name: {'precision': 0.9795918367346939, 'recall': 0.9739130434782609, 'f1': 0.9767441860465117, 'number': 345}
22
+ - Gst no: {'precision': 1.0, 'recall': 0.9689922480620154, 'f1': 0.9842519685039369, 'number': 129}
23
+ - Invoice no: {'precision': 0.9702970297029703, 'recall': 0.9607843137254902, 'f1': 0.9655172413793103, 'number': 102}
24
+ - Order date: {'precision': 0.9672131147540983, 'recall': 0.9752066115702479, 'f1': 0.9711934156378601, 'number': 121}
25
+ - Order id: {'precision': 0.9770992366412213, 'recall': 0.9922480620155039, 'f1': 0.9846153846153846, 'number': 129}
26
+ - S-adress: {'precision': 0.978021978021978, 'recall': 0.9941489361702127, 'f1': 0.9860195199155896, 'number': 1880}
27
+ - S-name: {'precision': 0.9860834990059643, 'recall': 0.9575289575289575, 'f1': 0.9715964740450539, 'number': 518}
28
+ - Total gross: {'precision': 0.9298245614035088, 'recall': 1.0, 'f1': 0.9636363636363636, 'number': 53}
29
+ - Total net: {'precision': 0.9923076923076923, 'recall': 1.0, 'f1': 0.9961389961389961, 'number': 129}
30
+ - Overall Precision: 0.9761
31
+ - Overall Recall: 0.9808
32
+ - Overall F1: 0.9784
33
+ - Overall Accuracy: 0.9944
34
+
35
+ ## Model description
36
+
37
+ More information needed
38
+
39
+ ## Intended uses & limitations
40
+
41
+ More information needed
42
+
43
+ ## Training and evaluation data
44
+
45
+ More information needed
46
+
47
+ ## Training procedure
48
+
49
+ ### Training hyperparameters
50
+
51
+ The following hyperparameters were used during training:
52
+ - learning_rate: 5e-05
53
+ - train_batch_size: 16
54
+ - eval_batch_size: 8
55
+ - seed: 42
56
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
57
+ - lr_scheduler_type: linear
58
+ - lr_scheduler_warmup_steps: 10
59
+ - num_epochs: 15
60
+
61
+ ### Training results
62
+
63
+ | Training Loss | Epoch | Step | Validation Loss | B-adress | B-name | Gst no | Invoice no | Order date | Order id | S-adress | S-name | Total gross | Total net | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
64
+ |:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
65
+ | 0.0192 | 1.0 | 19 | 0.0257 | {'precision': 0.9559322033898305, 'recall': 0.96, 'f1': 0.9579617834394905, 'number': 1175} | {'precision': 0.9713467048710601, 'recall': 0.9826086956521739, 'f1': 0.9769452449567723, 'number': 345} | {'precision': 1.0, 'recall': 0.9844961240310077, 'f1': 0.9921875, 'number': 129} | {'precision': 0.9615384615384616, 'recall': 0.9803921568627451, 'f1': 0.970873786407767, 'number': 102} | {'precision': 0.9752066115702479, 'recall': 0.9752066115702479, 'f1': 0.9752066115702479, 'number': 121} | {'precision': 0.9696969696969697, 'recall': 0.9922480620155039, 'f1': 0.9808429118773947, 'number': 129} | {'precision': 0.9789251844046365, 'recall': 0.9882978723404255, 'f1': 0.9835892006352567, 'number': 1880} | {'precision': 0.9899396378269618, 'recall': 0.9498069498069498, 'f1': 0.9694581280788177, 'number': 518} | {'precision': 0.8688524590163934, 'recall': 1.0, 'f1': 0.9298245614035088, 'number': 53} | {'precision': 1.0, 'recall': 0.9922480620155039, 'f1': 0.9961089494163424, 'number': 129} | 0.9726 | 0.9760 | 0.9743 | 0.9932 |
66
+ | 0.0179 | 2.0 | 38 | 0.0272 | {'precision': 0.9562657695542472, 'recall': 0.9676595744680851, 'f1': 0.9619289340101522, 'number': 1175} | {'precision': 0.9740634005763689, 'recall': 0.9797101449275363, 'f1': 0.976878612716763, 'number': 345} | {'precision': 1.0, 'recall': 0.9844961240310077, 'f1': 0.9921875, 'number': 129} | {'precision': 0.9705882352941176, 'recall': 0.9705882352941176, 'f1': 0.9705882352941176, 'number': 102} | {'precision': 0.9754098360655737, 'recall': 0.9834710743801653, 'f1': 0.9794238683127573, 'number': 121} | {'precision': 0.9770992366412213, 'recall': 0.9922480620155039, 'f1': 0.9846153846153846, 'number': 129} | {'precision': 0.982086406743941, 'recall': 0.9914893617021276, 'f1': 0.9867654843832716, 'number': 1880} | {'precision': 0.9900199600798403, 'recall': 0.9575289575289575, 'f1': 0.9735034347399412, 'number': 518} | {'precision': 0.8833333333333333, 'recall': 1.0, 'f1': 0.9380530973451328, 'number': 53} | {'precision': 1.0, 'recall': 0.9922480620155039, 'f1': 0.9961089494163424, 'number': 129} | 0.9748 | 0.9799 | 0.9774 | 0.9940 |
67
+ | 0.0149 | 3.0 | 57 | 0.0284 | {'precision': 0.9647766323024055, 'recall': 0.9557446808510638, 'f1': 0.960239418554938, 'number': 1175} | {'precision': 0.9794117647058823, 'recall': 0.9652173913043478, 'f1': 0.9722627737226278, 'number': 345} | {'precision': 1.0, 'recall': 0.9844961240310077, 'f1': 0.9921875, 'number': 129} | {'precision': 0.9611650485436893, 'recall': 0.9705882352941176, 'f1': 0.9658536585365853, 'number': 102} | {'precision': 0.9754098360655737, 'recall': 0.9834710743801653, 'f1': 0.9794238683127573, 'number': 121} | {'precision': 0.9770992366412213, 'recall': 0.9922480620155039, 'f1': 0.9846153846153846, 'number': 129} | {'precision': 0.9744791666666667, 'recall': 0.9952127659574468, 'f1': 0.9847368421052631, 'number': 1880} | {'precision': 0.99, 'recall': 0.9555984555984556, 'f1': 0.9724950884086443, 'number': 518} | {'precision': 0.9298245614035088, 'recall': 1.0, 'f1': 0.9636363636363636, 'number': 53} | {'precision': 0.9847328244274809, 'recall': 1.0, 'f1': 0.9923076923076923, 'number': 129} | 0.9743 | 0.9773 | 0.9758 | 0.9935 |
68
+ | 0.0135 | 4.0 | 76 | 0.0295 | {'precision': 0.9531380753138076, 'recall': 0.9693617021276596, 'f1': 0.9611814345991562, 'number': 1175} | {'precision': 0.9796511627906976, 'recall': 0.9768115942028985, 'f1': 0.9782293178519593, 'number': 345} | {'precision': 1.0, 'recall': 0.9767441860465116, 'f1': 0.988235294117647, 'number': 129} | {'precision': 0.9705882352941176, 'recall': 0.9705882352941176, 'f1': 0.9705882352941176, 'number': 102} | {'precision': 0.9672131147540983, 'recall': 0.9752066115702479, 'f1': 0.9711934156378601, 'number': 121} | {'precision': 0.9552238805970149, 'recall': 0.9922480620155039, 'f1': 0.9733840304182508, 'number': 129} | {'precision': 0.9760166840458812, 'recall': 0.9957446808510638, 'f1': 0.985781990521327, 'number': 1880} | {'precision': 0.988, 'recall': 0.9536679536679536, 'f1': 0.9705304518664047, 'number': 518} | {'precision': 0.9137931034482759, 'recall': 1.0, 'f1': 0.9549549549549551, 'number': 53} | {'precision': 0.9847328244274809, 'recall': 1.0, 'f1': 0.9923076923076923, 'number': 129} | 0.9708 | 0.9812 | 0.9760 | 0.9936 |
69
+ | 0.01 | 5.0 | 95 | 0.0308 | {'precision': 0.9619611158072696, 'recall': 0.9685106382978723, 'f1': 0.9652247667514843, 'number': 1175} | {'precision': 0.9795918367346939, 'recall': 0.9739130434782609, 'f1': 0.9767441860465117, 'number': 345} | {'precision': 1.0, 'recall': 0.9767441860465116, 'f1': 0.988235294117647, 'number': 129} | {'precision': 0.9519230769230769, 'recall': 0.9705882352941176, 'f1': 0.9611650485436893, 'number': 102} | {'precision': 0.9752066115702479, 'recall': 0.9752066115702479, 'f1': 0.9752066115702479, 'number': 121} | {'precision': 0.9770992366412213, 'recall': 0.9922480620155039, 'f1': 0.9846153846153846, 'number': 129} | {'precision': 0.9739311783107404, 'recall': 0.9936170212765958, 'f1': 0.9836756187467087, 'number': 1880} | {'precision': 0.9860834990059643, 'recall': 0.9575289575289575, 'f1': 0.9715964740450539, 'number': 518} | {'precision': 0.9298245614035088, 'recall': 1.0, 'f1': 0.9636363636363636, 'number': 53} | {'precision': 0.9847328244274809, 'recall': 1.0, 'f1': 0.9923076923076923, 'number': 129} | 0.9727 | 0.9804 | 0.9765 | 0.9938 |
70
+ | 0.009 | 6.0 | 114 | 0.0315 | {'precision': 0.9667235494880546, 'recall': 0.9642553191489361, 'f1': 0.9654878568385172, 'number': 1175} | {'precision': 0.9795918367346939, 'recall': 0.9739130434782609, 'f1': 0.9767441860465117, 'number': 345} | {'precision': 1.0, 'recall': 0.9689922480620154, 'f1': 0.9842519685039369, 'number': 129} | {'precision': 0.9705882352941176, 'recall': 0.9705882352941176, 'f1': 0.9705882352941176, 'number': 102} | {'precision': 0.9672131147540983, 'recall': 0.9752066115702479, 'f1': 0.9711934156378601, 'number': 121} | {'precision': 0.9624060150375939, 'recall': 0.9922480620155039, 'f1': 0.9770992366412213, 'number': 129} | {'precision': 0.9714434060228453, 'recall': 0.9952127659574468, 'f1': 0.9831844456121913, 'number': 1880} | {'precision': 0.9860557768924303, 'recall': 0.9555984555984556, 'f1': 0.9705882352941176, 'number': 518} | {'precision': 0.9298245614035088, 'recall': 1.0, 'f1': 0.9636363636363636, 'number': 53} | {'precision': 0.9847328244274809, 'recall': 1.0, 'f1': 0.9923076923076923, 'number': 129} | 0.9727 | 0.9795 | 0.9761 | 0.9936 |
71
+ | 0.0081 | 7.0 | 133 | 0.0322 | {'precision': 0.966893039049236, 'recall': 0.9693617021276596, 'f1': 0.9681257968550786, 'number': 1175} | {'precision': 0.9739130434782609, 'recall': 0.9739130434782609, 'f1': 0.9739130434782609, 'number': 345} | {'precision': 1.0, 'recall': 0.9689922480620154, 'f1': 0.9842519685039369, 'number': 129} | {'precision': 0.9607843137254902, 'recall': 0.9607843137254902, 'f1': 0.9607843137254902, 'number': 102} | {'precision': 0.9672131147540983, 'recall': 0.9752066115702479, 'f1': 0.9711934156378601, 'number': 121} | {'precision': 0.9624060150375939, 'recall': 0.9922480620155039, 'f1': 0.9770992366412213, 'number': 129} | {'precision': 0.9785002621919245, 'recall': 0.9925531914893617, 'f1': 0.9854766305782942, 'number': 1880} | {'precision': 0.9860834990059643, 'recall': 0.9575289575289575, 'f1': 0.9715964740450539, 'number': 518} | {'precision': 0.9298245614035088, 'recall': 1.0, 'f1': 0.9636363636363636, 'number': 53} | {'precision': 0.9847328244274809, 'recall': 1.0, 'f1': 0.9923076923076923, 'number': 129} | 0.9750 | 0.9797 | 0.9774 | 0.9940 |
72
+ | 0.0069 | 8.0 | 152 | 0.0324 | {'precision': 0.9658994032395567, 'recall': 0.9642553191489361, 'f1': 0.9650766609880749, 'number': 1175} | {'precision': 0.9767441860465116, 'recall': 0.9739130434782609, 'f1': 0.9753265602322205, 'number': 345} | {'precision': 1.0, 'recall': 0.9689922480620154, 'f1': 0.9842519685039369, 'number': 129} | {'precision': 0.9702970297029703, 'recall': 0.9607843137254902, 'f1': 0.9655172413793103, 'number': 102} | {'precision': 0.959349593495935, 'recall': 0.9752066115702479, 'f1': 0.9672131147540983, 'number': 121} | {'precision': 0.9696969696969697, 'recall': 0.9922480620155039, 'f1': 0.9808429118773947, 'number': 129} | {'precision': 0.9769633507853404, 'recall': 0.9925531914893617, 'f1': 0.9846965699208443, 'number': 1880} | {'precision': 0.9860834990059643, 'recall': 0.9575289575289575, 'f1': 0.9715964740450539, 'number': 518} | {'precision': 0.9298245614035088, 'recall': 1.0, 'f1': 0.9636363636363636, 'number': 53} | {'precision': 0.9847328244274809, 'recall': 1.0, 'f1': 0.9923076923076923, 'number': 129} | 0.9746 | 0.9784 | 0.9765 | 0.9938 |
73
+ | 0.0064 | 9.0 | 171 | 0.0344 | {'precision': 0.9636209813874789, 'recall': 0.9693617021276596, 'f1': 0.9664828171404328, 'number': 1175} | {'precision': 0.9767441860465116, 'recall': 0.9739130434782609, 'f1': 0.9753265602322205, 'number': 345} | {'precision': 1.0, 'recall': 0.9689922480620154, 'f1': 0.9842519685039369, 'number': 129} | {'precision': 0.9705882352941176, 'recall': 0.9705882352941176, 'f1': 0.9705882352941176, 'number': 102} | {'precision': 0.9672131147540983, 'recall': 0.9752066115702479, 'f1': 0.9711934156378601, 'number': 121} | {'precision': 0.9696969696969697, 'recall': 0.9922480620155039, 'f1': 0.9808429118773947, 'number': 129} | {'precision': 0.9744791666666667, 'recall': 0.9952127659574468, 'f1': 0.9847368421052631, 'number': 1880} | {'precision': 0.9860557768924303, 'recall': 0.9555984555984556, 'f1': 0.9705882352941176, 'number': 518} | {'precision': 0.9137931034482759, 'recall': 1.0, 'f1': 0.9549549549549551, 'number': 53} | {'precision': 0.9847328244274809, 'recall': 1.0, 'f1': 0.9923076923076923, 'number': 129} | 0.9729 | 0.9808 | 0.9768 | 0.9940 |
74
+ | 0.0058 | 10.0 | 190 | 0.0338 | {'precision': 0.9652836579170194, 'recall': 0.9702127659574468, 'f1': 0.9677419354838709, 'number': 1175} | {'precision': 0.9795918367346939, 'recall': 0.9739130434782609, 'f1': 0.9767441860465117, 'number': 345} | {'precision': 1.0, 'recall': 0.9689922480620154, 'f1': 0.9842519685039369, 'number': 129} | {'precision': 0.9607843137254902, 'recall': 0.9607843137254902, 'f1': 0.9607843137254902, 'number': 102} | {'precision': 0.959349593495935, 'recall': 0.9752066115702479, 'f1': 0.9672131147540983, 'number': 121} | {'precision': 0.9624060150375939, 'recall': 0.9922480620155039, 'f1': 0.9770992366412213, 'number': 129} | {'precision': 0.9785115303983228, 'recall': 0.9930851063829788, 'f1': 0.9857444561774024, 'number': 1880} | {'precision': 0.9860834990059643, 'recall': 0.9575289575289575, 'f1': 0.9715964740450539, 'number': 518} | {'precision': 0.9298245614035088, 'recall': 1.0, 'f1': 0.9636363636363636, 'number': 53} | {'precision': 0.9847328244274809, 'recall': 1.0, 'f1': 0.9923076923076923, 'number': 129} | 0.9748 | 0.9801 | 0.9775 | 0.9941 |
75
+ | 0.0057 | 11.0 | 209 | 0.0342 | {'precision': 0.9669491525423729, 'recall': 0.971063829787234, 'f1': 0.9690021231422504, 'number': 1175} | {'precision': 0.9795918367346939, 'recall': 0.9739130434782609, 'f1': 0.9767441860465117, 'number': 345} | {'precision': 1.0, 'recall': 0.9689922480620154, 'f1': 0.9842519685039369, 'number': 129} | {'precision': 0.9702970297029703, 'recall': 0.9607843137254902, 'f1': 0.9655172413793103, 'number': 102} | {'precision': 0.9672131147540983, 'recall': 0.9752066115702479, 'f1': 0.9711934156378601, 'number': 121} | {'precision': 0.9770992366412213, 'recall': 0.9922480620155039, 'f1': 0.9846153846153846, 'number': 129} | {'precision': 0.978021978021978, 'recall': 0.9941489361702127, 'f1': 0.9860195199155896, 'number': 1880} | {'precision': 0.9860834990059643, 'recall': 0.9575289575289575, 'f1': 0.9715964740450539, 'number': 518} | {'precision': 0.9298245614035088, 'recall': 1.0, 'f1': 0.9636363636363636, 'number': 53} | {'precision': 0.9923076923076923, 'recall': 1.0, 'f1': 0.9961389961389961, 'number': 129} | 0.9761 | 0.9808 | 0.9784 | 0.9944 |
76
+ | 0.0051 | 12.0 | 228 | 0.0352 | {'precision': 0.964527027027027, 'recall': 0.9719148936170213, 'f1': 0.9682068673166595, 'number': 1175} | {'precision': 0.9739130434782609, 'recall': 0.9739130434782609, 'f1': 0.9739130434782609, 'number': 345} | {'precision': 1.0, 'recall': 0.9689922480620154, 'f1': 0.9842519685039369, 'number': 129} | {'precision': 0.9702970297029703, 'recall': 0.9607843137254902, 'f1': 0.9655172413793103, 'number': 102} | {'precision': 0.9672131147540983, 'recall': 0.9752066115702479, 'f1': 0.9711934156378601, 'number': 121} | {'precision': 0.9770992366412213, 'recall': 0.9922480620155039, 'f1': 0.9846153846153846, 'number': 129} | {'precision': 0.9785340314136126, 'recall': 0.9941489361702127, 'f1': 0.9862796833773088, 'number': 1880} | {'precision': 0.9860834990059643, 'recall': 0.9575289575289575, 'f1': 0.9715964740450539, 'number': 518} | {'precision': 0.9137931034482759, 'recall': 1.0, 'f1': 0.9549549549549551, 'number': 53} | {'precision': 0.9847328244274809, 'recall': 1.0, 'f1': 0.9923076923076923, 'number': 129} | 0.9748 | 0.9810 | 0.9779 | 0.9943 |
77
+ | 0.0051 | 13.0 | 247 | 0.0352 | {'precision': 0.9638047138047138, 'recall': 0.9744680851063829, 'f1': 0.9691070672873465, 'number': 1175} | {'precision': 0.9795918367346939, 'recall': 0.9739130434782609, 'f1': 0.9767441860465117, 'number': 345} | {'precision': 1.0, 'recall': 0.9689922480620154, 'f1': 0.9842519685039369, 'number': 129} | {'precision': 0.9702970297029703, 'recall': 0.9607843137254902, 'f1': 0.9655172413793103, 'number': 102} | {'precision': 0.9291338582677166, 'recall': 0.9752066115702479, 'f1': 0.9516129032258065, 'number': 121} | {'precision': 0.9624060150375939, 'recall': 0.9922480620155039, 'f1': 0.9770992366412213, 'number': 129} | {'precision': 0.9790356394129979, 'recall': 0.9936170212765958, 'f1': 0.9862724392819429, 'number': 1880} | {'precision': 0.9860834990059643, 'recall': 0.9575289575289575, 'f1': 0.9715964740450539, 'number': 518} | {'precision': 0.9298245614035088, 'recall': 1.0, 'f1': 0.9636363636363636, 'number': 53} | {'precision': 0.9847328244274809, 'recall': 1.0, 'f1': 0.9923076923076923, 'number': 129} | 0.9740 | 0.9814 | 0.9777 | 0.9942 |
78
+ | 0.0046 | 14.0 | 266 | 0.0356 | {'precision': 0.9605373635600336, 'recall': 0.9736170212765958, 'f1': 0.967032967032967, 'number': 1175} | {'precision': 0.9795918367346939, 'recall': 0.9739130434782609, 'f1': 0.9767441860465117, 'number': 345} | {'precision': 1.0, 'recall': 0.9689922480620154, 'f1': 0.9842519685039369, 'number': 129} | {'precision': 0.9702970297029703, 'recall': 0.9607843137254902, 'f1': 0.9655172413793103, 'number': 102} | {'precision': 0.959349593495935, 'recall': 0.9752066115702479, 'f1': 0.9672131147540983, 'number': 121} | {'precision': 0.9624060150375939, 'recall': 0.9922480620155039, 'f1': 0.9770992366412213, 'number': 129} | {'precision': 0.9790246460409019, 'recall': 0.9930851063829788, 'f1': 0.9860047531027197, 'number': 1880} | {'precision': 0.9860834990059643, 'recall': 0.9575289575289575, 'f1': 0.9715964740450539, 'number': 518} | {'precision': 0.9137931034482759, 'recall': 1.0, 'f1': 0.9549549549549551, 'number': 53} | {'precision': 0.9847328244274809, 'recall': 1.0, 'f1': 0.9923076923076923, 'number': 129} | 0.9738 | 0.9810 | 0.9774 | 0.9941 |
79
+ | 0.0044 | 15.0 | 285 | 0.0355 | {'precision': 0.9613445378151261, 'recall': 0.9736170212765958, 'f1': 0.9674418604651164, 'number': 1175} | {'precision': 0.9795918367346939, 'recall': 0.9739130434782609, 'f1': 0.9767441860465117, 'number': 345} | {'precision': 1.0, 'recall': 0.9689922480620154, 'f1': 0.9842519685039369, 'number': 129} | {'precision': 0.9702970297029703, 'recall': 0.9607843137254902, 'f1': 0.9655172413793103, 'number': 102} | {'precision': 0.959349593495935, 'recall': 0.9752066115702479, 'f1': 0.9672131147540983, 'number': 121} | {'precision': 0.9624060150375939, 'recall': 0.9922480620155039, 'f1': 0.9770992366412213, 'number': 129} | {'precision': 0.9790246460409019, 'recall': 0.9930851063829788, 'f1': 0.9860047531027197, 'number': 1880} | {'precision': 0.9860834990059643, 'recall': 0.9575289575289575, 'f1': 0.9715964740450539, 'number': 518} | {'precision': 0.9137931034482759, 'recall': 1.0, 'f1': 0.9549549549549551, 'number': 53} | {'precision': 0.9923076923076923, 'recall': 1.0, 'f1': 0.9961389961389961, 'number': 129} | 0.9742 | 0.9810 | 0.9776 | 0.9942 |
80
+
81
+
82
+ ### Framework versions
83
+
84
+ - Transformers 4.44.2
85
+ - Pytorch 2.4.1+cu121
86
+ - Datasets 3.0.0
87
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "riteshbehera857/layoutlm-base-uncased-finetuned-invoices-1",
3
+ "architectures": [
4
+ "LayoutLMForTokenClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "hidden_act": "gelu",
8
+ "hidden_dropout_prob": 0.1,
9
+ "hidden_size": 768,
10
+ "id2label": {
11
+ "0": "O",
12
+ "1": "Invoice no",
13
+ "2": "Order id",
14
+ "3": "Order date",
15
+ "4": "GST no",
16
+ "5": "Total net",
17
+ "6": "Total gross",
18
+ "7": "S-name",
19
+ "8": "B-name",
20
+ "9": "S-adress",
21
+ "10": "B-adress"
22
+ },
23
+ "initializer_range": 0.02,
24
+ "intermediate_size": 3072,
25
+ "label2id": {
26
+ "B-adress": 10,
27
+ "B-name": 8,
28
+ "GST no": 4,
29
+ "Invoice no": 1,
30
+ "O": 0,
31
+ "Order date": 3,
32
+ "Order id": 2,
33
+ "S-adress": 9,
34
+ "S-name": 7,
35
+ "Total gross": 6,
36
+ "Total net": 5
37
+ },
38
+ "layer_norm_eps": 1e-12,
39
+ "max_2d_position_embeddings": 1024,
40
+ "max_position_embeddings": 512,
41
+ "model_type": "layoutlm",
42
+ "num_attention_heads": 12,
43
+ "num_hidden_layers": 12,
44
+ "output_past": true,
45
+ "pad_token_id": 0,
46
+ "position_embedding_type": "absolute",
47
+ "torch_dtype": "float32",
48
+ "transformers_version": "4.44.2",
49
+ "type_vocab_size": 2,
50
+ "use_cache": true,
51
+ "vocab_size": 30522
52
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d6103613111ddd3bbb8b736aadd2fa860ddee045806f7592352e9ca4cf2ed07d
3
+ size 450570516
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,56 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "100": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "101": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "102": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "103": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": true,
45
+ "cls_token": "[CLS]",
46
+ "do_lower_case": true,
47
+ "mask_token": "[MASK]",
48
+ "max_len": 512,
49
+ "model_max_length": 512,
50
+ "pad_token": "[PAD]",
51
+ "sep_token": "[SEP]",
52
+ "strip_accents": null,
53
+ "tokenize_chinese_chars": true,
54
+ "tokenizer_class": "LayoutLMTokenizer",
55
+ "unk_token": "[UNK]"
56
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8c9c4a2b145445e7c7083dd9d6a2585893db9fb6ff1959c27746e082e3580f47
3
+ size 5176
vocab.txt ADDED
The diff for this file is too large to render. See raw diff