Edit model card

lmv2-g-w2-300-doc-09-08

This model is a fine-tuned version of microsoft/layoutlmv2-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0262
  • Control Number Precision: 1.0
  • Control Number Recall: 1.0
  • Control Number F1: 1.0
  • Control Number Number: 17
  • Ein Precision: 1.0
  • Ein Recall: 0.9833
  • Ein F1: 0.9916
  • Ein Number: 60
  • Employee’s Address Precision: 0.9667
  • Employee’s Address Recall: 0.9831
  • Employee’s Address F1: 0.9748
  • Employee’s Address Number: 59
  • Employee’s Name Precision: 0.9833
  • Employee’s Name Recall: 1.0
  • Employee’s Name F1: 0.9916
  • Employee’s Name Number: 59
  • Employee’s Ssn Precision: 0.9836
  • Employee’s Ssn Recall: 1.0
  • Employee’s Ssn F1: 0.9917
  • Employee’s Ssn Number: 60
  • Employer’s Address Precision: 0.9833
  • Employer’s Address Recall: 0.9672
  • Employer’s Address F1: 0.9752
  • Employer’s Address Number: 61
  • Employer’s Name Precision: 0.9833
  • Employer’s Name Recall: 0.9833
  • Employer’s Name F1: 0.9833
  • Employer’s Name Number: 60
  • Federal Income Tax Withheld Precision: 1.0
  • Federal Income Tax Withheld Recall: 1.0
  • Federal Income Tax Withheld F1: 1.0
  • Federal Income Tax Withheld Number: 60
  • Medicare Tax Withheld Precision: 1.0
  • Medicare Tax Withheld Recall: 1.0
  • Medicare Tax Withheld F1: 1.0
  • Medicare Tax Withheld Number: 60
  • Medicare Wages Tips Precision: 1.0
  • Medicare Wages Tips Recall: 1.0
  • Medicare Wages Tips F1: 1.0
  • Medicare Wages Tips Number: 60
  • Social Security Tax Withheld Precision: 1.0
  • Social Security Tax Withheld Recall: 0.9836
  • Social Security Tax Withheld F1: 0.9917
  • Social Security Tax Withheld Number: 61
  • Social Security Wages Precision: 0.9833
  • Social Security Wages Recall: 1.0
  • Social Security Wages F1: 0.9916
  • Social Security Wages Number: 59
  • Wages Tips Precision: 1.0
  • Wages Tips Recall: 0.9836
  • Wages Tips F1: 0.9917
  • Wages Tips Number: 61
  • Overall Precision: 0.9905
  • Overall Recall: 0.9905
  • Overall F1: 0.9905
  • Overall Accuracy: 0.9973

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: constant
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Control Number Precision Control Number Recall Control Number F1 Control Number Number Ein Precision Ein Recall Ein F1 Ein Number Employee’s Address Precision Employee’s Address Recall Employee’s Address F1 Employee’s Address Number Employee’s Name Precision Employee’s Name Recall Employee’s Name F1 Employee’s Name Number Employee’s Ssn Precision Employee’s Ssn Recall Employee’s Ssn F1 Employee’s Ssn Number Employer’s Address Precision Employer’s Address Recall Employer’s Address F1 Employer’s Address Number Employer’s Name Precision Employer’s Name Recall Employer’s Name F1 Employer’s Name Number Federal Income Tax Withheld Precision Federal Income Tax Withheld Recall Federal Income Tax Withheld F1 Federal Income Tax Withheld Number Medicare Tax Withheld Precision Medicare Tax Withheld Recall Medicare Tax Withheld F1 Medicare Tax Withheld Number Medicare Wages Tips Precision Medicare Wages Tips Recall Medicare Wages Tips F1 Medicare Wages Tips Number Social Security Tax Withheld Precision Social Security Tax Withheld Recall Social Security Tax Withheld F1 Social Security Tax Withheld Number Social Security Wages Precision Social Security Wages Recall Social Security Wages F1 Social Security Wages Number Wages Tips Precision Wages Tips Recall Wages Tips F1 Wages Tips Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.7717 1.0 240 0.9856 0.0 0.0 0.0 17 0.9206 0.9667 0.9431 60 0.6824 0.9831 0.8056 59 0.2333 0.5932 0.3349 59 0.9836 1.0 0.9917 60 0.7609 0.5738 0.6542 61 0.3654 0.3167 0.3393 60 0.0 0.0 0.0 60 0.8194 0.9833 0.8939 60 0.6064 0.95 0.7403 60 0.5050 0.8361 0.6296 61 0.0 0.0 0.0 59 0.5859 0.9508 0.725 61 0.5954 0.6649 0.6282 0.9558
0.5578 2.0 480 0.2957 0.8462 0.6471 0.7333 17 0.9831 0.9667 0.9748 60 0.9048 0.9661 0.9344 59 0.8358 0.9492 0.8889 59 0.9836 1.0 0.9917 60 0.8125 0.8525 0.8320 61 0.8462 0.9167 0.8800 60 0.9672 0.9833 0.9752 60 0.9524 1.0 0.9756 60 0.9194 0.95 0.9344 60 0.9833 0.9672 0.9752 61 0.9508 0.9831 0.9667 59 0.9516 0.9672 0.9593 61 0.9212 0.9512 0.9359 0.9891
0.223 3.0 720 0.1626 0.5 0.6471 0.5641 17 0.9667 0.9667 0.9667 60 0.9355 0.9831 0.9587 59 0.9672 1.0 0.9833 59 0.9836 1.0 0.9917 60 0.8769 0.9344 0.9048 61 0.9508 0.9667 0.9587 60 0.9833 0.9833 0.9833 60 0.9836 1.0 0.9917 60 0.8769 0.95 0.912 60 1.0 0.9836 0.9917 61 0.9355 0.9831 0.9587 59 0.9516 0.9672 0.9593 61 0.9370 0.9688 0.9526 0.9923
0.1305 4.0 960 0.1025 0.9444 1.0 0.9714 17 0.9831 0.9667 0.9748 60 0.9194 0.9661 0.9421 59 0.9508 0.9831 0.9667 59 0.9836 1.0 0.9917 60 0.9219 0.9672 0.944 61 0.9667 0.9667 0.9667 60 0.9833 0.9833 0.9833 60 0.9524 1.0 0.9756 60 0.8906 0.95 0.9194 60 0.9833 0.9672 0.9752 61 0.9355 0.9831 0.9587 59 0.9516 0.9672 0.9593 61 0.9511 0.9756 0.9632 0.9947
0.0852 5.0 1200 0.0744 0.7391 1.0 0.85 17 0.9831 0.9667 0.9748 60 0.9667 0.9831 0.9748 59 0.9833 1.0 0.9916 59 0.9836 1.0 0.9917 60 0.9344 0.9344 0.9344 61 1.0 0.9833 0.9916 60 0.9365 0.9833 0.9593 60 0.9677 1.0 0.9836 60 0.95 0.95 0.9500 60 0.9836 0.9836 0.9836 61 0.9667 0.9831 0.9748 59 0.9833 0.9672 0.9752 61 0.9626 0.9783 0.9704 0.9953
0.0583 6.0 1440 0.0554 0.7727 1.0 0.8718 17 0.9831 0.9667 0.9748 60 0.9667 0.9831 0.9748 59 0.9833 1.0 0.9916 59 0.9836 1.0 0.9917 60 0.9048 0.9344 0.9194 61 1.0 0.9833 0.9916 60 1.0 0.9833 0.9916 60 0.9833 0.9833 0.9833 60 0.9344 0.95 0.9421 60 1.0 0.9672 0.9833 61 0.9667 0.9831 0.9748 59 0.9833 0.9672 0.9752 61 0.9677 0.9756 0.9716 0.9957
0.0431 7.0 1680 0.0471 0.9444 1.0 0.9714 17 0.9831 0.9667 0.9748 60 0.9016 0.9322 0.9167 59 0.95 0.9661 0.9580 59 0.9836 1.0 0.9917 60 0.8676 0.9672 0.9147 61 0.9831 0.9667 0.9748 60 1.0 0.9833 0.9916 60 1.0 1.0 1.0 60 0.9516 0.9833 0.9672 60 0.9836 0.9836 0.9836 61 0.9831 0.9831 0.9831 59 0.9833 0.9672 0.9752 61 0.9625 0.9756 0.9690 0.9947
0.0314 8.0 1920 0.0359 1.0 1.0 1.0 17 0.9831 0.9667 0.9748 60 0.9355 0.9831 0.9587 59 0.9833 1.0 0.9916 59 0.9836 1.0 0.9917 60 0.9516 0.9672 0.9593 61 1.0 0.9667 0.9831 60 0.9833 0.9833 0.9833 60 1.0 1.0 1.0 60 0.9516 0.9833 0.9672 60 1.0 0.9836 0.9917 61 0.9831 0.9831 0.9831 59 0.9672 0.9672 0.9672 61 0.9771 0.9824 0.9797 0.9969
0.0278 9.0 2160 0.0338 0.8947 1.0 0.9444 17 0.9833 0.9833 0.9833 60 0.9355 0.9831 0.9587 59 0.9667 0.9831 0.9748 59 1.0 1.0 1.0 60 0.9365 0.9672 0.9516 61 0.9672 0.9833 0.9752 60 1.0 0.9833 0.9916 60 1.0 1.0 1.0 60 0.9516 0.9833 0.9672 60 1.0 0.9836 0.9917 61 0.9667 0.9831 0.9748 59 0.9672 0.9672 0.9672 61 0.9705 0.9837 0.9771 0.9965
0.0231 10.0 2400 0.0332 0.9444 1.0 0.9714 17 0.9831 0.9667 0.9748 60 0.9508 0.9831 0.9667 59 0.9048 0.9661 0.9344 59 0.9836 1.0 0.9917 60 0.9667 0.9508 0.9587 61 0.9667 0.9667 0.9667 60 1.0 0.9833 0.9916 60 0.9836 1.0 0.9917 60 0.9365 0.9833 0.9593 60 1.0 0.9672 0.9833 61 0.9831 0.9831 0.9831 59 0.9833 0.9672 0.9752 61 0.9690 0.9769 0.9730 0.9964
0.0189 11.0 2640 0.0342 1.0 1.0 1.0 17 0.9667 0.9667 0.9667 60 0.8657 0.9831 0.9206 59 0.9833 1.0 0.9916 59 0.9836 1.0 0.9917 60 0.8594 0.9016 0.88 61 1.0 0.9833 0.9916 60 0.9833 0.9833 0.9833 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 0.9836 0.9917 61 0.9833 1.0 0.9916 59 0.9516 0.9672 0.9593 61 0.964 0.9810 0.9724 0.9958
0.0187 12.0 2880 0.0255 1.0 1.0 1.0 17 0.9667 0.9667 0.9667 60 0.9508 0.9831 0.9667 59 0.9833 1.0 0.9916 59 0.9836 1.0 0.9917 60 0.9667 0.9508 0.9587 61 1.0 0.9833 0.9916 60 0.9672 0.9833 0.9752 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 0.9836 0.9917 61 0.9833 1.0 0.9916 59 0.9833 0.9672 0.9752 61 0.9824 0.9851 0.9837 0.9976
0.0126 13.0 3120 0.0257 1.0 1.0 1.0 17 0.9667 0.9667 0.9667 60 0.9344 0.9661 0.95 59 0.8889 0.9492 0.9180 59 0.9836 1.0 0.9917 60 0.8788 0.9508 0.9134 61 1.0 0.9833 0.9916 60 1.0 1.0 1.0 60 0.9836 1.0 0.9917 60 1.0 1.0 1.0 60 1.0 0.9672 0.9833 61 0.9508 0.9831 0.9667 59 1.0 0.9836 0.9917 61 0.9652 0.9796 0.9724 0.9971
0.012 14.0 3360 0.0227 1.0 1.0 1.0 17 0.9667 0.9667 0.9667 60 0.9516 1.0 0.9752 59 0.9833 1.0 0.9916 59 0.9836 1.0 0.9917 60 0.9194 0.9344 0.9268 61 1.0 0.9833 0.9916 60 0.9672 0.9833 0.9752 60 1.0 0.9833 0.9916 60 1.0 1.0 1.0 60 0.9836 0.9836 0.9836 61 0.9833 1.0 0.9916 59 1.0 0.9836 0.9917 61 0.9784 0.9851 0.9817 0.9977
0.0119 15.0 3600 0.0284 1.0 1.0 1.0 17 1.0 1.0 1.0 60 0.9355 0.9831 0.9587 59 0.9833 1.0 0.9916 59 1.0 1.0 1.0 60 0.9167 0.9016 0.9091 61 0.9661 0.95 0.9580 60 0.9833 0.9833 0.9833 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 0.9836 0.9917 61 0.9833 1.0 0.9916 59 1.0 0.9836 0.9917 61 0.9810 0.9824 0.9817 0.9965
0.0103 16.0 3840 0.0289 0.9444 1.0 0.9714 17 0.9672 0.9833 0.9752 60 0.9344 0.9661 0.95 59 0.9833 1.0 0.9916 59 1.0 1.0 1.0 60 0.8088 0.9016 0.8527 61 0.9667 0.9667 0.9667 60 0.9833 0.9833 0.9833 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 0.9836 0.9917 61 0.9833 1.0 0.9916 59 1.0 0.9836 0.9917 61 0.9666 0.9810 0.9737 0.9963
0.01 17.0 4080 0.0305 0.8947 1.0 0.9444 17 1.0 0.9833 0.9916 60 0.9355 0.9831 0.9587 59 0.9516 1.0 0.9752 59 0.9836 1.0 0.9917 60 0.9355 0.9508 0.9431 61 0.9833 0.9833 0.9833 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 0.8955 1.0 0.9449 60 1.0 0.9836 0.9917 61 0.9833 1.0 0.9916 59 1.0 0.9836 0.9917 61 0.9694 0.9891 0.9792 0.9961
0.0082 18.0 4320 0.0256 1.0 1.0 1.0 17 1.0 0.9833 0.9916 60 0.9508 0.9831 0.9667 59 0.9833 1.0 0.9916 59 0.9836 1.0 0.9917 60 0.8636 0.9344 0.8976 61 0.9831 0.9667 0.9748 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 0.9836 0.9917 61 0.9833 1.0 0.9916 59 1.0 0.9836 0.9917 61 0.9785 0.9864 0.9824 0.9970
0.0059 19.0 4560 0.0255 1.0 1.0 1.0 17 1.0 0.9833 0.9916 60 0.9667 0.9831 0.9748 59 0.9833 1.0 0.9916 59 0.9836 1.0 0.9917 60 0.9355 0.9508 0.9431 61 0.9833 0.9833 0.9833 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 0.9836 0.9917 61 0.9833 1.0 0.9916 59 1.0 0.9836 0.9917 61 0.9865 0.9891 0.9878 0.9974
0.0078 20.0 4800 0.0293 1.0 1.0 1.0 17 1.0 0.9833 0.9916 60 0.9508 0.9831 0.9667 59 0.9833 1.0 0.9916 59 0.9836 1.0 0.9917 60 0.9 0.8852 0.8926 61 0.9661 0.95 0.9580 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 0.9836 0.9917 61 0.9833 1.0 0.9916 59 1.0 0.9836 0.9917 61 0.9810 0.9810 0.9810 0.9966
0.009 21.0 5040 0.0264 1.0 1.0 1.0 17 1.0 0.9833 0.9916 60 0.9206 0.9831 0.9508 59 0.9667 0.9831 0.9748 59 0.9836 1.0 0.9917 60 0.8889 0.9180 0.9032 61 0.9672 0.9833 0.9752 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 0.9836 0.9836 0.9836 61 0.9831 0.9831 0.9831 59 1.0 0.9836 0.9917 61 0.9745 0.9837 0.9791 0.9969
0.0046 22.0 5280 0.0271 1.0 1.0 1.0 17 1.0 0.9833 0.9916 60 0.9355 0.9831 0.9587 59 0.9667 0.9831 0.9748 59 0.9836 1.0 0.9917 60 0.9032 0.9180 0.9106 61 0.9672 0.9833 0.9752 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 0.9836 0.9917 61 0.9833 1.0 0.9916 59 1.0 0.9836 0.9917 61 0.9784 0.9851 0.9817 0.9970
0.0087 23.0 5520 0.0278 0.9444 1.0 0.9714 17 1.0 0.9833 0.9916 60 0.9194 0.9661 0.9421 59 0.9667 0.9831 0.9748 59 0.9836 1.0 0.9917 60 0.8657 0.9508 0.9062 61 0.9836 1.0 0.9917 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 0.9836 0.9917 61 0.9833 1.0 0.9916 59 1.0 0.9836 0.9917 61 0.9733 0.9878 0.9805 0.9958
0.0054 24.0 5760 0.0276 0.9444 1.0 0.9714 17 1.0 0.9833 0.9916 60 0.95 0.9661 0.9580 59 0.9833 1.0 0.9916 59 0.9836 1.0 0.9917 60 0.9355 0.9508 0.9431 61 0.9831 0.9667 0.9748 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 0.9355 0.9667 0.9508 60 1.0 0.9836 0.9917 61 0.9833 1.0 0.9916 59 1.0 0.9836 0.9917 61 0.9784 0.9837 0.9811 0.9971
0.0057 25.0 6000 0.0260 1.0 1.0 1.0 17 1.0 0.9667 0.9831 60 0.9077 1.0 0.9516 59 0.95 0.9661 0.9580 59 0.9677 1.0 0.9836 60 0.9508 0.9508 0.9508 61 1.0 0.9833 0.9916 60 0.9833 0.9833 0.9833 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 0.9672 0.9833 61 0.9672 1.0 0.9833 59 1.0 0.9836 0.9917 61 0.9771 0.9837 0.9804 0.9971
0.0074 26.0 6240 0.0340 1.0 1.0 1.0 17 1.0 0.9833 0.9916 60 0.9180 0.9492 0.9333 59 0.9667 0.9831 0.9748 59 0.9836 1.0 0.9917 60 0.8906 0.9344 0.9120 61 0.9831 0.9667 0.9748 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 0.9836 0.9917 61 0.9833 1.0 0.9916 59 0.9836 0.9836 0.9836 61 0.9757 0.9824 0.9790 0.9959
0.0047 27.0 6480 0.0306 1.0 1.0 1.0 17 1.0 1.0 1.0 60 0.8923 0.9831 0.9355 59 0.9672 1.0 0.9833 59 1.0 1.0 1.0 60 0.9016 0.9016 0.9016 61 0.9667 0.9667 0.9667 60 0.9833 0.9833 0.9833 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 0.9672 0.9833 61 0.8551 1.0 0.9219 59 1.0 0.8525 0.9204 61 0.9624 0.9715 0.9669 0.9961
0.0052 28.0 6720 0.0262 1.0 1.0 1.0 17 1.0 0.9833 0.9916 60 0.9667 0.9831 0.9748 59 0.9833 1.0 0.9916 59 0.9836 1.0 0.9917 60 0.9833 0.9672 0.9752 61 0.9833 0.9833 0.9833 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 0.9836 0.9917 61 0.9833 1.0 0.9916 59 1.0 0.9836 0.9917 61 0.9905 0.9905 0.9905 0.9973
0.0033 29.0 6960 0.0320 0.9444 1.0 0.9714 17 1.0 0.9833 0.9916 60 0.8406 0.9831 0.9062 59 0.9672 1.0 0.9833 59 0.9836 1.0 0.9917 60 0.8852 0.8852 0.8852 61 0.9833 0.9833 0.9833 60 1.0 0.9667 0.9831 60 1.0 1.0 1.0 60 0.9833 0.9833 0.9833 60 1.0 0.9836 0.9917 61 0.9365 1.0 0.9672 59 1.0 0.9836 0.9917 61 0.9627 0.9796 0.9711 0.9960
0.0048 30.0 7200 0.0215 1.0 1.0 1.0 17 1.0 0.9833 0.9916 60 0.9672 1.0 0.9833 59 0.9833 1.0 0.9916 59 0.9836 1.0 0.9917 60 0.9833 0.9672 0.9752 61 1.0 0.9833 0.9916 60 0.9833 0.9833 0.9833 60 1.0 1.0 1.0 60 1.0 1.0 1.0 60 1.0 0.9672 0.9833 61 0.9672 1.0 0.9833 59 1.0 0.9836 0.9917 61 0.9891 0.9891 0.9891 0.9980

Framework versions

  • Transformers 4.22.0.dev0
  • Pytorch 1.12.1+cu113
  • Datasets 2.2.2
  • Tokenizers 0.12.1
Downloads last month
2
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.