layoutlm-funsd

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7010
  • Answer: {'precision': 0.6882416396979504, 'recall': 0.788627935723115, 'f1': 0.7350230414746545, 'number': 809}
  • Header: {'precision': 0.2857142857142857, 'recall': 0.3025210084033613, 'f1': 0.2938775510204082, 'number': 119}
  • Question: {'precision': 0.7771836007130125, 'recall': 0.8187793427230047, 'f1': 0.7974394147233654, 'number': 1065}
  • Overall Precision: 0.7108
  • Overall Recall: 0.7757
  • Overall F1: 0.7418
  • Overall Accuracy: 0.8054

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.8001 1.0 10 1.6097 {'precision': 0.015299026425591099, 'recall': 0.013597033374536464, 'f1': 0.014397905759162303, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.2779291553133515, 'recall': 0.19154929577464788, 'f1': 0.226792662590328, 'number': 1065} 0.1480 0.1079 0.1248 0.3542
1.4627 2.0 20 1.2809 {'precision': 0.15977175463623394, 'recall': 0.138442521631644, 'f1': 0.14834437086092714, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.42467948717948717, 'recall': 0.49765258215962443, 'f1': 0.45827929096411585, 'number': 1065} 0.3294 0.3221 0.3257 0.5862
1.1306 3.0 30 1.0105 {'precision': 0.41445783132530123, 'recall': 0.4252163164400494, 'f1': 0.4197681513117754, 'number': 809} {'precision': 0.11764705882352941, 'recall': 0.03361344537815126, 'f1': 0.05228758169934641, 'number': 119} {'precision': 0.5563607085346216, 'recall': 0.6488262910798122, 'f1': 0.5990463805808409, 'number': 1065} 0.4934 0.5213 0.5070 0.6955
0.8744 4.0 40 0.8187 {'precision': 0.5656565656565656, 'recall': 0.622991347342398, 'f1': 0.5929411764705883, 'number': 809} {'precision': 0.2641509433962264, 'recall': 0.11764705882352941, 'f1': 0.16279069767441862, 'number': 119} {'precision': 0.6538789428815004, 'recall': 0.72018779342723, 'f1': 0.6854334226988382, 'number': 1065} 0.6070 0.6448 0.6253 0.7427
0.6901 5.0 50 0.7311 {'precision': 0.635091496232508, 'recall': 0.7292954264524104, 'f1': 0.6789413118527043, 'number': 809} {'precision': 0.25925925925925924, 'recall': 0.17647058823529413, 'f1': 0.21, 'number': 119} {'precision': 0.6711675933280381, 'recall': 0.7934272300469484, 'f1': 0.7271944922547332, 'number': 1065} 0.6417 0.7306 0.6832 0.7707
0.5703 6.0 60 0.7127 {'precision': 0.6548856548856549, 'recall': 0.7787391841779975, 'f1': 0.7114624505928854, 'number': 809} {'precision': 0.25555555555555554, 'recall': 0.19327731092436976, 'f1': 0.22009569377990432, 'number': 119} {'precision': 0.7043701799485861, 'recall': 0.7718309859154929, 'f1': 0.7365591397849461, 'number': 1065} 0.6647 0.7401 0.7004 0.7837
0.4964 7.0 70 0.6823 {'precision': 0.6729758149316509, 'recall': 0.7911001236093943, 'f1': 0.7272727272727274, 'number': 809} {'precision': 0.2647058823529412, 'recall': 0.226890756302521, 'f1': 0.24434389140271492, 'number': 119} {'precision': 0.751099384344767, 'recall': 0.8018779342723005, 'f1': 0.7756584922797457, 'number': 1065} 0.6945 0.7632 0.7272 0.7962
0.4347 8.0 80 0.6763 {'precision': 0.6754478398314014, 'recall': 0.792336217552534, 'f1': 0.7292377701934016, 'number': 809} {'precision': 0.23529411764705882, 'recall': 0.23529411764705882, 'f1': 0.23529411764705882, 'number': 119} {'precision': 0.7530434782608696, 'recall': 0.8131455399061033, 'f1': 0.781941309255079, 'number': 1065} 0.6921 0.7702 0.7290 0.8003
0.386 9.0 90 0.6695 {'precision': 0.6803013993541442, 'recall': 0.7812113720642769, 'f1': 0.7272727272727273, 'number': 809} {'precision': 0.29411764705882354, 'recall': 0.25210084033613445, 'f1': 0.27149321266968324, 'number': 119} {'precision': 0.762157382847038, 'recall': 0.8093896713615023, 'f1': 0.785063752276867, 'number': 1065} 0.7049 0.7647 0.7336 0.8091
0.3668 10.0 100 0.6898 {'precision': 0.6729559748427673, 'recall': 0.7935723114956736, 'f1': 0.7283040272263188, 'number': 809} {'precision': 0.29357798165137616, 'recall': 0.2689075630252101, 'f1': 0.28070175438596495, 'number': 119} {'precision': 0.7695729537366548, 'recall': 0.812206572769953, 'f1': 0.7903152124257652, 'number': 1065} 0.7037 0.7722 0.7364 0.8064
0.3174 11.0 110 0.6929 {'precision': 0.6782700421940928, 'recall': 0.7948084054388134, 'f1': 0.7319294251565168, 'number': 809} {'precision': 0.2903225806451613, 'recall': 0.3025210084033613, 'f1': 0.2962962962962963, 'number': 119} {'precision': 0.7741364038972542, 'recall': 0.8206572769953052, 'f1': 0.796718322698268, 'number': 1065} 0.7056 0.7792 0.7406 0.8031
0.3134 12.0 120 0.6977 {'precision': 0.6860215053763441, 'recall': 0.788627935723115, 'f1': 0.7337550316273721, 'number': 809} {'precision': 0.29411764705882354, 'recall': 0.29411764705882354, 'f1': 0.29411764705882354, 'number': 119} {'precision': 0.7796762589928058, 'recall': 0.8140845070422535, 'f1': 0.7965089572806615, 'number': 1065} 0.7126 0.7727 0.7415 0.8053
0.2851 13.0 130 0.7022 {'precision': 0.6894679695982627, 'recall': 0.7849196538936959, 'f1': 0.7341040462427746, 'number': 809} {'precision': 0.2773109243697479, 'recall': 0.2773109243697479, 'f1': 0.2773109243697479, 'number': 119} {'precision': 0.7772848269742679, 'recall': 0.8225352112676056, 'f1': 0.7992700729927008, 'number': 1065} 0.7125 0.7747 0.7423 0.8056
0.2693 14.0 140 0.7003 {'precision': 0.6897297297297297, 'recall': 0.788627935723115, 'f1': 0.7358708189158016, 'number': 809} {'precision': 0.28688524590163933, 'recall': 0.29411764705882354, 'f1': 0.2904564315352697, 'number': 119} {'precision': 0.7755102040816326, 'recall': 0.8206572769953052, 'f1': 0.7974452554744526, 'number': 1065} 0.7116 0.7762 0.7425 0.8048
0.2636 15.0 150 0.7010 {'precision': 0.6882416396979504, 'recall': 0.788627935723115, 'f1': 0.7350230414746545, 'number': 809} {'precision': 0.2857142857142857, 'recall': 0.3025210084033613, 'f1': 0.2938775510204082, 'number': 119} {'precision': 0.7771836007130125, 'recall': 0.8187793427230047, 'f1': 0.7974394147233654, 'number': 1065} 0.7108 0.7757 0.7418 0.8054

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.5.1+cu121
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
9
Safetensors
Model size
113M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for jfrish/layoutlm-funsd

Finetuned
(161)
this model