layoutlm-funsd / README.md
sreejith8100's picture
End of training
85ca93c
|
raw
history blame
9.26 kB
metadata
base_model: microsoft/layoutlm-base-uncased
tags:
  - generated_from_trainer
model-index:
  - name: layoutlm-funsd
    results: []

layoutlm-funsd

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6801
  • Answer: {'precision': 0.6865671641791045, 'recall': 0.796044499381953, 'f1': 0.7372638809387521, 'number': 809}
  • Header: {'precision': 0.30714285714285716, 'recall': 0.36134453781512604, 'f1': 0.33204633204633205, 'number': 119}
  • Question: {'precision': 0.7743634767339772, 'recall': 0.828169014084507, 'f1': 0.8003629764065335, 'number': 1065}
  • Overall Precision: 0.7077
  • Overall Recall: 0.7873
  • Overall F1: 0.7454
  • Overall Accuracy: 0.8029

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.7872 1.0 10 1.5976 {'precision': 0.020486555697823303, 'recall': 0.019777503090234856, 'f1': 0.02012578616352201, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.2535014005602241, 'recall': 0.1699530516431925, 'f1': 0.20348510399100617, 'number': 1065} 0.1318 0.0988 0.1130 0.3743
1.4377 2.0 20 1.2582 {'precision': 0.20262869660460023, 'recall': 0.22867737948084055, 'f1': 0.21486643437862948, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.4521497919556172, 'recall': 0.612206572769953, 'f1': 0.5201435979258078, 'number': 1065} 0.3553 0.4200 0.3849 0.5972
1.0609 3.0 30 0.9282 {'precision': 0.4720496894409938, 'recall': 0.5636588380716935, 'f1': 0.5138028169014084, 'number': 809} {'precision': 0.08, 'recall': 0.01680672268907563, 'f1': 0.02777777777777778, 'number': 119} {'precision': 0.5755947812739831, 'recall': 0.704225352112676, 'f1': 0.6334459459459459, 'number': 1065} 0.5266 0.6061 0.5636 0.7029
0.8126 4.0 40 0.7805 {'precision': 0.5814176245210728, 'recall': 0.7503090234857849, 'f1': 0.6551538046411225, 'number': 809} {'precision': 0.2, 'recall': 0.10084033613445378, 'f1': 0.1340782122905028, 'number': 119} {'precision': 0.6699916874480466, 'recall': 0.7568075117370892, 'f1': 0.710758377425044, 'number': 1065} 0.6177 0.7150 0.6628 0.7550
0.6594 5.0 50 0.7015 {'precision': 0.6331967213114754, 'recall': 0.7639060568603214, 'f1': 0.6924369747899161, 'number': 809} {'precision': 0.2222222222222222, 'recall': 0.13445378151260504, 'f1': 0.16753926701570682, 'number': 119} {'precision': 0.7241681260945709, 'recall': 0.7765258215962442, 'f1': 0.7494336202990485, 'number': 1065} 0.6671 0.7331 0.6985 0.7820
0.5617 6.0 60 0.6732 {'precision': 0.6566844919786097, 'recall': 0.7589616810877626, 'f1': 0.7041284403669724, 'number': 809} {'precision': 0.2, 'recall': 0.21008403361344538, 'f1': 0.20491803278688528, 'number': 119} {'precision': 0.7147385103011094, 'recall': 0.8469483568075117, 'f1': 0.7752470992694457, 'number': 1065} 0.6637 0.7732 0.7143 0.7867
0.4814 7.0 70 0.6633 {'precision': 0.6609989373007439, 'recall': 0.7688504326328801, 'f1': 0.7108571428571427, 'number': 809} {'precision': 0.27586206896551724, 'recall': 0.2689075630252101, 'f1': 0.27234042553191484, 'number': 119} {'precision': 0.7466442953020134, 'recall': 0.8356807511737089, 'f1': 0.7886575099689852, 'number': 1065} 0.6865 0.7747 0.7280 0.7961
0.4351 8.0 80 0.6481 {'precision': 0.6829533116178067, 'recall': 0.7775030902348579, 'f1': 0.7271676300578035, 'number': 809} {'precision': 0.2846153846153846, 'recall': 0.31092436974789917, 'f1': 0.29718875502008035, 'number': 119} {'precision': 0.7567796610169492, 'recall': 0.8384976525821596, 'f1': 0.7955456570155902, 'number': 1065} 0.6988 0.7822 0.7382 0.7985
0.3819 9.0 90 0.6559 {'precision': 0.6789473684210526, 'recall': 0.7972805933250927, 'f1': 0.7333712336554862, 'number': 809} {'precision': 0.3170731707317073, 'recall': 0.3277310924369748, 'f1': 0.32231404958677684, 'number': 119} {'precision': 0.7789566755083996, 'recall': 0.8272300469483568, 'f1': 0.802367941712204, 'number': 1065} 0.7101 0.7852 0.7458 0.8087
0.349 10.0 100 0.6553 {'precision': 0.6794055201698513, 'recall': 0.7911001236093943, 'f1': 0.7310108509423187, 'number': 809} {'precision': 0.33076923076923076, 'recall': 0.36134453781512604, 'f1': 0.34538152610441764, 'number': 119} {'precision': 0.7757417102966842, 'recall': 0.8347417840375587, 'f1': 0.8041610131162371, 'number': 1065} 0.7087 0.7888 0.7466 0.8055
0.3137 11.0 110 0.6590 {'precision': 0.6915584415584416, 'recall': 0.7898640296662547, 'f1': 0.7374495095210617, 'number': 809} {'precision': 0.2971014492753623, 'recall': 0.3445378151260504, 'f1': 0.31906614785992216, 'number': 119} {'precision': 0.7753496503496503, 'recall': 0.8328638497652582, 'f1': 0.8030783159800814, 'number': 1065} 0.7103 0.7863 0.7464 0.8082
0.3015 12.0 120 0.6652 {'precision': 0.6789862724392819, 'recall': 0.7948084054388134, 'f1': 0.7323462414578588, 'number': 809} {'precision': 0.3049645390070922, 'recall': 0.36134453781512604, 'f1': 0.3307692307692308, 'number': 119} {'precision': 0.77117903930131, 'recall': 0.8291079812206573, 'f1': 0.7990950226244343, 'number': 1065} 0.7026 0.7873 0.7425 0.7986
0.2804 13.0 130 0.6745 {'precision': 0.6993464052287581, 'recall': 0.7935723114956736, 'f1': 0.7434858135495078, 'number': 809} {'precision': 0.31386861313868614, 'recall': 0.36134453781512604, 'f1': 0.3359375, 'number': 119} {'precision': 0.7880143112701252, 'recall': 0.8272300469483568, 'f1': 0.8071461291800274, 'number': 1065} 0.7207 0.7858 0.7518 0.8055
0.2658 14.0 140 0.6757 {'precision': 0.6935483870967742, 'recall': 0.7972805933250927, 'f1': 0.7418056354226568, 'number': 809} {'precision': 0.31386861313868614, 'recall': 0.36134453781512604, 'f1': 0.3359375, 'number': 119} {'precision': 0.7793468667255075, 'recall': 0.8291079812206573, 'f1': 0.8034576888080072, 'number': 1065} 0.7141 0.7883 0.7493 0.8031
0.2581 15.0 150 0.6801 {'precision': 0.6865671641791045, 'recall': 0.796044499381953, 'f1': 0.7372638809387521, 'number': 809} {'precision': 0.30714285714285716, 'recall': 0.36134453781512604, 'f1': 0.33204633204633205, 'number': 119} {'precision': 0.7743634767339772, 'recall': 0.828169014084507, 'f1': 0.8003629764065335, 'number': 1065} 0.7077 0.7873 0.7454 0.8029

Framework versions

  • Transformers 4.34.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1