layoutlm-funsd1 / README.md
Benedict-L's picture
End of training
00f4a0d verified
|
raw
history blame
No virus
9.3 kB
metadata
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
  - generated_from_trainer
datasets:
  - funsd
model-index:
  - name: layoutlm-funsd1
    results: []

layoutlm-funsd1

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6985
  • Answer: {'precision': 0.7292134831460674, 'recall': 0.8022249690976514, 'f1': 0.7639788110653325, 'number': 809}
  • Header: {'precision': 0.2962962962962963, 'recall': 0.33613445378151263, 'f1': 0.31496062992125984, 'number': 119}
  • Question: {'precision': 0.7711267605633803, 'recall': 0.8225352112676056, 'f1': 0.7960018173557474, 'number': 1065}
  • Overall Precision: 0.7242
  • Overall Recall: 0.7852
  • Overall F1: 0.7535
  • Overall Accuracy: 0.8108

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.7326 1.0 10 1.5225 {'precision': 0.0576307363927428, 'recall': 0.06674907292954264, 'f1': 0.06185567010309278, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.2126899016979446, 'recall': 0.22347417840375586, 'f1': 0.21794871794871795, 'number': 1065} 0.1420 0.1465 0.1442 0.4302
1.3559 2.0 20 1.1907 {'precision': 0.2647058823529412, 'recall': 0.22249690976514216, 'f1': 0.24177300201477503, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.48519736842105265, 'recall': 0.5539906103286385, 'f1': 0.5173169662428759, 'number': 1065} 0.4055 0.3864 0.3957 0.5967
1.0329 3.0 30 0.9021 {'precision': 0.4879518072289157, 'recall': 0.5006180469715699, 'f1': 0.49420378279438687, 'number': 809} {'precision': 0.1, 'recall': 0.04201680672268908, 'f1': 0.059171597633136105, 'number': 119} {'precision': 0.647636039250669, 'recall': 0.6816901408450704, 'f1': 0.6642268984446478, 'number': 1065} 0.5677 0.5700 0.5689 0.7304
0.779 4.0 40 0.7524 {'precision': 0.6258205689277899, 'recall': 0.7070457354758962, 'f1': 0.6639582124201974, 'number': 809} {'precision': 0.25675675675675674, 'recall': 0.15966386554621848, 'f1': 0.19689119170984457, 'number': 119} {'precision': 0.6596814752724225, 'recall': 0.7389671361502348, 'f1': 0.6970770593445527, 'number': 1065} 0.6318 0.6914 0.6603 0.7734
0.6249 5.0 50 0.6899 {'precision': 0.6615553121577218, 'recall': 0.7466007416563659, 'f1': 0.7015098722415796, 'number': 809} {'precision': 0.3157894736842105, 'recall': 0.20168067226890757, 'f1': 0.24615384615384614, 'number': 119} {'precision': 0.6818181818181818, 'recall': 0.7746478873239436, 'f1': 0.7252747252747253, 'number': 1065} 0.6608 0.7291 0.6932 0.7938
0.5376 6.0 60 0.6911 {'precision': 0.6773504273504274, 'recall': 0.7836835599505563, 'f1': 0.7266475644699141, 'number': 809} {'precision': 0.29411764705882354, 'recall': 0.21008403361344538, 'f1': 0.2450980392156863, 'number': 119} {'precision': 0.7166377816291161, 'recall': 0.7765258215962442, 'f1': 0.7453808021631364, 'number': 1065} 0.6832 0.7456 0.7131 0.7926
0.4627 7.0 70 0.6573 {'precision': 0.6983783783783784, 'recall': 0.7985166872682324, 'f1': 0.7450980392156863, 'number': 809} {'precision': 0.2882882882882883, 'recall': 0.2689075630252101, 'f1': 0.2782608695652174, 'number': 119} {'precision': 0.735494880546075, 'recall': 0.8093896713615023, 'f1': 0.7706750111756816, 'number': 1065} 0.6975 0.7727 0.7332 0.8012
0.4082 8.0 80 0.6650 {'precision': 0.6871741397288843, 'recall': 0.8145859085290482, 'f1': 0.7454751131221721, 'number': 809} {'precision': 0.28440366972477066, 'recall': 0.2605042016806723, 'f1': 0.2719298245614035, 'number': 119} {'precision': 0.7446626814688301, 'recall': 0.8187793427230047, 'f1': 0.7799642218246869, 'number': 1065} 0.6976 0.7837 0.7382 0.8040
0.3665 9.0 90 0.6682 {'precision': 0.7011995637949836, 'recall': 0.7948084054388134, 'f1': 0.7450753186558517, 'number': 809} {'precision': 0.3076923076923077, 'recall': 0.3025210084033613, 'f1': 0.30508474576271183, 'number': 119} {'precision': 0.7519582245430809, 'recall': 0.8112676056338028, 'f1': 0.7804878048780487, 'number': 1065} 0.7068 0.7742 0.7390 0.8071
0.3554 10.0 100 0.6680 {'precision': 0.7168338907469343, 'recall': 0.7948084054388134, 'f1': 0.753810082063306, 'number': 809} {'precision': 0.3333333333333333, 'recall': 0.35294117647058826, 'f1': 0.34285714285714286, 'number': 119} {'precision': 0.7586206896551724, 'recall': 0.8262910798122066, 'f1': 0.7910112359550561, 'number': 1065} 0.7169 0.7852 0.7495 0.8101
0.3056 11.0 110 0.6786 {'precision': 0.707027027027027, 'recall': 0.8084054388133498, 'f1': 0.7543252595155711, 'number': 809} {'precision': 0.296, 'recall': 0.31092436974789917, 'f1': 0.30327868852459017, 'number': 119} {'precision': 0.7668393782383419, 'recall': 0.8338028169014085, 'f1': 0.7989203778677464, 'number': 1065} 0.7151 0.7923 0.7517 0.8087
0.2977 12.0 120 0.6900 {'precision': 0.7291196388261851, 'recall': 0.7985166872682324, 'f1': 0.7622418879056048, 'number': 809} {'precision': 0.32575757575757575, 'recall': 0.36134453781512604, 'f1': 0.3426294820717131, 'number': 119} {'precision': 0.7726872246696035, 'recall': 0.8234741784037559, 'f1': 0.7972727272727272, 'number': 1065} 0.7274 0.7858 0.7554 0.8097
0.2788 13.0 130 0.6937 {'precision': 0.7224669603524229, 'recall': 0.8108776266996292, 'f1': 0.7641234711706465, 'number': 809} {'precision': 0.3023255813953488, 'recall': 0.3277310924369748, 'f1': 0.314516129032258, 'number': 119} {'precision': 0.7724867724867724, 'recall': 0.8225352112676056, 'f1': 0.7967257844474761, 'number': 1065} 0.7236 0.7883 0.7546 0.8099
0.2593 14.0 140 0.6981 {'precision': 0.7278835386338186, 'recall': 0.8034610630407911, 'f1': 0.7638072855464161, 'number': 809} {'precision': 0.29850746268656714, 'recall': 0.33613445378151263, 'f1': 0.31620553359683795, 'number': 119} {'precision': 0.7715289982425307, 'recall': 0.8244131455399061, 'f1': 0.7970948706309579, 'number': 1065} 0.7242 0.7868 0.7542 0.8110
0.2581 15.0 150 0.6985 {'precision': 0.7292134831460674, 'recall': 0.8022249690976514, 'f1': 0.7639788110653325, 'number': 809} {'precision': 0.2962962962962963, 'recall': 0.33613445378151263, 'f1': 0.31496062992125984, 'number': 119} {'precision': 0.7711267605633803, 'recall': 0.8225352112676056, 'f1': 0.7960018173557474, 'number': 1065} 0.7242 0.7852 0.7535 0.8108

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.1+cu121
  • Datasets 2.19.2
  • Tokenizers 0.19.1