Edit model card

layoutlm-funsd

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6858
  • Answer: {'precision': 0.7110141766630316, 'recall': 0.8059332509270705, 'f1': 0.7555040556199305, 'number': 809}
  • Header: {'precision': 0.26573426573426573, 'recall': 0.31932773109243695, 'f1': 0.2900763358778626, 'number': 119}
  • Question: {'precision': 0.7804878048780488, 'recall': 0.8413145539906103, 'f1': 0.8097605061003164, 'number': 1065}
  • Overall Precision: 0.7183
  • Overall Recall: 0.7958
  • Overall F1: 0.7551
  • Overall Accuracy: 0.8018

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.7962 1.0 10 1.5480 {'precision': 0.01806451612903226, 'recall': 0.0173053152039555, 'f1': 0.017676767676767676, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.3031161473087819, 'recall': 0.20093896713615023, 'f1': 0.24167137210615472, 'number': 1065} 0.1540 0.1144 0.1313 0.3738
1.4016 2.0 20 1.1898 {'precision': 0.15568862275449102, 'recall': 0.16069221260815822, 'f1': 0.1581508515815085, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.46456140350877195, 'recall': 0.6215962441314554, 'f1': 0.5317269076305222, 'number': 1065} 0.3504 0.3974 0.3724 0.5895
1.0485 3.0 30 0.8955 {'precision': 0.5086767895878525, 'recall': 0.5797280593325093, 'f1': 0.5418833044482957, 'number': 809} {'precision': 0.05405405405405406, 'recall': 0.01680672268907563, 'f1': 0.02564102564102564, 'number': 119} {'precision': 0.5728011825572801, 'recall': 0.7276995305164319, 'f1': 0.641025641025641, 'number': 1065} 0.5389 0.6252 0.5789 0.7253
0.8064 4.0 40 0.7481 {'precision': 0.6226993865030674, 'recall': 0.7527812113720643, 'f1': 0.6815892557358703, 'number': 809} {'precision': 0.2153846153846154, 'recall': 0.11764705882352941, 'f1': 0.15217391304347824, 'number': 119} {'precision': 0.6828859060402684, 'recall': 0.7643192488262911, 'f1': 0.7213114754098361, 'number': 1065} 0.6430 0.7210 0.6798 0.7738
0.6605 5.0 50 0.7053 {'precision': 0.65, 'recall': 0.7391841779975278, 'f1': 0.6917293233082706, 'number': 809} {'precision': 0.2840909090909091, 'recall': 0.21008403361344538, 'f1': 0.24154589371980678, 'number': 119} {'precision': 0.664856477889837, 'recall': 0.8046948356807512, 'f1': 0.7281223449447748, 'number': 1065} 0.6443 0.7426 0.6900 0.7825
0.5604 6.0 60 0.6779 {'precision': 0.6486210418794689, 'recall': 0.7849196538936959, 'f1': 0.7102908277404922, 'number': 809} {'precision': 0.2826086956521739, 'recall': 0.2184873949579832, 'f1': 0.24644549763033172, 'number': 119} {'precision': 0.703765690376569, 'recall': 0.7896713615023474, 'f1': 0.7442477876106194, 'number': 1065} 0.6628 0.7536 0.7053 0.7900
0.4892 7.0 70 0.6542 {'precision': 0.6759358288770053, 'recall': 0.7812113720642769, 'f1': 0.7247706422018348, 'number': 809} {'precision': 0.2905982905982906, 'recall': 0.2857142857142857, 'f1': 0.288135593220339, 'number': 119} {'precision': 0.7476066144473456, 'recall': 0.8065727699530516, 'f1': 0.7759710930442637, 'number': 1065} 0.6929 0.7652 0.7272 0.8019
0.4295 8.0 80 0.6489 {'precision': 0.6817226890756303, 'recall': 0.8022249690976514, 'f1': 0.7370812038614423, 'number': 809} {'precision': 0.2396694214876033, 'recall': 0.24369747899159663, 'f1': 0.24166666666666667, 'number': 119} {'precision': 0.745819397993311, 'recall': 0.8375586854460094, 'f1': 0.7890314020344978, 'number': 1065} 0.6919 0.7878 0.7367 0.8013
0.3837 9.0 90 0.6428 {'precision': 0.6862955032119914, 'recall': 0.792336217552534, 'f1': 0.7355134825014343, 'number': 809} {'precision': 0.2786885245901639, 'recall': 0.2857142857142857, 'f1': 0.2821576763485477, 'number': 119} {'precision': 0.7790492957746479, 'recall': 0.8309859154929577, 'f1': 0.8041799182189914, 'number': 1065} 0.7117 0.7827 0.7455 0.8070
0.3704 10.0 100 0.6588 {'precision': 0.7003257328990228, 'recall': 0.7972805933250927, 'f1': 0.7456647398843931, 'number': 809} {'precision': 0.26811594202898553, 'recall': 0.31092436974789917, 'f1': 0.28793774319066145, 'number': 119} {'precision': 0.7737991266375546, 'recall': 0.831924882629108, 'f1': 0.8018099547511313, 'number': 1065} 0.7114 0.7868 0.7472 0.8066
0.3247 11.0 110 0.6610 {'precision': 0.7026737967914438, 'recall': 0.8121137206427689, 'f1': 0.7534403669724771, 'number': 809} {'precision': 0.2803030303030303, 'recall': 0.31092436974789917, 'f1': 0.29482071713147406, 'number': 119} {'precision': 0.7649572649572649, 'recall': 0.8403755868544601, 'f1': 0.8008948545861297, 'number': 1065} 0.7103 0.7973 0.7513 0.8034
0.3073 12.0 120 0.6618 {'precision': 0.709051724137931, 'recall': 0.8133498145859085, 'f1': 0.7576280944156591, 'number': 809} {'precision': 0.28776978417266186, 'recall': 0.33613445378151263, 'f1': 0.310077519379845, 'number': 119} {'precision': 0.7948260481712757, 'recall': 0.8366197183098592, 'f1': 0.8151875571820677, 'number': 1065} 0.7262 0.7973 0.7601 0.8080
0.2955 13.0 130 0.6810 {'precision': 0.7086527929901424, 'recall': 0.799752781211372, 'f1': 0.751451800232288, 'number': 809} {'precision': 0.2932330827067669, 'recall': 0.3277310924369748, 'f1': 0.30952380952380953, 'number': 119} {'precision': 0.7778738115816768, 'recall': 0.8450704225352113, 'f1': 0.8100810081008101, 'number': 1065} 0.7199 0.7958 0.7560 0.8021
0.2715 14.0 140 0.6852 {'precision': 0.7047413793103449, 'recall': 0.8084054388133498, 'f1': 0.7530224525043179, 'number': 809} {'precision': 0.2746478873239437, 'recall': 0.3277310924369748, 'f1': 0.2988505747126437, 'number': 119} {'precision': 0.7787456445993032, 'recall': 0.8394366197183099, 'f1': 0.807953004970628, 'number': 1065} 0.7155 0.7963 0.7537 0.8027
0.2679 15.0 150 0.6858 {'precision': 0.7110141766630316, 'recall': 0.8059332509270705, 'f1': 0.7555040556199305, 'number': 809} {'precision': 0.26573426573426573, 'recall': 0.31932773109243695, 'f1': 0.2900763358778626, 'number': 119} {'precision': 0.7804878048780488, 'recall': 0.8413145539906103, 'f1': 0.8097605061003164, 'number': 1065} 0.7183 0.7958 0.7551 0.8018

Framework versions

  • Transformers 4.41.0
  • Pytorch 2.3.0+cu118
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
113M params
Tensor type
F32
·
Inference API
This model can be loaded on Inference API (serverless).

Finetuned from