my-new-model-id / README.md
ShubhLM's picture
End of training
f5ba65b
metadata
tags:
  - generated_from_trainer
datasets:
  - funsd
model-index:
  - name: my-new-model-id
    results: []

my-new-model-id

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6585
  • Answer: {'precision': 0.7224043715846995, 'recall': 0.8170580964153276, 'f1': 0.7668213457076567, 'number': 809}
  • Header: {'precision': 0.2777777777777778, 'recall': 0.33613445378151263, 'f1': 0.30418250950570347, 'number': 119}
  • Question: {'precision': 0.7818343722172751, 'recall': 0.8244131455399061, 'f1': 0.8025594149908593, 'number': 1065}
  • Overall Precision: 0.7236
  • Overall Recall: 0.7923
  • Overall F1: 0.7564
  • Overall Accuracy: 0.8164

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.8057 1.0 10 1.5463 {'precision': 0.016229712858926344, 'recall': 0.016069221260815822, 'f1': 0.01614906832298137, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.16498993963782696, 'recall': 0.07699530516431925, 'f1': 0.10499359795134443, 'number': 1065} 0.0732 0.0477 0.0577 0.3858
1.4605 2.0 20 1.2487 {'precision': 0.24860335195530725, 'recall': 0.3300370828182942, 'f1': 0.2835900159320234, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.45215485756026297, 'recall': 0.5812206572769953, 'f1': 0.5086277732128184, 'number': 1065} 0.3627 0.4446 0.3995 0.5992
1.1219 3.0 30 0.9461 {'precision': 0.45073375262054505, 'recall': 0.5315203955500618, 'f1': 0.4878048780487805, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.5785597381342062, 'recall': 0.6638497652582159, 'f1': 0.6182772190642762, 'number': 1065} 0.5204 0.5705 0.5443 0.6840
0.847 4.0 40 0.7978 {'precision': 0.5685131195335277, 'recall': 0.723114956736712, 'f1': 0.6365614798694234, 'number': 809} {'precision': 0.07142857142857142, 'recall': 0.025210084033613446, 'f1': 0.037267080745341616, 'number': 119} {'precision': 0.6571180555555556, 'recall': 0.7107981220657277, 'f1': 0.6829048263419035, 'number': 1065} 0.6050 0.6749 0.6380 0.7407
0.6811 5.0 50 0.7134 {'precision': 0.6322444678609063, 'recall': 0.7416563658838071, 'f1': 0.6825938566552902, 'number': 809} {'precision': 0.2328767123287671, 'recall': 0.14285714285714285, 'f1': 0.17708333333333334, 'number': 119} {'precision': 0.7031924072476272, 'recall': 0.7652582159624414, 'f1': 0.7329136690647481, 'number': 1065} 0.6566 0.7185 0.6862 0.7763
0.5706 6.0 60 0.6581 {'precision': 0.663820704375667, 'recall': 0.7688504326328801, 'f1': 0.7124856815578464, 'number': 809} {'precision': 0.23376623376623376, 'recall': 0.15126050420168066, 'f1': 0.1836734693877551, 'number': 119} {'precision': 0.7145187601957586, 'recall': 0.8225352112676056, 'f1': 0.7647315582714972, 'number': 1065} 0.6768 0.7607 0.7163 0.7990
0.5016 7.0 70 0.6413 {'precision': 0.6694386694386695, 'recall': 0.796044499381953, 'f1': 0.7272727272727273, 'number': 809} {'precision': 0.19626168224299065, 'recall': 0.17647058823529413, 'f1': 0.18584070796460178, 'number': 119} {'precision': 0.7557446808510638, 'recall': 0.8338028169014085, 'f1': 0.7928571428571429, 'number': 1065} 0.6921 0.7792 0.7331 0.8033
0.4435 8.0 80 0.6286 {'precision': 0.6945031712473573, 'recall': 0.8121137206427689, 'f1': 0.7487179487179487, 'number': 809} {'precision': 0.23931623931623933, 'recall': 0.23529411764705882, 'f1': 0.23728813559322035, 'number': 119} {'precision': 0.7668122270742358, 'recall': 0.8244131455399061, 'f1': 0.7945701357466064, 'number': 1065} 0.7079 0.7842 0.7441 0.8108
0.4078 9.0 90 0.6405 {'precision': 0.6957470010905126, 'recall': 0.788627935723115, 'f1': 0.7392815758980301, 'number': 809} {'precision': 0.2711864406779661, 'recall': 0.2689075630252101, 'f1': 0.270042194092827, 'number': 119} {'precision': 0.7794508414526129, 'recall': 0.8262910798122066, 'f1': 0.8021877848678213, 'number': 1065} 0.7163 0.7777 0.7457 0.8137
0.3657 10.0 100 0.6364 {'precision': 0.7142857142857143, 'recall': 0.8096415327564895, 'f1': 0.7589803012746235, 'number': 809} {'precision': 0.2807017543859649, 'recall': 0.2689075630252101, 'f1': 0.27467811158798283, 'number': 119} {'precision': 0.7870452528837621, 'recall': 0.8328638497652582, 'f1': 0.8093065693430658, 'number': 1065} 0.7294 0.7898 0.7584 0.8098
0.335 11.0 110 0.6427 {'precision': 0.7027896995708155, 'recall': 0.8096415327564895, 'f1': 0.7524411257897761, 'number': 809} {'precision': 0.26865671641791045, 'recall': 0.3025210084033613, 'f1': 0.2845849802371542, 'number': 119} {'precision': 0.7813620071684588, 'recall': 0.8187793427230047, 'f1': 0.7996331957817515, 'number': 1065} 0.7163 0.7842 0.7487 0.8132
0.3103 12.0 120 0.6505 {'precision': 0.7311946902654868, 'recall': 0.8170580964153276, 'f1': 0.7717454757734967, 'number': 809} {'precision': 0.2595419847328244, 'recall': 0.2857142857142857, 'f1': 0.27199999999999996, 'number': 119} {'precision': 0.7859054415700267, 'recall': 0.8272300469483568, 'f1': 0.8060384263494967, 'number': 1065} 0.7310 0.7908 0.7597 0.8160
0.3007 13.0 130 0.6494 {'precision': 0.7219193020719739, 'recall': 0.8182941903584673, 'f1': 0.7670915411355737, 'number': 809} {'precision': 0.27692307692307694, 'recall': 0.3025210084033613, 'f1': 0.2891566265060241, 'number': 119} {'precision': 0.7930419268510259, 'recall': 0.8347417840375587, 'f1': 0.8133577310155535, 'number': 1065} 0.7320 0.7963 0.7628 0.8161
0.2831 14.0 140 0.6593 {'precision': 0.7202185792349727, 'recall': 0.8145859085290482, 'f1': 0.7645011600928074, 'number': 809} {'precision': 0.273972602739726, 'recall': 0.33613445378151263, 'f1': 0.3018867924528302, 'number': 119} {'precision': 0.7793468667255075, 'recall': 0.8291079812206573, 'f1': 0.8034576888080072, 'number': 1065} 0.7211 0.7938 0.7557 0.8144
0.2913 15.0 150 0.6585 {'precision': 0.7224043715846995, 'recall': 0.8170580964153276, 'f1': 0.7668213457076567, 'number': 809} {'precision': 0.2777777777777778, 'recall': 0.33613445378151263, 'f1': 0.30418250950570347, 'number': 119} {'precision': 0.7818343722172751, 'recall': 0.8244131455399061, 'f1': 0.8025594149908593, 'number': 1065} 0.7236 0.7923 0.7564 0.8164

Framework versions

  • Transformers 4.31.0.dev0
  • Pytorch 2.0.0+cpu
  • Datasets 2.1.0
  • Tokenizers 0.13.3