File size: 9,285 Bytes
44c18e6 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 |
---
base_model: microsoft/layoutlm-base-uncased
tags:
- generated_from_trainer
datasets:
- funsd
model-index:
- name: layoutlm-funsd
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# layoutlm-funsd
This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the funsd dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6559
- Answer: {'precision': 0.7160220994475138, 'recall': 0.8009888751545118, 'f1': 0.7561260210035006, 'number': 809}
- Header: {'precision': 0.31932773109243695, 'recall': 0.31932773109243695, 'f1': 0.31932773109243695, 'number': 119}
- Question: {'precision': 0.7674216027874564, 'recall': 0.8272300469483568, 'f1': 0.7962042476276546, 'number': 1065}
- Overall Precision: 0.7215
- Overall Recall: 0.7863
- Overall F1: 0.7525
- Overall Accuracy: 0.8165
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
### Training results
| Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 1.8319 | 1.0 | 10 | 1.6114 | {'precision': 0.02668213457076566, 'recall': 0.02843016069221261, 'f1': 0.02752842609216038, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.21885521885521886, 'recall': 0.18309859154929578, 'f1': 0.19938650306748468, 'number': 1065} | 0.1244 | 0.1094 | 0.1164 | 0.3478 |
| 1.4535 | 2.0 | 20 | 1.2624 | {'precision': 0.2141119221411192, 'recall': 0.21755253399258342, 'f1': 0.2158185162477008, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.45236250968241676, 'recall': 0.5483568075117371, 'f1': 0.4957555178268252, 'number': 1065} | 0.3597 | 0.3813 | 0.3702 | 0.5768 |
| 1.099 | 3.0 | 30 | 0.9496 | {'precision': 0.46866840731070497, 'recall': 0.4437577255871446, 'f1': 0.4558730158730159, 'number': 809} | {'precision': 0.05405405405405406, 'recall': 0.01680672268907563, 'f1': 0.02564102564102564, 'number': 119} | {'precision': 0.6174957118353345, 'recall': 0.676056338028169, 'f1': 0.6454504706409682, 'number': 1065} | 0.5490 | 0.5424 | 0.5457 | 0.7045 |
| 0.8218 | 4.0 | 40 | 0.7695 | {'precision': 0.5814606741573034, 'recall': 0.7676143386897404, 'f1': 0.6616941928609482, 'number': 809} | {'precision': 0.1935483870967742, 'recall': 0.10084033613445378, 'f1': 0.13259668508287292, 'number': 119} | {'precision': 0.6691983122362869, 'recall': 0.7446009389671362, 'f1': 0.7048888888888889, 'number': 1065} | 0.6160 | 0.7155 | 0.6620 | 0.7620 |
| 0.6633 | 5.0 | 50 | 0.7008 | {'precision': 0.6237006237006237, 'recall': 0.7416563658838071, 'f1': 0.6775832862789385, 'number': 809} | {'precision': 0.2571428571428571, 'recall': 0.15126050420168066, 'f1': 0.19047619047619044, 'number': 119} | {'precision': 0.7088055797733217, 'recall': 0.7633802816901408, 'f1': 0.7350813743218807, 'number': 1065} | 0.6567 | 0.7180 | 0.6860 | 0.7819 |
| 0.5651 | 6.0 | 60 | 0.6659 | {'precision': 0.6533192834562698, 'recall': 0.7663782447466008, 'f1': 0.7053469852104665, 'number': 809} | {'precision': 0.2564102564102564, 'recall': 0.25210084033613445, 'f1': 0.2542372881355932, 'number': 119} | {'precision': 0.7251655629139073, 'recall': 0.8225352112676056, 'f1': 0.7707875054993402, 'number': 1065} | 0.6711 | 0.7657 | 0.7153 | 0.7976 |
| 0.4862 | 7.0 | 70 | 0.6514 | {'precision': 0.6496815286624203, 'recall': 0.7564894932014833, 'f1': 0.6990291262135921, 'number': 809} | {'precision': 0.30927835051546393, 'recall': 0.25210084033613445, 'f1': 0.2777777777777778, 'number': 119} | {'precision': 0.7352206494587843, 'recall': 0.8291079812206573, 'f1': 0.7793468667255075, 'number': 1065} | 0.6808 | 0.7652 | 0.7205 | 0.8038 |
| 0.4421 | 8.0 | 80 | 0.6342 | {'precision': 0.6720085470085471, 'recall': 0.7775030902348579, 'f1': 0.7209169054441262, 'number': 809} | {'precision': 0.3017241379310345, 'recall': 0.29411764705882354, 'f1': 0.29787234042553185, 'number': 119} | {'precision': 0.7461928934010152, 'recall': 0.828169014084507, 'f1': 0.7850467289719626, 'number': 1065} | 0.6920 | 0.7757 | 0.7315 | 0.8087 |
| 0.3898 | 9.0 | 90 | 0.6485 | {'precision': 0.7045203969128997, 'recall': 0.7898640296662547, 'f1': 0.7447552447552448, 'number': 809} | {'precision': 0.32038834951456313, 'recall': 0.2773109243697479, 'f1': 0.29729729729729737, 'number': 119} | {'precision': 0.7669902912621359, 'recall': 0.815962441314554, 'f1': 0.7907188353048227, 'number': 1065} | 0.7191 | 0.7732 | 0.7452 | 0.8099 |
| 0.3531 | 10.0 | 100 | 0.6380 | {'precision': 0.7058177826564215, 'recall': 0.7948084054388134, 'f1': 0.7476744186046511, 'number': 809} | {'precision': 0.33980582524271846, 'recall': 0.29411764705882354, 'f1': 0.31531531531531537, 'number': 119} | {'precision': 0.7579672695951766, 'recall': 0.8262910798122066, 'f1': 0.7906558849955077, 'number': 1065} | 0.7163 | 0.7817 | 0.7476 | 0.8155 |
| 0.3226 | 11.0 | 110 | 0.6484 | {'precision': 0.72, 'recall': 0.8009888751545118, 'f1': 0.7583382094792276, 'number': 809} | {'precision': 0.2962962962962963, 'recall': 0.2689075630252101, 'f1': 0.28193832599118945, 'number': 119} | {'precision': 0.7819481680071493, 'recall': 0.8215962441314554, 'f1': 0.8012820512820512, 'number': 1065} | 0.7311 | 0.7802 | 0.7549 | 0.8171 |
| 0.3066 | 12.0 | 120 | 0.6399 | {'precision': 0.7007616974972797, 'recall': 0.796044499381953, 'f1': 0.7453703703703702, 'number': 809} | {'precision': 0.3181818181818182, 'recall': 0.29411764705882354, 'f1': 0.3056768558951965, 'number': 119} | {'precision': 0.7610544217687075, 'recall': 0.8403755868544601, 'f1': 0.7987505577867025, 'number': 1065} | 0.7138 | 0.7898 | 0.7499 | 0.8195 |
| 0.2932 | 13.0 | 130 | 0.6628 | {'precision': 0.7155555555555555, 'recall': 0.796044499381953, 'f1': 0.7536571094207138, 'number': 809} | {'precision': 0.288, 'recall': 0.3025210084033613, 'f1': 0.2950819672131147, 'number': 119} | {'precision': 0.7783783783783784, 'recall': 0.8112676056338028, 'f1': 0.7944827586206896, 'number': 1065} | 0.7232 | 0.7747 | 0.7481 | 0.8163 |
| 0.2739 | 14.0 | 140 | 0.6550 | {'precision': 0.7190265486725663, 'recall': 0.8034610630407911, 'f1': 0.7589025102159953, 'number': 809} | {'precision': 0.3157894736842105, 'recall': 0.3025210084033613, 'f1': 0.30901287553648066, 'number': 119} | {'precision': 0.766695576756288, 'recall': 0.8300469483568075, 'f1': 0.7971145175834085, 'number': 1065} | 0.7232 | 0.7878 | 0.7541 | 0.8173 |
| 0.2715 | 15.0 | 150 | 0.6559 | {'precision': 0.7160220994475138, 'recall': 0.8009888751545118, 'f1': 0.7561260210035006, 'number': 809} | {'precision': 0.31932773109243695, 'recall': 0.31932773109243695, 'f1': 0.31932773109243695, 'number': 119} | {'precision': 0.7674216027874564, 'recall': 0.8272300469483568, 'f1': 0.7962042476276546, 'number': 1065} | 0.7215 | 0.7863 | 0.7525 | 0.8165 |
### Framework versions
- Transformers 4.34.1
- Pytorch 2.1.0+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
|