|
--- |
|
tags: |
|
- generated_from_trainer |
|
datasets: |
|
- funsd |
|
model-index: |
|
- name: layoutlm-funsd |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# layoutlm-funsd |
|
|
|
This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the funsd dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.6659 |
|
- Answer: {'precision': 0.7130434782608696, 'recall': 0.8108776266996292, 'f1': 0.7588201272411799, 'number': 809} |
|
- Header: {'precision': 0.30578512396694213, 'recall': 0.31092436974789917, 'f1': 0.30833333333333335, 'number': 119} |
|
- Question: {'precision': 0.7858407079646018, 'recall': 0.8338028169014085, 'f1': 0.8091116173120729, 'number': 1065} |
|
- Overall Precision: 0.7282 |
|
- Overall Recall: 0.7933 |
|
- Overall F1: 0.7594 |
|
- Overall Accuracy: 0.8113 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 3e-05 |
|
- train_batch_size: 16 |
|
- eval_batch_size: 8 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- num_epochs: 15 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | |
|
|:-------------:|:-----:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:| |
|
| 1.7894 | 1.0 | 10 | 1.6087 | {'precision': 0.022050716648291068, 'recall': 0.024721878862793572, 'f1': 0.023310023310023312, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.21468926553672316, 'recall': 0.2140845070422535, 'f1': 0.21438645980253881, 'number': 1065} | 0.1260 | 0.1244 | 0.1252 | 0.3753 | |
|
| 1.4429 | 2.0 | 20 | 1.2246 | {'precision': 0.2103861517976032, 'recall': 0.19530284301606923, 'f1': 0.20256410256410257, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.4474885844748858, 'recall': 0.5521126760563381, 'f1': 0.4943253467843632, 'number': 1065} | 0.3613 | 0.3743 | 0.3677 | 0.5866 | |
|
| 1.0606 | 3.0 | 30 | 0.9253 | {'precision': 0.5022075055187638, 'recall': 0.5624227441285538, 'f1': 0.5306122448979591, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.6054006968641115, 'recall': 0.6525821596244131, 'f1': 0.6281066425666515, 'number': 1065} | 0.5518 | 0.5770 | 0.5641 | 0.7066 | |
|
| 0.8153 | 4.0 | 40 | 0.7559 | {'precision': 0.6192893401015228, 'recall': 0.754017305315204, 'f1': 0.6800445930880714, 'number': 809} | {'precision': 0.21153846153846154, 'recall': 0.09243697478991597, 'f1': 0.1286549707602339, 'number': 119} | {'precision': 0.6809480401093893, 'recall': 0.7014084507042253, 'f1': 0.6910268270120259, 'number': 1065} | 0.6410 | 0.6864 | 0.6630 | 0.7565 | |
|
| 0.6686 | 5.0 | 50 | 0.6983 | {'precision': 0.6512378902045209, 'recall': 0.7478368355995055, 'f1': 0.6962025316455697, 'number': 809} | {'precision': 0.25301204819277107, 'recall': 0.17647058823529413, 'f1': 0.20792079207920794, 'number': 119} | {'precision': 0.6876075731497419, 'recall': 0.7502347417840376, 'f1': 0.7175572519083969, 'number': 1065} | 0.6555 | 0.7150 | 0.6839 | 0.7797 | |
|
| 0.5578 | 6.0 | 60 | 0.6618 | {'precision': 0.6344969199178645, 'recall': 0.7639060568603214, 'f1': 0.6932136848008974, 'number': 809} | {'precision': 0.27586206896551724, 'recall': 0.20168067226890757, 'f1': 0.23300970873786409, 'number': 119} | {'precision': 0.6968724939855654, 'recall': 0.815962441314554, 'f1': 0.7517301038062284, 'number': 1065} | 0.6547 | 0.7582 | 0.7026 | 0.7895 | |
|
| 0.4916 | 7.0 | 70 | 0.6501 | {'precision': 0.6787234042553192, 'recall': 0.788627935723115, 'f1': 0.729559748427673, 'number': 809} | {'precision': 0.2523364485981308, 'recall': 0.226890756302521, 'f1': 0.23893805309734512, 'number': 119} | {'precision': 0.7281964436917866, 'recall': 0.8075117370892019, 'f1': 0.7658058771148708, 'number': 1065} | 0.6845 | 0.7652 | 0.7226 | 0.7975 | |
|
| 0.4501 | 8.0 | 80 | 0.6401 | {'precision': 0.6938110749185668, 'recall': 0.7898640296662547, 'f1': 0.738728323699422, 'number': 809} | {'precision': 0.26126126126126126, 'recall': 0.24369747899159663, 'f1': 0.25217391304347825, 'number': 119} | {'precision': 0.7434154630416313, 'recall': 0.8215962441314554, 'f1': 0.7805530776092775, 'number': 1065} | 0.6985 | 0.7742 | 0.7344 | 0.8066 | |
|
| 0.3986 | 9.0 | 90 | 0.6403 | {'precision': 0.7054945054945055, 'recall': 0.7935723114956736, 'f1': 0.7469458987783596, 'number': 809} | {'precision': 0.2537313432835821, 'recall': 0.2857142857142857, 'f1': 0.26877470355731226, 'number': 119} | {'precision': 0.7491496598639455, 'recall': 0.8272300469483568, 'f1': 0.786256135653726, 'number': 1065} | 0.7014 | 0.7812 | 0.7391 | 0.8069 | |
|
| 0.3621 | 10.0 | 100 | 0.6501 | {'precision': 0.7071038251366121, 'recall': 0.799752781211372, 'f1': 0.7505800464037122, 'number': 809} | {'precision': 0.29245283018867924, 'recall': 0.2605042016806723, 'f1': 0.27555555555555555, 'number': 119} | {'precision': 0.7715289982425307, 'recall': 0.8244131455399061, 'f1': 0.7970948706309579, 'number': 1065} | 0.7207 | 0.7807 | 0.7495 | 0.8085 | |
|
| 0.328 | 11.0 | 110 | 0.6625 | {'precision': 0.707742639040349, 'recall': 0.8022249690976514, 'f1': 0.7520278099652375, 'number': 809} | {'precision': 0.28688524590163933, 'recall': 0.29411764705882354, 'f1': 0.2904564315352697, 'number': 119} | {'precision': 0.7820738137082601, 'recall': 0.8356807511737089, 'f1': 0.8079891057648662, 'number': 1065} | 0.7230 | 0.7898 | 0.7549 | 0.8075 | |
|
| 0.3134 | 12.0 | 120 | 0.6655 | {'precision': 0.711038961038961, 'recall': 0.8121137206427689, 'f1': 0.7582227351413734, 'number': 809} | {'precision': 0.3135593220338983, 'recall': 0.31092436974789917, 'f1': 0.31223628691983124, 'number': 119} | {'precision': 0.7838078291814946, 'recall': 0.8272300469483568, 'f1': 0.8049337597076289, 'number': 1065} | 0.7271 | 0.7903 | 0.7574 | 0.8089 | |
|
| 0.2962 | 13.0 | 130 | 0.6583 | {'precision': 0.7161716171617162, 'recall': 0.8046971569839307, 'f1': 0.7578579743888243, 'number': 809} | {'precision': 0.3064516129032258, 'recall': 0.31932773109243695, 'f1': 0.31275720164609055, 'number': 119} | {'precision': 0.7808098591549296, 'recall': 0.8328638497652582, 'f1': 0.8059972739663789, 'number': 1065} | 0.7266 | 0.7908 | 0.7573 | 0.8089 | |
|
| 0.2823 | 14.0 | 140 | 0.6638 | {'precision': 0.7167755991285403, 'recall': 0.8133498145859085, 'f1': 0.7620150550086855, 'number': 809} | {'precision': 0.3135593220338983, 'recall': 0.31092436974789917, 'f1': 0.31223628691983124, 'number': 119} | {'precision': 0.7834960070984915, 'recall': 0.8291079812206573, 'f1': 0.8056569343065694, 'number': 1065} | 0.7295 | 0.7918 | 0.7594 | 0.8102 | |
|
| 0.2796 | 15.0 | 150 | 0.6659 | {'precision': 0.7130434782608696, 'recall': 0.8108776266996292, 'f1': 0.7588201272411799, 'number': 809} | {'precision': 0.30578512396694213, 'recall': 0.31092436974789917, 'f1': 0.30833333333333335, 'number': 119} | {'precision': 0.7858407079646018, 'recall': 0.8338028169014085, 'f1': 0.8091116173120729, 'number': 1065} | 0.7282 | 0.7933 | 0.7594 | 0.8113 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.30.2 |
|
- Pytorch 2.0.1+cu118 |
|
- Datasets 2.13.1 |
|
- Tokenizers 0.13.3 |
|
|