pooh's picture
Model save
e77527f
---
license: cc-by-nc-sa-4.0
base_model: microsoft/layoutlmv2-base-uncased
tags:
- generated_from_trainer
model-index:
- name: layoutlmv2-base-uncased_finetuned_docvqa
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# layoutlmv2-base-uncased_finetuned_docvqa
This model is a fine-tuned version of [microsoft/layoutlmv2-base-uncased](https://huggingface.co/microsoft/layoutlmv2-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 4.9529
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 5.2903 | 0.22 | 50 | 4.6096 |
| 4.441 | 0.44 | 100 | 4.1809 |
| 4.1512 | 0.66 | 150 | 3.8270 |
| 3.9297 | 0.88 | 200 | 3.6180 |
| 3.7006 | 1.11 | 250 | 3.3508 |
| 3.1238 | 1.33 | 300 | 3.4886 |
| 3.177 | 1.55 | 350 | 3.0878 |
| 2.8817 | 1.77 | 400 | 2.8975 |
| 2.6113 | 1.99 | 450 | 3.1366 |
| 2.9929 | 2.21 | 500 | 4.2811 |
| 2.8507 | 2.43 | 550 | 3.1442 |
| 2.6294 | 2.65 | 600 | 2.7537 |
| 2.9134 | 2.88 | 650 | 4.0845 |
| 2.7527 | 3.1 | 700 | 2.6888 |
| 2.2184 | 3.32 | 750 | 2.6068 |
| 1.9832 | 3.54 | 800 | 2.3920 |
| 1.8607 | 3.76 | 850 | 2.3026 |
| 1.6756 | 3.98 | 900 | 2.4535 |
| 1.594 | 4.2 | 950 | 2.3539 |
| 1.3695 | 4.42 | 1000 | 2.9487 |
| 1.4473 | 4.65 | 1050 | 2.3269 |
| 1.0998 | 4.87 | 1100 | 2.5812 |
| 1.043 | 5.09 | 1150 | 2.7785 |
| 1.0655 | 5.31 | 1200 | 3.1658 |
| 1.2366 | 5.53 | 1250 | 3.5025 |
| 1.1033 | 5.75 | 1300 | 3.0308 |
| 1.1406 | 5.97 | 1350 | 2.4193 |
| 0.7332 | 6.19 | 1400 | 3.0098 |
| 0.7752 | 6.42 | 1450 | 3.0226 |
| 0.9816 | 6.64 | 1500 | 3.1292 |
| 0.794 | 6.86 | 1550 | 3.4569 |
| 0.6923 | 7.08 | 1600 | 3.5805 |
| 0.4034 | 7.3 | 1650 | 3.9237 |
| 0.4836 | 7.52 | 1700 | 3.4433 |
| 0.6216 | 7.74 | 1750 | 3.1084 |
| 0.6027 | 7.96 | 1800 | 3.5491 |
| 0.4783 | 8.19 | 1850 | 3.7448 |
| 0.4513 | 8.41 | 1900 | 3.4646 |
| 0.4544 | 8.63 | 1950 | 3.7954 |
| 0.5161 | 8.85 | 2000 | 3.7831 |
| 0.1872 | 9.07 | 2050 | 3.6736 |
| 0.506 | 9.29 | 2100 | 3.7390 |
| 0.2257 | 9.51 | 2150 | 3.9423 |
| 0.2648 | 9.73 | 2200 | 3.7982 |
| 0.3953 | 9.96 | 2250 | 3.2984 |
| 0.1601 | 10.18 | 2300 | 3.6460 |
| 0.2689 | 10.4 | 2350 | 3.9842 |
| 0.2762 | 10.62 | 2400 | 3.2707 |
| 0.3091 | 10.84 | 2450 | 3.4759 |
| 0.2036 | 11.06 | 2500 | 3.7818 |
| 0.1104 | 11.28 | 2550 | 3.8338 |
| 0.1555 | 11.5 | 2600 | 3.7824 |
| 0.2794 | 11.73 | 2650 | 3.7954 |
| 0.2728 | 11.95 | 2700 | 3.5966 |
| 0.2168 | 12.17 | 2750 | 4.2583 |
| 0.1133 | 12.39 | 2800 | 4.3897 |
| 0.293 | 12.61 | 2850 | 3.9776 |
| 0.1307 | 12.83 | 2900 | 4.4287 |
| 0.2012 | 13.05 | 2950 | 4.0434 |
| 0.1583 | 13.27 | 3000 | 3.8509 |
| 0.1016 | 13.5 | 3050 | 3.9090 |
| 0.0329 | 13.72 | 3100 | 4.2917 |
| 0.1034 | 13.94 | 3150 | 4.3789 |
| 0.0928 | 14.16 | 3200 | 4.4046 |
| 0.1318 | 14.38 | 3250 | 4.2611 |
| 0.1015 | 14.6 | 3300 | 4.4932 |
| 0.1499 | 14.82 | 3350 | 4.4150 |
| 0.1858 | 15.04 | 3400 | 4.1948 |
| 0.1402 | 15.27 | 3450 | 4.3734 |
| 0.0584 | 15.49 | 3500 | 4.3949 |
| 0.0288 | 15.71 | 3550 | 4.6144 |
| 0.0554 | 15.93 | 3600 | 4.8472 |
| 0.0853 | 16.15 | 3650 | 4.7406 |
| 0.0111 | 16.37 | 3700 | 5.0774 |
| 0.1094 | 16.59 | 3750 | 4.9672 |
| 0.0102 | 16.81 | 3800 | 4.9885 |
| 0.0884 | 17.04 | 3850 | 5.0612 |
| 0.0318 | 17.26 | 3900 | 5.1363 |
| 0.1083 | 17.48 | 3950 | 4.7403 |
| 0.0891 | 17.7 | 4000 | 4.6907 |
| 0.0495 | 17.92 | 4050 | 4.7827 |
| 0.015 | 18.14 | 4100 | 5.0118 |
| 0.0554 | 18.36 | 4150 | 4.9823 |
| 0.084 | 18.58 | 4200 | 4.9539 |
| 0.0714 | 18.81 | 4250 | 4.8877 |
| 0.0573 | 19.03 | 4300 | 4.9120 |
| 0.012 | 19.25 | 4350 | 4.9568 |
| 0.0381 | 19.47 | 4400 | 4.9459 |
| 0.0126 | 19.69 | 4450 | 4.9544 |
| 0.0591 | 19.91 | 4500 | 4.9529 |
### Framework versions
- Transformers 4.32.1
- Pytorch 2.1.0
- Datasets 2.14.6
- Tokenizers 0.13.3