PrimWong's picture
End of training
ca8581d
---
license: cc-by-nc-sa-4.0
base_model: microsoft/layoutlmv2-base-uncased
tags:
- generated_from_trainer
model-index:
- name: layoutlmv2-base-uncased_finetuned_docvqa
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# layoutlmv2-base-uncased_finetuned_docvqa
This model is a fine-tuned version of [microsoft/layoutlmv2-base-uncased](https://huggingface.co/microsoft/layoutlmv2-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 4.4423
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 5.2693 | 0.22 | 50 | 4.4222 |
| 4.3703 | 0.44 | 100 | 4.1079 |
| 4.1363 | 0.66 | 150 | 3.9209 |
| 3.7332 | 0.88 | 200 | 3.6332 |
| 3.4591 | 1.11 | 250 | 3.5577 |
| 3.1781 | 1.33 | 300 | 3.1319 |
| 3.3388 | 1.55 | 350 | 3.0866 |
| 2.8356 | 1.77 | 400 | 2.7820 |
| 2.4286 | 1.99 | 450 | 2.8378 |
| 2.0496 | 2.21 | 500 | 2.5224 |
| 1.9469 | 2.43 | 550 | 2.5281 |
| 1.8342 | 2.65 | 600 | 2.5674 |
| 1.6589 | 2.88 | 650 | 2.2914 |
| 1.6939 | 3.1 | 700 | 2.4427 |
| 1.3883 | 3.32 | 750 | 2.5626 |
| 1.3944 | 3.54 | 800 | 2.3736 |
| 1.2459 | 3.76 | 850 | 2.7994 |
| 1.5218 | 3.98 | 900 | 2.5390 |
| 1.1471 | 4.2 | 950 | 2.5951 |
| 0.8888 | 4.42 | 1000 | 2.7430 |
| 0.971 | 4.65 | 1050 | 2.5219 |
| 1.0425 | 4.87 | 1100 | 2.5474 |
| 0.7665 | 5.09 | 1150 | 2.9321 |
| 0.8039 | 5.31 | 1200 | 2.7369 |
| 0.6426 | 5.53 | 1250 | 3.1309 |
| 0.6628 | 5.75 | 1300 | 3.1167 |
| 0.906 | 5.97 | 1350 | 3.8550 |
| 0.6223 | 6.19 | 1400 | 3.4892 |
| 0.6274 | 6.42 | 1450 | 3.2927 |
| 0.4732 | 6.64 | 1500 | 3.4192 |
| 0.5962 | 6.86 | 1550 | 3.2867 |
| 0.6761 | 7.08 | 1600 | 3.0610 |
| 0.4096 | 7.3 | 1650 | 3.5926 |
| 0.457 | 7.52 | 1700 | 3.2824 |
| 0.3721 | 7.74 | 1750 | 3.4383 |
| 0.4547 | 7.96 | 1800 | 3.4794 |
| 0.4231 | 8.19 | 1850 | 3.7591 |
| 0.3292 | 8.41 | 1900 | 3.8104 |
| 0.4401 | 8.63 | 1950 | 3.7450 |
| 0.446 | 8.85 | 2000 | 3.5815 |
| 0.3362 | 9.07 | 2050 | 3.6245 |
| 0.1832 | 9.29 | 2100 | 3.7162 |
| 0.2085 | 9.51 | 2150 | 3.8565 |
| 0.3248 | 9.73 | 2200 | 3.4577 |
| 0.4722 | 9.96 | 2250 | 3.6518 |
| 0.2575 | 10.18 | 2300 | 3.8701 |
| 0.2336 | 10.4 | 2350 | 3.7511 |
| 0.2864 | 10.62 | 2400 | 3.7999 |
| 0.2091 | 10.84 | 2450 | 3.8716 |
| 0.2371 | 11.06 | 2500 | 3.7909 |
| 0.1582 | 11.28 | 2550 | 4.0463 |
| 0.2519 | 11.5 | 2600 | 3.9798 |
| 0.1223 | 11.73 | 2650 | 4.3331 |
| 0.1838 | 11.95 | 2700 | 4.1601 |
| 0.1204 | 12.17 | 2750 | 4.2846 |
| 0.1797 | 12.39 | 2800 | 4.1595 |
| 0.123 | 12.61 | 2850 | 4.2625 |
| 0.2177 | 12.83 | 2900 | 4.0050 |
| 0.1728 | 13.05 | 2950 | 4.0885 |
| 0.1525 | 13.27 | 3000 | 3.9733 |
| 0.0388 | 13.5 | 3050 | 4.1072 |
| 0.0788 | 13.72 | 3100 | 4.2446 |
| 0.1629 | 13.94 | 3150 | 4.0483 |
| 0.0377 | 14.16 | 3200 | 4.2435 |
| 0.0966 | 14.38 | 3250 | 4.1510 |
| 0.0943 | 14.6 | 3300 | 4.2591 |
| 0.048 | 14.82 | 3350 | 4.1876 |
| 0.097 | 15.04 | 3400 | 4.2489 |
| 0.0188 | 15.27 | 3450 | 4.3612 |
| 0.1163 | 15.49 | 3500 | 4.2931 |
| 0.0754 | 15.71 | 3550 | 4.3306 |
| 0.1044 | 15.93 | 3600 | 4.2243 |
| 0.0316 | 16.15 | 3650 | 4.3932 |
| 0.005 | 16.37 | 3700 | 4.4173 |
| 0.0389 | 16.59 | 3750 | 4.3939 |
| 0.0505 | 16.81 | 3800 | 4.3207 |
| 0.0501 | 17.04 | 3850 | 4.3601 |
| 0.0491 | 17.26 | 3900 | 4.3211 |
| 0.0048 | 17.48 | 3950 | 4.3425 |
| 0.0043 | 17.7 | 4000 | 4.3461 |
| 0.0309 | 17.92 | 4050 | 4.3733 |
| 0.0246 | 18.14 | 4100 | 4.3912 |
| 0.0055 | 18.36 | 4150 | 4.4020 |
| 0.0078 | 18.58 | 4200 | 4.4256 |
| 0.0057 | 18.81 | 4250 | 4.4462 |
| 0.0352 | 19.03 | 4300 | 4.4558 |
| 0.0451 | 19.25 | 4350 | 4.4557 |
| 0.063 | 19.47 | 4400 | 4.4395 |
| 0.0123 | 19.69 | 4450 | 4.4428 |
| 0.0291 | 19.91 | 4500 | 4.4423 |
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.1+cu121
- Datasets 2.15.0
- Tokenizers 0.15.0