--- library_name: transformers license: cc-by-nc-sa-4.0 base_model: microsoft/layoutlmv2-base-uncased tags: - generated_from_trainer model-index: - name: layoutlmv2-base-uncased_finetuned_docvqa results: [] --- # layoutlmv2-base-uncased_finetuned_docvqa This model is a fine-tuned version of [microsoft/layoutlmv2-base-uncased](https://huggingface.co/microsoft/layoutlmv2-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.9710 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:------:|:----:|:---------------:| | 6.12 | 0.0221 | 5 | 5.8130 | | 5.8642 | 0.0442 | 10 | 5.5440 | | 5.739 | 0.0664 | 15 | 5.3407 | | 5.24 | 0.0885 | 20 | 5.1918 | | 5.2382 | 0.1106 | 25 | 5.0621 | | 5.0044 | 0.1327 | 30 | 4.9099 | | 4.8735 | 0.1549 | 35 | 4.7621 | | 4.5752 | 0.1770 | 40 | 4.7436 | | 4.9789 | 0.1991 | 45 | 4.6436 | | 5.3167 | 0.2212 | 50 | 4.5981 | | 5.1172 | 0.2434 | 55 | 4.6847 | | 4.7205 | 0.2655 | 60 | 4.5649 | | 4.5686 | 0.2876 | 65 | 4.5079 | | 4.774 | 0.3097 | 70 | 4.3704 | | 4.2153 | 0.3319 | 75 | 4.3057 | | 4.5881 | 0.3540 | 80 | 4.2297 | | 4.4437 | 0.3761 | 85 | 4.2064 | | 4.1528 | 0.3982 | 90 | 4.1870 | | 4.2176 | 0.4204 | 95 | 4.2060 | | 4.145 | 0.4425 | 100 | 4.1738 | | 4.487 | 0.4646 | 105 | 4.1157 | | 4.215 | 0.4867 | 110 | 4.1209 | | 4.2117 | 0.5088 | 115 | 4.0113 | | 4.2441 | 0.5310 | 120 | 3.9862 | | 3.8206 | 0.5531 | 125 | 4.0846 | | 4.418 | 0.5752 | 130 | 3.9696 | | 3.8883 | 0.5973 | 135 | 3.9478 | | 3.9334 | 0.6195 | 140 | 3.9126 | | 4.2097 | 0.6416 | 145 | 3.8813 | | 4.0268 | 0.6637 | 150 | 3.9252 | | 4.126 | 0.6858 | 155 | 3.8643 | | 4.0452 | 0.7080 | 160 | 3.9387 | | 3.9409 | 0.7301 | 165 | 3.8127 | | 3.9958 | 0.7522 | 170 | 3.7989 | | 3.8162 | 0.7743 | 175 | 3.8034 | | 3.5596 | 0.7965 | 180 | 3.8704 | | 4.081 | 0.8186 | 185 | 3.7822 | | 4.1374 | 0.8407 | 190 | 3.7431 | | 4.1355 | 0.8628 | 195 | 3.7494 | | 4.0031 | 0.8850 | 200 | 3.7118 | | 4.0624 | 0.9071 | 205 | 3.8061 | | 3.7152 | 0.9292 | 210 | 3.7471 | | 4.301 | 0.9513 | 215 | 3.9199 | | 4.0595 | 0.9735 | 220 | 3.7722 | | 4.1836 | 0.9956 | 225 | 3.6203 | | 3.6276 | 1.0177 | 230 | 3.6073 | | 3.4787 | 1.0398 | 235 | 3.5770 | | 3.3633 | 1.0619 | 240 | 3.5469 | | 3.2999 | 1.0841 | 245 | 3.6939 | | 3.4353 | 1.1062 | 250 | 3.7339 | | 3.663 | 1.1283 | 255 | 3.5301 | | 3.283 | 1.1504 | 260 | 3.5172 | | 3.5445 | 1.1726 | 265 | 3.5076 | | 3.1999 | 1.1947 | 270 | 3.5342 | | 3.4036 | 1.2168 | 275 | 3.4955 | | 3.31 | 1.2389 | 280 | 3.4295 | | 3.3661 | 1.2611 | 285 | 3.4398 | | 3.2727 | 1.2832 | 290 | 3.4223 | | 3.3522 | 1.3053 | 295 | 3.4298 | | 3.1652 | 1.3274 | 300 | 3.4076 | | 2.9084 | 1.3496 | 305 | 3.3806 | | 3.2943 | 1.3717 | 310 | 3.3692 | | 3.2965 | 1.3938 | 315 | 3.3601 | | 3.2069 | 1.4159 | 320 | 3.3893 | | 3.285 | 1.4381 | 325 | 3.4980 | | 3.1824 | 1.4602 | 330 | 3.4643 | | 3.4277 | 1.4823 | 335 | 3.3506 | | 3.1088 | 1.5044 | 340 | 3.2569 | | 3.1225 | 1.5265 | 345 | 3.2182 | | 2.9275 | 1.5487 | 350 | 3.3265 | | 3.0438 | 1.5708 | 355 | 3.3541 | | 3.2014 | 1.5929 | 360 | 3.2822 | | 3.0306 | 1.6150 | 365 | 3.2362 | | 2.9716 | 1.6372 | 370 | 3.2018 | | 3.0015 | 1.6593 | 375 | 3.1488 | | 2.8433 | 1.6814 | 380 | 3.1138 | | 3.0251 | 1.7035 | 385 | 3.0836 | | 3.0188 | 1.7257 | 390 | 3.1137 | | 2.8269 | 1.7478 | 395 | 3.1072 | | 3.2609 | 1.7699 | 400 | 3.1077 | | 2.8849 | 1.7920 | 405 | 3.1659 | | 2.6843 | 1.8142 | 410 | 3.2268 | | 2.9859 | 1.8363 | 415 | 3.2020 | | 2.5574 | 1.8584 | 420 | 3.1025 | | 2.9709 | 1.8805 | 425 | 3.1188 | | 3.1064 | 1.9027 | 430 | 3.0549 | | 2.7347 | 1.9248 | 435 | 2.9965 | | 2.6075 | 1.9469 | 440 | 2.9799 | | 2.9998 | 1.9690 | 445 | 3.0093 | | 2.4259 | 1.9912 | 450 | 3.1338 | | 2.5547 | 2.0133 | 455 | 3.3225 | | 2.9147 | 2.0354 | 460 | 3.3662 | | 3.004 | 2.0575 | 465 | 3.2570 | | 2.4481 | 2.0796 | 470 | 3.1761 | | 2.5156 | 2.1018 | 475 | 3.1332 | | 2.5695 | 2.1239 | 480 | 3.0219 | | 2.3243 | 2.1460 | 485 | 3.0122 | | 2.4268 | 2.1681 | 490 | 3.0692 | | 2.3157 | 2.1903 | 495 | 3.1625 | | 2.6856 | 2.2124 | 500 | 3.1868 | | 2.3567 | 2.2345 | 505 | 3.1789 | | 2.3799 | 2.2566 | 510 | 3.1141 | | 2.3814 | 2.2788 | 515 | 3.0845 | | 2.6517 | 2.3009 | 520 | 3.0001 | | 2.8808 | 2.3230 | 525 | 2.9786 | | 2.2501 | 2.3451 | 530 | 3.0351 | | 2.4319 | 2.3673 | 535 | 3.0998 | | 2.4569 | 2.3894 | 540 | 3.1180 | | 2.1893 | 2.4115 | 545 | 3.0840 | | 2.5029 | 2.4336 | 550 | 3.0379 | | 2.5414 | 2.4558 | 555 | 2.9775 | | 2.414 | 2.4779 | 560 | 2.9478 | | 2.4732 | 2.5 | 565 | 2.9530 | | 2.7319 | 2.5221 | 570 | 2.9462 | | 2.3984 | 2.5442 | 575 | 2.9199 | | 2.1631 | 2.5664 | 580 | 2.9257 | | 2.1815 | 2.5885 | 585 | 2.9564 | | 2.4294 | 2.6106 | 590 | 2.9570 | | 2.298 | 2.6327 | 595 | 2.9290 | | 2.2535 | 2.6549 | 600 | 2.9287 | | 2.1774 | 2.6770 | 605 | 2.9196 | | 2.2014 | 2.6991 | 610 | 2.9162 | | 2.1422 | 2.7212 | 615 | 2.9466 | | 2.494 | 2.7434 | 620 | 2.9844 | | 2.6516 | 2.7655 | 625 | 2.9899 | | 2.1923 | 2.7876 | 630 | 2.9580 | | 2.3944 | 2.8097 | 635 | 2.9432 | | 2.0892 | 2.8319 | 640 | 2.9422 | | 2.129 | 2.8540 | 645 | 2.9597 | | 2.4273 | 2.8761 | 650 | 2.9647 | | 2.1467 | 2.8982 | 655 | 2.9614 | | 2.1653 | 2.9204 | 660 | 2.9596 | | 2.1992 | 2.9425 | 665 | 2.9642 | | 2.1921 | 2.9646 | 670 | 2.9702 | | 2.2585 | 2.9867 | 675 | 2.9710 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.0+cu121 - Datasets 3.0.0 - Tokenizers 0.19.1