model10.0 / README.md
Jyotiyadav's picture
Model save
98c2972 verified
|
raw
history blame
11.5 kB
metadata
license: cc-by-nc-sa-4.0
base_model: microsoft/layoutlmv3-base
tags:
  - generated_from_trainer
datasets:
  - datasetprepfrom_gcp
metrics:
  - precision
  - recall
  - f1
  - accuracy
model-index:
  - name: model10.0
    results:
      - task:
          name: Token Classification
          type: token-classification
        dataset:
          name: datasetprepfrom_gcp
          type: datasetprepfrom_gcp
          config: discharge
          split: test
          args: discharge
        metrics:
          - name: Precision
            type: precision
            value: 0.7824019024970273
          - name: Recall
            type: recall
            value: 0.7396028475084301
          - name: F1
            type: f1
            value: 0.7604006163328197
          - name: Accuracy
            type: accuracy
            value: 0.9480585417540451

model10.0

This model is a fine-tuned version of microsoft/layoutlmv3-base on the datasetprepfrom_gcp dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3117
  • Precision: 0.7824
  • Recall: 0.7396
  • F1: 0.7604
  • Accuracy: 0.9481

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 10000

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 0.1094 100 1.1070 0.0820 0.0037 0.0072 0.8202
No log 0.2188 200 0.9957 0.0449 0.0049 0.0088 0.8244
No log 0.3282 300 0.9355 0.1949 0.0590 0.0906 0.8338
No log 0.4376 400 0.8255 0.2373 0.1223 0.1614 0.8453
1.0172 0.5470 500 0.7511 0.3185 0.1510 0.2049 0.8489
1.0172 0.6565 600 0.7203 0.3609 0.1997 0.2571 0.8510
1.0172 0.7659 700 0.6571 0.4256 0.2565 0.3200 0.8755
1.0172 0.8753 800 0.5845 0.4510 0.3127 0.3693 0.8873
1.0172 0.9847 900 0.5614 0.5053 0.3477 0.4119 0.8948
0.45 1.0941 1000 0.5126 0.5561 0.3818 0.4527 0.9042
0.45 1.2035 1100 0.5350 0.5790 0.4176 0.4852 0.9092
0.45 1.3129 1200 0.4846 0.5449 0.4562 0.4966 0.9068
0.45 1.4223 1300 0.4681 0.6153 0.4843 0.5420 0.9149
0.45 1.5317 1400 0.4807 0.6572 0.4874 0.5598 0.9164
0.2388 1.6411 1500 0.4096 0.5848 0.5264 0.5541 0.9146
0.2388 1.7505 1600 0.3991 0.6344 0.5474 0.5877 0.9203
0.2388 1.8600 1700 0.4044 0.5996 0.5691 0.5840 0.9185
0.2388 1.9694 1800 0.4065 0.6607 0.5869 0.6216 0.9258
0.2388 2.0788 1900 0.4159 0.6018 0.5802 0.5908 0.9123
0.1768 2.1882 2000 0.4203 0.6846 0.5822 0.6293 0.9268
0.1768 2.2976 2100 0.3934 0.6566 0.5954 0.6245 0.9276
0.1768 2.4070 2200 0.3879 0.7137 0.6038 0.6542 0.9313
0.1768 2.5164 2300 0.3829 0.5685 0.6326 0.5989 0.9175
0.1768 2.6258 2400 0.3508 0.7175 0.6191 0.6647 0.9328
0.1399 2.7352 2500 0.3215 0.6869 0.6311 0.6578 0.9327
0.1399 2.8446 2600 0.3271 0.7248 0.6171 0.6666 0.9358
0.1399 2.9540 2700 0.3226 0.6491 0.6544 0.6517 0.9277
0.1399 3.0635 2800 0.3336 0.6596 0.6386 0.6490 0.9278
0.1399 3.1729 2900 0.3423 0.6480 0.6624 0.6551 0.9314
0.1083 3.2823 3000 0.3698 0.7509 0.6566 0.7006 0.9372
0.1083 3.3917 3100 0.3353 0.6457 0.6649 0.6552 0.9287
0.1083 3.5011 3200 0.3391 0.7518 0.6626 0.7044 0.9383
0.1083 3.6105 3300 0.3314 0.7350 0.6699 0.7010 0.9381
0.1083 3.7199 3400 0.3338 0.6728 0.6832 0.6779 0.9347
0.0988 3.8293 3500 0.3239 0.7509 0.6753 0.7111 0.9369
0.0988 3.9387 3600 0.3481 0.7555 0.6564 0.7025 0.9395
0.0988 4.0481 3700 0.3231 0.6749 0.6883 0.6815 0.9348
0.0988 4.1575 3800 0.3581 0.7669 0.6699 0.7151 0.9411
0.0988 4.2670 3900 0.3213 0.7174 0.6873 0.7021 0.9389
0.0775 4.3764 4000 0.3244 0.7433 0.6738 0.7069 0.9387
0.0775 4.4858 4100 0.3275 0.7370 0.6868 0.7110 0.9405
0.0775 4.5952 4200 0.3197 0.7405 0.6997 0.7195 0.9413
0.0775 4.7046 4300 0.3183 0.7419 0.6935 0.7169 0.9415
0.0775 4.8140 4400 0.2961 0.7445 0.6933 0.7180 0.9408
0.0771 4.9234 4500 0.3195 0.7542 0.6986 0.7253 0.9426
0.0771 5.0328 4600 0.3295 0.7637 0.7010 0.7310 0.9435
0.0771 5.1422 4700 0.3204 0.7603 0.7006 0.7293 0.9434
0.0771 5.2516 4800 0.2992 0.7443 0.6995 0.7212 0.9395
0.0771 5.3611 4900 0.2978 0.7312 0.7033 0.7170 0.9393
0.0647 5.4705 5000 0.3324 0.7608 0.7079 0.7334 0.9432
0.0647 5.5799 5100 0.3356 0.7635 0.7038 0.7324 0.9430
0.0647 5.6893 5200 0.3121 0.7634 0.7121 0.7368 0.9430
0.0647 5.7987 5300 0.3392 0.7858 0.7003 0.7406 0.9448
0.0647 5.9081 5400 0.2952 0.7265 0.7220 0.7242 0.9412
0.0573 6.0175 5500 0.3070 0.7311 0.7211 0.7260 0.9429
0.0573 6.1269 5600 0.3207 0.7414 0.7241 0.7326 0.9435
0.0573 6.2363 5700 0.3130 0.7685 0.7231 0.7451 0.9455
0.0573 6.3457 5800 0.3441 0.7752 0.7139 0.7433 0.9447
0.0573 6.4551 5900 0.3196 0.7818 0.7128 0.7457 0.9458
0.0529 6.5646 6000 0.3369 0.7907 0.7164 0.7517 0.9456
0.0529 6.6740 6100 0.3059 0.7394 0.7267 0.7330 0.9435
0.0529 6.7834 6200 0.3043 0.7624 0.7231 0.7422 0.9444
0.0529 6.8928 6300 0.3028 0.7527 0.7252 0.7387 0.9441
0.0529 7.0022 6400 0.3089 0.7596 0.7293 0.7441 0.9457
0.0542 7.1116 6500 0.2927 0.7306 0.7286 0.7296 0.9408
0.0542 7.2210 6600 0.3178 0.7785 0.7274 0.7521 0.9456
0.0542 7.3304 6700 0.3267 0.7653 0.7304 0.7474 0.9450
0.0542 7.4398 6800 0.3254 0.7618 0.7280 0.7445 0.9450
0.0542 7.5492 6900 0.3240 0.7856 0.7254 0.7543 0.9464
0.0416 7.6586 7000 0.3203 0.7682 0.7319 0.7496 0.9463
0.0416 7.7681 7100 0.3176 0.7801 0.7299 0.7542 0.9468
0.0416 7.8775 7200 0.3012 0.7601 0.7355 0.7476 0.9470
0.0416 7.9869 7300 0.3092 0.7336 0.7377 0.7357 0.9436
0.0416 8.0963 7400 0.3025 0.7782 0.7349 0.7559 0.9480
0.0422 8.2057 7500 0.3046 0.7594 0.7340 0.7465 0.9459
0.0422 8.3151 7600 0.3113 0.7640 0.7332 0.7483 0.9458
0.0422 8.4245 7700 0.3002 0.7579 0.7394 0.7485 0.9461
0.0422 8.5339 7800 0.3173 0.7742 0.7321 0.7526 0.9464
0.0422 8.6433 7900 0.3084 0.7766 0.7334 0.7544 0.9467
0.041 8.7527 8000 0.3118 0.7829 0.7325 0.7569 0.9477
0.041 8.8621 8100 0.3145 0.7788 0.7389 0.7583 0.9473
0.041 8.9716 8200 0.3123 0.7788 0.7366 0.7571 0.9480
0.041 9.0810 8300 0.3088 0.7754 0.7398 0.7572 0.9476
0.041 9.1904 8400 0.3101 0.7804 0.7415 0.7604 0.9491
0.0323 9.2998 8500 0.3152 0.7829 0.7357 0.7585 0.9482
0.0323 9.4092 8600 0.3061 0.7734 0.7398 0.7562 0.9476
0.0323 9.5186 8700 0.3086 0.7636 0.7437 0.7535 0.9476
0.0323 9.6280 8800 0.3162 0.7723 0.7390 0.7553 0.9476
0.0323 9.7374 8900 0.3070 0.7605 0.7419 0.7511 0.9467
0.0357 9.8468 9000 0.3117 0.7824 0.7396 0.7604 0.9481
0.0357 9.9562 9100 0.3130 0.7750 0.7396 0.7569 0.9472
0.0357 10.0656 9200 0.3095 0.7673 0.7405 0.7537 0.9476
0.0357 10.1751 9300 0.3179 0.7868 0.7357 0.7604 0.9477
0.0357 10.2845 9400 0.3077 0.7645 0.7405 0.7523 0.9472
0.0359 10.3939 9500 0.3128 0.7798 0.7366 0.7576 0.9476
0.0359 10.5033 9600 0.3151 0.7784 0.7377 0.7575 0.9475
0.0359 10.6127 9700 0.3138 0.7744 0.7420 0.7579 0.9478
0.0359 10.7221 9800 0.3115 0.7688 0.7415 0.7549 0.9475
0.0359 10.8315 9900 0.3097 0.7673 0.7411 0.7540 0.9472
0.0301 10.9409 10000 0.3095 0.7674 0.7409 0.7539 0.9474

Framework versions

  • Transformers 4.41.0.dev0
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1