Edit model card

SwinV2-Base-Document-Classifier

This model is a fine-tuned version of microsoft/swinv2-base-patch4-window16-256 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.0511
  • Accuracy: 0.7904
  • F1: 0.7080
  • Precision: 0.7454
  • Recall: 0.6989

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 5000

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
0.8169 0.004 20 1.5288 0.3019 0.2413 0.2918 0.3691
0.3313 0.008 40 1.4209 0.4957 0.4772 0.5469 0.5327
0.1596 0.012 60 1.3420 0.6057 0.5825 0.5851 0.6313
0.1548 0.016 80 1.1491 0.6777 0.6146 0.6104 0.6343
0.0697 0.02 100 1.3103 0.7192 0.6588 0.6647 0.6722
0.1706 0.024 120 1.3826 0.7058 0.6304 0.6722 0.6510
0.0444 0.028 140 1.5106 0.6552 0.6201 0.6130 0.6549
0.0726 0.032 160 1.5560 0.6724 0.6267 0.6289 0.6507
0.065 0.036 180 2.2979 0.5478 0.5452 0.5887 0.6163
0.089 0.04 200 1.5792 0.7126 0.6410 0.6621 0.6422
0.0666 0.044 220 1.6487 0.7553 0.6607 0.7106 0.6670
0.0588 0.048 240 1.6536 0.7368 0.6607 0.6628 0.6719
0.0552 0.052 260 1.6955 0.7502 0.6611 0.6885 0.6682
0.0621 0.056 280 1.5985 0.7553 0.6730 0.6865 0.6702
0.1101 0.06 300 1.6365 0.7085 0.6489 0.6486 0.6757
0.0794 0.064 320 1.6918 0.7447 0.6598 0.6861 0.6488
0.0659 0.068 340 1.8215 0.7026 0.6319 0.6480 0.6519
0.0283 0.072 360 2.0111 0.7415 0.6423 0.7066 0.6446
0.068 0.076 380 1.7918 0.7304 0.6666 0.6767 0.6785
0.0647 0.08 400 1.7306 0.7472 0.6691 0.6863 0.6799
0.0471 0.084 420 1.8406 0.7619 0.6604 0.7178 0.6664
0.0376 0.088 440 1.8206 0.7324 0.6689 0.6676 0.6835
0.0479 0.092 460 1.8339 0.7460 0.6631 0.6885 0.6724
0.0423 0.096 480 1.9314 0.7562 0.6582 0.7107 0.6682
0.0532 0.1 500 1.6011 0.7710 0.6969 0.6979 0.7024
0.0351 0.104 520 1.7001 0.7649 0.6882 0.6939 0.6940
0.0986 0.108 540 1.6234 0.7570 0.6811 0.6970 0.6769
0.059 0.112 560 1.6405 0.7555 0.6740 0.6904 0.6798
0.0257 0.116 580 2.1886 0.7313 0.6391 0.7047 0.6479
0.0595 0.12 600 1.8580 0.7600 0.6606 0.7189 0.6597
0.0362 0.124 620 1.7232 0.7687 0.6859 0.7063 0.6807
0.0346 0.128 640 1.8170 0.7396 0.6558 0.6848 0.6718
0.0509 0.132 660 1.7384 0.7509 0.6771 0.6783 0.6843
0.0356 0.136 680 1.7770 0.7642 0.6838 0.6939 0.6868
0.0782 0.14 700 1.7917 0.7313 0.6619 0.6682 0.6728
0.0305 0.144 720 1.9640 0.7611 0.6711 0.7247 0.6532
0.0547 0.148 740 1.7882 0.7672 0.6891 0.7156 0.6758
0.0574 0.152 760 1.6707 0.7619 0.6914 0.6875 0.6997
0.0341 0.156 780 1.8867 0.7776 0.6887 0.7298 0.6838
0.0486 0.16 800 1.8698 0.7651 0.6860 0.7039 0.6891
0.0304 0.164 820 1.9863 0.7725 0.6864 0.7145 0.6879
0.0529 0.168 840 1.8715 0.7744 0.6933 0.7091 0.6890
0.0428 0.172 860 1.8680 0.7434 0.6784 0.6743 0.6914
0.0303 0.176 880 1.9197 0.7708 0.6892 0.7115 0.6947
0.0214 0.18 900 1.9956 0.7404 0.6761 0.6846 0.6887
0.0667 0.184 920 1.8409 0.7651 0.6913 0.7060 0.6871
0.0459 0.188 940 1.9177 0.7742 0.6840 0.7192 0.6760
0.0232 0.192 960 1.8778 0.7566 0.6826 0.6922 0.6943
0.0187 0.196 980 2.2011 0.7600 0.6612 0.7209 0.6717
0.0293 0.2 1000 2.1118 0.7744 0.6806 0.7236 0.6870
0.058 0.204 1020 2.0024 0.7372 0.6554 0.6851 0.6742
0.0471 0.208 1040 2.0062 0.7485 0.6651 0.7007 0.6723
0.0247 0.212 1060 2.1716 0.7630 0.6709 0.7303 0.6608
0.0228 0.216 1080 2.0076 0.7704 0.6807 0.7219 0.6835
0.024 0.22 1100 1.9334 0.7789 0.6872 0.7222 0.6843
0.0244 0.224 1120 1.9520 0.7710 0.6920 0.7007 0.6998
0.0395 0.228 1140 2.0621 0.7702 0.6927 0.7118 0.6934
0.0365 0.232 1160 2.0195 0.7517 0.6833 0.6829 0.6969
0.022 0.236 1180 1.9342 0.7672 0.6909 0.7003 0.6930
0.0418 0.24 1200 1.9085 0.7649 0.6929 0.6970 0.6912
0.0376 0.244 1220 1.9928 0.7655 0.6729 0.7273 0.6592
0.0304 0.248 1240 1.8051 0.7725 0.6933 0.7044 0.6919
0.0145 0.252 1260 2.1409 0.7602 0.6596 0.7256 0.6556
0.0136 0.256 1280 1.9489 0.7628 0.6825 0.7030 0.6871
0.0398 0.26 1300 1.9823 0.7547 0.6741 0.6948 0.6741
0.051 0.264 1320 1.9092 0.7651 0.6875 0.7127 0.6793
0.0293 0.268 1340 2.1178 0.7664 0.6670 0.7354 0.6615
0.023 0.272 1360 2.0888 0.7608 0.6731 0.7221 0.6734
0.0164 0.276 1380 1.9806 0.7776 0.6869 0.7386 0.6787
0.0191 0.28 1400 2.1181 0.7732 0.6766 0.7423 0.6648
0.0068 0.284 1420 2.0415 0.7715 0.6765 0.7282 0.6799
0.0237 0.288 1440 2.0610 0.7672 0.6720 0.7234 0.6791
0.0316 0.292 1460 1.9141 0.7876 0.7030 0.7296 0.6995
0.0317 0.296 1480 1.9383 0.7855 0.6985 0.7340 0.6946
0.0265 0.3 1500 1.9830 0.7827 0.6951 0.7234 0.6960
0.0171 0.304 1520 2.0997 0.7802 0.6883 0.7418 0.6805
0.0248 0.308 1540 2.2015 0.7534 0.6693 0.7211 0.6662
0.0328 0.312 1560 2.0481 0.7804 0.6938 0.7327 0.6893
0.0274 0.316 1580 2.0649 0.7738 0.6926 0.7201 0.6873
0.0032 0.32 1600 2.2288 0.7659 0.6778 0.7335 0.6642
0.0281 0.324 1620 1.9919 0.7662 0.6825 0.7225 0.6751
0.0332 0.328 1640 1.9466 0.7719 0.6884 0.7283 0.6772
0.0532 0.332 1660 1.9138 0.7534 0.6870 0.7011 0.6794
0.0498 0.336 1680 1.8470 0.7672 0.6858 0.7213 0.6763
0.027 0.34 1700 1.7451 0.7808 0.6991 0.7236 0.6950
0.0259 0.344 1720 1.7737 0.7874 0.7064 0.7363 0.6983
0.0184 0.348 1740 1.9273 0.7585 0.6849 0.6966 0.6997
0.0216 0.352 1760 2.1094 0.7674 0.6828 0.7252 0.6826
0.0343 0.356 1780 2.0939 0.7574 0.6696 0.7151 0.6612
0.0213 0.36 1800 2.0420 0.7698 0.6827 0.7218 0.6757
0.0144 0.364 1820 2.0380 0.7747 0.6928 0.7251 0.6850
0.0113 0.368 1840 1.8928 0.7817 0.7039 0.7135 0.7056
0.0093 0.372 1860 1.9707 0.7834 0.7049 0.7176 0.7065
0.0277 0.376 1880 2.3124 0.7485 0.6676 0.7109 0.6715
0.0089 0.38 1900 2.2395 0.7566 0.6740 0.7106 0.6762
0.0064 0.384 1920 2.2374 0.7696 0.6842 0.7250 0.6794
0.0497 0.388 1940 2.2056 0.7666 0.6838 0.7207 0.6812
0.0029 0.392 1960 2.0368 0.7808 0.7027 0.7137 0.7049
0.0553 0.396 1980 2.1463 0.7666 0.6873 0.7153 0.6873
0.042 0.4 2000 1.9924 0.7870 0.7016 0.7342 0.6946
0.0622 0.404 2020 1.8710 0.7753 0.6946 0.7156 0.6984
0.0228 0.408 2040 1.8997 0.7719 0.6919 0.7138 0.6952
0.023 0.412 2060 1.8278 0.7915 0.7097 0.7346 0.7052
0.0515 0.416 2080 1.7625 0.7689 0.6975 0.6948 0.7066
0.0205 0.42 2100 1.8979 0.7742 0.6884 0.7240 0.6867
0.0129 0.424 2120 1.9146 0.7893 0.7032 0.7430 0.6947
0.0117 0.428 2140 1.8620 0.7864 0.7088 0.7245 0.7046
0.021 0.432 2160 1.9294 0.7742 0.6968 0.7160 0.6992
0.0521 0.436 2180 2.1119 0.7364 0.6495 0.7039 0.6618
0.0151 0.44 2200 1.8735 0.7751 0.6901 0.7172 0.6922
0.0335 0.444 2220 1.8854 0.7795 0.6956 0.7336 0.6884
0.0242 0.448 2240 1.7997 0.7878 0.7050 0.7304 0.7020
0.0293 0.452 2260 1.9462 0.7817 0.6973 0.7333 0.6879
0.0171 0.456 2280 1.9591 0.7851 0.6927 0.7373 0.6855
0.0207 0.46 2300 1.9415 0.7834 0.6983 0.7273 0.6964
0.0042 0.464 2320 2.1175 0.7770 0.6868 0.7369 0.6821
0.0649 0.468 2340 2.0327 0.7817 0.6863 0.7437 0.6821
0.0147 0.472 2360 1.9038 0.7889 0.7044 0.7331 0.6999
0.0112 0.476 2380 1.9565 0.7802 0.6957 0.7258 0.6945
0.0145 0.48 2400 1.9352 0.7881 0.7081 0.7280 0.7006
0.0264 0.484 2420 1.9185 0.7887 0.7073 0.7292 0.7004
0.0513 0.488 2440 2.0005 0.7598 0.6823 0.7052 0.6864
0.0106 0.492 2460 1.8639 0.7795 0.7012 0.7139 0.7029
0.0102 0.496 2480 1.8810 0.7795 0.6985 0.7200 0.6964
0.0324 0.5 2500 2.0004 0.7674 0.6846 0.7165 0.6824
0.0066 0.504 2520 2.0025 0.7834 0.7006 0.7280 0.6990
0.0325 0.508 2540 2.0293 0.7812 0.6961 0.7328 0.6902
0.0218 0.512 2560 2.0315 0.7764 0.6964 0.7208 0.6931
0.0495 0.516 2580 2.1118 0.7659 0.6848 0.7133 0.6885
0.0116 0.52 2600 2.2202 0.7591 0.6738 0.7197 0.6715
0.0087 0.524 2620 2.0047 0.7759 0.6917 0.7258 0.6852
0.0413 0.528 2640 2.0134 0.7772 0.6919 0.7306 0.6851
0.0196 0.532 2660 2.1001 0.7732 0.6852 0.7337 0.6785
0.0124 0.536 2680 2.1026 0.7772 0.6883 0.7426 0.6757
0.0192 0.54 2700 2.0533 0.7706 0.6889 0.7246 0.6798
0.0155 0.544 2720 2.0434 0.7830 0.6947 0.7402 0.6838
0.0203 0.548 2740 2.0044 0.7832 0.6961 0.7354 0.6849
0.0236 0.552 2760 1.9366 0.7842 0.6987 0.7292 0.6918
0.0249 0.556 2780 1.9613 0.7851 0.7002 0.7325 0.6971
0.0044 0.56 2800 1.9085 0.7861 0.7035 0.7276 0.7019
0.0146 0.564 2820 2.0575 0.7891 0.7029 0.7461 0.6940
0.0163 0.568 2840 2.0780 0.7872 0.7008 0.7434 0.6959
0.022 0.572 2860 2.0810 0.7864 0.6995 0.7385 0.6955
0.0309 0.576 2880 1.9523 0.7887 0.7055 0.7305 0.7025
0.0209 0.58 2900 2.0953 0.7851 0.6978 0.7388 0.6932
0.0027 0.584 2920 2.2845 0.7696 0.6821 0.7357 0.6709
0.0224 0.588 2940 2.1609 0.7730 0.6883 0.7309 0.6813
0.0564 0.592 2960 2.0449 0.7744 0.6969 0.7202 0.6928
0.0108 0.596 2980 2.0108 0.7832 0.7031 0.7279 0.7026
0.0187 0.6 3000 2.0438 0.7727 0.6902 0.7199 0.6927
0.0063 0.604 3020 2.1797 0.7645 0.6818 0.7229 0.6824
0.0096 0.608 3040 1.9905 0.7798 0.7003 0.7250 0.6972
0.0038 0.612 3060 1.9953 0.7793 0.7006 0.7195 0.7025
0.0287 0.616 3080 2.2580 0.7460 0.6669 0.7043 0.6804
0.0609 0.62 3100 2.0312 0.7795 0.6988 0.7295 0.6926
0.0004 0.624 3120 2.0029 0.7876 0.7070 0.7357 0.6997
0.001 0.628 3140 2.1025 0.7857 0.6999 0.7471 0.6892
0.0007 0.632 3160 2.0503 0.7802 0.6994 0.7305 0.6917
0.0173 0.636 3180 1.9860 0.7866 0.7071 0.7318 0.6994
0.0397 0.64 3200 1.9786 0.7783 0.7040 0.7139 0.7026
0.0094 0.644 3220 2.1086 0.7587 0.6851 0.7006 0.6912
0.0366 0.648 3240 2.0076 0.7774 0.6992 0.7179 0.7009
0.0067 0.652 3260 2.0249 0.7885 0.7056 0.7372 0.7010
0.0342 0.656 3280 2.0464 0.7821 0.7009 0.7309 0.7010
0.0223 0.66 3300 2.0162 0.7853 0.7025 0.7320 0.7014
0.0392 0.664 3320 2.1415 0.7810 0.6940 0.7450 0.6835
0.0076 0.668 3340 2.2212 0.7674 0.6837 0.7318 0.6745
0.0204 0.672 3360 2.0552 0.7832 0.6987 0.7343 0.6940
0.0081 0.676 3380 2.0096 0.7832 0.7026 0.7293 0.6990
0.0259 0.68 3400 2.0168 0.7781 0.6961 0.7260 0.6958
0.031 0.684 3420 2.0094 0.7810 0.7000 0.7328 0.6979
0.0426 0.688 3440 1.9587 0.7876 0.7044 0.7380 0.6980
0.0468 0.692 3460 1.9638 0.7847 0.7005 0.7344 0.6976
0.0024 0.696 3480 1.9529 0.7908 0.7076 0.7417 0.7015
0.0021 0.7 3500 1.9416 0.7902 0.7084 0.7392 0.7036
0.002 0.704 3520 2.0436 0.7825 0.7004 0.7315 0.6996
0.0116 0.708 3540 2.0268 0.7851 0.7047 0.7300 0.7049
0.0072 0.712 3560 2.0174 0.7781 0.7002 0.7184 0.7030
0.0009 0.716 3580 2.0239 0.7842 0.7041 0.7278 0.7050
0.0137 0.72 3600 2.0557 0.7785 0.6975 0.7239 0.6998
0.0055 0.724 3620 2.0745 0.7830 0.6990 0.7349 0.6981
0.0325 0.728 3640 1.9863 0.7847 0.7056 0.7240 0.7071
0.0013 0.732 3660 2.0067 0.7830 0.7034 0.7232 0.7047
0.0225 0.736 3680 2.0209 0.7834 0.7009 0.7294 0.7012
0.0218 0.74 3700 1.9571 0.7919 0.7133 0.7335 0.7108
0.0273 0.744 3720 1.9922 0.7936 0.7124 0.7437 0.7062
0.0155 0.748 3740 1.9423 0.7951 0.7140 0.7411 0.7085
0.0147 0.752 3760 1.9515 0.7942 0.7125 0.7428 0.7057
0.0265 0.756 3780 1.9910 0.7883 0.7051 0.7312 0.7057
0.0136 0.76 3800 2.0000 0.7800 0.6978 0.7219 0.6987
0.0001 0.764 3820 2.0066 0.7776 0.6965 0.7185 0.6983
0.0009 0.768 3840 1.9357 0.7895 0.7098 0.7293 0.7072
0.0109 0.772 3860 2.0023 0.7827 0.7025 0.7286 0.6992
0.0419 0.776 3880 1.9449 0.7840 0.7048 0.7245 0.7053
0.0062 0.78 3900 2.0165 0.7842 0.7003 0.7321 0.6999
0.0099 0.784 3920 2.0016 0.7859 0.7038 0.7322 0.7023
0.0523 0.788 3940 1.9272 0.7868 0.7065 0.7317 0.7018
0.0108 0.792 3960 1.9112 0.7885 0.7077 0.7287 0.7059
0.0147 0.796 3980 1.9271 0.7800 0.7005 0.7176 0.7027
0.0115 0.8 4000 1.9350 0.7823 0.7021 0.7214 0.7035
0.0274 0.804 4020 1.8936 0.7915 0.7119 0.7301 0.7111
0.0279 0.808 4040 1.8748 0.8021 0.7208 0.7477 0.7148
0.0004 0.812 4060 1.9129 0.8014 0.7196 0.7506 0.7129
0.0171 0.816 4080 1.9970 0.7940 0.7105 0.7447 0.7072
0.0002 0.82 4100 1.9983 0.7949 0.7083 0.7466 0.7026
0.0019 0.824 4120 2.0107 0.7951 0.7084 0.7473 0.7029
0.0059 0.828 4140 2.0663 0.7857 0.7013 0.7453 0.6952
0.0304 0.832 4160 2.0433 0.7853 0.7024 0.7373 0.7022
0.0416 0.836 4180 2.0629 0.7855 0.7010 0.7454 0.6971
0.0489 0.84 4200 2.0777 0.7840 0.6992 0.7406 0.7008
0.0105 0.844 4220 2.0575 0.7859 0.7009 0.7389 0.7011
0.0198 0.848 4240 2.0226 0.7912 0.7071 0.7441 0.7032
0.0221 0.852 4260 2.0201 0.7917 0.7078 0.7451 0.7043
0.0168 0.856 4280 1.9936 0.7966 0.7141 0.7476 0.7079
0.0057 0.86 4300 1.9740 0.7978 0.7156 0.7481 0.7077
0.0449 0.864 4320 2.0593 0.7887 0.7046 0.7509 0.6932
0.0151 0.868 4340 2.0453 0.7887 0.7056 0.7504 0.6935
0.0044 0.872 4360 2.0808 0.7853 0.7029 0.7488 0.6891
0.0071 0.876 4380 2.0784 0.7847 0.7025 0.7473 0.6895
0.0 0.88 4400 2.0776 0.7855 0.7040 0.7469 0.6923
0.0171 0.884 4420 2.0440 0.7878 0.7062 0.7467 0.6937
0.0185 0.888 4440 2.0283 0.7887 0.7085 0.7457 0.6987
0.0337 0.892 4460 2.0318 0.7881 0.7056 0.7440 0.6963
0.018 0.896 4480 2.0252 0.7915 0.7094 0.7474 0.6997
0.0033 0.9 4500 1.9966 0.7942 0.7133 0.7451 0.7056
0.0002 0.904 4520 2.0223 0.7902 0.7094 0.7446 0.7004
0.018 0.908 4540 2.0072 0.7874 0.7074 0.7397 0.6974
0.0032 0.912 4560 2.0435 0.7876 0.7064 0.7432 0.6957
0.0242 0.916 4580 2.0097 0.7940 0.7132 0.7450 0.7050
0.0208 0.92 4600 1.9747 0.7938 0.7138 0.7397 0.7071
0.0114 0.924 4620 2.0074 0.7936 0.7126 0.7445 0.7043
0.0188 0.928 4640 2.0167 0.7940 0.7123 0.7462 0.7040
0.0057 0.932 4660 2.0379 0.7908 0.7088 0.7467 0.6996
0.0103 0.936 4680 2.0309 0.7934 0.7111 0.7479 0.7024
0.0005 0.94 4700 2.0406 0.7908 0.7085 0.7466 0.6994
0.0066 0.944 4720 2.0348 0.7925 0.7104 0.7468 0.7009
0.0199 0.948 4740 2.0125 0.7942 0.7127 0.7456 0.7041
0.0046 0.952 4760 2.0125 0.7944 0.7132 0.7458 0.7045
0.0155 0.956 4780 2.0372 0.7919 0.7098 0.7461 0.7004
0.0126 0.96 4800 2.0294 0.7927 0.7109 0.7460 0.7017
0.0065 0.964 4820 2.0284 0.7942 0.7122 0.7461 0.7041
0.0148 0.968 4840 2.0355 0.7929 0.7105 0.7462 0.7022
0.0133 0.972 4860 2.0427 0.7921 0.7092 0.7459 0.7002
0.0021 0.976 4880 2.0548 0.7904 0.7074 0.7454 0.6980
0.0001 0.98 4900 2.0543 0.7906 0.7076 0.7455 0.6983
0.008 0.984 4920 2.0467 0.7919 0.7090 0.7459 0.6999
0.0045 0.988 4940 2.0489 0.7917 0.7087 0.7459 0.6996
0.0225 0.992 4960 2.0517 0.7904 0.7077 0.7453 0.6985
0.0057 0.996 4980 2.0513 0.7904 0.7080 0.7454 0.6989
0.0097 1.0 5000 2.0511 0.7904 0.7080 0.7454 0.6989

Framework versions

  • Transformers 4.43.3
  • Pytorch 2.4.0
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
914
Safetensors
Model size
86.9M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for amaye15/SwinV2-Base-Document-Classifier

Finetuned
(6)
this model