Edit model card

kematangan-pisang-vit-h-14-100eph-224-telyu

This model is a fine-tuned version of google/vit-huge-patch14-224-in21k on the dataset kematangan pisang primer.

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Recall Specificity Precision Npv Accuracy F1
No log 1.0 187 0.3757 0.8980 0.9768 0.9387 0.9798 0.9307 0.9066
No log 2.0 374 0.2434 0.9469 0.9854 0.9464 0.9851 0.9547 0.9464
0.3226 3.0 561 0.1817 0.9616 0.9899 0.9593 0.9896 0.968 0.9604
0.3226 4.0 748 0.1778 0.9436 0.9862 0.9498 0.9865 0.9573 0.9459
0.3226 5.0 935 0.1710 0.9395 0.9842 0.9490 0.9847 0.952 0.9428
0.1007 6.0 1122 0.1100 0.9658 0.9907 0.9636 0.9904 0.9707 0.9646
0.1007 7.0 1309 0.1017 0.9689 0.9918 0.9640 0.9914 0.9733 0.9661
0.1007 8.0 1496 0.0911 0.9715 0.9926 0.9677 0.9923 0.976 0.9694
0.0882 9.0 1683 0.2328 0.8882 0.9735 0.9223 0.9762 0.92 0.8943
0.0882 10.0 1870 0.1409 0.9405 0.9851 0.9518 0.9857 0.9547 0.9442
0.0606 11.0 2057 0.1400 0.9388 0.9859 0.9570 0.9871 0.9573 0.9446
0.0606 12.0 2244 0.0992 0.9541 0.9889 0.9577 0.9890 0.9653 0.9556
0.0606 13.0 2431 0.0883 0.9747 0.9928 0.9669 0.9922 0.976 0.9697
0.0495 14.0 2618 0.1169 0.9601 0.9878 0.9492 0.9869 0.96 0.9521
0.0495 15.0 2805 0.1043 0.9547 0.9888 0.9595 0.9890 0.9653 0.9566
0.0495 16.0 2992 0.1056 0.9572 0.9903 0.9708 0.9912 0.9707 0.9624
0.0524 17.0 3179 0.2722 0.8751 0.9733 0.9335 0.9779 0.92 0.8825
0.0524 18.0 3366 0.0712 0.9667 0.9923 0.9713 0.9926 0.976 0.9688
0.0343 19.0 3553 0.0973 0.9576 0.9911 0.9762 0.9923 0.9733 0.9644
0.0343 20.0 3740 0.0733 0.9626 0.9905 0.9664 0.9906 0.9707 0.9642
0.0343 21.0 3927 0.0747 0.9789 0.9937 0.9701 0.9931 0.9787 0.9731
0.0359 22.0 4114 0.1891 0.9069 0.9794 0.9498 0.9825 0.9387 0.9164
0.0359 23.0 4301 0.0650 0.9773 0.9936 0.9703 0.9931 0.9787 0.9730
0.0359 24.0 4488 0.2470 0.8837 0.9741 0.9389 0.9781 0.9227 0.8924
0.0345 25.0 4675 0.0792 0.9619 0.9920 0.9784 0.9931 0.976 0.9681
0.0345 26.0 4862 0.1752 0.9323 0.9848 0.9610 0.9867 0.9547 0.9407
0.0329 27.0 5049 0.0451 0.9801 0.9939 0.9789 0.9937 0.9813 0.9794
0.0329 28.0 5236 0.0707 0.9786 0.9924 0.9708 0.9917 0.976 0.9740
0.0329 29.0 5423 0.1154 0.9594 0.9903 0.9706 0.9909 0.9707 0.9637
0.0206 30.0 5610 0.1213 0.9594 0.9903 0.9706 0.9909 0.9707 0.9637
0.0206 31.0 5797 0.0892 0.9652 0.9912 0.9708 0.9915 0.9733 0.9676
0.0206 32.0 5984 0.0765 0.9589 0.9895 0.9659 0.9898 0.968 0.9617
0.0208 33.0 6171 0.1340 0.9598 0.9911 0.9759 0.9921 0.9733 0.9658
0.0208 34.0 6358 0.0987 0.9572 0.9903 0.9708 0.9912 0.9707 0.9624
0.0242 35.0 6545 0.1154 0.9572 0.9903 0.9708 0.9912 0.9707 0.9624
0.0242 36.0 6732 0.3260 0.8859 0.9741 0.9382 0.9778 0.9227 0.8946
0.0242 37.0 6919 0.1599 0.9557 0.9892 0.9710 0.9901 0.968 0.9612
0.0218 38.0 7106 0.1251 0.9535 0.9893 0.9712 0.9904 0.968 0.9598
0.0218 39.0 7293 0.2713 0.9112 0.9803 0.9516 0.9832 0.9413 0.9206
0.0218 40.0 7480 0.1112 0.9598 0.9911 0.9759 0.9921 0.9733 0.9658
0.0175 41.0 7667 0.0745 0.9695 0.9921 0.9734 0.9923 0.976 0.9711
0.0175 42.0 7854 0.0876 0.9610 0.9904 0.9683 0.9908 0.9707 0.9640
0.02 43.0 8041 0.0901 0.9588 0.9904 0.9683 0.9910 0.9707 0.9626
0.02 44.0 8228 0.1402 0.9529 0.9894 0.9685 0.9904 0.968 0.9587
0.02 45.0 8415 0.1325 0.9572 0.9903 0.9708 0.9912 0.9707 0.9624
0.0163 46.0 8602 0.0987 0.9657 0.9921 0.9755 0.9927 0.976 0.9697
0.0163 47.0 8789 0.1056 0.9657 0.9921 0.9755 0.9927 0.976 0.9697
0.0163 48.0 8976 0.0599 0.9800 0.9948 0.9811 0.9949 0.984 0.9805
0.0091 49.0 9163 0.0611 0.9705 0.9929 0.9780 0.9934 0.9787 0.9735
0.0091 50.0 9350 0.0606 0.9757 0.9939 0.9784 0.9941 0.9813 0.9770
0.0123 51.0 9537 0.0622 0.9816 0.9949 0.9797 0.9948 0.984 0.9806
0.0123 52.0 9724 0.1036 0.9588 0.9904 0.9683 0.9910 0.9707 0.9626
0.0123 53.0 9911 0.0854 0.9715 0.9930 0.9758 0.9933 0.9787 0.9734
0.0076 54.0 10098 0.0600 0.9773 0.9940 0.9769 0.9940 0.9813 0.9771
0.0076 55.0 10285 0.1901 0.9429 0.9875 0.9674 0.9891 0.9627 0.9508
0.0076 56.0 10472 0.1208 0.9614 0.9912 0.9731 0.9919 0.9733 0.9660
0.0118 57.0 10659 0.2020 0.9386 0.9866 0.9654 0.9884 0.96 0.9469
0.0118 58.0 10846 0.0568 0.9806 0.9942 0.9749 0.9938 0.9813 0.9774
0.0161 59.0 11033 0.0507 0.9806 0.9942 0.9749 0.9938 0.9813 0.9774
0.0161 60.0 11220 0.0830 0.9757 0.9939 0.9784 0.9941 0.9813 0.9770
0.0161 61.0 11407 0.0931 0.9757 0.9939 0.9784 0.9941 0.9813 0.9770
0.0086 62.0 11594 0.1000 0.9715 0.9930 0.9758 0.9933 0.9787 0.9734
0.0086 63.0 11781 0.1215 0.9588 0.9904 0.9683 0.9910 0.9707 0.9626
0.0086 64.0 11968 0.0695 0.9858 0.9958 0.9825 0.9956 0.9867 0.9841
0.0048 65.0 12155 0.0747 0.9752 0.9941 0.9767 0.9942 0.9813 0.9759
0.0048 66.0 12342 0.0733 0.9795 0.9940 0.9772 0.9938 0.9813 0.9783
0.0066 67.0 12529 0.1429 0.9551 0.9894 0.9683 0.9902 0.968 0.9601
0.0066 68.0 12716 0.1146 0.9673 0.9922 0.9733 0.9925 0.976 0.9699
0.0066 69.0 12903 0.1022 0.9737 0.9930 0.9760 0.9931 0.9787 0.9747
0.0068 70.0 13090 0.0850 0.9789 0.9941 0.9757 0.9939 0.9813 0.9772
0.0068 71.0 13277 0.1619 0.9509 0.9885 0.9660 0.9894 0.9653 0.9564
0.0068 72.0 13464 0.1334 0.9636 0.9911 0.9730 0.9917 0.9733 0.9674
0.0046 73.0 13651 0.1099 0.9636 0.9911 0.9730 0.9917 0.9733 0.9674
0.0046 74.0 13838 0.2110 0.9493 0.9884 0.9691 0.9896 0.9653 0.9561
0.0087 75.0 14025 0.1417 0.9572 0.9903 0.9708 0.9912 0.9707 0.9624
0.0087 76.0 14212 0.2051 0.9471 0.9884 0.9695 0.9899 0.9653 0.9546
0.0087 77.0 14399 0.1403 0.9641 0.9920 0.9781 0.9928 0.976 0.9694
0.0061 78.0 14586 0.2232 0.9408 0.9866 0.9650 0.9882 0.96 0.9485
0.0061 79.0 14773 0.0962 0.9757 0.9939 0.9784 0.9941 0.9813 0.9770
0.0061 80.0 14960 0.1113 0.9683 0.9928 0.9803 0.9936 0.9787 0.9731
0.0076 81.0 15147 0.1090 0.9683 0.9928 0.9803 0.9936 0.9787 0.9731
0.0076 82.0 15334 0.0826 0.9741 0.9938 0.9803 0.9942 0.9813 0.9768
0.0023 83.0 15521 0.0644 0.9832 0.9950 0.9787 0.9947 0.984 0.9807
0.0023 84.0 15708 0.0605 0.9832 0.9950 0.9787 0.9947 0.984 0.9807
0.0023 85.0 15895 0.1132 0.9594 0.9903 0.9706 0.9909 0.9707 0.9637
0.0061 86.0 16082 0.0592 0.9773 0.9940 0.9769 0.9940 0.9813 0.9771
0.0061 87.0 16269 0.1200 0.9657 0.9921 0.9755 0.9927 0.976 0.9697
0.0061 88.0 16456 0.1636 0.9578 0.9902 0.9734 0.9911 0.9707 0.9635
0.0011 89.0 16643 0.1011 0.9699 0.9929 0.9779 0.9934 0.9787 0.9733
0.0011 90.0 16830 0.1491 0.9598 0.9911 0.9759 0.9921 0.9733 0.9658
0.0012 91.0 17017 0.1201 0.9657 0.9921 0.9755 0.9927 0.976 0.9697
0.0012 92.0 17204 0.0848 0.9657 0.9921 0.9755 0.9927 0.976 0.9697
0.0012 93.0 17391 0.0739 0.9757 0.9939 0.9784 0.9941 0.9813 0.9770
0.0014 94.0 17578 0.0755 0.9757 0.9939 0.9784 0.9941 0.9813 0.9770
0.0014 95.0 17765 0.1235 0.9657 0.9921 0.9755 0.9927 0.976 0.9697
0.0014 96.0 17952 0.1216 0.9657 0.9921 0.9755 0.9927 0.976 0.9697
0.0017 97.0 18139 0.1223 0.9657 0.9921 0.9755 0.9927 0.976 0.9697
0.0017 98.0 18326 0.1069 0.9657 0.9921 0.9755 0.9927 0.976 0.9697
0.0035 99.0 18513 0.1174 0.9657 0.9921 0.9755 0.9927 0.976 0.9697
0.0035 100.0 18700 0.1179 0.9657 0.9921 0.9755 0.9927 0.976 0.9697

Framework versions

  • Transformers 4.28.1
  • Pytorch 2.0.1+cu118
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
12
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Spaces using aryap2/kematangan-pisang-vit-h-14-100eph-224-telyu 2