Augusto777's picture
Model save
de092c0 verified
|
raw
history blame
No virus
5.76 kB
metadata
license: apache-2.0
base_model: google/vit-base-patch16-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: vit-base-patch16-224-ve-U13b-80RX1
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: validation
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.10869565217391304

vit-base-patch16-224-ve-U13b-80RX1

This model is a fine-tuned version of google/vit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 25872499347325405328572416.0000
  • Accuracy: 0.1087

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5.5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Accuracy
21407918734188223332876288.0000 1.0 103 25872499347325405328572416.0000 0.1087
19230841377306649816989696.0000 2.0 206 25872499347325405328572416.0000 0.1087
22859301179210058793222144.0000 3.0 309 25872499347325405328572416.0000 0.1087
23584992401720978670878720.0000 4.0 412 25872499347325405328572416.0000 0.1087
24310687313580712431452160.0000 5.0 515 25872499347325405328572416.0000 0.1087
24310687313580712431452160.0000 6.0 618 25872499347325405328572416.0000 0.1087
22496457412629005795852288.0000 7.0 721 25872499347325405328572416.0000 0.1087
21045071278258356452589568.0000 8.0 824 25872499347325405328572416.0000 0.1087
21045071278258356452589568.0000 9.0 927 25872499347325405328572416.0000 0.1087
23343098402008016231596032.0000 10.0 1030 25872499347325405328572416.0000 0.1087
23222148635139925673508864.0000 11.0 1133 25872499347325405328572416.0000 0.1087
23222150479814334762450944.0000 12.0 1236 25872499347325405328572416.0000 0.1087
21407918734188223332876288.0000 13.0 1339 25872499347325405328572416.0000 0.1087
21407916889513814243934208.0000 14.0 1442 25872499347325405328572416.0000 0.1087
21770764345443681124220928.0000 15.0 1545 25872499347325405328572416.0000 0.1087
22496455567954601001877504.0000 16.0 1648 25872499347325405328572416.0000 0.1087
22859303023884467882164224.0000 17.0 1751 25872499347325405328572416.0000 0.1087
19593686988562107608334336.0000 18.0 1854 25872499347325405328572416.0000 0.1087
22859304868558872676139008.0000 19.0 1957 25872499347325405328572416.0000 0.1087
21528866656381904802021376.0000 20.0 2060 25872499347325405328572416.0000 0.1087
17053764020425078448586752.0000 21.0 2163 25872499347325405328572416.0000 0.1087
22133609956699138915565568.0000 22.0 2266 25872499347325405328572416.0000 0.1087
21045074967607170335506432.0000 23.0 2369 25872499347325405328572416.0000 0.1087
21407915044839405154992128.0000 24.0 2472 25872499347325405328572416.0000 0.1087
21770762500769272035278848.0000 25.0 2575 25872499347325405328572416.0000 0.1087
23947841702325254640107520.0000 26.0 2678 25872499347325405328572416.0000 0.1087
21045071278258356452589568.0000 27.0 2781 25872499347325405328572416.0000 0.1087
21770762500769272035278848.0000 28.0 2884 25872499347325405328572416.0000 0.1087
21407918734188223332876288.0000 29.0 2987 25872499347325405328572416.0000 0.1087
21528866656381904802021376.0000 30.0 3090 25872499347325405328572416.0000 0.1087
21045073122932761246564352.0000 31.0 3193 25872499347325405328572416.0000 0.1087
23584994246395387759820800.0000 32.0 3296 25872499347325405328572416.0000 0.1087
21045069433583947363647488.0000 33.0 3399 25872499347325405328572416.0000 0.1087
22859304868558872676139008.0000 34.0 3502 25872499347325405328572416.0000 0.1087
21407920578862628126851072.0000 35.0 3605 25872499347325405328572416.0000 0.1087
21045074967607170335506432.0000 36.0 3708 25872499347325405328572416.0000 0.1087
21770764345443681124220928.0000 37.0 3811 25872499347325405328572416.0000 0.1087
22496457412629005795852288.0000 38.0 3914 25872499347325405328572416.0000 0.1087
21407918734188223332876288.0000 39.0 4017 25872499347325405328572416.0000 0.1087
23222148635139925673508864.0000 40.0 4120 25872499347325405328572416.0000 0.1087

Framework versions

  • Transformers 4.36.2
  • Pytorch 2.1.2+cu118
  • Datasets 2.16.1
  • Tokenizers 0.15.0