Augusto777 commited on
Commit
dbb94b2
1 Parent(s): 3d2d094

Model save

Browse files
README.md CHANGED
@@ -22,7 +22,7 @@ model-index:
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
- value: 0.10869565217391304
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -32,8 +32,8 @@ should probably proofread and complete it, then remove this comment. -->
32
 
33
  This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
34
  It achieves the following results on the evaluation set:
35
- - Loss: 25872499347325405328572416.0000
36
- - Accuracy: 0.1087
37
 
38
  ## Model description
39
 
@@ -56,6 +56,8 @@ The following hyperparameters were used during training:
56
  - train_batch_size: 8
57
  - eval_batch_size: 8
58
  - seed: 42
 
 
59
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
60
  - lr_scheduler_type: linear
61
  - lr_scheduler_warmup_ratio: 0.05
@@ -63,48 +65,48 @@ The following hyperparameters were used during training:
63
 
64
  ### Training results
65
 
66
- | Training Loss | Epoch | Step | Validation Loss | Accuracy |
67
- |:-------------------------------:|:-----:|:----:|:-------------------------------:|:--------:|
68
- | 21407918734188223332876288.0000 | 1.0 | 103 | 25872499347325405328572416.0000 | 0.1087 |
69
- | 19230841377306649816989696.0000 | 2.0 | 206 | 25872499347325405328572416.0000 | 0.1087 |
70
- | 22859301179210058793222144.0000 | 3.0 | 309 | 25872499347325405328572416.0000 | 0.1087 |
71
- | 23584992401720978670878720.0000 | 4.0 | 412 | 25872499347325405328572416.0000 | 0.1087 |
72
- | 24310687313580712431452160.0000 | 5.0 | 515 | 25872499347325405328572416.0000 | 0.1087 |
73
- | 24310687313580712431452160.0000 | 6.0 | 618 | 25872499347325405328572416.0000 | 0.1087 |
74
- | 22496457412629005795852288.0000 | 7.0 | 721 | 25872499347325405328572416.0000 | 0.1087 |
75
- | 21045071278258356452589568.0000 | 8.0 | 824 | 25872499347325405328572416.0000 | 0.1087 |
76
- | 21045071278258356452589568.0000 | 9.0 | 927 | 25872499347325405328572416.0000 | 0.1087 |
77
- | 23343098402008016231596032.0000 | 10.0 | 1030 | 25872499347325405328572416.0000 | 0.1087 |
78
- | 23222148635139925673508864.0000 | 11.0 | 1133 | 25872499347325405328572416.0000 | 0.1087 |
79
- | 23222150479814334762450944.0000 | 12.0 | 1236 | 25872499347325405328572416.0000 | 0.1087 |
80
- | 21407918734188223332876288.0000 | 13.0 | 1339 | 25872499347325405328572416.0000 | 0.1087 |
81
- | 21407916889513814243934208.0000 | 14.0 | 1442 | 25872499347325405328572416.0000 | 0.1087 |
82
- | 21770764345443681124220928.0000 | 15.0 | 1545 | 25872499347325405328572416.0000 | 0.1087 |
83
- | 22496455567954601001877504.0000 | 16.0 | 1648 | 25872499347325405328572416.0000 | 0.1087 |
84
- | 22859303023884467882164224.0000 | 17.0 | 1751 | 25872499347325405328572416.0000 | 0.1087 |
85
- | 19593686988562107608334336.0000 | 18.0 | 1854 | 25872499347325405328572416.0000 | 0.1087 |
86
- | 22859304868558872676139008.0000 | 19.0 | 1957 | 25872499347325405328572416.0000 | 0.1087 |
87
- | 21528866656381904802021376.0000 | 20.0 | 2060 | 25872499347325405328572416.0000 | 0.1087 |
88
- | 17053764020425078448586752.0000 | 21.0 | 2163 | 25872499347325405328572416.0000 | 0.1087 |
89
- | 22133609956699138915565568.0000 | 22.0 | 2266 | 25872499347325405328572416.0000 | 0.1087 |
90
- | 21045074967607170335506432.0000 | 23.0 | 2369 | 25872499347325405328572416.0000 | 0.1087 |
91
- | 21407915044839405154992128.0000 | 24.0 | 2472 | 25872499347325405328572416.0000 | 0.1087 |
92
- | 21770762500769272035278848.0000 | 25.0 | 2575 | 25872499347325405328572416.0000 | 0.1087 |
93
- | 23947841702325254640107520.0000 | 26.0 | 2678 | 25872499347325405328572416.0000 | 0.1087 |
94
- | 21045071278258356452589568.0000 | 27.0 | 2781 | 25872499347325405328572416.0000 | 0.1087 |
95
- | 21770762500769272035278848.0000 | 28.0 | 2884 | 25872499347325405328572416.0000 | 0.1087 |
96
- | 21407918734188223332876288.0000 | 29.0 | 2987 | 25872499347325405328572416.0000 | 0.1087 |
97
- | 21528866656381904802021376.0000 | 30.0 | 3090 | 25872499347325405328572416.0000 | 0.1087 |
98
- | 21045073122932761246564352.0000 | 31.0 | 3193 | 25872499347325405328572416.0000 | 0.1087 |
99
- | 23584994246395387759820800.0000 | 32.0 | 3296 | 25872499347325405328572416.0000 | 0.1087 |
100
- | 21045069433583947363647488.0000 | 33.0 | 3399 | 25872499347325405328572416.0000 | 0.1087 |
101
- | 22859304868558872676139008.0000 | 34.0 | 3502 | 25872499347325405328572416.0000 | 0.1087 |
102
- | 21407920578862628126851072.0000 | 35.0 | 3605 | 25872499347325405328572416.0000 | 0.1087 |
103
- | 21045074967607170335506432.0000 | 36.0 | 3708 | 25872499347325405328572416.0000 | 0.1087 |
104
- | 21770764345443681124220928.0000 | 37.0 | 3811 | 25872499347325405328572416.0000 | 0.1087 |
105
- | 22496457412629005795852288.0000 | 38.0 | 3914 | 25872499347325405328572416.0000 | 0.1087 |
106
- | 21407918734188223332876288.0000 | 39.0 | 4017 | 25872499347325405328572416.0000 | 0.1087 |
107
- | 23222148635139925673508864.0000 | 40.0 | 4120 | 25872499347325405328572416.0000 | 0.1087 |
108
 
109
 
110
  ### Framework versions
 
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
+ value: 0.717391304347826
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
32
 
33
  This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
34
  It achieves the following results on the evaluation set:
35
+ - Loss: 25847261397527471458109882368.0000
36
+ - Accuracy: 0.7174
37
 
38
  ## Model description
39
 
 
56
  - train_batch_size: 8
57
  - eval_batch_size: 8
58
  - seed: 42
59
+ - gradient_accumulation_steps: 2
60
+ - total_train_batch_size: 16
61
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
62
  - lr_scheduler_type: linear
63
  - lr_scheduler_warmup_ratio: 0.05
 
65
 
66
  ### Training results
67
 
68
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
69
+ |:----------------------------------:|:-----:|:----:|:----------------------------------:|:--------:|
70
+ | 62421131753362939742087282688.0000 | 0.99 | 51 | 25847261397527471458109882368.0000 | 0.3696 |
71
+ | 68366009359745128248184406016.0000 | 2.0 | 103 | 25847261397527471458109882368.0000 | 0.5435 |
72
+ | 56476265480660303086264254464.0000 | 2.99 | 154 | 25847261397527471458109882368.0000 | 0.7174 |
73
+ | 59448700505958211923205947392.0000 | 4.0 | 206 | 25847261397527471458109882368.0000 | 0.7391 |
74
+ | 57962481104362666995704922112.0000 | 4.99 | 257 | 25847261397527471458109882368.0000 | 0.7391 |
75
+ | 56476269258553484104324612096.0000 | 6.0 | 309 | 25847261397527471458109882368.0000 | 0.7174 |
76
+ | 60934916129660575832646615040.0000 | 6.99 | 360 | 25847261397527471458109882368.0000 | 0.8043 |
77
+ | 69852217427661121325411336192.0000 | 8.0 | 412 | 25847261397527471458109882368.0000 | 0.7174 |
78
+ | 59448704283851401737359327232.0000 | 8.99 | 463 | 25847261397527471458109882368.0000 | 0.7609 |
79
+ | 81741965084639127505391845376.0000 | 10.0 | 515 | 25847261397527471458109882368.0000 | 0.7609 |
80
+ | 57962484882255848013765279744.0000 | 10.99 | 566 | 25847261397527471458109882368.0000 | 0.7609 |
81
+ | 57962473548576287367398162432.0000 | 12.0 | 618 | 25847261397527471458109882368.0000 | 0.8043 |
82
+ | 57962484882255848013765279744.0000 | 12.99 | 669 | 25847261397527471458109882368.0000 | 0.7391 |
83
+ | 60934916129660575832646615040.0000 | 14.0 | 721 | 25847261397527471458109882368.0000 | 0.8043 |
84
+ | 57962477326469477181551542272.0000 | 14.99 | 772 | 25847261397527471458109882368.0000 | 0.7826 |
85
+ | 50531391652171295598227488768.0000 | 16.0 | 824 | 25847261397527471458109882368.0000 | 0.7826 |
86
+ | 57962481104362666995704922112.0000 | 16.99 | 875 | 25847261397527471458109882368.0000 | 0.7174 |
87
+ | 47558960404766567779346153472.0000 | 18.0 | 927 | 25847261397527471458109882368.0000 | 0.7174 |
88
+ | 68366001803958757415970668544.0000 | 18.99 | 978 | 25847261397527471458109882368.0000 | 0.7826 |
89
+ | 60439517218248428756995670016.0000 | 20.0 | 1030 | 25847261397527471458109882368.0000 | 0.7174 |
90
+ | 75797095034043309831508459520.0000 | 20.99 | 1081 | 25847261397527471458109882368.0000 | 0.7391 |
91
+ | 59448700505958211923205947392.0000 | 22.0 | 1133 | 25847261397527471458109882368.0000 | 0.7609 |
92
+ | 63907354932851674483741687808.0000 | 22.99 | 1184 | 25847261397527471458109882368.0000 | 0.7609 |
93
+ | 78769533837234408482603532288.0000 | 24.0 | 1236 | 25847261397527471458109882368.0000 | 0.7609 |
94
+ | 66879786180256393506530000896.0000 | 24.99 | 1287 | 25847261397527471458109882368.0000 | 0.8043 |
95
+ | 56476269258553484104324612096.0000 | 26.0 | 1339 | 25847261397527471458109882368.0000 | 0.7609 |
96
+ | 66879786180256393506530000896.0000 | 26.99 | 1390 | 25847261397527471458109882368.0000 | 0.7609 |
97
+ | 60934919907553756850706972672.0000 | 28.0 | 1442 | 25847261397527471458109882368.0000 | 0.7174 |
98
+ | 54990046079064749362670206976.0000 | 28.99 | 1493 | 25847261397527471458109882368.0000 | 0.7174 |
99
+ | 69852232539233862989838811136.0000 | 30.0 | 1545 | 25847261397527471458109882368.0000 | 0.7826 |
100
+ | 71338440607149856067065741312.0000 | 30.99 | 1596 | 25847261397527471458109882368.0000 | 0.7609 |
101
+ | 66879793736042764338743738368.0000 | 32.0 | 1648 | 25847261397527471458109882368.0000 | 0.7609 |
102
+ | 44586525379468658942404460544.0000 | 32.99 | 1699 | 25847261397527471458109882368.0000 | 0.7391 |
103
+ | 59448700505958211923205947392.0000 | 34.0 | 1751 | 25847261397527471458109882368.0000 | 0.7391 |
104
+ | 63907347377065294855434928128.0000 | 34.99 | 1802 | 25847261397527471458109882368.0000 | 0.7391 |
105
+ | 75797095034043309831508459520.0000 | 36.0 | 1854 | 25847261397527471458109882368.0000 | 0.7391 |
106
+ | 62421135531256120760147640320.0000 | 36.99 | 1905 | 25847261397527471458109882368.0000 | 0.7174 |
107
+ | 53503830455362394249322561536.0000 | 38.0 | 1957 | 25847261397527471458109882368.0000 | 0.7174 |
108
+ | 56476265480660303086264254464.0000 | 38.99 | 2008 | 25847261397527471458109882368.0000 | 0.7174 |
109
+ | 53503826677469204435169181696.0000 | 39.61 | 2040 | 25847261397527471458109882368.0000 | 0.7174 |
110
 
111
 
112
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:be4ba10430aa4877aaaebe6a61cd4a72ca924a6aef6724190dc7ec126cdeb749
3
  size 343230128
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a889cc7da9e5a5c934a2c6fb7893131a2701e7d17aedaaab3a2a0fec5a1f470d
3
  size 343230128
runs/Jun23_15-37-09_DESKTOP-SKBE9FB/events.out.tfevents.1719178630.DESKTOP-SKBE9FB.2132.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:43227618e7cfb3e624216af33b8516a92644fe93ed332dca2b55bbb522af004f
3
- size 23769
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:86b68944fb909c23f32d16748e53e343496cd2b48d92e21e7cb370cbdd8e6b96
3
+ size 49764