daigram_detr_r50_albumentations
This model is a fine-tuned version of facebook/detr-resnet-50 on the bpmn-shapes dataset. It achieves the following results on the evaluation set:
- Loss: 1.0088
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 500
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
3.8163 | 2.63 | 50 | 3.0660 |
2.9036 | 5.26 | 100 | 2.8878 |
2.7516 | 7.89 | 150 | 2.8043 |
2.6278 | 10.53 | 200 | 2.6820 |
2.4806 | 13.16 | 250 | 2.5676 |
2.3781 | 15.79 | 300 | 2.4282 |
2.253 | 18.42 | 350 | 2.3161 |
2.1405 | 21.05 | 400 | 2.1735 |
2.0263 | 23.68 | 450 | 2.0909 |
1.9732 | 26.32 | 500 | 2.0120 |
1.8647 | 28.95 | 550 | 1.9260 |
1.7793 | 31.58 | 600 | 1.8655 |
1.7706 | 34.21 | 650 | 1.8166 |
1.6792 | 36.84 | 700 | 1.7325 |
1.5654 | 39.47 | 750 | 1.7061 |
1.5802 | 42.11 | 800 | 1.6463 |
1.5053 | 44.74 | 850 | 1.5985 |
1.4858 | 47.37 | 900 | 1.6060 |
1.4186 | 50.0 | 950 | 1.5563 |
1.4391 | 52.63 | 1000 | 1.5219 |
1.3938 | 55.26 | 1050 | 1.4995 |
1.3734 | 57.89 | 1100 | 1.4661 |
1.3379 | 60.53 | 1150 | 1.4451 |
1.341 | 63.16 | 1200 | 1.4854 |
1.3647 | 65.79 | 1250 | 1.4509 |
1.3198 | 68.42 | 1300 | 1.4116 |
1.3054 | 71.05 | 1350 | 1.3821 |
1.2945 | 73.68 | 1400 | 1.3952 |
1.2899 | 76.32 | 1450 | 1.3868 |
1.2533 | 78.95 | 1500 | 1.3580 |
1.2655 | 81.58 | 1550 | 1.3374 |
1.2649 | 84.21 | 1600 | 1.3451 |
1.2286 | 86.84 | 1650 | 1.2973 |
1.2497 | 89.47 | 1700 | 1.3322 |
1.2456 | 92.11 | 1750 | 1.3289 |
1.2234 | 94.74 | 1800 | 1.3080 |
1.1695 | 97.37 | 1850 | 1.3218 |
1.2265 | 100.0 | 1900 | 1.3280 |
1.1899 | 102.63 | 1950 | 1.2834 |
1.1914 | 105.26 | 2000 | 1.2931 |
1.1698 | 107.89 | 2050 | 1.3176 |
1.177 | 110.53 | 2100 | 1.2896 |
1.1625 | 113.16 | 2150 | 1.2936 |
1.1626 | 115.79 | 2200 | 1.2614 |
1.1698 | 118.42 | 2250 | 1.2545 |
1.1703 | 121.05 | 2300 | 1.2398 |
1.1659 | 123.68 | 2350 | 1.2254 |
1.1734 | 126.32 | 2400 | 1.2489 |
1.1234 | 128.95 | 2450 | 1.2072 |
1.1464 | 131.58 | 2500 | 1.1707 |
1.1268 | 134.21 | 2550 | 1.1971 |
1.1511 | 136.84 | 2600 | 1.2247 |
1.1234 | 139.47 | 2650 | 1.1921 |
1.0923 | 142.11 | 2700 | 1.1751 |
1.1267 | 144.74 | 2750 | 1.1905 |
1.1021 | 147.37 | 2800 | 1.1885 |
1.1075 | 150.0 | 2850 | 1.1780 |
1.1116 | 152.63 | 2900 | 1.1666 |
1.0987 | 155.26 | 2950 | 1.1694 |
1.0974 | 157.89 | 3000 | 1.1931 |
1.0867 | 160.53 | 3050 | 1.1461 |
1.1076 | 163.16 | 3100 | 1.1501 |
1.0912 | 165.79 | 3150 | 1.1611 |
1.0671 | 168.42 | 3200 | 1.1718 |
1.0981 | 171.05 | 3250 | 1.1961 |
1.0602 | 173.68 | 3300 | 1.1786 |
1.0305 | 176.32 | 3350 | 1.1640 |
1.0647 | 178.95 | 3400 | 1.1416 |
1.0628 | 181.58 | 3450 | 1.1296 |
1.0856 | 184.21 | 3500 | 1.1140 |
1.0626 | 186.84 | 3550 | 1.1214 |
1.0782 | 189.47 | 3600 | 1.1449 |
1.0601 | 192.11 | 3650 | 1.1441 |
1.0906 | 194.74 | 3700 | 1.1396 |
1.0376 | 197.37 | 3750 | 1.1271 |
1.0625 | 200.0 | 3800 | 1.1397 |
1.057 | 202.63 | 3850 | 1.1121 |
1.0448 | 205.26 | 3900 | 1.1376 |
1.0747 | 207.89 | 3950 | 1.1475 |
1.0605 | 210.53 | 4000 | 1.0916 |
1.0344 | 213.16 | 4050 | 1.1001 |
1.0443 | 215.79 | 4100 | 1.0976 |
1.0202 | 218.42 | 4150 | 1.1240 |
1.078 | 221.05 | 4200 | 1.1024 |
1.0251 | 223.68 | 4250 | 1.0793 |
1.0353 | 226.32 | 4300 | 1.1153 |
1.0047 | 228.95 | 4350 | 1.0972 |
1.0143 | 231.58 | 4400 | 1.0948 |
1.0172 | 234.21 | 4450 | 1.1265 |
1.0299 | 236.84 | 4500 | 1.1038 |
0.9968 | 239.47 | 4550 | 1.0901 |
1.0233 | 242.11 | 4600 | 1.0945 |
0.9943 | 244.74 | 4650 | 1.0918 |
1.0321 | 247.37 | 4700 | 1.1270 |
1.0113 | 250.0 | 4750 | 1.1060 |
1.0229 | 252.63 | 4800 | 1.0859 |
0.9945 | 255.26 | 4850 | 1.0875 |
1.0073 | 257.89 | 4900 | 1.0976 |
1.0096 | 260.53 | 4950 | 1.0933 |
1.0 | 263.16 | 5000 | 1.0821 |
1.0326 | 265.79 | 5050 | 1.0747 |
0.997 | 268.42 | 5100 | 1.0931 |
1.0056 | 271.05 | 5150 | 1.0853 |
0.9858 | 273.68 | 5200 | 1.0945 |
1.0005 | 276.32 | 5250 | 1.0669 |
1.0217 | 278.95 | 5300 | 1.0497 |
0.9777 | 281.58 | 5350 | 1.0672 |
0.9888 | 284.21 | 5400 | 1.0844 |
0.9662 | 286.84 | 5450 | 1.0524 |
1.0029 | 289.47 | 5500 | 1.0519 |
0.984 | 292.11 | 5550 | 1.0538 |
0.9724 | 294.74 | 5600 | 1.0524 |
0.991 | 297.37 | 5650 | 1.0553 |
0.9936 | 300.0 | 5700 | 1.0601 |
0.9817 | 302.63 | 5750 | 1.0524 |
0.9868 | 305.26 | 5800 | 1.0644 |
0.9982 | 307.89 | 5850 | 1.0523 |
0.9814 | 310.53 | 5900 | 1.0611 |
0.9761 | 313.16 | 5950 | 1.0505 |
0.9507 | 315.79 | 6000 | 1.0361 |
0.9786 | 318.42 | 6050 | 1.0275 |
0.9684 | 321.05 | 6100 | 1.0292 |
0.9759 | 323.68 | 6150 | 1.0529 |
0.9442 | 326.32 | 6200 | 1.0689 |
0.9653 | 328.95 | 6250 | 1.0696 |
0.9579 | 331.58 | 6300 | 1.0572 |
1.0016 | 334.21 | 6350 | 1.0660 |
0.9462 | 336.84 | 6400 | 1.0525 |
0.9596 | 339.47 | 6450 | 1.0505 |
0.9655 | 342.11 | 6500 | 1.0514 |
0.9713 | 344.74 | 6550 | 1.0616 |
0.952 | 347.37 | 6600 | 1.0497 |
0.9433 | 350.0 | 6650 | 1.0389 |
0.9619 | 352.63 | 6700 | 1.0404 |
0.9594 | 355.26 | 6750 | 1.0332 |
0.9586 | 357.89 | 6800 | 1.0323 |
0.9582 | 360.53 | 6850 | 1.0294 |
0.9437 | 363.16 | 6900 | 1.0329 |
0.9585 | 365.79 | 6950 | 1.0361 |
0.9661 | 368.42 | 7000 | 1.0428 |
0.9603 | 371.05 | 7050 | 1.0299 |
0.9619 | 373.68 | 7100 | 1.0416 |
0.9766 | 376.32 | 7150 | 1.0471 |
0.9547 | 378.95 | 7200 | 1.0498 |
0.967 | 381.58 | 7250 | 1.0318 |
0.9463 | 384.21 | 7300 | 1.0238 |
0.9531 | 386.84 | 7350 | 1.0329 |
0.9342 | 389.47 | 7400 | 1.0354 |
0.939 | 392.11 | 7450 | 1.0312 |
0.9635 | 394.74 | 7500 | 1.0325 |
0.9261 | 397.37 | 7550 | 1.0245 |
0.962 | 400.0 | 7600 | 1.0381 |
0.9385 | 402.63 | 7650 | 1.0243 |
0.9422 | 405.26 | 7700 | 1.0235 |
0.9285 | 407.89 | 7750 | 1.0286 |
0.9598 | 410.53 | 7800 | 1.0353 |
0.9529 | 413.16 | 7850 | 1.0361 |
0.928 | 415.79 | 7900 | 1.0316 |
0.935 | 418.42 | 7950 | 1.0263 |
0.9456 | 421.05 | 8000 | 1.0368 |
0.9387 | 423.68 | 8050 | 1.0440 |
0.9321 | 426.32 | 8100 | 1.0440 |
0.9236 | 428.95 | 8150 | 1.0394 |
0.9448 | 431.58 | 8200 | 1.0467 |
0.9151 | 434.21 | 8250 | 1.0516 |
0.9373 | 436.84 | 8300 | 1.0383 |
0.9577 | 439.47 | 8350 | 1.0190 |
0.9199 | 442.11 | 8400 | 1.0215 |
0.9321 | 444.74 | 8450 | 1.0184 |
0.9387 | 447.37 | 8500 | 1.0236 |
0.9382 | 450.0 | 8550 | 1.0259 |
0.9391 | 452.63 | 8600 | 1.0282 |
0.9392 | 455.26 | 8650 | 1.0193 |
0.9438 | 457.89 | 8700 | 1.0124 |
0.9398 | 460.53 | 8750 | 1.0060 |
0.9246 | 463.16 | 8800 | 1.0140 |
0.9383 | 465.79 | 8850 | 1.0145 |
0.9267 | 468.42 | 8900 | 1.0122 |
0.9253 | 471.05 | 8950 | 1.0144 |
0.9238 | 473.68 | 9000 | 1.0065 |
0.9082 | 476.32 | 9050 | 1.0136 |
0.9287 | 478.95 | 9100 | 1.0120 |
0.9161 | 481.58 | 9150 | 1.0120 |
0.9093 | 484.21 | 9200 | 1.0128 |
0.9264 | 486.84 | 9250 | 1.0125 |
0.9487 | 489.47 | 9300 | 1.0131 |
0.9398 | 492.11 | 9350 | 1.0101 |
0.9039 | 494.74 | 9400 | 1.0090 |
0.908 | 497.37 | 9450 | 1.0097 |
0.944 | 500.0 | 9500 | 1.0088 |
Framework versions
- Transformers 4.34.1
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
- Downloads last month
- 45
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.