File size: 10,059 Bytes
e8afefa |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 |
---
license: apache-2.0
base_model: microsoft/conditional-detr-resnet-50
tags:
- generated_from_trainer
datasets:
- dsi
model-index:
- name: detr_finetuned_airdataset
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_airdataset
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on the dsi dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8959
- Map: 0.3195
- Map 50: 0.7784
- Map 75: 0.1925
- Map Small: 0.3211
- Map Medium: 0.0079
- Map Large: -1.0
- Mar 1: 0.0256
- Mar 10: 0.1995
- Mar 100: 0.487
- Mar Small: 0.4896
- Mar Medium: 0.0061
- Mar Large: -1.0
- Map Falciparum Trophozoite: 0.3195
- Mar 100 Falciparum Trophozoite: 0.487
- Map Wbc: -1.0
- Mar 100 Wbc: -1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Falciparum Trophozoite | Mar 100 Falciparum Trophozoite | Map Wbc | Mar 100 Wbc |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------------------------:|:------------------------------:|:-------:|:-----------:|
| No log | 1.0 | 209 | 1.2206 | 0.1424 | 0.401 | 0.07 | 0.1429 | 0.0328 | -1.0 | 0.0168 | 0.1239 | 0.4185 | 0.4204 | 0.0612 | -1.0 | 0.1424 | 0.4185 | -1.0 | -1.0 |
| No log | 2.0 | 418 | 1.1354 | 0.2136 | 0.585 | 0.1077 | 0.2145 | 0.0224 | -1.0 | 0.0212 | 0.1608 | 0.4102 | 0.4123 | 0.0224 | -1.0 | 0.2136 | 0.4102 | -1.0 | -1.0 |
| 1.3747 | 3.0 | 627 | 1.0729 | 0.2353 | 0.6428 | 0.1092 | 0.2365 | 0.0229 | -1.0 | 0.0216 | 0.1669 | 0.4247 | 0.4268 | 0.0245 | -1.0 | 0.2353 | 0.4247 | -1.0 | -1.0 |
| 1.3747 | 4.0 | 836 | 1.0260 | 0.2548 | 0.6701 | 0.1339 | 0.2563 | 0.0178 | -1.0 | 0.0234 | 0.1792 | 0.4424 | 0.4447 | 0.0163 | -1.0 | 0.2548 | 0.4424 | -1.0 | -1.0 |
| 1.0467 | 5.0 | 1045 | 1.0116 | 0.2576 | 0.6811 | 0.1321 | 0.2589 | 0.0208 | -1.0 | 0.0229 | 0.1773 | 0.4422 | 0.4445 | 0.0184 | -1.0 | 0.2576 | 0.4422 | -1.0 | -1.0 |
| 1.0467 | 6.0 | 1254 | 1.0150 | 0.2526 | 0.6842 | 0.1191 | 0.2537 | 0.0089 | -1.0 | 0.0226 | 0.1724 | 0.4463 | 0.4486 | 0.0082 | -1.0 | 0.2526 | 0.4463 | -1.0 | -1.0 |
| 1.0467 | 7.0 | 1463 | 0.9933 | 0.2627 | 0.699 | 0.1376 | 0.2639 | 0.0211 | -1.0 | 0.0215 | 0.1773 | 0.4458 | 0.4481 | 0.0224 | -1.0 | 0.2627 | 0.4458 | -1.0 | -1.0 |
| 0.9905 | 8.0 | 1672 | 0.9642 | 0.2797 | 0.7188 | 0.1511 | 0.2809 | 0.0112 | -1.0 | 0.0241 | 0.1858 | 0.459 | 0.4614 | 0.0143 | -1.0 | 0.2797 | 0.459 | -1.0 | -1.0 |
| 0.9905 | 9.0 | 1881 | 0.9641 | 0.2786 | 0.7209 | 0.1453 | 0.2803 | 0.0103 | -1.0 | 0.0231 | 0.1861 | 0.4534 | 0.4558 | 0.0102 | -1.0 | 0.2786 | 0.4534 | -1.0 | -1.0 |
| 0.955 | 10.0 | 2090 | 0.9869 | 0.2685 | 0.7158 | 0.1366 | 0.27 | 0.0023 | -1.0 | 0.0225 | 0.1789 | 0.4442 | 0.4465 | 0.0041 | -1.0 | 0.2685 | 0.4442 | -1.0 | -1.0 |
| 0.955 | 11.0 | 2299 | 0.9612 | 0.2837 | 0.7238 | 0.1534 | 0.2856 | 0.0067 | -1.0 | 0.0242 | 0.1878 | 0.4568 | 0.4592 | 0.0082 | -1.0 | 0.2837 | 0.4568 | -1.0 | -1.0 |
| 0.9248 | 12.0 | 2508 | 0.9437 | 0.2938 | 0.7368 | 0.1635 | 0.2954 | 0.005 | -1.0 | 0.0239 | 0.1882 | 0.4701 | 0.4727 | 0.0041 | -1.0 | 0.2938 | 0.4701 | -1.0 | -1.0 |
| 0.9248 | 13.0 | 2717 | 0.9390 | 0.289 | 0.7371 | 0.16 | 0.2903 | 0.0149 | -1.0 | 0.0254 | 0.191 | 0.4685 | 0.471 | 0.0122 | -1.0 | 0.289 | 0.4685 | -1.0 | -1.0 |
| 0.9248 | 14.0 | 2926 | 0.9321 | 0.2986 | 0.7428 | 0.1744 | 0.3002 | 0.005 | -1.0 | 0.0251 | 0.1928 | 0.4743 | 0.4768 | 0.0041 | -1.0 | 0.2986 | 0.4743 | -1.0 | -1.0 |
| 0.9027 | 15.0 | 3135 | 0.9448 | 0.2911 | 0.7418 | 0.1588 | 0.2924 | 0.0139 | -1.0 | 0.0241 | 0.1877 | 0.4678 | 0.4702 | 0.0122 | -1.0 | 0.2911 | 0.4678 | -1.0 | -1.0 |
| 0.9027 | 16.0 | 3344 | 0.9259 | 0.3033 | 0.7549 | 0.174 | 0.3047 | 0.005 | -1.0 | 0.0249 | 0.1931 | 0.4736 | 0.4762 | 0.0041 | -1.0 | 0.3033 | 0.4736 | -1.0 | -1.0 |
| 0.8725 | 17.0 | 3553 | 0.9200 | 0.3039 | 0.7554 | 0.1795 | 0.3055 | 0.0069 | -1.0 | 0.0259 | 0.1949 | 0.4764 | 0.479 | 0.0061 | -1.0 | 0.3039 | 0.4764 | -1.0 | -1.0 |
| 0.8725 | 18.0 | 3762 | 0.9129 | 0.3068 | 0.7622 | 0.1786 | 0.3083 | 0.0089 | -1.0 | 0.026 | 0.1961 | 0.4817 | 0.4842 | 0.0082 | -1.0 | 0.3068 | 0.4817 | -1.0 | -1.0 |
| 0.8725 | 19.0 | 3971 | 0.9053 | 0.3129 | 0.7699 | 0.182 | 0.3146 | 0.0119 | -1.0 | 0.0253 | 0.1986 | 0.4806 | 0.4832 | 0.0102 | -1.0 | 0.3129 | 0.4806 | -1.0 | -1.0 |
| 0.8532 | 20.0 | 4180 | 0.9124 | 0.3076 | 0.7661 | 0.1794 | 0.3093 | 0.0069 | -1.0 | 0.0252 | 0.1972 | 0.4798 | 0.4823 | 0.0061 | -1.0 | 0.3076 | 0.4798 | -1.0 | -1.0 |
| 0.8532 | 21.0 | 4389 | 0.9060 | 0.3129 | 0.7694 | 0.182 | 0.3146 | 0.0139 | -1.0 | 0.0254 | 0.1988 | 0.4811 | 0.4837 | 0.0122 | -1.0 | 0.3129 | 0.4811 | -1.0 | -1.0 |
| 0.8362 | 22.0 | 4598 | 0.9007 | 0.3157 | 0.7733 | 0.1886 | 0.3173 | 0.0079 | -1.0 | 0.0255 | 0.2005 | 0.4834 | 0.4859 | 0.0061 | -1.0 | 0.3157 | 0.4834 | -1.0 | -1.0 |
| 0.8362 | 23.0 | 4807 | 0.9036 | 0.3148 | 0.7702 | 0.1859 | 0.3159 | 0.0119 | -1.0 | 0.0255 | 0.1982 | 0.4859 | 0.4884 | 0.0102 | -1.0 | 0.3148 | 0.4859 | -1.0 | -1.0 |
| 0.8211 | 24.0 | 5016 | 0.8988 | 0.3159 | 0.7733 | 0.1875 | 0.3172 | 0.005 | -1.0 | 0.0253 | 0.1988 | 0.4844 | 0.487 | 0.0041 | -1.0 | 0.3159 | 0.4844 | -1.0 | -1.0 |
| 0.8211 | 25.0 | 5225 | 0.8989 | 0.3175 | 0.7741 | 0.1888 | 0.3189 | 0.0079 | -1.0 | 0.0256 | 0.1995 | 0.486 | 0.4886 | 0.0061 | -1.0 | 0.3175 | 0.486 | -1.0 | -1.0 |
| 0.8211 | 26.0 | 5434 | 0.8980 | 0.3188 | 0.776 | 0.1918 | 0.3204 | 0.005 | -1.0 | 0.0258 | 0.1998 | 0.4867 | 0.4893 | 0.0041 | -1.0 | 0.3188 | 0.4867 | -1.0 | -1.0 |
| 0.8091 | 27.0 | 5643 | 0.8953 | 0.3204 | 0.7786 | 0.1931 | 0.3219 | 0.0079 | -1.0 | 0.026 | 0.2002 | 0.4863 | 0.4889 | 0.0061 | -1.0 | 0.3204 | 0.4863 | -1.0 | -1.0 |
| 0.8091 | 28.0 | 5852 | 0.8973 | 0.3192 | 0.7784 | 0.1911 | 0.3208 | 0.0079 | -1.0 | 0.0255 | 0.199 | 0.4867 | 0.4892 | 0.0061 | -1.0 | 0.3192 | 0.4867 | -1.0 | -1.0 |
| 0.8001 | 29.0 | 6061 | 0.8962 | 0.3196 | 0.7785 | 0.1926 | 0.3211 | 0.0079 | -1.0 | 0.0257 | 0.1994 | 0.487 | 0.4896 | 0.0061 | -1.0 | 0.3196 | 0.487 | -1.0 | -1.0 |
| 0.8001 | 30.0 | 6270 | 0.8959 | 0.3195 | 0.7784 | 0.1925 | 0.3211 | 0.0079 | -1.0 | 0.0256 | 0.1995 | 0.487 | 0.4896 | 0.0061 | -1.0 | 0.3195 | 0.487 | -1.0 | -1.0 |
### Framework versions
- Transformers 4.41.2
- Pytorch 2.3.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1
|