scenario-kd-po-ner-half_data-univner_full66
This model is a fine-tuned version of haryoaw/scenario-TCR-NER_data-univner_full on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.4030
- Precision: 0.8318
- Recall: 0.8233
- F1: 0.8275
- Accuracy: 0.9820
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 66
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
0.9611 | 0.2910 | 500 | 0.7195 | 0.7523 | 0.7474 | 0.7498 | 0.9755 |
0.5668 | 0.5821 | 1000 | 0.6390 | 0.7418 | 0.7831 | 0.7619 | 0.9760 |
0.504 | 0.8731 | 1500 | 0.5837 | 0.7781 | 0.7787 | 0.7784 | 0.9781 |
0.4371 | 1.1641 | 2000 | 0.5616 | 0.7705 | 0.8035 | 0.7866 | 0.9783 |
0.3945 | 1.4552 | 2500 | 0.5404 | 0.7753 | 0.8116 | 0.7930 | 0.9792 |
0.387 | 1.7462 | 3000 | 0.5423 | 0.7901 | 0.7945 | 0.7923 | 0.9791 |
0.3689 | 2.0373 | 3500 | 0.5189 | 0.8069 | 0.7941 | 0.8005 | 0.9795 |
0.3275 | 2.3283 | 4000 | 0.5097 | 0.8050 | 0.7899 | 0.7974 | 0.9797 |
0.3242 | 2.6193 | 4500 | 0.5003 | 0.7948 | 0.8068 | 0.8007 | 0.9802 |
0.3184 | 2.9104 | 5000 | 0.4956 | 0.7883 | 0.8186 | 0.8032 | 0.9803 |
0.2894 | 3.2014 | 5500 | 0.4917 | 0.8059 | 0.8032 | 0.8045 | 0.9802 |
0.2869 | 3.4924 | 6000 | 0.4975 | 0.7900 | 0.8147 | 0.8022 | 0.9797 |
0.2752 | 3.7835 | 6500 | 0.5105 | 0.7956 | 0.8114 | 0.8034 | 0.9803 |
0.2758 | 4.0745 | 7000 | 0.4720 | 0.8 | 0.8132 | 0.8065 | 0.9804 |
0.2492 | 4.3655 | 7500 | 0.4781 | 0.8065 | 0.8103 | 0.8084 | 0.9803 |
0.2519 | 4.6566 | 8000 | 0.4719 | 0.8052 | 0.8146 | 0.8099 | 0.9805 |
0.251 | 4.9476 | 8500 | 0.4701 | 0.8164 | 0.8075 | 0.8119 | 0.9805 |
0.2316 | 5.2386 | 9000 | 0.4599 | 0.8222 | 0.8145 | 0.8183 | 0.9813 |
0.2256 | 5.5297 | 9500 | 0.4585 | 0.8050 | 0.8114 | 0.8082 | 0.9804 |
0.2314 | 5.8207 | 10000 | 0.4556 | 0.8184 | 0.8120 | 0.8152 | 0.9812 |
0.223 | 6.1118 | 10500 | 0.4556 | 0.8092 | 0.8182 | 0.8137 | 0.9808 |
0.2104 | 6.4028 | 11000 | 0.4585 | 0.8111 | 0.8022 | 0.8066 | 0.9806 |
0.2088 | 6.6938 | 11500 | 0.4556 | 0.8146 | 0.8120 | 0.8133 | 0.9811 |
0.2079 | 6.9849 | 12000 | 0.4520 | 0.8256 | 0.8201 | 0.8228 | 0.9816 |
0.1927 | 7.2759 | 12500 | 0.4534 | 0.8201 | 0.8107 | 0.8154 | 0.9808 |
0.195 | 7.5669 | 13000 | 0.4424 | 0.8109 | 0.8140 | 0.8124 | 0.9807 |
0.1943 | 7.8580 | 13500 | 0.4554 | 0.8110 | 0.8137 | 0.8124 | 0.9807 |
0.1888 | 8.1490 | 14000 | 0.4530 | 0.8243 | 0.8022 | 0.8131 | 0.9808 |
0.1807 | 8.4400 | 14500 | 0.4438 | 0.8203 | 0.8158 | 0.8180 | 0.9812 |
0.1809 | 8.7311 | 15000 | 0.4385 | 0.8227 | 0.8146 | 0.8186 | 0.9814 |
0.1811 | 9.0221 | 15500 | 0.4367 | 0.8199 | 0.8218 | 0.8209 | 0.9815 |
0.172 | 9.3132 | 16000 | 0.4341 | 0.8138 | 0.8323 | 0.8230 | 0.9816 |
0.1708 | 9.6042 | 16500 | 0.4364 | 0.8167 | 0.8264 | 0.8215 | 0.9813 |
0.1675 | 9.8952 | 17000 | 0.4340 | 0.8197 | 0.8201 | 0.8199 | 0.9815 |
0.1654 | 10.1863 | 17500 | 0.4337 | 0.8153 | 0.8230 | 0.8191 | 0.9816 |
0.1605 | 10.4773 | 18000 | 0.4284 | 0.8282 | 0.8129 | 0.8204 | 0.9815 |
0.1596 | 10.7683 | 18500 | 0.4338 | 0.8204 | 0.8198 | 0.8201 | 0.9816 |
0.1572 | 11.0594 | 19000 | 0.4252 | 0.8228 | 0.8220 | 0.8224 | 0.9818 |
0.1529 | 11.3504 | 19500 | 0.4360 | 0.8201 | 0.8175 | 0.8188 | 0.9817 |
0.1523 | 11.6414 | 20000 | 0.4332 | 0.8190 | 0.8224 | 0.8207 | 0.9816 |
0.1549 | 11.9325 | 20500 | 0.4305 | 0.8210 | 0.8184 | 0.8197 | 0.9816 |
0.1499 | 12.2235 | 21000 | 0.4286 | 0.8250 | 0.8194 | 0.8221 | 0.9816 |
0.1459 | 12.5146 | 21500 | 0.4271 | 0.8185 | 0.8240 | 0.8213 | 0.9814 |
0.1478 | 12.8056 | 22000 | 0.4313 | 0.8239 | 0.8147 | 0.8193 | 0.9815 |
0.144 | 13.0966 | 22500 | 0.4280 | 0.8230 | 0.8244 | 0.8237 | 0.9815 |
0.1401 | 13.3877 | 23000 | 0.4262 | 0.8224 | 0.8186 | 0.8205 | 0.9815 |
0.1409 | 13.6787 | 23500 | 0.4241 | 0.8247 | 0.8218 | 0.8232 | 0.9817 |
0.1416 | 13.9697 | 24000 | 0.4273 | 0.8295 | 0.8127 | 0.8210 | 0.9815 |
0.1353 | 14.2608 | 24500 | 0.4313 | 0.8235 | 0.8114 | 0.8174 | 0.9811 |
0.1351 | 14.5518 | 25000 | 0.4281 | 0.8203 | 0.8186 | 0.8195 | 0.9815 |
0.1354 | 14.8428 | 25500 | 0.4322 | 0.8239 | 0.8166 | 0.8202 | 0.9815 |
0.134 | 15.1339 | 26000 | 0.4210 | 0.8267 | 0.8100 | 0.8182 | 0.9813 |
0.1302 | 15.4249 | 26500 | 0.4141 | 0.8246 | 0.8189 | 0.8218 | 0.9817 |
0.1315 | 15.7159 | 27000 | 0.4157 | 0.8276 | 0.8179 | 0.8227 | 0.9817 |
0.1308 | 16.0070 | 27500 | 0.4177 | 0.8307 | 0.8179 | 0.8243 | 0.9818 |
0.127 | 16.2980 | 28000 | 0.4243 | 0.8240 | 0.8212 | 0.8226 | 0.9815 |
0.1269 | 16.5891 | 28500 | 0.4226 | 0.8337 | 0.8129 | 0.8231 | 0.9817 |
0.1278 | 16.8801 | 29000 | 0.4130 | 0.8285 | 0.8205 | 0.8245 | 0.9819 |
0.124 | 17.1711 | 29500 | 0.4186 | 0.8263 | 0.8220 | 0.8241 | 0.9817 |
0.1243 | 17.4622 | 30000 | 0.4101 | 0.8290 | 0.8217 | 0.8253 | 0.9818 |
0.1226 | 17.7532 | 30500 | 0.4171 | 0.8276 | 0.8199 | 0.8237 | 0.9818 |
0.1225 | 18.0442 | 31000 | 0.4133 | 0.8313 | 0.8221 | 0.8267 | 0.9819 |
0.1197 | 18.3353 | 31500 | 0.4121 | 0.8343 | 0.8143 | 0.8242 | 0.9818 |
0.1205 | 18.6263 | 32000 | 0.4108 | 0.8329 | 0.8215 | 0.8272 | 0.9822 |
0.1191 | 18.9173 | 32500 | 0.4221 | 0.8307 | 0.8195 | 0.8250 | 0.9820 |
0.118 | 19.2084 | 33000 | 0.4111 | 0.8250 | 0.8202 | 0.8226 | 0.9819 |
0.118 | 19.4994 | 33500 | 0.4161 | 0.8225 | 0.8159 | 0.8192 | 0.9814 |
0.1175 | 19.7905 | 34000 | 0.4045 | 0.8254 | 0.8280 | 0.8267 | 0.9820 |
0.117 | 20.0815 | 34500 | 0.4030 | 0.8235 | 0.8261 | 0.8248 | 0.9820 |
0.1149 | 20.3725 | 35000 | 0.4094 | 0.8317 | 0.8269 | 0.8293 | 0.9820 |
0.1155 | 20.6636 | 35500 | 0.4058 | 0.8314 | 0.8191 | 0.8252 | 0.9821 |
0.1139 | 20.9546 | 36000 | 0.4124 | 0.8331 | 0.8212 | 0.8271 | 0.9819 |
0.1126 | 21.2456 | 36500 | 0.4099 | 0.8292 | 0.8173 | 0.8232 | 0.9819 |
0.1125 | 21.5367 | 37000 | 0.4099 | 0.8322 | 0.8209 | 0.8266 | 0.9821 |
0.1124 | 21.8277 | 37500 | 0.4059 | 0.8329 | 0.8208 | 0.8268 | 0.9819 |
0.1119 | 22.1187 | 38000 | 0.4119 | 0.8300 | 0.8121 | 0.8210 | 0.9814 |
0.1101 | 22.4098 | 38500 | 0.4065 | 0.8263 | 0.8224 | 0.8244 | 0.9817 |
0.1111 | 22.7008 | 39000 | 0.4039 | 0.8252 | 0.8235 | 0.8244 | 0.9818 |
0.1111 | 22.9919 | 39500 | 0.4066 | 0.8323 | 0.8198 | 0.8260 | 0.9819 |
0.1096 | 23.2829 | 40000 | 0.4060 | 0.8277 | 0.8178 | 0.8227 | 0.9819 |
0.108 | 23.5739 | 40500 | 0.4091 | 0.8282 | 0.8182 | 0.8232 | 0.9817 |
0.1088 | 23.8650 | 41000 | 0.4048 | 0.8295 | 0.8197 | 0.8245 | 0.9818 |
0.1091 | 24.1560 | 41500 | 0.4008 | 0.8330 | 0.8238 | 0.8284 | 0.9821 |
0.1073 | 24.4470 | 42000 | 0.4023 | 0.8290 | 0.8256 | 0.8273 | 0.9819 |
0.1076 | 24.7381 | 42500 | 0.4007 | 0.8324 | 0.8241 | 0.8282 | 0.9823 |
0.1064 | 25.0291 | 43000 | 0.4008 | 0.8304 | 0.8233 | 0.8268 | 0.9821 |
0.1066 | 25.3201 | 43500 | 0.3961 | 0.8337 | 0.8234 | 0.8285 | 0.9821 |
0.1053 | 25.6112 | 44000 | 0.4043 | 0.8325 | 0.8222 | 0.8273 | 0.9819 |
0.1074 | 25.9022 | 44500 | 0.4007 | 0.8283 | 0.8205 | 0.8244 | 0.9818 |
0.1047 | 26.1932 | 45000 | 0.4030 | 0.8318 | 0.8233 | 0.8275 | 0.9820 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.1.1+cu121
- Datasets 2.14.5
- Tokenizers 0.19.1
- Downloads last month
- 3
Model tree for haryoaw/scenario-kd-po-ner-half_data-univner_full66
Base model
FacebookAI/xlm-roberta-base