fine_tuned_xlm-roberta-large_2April
This model is a fine-tuned version of xlm-roberta-large on the None dataset. It achieves the following results on the evaluation set:
- Best F1: 75.9147
- Loss: 1.6634
- Exact: 39.3876
- F1: 56.7272
- Total: 3821
- Hasans Exact: 56.6152
- Hasans F1: 81.5886
- Hasans Total: 2653
- Noans Exact: 0.2568
- Noans F1: 0.2568
- Noans Total: 1168
- Best Exact: 60.4292
- Best Exact Thresh: 0.6508
- Best F1 Thresh: 0.9299
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Best F1 | Validation Loss | Exact | F1 | Total | Hasans Exact | Hasans F1 | Hasans Total | Noans Exact | Noans F1 | Noans Total | Best Exact | Best Exact Thresh | Best F1 Thresh |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2.0619 | 0.26 | 500 | 54.8058 | 1.4812 | 32.2952 | 51.4988 | 3821 | 46.5134 | 74.1714 | 2653 | 0.0 | 0.0 | 1168 | 41.4551 | 0.7820 | 0.8887 |
1.3598 | 0.53 | 1000 | 65.8410 | 1.2289 | 35.9068 | 54.1665 | 3821 | 51.7150 | 78.0136 | 2653 | 0.0 | 0.0 | 1168 | 51.7666 | 0.8149 | 0.8993 |
1.2347 | 0.79 | 1500 | 67.6083 | 1.1868 | 37.2416 | 55.0317 | 3821 | 53.6374 | 79.2597 | 2653 | 0.0 | 0.0 | 1168 | 53.7032 | 0.7981 | 0.8571 |
1.0961 | 1.05 | 2000 | 71.4132 | 1.1491 | 38.9165 | 55.8262 | 3821 | 56.0498 | 80.4041 | 2653 | 0.0 | 0.0 | 1168 | 57.5504 | 0.7733 | 0.8679 |
0.9003 | 1.32 | 2500 | 72.3291 | 1.2053 | 38.4193 | 56.0447 | 3821 | 55.3336 | 80.7188 | 2653 | 0.0 | 0.0 | 1168 | 57.5766 | 0.8557 | 0.9662 |
0.8705 | 1.58 | 3000 | 71.7222 | 1.1239 | 38.5239 | 56.1153 | 3821 | 55.4844 | 80.8204 | 2653 | 0.0 | 0.0 | 1168 | 56.6344 | 0.7408 | 0.8452 |
0.8655 | 1.84 | 3500 | 73.6273 | 1.0855 | 39.0212 | 56.4204 | 3821 | 56.2005 | 81.2599 | 2653 | 0.0 | 0.0 | 1168 | 58.7543 | 0.7574 | 0.8970 |
0.7431 | 2.11 | 4000 | 74.8323 | 1.1817 | 39.6231 | 56.5930 | 3821 | 56.9544 | 81.3954 | 2653 | 0.2568 | 0.2568 | 1168 | 59.7226 | 0.8032 | 0.9219 |
0.5738 | 2.37 | 4500 | 74.4675 | 1.2047 | 38.8642 | 56.6456 | 3821 | 55.8236 | 81.4334 | 2653 | 0.3425 | 0.3425 | 1168 | 58.8851 | 0.7003 | 0.8792 |
0.5904 | 2.63 | 5000 | 74.7345 | 1.1571 | 38.5763 | 56.4366 | 3821 | 55.5221 | 81.2455 | 2653 | 0.0856 | 0.0856 | 1168 | 59.1206 | 0.7922 | 0.8458 |
0.5831 | 2.89 | 5500 | 74.9378 | 1.1537 | 39.7278 | 56.6778 | 3821 | 57.2182 | 81.6306 | 2653 | 0.0 | 0.0 | 1168 | 59.9581 | 0.7947 | 0.8767 |
0.4785 | 3.16 | 6000 | 75.0234 | 1.3432 | 39.3353 | 56.8437 | 3821 | 56.1628 | 81.3795 | 2653 | 1.1130 | 1.1130 | 1168 | 59.2515 | 0.7999 | 0.8258 |
0.3774 | 3.42 | 6500 | 74.8641 | 1.4903 | 39.3876 | 56.6856 | 3821 | 56.2759 | 81.1894 | 2653 | 1.0274 | 1.0274 | 1168 | 59.6964 | 0.7004 | 0.9268 |
0.3882 | 3.68 | 7000 | 75.0504 | 1.3418 | 38.7857 | 56.4315 | 3821 | 55.8613 | 81.2759 | 2653 | 0.0 | 0.0 | 1168 | 59.4609 | 0.6782 | 0.9456 |
0.3747 | 3.95 | 7500 | 75.3181 | 1.3673 | 39.4399 | 56.6455 | 3821 | 56.8036 | 81.5840 | 2653 | 0.0 | 0.0 | 1168 | 59.6441 | 0.7541 | 0.9554 |
0.2748 | 4.21 | 8000 | 75.5835 | 1.5103 | 39.2829 | 56.6922 | 3821 | 56.5398 | 81.6136 | 2653 | 0.0856 | 0.0856 | 1168 | 60.1152 | 0.7237 | 0.9792 |
0.2346 | 4.47 | 8500 | 75.8283 | 1.6566 | 40.0157 | 57.2731 | 3821 | 57.0675 | 81.9225 | 2653 | 1.2842 | 1.2842 | 1168 | 60.5339 | 0.7154 | 0.9550 |
0.2339 | 4.74 | 9000 | 75.7051 | 1.6699 | 39.0474 | 56.5165 | 3821 | 56.0874 | 81.2475 | 2653 | 0.3425 | 0.3425 | 1168 | 60.3245 | 0.8887 | 0.9799 |
0.23 | 5.0 | 9500 | 75.9147 | 1.6634 | 39.3876 | 56.7272 | 3821 | 56.6152 | 81.5886 | 2653 | 0.2568 | 0.2568 | 1168 | 60.4292 | 0.6508 | 0.9299 |
Framework versions
- Transformers 4.37.2
- Pytorch 1.13.1+cu117
- Datasets 2.16.1
- Tokenizers 0.15.2
- Downloads last month
- 5
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for Kudod/fine_tuned_xlm-roberta-large_2April
Base model
FacebookAI/xlm-roberta-large