haryoaw's picture
Upload tokenizer
b56fe53 verified
---
base_model: haryoaw/scenario-TCR-NER_data-univner_full
library_name: transformers
license: mit
metrics:
- precision
- recall
- f1
- accuracy
tags:
- generated_from_trainer
model-index:
- name: scenario-kd-scr-ner-half-xlmr_data-univner_full66
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# scenario-kd-scr-ner-half-xlmr_data-univner_full66
This model is a fine-tuned version of [haryoaw/scenario-TCR-NER_data-univner_full](https://huggingface.co/haryoaw/scenario-TCR-NER_data-univner_full) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2164
- Precision: 0.4268
- Recall: 0.3872
- F1: 0.4061
- Accuracy: 0.9451
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 8
- eval_batch_size: 32
- seed: 66
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:------:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 2.8528 | 0.2911 | 500 | 2.3182 | 0.5357 | 0.0022 | 0.0043 | 0.9241 |
| 2.1207 | 0.5822 | 1000 | 2.0259 | 0.2078 | 0.0100 | 0.0190 | 0.9243 |
| 1.8727 | 0.8732 | 1500 | 1.8417 | 0.2116 | 0.0853 | 0.1216 | 0.9265 |
| 1.7263 | 1.1643 | 2000 | 1.7362 | 0.1592 | 0.0693 | 0.0965 | 0.9268 |
| 1.6268 | 1.4554 | 2500 | 1.7251 | 0.2612 | 0.1594 | 0.1980 | 0.9303 |
| 1.5859 | 1.7465 | 3000 | 1.6161 | 0.3013 | 0.1824 | 0.2272 | 0.9331 |
| 1.4992 | 2.0375 | 3500 | 1.5799 | 0.3257 | 0.2106 | 0.2558 | 0.9348 |
| 1.4072 | 2.3286 | 4000 | 1.5337 | 0.3402 | 0.2483 | 0.2871 | 0.9358 |
| 1.374 | 2.6197 | 4500 | 1.5234 | 0.3113 | 0.2886 | 0.2995 | 0.9363 |
| 1.3602 | 2.9108 | 5000 | 1.4697 | 0.3426 | 0.2717 | 0.3030 | 0.9376 |
| 1.2661 | 3.2019 | 5500 | 1.4345 | 0.3421 | 0.2953 | 0.3170 | 0.9386 |
| 1.2524 | 3.4929 | 6000 | 1.4146 | 0.3843 | 0.3011 | 0.3376 | 0.9395 |
| 1.2057 | 3.7840 | 6500 | 1.4308 | 0.3767 | 0.2730 | 0.3166 | 0.9400 |
| 1.2018 | 4.0751 | 7000 | 1.3902 | 0.3836 | 0.3069 | 0.3410 | 0.9406 |
| 1.1243 | 4.3662 | 7500 | 1.3595 | 0.3811 | 0.3255 | 0.3511 | 0.9414 |
| 1.1248 | 4.6573 | 8000 | 1.3407 | 0.3940 | 0.3122 | 0.3484 | 0.9414 |
| 1.1234 | 4.9483 | 8500 | 1.3333 | 0.3802 | 0.3196 | 0.3473 | 0.9415 |
| 1.0707 | 5.2394 | 9000 | 1.3303 | 0.3937 | 0.3301 | 0.3591 | 0.9422 |
| 1.0384 | 5.5305 | 9500 | 1.2940 | 0.3962 | 0.3370 | 0.3642 | 0.9425 |
| 1.0239 | 5.8216 | 10000 | 1.2959 | 0.3967 | 0.3486 | 0.3711 | 0.9420 |
| 1.007 | 6.1126 | 10500 | 1.2798 | 0.4070 | 0.3653 | 0.3850 | 0.9430 |
| 0.9654 | 6.4037 | 11000 | 1.2714 | 0.3904 | 0.3634 | 0.3764 | 0.9424 |
| 0.9657 | 6.6948 | 11500 | 1.2591 | 0.3861 | 0.3774 | 0.3817 | 0.9428 |
| 0.9678 | 6.9859 | 12000 | 1.2546 | 0.4209 | 0.3509 | 0.3827 | 0.9435 |
| 0.9217 | 7.2770 | 12500 | 1.2610 | 0.4124 | 0.3686 | 0.3893 | 0.9433 |
| 0.9056 | 7.5680 | 13000 | 1.2403 | 0.4238 | 0.3744 | 0.3976 | 0.9442 |
| 0.9146 | 7.8591 | 13500 | 1.2396 | 0.4242 | 0.3779 | 0.3997 | 0.9445 |
| 0.8974 | 8.1502 | 14000 | 1.2246 | 0.4213 | 0.3910 | 0.4056 | 0.9448 |
| 0.8572 | 8.4413 | 14500 | 1.2233 | 0.4232 | 0.3831 | 0.4022 | 0.9447 |
| 0.8703 | 8.7324 | 15000 | 1.2265 | 0.4228 | 0.3740 | 0.3969 | 0.9450 |
| 0.8774 | 9.0234 | 15500 | 1.2190 | 0.4415 | 0.3806 | 0.4088 | 0.9454 |
| 0.8581 | 9.3145 | 16000 | 1.2245 | 0.4251 | 0.3838 | 0.4034 | 0.9449 |
| 0.8411 | 9.6056 | 16500 | 1.2153 | 0.4298 | 0.3982 | 0.4134 | 0.9453 |
| 0.8466 | 9.8967 | 17000 | 1.2164 | 0.4268 | 0.3872 | 0.4061 | 0.9451 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.1.1+cu121
- Datasets 2.14.5
- Tokenizers 0.19.1