roberta-base-finetuned-ner
This model is a fine-tuned version of FacebookAI/roberta-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.9020
- Precision: 0.6105
- Recall: 0.6545
- F1: 0.6317
- Accuracy: 0.8984
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 60
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
No log | 1.0 | 63 | 0.7317 | 0.6254 | 0.6378 | 0.6315 | 0.9019 |
No log | 2.0 | 126 | 0.7668 | 0.6130 | 0.6482 | 0.6301 | 0.9 |
No log | 3.0 | 189 | 0.7691 | 0.6123 | 0.6545 | 0.6327 | 0.8992 |
No log | 4.0 | 252 | 0.7907 | 0.6061 | 0.6232 | 0.6145 | 0.8956 |
No log | 5.0 | 315 | 0.8165 | 0.5798 | 0.6482 | 0.6121 | 0.8957 |
No log | 6.0 | 378 | 0.7758 | 0.6008 | 0.6534 | 0.6260 | 0.8999 |
No log | 7.0 | 441 | 0.8109 | 0.6018 | 0.6357 | 0.6183 | 0.8984 |
0.0018 | 8.0 | 504 | 0.7892 | 0.6018 | 0.6388 | 0.6197 | 0.8992 |
0.0018 | 9.0 | 567 | 0.8051 | 0.5878 | 0.6461 | 0.6156 | 0.8964 |
0.0018 | 10.0 | 630 | 0.7913 | 0.6123 | 0.6430 | 0.6273 | 0.8999 |
0.0018 | 11.0 | 693 | 0.8088 | 0.6012 | 0.6545 | 0.6267 | 0.8979 |
0.0018 | 12.0 | 756 | 0.8206 | 0.6072 | 0.6534 | 0.6295 | 0.8974 |
0.0018 | 13.0 | 819 | 0.8240 | 0.5858 | 0.6482 | 0.6155 | 0.8962 |
0.0018 | 14.0 | 882 | 0.8369 | 0.5961 | 0.6409 | 0.6177 | 0.8971 |
0.0018 | 15.0 | 945 | 0.8515 | 0.5951 | 0.6367 | 0.6152 | 0.8960 |
0.0012 | 16.0 | 1008 | 0.8743 | 0.5881 | 0.6096 | 0.5987 | 0.8949 |
0.0012 | 17.0 | 1071 | 0.8835 | 0.5945 | 0.6336 | 0.6134 | 0.8960 |
0.0012 | 18.0 | 1134 | 0.8633 | 0.5803 | 0.6409 | 0.6091 | 0.8946 |
0.0012 | 19.0 | 1197 | 0.8553 | 0.5899 | 0.6127 | 0.6011 | 0.8942 |
0.0012 | 20.0 | 1260 | 0.8715 | 0.5841 | 0.6232 | 0.6030 | 0.8938 |
0.0012 | 21.0 | 1323 | 0.8922 | 0.5881 | 0.6305 | 0.6086 | 0.8909 |
0.0012 | 22.0 | 1386 | 0.8716 | 0.5926 | 0.6482 | 0.6191 | 0.8935 |
0.0012 | 23.0 | 1449 | 0.8853 | 0.5915 | 0.6545 | 0.6214 | 0.8942 |
0.0008 | 24.0 | 1512 | 0.8494 | 0.6132 | 0.6388 | 0.6258 | 0.8973 |
0.0008 | 25.0 | 1575 | 0.8698 | 0.5901 | 0.6461 | 0.6168 | 0.8937 |
0.0008 | 26.0 | 1638 | 0.8622 | 0.5996 | 0.6409 | 0.6196 | 0.8946 |
0.0008 | 27.0 | 1701 | 0.8517 | 0.6057 | 0.6430 | 0.6238 | 0.8970 |
0.0008 | 28.0 | 1764 | 0.8696 | 0.6108 | 0.6388 | 0.6245 | 0.8977 |
0.0008 | 29.0 | 1827 | 0.8753 | 0.5979 | 0.6503 | 0.6230 | 0.8978 |
0.0008 | 30.0 | 1890 | 0.8519 | 0.6026 | 0.6409 | 0.6211 | 0.8973 |
0.0008 | 31.0 | 1953 | 0.8588 | 0.6086 | 0.6524 | 0.6297 | 0.8992 |
0.0007 | 32.0 | 2016 | 0.8713 | 0.5968 | 0.6305 | 0.6132 | 0.8970 |
0.0007 | 33.0 | 2079 | 0.8761 | 0.5982 | 0.6388 | 0.6179 | 0.8975 |
0.0007 | 34.0 | 2142 | 0.8733 | 0.5947 | 0.6357 | 0.6145 | 0.8967 |
0.0007 | 35.0 | 2205 | 0.8793 | 0.5996 | 0.6378 | 0.6181 | 0.8977 |
0.0007 | 36.0 | 2268 | 0.8959 | 0.5950 | 0.6503 | 0.6214 | 0.8971 |
0.0007 | 37.0 | 2331 | 0.8795 | 0.6078 | 0.6534 | 0.6298 | 0.8986 |
0.0007 | 38.0 | 2394 | 0.8856 | 0.6208 | 0.6597 | 0.6397 | 0.9 |
0.0007 | 39.0 | 2457 | 0.8897 | 0.6155 | 0.6534 | 0.6339 | 0.8992 |
0.0005 | 40.0 | 2520 | 0.8901 | 0.6098 | 0.6524 | 0.6304 | 0.8988 |
0.0005 | 41.0 | 2583 | 0.8881 | 0.6142 | 0.6482 | 0.6308 | 0.8984 |
0.0005 | 42.0 | 2646 | 0.8857 | 0.6193 | 0.6503 | 0.6344 | 0.8989 |
0.0005 | 43.0 | 2709 | 0.8911 | 0.6121 | 0.6524 | 0.6316 | 0.8973 |
0.0005 | 44.0 | 2772 | 0.8988 | 0.6015 | 0.6493 | 0.6245 | 0.8968 |
0.0005 | 45.0 | 2835 | 0.8927 | 0.6169 | 0.6472 | 0.6317 | 0.8978 |
0.0005 | 46.0 | 2898 | 0.8974 | 0.6137 | 0.6649 | 0.6383 | 0.8978 |
0.0005 | 47.0 | 2961 | 0.8991 | 0.6115 | 0.6555 | 0.6327 | 0.8968 |
0.0004 | 48.0 | 3024 | 0.9001 | 0.6087 | 0.6545 | 0.6308 | 0.8966 |
0.0004 | 49.0 | 3087 | 0.9015 | 0.6071 | 0.6566 | 0.6309 | 0.8968 |
0.0004 | 50.0 | 3150 | 0.8986 | 0.6109 | 0.6524 | 0.6310 | 0.8968 |
0.0004 | 51.0 | 3213 | 0.9014 | 0.6083 | 0.6597 | 0.6329 | 0.8984 |
0.0004 | 52.0 | 3276 | 0.9018 | 0.6091 | 0.6587 | 0.6329 | 0.8988 |
0.0004 | 53.0 | 3339 | 0.8991 | 0.6107 | 0.6534 | 0.6314 | 0.8986 |
0.0004 | 54.0 | 3402 | 0.9000 | 0.6084 | 0.6534 | 0.6301 | 0.8985 |
0.0004 | 55.0 | 3465 | 0.9015 | 0.6081 | 0.6545 | 0.6305 | 0.8988 |
0.0003 | 56.0 | 3528 | 0.9019 | 0.6054 | 0.6503 | 0.6271 | 0.8982 |
0.0003 | 57.0 | 3591 | 0.9011 | 0.6086 | 0.6524 | 0.6297 | 0.8982 |
0.0003 | 58.0 | 3654 | 0.9017 | 0.6080 | 0.6524 | 0.6294 | 0.8984 |
0.0003 | 59.0 | 3717 | 0.9019 | 0.6121 | 0.6555 | 0.6331 | 0.8985 |
0.0003 | 60.0 | 3780 | 0.9020 | 0.6105 | 0.6545 | 0.6317 | 0.8984 |
Framework versions
- Transformers 4.45.2
- Pytorch 2.4.1
- Datasets 2.18.0
- Tokenizers 0.20.0
- Downloads last month
- 16
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for roncmic/roberta-base-finetuned-ner
Base model
FacebookAI/roberta-base