Edit model card

distilroberta-base-climate-detector-finetuned-ner

This model is a fine-tuned version of climatebert/distilroberta-base-climate-detector on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7681
  • Precision: 0.6184
  • Recall: 0.6816
  • F1: 0.6485
  • Accuracy: 0.9046

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 80

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 63 0.6432 0.6034 0.6670 0.6336 0.9018
No log 2.0 126 0.6383 0.6167 0.6618 0.6385 0.9046
No log 3.0 189 0.6370 0.6219 0.6764 0.648 0.9060
No log 4.0 252 0.6658 0.5821 0.6775 0.6261 0.8988
No log 5.0 315 0.6742 0.5990 0.6566 0.6265 0.9019
No log 6.0 378 0.6467 0.6127 0.6722 0.6411 0.9044
No log 7.0 441 0.6676 0.6024 0.6848 0.6409 0.9040
0.0043 8.0 504 0.6633 0.5824 0.6754 0.6254 0.9012
0.0043 9.0 567 0.6581 0.6078 0.6827 0.6431 0.9027
0.0043 10.0 630 0.6672 0.6032 0.6681 0.6340 0.9035
0.0043 11.0 693 0.6648 0.6192 0.6858 0.6508 0.9064
0.0043 12.0 756 0.6556 0.6174 0.6806 0.6475 0.9064
0.0043 13.0 819 0.6698 0.6114 0.6733 0.6408 0.9064
0.0043 14.0 882 0.7113 0.6024 0.6879 0.6423 0.9035
0.0043 15.0 945 0.6913 0.5962 0.6827 0.6365 0.9026
0.0022 16.0 1008 0.7037 0.5941 0.6691 0.6294 0.9025
0.0022 17.0 1071 0.7171 0.6069 0.6816 0.6421 0.9042
0.0022 18.0 1134 0.6988 0.5960 0.6608 0.6267 0.9022
0.0022 19.0 1197 0.6803 0.6142 0.6879 0.6489 0.9056
0.0022 20.0 1260 0.7230 0.5924 0.6795 0.6330 0.9007
0.0022 21.0 1323 0.7249 0.5925 0.6587 0.6238 0.9019
0.0022 22.0 1386 0.7053 0.5880 0.6660 0.6246 0.9015
0.0022 23.0 1449 0.7108 0.6097 0.6848 0.6450 0.9026
0.0016 24.0 1512 0.7155 0.5963 0.6722 0.6320 0.9038
0.0016 25.0 1575 0.7228 0.6038 0.6649 0.6329 0.9018
0.0016 26.0 1638 0.7411 0.5927 0.6608 0.6249 0.9005
0.0016 27.0 1701 0.7427 0.5976 0.6743 0.6336 0.9031
0.0016 28.0 1764 0.7455 0.6082 0.6806 0.6424 0.9029
0.0016 29.0 1827 0.7326 0.6075 0.6754 0.6396 0.9040
0.0016 30.0 1890 0.7594 0.6101 0.6681 0.6378 0.9025
0.0016 31.0 1953 0.7516 0.6024 0.6691 0.6340 0.9056
0.0011 32.0 2016 0.7382 0.6053 0.6722 0.6370 0.9042
0.0011 33.0 2079 0.7523 0.6028 0.6733 0.6361 0.9023
0.0011 34.0 2142 0.7516 0.5983 0.6764 0.6350 0.9014
0.0011 35.0 2205 0.7380 0.6060 0.6743 0.6383 0.9034
0.0011 36.0 2268 0.7636 0.6030 0.6691 0.6343 0.9042
0.0011 37.0 2331 0.7734 0.6009 0.6743 0.6355 0.9014
0.0011 38.0 2394 0.7619 0.6039 0.6764 0.6381 0.9029
0.0011 39.0 2457 0.7605 0.6084 0.6649 0.6354 0.9025
0.001 40.0 2520 0.7642 0.6061 0.6681 0.6356 0.9056
0.001 41.0 2583 0.7526 0.6202 0.6681 0.6432 0.9052
0.001 42.0 2646 0.7774 0.6021 0.6681 0.6333 0.9012
0.001 43.0 2709 0.7645 0.6033 0.6795 0.6392 0.9014
0.001 44.0 2772 0.7594 0.6113 0.6764 0.6422 0.9052
0.001 45.0 2835 0.7616 0.6066 0.6712 0.6373 0.9025
0.001 46.0 2898 0.7566 0.6071 0.6743 0.6390 0.9045
0.001 47.0 2961 0.7614 0.6082 0.6775 0.6410 0.9046
0.0006 48.0 3024 0.7800 0.6159 0.6545 0.6346 0.9026
0.0006 49.0 3087 0.7841 0.6043 0.6681 0.6346 0.9026
0.0006 50.0 3150 0.7697 0.6035 0.6848 0.6416 0.9040
0.0006 51.0 3213 0.7667 0.6121 0.6785 0.6436 0.9031
0.0006 52.0 3276 0.7769 0.5985 0.6691 0.6318 0.9030
0.0006 53.0 3339 0.7842 0.6023 0.6701 0.6344 0.9030
0.0006 54.0 3402 0.7934 0.5856 0.6534 0.6177 0.9008
0.0006 55.0 3465 0.7906 0.6067 0.6649 0.6345 0.9041
0.0007 56.0 3528 0.7948 0.5885 0.6628 0.6235 0.9000
0.0007 57.0 3591 0.7703 0.6071 0.6628 0.6337 0.9027
0.0007 58.0 3654 0.7581 0.6089 0.6712 0.6385 0.9038
0.0007 59.0 3717 0.7659 0.6004 0.6681 0.6324 0.9018
0.0007 60.0 3780 0.7633 0.6069 0.6754 0.6393 0.9020
0.0007 61.0 3843 0.7606 0.6087 0.6722 0.6389 0.9034
0.0007 62.0 3906 0.7586 0.6256 0.6733 0.6486 0.9060
0.0007 63.0 3969 0.7560 0.6175 0.6639 0.6398 0.9038
0.0009 64.0 4032 0.7556 0.6183 0.6628 0.6398 0.9023
0.0009 65.0 4095 0.7544 0.6189 0.6764 0.6464 0.9048
0.0009 66.0 4158 0.7590 0.6158 0.6743 0.6437 0.9045
0.0009 67.0 4221 0.7615 0.6146 0.6691 0.6407 0.9044
0.0009 68.0 4284 0.7634 0.6135 0.6712 0.6411 0.9038
0.0009 69.0 4347 0.7646 0.6240 0.6670 0.6448 0.9046
0.0009 70.0 4410 0.7686 0.6195 0.6712 0.6443 0.9040
0.0009 71.0 4473 0.7685 0.6152 0.6691 0.6410 0.9041
0.0007 72.0 4536 0.7678 0.6162 0.6670 0.6406 0.9035
0.0007 73.0 4599 0.7712 0.6128 0.6775 0.6435 0.9038
0.0007 74.0 4662 0.7713 0.6091 0.6733 0.6396 0.9029
0.0007 75.0 4725 0.7690 0.6181 0.6827 0.6488 0.9052
0.0007 76.0 4788 0.7684 0.6201 0.6816 0.6494 0.9051
0.0007 77.0 4851 0.7688 0.6157 0.6806 0.6465 0.9048
0.0007 78.0 4914 0.7685 0.6134 0.6806 0.6452 0.9041
0.0007 79.0 4977 0.7683 0.6163 0.6806 0.6468 0.9041
0.0005 80.0 5040 0.7681 0.6184 0.6816 0.6485 0.9046

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.4.1
  • Datasets 2.18.0
  • Tokenizers 0.20.0
Downloads last month
41
Safetensors
Model size
65.2M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for roncmic/distilroberta-base-climate-detector-finetuned-ner

Finetuned
(3)
this model