nnisbett's picture
Update README.md
d98698a
metadata
license: mit
base_model: roberta-base
tags:
  - generated_from_trainer
metrics:
  - f1
model-index:
  - name: cc_narratives_robertamodel2
    results: []
widget:
  - text: >-
      I believe net zero target lacks legitimacy and without a referendum the
      current climate change policy lacks the explicit consent of the people.
  - text: >-
      Solar panels installed within new homes would be far cheaper than retro
      fitting after construction.This would help on home running costs and
      assist in our climate policy pledge on carbon emission
  - text: >-
      It is environmentally irresponsible to allow garden space occupied by
      grass and other plant life (which processes CO2 and supports wildlife) to
      be replaced by plastic which does not biodegrade

base_model

This model is a fine-tuned version of roberta-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9156
  • F1: 0.7112

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss F1
1.0269 1.0 22 0.9767 0.3192
0.9372 2.0 44 0.9233 0.4689
0.7988 3.0 66 0.8628 0.5678
0.6139 4.0 88 0.8515 0.6001
0.4226 5.0 110 0.9094 0.6003
0.2551 6.0 132 1.0029 0.6192
0.1439 7.0 154 1.0345 0.6581
0.0872 8.0 176 1.1825 0.6431
0.0702 9.0 198 1.2059 0.6468
0.0497 10.0 220 1.2089 0.6403

Framework versions

  • Transformers 4.33.1
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3