Edit model card

t5-small-finetuned-feedback

This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6145
  • Rouge1: 51.2809
  • Rouge2: 27.3229
  • Rougel: 49.2287
  • Rougelsum: 49.211
  • Gen Len: 10.1736

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 12
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 61 2.9832 24.9931 10.0881 21.9651 22.0687 16.4876
No log 2.0 122 2.1822 36.3348 17.5969 34.3034 34.2834 12.1653
No log 3.0 183 1.9607 43.7295 21.5907 41.8815 41.929 10.5372
No log 4.0 244 1.8412 48.7074 25.1744 46.8382 46.8399 10.405
No log 5.0 305 1.7674 50.1972 26.4116 48.1456 48.0538 10.2066
No log 6.0 366 1.7195 51.0984 27.8685 48.9483 49.0108 10.3554
No log 7.0 427 1.6832 50.272 27.3168 48.4083 48.4307 10.0331
No log 8.0 488 1.6558 50.6829 27.5132 48.6684 48.735 10.2727
2.363 9.0 549 1.6357 50.0286 27.0674 48.0211 48.0783 10.1736
2.363 10.0 610 1.6240 50.8207 26.8345 48.6528 48.6903 10.1983
2.363 11.0 671 1.6166 50.9796 27.0236 48.8888 48.8958 10.1901
2.363 12.0 732 1.6145 51.2809 27.3229 49.2287 49.211 10.1736

Framework versions

  • Transformers 4.40.1
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
60.5M params
Tensor type
F32
·

Finetuned from