moanlb's picture
End of training
5c155e9 verified
|
raw
history blame
8.59 kB
metadata
license: apache-2.0
base_model: t5-small
tags:
  - generated_from_trainer
metrics:
  - bleu
model-index:
  - name: t5-small_finetuned_Informal_text-to-Formal_text
    results: []

t5-small_finetuned_Informal_text-to-Formal_text

This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 10.375
  • Bleu: 0.0
  • Gen Len: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.01
  • train_batch_size: 12
  • eval_batch_size: 12
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
9.3669 1.0 5229 9.4520 0.0023 19.0
10.2293 2.0 10458 10.2588 0.1433 6.0
10.3618 3.0 15687 10.3648 0.0 0.0
10.375 4.0 20916 10.375 0.0 0.0
10.375 5.0 26145 10.375 0.0 0.0
10.375 6.0 31374 10.375 0.0 0.0
10.375 7.0 36603 10.375 0.0 0.0
10.375 8.0 41832 10.375 0.0 0.0
10.375 9.0 47061 10.375 0.0 0.0
10.375 10.0 52290 10.375 0.0 0.0
10.375 11.0 57519 10.375 0.0 0.0
10.375 12.0 62748 10.375 0.0 0.0
10.375 13.0 67977 10.375 0.0 0.0
10.375 14.0 73206 10.375 0.0 0.0
10.375 15.0 78435 10.375 0.0 0.0
10.375 16.0 83664 10.375 0.0 0.0
10.375 17.0 88893 10.375 0.0 0.0
10.375 18.0 94122 10.375 0.0 0.0
10.375 19.0 99351 10.375 0.0 0.0
10.375 20.0 104580 10.375 0.0 0.0
10.375 21.0 109809 10.375 0.0 0.0
10.375 22.0 115038 10.375 0.0 0.0
10.375 23.0 120267 10.375 0.0 0.0
10.375 24.0 125496 10.375 0.0 0.0
10.375 25.0 130725 10.375 0.0 0.0
10.375 26.0 135954 10.375 0.0 0.0
10.375 27.0 141183 10.375 0.0 0.0
10.375 28.0 146412 10.375 0.0 0.0
10.375 29.0 151641 10.375 0.0 0.0
10.375 30.0 156870 10.375 0.0 0.0
10.375 31.0 162099 10.375 0.0 0.0
10.375 32.0 167328 10.375 0.0 0.0
10.375 33.0 172557 10.375 0.0 0.0
10.375 34.0 177786 10.375 0.0 0.0
10.375 35.0 183015 10.375 0.0 0.0
10.375 36.0 188244 10.375 0.0 0.0
10.375 37.0 193473 10.375 0.0 0.0
10.375 38.0 198702 10.375 0.0 0.0
10.375 39.0 203931 10.375 0.0 0.0
10.375 40.0 209160 10.375 0.0 0.0
10.375 41.0 214389 10.375 0.0 0.0
10.375 42.0 219618 10.375 0.0 0.0
10.375 43.0 224847 10.375 0.0 0.0
10.375 44.0 230076 10.375 0.0 0.0
10.375 45.0 235305 10.375 0.0 0.0
10.375 46.0 240534 10.375 0.0 0.0
10.375 47.0 245763 10.375 0.0 0.0
10.375 48.0 250992 10.375 0.0 0.0
10.375 49.0 256221 10.375 0.0 0.0
10.375 50.0 261450 10.375 0.0 0.0
10.375 51.0 266679 10.375 0.0 0.0
10.375 52.0 271908 10.375 0.0 0.0
10.375 53.0 277137 10.375 0.0 0.0
10.375 54.0 282366 10.375 0.0 0.0
10.375 55.0 287595 10.375 0.0 0.0
10.375 56.0 292824 10.375 0.0 0.0
10.375 57.0 298053 10.375 0.0 0.0
10.375 58.0 303282 10.375 0.0 0.0
10.375 59.0 308511 10.375 0.0 0.0
10.375 60.0 313740 10.375 0.0 0.0
10.375 61.0 318969 10.375 0.0 0.0
10.375 62.0 324198 10.375 0.0 0.0
10.375 63.0 329427 10.375 0.0 0.0
10.375 64.0 334656 10.375 0.0 0.0
10.375 65.0 339885 10.375 0.0 0.0
10.375 66.0 345114 10.375 0.0 0.0
10.375 67.0 350343 10.375 0.0 0.0
10.375 68.0 355572 10.375 0.0 0.0
10.375 69.0 360801 10.375 0.0 0.0
10.375 70.0 366030 10.375 0.0 0.0
10.375 71.0 371259 10.375 0.0 0.0
10.375 72.0 376488 10.375 0.0 0.0
10.375 73.0 381717 10.375 0.0 0.0
10.375 74.0 386946 10.375 0.0 0.0
10.375 75.0 392175 10.375 0.0 0.0
10.375 76.0 397404 10.375 0.0 0.0
10.375 77.0 402633 10.375 0.0 0.0
10.375 78.0 407862 10.375 0.0 0.0
10.375 79.0 413091 10.375 0.0 0.0
10.375 80.0 418320 10.375 0.0 0.0
10.375 81.0 423549 10.375 0.0 0.0
10.375 82.0 428778 10.375 0.0 0.0
10.375 83.0 434007 10.375 0.0 0.0
10.375 84.0 439236 10.375 0.0 0.0
10.375 85.0 444465 10.375 0.0 0.0
10.375 86.0 449694 10.375 0.0 0.0
10.375 87.0 454923 10.375 0.0 0.0
10.375 88.0 460152 10.375 0.0 0.0
10.375 89.0 465381 10.375 0.0 0.0
10.375 90.0 470610 10.375 0.0 0.0
10.375 91.0 475839 10.375 0.0 0.0
10.375 92.0 481068 10.375 0.0 0.0
10.375 93.0 486297 10.375 0.0 0.0
10.375 94.0 491526 10.375 0.0 0.0
10.375 95.0 496755 10.375 0.0 0.0
10.375 96.0 501984 10.375 0.0 0.0
10.375 97.0 507213 10.375 0.0 0.0
10.375 98.0 512442 10.375 0.0 0.0
10.375 99.0 517671 10.375 0.0 0.0
10.375 100.0 522900 10.375 0.0 0.0

Framework versions

  • Transformers 4.40.1
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1