mt5-small-finetuned-amazon-en-de

This model is a fine-tuned version of google/mt5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.5965
  • Rouge1: 19.1764
  • Rouge2: 10.6855
  • Rougel: 18.7602
  • Rougelsum: 18.8956

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5.5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 6

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
2.9041 1.0 1301 2.6000 17.3749 10.0728 16.9903 17.0336
2.744 2.0 2602 2.5874 17.7266 9.2481 17.2785 17.3827
2.6641 3.0 3903 2.6001 19.0052 10.6312 18.7604 18.754
2.6189 4.0 5204 2.6012 18.834 10.1299 18.4209 18.5351
2.6029 5.0 6505 2.5944 19.3375 10.537 18.8614 19.0826
2.6086 6.0 7806 2.5965 19.1764 10.6855 18.7602 18.8956

Framework versions

  • Transformers 4.34.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
24
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for gsvann/mt5-small-finetuned-amazon-en-de

Base model

google/mt5-small
Finetuned
(365)
this model