Edit model card

t5-base-finetuned-billsum

This model is a fine-tuned version of google-t5/t5-base on an FiscalNote/billsum dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1725
  • Rouge1: 54.1481
  • Rouge2: 33.3953
  • Rougel: 42.8337
  • Rougelsum: 47.5287
  • Gen Len: 116.8581

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
2.5944 0.4219 500 1.2582 50.6899 31.6418 40.2325 44.2687 111.7541
1.3588 0.8439 1000 1.1591 55.865 35.992 44.7636 49.2805 114.3552
1.275 1.2658 1500 1.1214 56.3449 37.0781 45.604 49.9711 110.7724
1.3266 1.6878 2000 1.1791 54.4797 33.8689 43.1813 47.8507 114.8278
1.3591 2.1097 2500 1.1725 54.243 33.5179 42.9187 47.6231 116.4601
1.3484 2.5316 3000 1.1724 54.1433 33.3914 42.8348 47.5267 116.7736
1.3467 2.9536 3500 1.1724 54.1359 33.3794 42.8167 47.5153 116.7819
1.3483 3.3755 4000 1.1724 54.1446 33.3947 42.8274 47.5313 116.8529
1.342 3.7975 4500 1.1724 54.1341 33.3888 42.8239 47.5291 116.7957
1.3475 4.2194 5000 1.1725 54.1411 33.3931 42.8224 47.5218 116.8229
1.3542 4.6414 5500 1.1725 54.1481 33.3953 42.8337 47.5287 116.8581

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.1.0+cu118
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
223M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for luluw/t5-base-finetuned-billsum

Base model

google-t5/t5-base
Finetuned
(407)
this model

Dataset used to train luluw/t5-base-finetuned-billsum