biobart-finetune

This model is a fine-tuned version of GanjinZero/biobart-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.9186
  • Rouge1: 22.5525
  • Rouge2: 4.3362
  • Rougel: 15.8156
  • Rougelsum: 19.1059
  • Gen Len: 42.9050

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 3
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
6.4594 0.2252 100 6.4329 18.9117 2.0188 11.7200 14.0691 87.4745
4.9261 0.4505 200 4.3777 18.5212 2.0711 11.6242 13.9036 93.2281
3.7235 0.6757 300 3.4460 17.5924 2.6991 12.4698 13.9427 31.1940
3.5392 0.9009 400 3.2915 17.2541 2.7861 12.9556 14.3577 21.3399
3.3947 1.1261 500 3.1847 17.2388 2.8528 13.0146 14.3760 19.3028
3.3591 1.3514 600 3.1129 17.9652 3.0939 13.4691 14.8890 20.4810
3.2893 1.5766 700 3.0270 19.5473 3.3778 14.3209 16.0940 25.6393
3.2196 1.8018 800 2.9678 21.1542 3.9248 15.1733 17.7524 32.2133
3.1616 2.0270 900 2.9470 22.2155 4.3290 15.6293 18.7960 41.2048
3.1339 2.2523 1000 2.9354 22.5585 4.2939 15.5387 19.1468 47.4577
3.1307 2.4775 1100 2.9255 22.6986 4.3846 15.8222 19.2784 44.6423
3.1409 2.7027 1200 2.9202 22.7305 4.3966 15.8445 19.2561 43.9401
3.1098 2.9279 1300 2.9186 22.5525 4.3362 15.8156 19.1059 42.9050

Framework versions

  • PEFT 0.14.0
  • Transformers 4.47.1
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
35
Inference API
Unable to determine this modelโ€™s pipeline type. Check the docs .

Model tree for pendar02/biobart-finetune

Adapter
(4)
this model

Spaces using pendar02/biobart-finetune 2