metadata
tags:
- generated_from_trainer
datasets:
- xlsum
model-index:
- name: flan-t5-base-xlsum
results: []
flan-t5-base-xlsum
This model was trained from scratch on the xlsum dataset. It achieves the following results on the evaluation set:
- Loss: 0.4057
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 6
- eval_batch_size: 12
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 200
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
0.4372 | 0.05 | 100 | 0.3986 |
0.4257 | 0.09 | 200 | 0.3988 |
0.3988 | 0.14 | 300 | 0.4002 |
0.4148 | 0.18 | 400 | 0.4011 |
0.4156 | 0.23 | 500 | 0.4010 |
0.4102 | 0.28 | 600 | 0.4012 |
0.4198 | 0.32 | 700 | 0.4014 |
0.4085 | 0.37 | 800 | 0.4013 |
0.4199 | 0.42 | 900 | 0.4014 |
0.4143 | 0.46 | 1000 | 0.4008 |
0.4176 | 0.51 | 1100 | 0.4003 |
0.4188 | 0.55 | 1200 | 0.4007 |
0.4151 | 0.6 | 1300 | 0.4005 |
0.4221 | 0.65 | 1400 | 0.3990 |
0.416 | 0.69 | 1500 | 0.4004 |
0.4093 | 0.74 | 1600 | 0.3992 |
0.4111 | 0.79 | 1700 | 0.3995 |
0.4214 | 0.83 | 1800 | 0.3997 |
0.4061 | 0.88 | 1900 | 0.3998 |
0.4307 | 0.92 | 2000 | 0.3999 |
0.4301 | 0.97 | 2100 | 0.3994 |
0.4049 | 1.02 | 2200 | 0.4006 |
0.386 | 1.06 | 2300 | 0.4008 |
0.3948 | 1.11 | 2400 | 0.4015 |
0.3909 | 1.16 | 2500 | 0.4013 |
0.3852 | 1.2 | 2600 | 0.4005 |
0.3927 | 1.25 | 2700 | 0.4011 |
0.3973 | 1.29 | 2800 | 0.4021 |
0.3895 | 1.34 | 2900 | 0.4014 |
0.386 | 1.39 | 3000 | 0.4006 |
0.4033 | 1.43 | 3100 | 0.4013 |
0.3931 | 1.48 | 3200 | 0.4009 |
0.4035 | 1.53 | 3300 | 0.4003 |
0.4073 | 1.57 | 3400 | 0.4003 |
0.3914 | 1.62 | 3500 | 0.4001 |
0.3875 | 1.66 | 3600 | 0.4007 |
0.4051 | 1.71 | 3700 | 0.4007 |
0.3878 | 1.76 | 3800 | 0.4016 |
0.3891 | 1.8 | 3900 | 0.4005 |
0.3916 | 1.85 | 4000 | 0.4014 |
0.4147 | 1.9 | 4100 | 0.3999 |
0.4037 | 1.94 | 4200 | 0.3994 |
0.4137 | 1.99 | 4300 | 0.3992 |
0.3811 | 2.03 | 4400 | 0.4028 |
0.3702 | 2.08 | 4500 | 0.4030 |
0.3607 | 2.13 | 4600 | 0.4031 |
0.3705 | 2.17 | 4700 | 0.4030 |
0.3771 | 2.22 | 4800 | 0.4030 |
0.3643 | 2.27 | 4900 | 0.4026 |
0.3933 | 2.31 | 5000 | 0.4030 |
0.3948 | 2.36 | 5100 | 0.4024 |
0.3772 | 2.4 | 5200 | 0.4023 |
0.3791 | 2.45 | 5300 | 0.4036 |
0.3705 | 2.5 | 5400 | 0.4036 |
0.3806 | 2.54 | 5500 | 0.4035 |
0.377 | 2.59 | 5600 | 0.4026 |
0.3768 | 2.64 | 5700 | 0.4020 |
0.3765 | 2.68 | 5800 | 0.4031 |
0.3819 | 2.73 | 5900 | 0.4029 |
0.3715 | 2.77 | 6000 | 0.4022 |
0.3808 | 2.82 | 6100 | 0.4014 |
0.3905 | 2.87 | 6200 | 0.4016 |
0.3905 | 2.91 | 6300 | 0.4018 |
0.3798 | 2.96 | 6400 | 0.4007 |
0.3705 | 3.01 | 6500 | 0.4013 |
0.376 | 3.05 | 6600 | 0.4042 |
0.3599 | 3.1 | 6700 | 0.4048 |
0.3642 | 3.14 | 6800 | 0.4044 |
0.368 | 3.19 | 6900 | 0.4055 |
0.3709 | 3.24 | 7000 | 0.4051 |
0.3594 | 3.28 | 7100 | 0.4046 |
0.3723 | 3.33 | 7200 | 0.4045 |
0.3564 | 3.37 | 7300 | 0.4051 |
0.3695 | 3.42 | 7400 | 0.4040 |
0.354 | 3.47 | 7500 | 0.4038 |
0.3695 | 3.51 | 7600 | 0.4040 |
0.3769 | 3.56 | 7700 | 0.4040 |
0.361 | 3.61 | 7800 | 0.4044 |
0.3727 | 3.65 | 7900 | 0.4035 |
0.3591 | 3.7 | 8000 | 0.4042 |
0.3695 | 3.74 | 8100 | 0.4036 |
0.3747 | 3.79 | 8200 | 0.4043 |
0.3562 | 3.84 | 8300 | 0.4038 |
0.3512 | 3.88 | 8400 | 0.4037 |
0.3647 | 3.93 | 8500 | 0.4038 |
0.3657 | 3.98 | 8600 | 0.4041 |
0.3534 | 4.02 | 8700 | 0.4042 |
0.3517 | 4.07 | 8800 | 0.4052 |
0.3483 | 4.11 | 8900 | 0.4052 |
0.3514 | 4.16 | 9000 | 0.4056 |
0.3544 | 4.21 | 9100 | 0.4056 |
0.3599 | 4.25 | 9200 | 0.4054 |
0.3559 | 4.3 | 9300 | 0.4056 |
0.3738 | 4.35 | 9400 | 0.4056 |
0.3572 | 4.39 | 9500 | 0.4056 |
0.3444 | 4.44 | 9600 | 0.4056 |
0.3555 | 4.48 | 9700 | 0.4058 |
0.3583 | 4.53 | 9800 | 0.4059 |
0.3746 | 4.58 | 9900 | 0.4057 |
0.3496 | 4.62 | 10000 | 0.4059 |
0.3625 | 4.67 | 10100 | 0.4059 |
0.3529 | 4.72 | 10200 | 0.4058 |
0.3584 | 4.76 | 10300 | 0.4055 |
0.3503 | 4.81 | 10400 | 0.4056 |
0.3681 | 4.85 | 10500 | 0.4057 |
0.3542 | 4.9 | 10600 | 0.4057 |
0.3539 | 4.95 | 10700 | 0.4057 |
0.3591 | 4.99 | 10800 | 0.4057 |
Framework versions
- Transformers 4.31.0
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3