MRPC
This model is a fine-tuned version of google-t5/t5-base on the GLUE MRPC dataset. It achieves the following results on the evaluation set:
- Loss: 0.5629
- Accuracy: 0.8971
- F1: 0.9268
- Combined Score: 0.9119
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10.0
Training results
Training Loss | Epoch | Step | Accuracy | Combined Score | F1 | Validation Loss |
---|---|---|---|---|---|---|
No log | 1.0 | 115 | 0.7108 | 0.7671 | 0.8234 | 0.5476 |
No log | 2.0 | 230 | 0.8701 | 0.8901 | 0.9100 | 0.3523 |
No log | 3.0 | 345 | 0.8725 | 0.8924 | 0.9122 | 0.3624 |
No log | 4.0 | 460 | 0.8775 | 0.8949 | 0.9123 | 0.3646 |
0.3744 | 5.0 | 575 | 0.8946 | 0.9099 | 0.9252 | 0.4054 |
0.3744 | 6.0 | 690 | 0.8897 | 0.9057 | 0.9217 | 0.4624 |
0.3744 | 7.0 | 805 | 0.5530 | 0.8873 | 0.9212 | 0.9042 |
0.3744 | 8.0 | 920 | 0.5405 | 0.8897 | 0.9220 | 0.9059 |
0.0877 | 9.0 | 1035 | 0.5629 | 0.8971 | 0.9268 | 0.9119 |
0.0877 | 10.0 | 1150 | 0.5856 | 0.8922 | 0.9241 | 0.9081 |
Framework versions
- Transformers 4.43.3
- Pytorch 1.11.0+cu113
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 5
Model tree for du33169/t5-base-finetuned-GLUE-MRPC
Base model
google-t5/t5-baseDataset used to train du33169/t5-base-finetuned-GLUE-MRPC
Evaluation results
- Accuracy on GLUE MRPCself-reported0.897
- F1 on GLUE MRPCself-reported0.927