--- library_name: transformers license: apache-2.0 base_model: google-t5/t5-small tags: - translation - generated_from_trainer metrics: - bleu model-index: - name: t5-small-finetuned-english-to-hausa results: [] --- # t5-small-finetuned-english-to-hausa This model is a fine-tuned version of [google-t5/t5-small](https://huggingface.co/google-t5/t5-small) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.6851 - Bleu: 71.9442 - Gen Len: 14.3679 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0008 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len | |:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------:| | 1.2594 | 1.0 | 1497 | 0.8236 | 59.41 | 14.2172 | | 0.7848 | 2.0 | 2994 | 0.6581 | 64.4839 | 14.219 | | 0.6172 | 3.0 | 4491 | 0.5897 | 66.4564 | 14.2357 | | 0.5151 | 4.0 | 5988 | 0.5619 | 68.0986 | 14.4905 | | 0.4457 | 5.0 | 7485 | 0.5477 | 69.2175 | 14.4141 | | 0.3938 | 6.0 | 8982 | 0.5413 | 70.0663 | 14.4059 | | 0.3555 | 7.0 | 10479 | 0.5338 | 70.1734 | 14.4734 | | 0.3154 | 8.0 | 11976 | 0.5485 | 70.3692 | 14.3035 | | 0.2837 | 9.0 | 13473 | 0.5454 | 70.7837 | 14.4556 | | 0.2507 | 10.0 | 14970 | 0.5616 | 70.976 | 14.3807 | | 0.2265 | 11.0 | 16467 | 0.5728 | 71.2008 | 14.3692 | | 0.2041 | 12.0 | 17964 | 0.5808 | 71.4766 | 14.362 | | 0.1848 | 13.0 | 19461 | 0.5981 | 71.3804 | 14.3114 | | 0.1715 | 14.0 | 20958 | 0.6122 | 71.43 | 14.4295 | | 0.1547 | 15.0 | 22455 | 0.6309 | 71.753 | 14.351 | | 0.1417 | 16.0 | 23952 | 0.6411 | 71.7608 | 14.3513 | | 0.1267 | 17.0 | 25449 | 0.6612 | 71.93 | 14.4243 | | 0.1208 | 18.0 | 26946 | 0.6662 | 71.8591 | 14.3486 | | 0.1076 | 19.0 | 28443 | 0.6799 | 72.0417 | 14.3862 | | 0.1046 | 20.0 | 29940 | 0.6851 | 71.9442 | 14.3679 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.3.1+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1