--- license: apache-2.0 base_model: openai/whisper-base tags: - generated_from_trainer metrics: - bleu - wer - chrf model-index: - name: Whisper Base GA-EN Speech Translation results: [] datasets: - ymoslem/IWSLT2023-GA-EN - ymoslem/FLEURS-GA-EN language: - ga - en library_name: transformers --- # Whisper Base GA-EN Speech Translation This model is a fine-tuned version of [openai/whisper-base](https://huggingface.co/openai/whisper-base) on an unknown dataset. The best model (this version) is at checkpoint 1000, epoch 2.54, and it achieves the following results on the evaluation set: - Loss: 1.9005 - Bleu: 21.83 - Chrf: 37.13 - Wer: 80.4593 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Experiment - Data (v1.1: IWSLT2023-GA-EN; v1.2: +FLEURS-GA-EN) ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 0.03 - training_steps: 1000 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Bleu | Chrf | Wer | |:-------------:|:-----:|:----:|:---------------:|:-----:|:-----:|:--------:| | 2.6826 | 0.25 | 100 | 2.0993 | 7.23 | 22.29 | 100.7654 | | 2.1287 | 0.51 | 200 | 1.9451 | 9.37 | 27.74 | 125.9343 | | 1.8482 | 0.76 | 300 | 1.8356 | 13.11 | 30.65 | 103.5570 | | 1.2977 | 1.02 | 400 | 1.8643 | 10.56 | 30.86 | 128.5907 | | 0.8068 | 1.27 | 500 | 1.8658 | 18.23 | 35.17 | 82.6204 | | 0.7257 | 1.52 | 600 | 1.8493 | 17.81 | 34.13 | 90.7249 | | 0.6202 | 1.78 | 700 | 1.8312 | 17.6 | 35.19 | 92.2107 | | 0.4348 | 2.03 | 800 | 1.8771 | 17.9 | 35.66 | 91.9856 | | 0.2566 | 2.28 | 900 | 1.9088 | 20.14 | 36.79 | 81.4498 | | 0.2301 | 2.54 | 1000 | 1.9005 | 21.83 | 37.13 | 80.4593 | ### Framework versions - Transformers 4.39.3 - Pytorch 2.2.1+cu121 - Datasets 2.18.0 - Tokenizers 0.15.2