cantillation's picture
End of training
1c04180 verified
metadata
language:
  - he
license: apache-2.0
base_model: openai/whisper-base
tags:
  - hf-asr-leaderboard
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: he-cantillation
    results: []

he-cantillation

This model is a fine-tuned version of openai/whisper-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2018
  • Wer: 13.4397
  • Avg Precision Exact: 0.8729
  • Avg Recall Exact: 0.8750
  • Avg F1 Exact: 0.8735
  • Avg Precision Letter Shift: 0.8927
  • Avg Recall Letter Shift: 0.8951
  • Avg F1 Letter Shift: 0.8935
  • Avg Precision Word Level: 0.8956
  • Avg Recall Word Level: 0.8980
  • Avg F1 Word Level: 0.8964
  • Avg Precision Word Shift: 0.9607
  • Avg Recall Word Shift: 0.9633
  • Avg F1 Word Shift: 0.9615
  • Precision Median Exact: 0.9375
  • Recall Median Exact: 1.0
  • F1 Median Exact: 0.9565
  • Precision Max Exact: 1.0
  • Recall Max Exact: 1.0
  • F1 Max Exact: 1.0
  • Precision Min Exact: 0.0
  • Recall Min Exact: 0.0
  • F1 Min Exact: 0.0
  • Precision Min Letter Shift: 0.0
  • Recall Min Letter Shift: 0.0
  • F1 Min Letter Shift: 0.0
  • Precision Min Word Level: 0.0
  • Recall Min Word Level: 0.0
  • F1 Min Word Level: 0.0
  • Precision Min Word Shift: 0.1429
  • Recall Min Word Shift: 0.125
  • F1 Min Word Shift: 0.1333

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 300000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Avg Precision Exact Avg Recall Exact Avg F1 Exact Avg Precision Letter Shift Avg Recall Letter Shift Avg F1 Letter Shift Avg Precision Word Level Avg Recall Word Level Avg F1 Word Level Avg Precision Word Shift Avg Recall Word Shift Avg F1 Word Shift Precision Median Exact Recall Median Exact F1 Median Exact Precision Max Exact Recall Max Exact F1 Max Exact Precision Min Exact Recall Min Exact F1 Min Exact Precision Min Letter Shift Recall Min Letter Shift F1 Min Letter Shift Precision Min Word Level Recall Min Word Level F1 Min Word Level Precision Min Word Shift Recall Min Word Shift F1 Min Word Shift
No log 0.0001 1 9.3041 101.5608 0.0004 0.0014 0.0006 0.0097 0.0097 0.0095 0.0046 0.0245 0.0076 0.0745 0.0770 0.0745 0.0 0.0 0.0 0.125 0.5 0.2 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0912 0.5167 10000 0.1643 24.2519 0.7744 0.7794 0.7763 0.8031 0.8084 0.8050 0.8086 0.8139 0.8105 0.9118 0.9194 0.9147 0.8667 0.875 0.875 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.031 1.0334 20000 0.1517 19.9125 0.8108 0.8070 0.8082 0.8351 0.8312 0.8325 0.8397 0.8368 0.8376 0.9358 0.9337 0.9339 0.9091 0.9091 0.9091 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.1111 0.1
0.0202 1.5501 30000 0.1538 19.0346 0.8207 0.8188 0.8191 0.8440 0.8422 0.8425 0.8484 0.8461 0.8466 0.9362 0.9355 0.9350 0.9167 0.9091 0.9091 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.0833 0.0909
0.0092 2.0668 40000 0.1561 17.4297 0.8346 0.8349 0.8342 0.8570 0.8575 0.8567 0.8610 0.8615 0.8607 0.9425 0.9438 0.9424 0.9167 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1111 0.1667 0.1429
0.0087 2.5834 50000 0.1619 16.5864 0.8456 0.8472 0.8459 0.8688 0.8706 0.8692 0.8728 0.8746 0.8732 0.9476 0.9502 0.9482 0.9231 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0204 3.1001 60000 0.1678 16.4165 0.8450 0.8476 0.8458 0.8670 0.8697 0.8678 0.8704 0.8731 0.8712 0.9463 0.9505 0.9478 0.9231 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0769 0.0714 0.0741
0.01 3.6168 70000 0.1698 16.5707 0.8442 0.8487 0.8460 0.8658 0.8706 0.8676 0.8693 0.8741 0.8712 0.9425 0.9481 0.9446 0.9231 0.9231 0.9286 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0028 4.1335 80000 0.1773 15.8249 0.8503 0.8547 0.8520 0.8716 0.8762 0.8734 0.8750 0.8798 0.8769 0.9475 0.9535 0.9498 0.9231 0.9286 0.9286 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.0909 0.1176
0.0027 4.6502 90000 0.1759 16.0137 0.8473 0.8495 0.8479 0.8687 0.8711 0.8694 0.8728 0.8753 0.8735 0.9477 0.9516 0.9491 0.9231 0.9231 0.9286 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0769 0.0714 0.0741
0.0027 5.1669 100000 0.1789 15.5543 0.8518 0.8527 0.8517 0.8725 0.8739 0.8727 0.8765 0.8774 0.8764 0.9514 0.9529 0.9515 0.9286 0.9286 0.9333 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0031 5.6836 110000 0.1830 15.4064 0.8546 0.8553 0.8545 0.8758 0.8766 0.8757 0.8797 0.8804 0.8795 0.9528 0.9537 0.9527 0.9286 0.9286 0.9286 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0008 6.2003 120000 0.1847 15.0980 0.8598 0.8610 0.8599 0.8810 0.8823 0.8812 0.8844 0.8857 0.8845 0.9538 0.9571 0.9549 0.9286 0.9286 0.9333 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0011 6.7170 130000 0.1879 15.3151 0.8550 0.8584 0.8562 0.8758 0.8793 0.8770 0.8790 0.8825 0.8802 0.9525 0.9562 0.9537 0.9286 0.9286 0.9333 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0625 0.125 0.0870
0.0022 7.2336 140000 0.1953 15.4253 0.8545 0.8597 0.8566 0.8768 0.8823 0.8790 0.8801 0.8856 0.8823 0.9511 0.9562 0.9530 0.9231 0.9286 0.9286 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0005 7.7503 150000 0.1929 15.0791 0.8608 0.8615 0.8607 0.8813 0.8823 0.8813 0.8849 0.8859 0.8849 0.9546 0.9564 0.9549 0.9286 0.9286 0.9333 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0011 8.2670 160000 0.1946 15.0036 0.8579 0.8609 0.8589 0.8792 0.8824 0.8803 0.8827 0.8858 0.8838 0.9553 0.9597 0.9569 0.9286 0.9333 0.9412 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0014 8.7837 170000 0.1922 14.6732 0.8618 0.8625 0.8616 0.8827 0.8836 0.8826 0.8862 0.8871 0.8861 0.9560 0.9582 0.9565 0.9333 0.9333 0.9474 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2222 0.1667 0.1905
0.0003 9.3004 180000 0.1957 14.5505 0.8620 0.8657 0.8634 0.8824 0.8863 0.8839 0.8858 0.8896 0.8872 0.9546 0.9586 0.9560 0.9286 0.9333 0.9474 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0005 9.8171 190000 0.1965 14.7928 0.8615 0.8637 0.8622 0.8828 0.8853 0.8836 0.8858 0.8883 0.8866 0.9539 0.9578 0.9552 0.9286 0.9333 0.9412 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1111 0.125 0.125
0.0007 10.3338 200000 0.1991 14.3208 0.8643 0.8655 0.8644 0.8854 0.8868 0.8856 0.8884 0.8899 0.8887 0.9576 0.9600 0.9582 0.9333 0.9333 0.9474 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0001 10.8505 210000 0.1999 14.4466 0.8598 0.8636 0.8612 0.8801 0.8841 0.8816 0.8836 0.8876 0.8851 0.9553 0.9597 0.9569 0.9286 0.9333 0.9474 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0001 11.3672 220000 0.2056 14.8148 0.8566 0.8592 0.8574 0.8777 0.8805 0.8786 0.8810 0.8841 0.8821 0.9523 0.9577 0.9544 0.9286 0.9286 0.9333 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0013 11.8838 230000 0.1986 13.9054 0.8690 0.8695 0.8688 0.8891 0.8896 0.8889 0.8921 0.8928 0.8920 0.9580 0.9599 0.9584 0.9333 0.9333 0.9524 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0 12.4005 240000 0.1983 13.7670 0.8722 0.8725 0.8719 0.8918 0.8922 0.8916 0.8949 0.8954 0.8947 0.9604 0.9616 0.9604 0.9375 0.9412 0.9565 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0 12.9172 250000 0.1994 13.9274 0.8673 0.8693 0.8679 0.8865 0.8887 0.8872 0.8898 0.8920 0.8904 0.9583 0.9612 0.9592 0.9333 0.9412 0.9565 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0 13.4339 260000 0.2031 13.8771 0.8669 0.8688 0.8674 0.8864 0.8884 0.8869 0.8894 0.8914 0.8899 0.9582 0.9609 0.9590 0.9333 0.9412 0.9524 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0 13.9506 270000 0.2013 13.7323 0.8723 0.8727 0.8721 0.8921 0.8926 0.8919 0.8951 0.8959 0.8950 0.9594 0.9612 0.9598 0.9375 0.9375 0.9565 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0 14.4673 280000 0.2016 13.5247 0.8713 0.8733 0.8719 0.8910 0.8933 0.8917 0.8940 0.8962 0.8947 0.9604 0.9628 0.9611 0.9375 1.0 0.9565 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0 14.9840 290000 0.2016 13.4837 0.8731 0.8751 0.8737 0.8930 0.8951 0.8936 0.8957 0.8979 0.8964 0.9602 0.9628 0.9609 0.9375 1.0 0.9565 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0 15.5007 300000 0.2018 13.4397 0.8729 0.8750 0.8735 0.8927 0.8951 0.8935 0.8956 0.8980 0.8964 0.9607 0.9633 0.9615 0.9375 1.0 0.9565 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.2.1
  • Datasets 2.20.0
  • Tokenizers 0.19.1