File size: 22,903 Bytes
b1b2f17 6a234db b1b2f17 6a234db b1b2f17 6a234db b1b2f17 6a234db b1b2f17 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 |
---
language:
- he
license: apache-2.0
base_model: cantillation/Teamim-tiny_DropOut-0.5_Augmented_Combined-Data_date-06-07-2024_20-19
tags:
- hf-asr-leaderboard
- generated_from_trainer
metrics:
- wer
model-index:
- name: he-cantillation
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# he-cantillation
This model is a fine-tuned version of [cantillation/Teamim-tiny_DropOut-0.5_Augmented_Combined-Data_date-06-07-2024_20-19](https://huggingface.co/cantillation/Teamim-tiny_DropOut-0.5_Augmented_Combined-Data_date-06-07-2024_20-19) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 3.9592
- Wer: 100.0
- Avg Precision Exact: 0.0053
- Avg Recall Exact: 0.0482
- Avg F1 Exact: 0.0094
- Avg Precision Letter Shift: 0.0129
- Avg Recall Letter Shift: 0.1183
- Avg F1 Letter Shift: 0.0228
- Avg Precision Word Level: 0.0173
- Avg Recall Word Level: 0.1573
- Avg F1 Word Level: 0.0306
- Avg Precision Word Shift: 0.0334
- Avg Recall Word Shift: 0.2899
- Avg F1 Word Shift: 0.0588
- Precision Median Exact: 0.0
- Recall Median Exact: 0.0
- F1 Median Exact: 0.0
- Precision Max Exact: 0.1667
- Recall Max Exact: 1.0
- F1 Max Exact: 0.25
- Precision Min Exact: 0.0
- Recall Min Exact: 0.0
- F1 Min Exact: 0.0
- Precision Min Letter Shift: 0.0
- Recall Min Letter Shift: 0.0
- F1 Min Letter Shift: 0.0
- Precision Min Word Level: 0.0
- Recall Min Word Level: 0.0
- F1 Min Word Level: 0.0
- Precision Min Word Shift: 0.0
- Recall Min Word Shift: 0.0
- F1 Min Word Shift: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 8
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- training_steps: 50000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Avg Precision Exact | Avg Recall Exact | Avg F1 Exact | Avg Precision Letter Shift | Avg Recall Letter Shift | Avg F1 Letter Shift | Avg Precision Word Level | Avg Recall Word Level | Avg F1 Word Level | Avg Precision Word Shift | Avg Recall Word Shift | Avg F1 Word Shift | Precision Median Exact | Recall Median Exact | F1 Median Exact | Precision Max Exact | Recall Max Exact | F1 Max Exact | Precision Min Exact | Recall Min Exact | F1 Min Exact | Precision Min Letter Shift | Recall Min Letter Shift | F1 Min Letter Shift | Precision Min Word Level | Recall Min Word Level | F1 Min Word Level | Precision Min Word Shift | Recall Min Word Shift | F1 Min Word Shift |
|:-------------:|:------:|:-----:|:---------------:|:--------:|:-------------------:|:----------------:|:------------:|:--------------------------:|:-----------------------:|:-------------------:|:------------------------:|:---------------------:|:-----------------:|:------------------------:|:---------------------:|:-----------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:--------------------------:|:-----------------------:|:-------------------:|:------------------------:|:---------------------:|:-----------------:|:------------------------:|:---------------------:|:-----------------:|
| No log | 0.0001 | 1 | 0.2806 | 16.9452 | 0.8385 | 0.8403 | 0.8390 | 0.8609 | 0.8628 | 0.8613 | 0.8657 | 0.8679 | 0.8663 | 0.9468 | 0.9505 | 0.9481 | 0.9231 | 0.9231 | 0.9286 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0833 | 0.0769 | 0.08 |
| 6.379 | 0.1033 | 2000 | 1.1953 | 120.7433 | 0.1765 | 0.2759 | 0.1896 | 0.2338 | 0.3541 | 0.2490 | 0.2533 | 0.3945 | 0.2751 | 0.4039 | 0.6029 | 0.4412 | 0.1538 | 0.2 | 0.1667 | 1.0 | 1.0 | 0.9412 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.4428 | 0.2067 | 4000 | 4.7348 | 100.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 5.1556 | 0.3100 | 6000 | 4.6051 | 100.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 4.935 | 0.4134 | 8000 | 4.4172 | 100.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 4.7728 | 0.5167 | 10000 | 4.1229 | 100.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0026 | 0.0005 | 0.0008 | 0.0092 | 0.0015 | 0.0012 | 0.0141 | 0.0022 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.636 | 0.6200 | 12000 | 4.0112 | 100.0378 | 0.0071 | 0.0423 | 0.0116 | 0.0155 | 0.0901 | 0.0251 | 0.0260 | 0.1528 | 0.0422 | 0.0567 | 0.3257 | 0.0922 | 0.0 | 0.0 | 0.0 | 0.2222 | 1.0 | 0.3333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.4949 | 0.7234 | 14000 | 4.0080 | 100.0 | 0.0040 | 0.0390 | 0.0072 | 0.0097 | 0.0899 | 0.0171 | 0.0180 | 0.1677 | 0.0317 | 0.0349 | 0.3220 | 0.0615 | 0.0 | 0.0 | 0.0 | 0.1818 | 1.0 | 0.3077 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.4359 | 0.8267 | 16000 | 4.0252 | 100.0 | 0.0031 | 0.0347 | 0.0057 | 0.0075 | 0.0849 | 0.0138 | 0.0144 | 0.1589 | 0.0262 | 0.0250 | 0.2827 | 0.0457 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.3834 | 0.9300 | 18000 | 4.0642 | 100.0 | 0.0025 | 0.0273 | 0.0045 | 0.0063 | 0.0700 | 0.0115 | 0.0125 | 0.1372 | 0.0228 | 0.0225 | 0.2534 | 0.0412 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.317 | 1.0334 | 20000 | 4.0775 | 100.0 | 0.0017 | 0.0183 | 0.0031 | 0.0043 | 0.0470 | 0.0079 | 0.0080 | 0.0860 | 0.0146 | 0.0150 | 0.1663 | 0.0274 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.269 | 1.1367 | 22000 | 4.1070 | 100.0 | 0.0010 | 0.0108 | 0.0018 | 0.0025 | 0.0266 | 0.0046 | 0.0044 | 0.0447 | 0.0079 | 0.0080 | 0.0834 | 0.0144 | 0.0 | 0.0 | 0.0 | 0.125 | 1.0 | 0.2222 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.2529 | 1.2401 | 24000 | 4.0914 | 100.0 | 0.0008 | 0.0086 | 0.0015 | 0.0021 | 0.0215 | 0.0038 | 0.0033 | 0.0339 | 0.0060 | 0.0064 | 0.0658 | 0.0115 | 0.0 | 0.0 | 0.0 | 0.125 | 1.0 | 0.2222 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.1986 | 1.3434 | 26000 | 4.0861 | 100.0 | 0.0012 | 0.0131 | 0.0022 | 0.0025 | 0.0255 | 0.0044 | 0.0038 | 0.0404 | 0.0070 | 0.0081 | 0.0865 | 0.0146 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.1805 | 1.4467 | 28000 | 4.0716 | 100.0 | 0.0016 | 0.0171 | 0.0029 | 0.0034 | 0.0362 | 0.0062 | 0.0055 | 0.0579 | 0.0099 | 0.0103 | 0.1118 | 0.0188 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.1489 | 1.5501 | 30000 | 4.0526 | 100.0 | 0.0027 | 0.0306 | 0.0050 | 0.0065 | 0.0701 | 0.0117 | 0.0096 | 0.1033 | 0.0174 | 0.0174 | 0.1901 | 0.0316 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0979 | 1.6534 | 32000 | 4.0399 | 100.0 | 0.0034 | 0.0389 | 0.0062 | 0.0082 | 0.0924 | 0.0149 | 0.0121 | 0.1331 | 0.0220 | 0.0217 | 0.2389 | 0.0394 | 0.0 | 0.0 | 0.0 | 0.2 | 1.0 | 0.3333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0667 | 1.7567 | 34000 | 4.0224 | 100.0 | 0.0042 | 0.0473 | 0.0076 | 0.0099 | 0.1091 | 0.0179 | 0.0143 | 0.1549 | 0.0259 | 0.0258 | 0.2749 | 0.0465 | 0.0 | 0.0 | 0.0 | 0.2 | 1.0 | 0.3333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0602 | 1.8601 | 36000 | 4.0042 | 100.0 | 0.0040 | 0.0466 | 0.0073 | 0.0103 | 0.1182 | 0.0187 | 0.0142 | 0.1606 | 0.0259 | 0.0254 | 0.2818 | 0.0461 | 0.0 | 0.0 | 0.0 | 0.2 | 1.0 | 0.3333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0584 | 1.9634 | 38000 | 3.9922 | 100.0 | 0.0038 | 0.0441 | 0.0069 | 0.0104 | 0.1215 | 0.0190 | 0.0140 | 0.1623 | 0.0256 | 0.0244 | 0.2807 | 0.0446 | 0.0 | 0.0 | 0.0 | 0.2 | 1.0 | 0.3333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0015 | 2.0668 | 40000 | 3.9825 | 100.0 | 0.0042 | 0.0506 | 0.0077 | 0.0111 | 0.1293 | 0.0203 | 0.0149 | 0.1726 | 0.0272 | 0.0254 | 0.2923 | 0.0464 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0073 | 2.1701 | 42000 | 3.9722 | 100.0 | 0.0044 | 0.0508 | 0.0080 | 0.0112 | 0.1285 | 0.0205 | 0.0149 | 0.1704 | 0.0272 | 0.0255 | 0.2891 | 0.0465 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0284 | 2.2734 | 44000 | 3.9651 | 100.0 | 0.0047 | 0.0506 | 0.0084 | 0.0116 | 0.1258 | 0.0211 | 0.0156 | 0.1669 | 0.0281 | 0.0278 | 0.2904 | 0.0501 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9935 | 2.3768 | 46000 | 3.9619 | 100.0 | 0.0051 | 0.0507 | 0.0092 | 0.0122 | 0.1230 | 0.0219 | 0.0165 | 0.1640 | 0.0295 | 0.0298 | 0.2889 | 0.0532 | 0.0 | 0.0 | 0.0 | 0.1667 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0324 | 2.4801 | 48000 | 3.9600 | 100.0 | 0.0052 | 0.0485 | 0.0092 | 0.0126 | 0.1197 | 0.0225 | 0.0171 | 0.1597 | 0.0304 | 0.0326 | 0.2909 | 0.0577 | 0.0 | 0.0 | 0.0 | 0.1667 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.003 | 2.5834 | 50000 | 3.9592 | 100.0 | 0.0053 | 0.0482 | 0.0094 | 0.0129 | 0.1183 | 0.0228 | 0.0173 | 0.1573 | 0.0306 | 0.0334 | 0.2899 | 0.0588 | 0.0 | 0.0 | 0.0 | 0.1667 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
### Framework versions
- Transformers 4.41.2
- Pytorch 2.2.1
- Datasets 2.20.0
- Tokenizers 0.19.1
|