File size: 4,523 Bytes
51c1b64
3617e38
 
 
 
 
 
 
 
 
 
 
51c1b64
 
3617e38
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
---
language:
- nl
license: apache-2.0
base_model: openai/whisper-large-v2
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: Whisper Large V2
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# Whisper Large V2

This model is a fine-tuned version of [openai/whisper-large-v2](https://huggingface.co/openai/whisper-large-v2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2674
- Wer: 8.9178

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 12
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 20
- num_epochs: 5

### Training results

| Training Loss | Epoch | Step | Validation Loss | Wer     |
|:-------------:|:-----:|:----:|:---------------:|:-------:|
| 0.5984        | 0.09  | 30   | 0.3391          | 13.4234 |
| 0.3844        | 0.19  | 60   | 0.2936          | 16.0882 |
| 0.3245        | 0.28  | 90   | 0.2801          | 12.7436 |
| 0.2967        | 0.38  | 120  | 0.2602          | 12.8549 |
| 0.2526        | 0.47  | 150  | 0.2604          | 17.7364 |
| 0.2889        | 0.57  | 180  | 0.2466          | 13.2940 |
| 0.2378        | 0.66  | 210  | 0.2506          | 15.9919 |
| 0.237         | 0.76  | 240  | 0.2500          | 17.4176 |
| 0.2769        | 0.85  | 270  | 0.2340          | 15.0956 |
| 0.2579        | 0.95  | 300  | 0.2365          | 13.3482 |
| 0.1979        | 1.04  | 330  | 0.2461          | 15.3333 |
| 0.1336        | 1.14  | 360  | 0.2416          | 13.3331 |
| 0.1415        | 1.23  | 390  | 0.2380          | 14.3918 |
| 0.1307        | 1.33  | 420  | 0.2397          | 11.2879 |
| 0.1489        | 1.42  | 450  | 0.2389          | 11.0954 |
| 0.1311        | 1.52  | 480  | 0.2378          | 14.1783 |
| 0.1256        | 1.61  | 510  | 0.2333          | 12.2895 |
| 0.1283        | 1.71  | 540  | 0.2318          | 10.5901 |
| 0.1418        | 1.8   | 570  | 0.2317          | 14.6084 |
| 0.1346        | 1.9   | 600  | 0.2284          | 12.2564 |
| 0.1357        | 1.99  | 630  | 0.2212          | 10.5029 |
| 0.0641        | 2.09  | 660  | 0.2369          | 11.4894 |
| 0.0587        | 2.18  | 690  | 0.2383          | 9.7690  |
| 0.0585        | 2.28  | 720  | 0.2378          | 11.6037 |
| 0.0601        | 2.37  | 750  | 0.2409          | 11.6609 |
| 0.0645        | 2.47  | 780  | 0.2397          | 10.4397 |
| 0.0648        | 2.56  | 810  | 0.2430          | 10.2984 |
| 0.0616        | 2.66  | 840  | 0.2421          | 10.3946 |
| 0.0668        | 2.75  | 870  | 0.2351          | 13.2489 |
| 0.0553        | 2.85  | 900  | 0.2343          | 10.6563 |
| 0.0576        | 2.94  | 930  | 0.2359          | 10.2262 |
| 0.0468        | 3.04  | 960  | 0.2433          | 10.1329 |
| 0.0253        | 3.13  | 990  | 0.2496          | 10.0638 |
| 0.025         | 3.23  | 1020 | 0.2480          | 11.0864 |
| 0.0232        | 3.32  | 1050 | 0.2550          | 9.9916  |
| 0.0252        | 3.42  | 1080 | 0.2531          | 9.3269  |
| 0.0254        | 3.51  | 1110 | 0.2472          | 9.0381  |
| 0.0225        | 3.61  | 1140 | 0.2549          | 9.2908  |
| 0.0218        | 3.7   | 1170 | 0.2496          | 9.5404  |
| 0.0242        | 3.8   | 1200 | 0.2432          | 9.9284  |
| 0.0223        | 3.89  | 1230 | 0.2462          | 10.8277 |
| 0.0204        | 3.99  | 1260 | 0.2522          | 9.6637  |
| 0.0115        | 4.08  | 1290 | 0.2585          | 8.8426  |
| 0.0094        | 4.18  | 1320 | 0.2622          | 9.4923  |
| 0.0092        | 4.27  | 1350 | 0.2638          | 10.6773 |
| 0.009         | 4.37  | 1380 | 0.2640          | 10.0999 |
| 0.009         | 4.46  | 1410 | 0.2664          | 10.0036 |
| 0.0087        | 4.56  | 1440 | 0.2666          | 9.9705  |
| 0.0075        | 4.65  | 1470 | 0.2672          | 9.8622  |
| 0.0077        | 4.75  | 1500 | 0.2658          | 9.1254  |
| 0.0069        | 4.84  | 1530 | 0.2667          | 9.0442  |
| 0.0081        | 4.94  | 1560 | 0.2674          | 8.9178  |


### Framework versions

- Transformers 4.38.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.0