File size: 4,524 Bytes
f4c1bcc
1e45a0e
 
 
 
 
 
 
 
 
 
 
f4c1bcc
 
1e45a0e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
---
language:
- nl
license: apache-2.0
base_model: openai/whisper-large-v2
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: Whisper Large V2
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# Whisper Large V2

This model is a fine-tuned version of [openai/whisper-large-v2](https://huggingface.co/openai/whisper-large-v2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3047
- Wer: 10.4756

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 12
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 20
- num_epochs: 5

### Training results

| Training Loss | Epoch | Step | Validation Loss | Wer     |
|:-------------:|:-----:|:----:|:---------------:|:-------:|
| 0.5862        | 0.09  | 30   | 0.3770          | 15.4837 |
| 0.3186        | 0.19  | 60   | 0.3302          | 13.7743 |
| 0.2867        | 0.28  | 90   | 0.3126          | 13.5958 |
| 0.288         | 0.38  | 120  | 0.2984          | 12.1001 |
| 0.2647        | 0.47  | 150  | 0.2963          | 14.9480 |
| 0.2578        | 0.57  | 180  | 0.2984          | 13.6251 |
| 0.2943        | 0.66  | 210  | 0.2910          | 15.0124 |
| 0.2584        | 0.76  | 240  | 0.2758          | 14.6729 |
| 0.2741        | 0.85  | 270  | 0.2724          | 11.9040 |
| 0.2595        | 0.95  | 300  | 0.2743          | 14.1753 |
| 0.2164        | 1.04  | 330  | 0.2688          | 12.1469 |
| 0.1197        | 1.14  | 360  | 0.2665          | 12.0006 |
| 0.1275        | 1.23  | 390  | 0.2690          | 11.4035 |
| 0.1342        | 1.33  | 420  | 0.2742          | 12.2025 |
| 0.1271        | 1.42  | 450  | 0.2695          | 12.0972 |
| 0.1335        | 1.52  | 480  | 0.2728          | 11.3508 |
| 0.1385        | 1.61  | 510  | 0.2669          | 11.5908 |
| 0.1326        | 1.71  | 540  | 0.2631          | 11.8045 |
| 0.1245        | 1.8   | 570  | 0.2621          | 12.0884 |
| 0.1232        | 1.9   | 600  | 0.2597          | 11.6611 |
| 0.1325        | 1.99  | 630  | 0.2576          | 11.6054 |
| 0.0615        | 2.09  | 660  | 0.2724          | 12.8055 |
| 0.0615        | 2.18  | 690  | 0.2703          | 12.1908 |
| 0.0575        | 2.28  | 720  | 0.2699          | 12.0474 |
| 0.0568        | 2.37  | 750  | 0.2722          | 11.8425 |
| 0.0562        | 2.47  | 780  | 0.2734          | 12.9987 |
| 0.0568        | 2.56  | 810  | 0.2696          | 11.2630 |
| 0.0567        | 2.66  | 840  | 0.2749          | 10.9557 |
| 0.058         | 2.75  | 870  | 0.2783          | 11.6025 |
| 0.0608        | 2.85  | 900  | 0.2733          | 11.1605 |
| 0.0586        | 2.94  | 930  | 0.2678          | 11.9830 |
| 0.044         | 3.04  | 960  | 0.2753          | 11.2601 |
| 0.0236        | 3.13  | 990  | 0.2814          | 10.8825 |
| 0.0235        | 3.23  | 1020 | 0.2853          | 11.0376 |
| 0.0229        | 3.32  | 1050 | 0.2865          | 10.7654 |
| 0.0217        | 3.42  | 1080 | 0.2848          | 10.6776 |
| 0.0233        | 3.51  | 1110 | 0.2838          | 10.6600 |
| 0.0223        | 3.61  | 1140 | 0.2867          | 10.6981 |
| 0.0208        | 3.7   | 1170 | 0.2791          | 10.3761 |
| 0.0195        | 3.8   | 1200 | 0.2832          | 10.5020 |
| 0.02          | 3.89  | 1230 | 0.2841          | 10.9176 |
| 0.0204        | 3.99  | 1260 | 0.2817          | 10.4610 |
| 0.0092        | 4.08  | 1290 | 0.2933          | 10.5312 |
| 0.0078        | 4.18  | 1320 | 0.2992          | 10.4727 |
| 0.0068        | 4.27  | 1350 | 0.3026          | 10.3264 |
| 0.0076        | 4.37  | 1380 | 0.3064          | 10.7361 |
| 0.0077        | 4.46  | 1410 | 0.3070          | 10.5752 |
| 0.0073        | 4.56  | 1440 | 0.3070          | 10.5459 |
| 0.0078        | 4.65  | 1470 | 0.3053          | 10.5254 |
| 0.0083        | 4.75  | 1500 | 0.3035          | 10.4317 |
| 0.009         | 4.84  | 1530 | 0.3042          | 10.4669 |
| 0.0074        | 4.94  | 1560 | 0.3047          | 10.4756 |


### Framework versions

- Transformers 4.38.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.0