File size: 6,146 Bytes
91508bb
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_0040
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# whisper_charsplit_new_0040

This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0035
- Train Accuracy: 0.0795
- Train Wermet: 10.6833
- Validation Loss: 0.5276
- Validation Accuracy: 0.0757
- Validation Wermet: 8.9798
- Epoch: 39

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32

### Training results

| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.8733     | 0.0602         | 13.0686      | 0.6470          | 0.0676              | 11.4066           | 0     |
| 0.5740     | 0.0666         | 12.7778      | 0.5113          | 0.0706              | 11.1022           | 1     |
| 0.4553     | 0.0692         | 12.2404      | 0.4371          | 0.0723              | 10.9105           | 2     |
| 0.3813     | 0.0708         | 11.9157      | 0.3935          | 0.0733              | 9.4615            | 3     |
| 0.3292     | 0.0720         | 11.5732      | 0.3630          | 0.0740              | 9.9885            | 4     |
| 0.2886     | 0.0729         | 11.5171      | 0.3403          | 0.0745              | 9.8042            | 5     |
| 0.2561     | 0.0736         | 11.3173      | 0.3256          | 0.0749              | 9.9431            | 6     |
| 0.2282     | 0.0743         | 11.7308      | 0.3159          | 0.0752              | 9.2086            | 7     |
| 0.2036     | 0.0748         | 11.4503      | 0.3071          | 0.0754              | 9.5236            | 8     |
| 0.1820     | 0.0754         | 11.7175      | 0.3005          | 0.0756              | 10.0755           | 9     |
| 0.1628     | 0.0758         | 11.7056      | 0.2993          | 0.0757              | 9.9497            | 10    |
| 0.1450     | 0.0762         | 11.7637      | 0.2971          | 0.0758              | 10.1481           | 11    |
| 0.1287     | 0.0766         | 11.8509      | 0.3029          | 0.0759              | 10.2042           | 12    |
| 0.1140     | 0.0770         | 12.1100      | 0.3004          | 0.0760              | 10.3873           | 13    |
| 0.0998     | 0.0773         | 11.9502      | 0.3025          | 0.0761              | 10.7066           | 14    |
| 0.0872     | 0.0777         | 12.3196      | 0.3129          | 0.0759              | 10.7707           | 15    |
| 0.0760     | 0.0779         | 12.2637      | 0.3142          | 0.0761              | 10.2638           | 16    |
| 0.0651     | 0.0782         | 12.1215      | 0.3192          | 0.0761              | 10.0750           | 17    |
| 0.0547     | 0.0785         | 12.0551      | 0.3294          | 0.0761              | 10.4732           | 18    |
| 0.0463     | 0.0787         | 11.9677      | 0.3402          | 0.0760              | 10.2814           | 19    |
| 0.0386     | 0.0789         | 11.6855      | 0.3517          | 0.0760              | 10.0599           | 20    |
| 0.0318     | 0.0790         | 11.6314      | 0.3628          | 0.0760              | 9.6652            | 21    |
| 0.0262     | 0.0792         | 11.4603      | 0.3728          | 0.0760              | 10.0035           | 22    |
| 0.0224     | 0.0792         | 11.4330      | 0.3824          | 0.0760              | 9.1995            | 23    |
| 0.0181     | 0.0793         | 11.3124      | 0.3982          | 0.0759              | 9.8710            | 24    |
| 0.0142     | 0.0794         | 11.3562      | 0.4057          | 0.0760              | 9.6831            | 25    |
| 0.0118     | 0.0794         | 11.0532      | 0.4207          | 0.0759              | 9.7227            | 26    |
| 0.0101     | 0.0794         | 11.2963      | 0.4282          | 0.0760              | 9.5792            | 27    |
| 0.0114     | 0.0794         | 11.3093      | 0.4431          | 0.0758              | 9.5545            | 28    |
| 0.0109     | 0.0794         | 11.4214      | 0.4419          | 0.0760              | 9.4377            | 29    |
| 0.0084     | 0.0794         | 10.9143      | 0.4474          | 0.0760              | 9.3668            | 30    |
| 0.0043     | 0.0795         | 10.9497      | 0.4525          | 0.0761              | 9.3202            | 31    |
| 0.0036     | 0.0795         | 10.7759      | 0.4667          | 0.0761              | 9.0385            | 32    |
| 0.0047     | 0.0795         | 10.7613      | 0.4788          | 0.0759              | 9.4065            | 33    |
| 0.0130     | 0.0793         | 11.1022      | 0.4748          | 0.0760              | 9.4521            | 34    |
| 0.0074     | 0.0794         | 10.9738      | 0.4730          | 0.0760              | 9.3348            | 35    |
| 0.0032     | 0.0795         | 10.6370      | 0.4750          | 0.0762              | 8.8298            | 36    |
| 0.0020     | 0.0795         | 10.7428      | 0.4835          | 0.0762              | 9.0566            | 37    |
| 0.0014     | 0.0795         | 10.6908      | 0.4937          | 0.0761              | 9.2445            | 38    |
| 0.0035     | 0.0795         | 10.6833      | 0.5276          | 0.0757              | 8.9798            | 39    |


### Framework versions

- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3