cantillation commited on
Commit
b1b2f17
1 Parent(s): d3a43d3

Model save

Browse files
README.md ADDED
@@ -0,0 +1,117 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: cantillation/Teamim-tiny_DropOut-0.5_Augmented_Combined-Data_date-06-07-2024_20-19
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - wer
8
+ model-index:
9
+ - name: Teamim-tiny_DropOut-0.5_Augmented_Combined-Data_date-07-07-2024_08-34
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # Teamim-tiny_DropOut-0.5_Augmented_Combined-Data_date-07-07-2024_08-34
17
+
18
+ This model is a fine-tuned version of [cantillation/Teamim-tiny_DropOut-0.5_Augmented_Combined-Data_date-06-07-2024_20-19](https://huggingface.co/cantillation/Teamim-tiny_DropOut-0.5_Augmented_Combined-Data_date-06-07-2024_20-19) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 3.9592
21
+ - Wer: 100.0
22
+ - Avg Precision Exact: 0.0053
23
+ - Avg Recall Exact: 0.0482
24
+ - Avg F1 Exact: 0.0094
25
+ - Avg Precision Letter Shift: 0.0129
26
+ - Avg Recall Letter Shift: 0.1183
27
+ - Avg F1 Letter Shift: 0.0228
28
+ - Avg Precision Word Level: 0.0173
29
+ - Avg Recall Word Level: 0.1573
30
+ - Avg F1 Word Level: 0.0306
31
+ - Avg Precision Word Shift: 0.0334
32
+ - Avg Recall Word Shift: 0.2899
33
+ - Avg F1 Word Shift: 0.0588
34
+ - Precision Median Exact: 0.0
35
+ - Recall Median Exact: 0.0
36
+ - F1 Median Exact: 0.0
37
+ - Precision Max Exact: 0.1667
38
+ - Recall Max Exact: 1.0
39
+ - F1 Max Exact: 0.25
40
+ - Precision Min Exact: 0.0
41
+ - Recall Min Exact: 0.0
42
+ - F1 Min Exact: 0.0
43
+ - Precision Min Letter Shift: 0.0
44
+ - Recall Min Letter Shift: 0.0
45
+ - F1 Min Letter Shift: 0.0
46
+ - Precision Min Word Level: 0.0
47
+ - Recall Min Word Level: 0.0
48
+ - F1 Min Word Level: 0.0
49
+ - Precision Min Word Shift: 0.0
50
+ - Recall Min Word Shift: 0.0
51
+ - F1 Min Word Shift: 0.0
52
+
53
+ ## Model description
54
+
55
+ More information needed
56
+
57
+ ## Intended uses & limitations
58
+
59
+ More information needed
60
+
61
+ ## Training and evaluation data
62
+
63
+ More information needed
64
+
65
+ ## Training procedure
66
+
67
+ ### Training hyperparameters
68
+
69
+ The following hyperparameters were used during training:
70
+ - learning_rate: 1e-06
71
+ - train_batch_size: 8
72
+ - eval_batch_size: 32
73
+ - seed: 42
74
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
75
+ - lr_scheduler_type: linear
76
+ - lr_scheduler_warmup_steps: 1000
77
+ - training_steps: 50000
78
+ - mixed_precision_training: Native AMP
79
+
80
+ ### Training results
81
+
82
+ | Training Loss | Epoch | Step | Validation Loss | Wer | Avg Precision Exact | Avg Recall Exact | Avg F1 Exact | Avg Precision Letter Shift | Avg Recall Letter Shift | Avg F1 Letter Shift | Avg Precision Word Level | Avg Recall Word Level | Avg F1 Word Level | Avg Precision Word Shift | Avg Recall Word Shift | Avg F1 Word Shift | Precision Median Exact | Recall Median Exact | F1 Median Exact | Precision Max Exact | Recall Max Exact | F1 Max Exact | Precision Min Exact | Recall Min Exact | F1 Min Exact | Precision Min Letter Shift | Recall Min Letter Shift | F1 Min Letter Shift | Precision Min Word Level | Recall Min Word Level | F1 Min Word Level | Precision Min Word Shift | Recall Min Word Shift | F1 Min Word Shift |
83
+ |:-------------:|:------:|:-----:|:---------------:|:--------:|:-------------------:|:----------------:|:------------:|:--------------------------:|:-----------------------:|:-------------------:|:------------------------:|:---------------------:|:-----------------:|:------------------------:|:---------------------:|:-----------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:--------------------------:|:-----------------------:|:-------------------:|:------------------------:|:---------------------:|:-----------------:|:------------------------:|:---------------------:|:-----------------:|
84
+ | No log | 0.0001 | 1 | 0.2806 | 16.9452 | 0.8385 | 0.8403 | 0.8390 | 0.8609 | 0.8628 | 0.8613 | 0.8657 | 0.8679 | 0.8663 | 0.9468 | 0.9505 | 0.9481 | 0.9231 | 0.9231 | 0.9286 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0833 | 0.0769 | 0.08 |
85
+ | 6.379 | 0.1033 | 2000 | 1.1953 | 120.7433 | 0.1765 | 0.2759 | 0.1896 | 0.2338 | 0.3541 | 0.2490 | 0.2533 | 0.3945 | 0.2751 | 0.4039 | 0.6029 | 0.4412 | 0.1538 | 0.2 | 0.1667 | 1.0 | 1.0 | 0.9412 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
86
+ | 5.4428 | 0.2067 | 4000 | 4.7348 | 100.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
87
+ | 5.1556 | 0.3100 | 6000 | 4.6051 | 100.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
88
+ | 4.935 | 0.4134 | 8000 | 4.4172 | 100.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
89
+ | 4.7728 | 0.5167 | 10000 | 4.1229 | 100.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0026 | 0.0005 | 0.0008 | 0.0092 | 0.0015 | 0.0012 | 0.0141 | 0.0022 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
90
+ | 4.636 | 0.6200 | 12000 | 4.0112 | 100.0378 | 0.0071 | 0.0423 | 0.0116 | 0.0155 | 0.0901 | 0.0251 | 0.0260 | 0.1528 | 0.0422 | 0.0567 | 0.3257 | 0.0922 | 0.0 | 0.0 | 0.0 | 0.2222 | 1.0 | 0.3333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
91
+ | 4.4949 | 0.7234 | 14000 | 4.0080 | 100.0 | 0.0040 | 0.0390 | 0.0072 | 0.0097 | 0.0899 | 0.0171 | 0.0180 | 0.1677 | 0.0317 | 0.0349 | 0.3220 | 0.0615 | 0.0 | 0.0 | 0.0 | 0.1818 | 1.0 | 0.3077 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
92
+ | 4.4359 | 0.8267 | 16000 | 4.0252 | 100.0 | 0.0031 | 0.0347 | 0.0057 | 0.0075 | 0.0849 | 0.0138 | 0.0144 | 0.1589 | 0.0262 | 0.0250 | 0.2827 | 0.0457 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
93
+ | 4.3834 | 0.9300 | 18000 | 4.0642 | 100.0 | 0.0025 | 0.0273 | 0.0045 | 0.0063 | 0.0700 | 0.0115 | 0.0125 | 0.1372 | 0.0228 | 0.0225 | 0.2534 | 0.0412 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
94
+ | 4.317 | 1.0334 | 20000 | 4.0775 | 100.0 | 0.0017 | 0.0183 | 0.0031 | 0.0043 | 0.0470 | 0.0079 | 0.0080 | 0.0860 | 0.0146 | 0.0150 | 0.1663 | 0.0274 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
95
+ | 4.269 | 1.1367 | 22000 | 4.1070 | 100.0 | 0.0010 | 0.0108 | 0.0018 | 0.0025 | 0.0266 | 0.0046 | 0.0044 | 0.0447 | 0.0079 | 0.0080 | 0.0834 | 0.0144 | 0.0 | 0.0 | 0.0 | 0.125 | 1.0 | 0.2222 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
96
+ | 4.2529 | 1.2401 | 24000 | 4.0914 | 100.0 | 0.0008 | 0.0086 | 0.0015 | 0.0021 | 0.0215 | 0.0038 | 0.0033 | 0.0339 | 0.0060 | 0.0064 | 0.0658 | 0.0115 | 0.0 | 0.0 | 0.0 | 0.125 | 1.0 | 0.2222 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
97
+ | 4.1986 | 1.3434 | 26000 | 4.0861 | 100.0 | 0.0012 | 0.0131 | 0.0022 | 0.0025 | 0.0255 | 0.0044 | 0.0038 | 0.0404 | 0.0070 | 0.0081 | 0.0865 | 0.0146 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
98
+ | 4.1805 | 1.4467 | 28000 | 4.0716 | 100.0 | 0.0016 | 0.0171 | 0.0029 | 0.0034 | 0.0362 | 0.0062 | 0.0055 | 0.0579 | 0.0099 | 0.0103 | 0.1118 | 0.0188 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
99
+ | 4.1489 | 1.5501 | 30000 | 4.0526 | 100.0 | 0.0027 | 0.0306 | 0.0050 | 0.0065 | 0.0701 | 0.0117 | 0.0096 | 0.1033 | 0.0174 | 0.0174 | 0.1901 | 0.0316 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
100
+ | 4.0979 | 1.6534 | 32000 | 4.0399 | 100.0 | 0.0034 | 0.0389 | 0.0062 | 0.0082 | 0.0924 | 0.0149 | 0.0121 | 0.1331 | 0.0220 | 0.0217 | 0.2389 | 0.0394 | 0.0 | 0.0 | 0.0 | 0.2 | 1.0 | 0.3333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
101
+ | 4.0667 | 1.7567 | 34000 | 4.0224 | 100.0 | 0.0042 | 0.0473 | 0.0076 | 0.0099 | 0.1091 | 0.0179 | 0.0143 | 0.1549 | 0.0259 | 0.0258 | 0.2749 | 0.0465 | 0.0 | 0.0 | 0.0 | 0.2 | 1.0 | 0.3333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
102
+ | 4.0602 | 1.8601 | 36000 | 4.0042 | 100.0 | 0.0040 | 0.0466 | 0.0073 | 0.0103 | 0.1182 | 0.0187 | 0.0142 | 0.1606 | 0.0259 | 0.0254 | 0.2818 | 0.0461 | 0.0 | 0.0 | 0.0 | 0.2 | 1.0 | 0.3333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
103
+ | 4.0584 | 1.9634 | 38000 | 3.9922 | 100.0 | 0.0038 | 0.0441 | 0.0069 | 0.0104 | 0.1215 | 0.0190 | 0.0140 | 0.1623 | 0.0256 | 0.0244 | 0.2807 | 0.0446 | 0.0 | 0.0 | 0.0 | 0.2 | 1.0 | 0.3333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
104
+ | 4.0015 | 2.0668 | 40000 | 3.9825 | 100.0 | 0.0042 | 0.0506 | 0.0077 | 0.0111 | 0.1293 | 0.0203 | 0.0149 | 0.1726 | 0.0272 | 0.0254 | 0.2923 | 0.0464 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
105
+ | 4.0073 | 2.1701 | 42000 | 3.9722 | 100.0 | 0.0044 | 0.0508 | 0.0080 | 0.0112 | 0.1285 | 0.0205 | 0.0149 | 0.1704 | 0.0272 | 0.0255 | 0.2891 | 0.0465 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
106
+ | 4.0284 | 2.2734 | 44000 | 3.9651 | 100.0 | 0.0047 | 0.0506 | 0.0084 | 0.0116 | 0.1258 | 0.0211 | 0.0156 | 0.1669 | 0.0281 | 0.0278 | 0.2904 | 0.0501 | 0.0 | 0.0 | 0.0 | 0.1429 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
107
+ | 3.9935 | 2.3768 | 46000 | 3.9619 | 100.0 | 0.0051 | 0.0507 | 0.0092 | 0.0122 | 0.1230 | 0.0219 | 0.0165 | 0.1640 | 0.0295 | 0.0298 | 0.2889 | 0.0532 | 0.0 | 0.0 | 0.0 | 0.1667 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
108
+ | 4.0324 | 2.4801 | 48000 | 3.9600 | 100.0 | 0.0052 | 0.0485 | 0.0092 | 0.0126 | 0.1197 | 0.0225 | 0.0171 | 0.1597 | 0.0304 | 0.0326 | 0.2909 | 0.0577 | 0.0 | 0.0 | 0.0 | 0.1667 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
109
+ | 4.003 | 2.5834 | 50000 | 3.9592 | 100.0 | 0.0053 | 0.0482 | 0.0094 | 0.0129 | 0.1183 | 0.0228 | 0.0173 | 0.1573 | 0.0306 | 0.0334 | 0.2899 | 0.0588 | 0.0 | 0.0 | 0.0 | 0.1667 | 1.0 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
110
+
111
+
112
+ ### Framework versions
113
+
114
+ - Transformers 4.41.2
115
+ - Pytorch 2.2.1
116
+ - Datasets 2.20.0
117
+ - Tokenizers 0.19.1
generation_config.json ADDED
@@ -0,0 +1,250 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 2,
5
+ 2
6
+ ],
7
+ [
8
+ 3,
9
+ 0
10
+ ],
11
+ [
12
+ 3,
13
+ 2
14
+ ],
15
+ [
16
+ 3,
17
+ 3
18
+ ],
19
+ [
20
+ 3,
21
+ 4
22
+ ],
23
+ [
24
+ 3,
25
+ 5
26
+ ]
27
+ ],
28
+ "begin_suppress_tokens": [
29
+ 220,
30
+ 50257
31
+ ],
32
+ "bos_token_id": 50257,
33
+ "decoder_start_token_id": 50258,
34
+ "eos_token_id": 50257,
35
+ "forced_decoder_ids": [
36
+ [
37
+ 1,
38
+ null
39
+ ],
40
+ [
41
+ 2,
42
+ 50359
43
+ ]
44
+ ],
45
+ "is_multilingual": true,
46
+ "lang_to_id": {
47
+ "<|af|>": 50327,
48
+ "<|am|>": 50334,
49
+ "<|ar|>": 50272,
50
+ "<|as|>": 50350,
51
+ "<|az|>": 50304,
52
+ "<|ba|>": 50355,
53
+ "<|be|>": 50330,
54
+ "<|bg|>": 50292,
55
+ "<|bn|>": 50302,
56
+ "<|bo|>": 50347,
57
+ "<|br|>": 50309,
58
+ "<|bs|>": 50315,
59
+ "<|ca|>": 50270,
60
+ "<|cs|>": 50283,
61
+ "<|cy|>": 50297,
62
+ "<|da|>": 50285,
63
+ "<|de|>": 50261,
64
+ "<|el|>": 50281,
65
+ "<|en|>": 50259,
66
+ "<|es|>": 50262,
67
+ "<|et|>": 50307,
68
+ "<|eu|>": 50310,
69
+ "<|fa|>": 50300,
70
+ "<|fi|>": 50277,
71
+ "<|fo|>": 50338,
72
+ "<|fr|>": 50265,
73
+ "<|gl|>": 50319,
74
+ "<|gu|>": 50333,
75
+ "<|haw|>": 50352,
76
+ "<|ha|>": 50354,
77
+ "<|he|>": 50279,
78
+ "<|hi|>": 50276,
79
+ "<|hr|>": 50291,
80
+ "<|ht|>": 50339,
81
+ "<|hu|>": 50286,
82
+ "<|hy|>": 50312,
83
+ "<|id|>": 50275,
84
+ "<|is|>": 50311,
85
+ "<|it|>": 50274,
86
+ "<|ja|>": 50266,
87
+ "<|jw|>": 50356,
88
+ "<|ka|>": 50329,
89
+ "<|kk|>": 50316,
90
+ "<|km|>": 50323,
91
+ "<|kn|>": 50306,
92
+ "<|ko|>": 50264,
93
+ "<|la|>": 50294,
94
+ "<|lb|>": 50345,
95
+ "<|ln|>": 50353,
96
+ "<|lo|>": 50336,
97
+ "<|lt|>": 50293,
98
+ "<|lv|>": 50301,
99
+ "<|mg|>": 50349,
100
+ "<|mi|>": 50295,
101
+ "<|mk|>": 50308,
102
+ "<|ml|>": 50296,
103
+ "<|mn|>": 50314,
104
+ "<|mr|>": 50320,
105
+ "<|ms|>": 50282,
106
+ "<|mt|>": 50343,
107
+ "<|my|>": 50346,
108
+ "<|ne|>": 50313,
109
+ "<|nl|>": 50271,
110
+ "<|nn|>": 50342,
111
+ "<|no|>": 50288,
112
+ "<|oc|>": 50328,
113
+ "<|pa|>": 50321,
114
+ "<|pl|>": 50269,
115
+ "<|ps|>": 50340,
116
+ "<|pt|>": 50267,
117
+ "<|ro|>": 50284,
118
+ "<|ru|>": 50263,
119
+ "<|sa|>": 50344,
120
+ "<|sd|>": 50332,
121
+ "<|si|>": 50322,
122
+ "<|sk|>": 50298,
123
+ "<|sl|>": 50305,
124
+ "<|sn|>": 50324,
125
+ "<|so|>": 50326,
126
+ "<|sq|>": 50317,
127
+ "<|sr|>": 50303,
128
+ "<|su|>": 50357,
129
+ "<|sv|>": 50273,
130
+ "<|sw|>": 50318,
131
+ "<|ta|>": 50287,
132
+ "<|te|>": 50299,
133
+ "<|tg|>": 50331,
134
+ "<|th|>": 50289,
135
+ "<|tk|>": 50341,
136
+ "<|tl|>": 50348,
137
+ "<|tr|>": 50268,
138
+ "<|tt|>": 50351,
139
+ "<|uk|>": 50280,
140
+ "<|ur|>": 50290,
141
+ "<|uz|>": 50337,
142
+ "<|vi|>": 50278,
143
+ "<|yi|>": 50335,
144
+ "<|yo|>": 50325,
145
+ "<|zh|>": 50260
146
+ },
147
+ "language": "he",
148
+ "max_initial_timestamp_index": 50,
149
+ "max_length": 448,
150
+ "no_timestamps_token_id": 50363,
151
+ "pad_token_id": 50257,
152
+ "prev_sot_token_id": 50361,
153
+ "return_timestamps": false,
154
+ "suppress_tokens": [
155
+ 1,
156
+ 2,
157
+ 7,
158
+ 8,
159
+ 9,
160
+ 10,
161
+ 14,
162
+ 25,
163
+ 26,
164
+ 27,
165
+ 28,
166
+ 29,
167
+ 31,
168
+ 58,
169
+ 59,
170
+ 60,
171
+ 61,
172
+ 62,
173
+ 63,
174
+ 90,
175
+ 91,
176
+ 92,
177
+ 93,
178
+ 359,
179
+ 503,
180
+ 522,
181
+ 542,
182
+ 873,
183
+ 893,
184
+ 902,
185
+ 918,
186
+ 922,
187
+ 931,
188
+ 1350,
189
+ 1853,
190
+ 1982,
191
+ 2460,
192
+ 2627,
193
+ 3246,
194
+ 3253,
195
+ 3268,
196
+ 3536,
197
+ 3846,
198
+ 3961,
199
+ 4183,
200
+ 4667,
201
+ 6585,
202
+ 6647,
203
+ 7273,
204
+ 9061,
205
+ 9383,
206
+ 10428,
207
+ 10929,
208
+ 11938,
209
+ 12033,
210
+ 12331,
211
+ 12562,
212
+ 13793,
213
+ 14157,
214
+ 14635,
215
+ 15265,
216
+ 15618,
217
+ 16553,
218
+ 16604,
219
+ 18362,
220
+ 18956,
221
+ 20075,
222
+ 21675,
223
+ 22520,
224
+ 26130,
225
+ 26161,
226
+ 26435,
227
+ 28279,
228
+ 29464,
229
+ 31650,
230
+ 32302,
231
+ 32470,
232
+ 36865,
233
+ 42863,
234
+ 47425,
235
+ 49870,
236
+ 50254,
237
+ 50258,
238
+ 50358,
239
+ 50359,
240
+ 50360,
241
+ 50361,
242
+ 50362
243
+ ],
244
+ "task_to_id": {
245
+ "transcribe": 50359,
246
+ "translate": 50358
247
+ },
248
+ "transformers_version": "4.41.2",
249
+ "use_cache": false
250
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:78927a31949bd10667e4a6dabc9afaba2f898b655fa994069ad53e7a76eea85a
3
  size 151109288
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fceba3c48ea741423d9ad002b85cb2c80a213bcfba96d383d6563870d90ac241
3
  size 151109288
runs/Jul07_08-40-30_f4f6a0a71b85/events.out.tfevents.1720341631.f4f6a0a71b85.1.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:63c56fae0e339bb7ddb237a9c36278cd5d84b03fba3cbf5422ec150d03c9f85a
3
- size 472094
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:62658b7b32f96559769ed24a79d5eef8ce24eb42736921c5db83fda5dd0b5447
3
+ size 491909