End of training
Browse files- README.md +177 -0
- model.safetensors +1 -1
README.md
ADDED
@@ -0,0 +1,177 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
base_model: facebook/wav2vec2-xls-r-300m
|
4 |
+
tags:
|
5 |
+
- generated_from_trainer
|
6 |
+
model-index:
|
7 |
+
- name: ft_0213_korean
|
8 |
+
results: []
|
9 |
+
---
|
10 |
+
|
11 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
12 |
+
should probably proofread and complete it, then remove this comment. -->
|
13 |
+
|
14 |
+
# ft_0213_korean
|
15 |
+
|
16 |
+
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the None dataset.
|
17 |
+
It achieves the following results on the evaluation set:
|
18 |
+
- Loss: 0.6093
|
19 |
+
- Cer: 0.0958
|
20 |
+
|
21 |
+
## Model description
|
22 |
+
|
23 |
+
More information needed
|
24 |
+
|
25 |
+
## Intended uses & limitations
|
26 |
+
|
27 |
+
More information needed
|
28 |
+
|
29 |
+
## Training and evaluation data
|
30 |
+
|
31 |
+
More information needed
|
32 |
+
|
33 |
+
## Training procedure
|
34 |
+
|
35 |
+
### Training hyperparameters
|
36 |
+
|
37 |
+
The following hyperparameters were used during training:
|
38 |
+
- learning_rate: 0.0001
|
39 |
+
- train_batch_size: 4
|
40 |
+
- eval_batch_size: 8
|
41 |
+
- seed: 42
|
42 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
43 |
+
- lr_scheduler_type: linear
|
44 |
+
- lr_scheduler_warmup_steps: 500
|
45 |
+
- num_epochs: 20
|
46 |
+
- mixed_precision_training: Native AMP
|
47 |
+
|
48 |
+
### Training results
|
49 |
+
|
50 |
+
| Training Loss | Epoch | Step | Validation Loss | Cer |
|
51 |
+
|:-------------:|:-----:|:-----:|:---------------:|:------:|
|
52 |
+
| 24.3697 | 0.17 | 500 | 5.0804 | 1.0 |
|
53 |
+
| 4.8016 | 0.34 | 1000 | 5.1173 | 1.0 |
|
54 |
+
| 4.6791 | 0.51 | 1500 | 4.7037 | 1.0000 |
|
55 |
+
| 4.562 | 0.68 | 2000 | 4.6273 | 0.9779 |
|
56 |
+
| 4.4539 | 0.84 | 2500 | 4.2212 | 0.9370 |
|
57 |
+
| 3.5358 | 1.01 | 3000 | 2.7001 | 0.5326 |
|
58 |
+
| 2.6771 | 1.18 | 3500 | 2.1532 | 0.4519 |
|
59 |
+
| 2.2226 | 1.35 | 4000 | 1.7409 | 0.3787 |
|
60 |
+
| 1.9143 | 1.52 | 4500 | 1.4978 | 0.3372 |
|
61 |
+
| 1.6892 | 1.69 | 5000 | 1.3429 | 0.3112 |
|
62 |
+
| 1.5503 | 1.86 | 5500 | 1.1997 | 0.2812 |
|
63 |
+
| 1.4184 | 2.03 | 6000 | 1.1011 | 0.2624 |
|
64 |
+
| 1.2758 | 2.19 | 6500 | 1.0286 | 0.2551 |
|
65 |
+
| 1.2045 | 2.36 | 7000 | 0.9572 | 0.2373 |
|
66 |
+
| 1.1666 | 2.53 | 7500 | 0.9170 | 0.2251 |
|
67 |
+
| 1.1007 | 2.7 | 8000 | 0.8521 | 0.2142 |
|
68 |
+
| 1.0391 | 2.87 | 8500 | 0.8260 | 0.2140 |
|
69 |
+
| 0.9761 | 3.04 | 9000 | 0.8005 | 0.2071 |
|
70 |
+
| 0.9166 | 3.21 | 9500 | 0.7572 | 0.1941 |
|
71 |
+
| 0.864 | 3.38 | 10000 | 0.7375 | 0.1935 |
|
72 |
+
| 0.8579 | 3.54 | 10500 | 0.7404 | 0.1933 |
|
73 |
+
| 0.8442 | 3.71 | 11000 | 0.7080 | 0.1799 |
|
74 |
+
| 0.8114 | 3.88 | 11500 | 0.6816 | 0.1766 |
|
75 |
+
| 0.7863 | 4.05 | 12000 | 0.6921 | 0.1753 |
|
76 |
+
| 0.7454 | 4.22 | 12500 | 0.6831 | 0.1759 |
|
77 |
+
| 0.7077 | 4.39 | 13000 | 0.6610 | 0.1689 |
|
78 |
+
| 0.6974 | 4.56 | 13500 | 0.6864 | 0.1687 |
|
79 |
+
| 0.7001 | 4.73 | 14000 | 0.6450 | 0.1641 |
|
80 |
+
| 0.6636 | 4.9 | 14500 | 0.6303 | 0.1585 |
|
81 |
+
| 0.6423 | 5.06 | 15000 | 0.6465 | 0.1597 |
|
82 |
+
| 0.5828 | 5.23 | 15500 | 0.6224 | 0.1550 |
|
83 |
+
| 0.6085 | 5.4 | 16000 | 0.6154 | 0.1534 |
|
84 |
+
| 0.5877 | 5.57 | 16500 | 0.6112 | 0.1510 |
|
85 |
+
| 0.586 | 5.74 | 17000 | 0.6022 | 0.1485 |
|
86 |
+
| 0.5656 | 5.91 | 17500 | 0.6022 | 0.1491 |
|
87 |
+
| 0.5366 | 6.08 | 18000 | 0.5894 | 0.1468 |
|
88 |
+
| 0.5134 | 6.25 | 18500 | 0.5779 | 0.1435 |
|
89 |
+
| 0.5217 | 6.41 | 19000 | 0.5960 | 0.1449 |
|
90 |
+
| 0.5049 | 6.58 | 19500 | 0.5813 | 0.1408 |
|
91 |
+
| 0.4961 | 6.75 | 20000 | 0.5582 | 0.1382 |
|
92 |
+
| 0.5089 | 6.92 | 20500 | 0.5898 | 0.1385 |
|
93 |
+
| 0.4769 | 7.09 | 21000 | 0.5739 | 0.1361 |
|
94 |
+
| 0.4552 | 7.26 | 21500 | 0.5700 | 0.1369 |
|
95 |
+
| 0.4552 | 7.43 | 22000 | 0.5956 | 0.1367 |
|
96 |
+
| 0.4476 | 7.6 | 22500 | 0.5885 | 0.1342 |
|
97 |
+
| 0.4449 | 7.77 | 23000 | 0.5501 | 0.1314 |
|
98 |
+
| 0.4333 | 7.93 | 23500 | 0.5474 | 0.1302 |
|
99 |
+
| 0.3946 | 8.1 | 24000 | 0.6018 | 0.1327 |
|
100 |
+
| 0.3993 | 8.27 | 24500 | 0.5680 | 0.1295 |
|
101 |
+
| 0.3892 | 8.44 | 25000 | 0.5575 | 0.1309 |
|
102 |
+
| 0.3936 | 8.61 | 25500 | 0.5666 | 0.1288 |
|
103 |
+
| 0.3957 | 8.78 | 26000 | 0.5546 | 0.1262 |
|
104 |
+
| 0.4006 | 8.95 | 26500 | 0.5702 | 0.1264 |
|
105 |
+
| 0.3456 | 9.12 | 27000 | 0.5614 | 0.1247 |
|
106 |
+
| 0.3459 | 9.28 | 27500 | 0.5608 | 0.1242 |
|
107 |
+
| 0.3511 | 9.45 | 28000 | 0.5527 | 0.1236 |
|
108 |
+
| 0.3504 | 9.62 | 28500 | 0.5479 | 0.1201 |
|
109 |
+
| 0.3529 | 9.79 | 29000 | 0.5525 | 0.1200 |
|
110 |
+
| 0.3397 | 9.96 | 29500 | 0.5451 | 0.1201 |
|
111 |
+
| 0.314 | 10.13 | 30000 | 0.5549 | 0.1184 |
|
112 |
+
| 0.3048 | 10.3 | 30500 | 0.5616 | 0.1180 |
|
113 |
+
| 0.3021 | 10.47 | 31000 | 0.5634 | 0.1184 |
|
114 |
+
| 0.3136 | 10.63 | 31500 | 0.5753 | 0.1166 |
|
115 |
+
| 0.3116 | 10.8 | 32000 | 0.5410 | 0.1149 |
|
116 |
+
| 0.3098 | 10.97 | 32500 | 0.5354 | 0.1143 |
|
117 |
+
| 0.2852 | 11.14 | 33000 | 0.5482 | 0.1144 |
|
118 |
+
| 0.2807 | 11.31 | 33500 | 0.5465 | 0.1126 |
|
119 |
+
| 0.2771 | 11.48 | 34000 | 0.5452 | 0.1147 |
|
120 |
+
| 0.2865 | 11.65 | 34500 | 0.5538 | 0.1128 |
|
121 |
+
| 0.2783 | 11.82 | 35000 | 0.5374 | 0.1118 |
|
122 |
+
| 0.2775 | 11.99 | 35500 | 0.5418 | 0.1121 |
|
123 |
+
| 0.2649 | 12.15 | 36000 | 0.5468 | 0.1104 |
|
124 |
+
| 0.2558 | 12.32 | 36500 | 0.5498 | 0.1108 |
|
125 |
+
| 0.2632 | 12.49 | 37000 | 0.5699 | 0.1118 |
|
126 |
+
| 0.2488 | 12.66 | 37500 | 0.5523 | 0.1088 |
|
127 |
+
| 0.2552 | 12.83 | 38000 | 0.5532 | 0.1090 |
|
128 |
+
| 0.2577 | 13.0 | 38500 | 0.5480 | 0.1078 |
|
129 |
+
| 0.2334 | 13.17 | 39000 | 0.5716 | 0.1078 |
|
130 |
+
| 0.2387 | 13.34 | 39500 | 0.5740 | 0.1080 |
|
131 |
+
| 0.2364 | 13.5 | 40000 | 0.5587 | 0.1066 |
|
132 |
+
| 0.2253 | 13.67 | 40500 | 0.5544 | 0.1071 |
|
133 |
+
| 0.2536 | 13.84 | 41000 | 0.5680 | 0.1055 |
|
134 |
+
| 0.2254 | 14.01 | 41500 | 0.5605 | 0.1058 |
|
135 |
+
| 0.2207 | 14.18 | 42000 | 0.5776 | 0.1049 |
|
136 |
+
| 0.2127 | 14.35 | 42500 | 0.5762 | 0.1046 |
|
137 |
+
| 0.2121 | 14.52 | 43000 | 0.5637 | 0.1043 |
|
138 |
+
| 0.2048 | 14.69 | 43500 | 0.5647 | 0.1048 |
|
139 |
+
| 0.2085 | 14.85 | 44000 | 0.5658 | 0.1032 |
|
140 |
+
| 0.2031 | 15.02 | 44500 | 0.5789 | 0.1026 |
|
141 |
+
| 0.1923 | 15.19 | 45000 | 0.5627 | 0.1011 |
|
142 |
+
| 0.1956 | 15.36 | 45500 | 0.5698 | 0.1016 |
|
143 |
+
| 0.1989 | 15.53 | 46000 | 0.5950 | 0.1016 |
|
144 |
+
| 0.1996 | 15.7 | 46500 | 0.5833 | 0.1003 |
|
145 |
+
| 0.1895 | 15.87 | 47000 | 0.5872 | 0.1003 |
|
146 |
+
| 0.1893 | 16.04 | 47500 | 0.5861 | 0.1001 |
|
147 |
+
| 0.1837 | 16.21 | 48000 | 0.5947 | 0.0998 |
|
148 |
+
| 0.1875 | 16.37 | 48500 | 0.5898 | 0.0994 |
|
149 |
+
| 0.1773 | 16.54 | 49000 | 0.5885 | 0.1001 |
|
150 |
+
| 0.1834 | 16.71 | 49500 | 0.5964 | 0.0995 |
|
151 |
+
| 0.1787 | 16.88 | 50000 | 0.5935 | 0.0994 |
|
152 |
+
| 0.1719 | 17.05 | 50500 | 0.5990 | 0.0987 |
|
153 |
+
| 0.1697 | 17.22 | 51000 | 0.5917 | 0.0987 |
|
154 |
+
| 0.1736 | 17.39 | 51500 | 0.5988 | 0.0988 |
|
155 |
+
| 0.1695 | 17.56 | 52000 | 0.5988 | 0.0978 |
|
156 |
+
| 0.1663 | 17.72 | 52500 | 0.6062 | 0.0979 |
|
157 |
+
| 0.1621 | 17.89 | 53000 | 0.5993 | 0.0976 |
|
158 |
+
| 0.1653 | 18.06 | 53500 | 0.6049 | 0.0973 |
|
159 |
+
| 0.1639 | 18.23 | 54000 | 0.6169 | 0.0976 |
|
160 |
+
| 0.1574 | 18.4 | 54500 | 0.6063 | 0.0973 |
|
161 |
+
| 0.1557 | 18.57 | 55000 | 0.5953 | 0.0959 |
|
162 |
+
| 0.1608 | 18.74 | 55500 | 0.5943 | 0.0963 |
|
163 |
+
| 0.1621 | 18.91 | 56000 | 0.5966 | 0.0961 |
|
164 |
+
| 0.1534 | 19.07 | 56500 | 0.6086 | 0.0961 |
|
165 |
+
| 0.1441 | 19.24 | 57000 | 0.6128 | 0.0962 |
|
166 |
+
| 0.169 | 19.41 | 57500 | 0.6053 | 0.0957 |
|
167 |
+
| 0.1516 | 19.58 | 58000 | 0.6066 | 0.0960 |
|
168 |
+
| 0.1474 | 19.75 | 58500 | 0.6080 | 0.0958 |
|
169 |
+
| 0.1478 | 19.92 | 59000 | 0.6093 | 0.0958 |
|
170 |
+
|
171 |
+
|
172 |
+
### Framework versions
|
173 |
+
|
174 |
+
- Transformers 4.36.2
|
175 |
+
- Pytorch 2.1.2+cu118
|
176 |
+
- Datasets 2.16.1
|
177 |
+
- Tokenizers 0.15.0
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 1267768904
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:224fc235c053d6e6a1976716b4989da7ba2be932e0f2f8738e6acb05f2a5bf0b
|
3 |
size 1267768904
|