gary109 commited on
Commit
90a6495
1 Parent(s): 51627ab

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +125 -114
README.md CHANGED
@@ -1,7 +1,5 @@
1
  ---
2
  tags:
3
- - automatic-speech-recognition
4
- - gary109/AI_Light_Dance
5
  - generated_from_trainer
6
  datasets:
7
  - ai_light_dance
@@ -9,7 +7,20 @@ metrics:
9
  - wer
10
  model-index:
11
  - name: ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v6-1
12
- results: []
 
 
 
 
 
 
 
 
 
 
 
 
 
13
  ---
14
 
15
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -17,10 +28,10 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  # ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v6-1
19
 
20
- This model is a fine-tuned version of [gary109/ai-light-dance_drums_pretrain_wav2vec2-base-new](https://huggingface.co/gary109/ai-light-dance_drums_pretrain_wav2vec2-base-new) on the GARY109/AI_LIGHT_DANCE - ONSET-IDMT-MDB-ENST2 dataset.
21
  It achieves the following results on the evaluation set:
22
- - Loss: 0.6712
23
- - Wer: 0.3762
24
 
25
  ## Model description
26
 
@@ -39,122 +50,122 @@ More information needed
39
  ### Training hyperparameters
40
 
41
  The following hyperparameters were used during training:
42
- - learning_rate: 0.0004
43
- - train_batch_size: 4
44
- - eval_batch_size: 4
45
  - seed: 42
46
- - gradient_accumulation_steps: 4
47
- - total_train_batch_size: 16
48
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
  - lr_scheduler_type: linear
50
- - lr_scheduler_warmup_steps: 30
51
  - num_epochs: 100.0
52
  - mixed_precision_training: Native AMP
53
 
54
  ### Training results
55
 
56
- | Training Loss | Epoch | Step | Validation Loss | Wer |
57
- |:-------------:|:-----:|:----:|:---------------:|:------:|
58
- | 17.2739 | 0.99 | 35 | 2.9041 | 1.0 |
59
- | 1.8424 | 1.99 | 70 | 3.5055 | 1.0 |
60
- | 1.7092 | 2.99 | 105 | 2.0046 | 1.0 |
61
- | 1.5022 | 3.99 | 140 | 1.9662 | 0.9675 |
62
- | 1.2964 | 4.99 | 175 | 1.9017 | 0.5504 |
63
- | 1.1235 | 5.99 | 210 | 1.9875 | 0.4644 |
64
- | 1.1056 | 6.99 | 245 | 1.8791 | 0.4762 |
65
- | 0.8907 | 7.99 | 280 | 1.4811 | 0.4673 |
66
- | 0.8605 | 8.99 | 315 | 1.9114 | 0.4479 |
67
- | 0.8498 | 9.99 | 350 | 1.2107 | 0.4728 |
68
- | 0.7205 | 10.99 | 385 | 1.5744 | 0.4526 |
69
- | 0.8417 | 11.99 | 420 | 1.4689 | 0.4534 |
70
- | 0.7734 | 12.99 | 455 | 1.3531 | 0.4551 |
71
- | 0.7762 | 13.99 | 490 | 1.2924 | 0.4665 |
72
- | 0.6812 | 14.99 | 525 | 1.0827 | 0.4100 |
73
- | 0.7245 | 15.99 | 560 | 1.4070 | 0.4353 |
74
- | 0.6508 | 16.99 | 595 | 1.0520 | 0.4087 |
75
- | 0.7144 | 17.99 | 630 | 1.0729 | 0.4209 |
76
- | 0.6566 | 18.99 | 665 | 1.1672 | 0.4053 |
77
- | 0.5802 | 19.99 | 700 | 1.0129 | 0.4015 |
78
- | 0.5924 | 20.99 | 735 | 1.0762 | 0.4007 |
79
- | 0.7051 | 21.99 | 770 | 1.0253 | 0.4028 |
80
- | 0.5669 | 22.99 | 805 | 1.0526 | 0.4188 |
81
- | 0.6209 | 23.99 | 840 | 1.0177 | 0.4213 |
82
- | 0.635 | 24.99 | 875 | 0.9299 | 0.4019 |
83
- | 0.5914 | 25.99 | 910 | 1.0058 | 0.4142 |
84
- | 0.5983 | 26.99 | 945 | 0.9720 | 0.4142 |
85
- | 0.5631 | 27.99 | 980 | 0.8983 | 0.4028 |
86
- | 0.552 | 28.99 | 1015 | 0.9148 | 0.4184 |
87
- | 0.5213 | 29.99 | 1050 | 1.0817 | 0.4142 |
88
- | 0.5387 | 30.99 | 1085 | 0.9432 | 0.4188 |
89
- | 0.5276 | 31.99 | 1120 | 1.1207 | 0.4007 |
90
- | 0.5778 | 32.99 | 1155 | 0.9254 | 0.4150 |
91
- | 0.5001 | 33.99 | 1190 | 1.0393 | 0.4192 |
92
- | 0.5329 | 34.99 | 1225 | 0.9109 | 0.3965 |
93
- | 0.5168 | 35.99 | 1260 | 0.8983 | 0.4298 |
94
- | 0.4918 | 36.99 | 1295 | 0.8412 | 0.4087 |
95
- | 0.5651 | 37.99 | 1330 | 0.8560 | 0.4218 |
96
- | 0.438 | 38.99 | 1365 | 0.8556 | 0.4171 |
97
- | 0.4808 | 39.99 | 1400 | 0.8320 | 0.4175 |
98
- | 0.5372 | 40.99 | 1435 | 0.9745 | 0.3956 |
99
- | 0.4814 | 41.99 | 1470 | 0.8033 | 0.4121 |
100
- | 0.4416 | 42.99 | 1505 | 0.8195 | 0.3990 |
101
- | 0.4958 | 43.99 | 1540 | 0.8264 | 0.3956 |
102
- | 0.4665 | 44.99 | 1575 | 0.8172 | 0.4070 |
103
- | 0.4196 | 45.99 | 1610 | 0.7971 | 0.3952 |
104
- | 0.4088 | 46.99 | 1645 | 0.7417 | 0.3880 |
105
- | 0.4308 | 47.99 | 1680 | 0.7806 | 0.3931 |
106
- | 0.4173 | 48.99 | 1715 | 0.7380 | 0.3922 |
107
- | 0.4653 | 49.99 | 1750 | 0.8962 | 0.4028 |
108
- | 0.4406 | 50.99 | 1785 | 0.7790 | 0.3935 |
109
- | 0.4664 | 51.99 | 1820 | 0.9173 | 0.3893 |
110
- | 0.4486 | 52.99 | 1855 | 0.8235 | 0.3922 |
111
- | 0.4137 | 53.99 | 1890 | 0.8032 | 0.3927 |
112
- | 0.4402 | 54.99 | 1925 | 0.7658 | 0.3830 |
113
- | 0.4101 | 55.99 | 1960 | 0.8621 | 0.3994 |
114
- | 0.5239 | 56.99 | 1995 | 0.7903 | 0.3956 |
115
- | 0.4151 | 57.99 | 2030 | 0.7849 | 0.3872 |
116
- | 0.4766 | 58.99 | 2065 | 0.8306 | 0.3918 |
117
- | 0.4882 | 59.99 | 2100 | 0.8134 | 0.3927 |
118
- | 0.4583 | 60.99 | 2135 | 0.9527 | 0.3851 |
119
- | 0.4284 | 61.99 | 2170 | 0.9743 | 0.3998 |
120
- | 0.46 | 62.99 | 2205 | 0.7807 | 0.3830 |
121
- | 0.4039 | 63.99 | 2240 | 0.8864 | 0.3884 |
122
- | 0.3868 | 64.99 | 2275 | 0.7304 | 0.3817 |
123
- | 0.3934 | 65.99 | 2310 | 0.8758 | 0.3846 |
124
- | 0.3776 | 66.99 | 2345 | 0.8156 | 0.3762 |
125
- | 0.3499 | 67.99 | 2380 | 0.8143 | 0.3889 |
126
- | 0.4055 | 68.99 | 2415 | 0.7503 | 0.3796 |
127
- | 0.3505 | 69.99 | 2450 | 0.7138 | 0.3804 |
128
- | 0.3755 | 70.99 | 2485 | 0.8072 | 0.3800 |
129
- | 0.3594 | 71.99 | 2520 | 0.7692 | 0.3851 |
130
- | 0.3167 | 72.99 | 2555 | 0.6995 | 0.3745 |
131
- | 0.3915 | 73.99 | 2590 | 0.6712 | 0.3762 |
132
- | 0.3741 | 74.99 | 2625 | 0.7139 | 0.3800 |
133
- | 0.3708 | 75.99 | 2660 | 0.7065 | 0.3834 |
134
- | 0.3731 | 76.99 | 2695 | 0.7316 | 0.3754 |
135
- | 0.3785 | 77.99 | 2730 | 0.7071 | 0.3758 |
136
- | 0.3466 | 78.99 | 2765 | 0.7362 | 0.3834 |
137
- | 0.3505 | 79.99 | 2800 | 0.6965 | 0.3800 |
138
- | 0.4003 | 80.99 | 2835 | 0.7521 | 0.3766 |
139
- | 0.3723 | 81.99 | 2870 | 0.7617 | 0.3749 |
140
- | 0.4029 | 82.99 | 2905 | 0.7659 | 0.3813 |
141
- | 0.3478 | 83.99 | 2940 | 0.7077 | 0.3834 |
142
- | 0.3363 | 84.99 | 2975 | 0.7333 | 0.3787 |
143
- | 0.4228 | 85.99 | 3010 | 0.7196 | 0.3745 |
144
- | 0.3823 | 86.99 | 3045 | 0.7195 | 0.3754 |
145
- | 0.3574 | 87.99 | 3080 | 0.7137 | 0.3796 |
146
- | 0.3371 | 88.99 | 3115 | 0.7164 | 0.3762 |
147
- | 0.3548 | 89.99 | 3150 | 0.7766 | 0.3792 |
148
- | 0.4042 | 90.99 | 3185 | 0.7588 | 0.3766 |
149
- | 0.3989 | 91.99 | 3220 | 0.7311 | 0.3775 |
150
- | 0.3625 | 92.99 | 3255 | 0.7475 | 0.3745 |
151
- | 0.3036 | 93.99 | 3290 | 0.7138 | 0.3716 |
152
- | 0.5157 | 94.99 | 3325 | 0.7246 | 0.3787 |
153
- | 0.4072 | 95.99 | 3360 | 0.7322 | 0.3762 |
154
- | 0.3406 | 96.99 | 3395 | 0.7134 | 0.3771 |
155
- | 0.2987 | 97.99 | 3430 | 0.6951 | 0.3754 |
156
- | 0.3355 | 98.99 | 3465 | 0.7005 | 0.3766 |
157
- | 0.341 | 99.99 | 3500 | 0.6990 | 0.3771 |
158
 
159
 
160
  ### Framework versions
 
1
  ---
2
  tags:
 
 
3
  - generated_from_trainer
4
  datasets:
5
  - ai_light_dance
 
7
  - wer
8
  model-index:
9
  - name: ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v6-1
10
+ results:
11
+ - task:
12
+ name: Automatic Speech Recognition
13
+ type: automatic-speech-recognition
14
+ dataset:
15
+ name: ai_light_dance
16
+ type: ai_light_dance
17
+ config: onset-idmt-mdb-enst2
18
+ split: train
19
+ args: onset-idmt-mdb-enst2
20
+ metrics:
21
+ - name: Wer
22
+ type: wer
23
+ value: 0.32686630113876003
24
  ---
25
 
26
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
28
 
29
  # ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v6-1
30
 
31
+ This model is a fine-tuned version of [gary109/ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v6-1](https://huggingface.co/gary109/ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v6-1) on the ai_light_dance dataset.
32
  It achieves the following results on the evaluation set:
33
+ - Loss: 0.6187
34
+ - Wer: 0.3269
35
 
36
  ## Model description
37
 
 
50
  ### Training hyperparameters
51
 
52
  The following hyperparameters were used during training:
53
+ - learning_rate: 0.0001
54
+ - train_batch_size: 2
55
+ - eval_batch_size: 2
56
  - seed: 42
57
+ - gradient_accumulation_steps: 2
58
+ - total_train_batch_size: 4
59
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
60
  - lr_scheduler_type: linear
61
+ - lr_scheduler_warmup_steps: 10
62
  - num_epochs: 100.0
63
  - mixed_precision_training: Native AMP
64
 
65
  ### Training results
66
 
67
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
68
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|
69
+ | 0.2764 | 1.0 | 141 | 0.7125 | 0.3619 |
70
+ | 0.5415 | 2.0 | 282 | 0.7252 | 0.3682 |
71
+ | 0.3324 | 3.0 | 423 | 0.6779 | 0.3728 |
72
+ | 0.4244 | 4.0 | 564 | 0.7403 | 0.3737 |
73
+ | 0.5234 | 5.0 | 705 | 0.8086 | 0.3534 |
74
+ | 0.3339 | 6.0 | 846 | 0.7187 | 0.3619 |
75
+ | 0.5016 | 7.0 | 987 | 0.8582 | 0.3602 |
76
+ | 0.3376 | 8.0 | 1128 | 0.8801 | 0.3674 |
77
+ | 0.3507 | 9.0 | 1269 | 0.8524 | 0.3560 |
78
+ | 0.4844 | 10.0 | 1410 | 0.7152 | 0.3648 |
79
+ | 0.4282 | 11.0 | 1551 | 0.6719 | 0.3475 |
80
+ | 0.4398 | 12.0 | 1692 | 0.7130 | 0.3686 |
81
+ | 0.331 | 13.0 | 1833 | 0.6425 | 0.3627 |
82
+ | 0.4488 | 14.0 | 1974 | 0.6483 | 0.3648 |
83
+ | 0.3876 | 15.0 | 2115 | 0.6375 | 0.3509 |
84
+ | 0.3361 | 16.0 | 2256 | 0.6791 | 0.3703 |
85
+ | 0.344 | 17.0 | 2397 | 0.7279 | 0.3551 |
86
+ | 0.3198 | 18.0 | 2538 | 0.6801 | 0.3509 |
87
+ | 0.2753 | 19.0 | 2679 | 0.6239 | 0.3509 |
88
+ | 0.2962 | 20.0 | 2820 | 0.7419 | 0.3442 |
89
+ | 0.7503 | 21.0 | 2961 | 0.7279 | 0.3501 |
90
+ | 0.4013 | 22.0 | 3102 | 0.6899 | 0.3792 |
91
+ | 0.5134 | 23.0 | 3243 | 0.6572 | 0.3787 |
92
+ | 0.3144 | 24.0 | 3384 | 0.5882 | 0.3543 |
93
+ | 0.3534 | 25.0 | 3525 | 0.5661 | 0.3416 |
94
+ | 0.2555 | 26.0 | 3666 | 0.5977 | 0.3589 |
95
+ | 0.3524 | 27.0 | 3807 | 0.5953 | 0.3585 |
96
+ | 0.314 | 28.0 | 3948 | 0.6359 | 0.3593 |
97
+ | 0.2565 | 29.0 | 4089 | 0.6192 | 0.3615 |
98
+ | 0.5023 | 30.0 | 4230 | 0.6229 | 0.3378 |
99
+ | 0.3025 | 31.0 | 4371 | 0.6002 | 0.3442 |
100
+ | 0.3329 | 32.0 | 4512 | 0.6235 | 0.3513 |
101
+ | 0.3744 | 33.0 | 4653 | 0.5782 | 0.3416 |
102
+ | 0.2899 | 34.0 | 4794 | 0.5835 | 0.3336 |
103
+ | 0.306 | 35.0 | 4935 | 0.6061 | 0.3496 |
104
+ | 0.2519 | 36.0 | 5076 | 0.5958 | 0.3652 |
105
+ | 0.3201 | 37.0 | 5217 | 0.5778 | 0.3652 |
106
+ | 0.3011 | 38.0 | 5358 | 0.6238 | 0.3589 |
107
+ | 0.2882 | 39.0 | 5499 | 0.6501 | 0.3361 |
108
+ | 0.2542 | 40.0 | 5640 | 0.6341 | 0.3488 |
109
+ | 0.2717 | 41.0 | 5781 | 0.5890 | 0.3530 |
110
+ | 0.3197 | 42.0 | 5922 | 0.5877 | 0.3471 |
111
+ | 0.2816 | 43.0 | 6063 | 0.6614 | 0.3420 |
112
+ | 0.3301 | 44.0 | 6204 | 0.6334 | 0.3475 |
113
+ | 0.2466 | 45.0 | 6345 | 0.6663 | 0.3429 |
114
+ | 0.2908 | 46.0 | 6486 | 0.5941 | 0.3475 |
115
+ | 0.2785 | 47.0 | 6627 | 0.6337 | 0.3568 |
116
+ | 0.2361 | 48.0 | 6768 | 0.5845 | 0.3399 |
117
+ | 0.4729 | 49.0 | 6909 | 0.6466 | 0.3425 |
118
+ | 0.5103 | 50.0 | 7050 | 0.7112 | 0.3416 |
119
+ | 0.2676 | 51.0 | 7191 | 0.6260 | 0.3307 |
120
+ | 0.3533 | 52.0 | 7332 | 0.7327 | 0.3454 |
121
+ | 0.3308 | 53.0 | 7473 | 0.7150 | 0.3277 |
122
+ | 0.2617 | 54.0 | 7614 | 0.6412 | 0.3391 |
123
+ | 0.2901 | 55.0 | 7755 | 0.6225 | 0.3391 |
124
+ | 0.2847 | 56.0 | 7896 | 0.7385 | 0.3391 |
125
+ | 0.2621 | 57.0 | 8037 | 0.7241 | 0.3496 |
126
+ | 0.2477 | 58.0 | 8178 | 0.6957 | 0.3429 |
127
+ | 0.3147 | 59.0 | 8319 | 0.6808 | 0.3425 |
128
+ | 0.3761 | 60.0 | 8460 | 0.6710 | 0.3450 |
129
+ | 0.2609 | 61.0 | 8601 | 0.6629 | 0.3345 |
130
+ | 0.388 | 62.0 | 8742 | 0.6688 | 0.3463 |
131
+ | 0.3684 | 63.0 | 8883 | 0.7018 | 0.3340 |
132
+ | 0.2494 | 64.0 | 9024 | 0.6611 | 0.3399 |
133
+ | 0.2641 | 65.0 | 9165 | 0.6828 | 0.3399 |
134
+ | 0.2716 | 66.0 | 9306 | 0.6409 | 0.3294 |
135
+ | 0.2595 | 67.0 | 9447 | 0.6056 | 0.3231 |
136
+ | 0.2683 | 68.0 | 9588 | 0.6203 | 0.3332 |
137
+ | 0.2571 | 69.0 | 9729 | 0.6484 | 0.3336 |
138
+ | 0.2593 | 70.0 | 9870 | 0.6597 | 0.3294 |
139
+ | 0.229 | 71.0 | 10011 | 0.6354 | 0.3235 |
140
+ | 0.281 | 72.0 | 10152 | 0.6398 | 0.3294 |
141
+ | 0.3779 | 73.0 | 10293 | 0.6871 | 0.3345 |
142
+ | 0.2998 | 74.0 | 10434 | 0.7329 | 0.3323 |
143
+ | 0.2095 | 75.0 | 10575 | 0.7365 | 0.3239 |
144
+ | 0.247 | 76.0 | 10716 | 0.6384 | 0.3290 |
145
+ | 0.2095 | 77.0 | 10857 | 0.6703 | 0.3345 |
146
+ | 0.2074 | 78.0 | 10998 | 0.6577 | 0.3425 |
147
+ | 0.2519 | 79.0 | 11139 | 0.6359 | 0.3370 |
148
+ | 0.2046 | 80.0 | 11280 | 0.6222 | 0.3256 |
149
+ | 1.3195 | 81.0 | 11421 | 0.6126 | 0.3345 |
150
+ | 0.2821 | 82.0 | 11562 | 0.6193 | 0.3294 |
151
+ | 0.3256 | 83.0 | 11703 | 0.6140 | 0.3336 |
152
+ | 0.2743 | 84.0 | 11844 | 0.6204 | 0.3290 |
153
+ | 0.2761 | 85.0 | 11985 | 0.6599 | 0.3252 |
154
+ | 0.224 | 86.0 | 12126 | 0.6580 | 0.3294 |
155
+ | 0.2106 | 87.0 | 12267 | 0.6298 | 0.3294 |
156
+ | 0.2706 | 88.0 | 12408 | 0.6411 | 0.3281 |
157
+ | 0.2523 | 89.0 | 12549 | 0.6243 | 0.3264 |
158
+ | 0.3635 | 90.0 | 12690 | 0.6297 | 0.3290 |
159
+ | 0.353 | 91.0 | 12831 | 0.6145 | 0.3235 |
160
+ | 0.2491 | 92.0 | 12972 | 0.6296 | 0.3197 |
161
+ | 0.1999 | 93.0 | 13113 | 0.6329 | 0.3222 |
162
+ | 0.2417 | 94.0 | 13254 | 0.6200 | 0.3222 |
163
+ | 0.2397 | 95.0 | 13395 | 0.6137 | 0.3269 |
164
+ | 0.2275 | 96.0 | 13536 | 0.6237 | 0.3277 |
165
+ | 0.207 | 97.0 | 13677 | 0.6230 | 0.3235 |
166
+ | 0.2704 | 98.0 | 13818 | 0.6239 | 0.3281 |
167
+ | 0.2119 | 99.0 | 13959 | 0.6224 | 0.3277 |
168
+ | 0.2561 | 100.0 | 14100 | 0.6187 | 0.3269 |
169
 
170
 
171
  ### Framework versions