ai-light-dance_drums_ft_pretrain_wav2vec2-base-new_onset-idmt-2
This model is a fine-tuned version of gary109/ai-light-dance_drums_ft_pretrain_wav2vec2-base-new_onset-idmt-2 on the GARY109/AI_LIGHT_DANCE - ONSET-IDMT-2 dataset. It achieves the following results on the evaluation set:
- Loss: 0.2794
- Wer: 0.2733
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 30
- num_epochs: 100.0
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
No log | 1.0 | 9 | 0.3089 | 0.2856 |
0.2871 | 2.0 | 18 | 0.3208 | 0.28 |
0.2997 | 3.0 | 27 | 0.3948 | 0.2878 |
0.299 | 4.0 | 36 | 0.3137 | 0.3011 |
0.3462 | 5.0 | 45 | 0.3067 | 0.2689 |
0.3098 | 6.0 | 54 | 0.3271 | 0.2811 |
0.2812 | 7.0 | 63 | 0.4907 | 0.26 |
0.3151 | 8.0 | 72 | 0.5852 | 0.2778 |
0.3038 | 9.0 | 81 | 0.2981 | 0.2767 |
0.3248 | 10.0 | 90 | 0.3129 | 0.2811 |
0.3248 | 11.0 | 99 | 0.4090 | 0.2767 |
0.3106 | 12.0 | 108 | 0.5354 | 0.3 |
0.2702 | 13.0 | 117 | 0.5543 | 0.3 |
0.3021 | 14.0 | 126 | 0.5437 | 0.2689 |
0.2622 | 15.0 | 135 | 0.5898 | 0.2778 |
0.2465 | 16.0 | 144 | 0.2900 | 0.2722 |
0.3077 | 17.0 | 153 | 0.4407 | 0.2544 |
0.2959 | 18.0 | 162 | 0.4079 | 0.2944 |
0.2843 | 19.0 | 171 | 0.5042 | 0.2722 |
0.254 | 20.0 | 180 | 0.3851 | 0.2878 |
0.254 | 21.0 | 189 | 0.3912 | 0.2678 |
0.2532 | 22.0 | 198 | 0.4699 | 0.2578 |
0.3011 | 23.0 | 207 | 0.7466 | 0.2744 |
0.2601 | 24.0 | 216 | 0.4238 | 0.28 |
0.2873 | 25.0 | 225 | 0.3817 | 0.2456 |
0.2791 | 26.0 | 234 | 0.3488 | 0.2489 |
0.2399 | 27.0 | 243 | 0.2980 | 0.2611 |
0.2592 | 28.0 | 252 | 0.2942 | 0.27 |
0.2191 | 29.0 | 261 | 0.2921 | 0.2833 |
0.2285 | 30.0 | 270 | 0.2851 | 0.2744 |
0.2285 | 31.0 | 279 | 0.2794 | 0.2733 |
0.2489 | 32.0 | 288 | 0.3036 | 0.2678 |
0.2445 | 33.0 | 297 | 0.2851 | 0.2678 |
0.2261 | 34.0 | 306 | 0.2864 | 0.2733 |
0.2391 | 35.0 | 315 | 0.3055 | 0.2611 |
0.3939 | 36.0 | 324 | 0.2927 | 0.26 |
0.2521 | 37.0 | 333 | 0.3470 | 0.2578 |
0.2378 | 38.0 | 342 | 0.2841 | 0.2656 |
0.2653 | 39.0 | 351 | 0.2889 | 0.2389 |
0.2235 | 40.0 | 360 | 0.3176 | 0.25 |
0.2235 | 41.0 | 369 | 0.3188 | 0.2667 |
0.2474 | 42.0 | 378 | 0.3782 | 0.2633 |
0.222 | 43.0 | 387 | 0.3201 | 0.2767 |
0.2411 | 44.0 | 396 | 0.3416 | 0.2722 |
0.2561 | 45.0 | 405 | 0.3050 | 0.2711 |
0.2169 | 46.0 | 414 | 0.3968 | 0.2511 |
0.2296 | 47.0 | 423 | 0.3721 | 0.2567 |
0.1989 | 48.0 | 432 | 0.3205 | 0.2667 |
0.2408 | 49.0 | 441 | 0.4524 | 0.2489 |
0.2163 | 50.0 | 450 | 0.4850 | 0.2567 |
0.2163 | 51.0 | 459 | 0.3777 | 0.2711 |
0.2001 | 52.0 | 468 | 0.5526 | 0.2644 |
0.2373 | 53.0 | 477 | 0.5141 | 0.2589 |
0.2132 | 54.0 | 486 | 0.5408 | 0.2611 |
0.2687 | 55.0 | 495 | 0.5389 | 0.2678 |
0.2244 | 56.0 | 504 | 0.5729 | 0.2578 |
0.2102 | 57.0 | 513 | 0.6249 | 0.2489 |
0.2076 | 58.0 | 522 | 0.5538 | 0.25 |
0.208 | 59.0 | 531 | 0.5499 | 0.2467 |
0.2167 | 60.0 | 540 | 0.6481 | 0.2433 |
0.2167 | 61.0 | 549 | 0.6797 | 0.2589 |
0.2218 | 62.0 | 558 | 0.5401 | 0.2656 |
0.2102 | 63.0 | 567 | 0.5152 | 0.26 |
0.2176 | 64.0 | 576 | 0.5581 | 0.26 |
0.2068 | 65.0 | 585 | 0.7225 | 0.2533 |
0.2123 | 66.0 | 594 | 0.6330 | 0.2633 |
0.2212 | 67.0 | 603 | 0.5943 | 0.2589 |
0.2013 | 68.0 | 612 | 0.7557 | 0.25 |
0.2304 | 69.0 | 621 | 0.9144 | 0.2467 |
0.209 | 70.0 | 630 | 0.7790 | 0.24 |
0.209 | 71.0 | 639 | 0.6203 | 0.2411 |
0.191 | 72.0 | 648 | 0.6280 | 0.2322 |
0.2313 | 73.0 | 657 | 0.5491 | 0.2378 |
0.1869 | 74.0 | 666 | 0.4653 | 0.2411 |
0.2313 | 75.0 | 675 | 0.6016 | 0.2489 |
0.1806 | 76.0 | 684 | 0.6492 | 0.2478 |
0.1934 | 77.0 | 693 | 0.6185 | 0.2478 |
0.1954 | 78.0 | 702 | 0.5618 | 0.2489 |
0.2077 | 79.0 | 711 | 0.5760 | 0.2522 |
0.2052 | 80.0 | 720 | 0.6172 | 0.25 |
0.2052 | 81.0 | 729 | 0.6859 | 0.2467 |
0.1804 | 82.0 | 738 | 0.7643 | 0.2422 |
0.1995 | 83.0 | 747 | 0.8360 | 0.2367 |
0.1869 | 84.0 | 756 | 0.6984 | 0.2489 |
0.2135 | 85.0 | 765 | 0.6759 | 0.2422 |
0.178 | 86.0 | 774 | 0.6791 | 0.2444 |
0.1734 | 87.0 | 783 | 0.7284 | 0.2411 |
0.1881 | 88.0 | 792 | 0.8172 | 0.2344 |
0.1625 | 89.0 | 801 | 0.8061 | 0.2356 |
0.181 | 90.0 | 810 | 0.7644 | 0.2389 |
0.181 | 91.0 | 819 | 0.7413 | 0.24 |
0.1942 | 92.0 | 828 | 0.6439 | 0.2433 |
0.1806 | 93.0 | 837 | 0.6250 | 0.2467 |
0.1651 | 94.0 | 846 | 0.6517 | 0.2433 |
0.1833 | 95.0 | 855 | 0.6628 | 0.2389 |
0.1873 | 96.0 | 864 | 0.6582 | 0.2378 |
0.1672 | 97.0 | 873 | 0.6548 | 0.2389 |
0.1871 | 98.0 | 882 | 0.6655 | 0.24 |
0.2429 | 99.0 | 891 | 0.6695 | 0.24 |
0.1832 | 100.0 | 900 | 0.6700 | 0.2389 |
Framework versions
- Transformers 4.25.0.dev0
- Pytorch 1.8.1+cu111
- Datasets 2.7.1.dev0
- Tokenizers 0.13.2
- Downloads last month
- 14
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.