Edit model card

wavlm-base_2

This model is a fine-tuned version of microsoft/wavlm-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0244
  • Accuracy: 0.9966

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 2
  • seed: 0
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50.0

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.4872 0.25 100 0.2180 0.8974
0.1571 0.5 200 0.2582 0.9334
0.0644 0.76 300 0.0244 0.9966
0.0553 1.01 400 0.1156 0.9928
0.1108 1.26 500 0.1576 0.9898
0.0849 1.51 600 0.0871 0.9947
0.0635 1.76 700 0.1088 0.9939
0.0504 2.02 800 0.4074 0.9790
0.1075 2.27 900 0.2955 0.9814
0.2387 2.52 1000 0.0651 0.9956
0.3052 2.77 1100 0.2379 0.8974
0.3336 3.02 1200 0.3527 0.8974
0.3322 3.28 1300 0.3307 0.8974
0.3201 3.53 1400 0.3405 0.8974
0.3406 3.78 1500 0.3335 0.8974
0.3475 4.03 1600 0.3341 0.8974
0.3312 4.28 1700 0.3361 0.8974
0.3367 4.54 1800 0.3310 0.8974
0.3284 4.79 1900 0.3339 0.8974
0.3267 5.04 2000 0.3350 0.8974
0.338 5.29 2100 0.3308 0.8974
0.3277 5.55 2200 0.3309 0.8974
0.3294 5.8 2300 0.3313 0.8974
0.3315 6.05 2400 0.3360 0.8974
0.3397 6.3 2500 0.3307 0.8974
0.3318 6.55 2600 0.3359 0.8974
0.3312 6.81 2700 0.3308 0.8974
0.3155 7.06 2800 0.3317 0.8974
0.3304 7.31 2900 0.3362 0.8974
0.338 7.56 3000 0.3342 0.8974
0.3241 7.81 3100 0.3310 0.8974
0.3325 8.07 3200 0.3326 0.8974
0.3202 8.32 3300 0.3345 0.8974
0.3315 8.57 3400 0.3335 0.8974
0.3288 8.82 3500 0.3312 0.8974
0.3371 9.07 3600 0.3401 0.8974
0.3409 9.33 3700 0.3330 0.8974
0.3236 9.58 3800 0.3330 0.8974
0.3224 9.83 3900 0.3321 0.8974
0.3439 10.08 4000 0.3326 0.8974
0.3382 10.33 4100 0.3310 0.8974
0.3307 10.59 4200 0.3382 0.8974
0.3231 10.84 4300 0.3325 0.8974
0.3095 11.09 4400 0.3348 0.8974
0.3442 11.34 4500 0.3327 0.8974
0.3269 11.59 4600 0.3326 0.8974
0.3323 11.85 4700 0.3308 0.8974
0.3313 12.1 4800 0.3308 0.8974
0.3283 12.35 4900 0.3314 0.8974
0.3331 12.6 5000 0.3307 0.8974
0.3317 12.85 5100 0.3344 0.8974
0.3283 13.11 5200 0.3320 0.8974
0.3263 13.36 5300 0.3311 0.8974
0.3421 13.61 5400 0.3307 0.8974
0.3164 13.86 5500 0.3318 0.8974
0.3315 14.11 5600 0.3335 0.8974
0.3415 14.37 5700 0.3315 0.8974
0.3325 14.62 5800 0.3307 0.8974
0.3264 14.87 5900 0.3330 0.8974
0.3223 15.12 6000 0.3307 0.8974
0.3289 15.37 6100 0.3329 0.8974
0.3353 15.63 6200 0.3311 0.8974
0.3246 15.88 6300 0.3311 0.8974
0.3425 16.13 6400 0.3307 0.8974
0.331 16.38 6500 0.3307 0.8974
0.3293 16.64 6600 0.3353 0.8974
0.3249 16.89 6700 0.3339 0.8974
0.3214 17.14 6800 0.3338 0.8974
0.3259 17.39 6900 0.3327 0.8974
0.3408 17.64 7000 0.3318 0.8974
0.3258 17.9 7100 0.3318 0.8974
0.3299 18.15 7200 0.3308 0.8974
0.327 18.4 7300 0.3371 0.8974
0.3317 18.65 7400 0.3308 0.8974
0.3291 18.9 7500 0.3310 0.8974
0.3263 19.16 7600 0.3325 0.8974
0.3223 19.41 7700 0.3346 0.8974
0.3403 19.66 7800 0.3316 0.8974
0.3265 19.91 7900 0.3309 0.8974
0.33 20.16 8000 0.3318 0.8974
0.3488 20.42 8100 0.3313 0.8974
0.3293 20.67 8200 0.3335 0.8974
0.3095 20.92 8300 0.3356 0.8974
0.3366 21.17 8400 0.3332 0.8974
0.317 21.42 8500 0.3338 0.8974
0.3299 21.68 8600 0.3308 0.8974
0.3434 21.93 8700 0.3310 0.8974
0.3208 22.18 8800 0.3309 0.8974
0.3351 22.43 8900 0.3324 0.8974
0.3301 22.68 9000 0.3308 0.8974
0.3196 22.94 9100 0.3330 0.8974
0.3339 23.19 9200 0.3333 0.8974
0.3249 23.44 9300 0.3308 0.8974
0.3247 23.69 9400 0.3338 0.8974
0.3369 23.94 9500 0.3313 0.8974
0.3291 24.2 9600 0.3320 0.8974
0.3307 24.45 9700 0.3309 0.8974
0.3328 24.7 9800 0.3307 0.8974
0.3277 24.95 9900 0.3342 0.8974
0.3278 25.2 10000 0.3310 0.8974
0.3197 25.46 10100 0.3349 0.8974
0.3273 25.71 10200 0.3321 0.8974
0.3345 25.96 10300 0.3312 0.8974
0.3351 26.21 10400 0.3325 0.8974
0.3144 26.47 10500 0.3346 0.8974
0.3361 26.72 10600 0.3311 0.8974
0.3334 26.97 10700 0.3307 0.8974
0.3287 27.22 10800 0.3373 0.8974
0.3374 27.47 10900 0.3307 0.8974
0.3302 27.73 11000 0.3307 0.8974
0.3245 27.98 11100 0.3315 0.8974
0.3353 28.23 11200 0.3335 0.8974
0.3191 28.48 11300 0.3336 0.8974
0.3226 28.73 11400 0.3308 0.8974
0.3384 28.99 11500 0.3322 0.8974
0.3368 29.24 11600 0.3337 0.8974
0.3224 29.49 11700 0.3332 0.8974
0.3224 29.74 11800 0.3318 0.8974
0.3363 29.99 11900 0.3310 0.8974
0.327 30.25 12000 0.3307 0.8974
0.3291 30.5 12100 0.3307 0.8974
0.3369 30.75 12200 0.3322 0.8974
0.3211 31.0 12300 0.3329 0.8974
0.329 31.25 12400 0.3321 0.8974
0.3206 31.51 12500 0.3309 0.8974
0.3339 31.76 12600 0.3332 0.8974
0.3323 32.01 12700 0.3316 0.8974
0.3273 32.26 12800 0.3323 0.8974
0.3362 32.51 12900 0.3307 0.8974
0.3387 32.77 13000 0.3309 0.8974
0.3173 33.02 13100 0.3311 0.8974
0.3291 33.27 13200 0.3309 0.8974
0.3316 33.52 13300 0.3315 0.8974
0.3366 33.77 13400 0.3332 0.8974
0.3115 34.03 13500 0.3383 0.8974
0.3275 34.28 13600 0.3324 0.8974
0.3373 34.53 13700 0.3315 0.8974
0.3247 34.78 13800 0.3313 0.8974
0.3349 35.03 13900 0.3325 0.8974
0.3223 35.29 14000 0.3312 0.8974
0.3321 35.54 14100 0.3308 0.8974
0.3304 35.79 14200 0.3316 0.8974
0.3262 36.04 14300 0.3320 0.8974
0.3239 36.29 14400 0.3317 0.8974
0.3325 36.55 14500 0.3308 0.8974
0.325 36.8 14600 0.3316 0.8974
0.3416 37.05 14700 0.3311 0.8974
0.3226 37.3 14800 0.3309 0.8974
0.3286 37.56 14900 0.3307 0.8974
0.3284 37.81 15000 0.3312 0.8974
0.3298 38.06 15100 0.3326 0.8974
0.3383 38.31 15200 0.3311 0.8974
0.3418 38.56 15300 0.3308 0.8974
0.3123 38.82 15400 0.3311 0.8974
0.3237 39.07 15500 0.3346 0.8974
0.3261 39.32 15600 0.3325 0.8974
0.3269 39.57 15700 0.3312 0.8974
0.3267 39.82 15800 0.3319 0.8974
0.3381 40.08 15900 0.3327 0.8974
0.3238 40.33 16000 0.3326 0.8974
0.3299 40.58 16100 0.3320 0.8974
0.3385 40.83 16200 0.3309 0.8974
0.3268 41.08 16300 0.3322 0.8974
0.3253 41.34 16400 0.3320 0.8974
0.3261 41.59 16500 0.3314 0.8974
0.3362 41.84 16600 0.3324 0.8974
0.3203 42.09 16700 0.3326 0.8974
0.325 42.34 16800 0.3323 0.8974
0.3172 42.6 16900 0.3326 0.8974
0.3361 42.85 17000 0.3308 0.8974
0.3432 43.1 17100 0.3310 0.8974
0.3396 43.35 17200 0.3313 0.8974
0.3163 43.6 17300 0.3328 0.8974
0.3353 43.86 17400 0.3318 0.8974
0.3299 44.11 17500 0.3317 0.8974
0.3213 44.36 17600 0.3319 0.8974
0.3253 44.61 17700 0.3329 0.8974
0.3391 44.86 17800 0.3322 0.8974
0.3179 45.12 17900 0.3330 0.8974
0.3348 45.37 18000 0.3321 0.8974
0.3116 45.62 18100 0.3326 0.8974
0.3334 45.87 18200 0.3322 0.8974
0.3401 46.12 18300 0.3315 0.8974
0.3381 46.38 18400 0.3311 0.8974
0.3154 46.63 18500 0.3327 0.8974
0.3348 46.88 18600 0.3322 0.8974
0.3285 47.13 18700 0.3325 0.8974
0.3256 47.39 18800 0.3329 0.8974
0.3389 47.64 18900 0.3325 0.8974
0.3288 47.89 19000 0.3327 0.8974
0.3172 48.14 19100 0.3327 0.8974
0.3211 48.39 19200 0.3325 0.8974
0.3348 48.65 19300 0.3325 0.8974
0.3327 48.9 19400 0.3326 0.8974
0.3341 49.15 19500 0.3326 0.8974
0.3344 49.4 19600 0.3325 0.8974
0.3207 49.65 19700 0.3326 0.8974
0.3299 49.91 19800 0.3326 0.8974

Framework versions

  • Transformers 4.34.0.dev0
  • Pytorch 2.0.0.post302
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
80
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for cloudwalkerw/wavlm-base_2

Finetuned
(12)
this model