myclassification / README.md
MaggieZhang's picture
End of training
3852810 verified
metadata
license: apache-2.0
library_name: peft
tags:
  - generated_from_trainer
metrics:
  - accuracy
base_model: distilbert-base-uncased
model-index:
  - name: myclassification
    results: []

myclassification

This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1432
  • Accuracy: 0.9388

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6881 1.0 625 0.5453 0.7528
0.5585 2.0 1250 0.4954 0.7574
0.5185 3.0 1875 0.4485 0.8018
0.4635 4.0 2500 0.4274 0.8236
0.4556 5.0 3125 0.4262 0.8264
0.431 6.0 3750 0.4520 0.8258
0.4422 7.0 4375 0.4324 0.829
0.4276 8.0 5000 0.3828 0.8342
0.4137 9.0 5625 0.4053 0.8306
0.4282 10.0 6250 0.3915 0.834
0.4131 11.0 6875 0.4001 0.8342
0.403 12.0 7500 0.3894 0.834
0.4098 13.0 8125 0.3739 0.8352
0.3976 14.0 8750 0.3936 0.8298
0.4015 15.0 9375 0.3794 0.836
0.3979 16.0 10000 0.3737 0.841
0.3894 17.0 10625 0.3610 0.8364
0.3884 18.0 11250 0.3530 0.8312
0.3852 19.0 11875 0.3564 0.8348
0.3806 20.0 12500 0.3507 0.842
0.3803 21.0 13125 0.3439 0.8392
0.3757 22.0 13750 0.3391 0.8386
0.37 23.0 14375 0.3244 0.8428
0.3781 24.0 15000 0.3200 0.8442
0.3662 25.0 15625 0.3418 0.8458
0.3515 26.0 16250 0.3043 0.8522
0.3615 27.0 16875 0.2973 0.8606
0.3532 28.0 17500 0.3105 0.8558
0.3498 29.0 18125 0.2971 0.8664
0.3564 30.0 18750 0.3051 0.8684
0.3469 31.0 19375 0.3050 0.8688
0.349 32.0 20000 0.2813 0.864
0.3294 33.0 20625 0.2898 0.8716
0.3371 34.0 21250 0.2921 0.8728
0.3254 35.0 21875 0.2812 0.8744
0.3382 36.0 22500 0.2816 0.8622
0.3402 37.0 23125 0.2905 0.873
0.3333 38.0 23750 0.2832 0.863
0.3084 39.0 24375 0.3017 0.8734
0.3421 40.0 25000 0.2876 0.8718
0.3113 41.0 25625 0.2759 0.8642
0.3223 42.0 26250 0.2814 0.8746
0.3154 43.0 26875 0.2691 0.8684
0.3185 44.0 27500 0.2780 0.8726
0.3074 45.0 28125 0.2596 0.88
0.3037 46.0 28750 0.2645 0.8822
0.3035 47.0 29375 0.2498 0.8848
0.3144 48.0 30000 0.2552 0.8742
0.3057 49.0 30625 0.2453 0.8876
0.2972 50.0 31250 0.2412 0.891
0.2962 51.0 31875 0.2394 0.8938
0.2931 52.0 32500 0.2502 0.8948
0.2908 53.0 33125 0.2398 0.8972
0.288 54.0 33750 0.2314 0.8972
0.2872 55.0 34375 0.2221 0.9016
0.2885 56.0 35000 0.2404 0.8932
0.2828 57.0 35625 0.2145 0.9046
0.2786 58.0 36250 0.2171 0.9038
0.267 59.0 36875 0.2191 0.9062
0.2689 60.0 37500 0.2012 0.9084
0.2716 61.0 38125 0.2061 0.9096
0.2707 62.0 38750 0.2156 0.912
0.275 63.0 39375 0.1997 0.911
0.2355 64.0 40000 0.1991 0.9128
0.2692 65.0 40625 0.1910 0.914
0.2591 66.0 41250 0.1833 0.9166
0.2694 67.0 41875 0.1838 0.9228
0.2762 68.0 42500 0.1776 0.9244
0.2596 69.0 43125 0.1820 0.924
0.2624 70.0 43750 0.1893 0.9218
0.2442 71.0 44375 0.1764 0.9234
0.2601 72.0 45000 0.1652 0.9292
0.2614 73.0 45625 0.1701 0.9232
0.2579 74.0 46250 0.1627 0.9308
0.2562 75.0 46875 0.1616 0.9306
0.244 76.0 47500 0.1630 0.9312
0.2368 77.0 48125 0.1616 0.9298
0.2619 78.0 48750 0.1658 0.93
0.2249 79.0 49375 0.1596 0.9316
0.254 80.0 50000 0.1525 0.9334
0.2467 81.0 50625 0.1596 0.9336
0.2311 82.0 51250 0.1577 0.932
0.2422 83.0 51875 0.1502 0.9346
0.2224 84.0 52500 0.1500 0.9358
0.2377 85.0 53125 0.1499 0.937
0.2442 86.0 53750 0.1498 0.9364
0.2285 87.0 54375 0.1506 0.9354
0.2361 88.0 55000 0.1479 0.9362
0.2416 89.0 55625 0.1461 0.9372
0.2315 90.0 56250 0.1462 0.9362
0.2282 91.0 56875 0.1471 0.9348
0.2293 92.0 57500 0.1479 0.9348
0.2246 93.0 58125 0.1484 0.9376
0.2568 94.0 58750 0.1434 0.9384
0.2356 95.0 59375 0.1454 0.9374
0.2357 96.0 60000 0.1432 0.9378
0.2301 97.0 60625 0.1421 0.9386
0.2321 98.0 61250 0.1425 0.9386
0.241 99.0 61875 0.1427 0.9392
0.2283 100.0 62500 0.1432 0.9388

Framework versions

  • PEFT 0.8.2
  • Transformers 4.39.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 2.17.1
  • Tokenizers 0.15.2