vazish's picture
vazish/distilbert-fine-tuned-autofill
c002ea2 verified
|
raw
history blame
3.08 kB
metadata
library_name: transformers
license: apache-2.0
base_model: distilbert/distilbert-base-multilingual-cased
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
  - f1
model-index:
  - name: fine-tuned-distilbert-autofill
    results: []

fine-tuned-distilbert-autofill

This model is a fine-tuned version of distilbert/distilbert-base-multilingual-cased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0516
  • Precision: 0.9887
  • Recall: 0.9876
  • F1: 0.9878
  • Confusion Matrix: [[ 93 7 0] [ 15 43 0] [ 11 0 2489]]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Confusion Matrix
No log 1.0 367 0.0597 0.9626 0.9733 0.9659 [[ 100 0 0]
[ 58 0 0]
[ 13 0 2487]]
0.1186 2.0 734 0.0664 0.9622 0.9722 0.9650 [[ 100 0 0]
[ 58 0 0]
[ 16 0 2484]]
0.1138 3.0 1101 0.0447 0.9873 0.9853 0.9851 [[ 96 4 0]
[ 25 33 0]
[ 9 1 2490]]
0.1138 4.0 1468 0.0459 0.9870 0.9857 0.9858 [[ 92 8 0]
[ 20 38 0]
[ 10 0 2490]]
0.094 5.0 1835 0.0518 0.9872 0.9865 0.9867 [[ 90 10 0]
[ 16 42 0]
[ 8 2 2490]]
0.0725 6.0 2202 0.0606 0.9836 0.9808 0.9819 [[ 91 9 0]
[ 15 43 0]
[ 11 16 2473]]
0.0811 7.0 2569 0.0572 0.9864 0.9846 0.9849 [[ 93 7 0]
[ 19 39 0]
[ 14 1 2485]]
0.0811 8.0 2936 0.0610 0.9861 0.9846 0.9851 [[ 89 11 0]
[ 15 43 0]
[ 15 0 2485]]
0.0602 9.0 3303 0.0465 0.9885 0.9868 0.9869 [[ 95 5 0]
[ 19 39 0]
[ 11 0 2489]]
0.0457 10.0 3670 0.0516 0.9887 0.9876 0.9878 [[ 93 7 0]
[ 15 43 0]
[ 11 0 2489]]

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.1.2
  • Datasets 2.19.2
  • Tokenizers 0.19.1