lilt-en-aadhaar-red / README.md
prashantloni's picture
End of training
3b79026 verified
|
raw
history blame
18.9 kB
metadata
license: mit
base_model: SCUT-DLVCLab/lilt-roberta-en-base
tags:
  - generated_from_trainer
model-index:
  - name: lilt-en-aadhaar-red
    results: []

lilt-en-aadhaar-red

This model is a fine-tuned version of SCUT-DLVCLab/lilt-roberta-en-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0287
  • Adhaar Number: {'precision': 0.9743589743589743, 'recall': 0.9743589743589743, 'f1': 0.9743589743589743, 'number': 39}
  • Ame: {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 23}
  • Ather Name: {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 2}
  • Ather Name Back: {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19}
  • Ather Name Front Top: {'precision': 0.9166666666666666, 'recall': 1.0, 'f1': 0.9565217391304348, 'number': 11}
  • Ddress Back: {'precision': 0.9512195121951219, 'recall': 0.9629629629629629, 'f1': 0.9570552147239264, 'number': 81}
  • Ddress Front: {'precision': 0.9615384615384616, 'recall': 0.9615384615384616, 'f1': 0.9615384615384616, 'number': 52}
  • Ender: {'precision': 0.9523809523809523, 'recall': 0.9523809523809523, 'f1': 0.9523809523809523, 'number': 21}
  • Ob: {'precision': 0.9545454545454546, 'recall': 1.0, 'f1': 0.9767441860465117, 'number': 21}
  • Obile Number: {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 10}
  • Ther: {'precision': 0.958974358974359, 'recall': 0.9689119170984456, 'f1': 0.9639175257731959, 'number': 193}
  • Overall Precision: 0.9623
  • Overall Recall: 0.9725
  • Overall F1: 0.9673
  • Overall Accuracy: 0.9973

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 2500
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Adhaar Number Ame Ather Name Ather Name Back Ather Name Front Top Ddress Back Ddress Front Ender Ob Obile Number Ther Overall Precision Overall Recall Overall F1 Overall Accuracy
0.1651 10.0 200 0.0226 {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 39} {'precision': 0.9130434782608695, 'recall': 0.9130434782608695, 'f1': 0.9130434782608695, 'number': 23} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 2} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 11} {'precision': 0.926829268292683, 'recall': 0.9382716049382716, 'f1': 0.9325153374233128, 'number': 81} {'precision': 0.9811320754716981, 'recall': 1.0, 'f1': 0.9904761904761905, 'number': 52} {'precision': 0.9047619047619048, 'recall': 0.9047619047619048, 'f1': 0.9047619047619048, 'number': 21} {'precision': 0.9545454545454546, 'recall': 1.0, 'f1': 0.9767441860465117, 'number': 21} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 10} {'precision': 0.9384615384615385, 'recall': 0.9481865284974094, 'f1': 0.9432989690721649, 'number': 193} 0.9497 0.9597 0.9547 0.9962
0.004 20.0 400 0.0270 {'precision': 0.9487179487179487, 'recall': 0.9487179487179487, 'f1': 0.9487179487179487, 'number': 39} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 23} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 2} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 11} {'precision': 0.926829268292683, 'recall': 0.9382716049382716, 'f1': 0.9325153374233128, 'number': 81} {'precision': 0.9615384615384616, 'recall': 0.9615384615384616, 'f1': 0.9615384615384616, 'number': 52} {'precision': 0.9523809523809523, 'recall': 0.9523809523809523, 'f1': 0.9523809523809523, 'number': 21} {'precision': 0.9090909090909091, 'recall': 0.9523809523809523, 'f1': 0.9302325581395349, 'number': 21} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 10} {'precision': 0.9333333333333333, 'recall': 0.9430051813471503, 'f1': 0.9381443298969072, 'number': 193} 0.9454 0.9534 0.9494 0.9964
0.0016 30.0 600 0.0321 {'precision': 0.925, 'recall': 0.9487179487179487, 'f1': 0.9367088607594937, 'number': 39} {'precision': 0.9565217391304348, 'recall': 0.9565217391304348, 'f1': 0.9565217391304348, 'number': 23} {'precision': 0.6666666666666666, 'recall': 1.0, 'f1': 0.8, 'number': 2} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 11} {'precision': 0.9146341463414634, 'recall': 0.9259259259259259, 'f1': 0.9202453987730062, 'number': 81} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 52} {'precision': 0.9523809523809523, 'recall': 0.9523809523809523, 'f1': 0.9523809523809523, 'number': 21} {'precision': 0.9545454545454546, 'recall': 1.0, 'f1': 0.9767441860465117, 'number': 21} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 10} {'precision': 0.9282051282051282, 'recall': 0.9378238341968912, 'f1': 0.9329896907216495, 'number': 193} 0.9414 0.9534 0.9474 0.9959
0.0013 40.0 800 0.0243 {'precision': 0.9743589743589743, 'recall': 0.9743589743589743, 'f1': 0.9743589743589743, 'number': 39} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 23} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 2} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 11} {'precision': 0.9390243902439024, 'recall': 0.9506172839506173, 'f1': 0.9447852760736196, 'number': 81} {'precision': 0.9803921568627451, 'recall': 0.9615384615384616, 'f1': 0.970873786407767, 'number': 52} {'precision': 0.9523809523809523, 'recall': 0.9523809523809523, 'f1': 0.9523809523809523, 'number': 21} {'precision': 0.9545454545454546, 'recall': 1.0, 'f1': 0.9767441860465117, 'number': 21} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 10} {'precision': 0.9487179487179487, 'recall': 0.9585492227979274, 'f1': 0.9536082474226804, 'number': 193} 0.96 0.9661 0.9630 0.9973
0.0006 50.0 1000 0.0400 {'precision': 0.9743589743589743, 'recall': 0.9743589743589743, 'f1': 0.9743589743589743, 'number': 39} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 23} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 2} {'precision': 1.0, 'recall': 0.8947368421052632, 'f1': 0.9444444444444444, 'number': 19} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 11} {'precision': 0.8902439024390244, 'recall': 0.9012345679012346, 'f1': 0.8957055214723927, 'number': 81} {'precision': 0.9803921568627451, 'recall': 0.9615384615384616, 'f1': 0.970873786407767, 'number': 52} {'precision': 0.9523809523809523, 'recall': 0.9523809523809523, 'f1': 0.9523809523809523, 'number': 21} {'precision': 0.9545454545454546, 'recall': 1.0, 'f1': 0.9767441860465117, 'number': 21} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 10} {'precision': 0.9384615384615385, 'recall': 0.9481865284974094, 'f1': 0.9432989690721649, 'number': 193} 0.9471 0.9492 0.9481 0.9951
0.0003 60.0 1200 0.0323 {'precision': 0.9743589743589743, 'recall': 0.9743589743589743, 'f1': 0.9743589743589743, 'number': 39} {'precision': 0.9565217391304348, 'recall': 0.9565217391304348, 'f1': 0.9565217391304348, 'number': 23} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 2} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} {'precision': 0.9166666666666666, 'recall': 1.0, 'f1': 0.9565217391304348, 'number': 11} {'precision': 0.926829268292683, 'recall': 0.9382716049382716, 'f1': 0.9325153374233128, 'number': 81} {'precision': 0.9423076923076923, 'recall': 0.9423076923076923, 'f1': 0.9423076923076923, 'number': 52} {'precision': 0.9523809523809523, 'recall': 0.9523809523809523, 'f1': 0.9523809523809523, 'number': 21} {'precision': 0.9545454545454546, 'recall': 1.0, 'f1': 0.9767441860465117, 'number': 21} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 10} {'precision': 0.9384615384615385, 'recall': 0.9481865284974094, 'f1': 0.9432989690721649, 'number': 193} 0.9455 0.9555 0.9505 0.9964
0.0005 70.0 1400 0.0287 {'precision': 0.9743589743589743, 'recall': 0.9743589743589743, 'f1': 0.9743589743589743, 'number': 39} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 23} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 2} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} {'precision': 0.9166666666666666, 'recall': 1.0, 'f1': 0.9565217391304348, 'number': 11} {'precision': 0.9512195121951219, 'recall': 0.9629629629629629, 'f1': 0.9570552147239264, 'number': 81} {'precision': 0.9615384615384616, 'recall': 0.9615384615384616, 'f1': 0.9615384615384616, 'number': 52} {'precision': 0.9523809523809523, 'recall': 0.9523809523809523, 'f1': 0.9523809523809523, 'number': 21} {'precision': 0.9545454545454546, 'recall': 1.0, 'f1': 0.9767441860465117, 'number': 21} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 10} {'precision': 0.958974358974359, 'recall': 0.9689119170984456, 'f1': 0.9639175257731959, 'number': 193} 0.9623 0.9725 0.9673 0.9973
0.0004 80.0 1600 0.0417 {'precision': 0.9487179487179487, 'recall': 0.9487179487179487, 'f1': 0.9487179487179487, 'number': 39} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 23} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 2} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} {'precision': 0.9166666666666666, 'recall': 1.0, 'f1': 0.9565217391304348, 'number': 11} {'precision': 0.9036144578313253, 'recall': 0.9259259259259259, 'f1': 0.9146341463414634, 'number': 81} {'precision': 0.9607843137254902, 'recall': 0.9423076923076923, 'f1': 0.9514563106796117, 'number': 52} {'precision': 0.9523809523809523, 'recall': 0.9523809523809523, 'f1': 0.9523809523809523, 'number': 21} {'precision': 0.9545454545454546, 'recall': 1.0, 'f1': 0.9767441860465117, 'number': 21} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 10} {'precision': 0.9285714285714286, 'recall': 0.9430051813471503, 'f1': 0.9357326478149101, 'number': 193} 0.9393 0.9513 0.9453 0.9951
0.0001 90.0 1800 0.0362 {'precision': 0.9743589743589743, 'recall': 0.9743589743589743, 'f1': 0.9743589743589743, 'number': 39} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 23} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 2} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 11} {'precision': 0.9146341463414634, 'recall': 0.9259259259259259, 'f1': 0.9202453987730062, 'number': 81} {'precision': 0.9803921568627451, 'recall': 0.9615384615384616, 'f1': 0.970873786407767, 'number': 52} {'precision': 0.9523809523809523, 'recall': 0.9523809523809523, 'f1': 0.9523809523809523, 'number': 21} {'precision': 0.9545454545454546, 'recall': 1.0, 'f1': 0.9767441860465117, 'number': 21} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 10} {'precision': 0.9384615384615385, 'recall': 0.9481865284974094, 'f1': 0.9432989690721649, 'number': 193} 0.9516 0.9576 0.9546 0.9964
0.0001 100.0 2000 0.0378 {'precision': 0.9743589743589743, 'recall': 0.9743589743589743, 'f1': 0.9743589743589743, 'number': 39} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 23} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 2} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 11} {'precision': 0.9146341463414634, 'recall': 0.9259259259259259, 'f1': 0.9202453987730062, 'number': 81} {'precision': 0.9615384615384616, 'recall': 0.9615384615384616, 'f1': 0.9615384615384616, 'number': 52} {'precision': 0.9523809523809523, 'recall': 0.9523809523809523, 'f1': 0.9523809523809523, 'number': 21} {'precision': 0.9545454545454546, 'recall': 1.0, 'f1': 0.9767441860465117, 'number': 21} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 10} {'precision': 0.9336734693877551, 'recall': 0.9481865284974094, 'f1': 0.9408740359897172, 'number': 193} 0.9476 0.9576 0.9526 0.9962
0.0001 110.0 2200 0.0379 {'precision': 0.9743589743589743, 'recall': 0.9743589743589743, 'f1': 0.9743589743589743, 'number': 39} {'precision': 0.9565217391304348, 'recall': 0.9565217391304348, 'f1': 0.9565217391304348, 'number': 23} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 2} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 11} {'precision': 0.9146341463414634, 'recall': 0.9259259259259259, 'f1': 0.9202453987730062, 'number': 81} {'precision': 0.9615384615384616, 'recall': 0.9615384615384616, 'f1': 0.9615384615384616, 'number': 52} {'precision': 0.9523809523809523, 'recall': 0.9523809523809523, 'f1': 0.9523809523809523, 'number': 21} {'precision': 0.9545454545454546, 'recall': 1.0, 'f1': 0.9767441860465117, 'number': 21} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 10} {'precision': 0.9285714285714286, 'recall': 0.9430051813471503, 'f1': 0.9357326478149101, 'number': 193} 0.9434 0.9534 0.9484 0.9959
0.0001 120.0 2400 0.0361 {'precision': 0.9743589743589743, 'recall': 0.9743589743589743, 'f1': 0.9743589743589743, 'number': 39} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 23} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 2} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 19} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 11} {'precision': 0.9146341463414634, 'recall': 0.9259259259259259, 'f1': 0.9202453987730062, 'number': 81} {'precision': 0.9615384615384616, 'recall': 0.9615384615384616, 'f1': 0.9615384615384616, 'number': 52} {'precision': 0.9523809523809523, 'recall': 0.9523809523809523, 'f1': 0.9523809523809523, 'number': 21} {'precision': 0.9545454545454546, 'recall': 1.0, 'f1': 0.9767441860465117, 'number': 21} {'precision': 1.0, 'recall': 1.0, 'f1': 1.0, 'number': 10} {'precision': 0.9336734693877551, 'recall': 0.9481865284974094, 'f1': 0.9408740359897172, 'number': 193} 0.9476 0.9576 0.9526 0.9962

Framework versions

  • Transformers 4.40.1
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.0
  • Tokenizers 0.19.1