muhammadravi251001's picture
update model card README.md
9f9e9c6
|
raw
history blame
3.03 kB
metadata
license: mit
tags:
  - generated_from_trainer
metrics:
  - f1
model-index:
  - name: >-
      fine-tuned-DatasetQAS-IDK-MRC-with-indobert-base-uncased-with-ITTL-without-freeze-LR-1e-05
    results: []

fine-tuned-DatasetQAS-IDK-MRC-with-indobert-base-uncased-with-ITTL-without-freeze-LR-1e-05

This model is a fine-tuned version of indolem/indobert-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0860
  • Exact Match: 64.7906
  • F1: 70.2020

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 16
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Exact Match F1
6.2255 0.49 36 2.4921 50.0 50.0
3.6958 0.98 72 1.9696 49.6073 49.8277
2.2068 1.48 108 1.8415 47.3822 48.9302
2.2068 1.97 144 1.7148 48.1675 51.1818
1.9768 2.46 180 1.5553 51.8325 56.0847
1.7318 2.95 216 1.4373 55.1047 59.8473
1.5469 3.45 252 1.2970 58.3770 63.3911
1.5469 3.94 288 1.2882 58.9005 64.0631
1.3771 4.44 324 1.2048 62.0419 66.6696
1.2296 4.93 360 1.1860 61.7801 66.8504
1.2296 5.42 396 1.1807 60.3403 65.5550
1.1715 5.91 432 1.1330 62.6963 67.5995
1.0833 6.41 468 1.1292 62.8272 67.7732
1.025 6.9 504 1.1256 63.3508 68.7945
1.025 7.4 540 1.0740 64.5288 69.8302
1.0033 7.89 576 1.0828 64.5288 69.8559
0.9603 8.38 612 1.0870 63.7435 69.1867
0.9603 8.87 648 1.0655 65.9686 70.8956
0.94 9.37 684 1.0717 65.3141 70.5016
0.9259 9.86 720 1.0860 64.7906 70.2020

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.13.1+cu117
  • Datasets 2.2.0
  • Tokenizers 0.13.2