BERTicSENTNEG4 / README.md
Tanor's picture
update model card README.md
cbd0db1
|
raw
history blame
3.21 kB
metadata
tags:
  - generated_from_trainer
metrics:
  - f1
model-index:
  - name: BERTicSENTNEG4
    results: []

BERTicSENTNEG4

This model is a fine-tuned version of Tanor/BERTicSENTNEG4 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0842
  • F1: 0.6275

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 32

Training results

Training Loss Epoch Step Validation Loss F1
No log 1.0 53 0.0668 0.0
No log 2.0 106 0.0432 0.6383
No log 3.0 159 0.0400 0.5714
No log 4.0 212 0.0458 0.5957
No log 5.0 265 0.0444 0.5882
No log 6.0 318 0.0556 0.5957
No log 7.0 371 0.0566 0.5714
No log 8.0 424 0.0587 0.5862
No log 9.0 477 0.0566 0.5660
0.0389 10.0 530 0.0693 0.5455
0.0389 11.0 583 0.0612 0.6383
0.0389 12.0 636 0.0596 0.6
0.0389 13.0 689 0.0671 0.6038
0.0389 14.0 742 0.0740 0.5957
0.0389 15.0 795 0.0799 0.5778
0.0389 16.0 848 0.0702 0.5957
0.0389 17.0 901 0.0737 0.6087
0.0389 18.0 954 0.0674 0.5660
0.0053 19.0 1007 0.0725 0.5957
0.0053 20.0 1060 0.0738 0.6
0.0053 21.0 1113 0.0821 0.625
0.0053 22.0 1166 0.0737 0.6
0.0053 23.0 1219 0.0828 0.6122
0.0053 24.0 1272 0.0776 0.6182
0.0053 25.0 1325 0.0792 0.6182
0.0053 26.0 1378 0.0791 0.6275
0.0053 27.0 1431 0.0812 0.6275
0.0053 28.0 1484 0.0819 0.6038
0.0029 29.0 1537 0.0831 0.6275
0.0029 30.0 1590 0.0834 0.6275
0.0029 31.0 1643 0.0837 0.6275
0.0029 32.0 1696 0.0842 0.6275

Framework versions

  • Transformers 4.30.2
  • Pytorch 2.0.1
  • Datasets 2.13.1
  • Tokenizers 0.13.3