MHGanainy's picture
Model save
15e096c verified
|
raw
history blame
2.58 kB
metadata
library_name: transformers
license: mit
base_model: roberta-base
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
  - f1
  - accuracy
model-index:
  - name: roberta-base-downstream-build_rr
    results: []

roberta-base-downstream-build_rr

This model is a fine-tuned version of roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Precision: 0.1983
  • Recall: 0.3587
  • F1: 0.2554
  • Micro-f1: 0.2554
  • Accuracy: 0.9191
  • Loss: 0.2640

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 1
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Precision Recall F1 Micro-f1 Accuracy Validation Loss
No log 1.0 62 0.0835 0.1152 0.0968 0.0968 0.8780 0.4226
No log 2.0 124 0.1537 0.2696 0.1957 0.1957 0.8931 0.3475
No log 3.0 186 0.1875 0.3391 0.2415 0.2415 0.9052 0.2912
No log 4.0 248 0.1992 0.3304 0.2486 0.2486 0.9003 0.2991
No log 5.0 310 0.1784 0.3870 0.2442 0.2442 0.9066 0.2833
No log 6.0 372 0.2206 0.3543 0.2719 0.2719 0.9148 0.2642
No log 7.0 434 0.2300 0.3630 0.2816 0.2816 0.9177 0.2584
No log 8.0 496 0.2179 0.3696 0.2742 0.2742 0.9177 0.2523
0.4245 9.0 558 0.1921 0.3696 0.2528 0.2528 0.9167 0.2630
0.4245 10.0 620 0.1983 0.3587 0.2554 0.2554 0.9191 0.2640

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1