Edit model card

left_as_train_context_roberta-large_20e

This model is a fine-tuned version of FacebookAI/roberta-large on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.0530
  • Val Accuracy: 0.7598
  • Val Precision Macro: 0.7129
  • Val Recall Macro: 0.7027
  • Val F1 Macro: 0.7066
  • Val Precision Weighted: 0.7605
  • Val Recall Weighted: 0.7598
  • Val F1 Weighted: 0.7595

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Val Accuracy Val Precision Macro Val Recall Macro Val F1 Macro Val Precision Weighted Val Recall Weighted Val F1 Weighted
0.4664 1.0 3630 0.6205 0.7544 0.7032 0.7108 0.7050 0.7625 0.7544 0.7564
0.3597 2.0 7260 0.7307 0.7556 0.6982 0.7237 0.7093 0.7639 0.7556 0.7587
0.2864 3.0 10890 0.8032 0.7509 0.6944 0.7157 0.7035 0.7605 0.7509 0.7542
0.2149 4.0 14520 1.0851 0.7581 0.7066 0.7070 0.7061 0.7609 0.7581 0.7588
0.182 5.0 18150 1.3747 0.7503 0.6907 0.7128 0.7004 0.7590 0.7503 0.7535
0.1306 6.0 21780 1.7668 0.7444 0.7013 0.6941 0.6936 0.7534 0.7444 0.7456
0.1116 7.0 25410 1.7892 0.7631 0.7199 0.6947 0.7046 0.7617 0.7631 0.7612
0.0915 8.0 29040 2.0678 0.7565 0.7064 0.6918 0.6979 0.7551 0.7565 0.7553
0.0696 9.0 32670 2.2576 0.7554 0.7103 0.6981 0.7019 0.7582 0.7554 0.7553
0.0427 10.0 36300 2.2779 0.7588 0.7117 0.6998 0.7046 0.7589 0.7588 0.7582
0.046 11.0 39930 2.4922 0.7580 0.7066 0.7004 0.7030 0.7581 0.7580 0.7578
0.0242 12.0 43560 2.6629 0.7623 0.7150 0.7034 0.7085 0.7612 0.7623 0.7615
0.0251 13.0 47190 2.7028 0.7527 0.7031 0.6977 0.6997 0.7538 0.7527 0.7528
0.0214 14.0 50820 2.7458 0.7572 0.7104 0.7021 0.7046 0.7599 0.7572 0.7574
0.0256 15.0 54450 2.7886 0.7552 0.7045 0.7036 0.7032 0.7582 0.7552 0.7560
0.0134 16.0 58080 2.9100 0.7583 0.7077 0.7005 0.7036 0.7582 0.7583 0.7580
0.0109 17.0 61710 2.8942 0.7599 0.7137 0.6963 0.7038 0.7580 0.7599 0.7584
0.0087 18.0 65340 2.9562 0.7602 0.7146 0.7019 0.7072 0.7599 0.7602 0.7595
0.0019 19.0 68970 3.0273 0.7589 0.7145 0.6999 0.7051 0.7602 0.7589 0.7584
0.0043 20.0 72600 3.0530 0.7598 0.7129 0.7027 0.7066 0.7605 0.7598 0.7595

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.1.2
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
10
Safetensors
Model size
355M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for kghanlon/left_as_train_context_roberta-large_20e

Finetuned
(283)
this model