anniew666's picture
Model save
b1ade3c
|
raw
history blame
10.2 kB
metadata
license: mit
base_model: roberta-large
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - recall
  - f1
model-index:
  - name: lora-roberta-large-0927
    results: []

lora-roberta-large-0927

This model is a fine-tuned version of roberta-large on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5356
  • Accuracy: 0.4472
  • Prec: 0.2000
  • Recall: 0.4472
  • F1: 0.2763
  • B Acc: 0.1429
  • Micro F1: 0.4472
  • Prec Joy: 0.0
  • Recall Joy: 0.0
  • F1 Joy: 0.0
  • Prec Anger: 0.0
  • Recall Anger: 0.0
  • F1 Anger: 0.0
  • Prec Disgust: 0.0
  • Recall Disgust: 0.0
  • F1 Disgust: 0.0
  • Prec Fear: 0.0
  • Recall Fear: 0.0
  • F1 Fear: 0.0
  • Prec Neutral: 0.4472
  • Recall Neutral: 1.0
  • F1 Neutral: 0.6180
  • Prec Sadness: 0.0
  • Recall Sadness: 0.0
  • F1 Sadness: 0.0
  • Prec Surprise: 0.0
  • Recall Surprise: 0.0
  • F1 Surprise: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 25.0

Training results

Training Loss Epoch Step Validation Loss Accuracy Prec Recall F1 B Acc Micro F1 Prec Joy Recall Joy F1 Joy Prec Anger Recall Anger F1 Anger Prec Disgust Recall Disgust F1 Disgust Prec Fear Recall Fear F1 Fear Prec Neutral Recall Neutral F1 Neutral Prec Sadness Recall Sadness F1 Sadness Prec Surprise Recall Surprise F1 Surprise
0.8381 1.25 2092 1.5415 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.4866 2.5 4184 1.5564 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.4862 3.75 6276 1.5700 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.4762 5.0 8368 1.5391 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.4765 6.25 10460 1.5566 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.4848 7.5 12552 1.5411 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.4782 8.75 14644 1.5548 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.4943 10.0 16736 1.6115 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.4801 11.25 18828 1.5424 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.4946 12.5 20920 1.5637 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.4867 13.75 23012 1.5492 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.4957 15.01 25104 1.5812 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.4913 16.26 27196 1.5425 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.5007 17.51 29288 1.5446 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.4919 18.76 31380 1.5616 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.4895 20.01 33472 1.5502 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.4946 21.26 35564 1.5398 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.4754 22.51 37656 1.5307 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0
1.4824 23.76 39748 1.5356 0.4472 0.2000 0.4472 0.2763 0.1429 0.4472 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4472 1.0 0.6180 0.0 0.0 0.0 0.0 0.0 0.0

Framework versions

  • Transformers 4.33.1
  • Pytorch 2.0.1
  • Datasets 2.12.0
  • Tokenizers 0.13.3