leonzhou286's picture
leonzhou286/raid_roberta
4fce063 verified
metadata
library_name: transformers
license: mit
base_model: FacebookAI/roberta-large
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: cohere_generated_abstracts_roberta
    results: []

cohere_generated_abstracts_roberta

This model is a fine-tuned version of FacebookAI/roberta-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0000
  • Accuracy: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 3

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.0078 0.0838 100 0.0029 0.9996
0.0036 0.1676 200 0.0053 0.9992
0.0064 0.2515 300 0.0012 0.9999
0.002 0.3353 400 0.0028 0.9996
0.0019 0.4191 500 0.0009 0.9999
0.0014 0.5029 600 0.0026 0.9998
0.0003 0.5868 700 0.0012 0.9999
0.0017 0.6706 800 0.0000 1.0
0.0015 0.7544 900 0.0000 1.0
0.0019 0.8382 1000 0.0007 0.9999
0.0033 0.9220 1100 0.0048 0.9994
0.0013 1.0059 1200 0.0001 1.0
0.0032 1.0897 1300 0.0015 0.9998
0.0013 1.1735 1400 0.0000 1.0
0.0 1.2573 1500 0.0000 1.0
0.0 1.3412 1600 0.0000 1.0
0.0 1.4250 1700 0.0000 1.0
0.0003 1.5088 1800 0.0023 0.9996
0.0005 1.5926 1900 0.0000 1.0
0.0 1.6764 2000 0.0000 1.0
0.0 1.7603 2100 0.0000 1.0
0.0 1.8441 2200 0.0000 1.0
0.0 1.9279 2300 0.0000 1.0
0.0 2.0117 2400 0.0000 1.0
0.0 2.0956 2500 0.0000 1.0
0.0 2.1794 2600 0.0000 1.0
0.0 2.2632 2700 0.0000 1.0
0.0 2.3470 2800 0.0000 1.0
0.0 2.4308 2900 0.0000 1.0
0.0 2.5147 3000 0.0000 1.0
0.0 2.5985 3100 0.0000 1.0
0.0 2.6823 3200 0.0000 1.0
0.0 2.7661 3300 0.0000 1.0
0.0 2.8500 3400 0.0000 1.0
0.0 2.9338 3500 0.0000 1.0

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.5.0+cu124
  • Datasets 3.0.1
  • Tokenizers 0.20.1