Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Pretrained checkpoint: roberta-large-mnli

Traning hyperparameters:

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 24
  • eval_batch_size: 24
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 3

Training results

Epoch Train loss Test loss Subtask 3 f1 Subtask 3 precision Subtask 3 recall Subtask4 accuracy
1 340.1608857823303 68.94318291614763 0.8756704046806436 0.8752436647173489 0.8760975609756098 0.8458536585365853
2 148.33983786634053 36.02450433204649 0.9217221135029354 0.9244357212953876 0.9190243902439025 0.8741463414634146
3 60.1067302722804 29.687325364822755 0.9230769230769231 0.9393939393939394 0.9073170731707317 0.8848780487804878
Downloads last month
6
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.