mika5883's picture
inverse_gec_relco+clang_finetuned
f17de0d verified
|
raw
history blame
No virus
1.71 kB
metadata
base_model: mika5883/inverse_gec
tags:
  - generated_from_trainer
metrics:
  - bleu
model-index:
  - name: inverse_gec_finetuned
    results: []

inverse_gec_finetuned

This model is a fine-tuned version of mika5883/inverse_gec on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2517
  • Bleu: 59.3585
  • Gen Len: 16.2412

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
No log 1.0 40 0.3337 59.0634 16.2428
No log 2.0 80 0.2966 59.1606 16.2464
No log 3.0 120 0.2708 59.229 16.2444
No log 4.0 160 0.2561 59.3149 16.2408
No log 5.0 200 0.2517 59.3585 16.2412

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.1.2
  • Datasets 2.18.0
  • Tokenizers 0.15.2