esahit's picture
Full fine-tuning run with ul2-base-dutch on increased dataset
daf3460 verified
metadata
library_name: transformers
license: apache-2.0
base_model: yhavinga/ul2-base-dutch
tags:
  - generated_from_trainer
model-index:
  - name: ul2-base-dutch-finetuned-oba-book-search
    results: []

ul2-base-dutch-finetuned-oba-book-search

This model is a fine-tuned version of yhavinga/ul2-base-dutch on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 3.5040
  • Top-5-accuracy: 0.0597

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.3
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 3

Training results

Training Loss Epoch Step Validation Loss Top-5-accuracy
6.7559 0.0848 500 7.0741 0.0
7.3594 0.1696 1000 7.0888 0.0
7.4457 0.2544 1500 6.8574 0.0
7.6522 0.3392 2000 7.2824 0.0
7.4598 0.4239 2500 7.1592 0.0
7.4733 0.5087 3000 6.8309 0.0
7.1533 0.5935 3500 6.3314 0.0
7.1903 0.6783 4000 6.6715 0.0
12.2465 0.7631 4500 7.5477 0.0
7.0061 0.8479 5000 6.7576 0.0
6.7448 0.9327 5500 6.2698 0.0
6.4934 1.0175 6000 6.0520 0.0
6.7022 1.1023 6500 6.4743 0.0
6.6138 1.1870 7000 6.6552 0.0
6.1879 1.2718 7500 5.8394 0.0
6.3701 1.3566 8000 6.2708 0.0
6.0675 1.4414 8500 5.8804 0.0
5.9228 1.5262 9000 5.4786 0.0796
5.8256 1.6110 9500 5.8534 0.0
5.529 1.6958 10000 5.4673 0.0796
5.3783 1.7806 10500 5.1146 0.0
5.3029 1.8654 11000 5.1393 0.0
5.0497 1.9501 11500 4.8904 0.0
4.9395 2.0349 12000 4.7346 0.0
4.6926 2.1197 12500 4.6029 0.0
4.5387 2.2045 13000 4.3546 0.1393
4.3876 2.2893 13500 4.2308 0.0597
4.2131 2.3741 14000 4.1112 0.1990
4.0999 2.4589 14500 3.9334 0.0995
3.9525 2.5437 15000 3.8421 0.0
3.8629 2.6285 15500 3.7120 0.1592
3.7975 2.7132 16000 3.5973 0.0796
3.7205 2.7980 16500 3.5398 0.0796
3.6382 2.8828 17000 3.5131 0.2786
3.5967 2.9676 17500 3.5040 0.0597

Framework versions

  • Transformers 4.44.2
  • Pytorch 1.13.0+cu116
  • Datasets 3.0.0
  • Tokenizers 0.19.1