Edit model card

This model is a fine-tuned version of pdelobelle/robbert-v2-dutch-base on GroNLP/dutch-cola, from where only the acceptable and unacceptable examples were used to fine-tune. The data used for fine-tuning therefore only contains sentences with original mark 'None' and '*'.

Accuracy: 0.8055

The following hyperparameters were used during training:

learning rate: 4e-05
train_batch_size: 16
eval_batch_size: 16
seed: 42
num_epochs: 4

Epoch Training Loss Step Validation Loss Accuracy
1.0 0.5078 1169 0.5111 0.7732
2.0 0.3076 2338 0.6988 0.7700
3.0 0.1901 3507 0.8459 0.8027
4.0 0.1198 4676 1.0620 0.8055
Downloads last month
9
Safetensors
Model size
117M params
Tensor type
F32
·
Inference API
This model can be loaded on Inference API (serverless).

Dataset used to train HylkeBr/robbert_dutch-cola