bert2bert-model1.5
This model was trained from scratch on the id_liputan6 dataset. It achieves the following results on the evaluation set:
- Loss: 3.0718
- R1 Precision: 0.3803
- R1 Recall: 0.2608
- R1 Fmeasure: 0.3076
- R2 Precision: 0.1547
- R2 Recall: 0.1037
- R2 Fmeasure: 0.1233
- Rl Precision: 0.306
- Rl Recall: 0.2101
- Rl Fmeasure: 0.2477
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 10
- eval_batch_size: 10
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | R1 Precision | R1 Recall | R1 Fmeasure | R2 Precision | R2 Recall | R2 Fmeasure | Rl Precision | Rl Recall | Rl Fmeasure |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2.7387 | 1.0 | 495 | 2.4137 | 0.3909 | 0.2652 | 0.3142 | 0.1624 | 0.1075 | 0.1285 | 0.3191 | 0.2167 | 0.2565 |
1.8313 | 2.0 | 990 | 2.4894 | 0.3921 | 0.2673 | 0.3159 | 0.1656 | 0.1101 | 0.1313 | 0.3209 | 0.2189 | 0.2586 |
1.253 | 3.0 | 1485 | 2.6924 | 0.3841 | 0.2636 | 0.3108 | 0.1582 | 0.106 | 0.1261 | 0.3106 | 0.2135 | 0.2515 |
0.8657 | 4.0 | 1980 | 2.9297 | 0.3818 | 0.2617 | 0.3087 | 0.1549 | 0.1037 | 0.1234 | 0.3082 | 0.2116 | 0.2494 |
0.6392 | 5.0 | 2475 | 3.0718 | 0.3803 | 0.2608 | 0.3076 | 0.1547 | 0.1037 | 0.1233 | 0.306 | 0.2101 | 0.2477 |
Framework versions
- Transformers 4.38.2
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2
- Downloads last month
- 3
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.