metadata
license: apache-2.0
base_model: bert-base-multilingual-uncased
tags:
- generated_from_trainer
metrics:
- recall
- accuracy
model-index:
- name: multibert1110_lrate2.5b16
results: []
multibert1110_lrate2.5b16
This model is a fine-tuned version of bert-base-multilingual-uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5408
- Precisions: 0.8751
- Recall: 0.8102
- F-measure: 0.8365
- Accuracy: 0.9131
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 14
Training results
Training Loss | Epoch | Step | Validation Loss | Precisions | Recall | F-measure | Accuracy |
---|---|---|---|---|---|---|---|
0.6226 | 1.0 | 236 | 0.3957 | 0.8372 | 0.6871 | 0.6960 | 0.8714 |
0.3373 | 2.0 | 472 | 0.3830 | 0.8460 | 0.7204 | 0.7485 | 0.8810 |
0.2071 | 3.0 | 708 | 0.3464 | 0.8572 | 0.7790 | 0.7966 | 0.8985 |
0.1384 | 4.0 | 944 | 0.4046 | 0.8653 | 0.7863 | 0.8128 | 0.9041 |
0.0935 | 5.0 | 1180 | 0.4299 | 0.8559 | 0.7976 | 0.8209 | 0.9044 |
0.0708 | 6.0 | 1416 | 0.4899 | 0.8709 | 0.7972 | 0.8269 | 0.9096 |
0.0504 | 7.0 | 1652 | 0.4837 | 0.8578 | 0.8030 | 0.8254 | 0.9039 |
0.0361 | 8.0 | 1888 | 0.5098 | 0.8448 | 0.7970 | 0.8173 | 0.9056 |
0.0259 | 9.0 | 2124 | 0.5260 | 0.8622 | 0.7992 | 0.8241 | 0.9090 |
0.0214 | 10.0 | 2360 | 0.5394 | 0.8676 | 0.8051 | 0.8316 | 0.9107 |
0.0149 | 11.0 | 2596 | 0.5408 | 0.8751 | 0.8102 | 0.8365 | 0.9131 |
0.0095 | 12.0 | 2832 | 0.5725 | 0.8709 | 0.8056 | 0.8321 | 0.9115 |
0.0092 | 13.0 | 3068 | 0.5650 | 0.8658 | 0.8099 | 0.8326 | 0.9119 |
0.0073 | 14.0 | 3304 | 0.5734 | 0.8637 | 0.8101 | 0.8317 | 0.9122 |
Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1