distilbert-base-multilingual-cased_regression_finetuned_mobile01_all
This model is a fine-tuned version of distilbert/distilbert-base-multilingual-cased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.8877
- Mse: 0.8877
- Mae: 0.5973
- Rmse: 0.9422
- Mape: inf
- R Squared: 0.5145
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 778
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Mse | Mae | Rmse | Mape | R Squared |
---|---|---|---|---|---|---|---|---|
1.0906 | 1.0 | 7789 | 1.0449 | 1.0449 | 0.7032 | 1.0222 | inf | 0.4285 |
0.9156 | 2.0 | 15578 | 0.9369 | 0.9369 | 0.6340 | 0.9679 | inf | 0.4875 |
0.6858 | 3.0 | 23367 | 0.9153 | 0.9153 | 0.6189 | 0.9567 | inf | 0.4993 |
1.0272 | 4.0 | 31156 | 0.8877 | 0.8877 | 0.5973 | 0.9422 | inf | 0.5145 |
0.7273 | 5.0 | 38945 | 0.8928 | 0.8928 | 0.6004 | 0.9449 | inf | 0.5117 |
0.8211 | 6.0 | 46734 | 0.8880 | 0.8880 | 0.5936 | 0.9423 | inf | 0.5143 |
Framework versions
- Transformers 4.39.3
- Pytorch 2.2.1
- Datasets 2.18.0
- Tokenizers 0.15.2
- Downloads last month
- 14
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.