bloom-560m-finetuned-unnatural-instructions
This model is a fine-tuned version of bigscience/bloom-560m on the None dataset. It achieves the following results on the evaluation set:
- Loss: 2.7021
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
1.6937 | 0.32 | 1000 | 1.6739 |
1.5527 | 0.63 | 2000 | 1.5767 |
1.5305 | 0.95 | 3000 | 1.5221 |
1.1514 | 1.26 | 4000 | 1.5201 |
1.1564 | 1.58 | 5000 | 1.5042 |
1.1365 | 1.89 | 6000 | 1.4799 |
0.7729 | 2.21 | 7000 | 1.6496 |
0.7713 | 2.52 | 8000 | 1.5909 |
0.8063 | 2.84 | 9000 | 1.6073 |
0.4753 | 3.15 | 10000 | 1.9611 |
0.4719 | 3.47 | 11000 | 2.0177 |
0.4732 | 3.79 | 12000 | 2.0341 |
0.2747 | 4.1 | 13000 | 2.5669 |
0.2582 | 4.42 | 14000 | 2.6801 |
0.2751 | 4.73 | 15000 | 2.6907 |
Framework versions
- Transformers 4.25.1
- Pytorch 1.13.0+cu116
- Datasets 2.8.0
- Tokenizers 0.13.2
- Downloads last month
- 12
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.