CodeLlama-7b-Instruct-hf_Fi__components_size_252_epochs_10_2024-06-21_09-35-27_3556547
This model is a fine-tuned version of codellama/CodeLlama-7b-Instruct-hf on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.9096
- Accuracy: 0.462
- Chrf: 0.297
- Bleu: 0.225
- Sacrebleu: 0.2
- Rouge1: 0.472
- Rouge2: 0.3
- Rougel: 0.459
- Rougelsum: 0.471
- Meteor: 0.505
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 1
- eval_batch_size: 1
- seed: 3407
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 4
- total_eval_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-06
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 252
- training_steps: 2520
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Chrf | Bleu | Sacrebleu | Rouge1 | Rouge2 | Rougel | Rougelsum | Meteor |
---|---|---|---|---|---|---|---|---|---|---|---|---|
0.063 | 4.0 | 252 | 3.6864 | 0.457 | 0.044 | 0.0 | 0.0 | 0.044 | 0.0 | 0.03 | 0.03 | 0.138 |
0.0742 | 8.0 | 504 | 2.7260 | 0.474 | 0.104 | 0.036 | 0.0 | 0.148 | 0.009 | 0.126 | 0.143 | 0.24 |
0.0774 | 12.0 | 756 | 2.6054 | 0.461 | 0.159 | 0.099 | 0.1 | 0.315 | 0.149 | 0.306 | 0.308 | 0.325 |
0.7995 | 16.0 | 1008 | 2.4395 | 0.465 | 0.215 | 0.119 | 0.1 | 0.393 | 0.178 | 0.365 | 0.379 | 0.359 |
0.1761 | 20.0 | 1260 | 2.4190 | 0.482 | 0.249 | 0.164 | 0.2 | 0.356 | 0.194 | 0.34 | 0.355 | 0.39 |
0.4002 | 24.0 | 1512 | 2.1404 | 0.462 | 0.251 | 0.188 | 0.2 | 0.418 | 0.269 | 0.4 | 0.409 | 0.437 |
0.0254 | 28.0 | 1764 | 2.0202 | 0.46 | 0.295 | 0.192 | 0.2 | 0.484 | 0.308 | 0.461 | 0.478 | 0.463 |
0.1469 | 32.0 | 2016 | 1.9957 | 0.462 | 0.289 | 0.225 | 0.2 | 0.448 | 0.291 | 0.44 | 0.443 | 0.482 |
0.0346 | 36.0 | 2268 | 1.9562 | 0.46 | 0.293 | 0.2 | 0.2 | 0.474 | 0.278 | 0.452 | 0.471 | 0.491 |
0.0378 | 40.0 | 2520 | 1.9096 | 0.462 | 0.297 | 0.225 | 0.2 | 0.472 | 0.3 | 0.459 | 0.471 | 0.505 |
Framework versions
- Transformers 4.37.0
- Pytorch 2.2.1+cu121
- Datasets 2.20.0
- Tokenizers 0.15.2
Model tree for vdavidr/CodeLlama-7b-Instruct-hf_Fi__components_size_252_epochs_10_2024-06-21_09-35-27_3556547
Base model
codellama/CodeLlama-7b-Instruct-hf