metadata
license: apache-2.0
library_name: peft
tags:
- unsloth
- generated_from_trainer
base_model: unsloth/llama-2-13b-bnb-4bit
model-index:
- name: llama_2_13b_Magiccoder_evol_10k_qlora_ortho
results: []
llama_2_13b_Magiccoder_evol_10k_qlora_ortho
This model is a fine-tuned version of unsloth/llama-2-13b-bnb-4bit on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.0950
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 0.02
- num_epochs: 1
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
1.2067 | 0.0262 | 4 | 1.1823 |
1.1675 | 0.0523 | 8 | 1.1498 |
1.1004 | 0.0785 | 12 | 1.1349 |
1.0531 | 0.1047 | 16 | 1.1288 |
1.0946 | 0.1308 | 20 | 1.1246 |
1.0602 | 0.1570 | 24 | 1.1215 |
1.0636 | 0.1832 | 28 | 1.1175 |
1.1078 | 0.2093 | 32 | 1.1151 |
1.04 | 0.2355 | 36 | 1.1125 |
1.115 | 0.2617 | 40 | 1.1123 |
1.0994 | 0.2878 | 44 | 1.1102 |
1.1379 | 0.3140 | 48 | 1.1098 |
1.1145 | 0.3401 | 52 | 1.1064 |
1.0849 | 0.3663 | 56 | 1.1088 |
1.1317 | 0.3925 | 60 | 1.1087 |
1.134 | 0.4186 | 64 | 1.1056 |
1.0856 | 0.4448 | 68 | 1.1038 |
1.0972 | 0.4710 | 72 | 1.1004 |
1.044 | 0.4971 | 76 | 1.1005 |
1.1311 | 0.5233 | 80 | 1.1004 |
1.1474 | 0.5495 | 84 | 1.1002 |
1.0886 | 0.5756 | 88 | 1.0999 |
1.0372 | 0.6018 | 92 | 1.0973 |
1.0376 | 0.6280 | 96 | 1.0968 |
1.1006 | 0.6541 | 100 | 1.0965 |
1.09 | 0.6803 | 104 | 1.0964 |
1.0786 | 0.7065 | 108 | 1.0969 |
1.111 | 0.7326 | 112 | 1.0970 |
1.053 | 0.7588 | 116 | 1.0961 |
1.0764 | 0.7850 | 120 | 1.0948 |
1.0971 | 0.8111 | 124 | 1.0944 |
1.0572 | 0.8373 | 128 | 1.0948 |
0.999 | 0.8635 | 132 | 1.0949 |
1.1098 | 0.8896 | 136 | 1.0951 |
1.0215 | 0.9158 | 140 | 1.0951 |
1.0759 | 0.9419 | 144 | 1.0951 |
1.096 | 0.9681 | 148 | 1.0950 |
1.08 | 0.9943 | 152 | 1.0950 |
Framework versions
- PEFT 0.7.1
- Transformers 4.40.2
- Pytorch 2.3.0+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1