mixtral_no_robots_secondtry
This model is a fine-tuned version of mistralai/Mixtral-8x7B-v0.1 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.9807
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 8
- total_train_batch_size: 8
- total_eval_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
1.0635 | 0.02 | 1000 | 1.1332 |
0.9311 | 0.03 | 2000 | 1.1109 |
0.9417 | 0.05 | 3000 | 1.0926 |
1.0411 | 0.06 | 4000 | 1.0809 |
0.9516 | 0.08 | 5000 | 1.0786 |
1.0107 | 0.09 | 6000 | 1.0726 |
1.0698 | 0.11 | 7000 | 1.0666 |
1.1083 | 0.13 | 8000 | 1.0638 |
0.9148 | 0.14 | 9000 | 1.0589 |
0.957 | 0.16 | 10000 | 1.0565 |
1.0063 | 0.17 | 11000 | 1.0531 |
0.9831 | 0.19 | 12000 | 1.0509 |
1.0826 | 0.2 | 13000 | 1.0490 |
0.9598 | 0.22 | 14000 | 1.0518 |
0.8066 | 0.23 | 15000 | 1.0453 |
0.8795 | 0.25 | 16000 | 1.0431 |
1.1402 | 0.27 | 17000 | 1.0442 |
1.0652 | 0.28 | 18000 | 1.0428 |
0.93 | 0.3 | 19000 | 1.0371 |
0.9727 | 0.31 | 20000 | 1.0344 |
1.0753 | 0.33 | 21000 | 1.0339 |
0.9498 | 0.34 | 22000 | 1.0303 |
0.6971 | 0.36 | 23000 | 1.0316 |
0.9259 | 0.38 | 24000 | 1.0298 |
1.0359 | 0.39 | 25000 | 1.0284 |
1.1883 | 0.41 | 26000 | 1.0273 |
0.8642 | 0.42 | 27000 | 1.0250 |
0.9147 | 0.44 | 28000 | 1.0226 |
0.7824 | 0.45 | 29000 | 1.0237 |
0.8319 | 0.47 | 30000 | 1.0219 |
0.9443 | 0.49 | 31000 | 1.0190 |
0.9103 | 0.5 | 32000 | 1.0166 |
0.8903 | 0.52 | 33000 | 1.0149 |
1.0509 | 0.53 | 34000 | 1.0148 |
1.0008 | 0.55 | 35000 | 1.0151 |
0.778 | 0.56 | 36000 | 1.0106 |
0.7957 | 0.58 | 37000 | 1.0090 |
0.8679 | 0.6 | 38000 | 1.0085 |
1.064 | 0.61 | 39000 | 1.0064 |
0.823 | 0.63 | 40000 | 1.0061 |
0.9117 | 0.64 | 41000 | 1.0047 |
0.8284 | 0.66 | 42000 | 1.0019 |
0.9345 | 0.67 | 43000 | 1.0012 |
0.9854 | 0.69 | 44000 | 1.0004 |
0.7631 | 0.7 | 45000 | 0.9989 |
0.7189 | 0.72 | 46000 | 0.9979 |
0.9386 | 0.74 | 47000 | 0.9952 |
1.011 | 0.75 | 48000 | 0.9943 |
0.9627 | 0.77 | 49000 | 0.9941 |
1.1317 | 0.78 | 50000 | 0.9923 |
1.0506 | 0.8 | 51000 | 0.9912 |
0.8596 | 0.81 | 52000 | 0.9894 |
0.9702 | 0.83 | 53000 | 0.9889 |
1.0198 | 0.85 | 54000 | 0.9875 |
1.1125 | 0.86 | 55000 | 0.9862 |
0.9356 | 0.88 | 56000 | 0.9862 |
0.7212 | 0.89 | 57000 | 0.9852 |
0.974 | 0.91 | 58000 | 0.9843 |
0.9369 | 0.92 | 59000 | 0.9829 |
0.938 | 0.94 | 60000 | 0.9826 |
0.8011 | 0.96 | 61000 | 0.9818 |
0.7937 | 0.97 | 62000 | 0.9811 |
0.9679 | 0.99 | 63000 | 0.9807 |
Framework versions
- PEFT 0.7.1
- Transformers 4.36.0
- Pytorch 2.1.1+cu121
- Datasets 2.15.0
- Tokenizers 0.15.0
- Downloads last month
- 0
Unable to determine this model’s pipeline type. Check the
docs
.