lapp0 commited on
Commit
4f7c44e
1 Parent(s): d6d6899

Training in progress, step 61875

Browse files
README.md CHANGED
@@ -15,14 +15,14 @@ This student model is distilled from the teacher model [roneneldan/TinyStories-3
15
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
16
 
17
  It achieves the following results on the evaluation set:
18
- - eval_enwikippl: 149.5442
19
- - eval_frwikippl: 28142.1230
20
- - eval_zhwikippl: 243104.3594
21
- - eval_tinystoriesppl: 11.2706
22
- - eval_loss: 7.4452
23
- - eval_runtime: 66.0052
24
- - eval_samples_per_second: 75.752
25
- - eval_steps_per_second: 9.469
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
@@ -45,10 +45,10 @@ More information needed
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
- - distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=10.0, loss_fn=raw_mse, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=10.0, loss_fn=mse, layer_mapper=None, projector=None))
49
  - train_embeddings: True
50
- - learning_rate: 0.004
51
- - train_batch_size: 8
52
  - eval_batch_size: 8
53
  - seed: 42
54
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
@@ -56,37 +56,24 @@ The following hyperparameters were used during training:
56
  - num_epochs: 1.0
57
 
58
  ### Resource Usage
59
- Peak GPU Memory: 8.2666 GB
60
 
61
  ### Eval-Phase Metrics
62
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
63
  | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
64
  | **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
65
- | 0 | 0 | 58069.8203 | 77442.5625 | 18.5372 | 65.9335 | 75.834 | 9.479 | 46072.8867 | 100550.5078 |
66
- | 3000 | 0.0485 | 145.3919 | 28068.8965 | 7.4450 | 66.1007 | 75.642 | 9.455 | 10.8792 | 239563.0469 |
67
- | 6000 | 0.0970 | 143.8738 | 27934.7852 | 7.4443 | 65.9587 | 75.805 | 9.476 | 10.7017 | 239818.8281 |
68
- | 9000 | 0.1455 | 149.5442 | 28142.1230 | 7.4452 | 66.0052 | 75.752 | 9.469 | 11.2706 | 243104.3594 |
69
- | 12000 | 0.1939 | 141.4481 | 28096.5879 | 7.4447 | 66.228 | 75.497 | 9.437 | 10.4616 | 242650.5938 |
70
- | 15000 | 0.2424 | 141.9365 | 27532.4258 | 7.4447 | 66.1198 | 75.62 | 9.453 | 10.5402 | 235884.5 |
71
- | 18000 | 0.2909 | 150.4271 | 28453.0332 | 7.4452 | 65.9158 | 75.854 | 9.482 | 11.3604 | 248415.0781 |
72
- | 21000 | 0.3394 | 148.4649 | 27337.2715 | 7.4450 | 65.7078 | 76.094 | 9.512 | 11.2888 | 229674.3125 |
73
- | 24000 | 0.3879 | 149.7760 | 28039.2520 | 7.4446 | 65.7891 | 76.0 | 9.5 | 11.2827 | 240716.3594 |
74
- | 27000 | 0.4364 | 141.8706 | 28049.1211 | 7.4454 | 65.831 | 75.952 | 9.494 | 10.5280 | 235255.9062 |
75
- | 30000 | 0.4848 | 144.7906 | 28084.7207 | 7.4449 | 65.9422 | 75.824 | 9.478 | 10.7119 | 240716.3594 |
76
- | 33000 | 0.5333 | 149.6832 | 28237.4258 | 7.4454 | 65.807 | 75.98 | 9.497 | 11.2524 | 244013.9531 |
77
- | 36000 | 0.5818 | 148.6030 | 27445.3125 | 7.4453 | 65.6651 | 76.144 | 9.518 | 11.2729 | 236893.5625 |
78
- | 39000 | 0.6303 | 142.9683 | 27676.2949 | 7.4447 | 65.6729 | 76.135 | 9.517 | 10.6589 | 235381.5781 |
79
- | 42000 | 0.6788 | 146.5510 | 27895.4648 | 7.4449 | 65.6881 | 76.117 | 9.515 | 10.9904 | 239690.7812 |
80
- | 45000 | 0.7273 | 149.2144 | 28023.4531 | 7.4454 | 65.9058 | 75.866 | 9.483 | 11.2701 | 240716.3594 |
81
- | 48000 | 0.7758 | 144.2086 | 28243.4043 | 7.4449 | 65.873 | 75.904 | 9.488 | 10.7022 | 244339.7188 |
82
- | 51000 | 0.8242 | 141.9915 | 27781.7559 | 7.4450 | 65.9256 | 75.843 | 9.48 | 10.5589 | 239563.0469 |
83
- | 54000 | 0.8727 | 145.6399 | 28219.5234 | 7.4451 | 65.6892 | 76.116 | 9.515 | 10.8693 | 238542.6094 |
84
- | 57000 | 0.9212 | 144.2365 | 27040.4609 | 7.4445 | 65.6838 | 76.122 | 9.515 | 10.8312 | 227175.6875 |
85
- | 60000 | 0.9697 | 144.3482 | 26979.5938 | 7.4447 | 65.623 | 76.193 | 9.524 | 10.9257 | 232138.6562 |
86
- | 61875 | 1.0 | 146.9147 | 28084.7207 | 7.4450 | 65.6082 | 76.21 | 9.526 | 10.9569 | 237146.5 |
87
 
88
  ### Framework versions
89
  - Distily 0.2.0
90
  - Transformers 4.44.0
91
  - Pytorch 2.3.0
92
- - Datasets 2.20.0
 
15
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
16
 
17
  It achieves the following results on the evaluation set:
18
+ - eval_enwikippl: 4707.3955
19
+ - eval_frwikippl: 38983.5547
20
+ - eval_zhwikippl: 53243.9219
21
+ - eval_tinystoriesppl: 1636.1367
22
+ - eval_loss: 5.6090
23
+ - eval_runtime: 33.6693
24
+ - eval_samples_per_second: 74.252
25
+ - eval_steps_per_second: 9.296
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
 
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
+ - distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=10.0, loss_fn=mse, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=10.0, loss_fn=raw_mse, layer_mapper=None, projector=None))
49
  - train_embeddings: True
50
+ - learning_rate: 0.0004
51
+ - train_batch_size: 16
52
  - eval_batch_size: 8
53
  - seed: 42
54
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
 
56
  - num_epochs: 1.0
57
 
58
  ### Resource Usage
59
+ Peak GPU Memory: 16.2515 GB
60
 
61
  ### Eval-Phase Metrics
62
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
63
  | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
64
  | **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
65
+ | 0 | 0 | 27548.9473 | 81896.5703 | 7.1177 | 33.5606 | 74.492 | 9.326 | 15431.1592 | 64176.7344 |
66
+ | 2000 | 0.1293 | 4707.3955 | 39038.5078 | 5.6090 | 33.5798 | 74.45 | 9.321 | 1635.3256 | 53243.9219 |
67
+ | 4000 | 0.2586 | 4704.4829 | 38983.5547 | 5.6090 | 33.4351 | 74.772 | 9.361 | 1632.6235 | 53243.9219 |
68
+ | 6000 | 0.3879 | 4717.6201 | 38972.5898 | 5.6090 | 33.4912 | 74.646 | 9.346 | 1638.3024 | 53243.9219 |
69
+ | 8000 | 0.5172 | 4707.3955 | 38983.5547 | 5.6090 | 33.6693 | 74.252 | 9.296 | 1636.1367 | 53243.9219 |
70
+ | 10000 | 0.6465 | 4707.3955 | 38994.5625 | 5.6090 | 33.5614 | 74.49 | 9.326 | 1636.4075 | 53243.9219 |
71
+ | 12000 | 0.7757 | 4707.3955 | 39005.5352 | 5.6090 | 33.5763 | 74.457 | 9.322 | 1634.2444 | 53243.9219 |
72
+ | 14000 | 0.9050 | 4708.1274 | 38994.5625 | 5.6090 | 33.4805 | 74.67 | 9.349 | 1636.4075 | 53243.9219 |
73
+ | 15469 | 1.0 | 4704.4829 | 39027.5234 | 5.6090 | 33.5496 | 74.517 | 9.329 | 1632.6235 | 53243.9219 |
 
 
 
 
 
 
 
 
 
 
 
 
 
74
 
75
  ### Framework versions
76
  - Distily 0.2.0
77
  - Transformers 4.44.0
78
  - Pytorch 2.3.0
79
+ - Datasets 2.21.0
logs/attn_loss_fn=mse, attn_weight=10.0, hidden_weight=10.0, hs_loss_fn=raw_mse, learning_rate=0.0004, warmup_ratio=0/events.out.tfevents.1723766581.b7d545513dcf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6551b035f6dc920a7b3f13aec9ccbb7d136971a4fad59f471bd17024436cbd84
3
+ size 6167
logs/attn_loss_fn=mse, attn_weight=10.0, hs_loss_fn=raw_mse, hs_weight=10.0, learning_rate=0.0004, warmup_ratio=0/events.out.tfevents.1723766901.b7d545513dcf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:191f333e8823ef8dcd78abe43a6692d30926b3913e5e694f1fa0081b64b192c0
3
+ size 16923354
logs/attn_loss_fn=raw_mse, attn_weight=10.0, hs_loss_fn=mse, hs_weight=10.0, learning_rate=0.0004/events.out.tfevents.1723766193.b7d545513dcf CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a39f0354e09a55bd32dea8da9497ba0398c5bc1ebafb21df79ba7e23e5f3c729
3
- size 307
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:be3390ab7d76842d7578dbd65c922b04dba3f6055eba39efd8e41e27219c4d1c
3
+ size 578
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:9505c0f32ea5812d4a2fba0aa92c57148f2a4589c13fbdefc77bdb4fa6713906
3
  size 137033984
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9218a59ae4f858d40b4417133a24c66b9633847dc9c92574f7cbd3f72b848f35
3
  size 137033984
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:658966eef28ccc5ad224ee05199999e2c90265f455f817d1dd80e3722545273f
3
  size 1017948104
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c94dfda9104b5cee2c69f29fd41ba84612c310dd645555e3e221c4ae07c87b0c
3
  size 1017948104