lapp0 commited on
Commit
d81becf
1 Parent(s): 52e9b8e

End of training

Browse files
README.md CHANGED
@@ -15,14 +15,14 @@ This student model is distilled from the teacher model [roneneldan/TinyStories-3
15
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
16
 
17
  It achieves the following results on the evaluation set:
18
- - eval_enwikippl: 162.8458
19
- - eval_frwikippl: 38427.4805
20
- - eval_zhwikippl: 300303.4688
21
- - eval_tinystoriesppl: 11.0505
22
- - eval_loss: 8.0215
23
- - eval_runtime: 66.4346
24
- - eval_samples_per_second: 75.262
25
- - eval_steps_per_second: 9.408
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
@@ -45,7 +45,7 @@ More information needed
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
- - distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=10.0, loss_fn=cos, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=10.0, loss_fn=mse, layer_mapper=None, projector=None))
49
  - train_embeddings: True
50
  - learning_rate: 0.004
51
  - train_batch_size: 8
@@ -56,34 +56,34 @@ The following hyperparameters were used during training:
56
  - num_epochs: 1.0
57
 
58
  ### Resource Usage
59
- Peak GPU Memory: 8.2677 GB
60
 
61
  ### Eval-Phase Metrics
62
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
63
  | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
64
  | **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
65
- | 0 | 0 | 21397.4785 | 57946.0117 | 14.5303 | 66.4513 | 75.243 | 9.405 | 12321.8145 | 60955.8008 |
66
- | 3000 | 0.0485 | 162.4552 | 39038.5078 | 8.0218 | 66.4482 | 75.247 | 9.406 | 11.0391 | 307027.6562 |
67
- | 6000 | 0.0970 | 153.5114 | 39220.4258 | 8.0201 | 66.3712 | 75.334 | 9.417 | 10.0871 | 301748.9375 |
68
- | 9000 | 0.1455 | 162.8458 | 38427.4805 | 8.0215 | 66.4346 | 75.262 | 9.408 | 11.0505 | 300303.4688 |
69
- | 12000 | 0.1939 | 162.6252 | 39458.6953 | 8.0210 | 66.1823 | 75.549 | 9.444 | 10.9736 | 309660.0625 |
70
- | 15000 | 0.2424 | 163.4587 | 39866.5234 | 8.0206 | 66.2793 | 75.438 | 9.43 | 10.9968 | 317608.9688 |
71
- | 18000 | 0.2909 | 162.5244 | 39259.0820 | 8.0212 | 66.4555 | 75.238 | 9.405 | 10.9723 | 301427.0938 |
72
- | 21000 | 0.3394 | 163.7503 | 39860.8984 | 8.0212 | 66.3843 | 75.319 | 9.415 | 11.0890 | 308341.0625 |
73
- | 24000 | 0.3879 | 161.7958 | 39126.6172 | 8.0216 | 66.3302 | 75.38 | 9.423 | 10.9895 | 296482.0312 |
74
- | 27000 | 0.4364 | 163.1867 | 39303.3633 | 8.0212 | 66.2968 | 75.418 | 9.427 | 11.0542 | 302797.375 |
75
- | 30000 | 0.4848 | 153.6542 | 39408.6797 | 8.0203 | 66.0901 | 75.654 | 9.457 | 10.0771 | 302393.6562 |
76
- | 33000 | 0.5333 | 163.0667 | 39922.7188 | 8.0218 | 66.1778 | 75.554 | 9.444 | 10.9659 | 308999.875 |
77
- | 36000 | 0.5818 | 161.7519 | 38254.6758 | 8.0205 | 66.4691 | 75.223 | 9.403 | 10.9483 | 301910.125 |
78
- | 39000 | 0.6303 | 152.6517 | 39331.0703 | 8.0206 | 66.292 | 75.424 | 9.428 | 9.9620 | 302393.6562 |
79
- | 42000 | 0.6788 | 161.3577 | 38265.4375 | 8.0211 | 66.2169 | 75.509 | 9.439 | 11.0086 | 296798.5938 |
80
- | 45000 | 0.7273 | 152.9832 | 40238.8672 | 8.0213 | 66.3366 | 75.373 | 9.422 | 9.9365 | 316593.7812 |
81
- | 48000 | 0.7758 | 163.2057 | 40012.7852 | 8.0204 | 66.3233 | 75.388 | 9.424 | 10.9546 | 311815.375 |
82
- | 51000 | 0.8242 | 151.6968 | 39000.0664 | 8.0212 | 66.2188 | 75.507 | 9.438 | 9.9484 | 299823.0 |
83
- | 54000 | 0.8727 | 153.8090 | 39832.8125 | 8.0208 | 66.3113 | 75.402 | 9.425 | 10.0065 | 303363.5938 |
84
- | 57000 | 0.9212 | 159.2961 | 39759.9453 | 8.0196 | 66.1288 | 75.61 | 9.451 | 10.5878 | 314824.9062 |
85
- | 60000 | 0.9697 | 151.4502 | 39022.0156 | 8.0217 | 66.3518 | 75.356 | 9.419 | 9.9451 | 294904.2812 |
86
- | 61875 | 1.0 | 161.7269 | 38666.3906 | 8.0211 | 66.5523 | 75.129 | 9.391 | 11.0359 | 299024.3125 |
87
 
88
  ### Framework versions
89
  - Distily 0.2.0
 
15
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
16
 
17
  It achieves the following results on the evaluation set:
18
+ - eval_enwikippl: 148.8680
19
+ - eval_frwikippl: 21987.7637
20
+ - eval_zhwikippl: 181662.0469
21
+ - eval_tinystoriesppl: 12.2941
22
+ - eval_loss: 25.4402
23
+ - eval_runtime: 66.3462
24
+ - eval_samples_per_second: 75.362
25
+ - eval_steps_per_second: 9.42
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
 
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
+ - distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=10.0, loss_fn=kl, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=10.0, loss_fn=mse, layer_mapper=None, projector=None))
49
  - train_embeddings: True
50
  - learning_rate: 0.004
51
  - train_batch_size: 8
 
56
  - num_epochs: 1.0
57
 
58
  ### Resource Usage
59
+ Peak GPU Memory: 8.2666 GB
60
 
61
  ### Eval-Phase Metrics
62
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
63
  | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
64
  | **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
65
+ | 0 | 0 | 8167.3613 | 48488.5742 | 38.4688 | 65.8686 | 75.909 | 9.489 | 3345.3254 | 73944.1484 |
66
+ | 3000 | 0.0485 | 145.5666 | 22094.8828 | 25.4406 | 65.8501 | 75.93 | 9.491 | 11.9420 | 179685.7344 |
67
+ | 6000 | 0.0970 | 146.8635 | 22376.7676 | 25.4394 | 66.4135 | 75.286 | 9.411 | 11.9667 | 183024.1719 |
68
+ | 9000 | 0.1455 | 148.8680 | 21987.7637 | 25.4402 | 66.3462 | 75.362 | 9.42 | 12.2941 | 181662.0469 |
69
+ | 12000 | 0.1939 | 151.0636 | 22504.7676 | 25.4400 | 66.2246 | 75.501 | 9.438 | 12.5052 | 181759.0938 |
70
+ | 15000 | 0.2424 | 146.5339 | 22604.8535 | 25.4392 | 66.1192 | 75.621 | 9.453 | 11.8540 | 189888.4375 |
71
+ | 18000 | 0.2909 | 147.3192 | 22481.0215 | 25.4400 | 66.2457 | 75.477 | 9.435 | 12.0058 | 183905.2969 |
72
+ | 21000 | 0.3394 | 150.9525 | 22555.5625 | 25.4390 | 66.2661 | 75.453 | 9.432 | 12.4310 | 188575.7344 |
73
+ | 24000 | 0.3879 | 149.8920 | 22155.6523 | 25.4404 | 66.2363 | 75.487 | 9.436 | 12.4593 | 177493.9531 |
74
+ | 27000 | 0.4364 | 147.1653 | 22531.7402 | 25.4398 | 66.3823 | 75.321 | 9.415 | 11.9514 | 183905.2969 |
75
+ | 30000 | 0.4848 | 150.4855 | 22580.9805 | 25.4400 | 66.2281 | 75.497 | 9.437 | 12.4172 | 183513.1875 |
76
+ | 33000 | 0.5333 | 145.7359 | 22307.5195 | 25.4400 | 66.4448 | 75.25 | 9.406 | 11.9159 | 180165.8438 |
77
+ | 36000 | 0.5818 | 148.7297 | 22495.2617 | 25.4396 | 66.2715 | 75.447 | 9.431 | 12.1426 | 186574.0156 |
78
+ | 39000 | 0.6303 | 147.5820 | 22807.9492 | 25.4406 | 66.6342 | 75.037 | 9.38 | 11.9944 | 187372.1406 |
79
+ | 42000 | 0.6788 | 150.2292 | 22193.125 | 25.4402 | 66.5873 | 75.089 | 9.386 | 12.5202 | 182050.1875 |
80
+ | 45000 | 0.7273 | 146.7725 | 22207.2051 | 25.4400 | 66.1476 | 75.589 | 9.449 | 11.9890 | 181468.2812 |
81
+ | 48000 | 0.7758 | 146.3014 | 22194.6914 | 25.4398 | 66.4166 | 75.282 | 9.41 | 11.9746 | 177588.7812 |
82
+ | 51000 | 0.8242 | 148.6375 | 22533.3301 | 25.4402 | 66.2612 | 75.459 | 9.432 | 12.1471 | 186275.5156 |
83
+ | 54000 | 0.8727 | 147.6220 | 22394.1035 | 25.4404 | 66.4085 | 75.292 | 9.411 | 12.1140 | 185581.1406 |
84
+ | 57000 | 0.9212 | 148.8161 | 22679.8047 | 25.4400 | 66.3328 | 75.377 | 9.422 | 12.1230 | 187872.7812 |
85
+ | 60000 | 0.9697 | 146.8180 | 22345.2695 | 25.4392 | 66.5261 | 75.158 | 9.395 | 12.0317 | 181371.5625 |
86
+ | 61875 | 1.0 | 149.0526 | 22099.5410 | 25.4400 | 66.498 | 75.19 | 9.399 | 12.3048 | 181371.5625 |
87
 
88
  ### Framework versions
89
  - Distily 0.2.0
logs/attn_loss_fn=mse, attn_weight=10.0, hs_loss_fn=kl, hs_weight=10.0, learning_rate=0.004, warmup_ratio=0/events.out.tfevents.1723827667.93d6cbb3ad53 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:28531dfe2f26a3344850a505958d64116767b0443a5e9b00640e291088d0f056
3
+ size 312