lapp0 commited on
Commit
b566046
1 Parent(s): a384cac

End of training

Browse files
README.md CHANGED
@@ -15,14 +15,14 @@ This student model is distilled from the teacher model [roneneldan/TinyStories-3
15
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
16
 
17
  It achieves the following results on the evaluation set:
18
- - eval_enwikippl: 165.1131
19
- - eval_frwikippl: 53475.5117
20
- - eval_zhwikippl: 433274.5625
21
- - eval_tinystoriesppl: 9.8234
22
- - eval_loss: 1.2315
23
- - eval_runtime: 66.333
24
- - eval_samples_per_second: 75.377
25
- - eval_steps_per_second: 9.422
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
@@ -45,7 +45,7 @@ More information needed
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
- - distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=10.0, loss_fn=mse, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=10.0, loss_fn=mse, layer_mapper=None, projector=None))
49
  - train_embeddings: True
50
  - learning_rate: 0.004
51
  - train_batch_size: 8
@@ -53,7 +53,6 @@ The following hyperparameters were used during training:
53
  - seed: 42
54
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
55
  - lr_scheduler_type: constant
56
- - lr_scheduler_warmup_ratio: 0.1
57
  - num_epochs: 1.0
58
 
59
  ### Resource Usage
@@ -63,28 +62,28 @@ Peak GPU Memory: 8.2677 GB
63
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
64
  | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
65
  | **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
66
- | 0 | 0 | 21397.4785 | 57946.0117 | 6.1625 | 66.2144 | 75.512 | 9.439 | 12321.8145 | 60955.8008 |
67
- | 3000 | 0.0485 | 163.4777 | 52357.5 | 1.2318 | 66.2157 | 75.511 | 9.439 | 9.7461 | 405426.5312 |
68
- | 6000 | 0.0970 | 164.5832 | 52313.2773 | 1.2313 | 66.4422 | 75.253 | 9.407 | 9.8299 | 399840.8438 |
69
- | 9000 | 0.1455 | 165.1131 | 53475.5117 | 1.2315 | 66.333 | 75.377 | 9.422 | 9.8234 | 433274.5625 |
70
- | 12000 | 0.1939 | 175.9144 | 53347.6094 | 1.2314 | 66.1602 | 75.574 | 9.447 | 10.8017 | 431314.25 |
71
- | 15000 | 0.2424 | 166.3006 | 53279.9844 | 1.2312 | 66.2525 | 75.469 | 9.434 | 9.8988 | 439327.75 |
72
- | 18000 | 0.2909 | 165.6127 | 53520.7148 | 1.2316 | 66.1729 | 75.56 | 9.445 | 9.8144 | 435128.4375 |
73
- | 21000 | 0.3394 | 164.5322 | 54035.7852 | 1.2317 | 66.1602 | 75.574 | 9.447 | 9.7405 | 440971.5312 |
74
- | 24000 | 0.3879 | 175.3294 | 52764.6719 | 1.2315 | 66.2724 | 75.446 | 9.431 | 10.8317 | 413399.9688 |
75
- | 27000 | 0.4364 | 176.5424 | 52542.1719 | 1.2318 | 66.1192 | 75.621 | 9.453 | 10.8923 | 411858.9688 |
76
- | 30000 | 0.4848 | 165.8760 | 53490.5586 | 1.2310 | 66.2275 | 75.497 | 9.437 | 9.8575 | 418728.4062 |
77
- | 33000 | 0.5333 | 165.8246 | 53823.1172 | 1.2314 | 66.2901 | 75.426 | 9.428 | 9.8718 | 418728.4062 |
78
- | 36000 | 0.5818 | 164.1630 | 51814.5781 | 1.2318 | 66.2021 | 75.526 | 9.441 | 9.7994 | 426621.9688 |
79
- | 39000 | 0.6303 | 176.7887 | 53868.6172 | 1.2316 | 66.2783 | 75.439 | 9.43 | 10.8604 | 428103.875 |
80
- | 42000 | 0.6788 | 176.1667 | 52483.0273 | 1.2317 | 66.1638 | 75.57 | 9.446 | 10.8981 | 430854.2188 |
81
- | 45000 | 0.7273 | 165.7860 | 54833.2031 | 1.2316 | 66.226 | 75.499 | 9.437 | 9.7437 | 433274.5625 |
82
- | 48000 | 0.7758 | 177.6536 | 54096.7305 | 1.2314 | 66.2463 | 75.476 | 9.434 | 10.8174 | 436174.1875 |
83
- | 51000 | 0.8242 | 164.2648 | 53400.2422 | 1.2316 | 66.365 | 75.341 | 9.418 | 9.7212 | 425257.9062 |
84
- | 54000 | 0.8727 | 177.6812 | 53792.7930 | 1.2314 | 66.3929 | 75.309 | 9.414 | 10.8147 | 419846.8125 |
85
- | 57000 | 0.9212 | 169.2639 | 54421.5391 | 1.2310 | 66.2287 | 75.496 | 9.437 | 10.0082 | 441678.1875 |
86
- | 60000 | 0.9697 | 163.0982 | 52077.9805 | 1.2316 | 66.2184 | 75.508 | 9.438 | 9.7240 | 401765.375 |
87
- | 61875 | 1.0 | 164.1884 | 52549.5898 | 1.2312 | 66.1617 | 75.572 | 9.447 | 9.8116 | 419846.8125 |
88
 
89
  ### Framework versions
90
  - Distily 0.2.0
 
15
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
16
 
17
  It achieves the following results on the evaluation set:
18
+ - eval_enwikippl: 162.8458
19
+ - eval_frwikippl: 38427.4805
20
+ - eval_zhwikippl: 300303.4688
21
+ - eval_tinystoriesppl: 11.0505
22
+ - eval_loss: 8.0215
23
+ - eval_runtime: 66.4346
24
+ - eval_samples_per_second: 75.262
25
+ - eval_steps_per_second: 9.408
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
 
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
+ - distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=10.0, loss_fn=cos, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=10.0, loss_fn=mse, layer_mapper=None, projector=None))
49
  - train_embeddings: True
50
  - learning_rate: 0.004
51
  - train_batch_size: 8
 
53
  - seed: 42
54
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
55
  - lr_scheduler_type: constant
 
56
  - num_epochs: 1.0
57
 
58
  ### Resource Usage
 
62
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
63
  | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
64
  | **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
65
+ | 0 | 0 | 21397.4785 | 57946.0117 | 14.5303 | 66.4513 | 75.243 | 9.405 | 12321.8145 | 60955.8008 |
66
+ | 3000 | 0.0485 | 162.4552 | 39038.5078 | 8.0218 | 66.4482 | 75.247 | 9.406 | 11.0391 | 307027.6562 |
67
+ | 6000 | 0.0970 | 153.5114 | 39220.4258 | 8.0201 | 66.3712 | 75.334 | 9.417 | 10.0871 | 301748.9375 |
68
+ | 9000 | 0.1455 | 162.8458 | 38427.4805 | 8.0215 | 66.4346 | 75.262 | 9.408 | 11.0505 | 300303.4688 |
69
+ | 12000 | 0.1939 | 162.6252 | 39458.6953 | 8.0210 | 66.1823 | 75.549 | 9.444 | 10.9736 | 309660.0625 |
70
+ | 15000 | 0.2424 | 163.4587 | 39866.5234 | 8.0206 | 66.2793 | 75.438 | 9.43 | 10.9968 | 317608.9688 |
71
+ | 18000 | 0.2909 | 162.5244 | 39259.0820 | 8.0212 | 66.4555 | 75.238 | 9.405 | 10.9723 | 301427.0938 |
72
+ | 21000 | 0.3394 | 163.7503 | 39860.8984 | 8.0212 | 66.3843 | 75.319 | 9.415 | 11.0890 | 308341.0625 |
73
+ | 24000 | 0.3879 | 161.7958 | 39126.6172 | 8.0216 | 66.3302 | 75.38 | 9.423 | 10.9895 | 296482.0312 |
74
+ | 27000 | 0.4364 | 163.1867 | 39303.3633 | 8.0212 | 66.2968 | 75.418 | 9.427 | 11.0542 | 302797.375 |
75
+ | 30000 | 0.4848 | 153.6542 | 39408.6797 | 8.0203 | 66.0901 | 75.654 | 9.457 | 10.0771 | 302393.6562 |
76
+ | 33000 | 0.5333 | 163.0667 | 39922.7188 | 8.0218 | 66.1778 | 75.554 | 9.444 | 10.9659 | 308999.875 |
77
+ | 36000 | 0.5818 | 161.7519 | 38254.6758 | 8.0205 | 66.4691 | 75.223 | 9.403 | 10.9483 | 301910.125 |
78
+ | 39000 | 0.6303 | 152.6517 | 39331.0703 | 8.0206 | 66.292 | 75.424 | 9.428 | 9.9620 | 302393.6562 |
79
+ | 42000 | 0.6788 | 161.3577 | 38265.4375 | 8.0211 | 66.2169 | 75.509 | 9.439 | 11.0086 | 296798.5938 |
80
+ | 45000 | 0.7273 | 152.9832 | 40238.8672 | 8.0213 | 66.3366 | 75.373 | 9.422 | 9.9365 | 316593.7812 |
81
+ | 48000 | 0.7758 | 163.2057 | 40012.7852 | 8.0204 | 66.3233 | 75.388 | 9.424 | 10.9546 | 311815.375 |
82
+ | 51000 | 0.8242 | 151.6968 | 39000.0664 | 8.0212 | 66.2188 | 75.507 | 9.438 | 9.9484 | 299823.0 |
83
+ | 54000 | 0.8727 | 153.8090 | 39832.8125 | 8.0208 | 66.3113 | 75.402 | 9.425 | 10.0065 | 303363.5938 |
84
+ | 57000 | 0.9212 | 159.2961 | 39759.9453 | 8.0196 | 66.1288 | 75.61 | 9.451 | 10.5878 | 314824.9062 |
85
+ | 60000 | 0.9697 | 151.4502 | 39022.0156 | 8.0217 | 66.3518 | 75.356 | 9.419 | 9.9451 | 294904.2812 |
86
+ | 61875 | 1.0 | 161.7269 | 38666.3906 | 8.0211 | 66.5523 | 75.129 | 9.391 | 11.0359 | 299024.3125 |
87
 
88
  ### Framework versions
89
  - Distily 0.2.0
logs/attn_loss_fn=mse, attn_weight=10.0, hs_loss_fn=cos, hs_weight=10.0, learning_rate=0.004, warmup_ratio=0/events.out.tfevents.1723816614.93d6cbb3ad53 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0a1dddf40bf03b9f4dda52dd55e1463f0807bdb29e97f0f3b2ca5c3ab0b04e3f
3
+ size 312