lapp0 commited on
Commit
52e9b8e
1 Parent(s): 56da803

Training in progress, step 61875

Browse files
README.md CHANGED
@@ -15,14 +15,14 @@ This student model is distilled from the teacher model [roneneldan/TinyStories-3
15
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
16
 
17
  It achieves the following results on the evaluation set:
18
- - eval_enwikippl: 150.6954
19
- - eval_frwikippl: 20983.1934
20
- - eval_zhwikippl: 163274.0312
21
- - eval_tinystoriesppl: 13.6584
22
- - eval_loss: 2.1824
23
- - eval_runtime: 65.7475
24
- - eval_samples_per_second: 76.049
25
- - eval_steps_per_second: 9.506
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
@@ -45,9 +45,9 @@ More information needed
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
- - distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=10.0, loss_fn=mse, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=10.0, loss_fn=mse, layer_mapper=None, projector=None))
49
  - train_embeddings: True
50
- - learning_rate: 0.001
51
  - train_batch_size: 8
52
  - eval_batch_size: 8
53
  - seed: 42
@@ -56,37 +56,37 @@ The following hyperparameters were used during training:
56
  - num_epochs: 1.0
57
 
58
  ### Resource Usage
59
- Peak GPU Memory: 8.2666 GB
60
 
61
  ### Eval-Phase Metrics
62
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
63
  | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
64
  | **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
65
- | 0 | 0 | 15507.5488 | 57030.9961 | 5.9658 | 65.2151 | 76.669 | 9.584 | 7965.0278 | 102309.4219 |
66
- | 3000 | 0.0485 | 150.3223 | 21042.3887 | 2.1825 | 65.2954 | 76.575 | 9.572 | 13.5953 | 164147.5625 |
67
- | 6000 | 0.0970 | 150.5029 | 21042.3887 | 2.1824 | 65.675 | 76.132 | 9.517 | 13.6268 | 164498.2812 |
68
- | 9000 | 0.1455 | 150.6954 | 20983.1934 | 2.1824 | 65.7475 | 76.049 | 9.506 | 13.6584 | 163274.0312 |
69
- | 12000 | 0.1939 | 150.3514 | 20983.1934 | 2.1823 | 65.8055 | 75.981 | 9.498 | 13.6251 | 162882.4219 |
70
- | 15000 | 0.2424 | 150.6196 | 21042.3887 | 2.1824 | 65.3959 | 76.457 | 9.557 | 13.6482 | 164147.5625 |
71
- | 18000 | 0.2909 | 150.6487 | 21042.3887 | 2.1824 | 65.5594 | 76.267 | 9.533 | 13.6533 | 163797.5938 |
72
- | 21000 | 0.3394 | 150.3980 | 21042.3887 | 2.1824 | 65.4773 | 76.362 | 9.545 | 13.6234 | 163972.4844 |
73
- | 24000 | 0.3879 | 150.5495 | 21131.4961 | 2.1825 | 65.4348 | 76.412 | 9.551 | 13.6184 | 164849.75 |
74
- | 27000 | 0.4364 | 150.7538 | 21042.3887 | 2.1822 | 65.4127 | 76.438 | 9.555 | 13.6635 | 162925.9219 |
75
- | 30000 | 0.4848 | 150.6954 | 21042.3887 | 2.1825 | 65.4113 | 76.439 | 9.555 | 13.6510 | 164586.0 |
76
- | 33000 | 0.5333 | 151.0109 | 20983.1934 | 2.1823 | 65.6274 | 76.188 | 9.523 | 13.6832 | 163186.8594 |
77
- | 36000 | 0.5818 | 150.3514 | 21042.3887 | 2.1824 | 65.4107 | 76.44 | 9.555 | 13.6184 | 164586.0 |
78
- | 39000 | 0.6303 | 150.6020 | 20983.1934 | 2.1823 | 65.415 | 76.435 | 9.554 | 13.6550 | 163884.9375 |
79
- | 42000 | 0.6788 | 150.5495 | 21042.3887 | 2.1823 | 65.3696 | 76.488 | 9.561 | 13.6454 | 163186.8594 |
80
- | 45000 | 0.7273 | 150.3223 | 20995.0234 | 2.1824 | 65.7092 | 76.093 | 9.512 | 13.6257 | 163274.0312 |
81
- | 48000 | 0.7758 | 150.8706 | 21042.3887 | 2.1824 | 65.5511 | 76.276 | 9.535 | 13.6652 | 163186.8594 |
82
- | 51000 | 0.8242 | 150.8940 | 21006.8594 | 2.1823 | 65.6118 | 76.206 | 9.526 | 13.6719 | 163186.8594 |
83
- | 54000 | 0.8727 | 150.4738 | 20918.2773 | 2.1824 | 65.6557 | 76.155 | 9.519 | 13.6539 | 162925.9219 |
84
- | 57000 | 0.9212 | 150.4446 | 21042.3887 | 2.1824 | 65.3885 | 76.466 | 9.558 | 13.6257 | 163622.8906 |
85
- | 60000 | 0.9697 | 150.4097 | 20918.2773 | 2.1824 | 65.4087 | 76.442 | 9.555 | 13.6533 | 162795.4688 |
86
- | 61875 | 1.0 | 150.6896 | 21042.3887 | 2.1825 | 65.8705 | 75.907 | 9.488 | 13.6533 | 163972.4844 |
87
 
88
  ### Framework versions
89
  - Distily 0.2.0
90
  - Transformers 4.44.0
91
  - Pytorch 2.3.0
92
- - Datasets 2.21.0
 
15
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
16
 
17
  It achieves the following results on the evaluation set:
18
+ - eval_enwikippl: 162.8458
19
+ - eval_frwikippl: 38427.4805
20
+ - eval_zhwikippl: 300303.4688
21
+ - eval_tinystoriesppl: 11.0505
22
+ - eval_loss: 8.0215
23
+ - eval_runtime: 66.4346
24
+ - eval_samples_per_second: 75.262
25
+ - eval_steps_per_second: 9.408
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
 
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
+ - distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=10.0, loss_fn=cos, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=10.0, loss_fn=mse, layer_mapper=None, projector=None))
49
  - train_embeddings: True
50
+ - learning_rate: 0.004
51
  - train_batch_size: 8
52
  - eval_batch_size: 8
53
  - seed: 42
 
56
  - num_epochs: 1.0
57
 
58
  ### Resource Usage
59
+ Peak GPU Memory: 8.2677 GB
60
 
61
  ### Eval-Phase Metrics
62
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
63
  | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
64
  | **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
65
+ | 0 | 0 | 21397.4785 | 57946.0117 | 14.5303 | 66.4513 | 75.243 | 9.405 | 12321.8145 | 60955.8008 |
66
+ | 3000 | 0.0485 | 162.4552 | 39038.5078 | 8.0218 | 66.4482 | 75.247 | 9.406 | 11.0391 | 307027.6562 |
67
+ | 6000 | 0.0970 | 153.5114 | 39220.4258 | 8.0201 | 66.3712 | 75.334 | 9.417 | 10.0871 | 301748.9375 |
68
+ | 9000 | 0.1455 | 162.8458 | 38427.4805 | 8.0215 | 66.4346 | 75.262 | 9.408 | 11.0505 | 300303.4688 |
69
+ | 12000 | 0.1939 | 162.6252 | 39458.6953 | 8.0210 | 66.1823 | 75.549 | 9.444 | 10.9736 | 309660.0625 |
70
+ | 15000 | 0.2424 | 163.4587 | 39866.5234 | 8.0206 | 66.2793 | 75.438 | 9.43 | 10.9968 | 317608.9688 |
71
+ | 18000 | 0.2909 | 162.5244 | 39259.0820 | 8.0212 | 66.4555 | 75.238 | 9.405 | 10.9723 | 301427.0938 |
72
+ | 21000 | 0.3394 | 163.7503 | 39860.8984 | 8.0212 | 66.3843 | 75.319 | 9.415 | 11.0890 | 308341.0625 |
73
+ | 24000 | 0.3879 | 161.7958 | 39126.6172 | 8.0216 | 66.3302 | 75.38 | 9.423 | 10.9895 | 296482.0312 |
74
+ | 27000 | 0.4364 | 163.1867 | 39303.3633 | 8.0212 | 66.2968 | 75.418 | 9.427 | 11.0542 | 302797.375 |
75
+ | 30000 | 0.4848 | 153.6542 | 39408.6797 | 8.0203 | 66.0901 | 75.654 | 9.457 | 10.0771 | 302393.6562 |
76
+ | 33000 | 0.5333 | 163.0667 | 39922.7188 | 8.0218 | 66.1778 | 75.554 | 9.444 | 10.9659 | 308999.875 |
77
+ | 36000 | 0.5818 | 161.7519 | 38254.6758 | 8.0205 | 66.4691 | 75.223 | 9.403 | 10.9483 | 301910.125 |
78
+ | 39000 | 0.6303 | 152.6517 | 39331.0703 | 8.0206 | 66.292 | 75.424 | 9.428 | 9.9620 | 302393.6562 |
79
+ | 42000 | 0.6788 | 161.3577 | 38265.4375 | 8.0211 | 66.2169 | 75.509 | 9.439 | 11.0086 | 296798.5938 |
80
+ | 45000 | 0.7273 | 152.9832 | 40238.8672 | 8.0213 | 66.3366 | 75.373 | 9.422 | 9.9365 | 316593.7812 |
81
+ | 48000 | 0.7758 | 163.2057 | 40012.7852 | 8.0204 | 66.3233 | 75.388 | 9.424 | 10.9546 | 311815.375 |
82
+ | 51000 | 0.8242 | 151.6968 | 39000.0664 | 8.0212 | 66.2188 | 75.507 | 9.438 | 9.9484 | 299823.0 |
83
+ | 54000 | 0.8727 | 153.8090 | 39832.8125 | 8.0208 | 66.3113 | 75.402 | 9.425 | 10.0065 | 303363.5938 |
84
+ | 57000 | 0.9212 | 159.2961 | 39759.9453 | 8.0196 | 66.1288 | 75.61 | 9.451 | 10.5878 | 314824.9062 |
85
+ | 60000 | 0.9697 | 151.4502 | 39022.0156 | 8.0217 | 66.3518 | 75.356 | 9.419 | 9.9451 | 294904.2812 |
86
+ | 61875 | 1.0 | 161.7269 | 38666.3906 | 8.0211 | 66.5523 | 75.129 | 9.391 | 11.0359 | 299024.3125 |
87
 
88
  ### Framework versions
89
  - Distily 0.2.0
90
  - Transformers 4.44.0
91
  - Pytorch 2.3.0
92
+ - Datasets 2.20.0
logs/attn_loss_fn=mse, attn_weight=10.0, hs_loss_fn=cos, hs_weight=10.0, learning_rate=0.004, warmup_ratio=0.1/events.out.tfevents.1723816815.93d6cbb3ad53 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8a668d4d5d9ee22e84ced534787863a8aec5572457902d33d954445df4140b72
3
+ size 1334060
logs/attn_loss_fn=mse, attn_weight=10.0, hs_loss_fn=cos, hs_weight=10.0, learning_rate=0.004, warmup_ratio=0/completed.flag ADDED
File without changes
logs/attn_loss_fn=mse, attn_weight=10.0, hs_loss_fn=cos, hs_weight=10.0, learning_rate=0.004, warmup_ratio=0/events.out.tfevents.1723816614.93d6cbb3ad53 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0a1dddf40bf03b9f4dda52dd55e1463f0807bdb29e97f0f3b2ca5c3ab0b04e3f
3
- size 312
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2ef3f46f295f2138f21bbe370ad4fe9cbe8bbe6d4574c93e06865808e9199cb7
3
+ size 588
logs/attn_loss_fn=mse, attn_weight=10.0, hs_loss_fn=kl, hs_weight=10.0, learning_rate=0.004, warmup_ratio=0/events.out.tfevents.1723817845.93d6cbb3ad53 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5053e0141141db3cc44adebcb36ea98d3cf4d7d4f24602e95df8d6e88411bcbf
3
+ size 16923347
logs/attn_loss_fn=mse, attn_weight=10.0, hs_loss_fn=mse, hs_weight=10.0, learning_rate=0.004, warmup_ratio=0/completed.flag ADDED
File without changes
logs/attn_loss_fn=mse, attn_weight=10.0, hs_loss_fn=raw_mse, hs_weight=10.0, learning_rate=0.004, warmup_ratio=0/completed.flag ADDED
File without changes
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:75dd7216db29b00cea3be433dc8cf4e1ba5499f9209d64f6c54b00d74d88ebdc
3
  size 137033984
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e72efc3bcc17c16aac741b5ce5b2355af6267a89a64aac10bb5529c4b09e3ef9
3
  size 137033984
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:06603e22afc968d0e36b2e11bd6961be40ef09276a0f214f8780b1fe64eed6cd
3
  size 1017948104
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:40fc8813ba3d68350b2791e1ba88fedb4437a530af0495b54eb1ef81062c47c3
3
  size 1017948104