lapp0 commited on
Commit
2a475a3
1 Parent(s): b566046

Training in progress, step 61875

Browse files
README.md CHANGED
@@ -15,14 +15,14 @@ This student model is distilled from the teacher model [roneneldan/TinyStories-3
15
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
16
 
17
  It achieves the following results on the evaluation set:
18
- - eval_enwikippl: 162.8458
19
- - eval_frwikippl: 38427.4805
20
- - eval_zhwikippl: 300303.4688
21
- - eval_tinystoriesppl: 11.0505
22
- - eval_loss: 8.0215
23
- - eval_runtime: 66.4346
24
- - eval_samples_per_second: 75.262
25
- - eval_steps_per_second: 9.408
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
@@ -45,14 +45,15 @@ More information needed
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
- - distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=10.0, loss_fn=cos, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=10.0, loss_fn=mse, layer_mapper=None, projector=None))
49
  - train_embeddings: True
50
- - learning_rate: 0.004
51
  - train_batch_size: 8
52
  - eval_batch_size: 8
53
  - seed: 42
54
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
55
  - lr_scheduler_type: constant
 
56
  - num_epochs: 1.0
57
 
58
  ### Resource Usage
@@ -62,31 +63,31 @@ Peak GPU Memory: 8.2677 GB
62
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
63
  | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
64
  | **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
65
- | 0 | 0 | 21397.4785 | 57946.0117 | 14.5303 | 66.4513 | 75.243 | 9.405 | 12321.8145 | 60955.8008 |
66
- | 3000 | 0.0485 | 162.4552 | 39038.5078 | 8.0218 | 66.4482 | 75.247 | 9.406 | 11.0391 | 307027.6562 |
67
- | 6000 | 0.0970 | 153.5114 | 39220.4258 | 8.0201 | 66.3712 | 75.334 | 9.417 | 10.0871 | 301748.9375 |
68
- | 9000 | 0.1455 | 162.8458 | 38427.4805 | 8.0215 | 66.4346 | 75.262 | 9.408 | 11.0505 | 300303.4688 |
69
- | 12000 | 0.1939 | 162.6252 | 39458.6953 | 8.0210 | 66.1823 | 75.549 | 9.444 | 10.9736 | 309660.0625 |
70
- | 15000 | 0.2424 | 163.4587 | 39866.5234 | 8.0206 | 66.2793 | 75.438 | 9.43 | 10.9968 | 317608.9688 |
71
- | 18000 | 0.2909 | 162.5244 | 39259.0820 | 8.0212 | 66.4555 | 75.238 | 9.405 | 10.9723 | 301427.0938 |
72
- | 21000 | 0.3394 | 163.7503 | 39860.8984 | 8.0212 | 66.3843 | 75.319 | 9.415 | 11.0890 | 308341.0625 |
73
- | 24000 | 0.3879 | 161.7958 | 39126.6172 | 8.0216 | 66.3302 | 75.38 | 9.423 | 10.9895 | 296482.0312 |
74
- | 27000 | 0.4364 | 163.1867 | 39303.3633 | 8.0212 | 66.2968 | 75.418 | 9.427 | 11.0542 | 302797.375 |
75
- | 30000 | 0.4848 | 153.6542 | 39408.6797 | 8.0203 | 66.0901 | 75.654 | 9.457 | 10.0771 | 302393.6562 |
76
- | 33000 | 0.5333 | 163.0667 | 39922.7188 | 8.0218 | 66.1778 | 75.554 | 9.444 | 10.9659 | 308999.875 |
77
- | 36000 | 0.5818 | 161.7519 | 38254.6758 | 8.0205 | 66.4691 | 75.223 | 9.403 | 10.9483 | 301910.125 |
78
- | 39000 | 0.6303 | 152.6517 | 39331.0703 | 8.0206 | 66.292 | 75.424 | 9.428 | 9.9620 | 302393.6562 |
79
- | 42000 | 0.6788 | 161.3577 | 38265.4375 | 8.0211 | 66.2169 | 75.509 | 9.439 | 11.0086 | 296798.5938 |
80
- | 45000 | 0.7273 | 152.9832 | 40238.8672 | 8.0213 | 66.3366 | 75.373 | 9.422 | 9.9365 | 316593.7812 |
81
- | 48000 | 0.7758 | 163.2057 | 40012.7852 | 8.0204 | 66.3233 | 75.388 | 9.424 | 10.9546 | 311815.375 |
82
- | 51000 | 0.8242 | 151.6968 | 39000.0664 | 8.0212 | 66.2188 | 75.507 | 9.438 | 9.9484 | 299823.0 |
83
- | 54000 | 0.8727 | 153.8090 | 39832.8125 | 8.0208 | 66.3113 | 75.402 | 9.425 | 10.0065 | 303363.5938 |
84
- | 57000 | 0.9212 | 159.2961 | 39759.9453 | 8.0196 | 66.1288 | 75.61 | 9.451 | 10.5878 | 314824.9062 |
85
- | 60000 | 0.9697 | 151.4502 | 39022.0156 | 8.0217 | 66.3518 | 75.356 | 9.419 | 9.9451 | 294904.2812 |
86
- | 61875 | 1.0 | 161.7269 | 38666.3906 | 8.0211 | 66.5523 | 75.129 | 9.391 | 11.0359 | 299024.3125 |
87
 
88
  ### Framework versions
89
  - Distily 0.2.0
90
  - Transformers 4.44.0
91
  - Pytorch 2.3.0
92
- - Datasets 2.20.0
 
15
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
16
 
17
  It achieves the following results on the evaluation set:
18
+ - eval_enwikippl: 158.7294
19
+ - eval_frwikippl: 15434.1611
20
+ - eval_zhwikippl: 106089.8359
21
+ - eval_tinystoriesppl: 15.5930
22
+ - eval_loss: 2.3671
23
+ - eval_runtime: 65.5679
24
+ - eval_samples_per_second: 76.257
25
+ - eval_steps_per_second: 9.532
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
 
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
+ - distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=10.0, loss_fn=mse, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=10.0, loss_fn=mse, layer_mapper=None, projector=None))
49
  - train_embeddings: True
50
+ - learning_rate: 0.001
51
  - train_batch_size: 8
52
  - eval_batch_size: 8
53
  - seed: 42
54
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
55
  - lr_scheduler_type: constant
56
+ - lr_scheduler_warmup_ratio: 0.1
57
  - num_epochs: 1.0
58
 
59
  ### Resource Usage
 
63
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
64
  | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
65
  | **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
66
+ | 0 | 0 | 21397.4785 | 57946.0117 | 6.1625 | 65.3142 | 76.553 | 9.569 | 12321.8145 | 60955.8008 |
67
+ | 3000 | 0.0485 | 158.5205 | 15451.5547 | 2.3672 | 65.4452 | 76.4 | 9.55 | 15.5596 | 105976.6797 |
68
+ | 6000 | 0.0970 | 159.4627 | 15519.1768 | 2.3670 | 65.5893 | 76.232 | 9.529 | 15.6570 | 107916.8281 |
69
+ | 9000 | 0.1455 | 158.7294 | 15434.1611 | 2.3671 | 65.5679 | 76.257 | 9.532 | 15.5930 | 106089.8359 |
70
+ | 12000 | 0.1939 | 159.8214 | 15466.7988 | 2.3670 | 65.4602 | 76.382 | 9.548 | 15.6933 | 107457.1484 |
71
+ | 15000 | 0.2424 | 159.5122 | 15501.6934 | 2.3671 | 65.4126 | 76.438 | 9.555 | 15.6648 | 107916.8281 |
72
+ | 18000 | 0.2909 | 158.8771 | 15434.1611 | 2.3670 | 65.4041 | 76.448 | 9.556 | 15.5956 | 106033.1953 |
73
+ | 21000 | 0.3394 | 159.3146 | 15434.1611 | 2.3671 | 65.4872 | 76.351 | 9.544 | 15.6460 | 106089.8359 |
74
+ | 24000 | 0.3879 | 159.4504 | 15434.1611 | 2.3670 | 65.5249 | 76.307 | 9.538 | 15.6589 | 106089.8359 |
75
+ | 27000 | 0.4364 | 158.9386 | 15386.3984 | 2.3669 | 65.4767 | 76.363 | 9.545 | 15.6163 | 105581.5391 |
76
+ | 30000 | 0.4848 | 159.1728 | 15451.5547 | 2.3671 | 65.3648 | 76.494 | 9.562 | 15.6369 | 107342.5391 |
77
+ | 33000 | 0.5333 | 159.8709 | 15466.7988 | 2.3670 | 65.4363 | 76.41 | 9.551 | 15.6965 | 106942.4062 |
78
+ | 36000 | 0.5818 | 159.2097 | 15460.2656 | 2.3670 | 65.4686 | 76.373 | 9.547 | 15.6318 | 107629.3516 |
79
+ | 39000 | 0.6303 | 158.6066 | 15503.8809 | 2.3670 | 65.5342 | 76.296 | 9.537 | 15.5724 | 107744.2734 |
80
+ | 42000 | 0.6788 | 158.5205 | 15468.9824 | 2.3671 | 65.5105 | 76.324 | 9.54 | 15.5576 | 107399.8828 |
81
+ | 45000 | 0.7273 | 158.7909 | 15399.4043 | 2.3670 | 65.5316 | 76.299 | 9.537 | 15.6163 | 106089.8359 |
82
+ | 48000 | 0.7758 | 158.7909 | 15434.1611 | 2.3671 | 65.4706 | 76.37 | 9.546 | 15.6027 | 106373.1953 |
83
+ | 51000 | 0.8242 | 158.8033 | 15425.4648 | 2.3669 | 65.5734 | 76.25 | 9.531 | 15.6169 | 106089.8359 |
84
+ | 54000 | 0.8727 | 158.9263 | 15434.1611 | 2.3670 | 65.5021 | 76.333 | 9.542 | 15.6085 | 106486.7812 |
85
+ | 57000 | 0.9212 | 159.3887 | 15451.5547 | 2.3671 | 65.5842 | 76.238 | 9.53 | 15.6505 | 107342.5391 |
86
+ | 60000 | 0.9697 | 159.4874 | 15390.7422 | 2.3670 | 65.5517 | 76.276 | 9.534 | 15.6641 | 105581.5391 |
87
+ | 61875 | 1.0 | 159.6729 | 15492.9736 | 2.3671 | 65.3871 | 76.468 | 9.558 | 15.6926 | 107342.5391 |
88
 
89
  ### Framework versions
90
  - Distily 0.2.0
91
  - Transformers 4.44.0
92
  - Pytorch 2.3.0
93
+ - Datasets 2.21.0
logs/attn_loss_fn=mse, attn_weight=10.0, hs_loss_fn=cos, hs_weight=10.0, learning_rate=0.001, warmup_ratio=0/completed.flag ADDED
File without changes
logs/attn_loss_fn=mse, attn_weight=10.0, hs_loss_fn=cos, hs_weight=10.0, learning_rate=0.001, warmup_ratio=0/events.out.tfevents.1723813550.5f530b1cf724 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e8d73bfda7a7864fd7c4d960019a029af5fa07ccd4025395a1733694934ef8e6
3
+ size 7032408
logs/attn_loss_fn=mse, attn_weight=10.0, hs_loss_fn=mse, hs_weight=10.0, learning_rate=0.001, warmup_ratio=0.1/events.out.tfevents.1723813356.5f530b1cf724 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:79725079d52ba85fa80b280c0868d8c79a8440c00ce1ba612c63af0f947d94c7
3
- size 312
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1df3afc43756eebaccbd37b3592b09116f2c1763e5820af8f2c5c0f2e46e6452
3
+ size 588
logs/attn_loss_fn=mse, attn_weight=10.0, hs_loss_fn=mse, hs_weight=10.0, learning_rate=0.001, warmup_ratio=0/events.out.tfevents.1723817783.5f530b1cf724 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4d47fea461db143d2c64d31f12684dc7332747f06367883172dc72e1edd3d105
3
+ size 16923348
logs/attn_loss_fn=mse, attn_weight=10.0, hs_loss_fn=raw_mse, hs_weight=10.0, learning_rate=0.0001, warmup_ratio=0/completed.flag ADDED
File without changes
logs/attn_loss_fn=mse, attn_weight=10.0, hs_loss_fn=raw_mse, hs_weight=10.0, learning_rate=0.001, warmup_ratio=0/completed.flag ADDED
File without changes
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:1e70b8f8cdbd3dfa80135450fd849dc798769a33ae1dc5b193ef3658b30e881e
3
  size 137033984
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:75dd7216db29b00cea3be433dc8cf4e1ba5499f9209d64f6c54b00d74d88ebdc
3
  size 137033984
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2dcb263d55f6965884f8fa3afad966b9db28704ad230bb9ff93a7464658c2df1
3
  size 1017948104
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:06603e22afc968d0e36b2e11bd6961be40ef09276a0f214f8780b1fe64eed6cd
3
  size 1017948104