lapp0 commited on
Commit
56da803
1 Parent(s): 2a475a3

End of training

Browse files
README.md CHANGED
@@ -15,14 +15,14 @@ This student model is distilled from the teacher model [roneneldan/TinyStories-3
15
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
16
 
17
  It achieves the following results on the evaluation set:
18
- - eval_enwikippl: 158.7294
19
- - eval_frwikippl: 15434.1611
20
- - eval_zhwikippl: 106089.8359
21
- - eval_tinystoriesppl: 15.5930
22
- - eval_loss: 2.3671
23
- - eval_runtime: 65.5679
24
- - eval_samples_per_second: 76.257
25
- - eval_steps_per_second: 9.532
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
@@ -53,38 +53,37 @@ The following hyperparameters were used during training:
53
  - seed: 42
54
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
55
  - lr_scheduler_type: constant
56
- - lr_scheduler_warmup_ratio: 0.1
57
  - num_epochs: 1.0
58
 
59
  ### Resource Usage
60
- Peak GPU Memory: 8.2677 GB
61
 
62
  ### Eval-Phase Metrics
63
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
64
  | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
65
  | **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
66
- | 0 | 0 | 21397.4785 | 57946.0117 | 6.1625 | 65.3142 | 76.553 | 9.569 | 12321.8145 | 60955.8008 |
67
- | 3000 | 0.0485 | 158.5205 | 15451.5547 | 2.3672 | 65.4452 | 76.4 | 9.55 | 15.5596 | 105976.6797 |
68
- | 6000 | 0.0970 | 159.4627 | 15519.1768 | 2.3670 | 65.5893 | 76.232 | 9.529 | 15.6570 | 107916.8281 |
69
- | 9000 | 0.1455 | 158.7294 | 15434.1611 | 2.3671 | 65.5679 | 76.257 | 9.532 | 15.5930 | 106089.8359 |
70
- | 12000 | 0.1939 | 159.8214 | 15466.7988 | 2.3670 | 65.4602 | 76.382 | 9.548 | 15.6933 | 107457.1484 |
71
- | 15000 | 0.2424 | 159.5122 | 15501.6934 | 2.3671 | 65.4126 | 76.438 | 9.555 | 15.6648 | 107916.8281 |
72
- | 18000 | 0.2909 | 158.8771 | 15434.1611 | 2.3670 | 65.4041 | 76.448 | 9.556 | 15.5956 | 106033.1953 |
73
- | 21000 | 0.3394 | 159.3146 | 15434.1611 | 2.3671 | 65.4872 | 76.351 | 9.544 | 15.6460 | 106089.8359 |
74
- | 24000 | 0.3879 | 159.4504 | 15434.1611 | 2.3670 | 65.5249 | 76.307 | 9.538 | 15.6589 | 106089.8359 |
75
- | 27000 | 0.4364 | 158.9386 | 15386.3984 | 2.3669 | 65.4767 | 76.363 | 9.545 | 15.6163 | 105581.5391 |
76
- | 30000 | 0.4848 | 159.1728 | 15451.5547 | 2.3671 | 65.3648 | 76.494 | 9.562 | 15.6369 | 107342.5391 |
77
- | 33000 | 0.5333 | 159.8709 | 15466.7988 | 2.3670 | 65.4363 | 76.41 | 9.551 | 15.6965 | 106942.4062 |
78
- | 36000 | 0.5818 | 159.2097 | 15460.2656 | 2.3670 | 65.4686 | 76.373 | 9.547 | 15.6318 | 107629.3516 |
79
- | 39000 | 0.6303 | 158.6066 | 15503.8809 | 2.3670 | 65.5342 | 76.296 | 9.537 | 15.5724 | 107744.2734 |
80
- | 42000 | 0.6788 | 158.5205 | 15468.9824 | 2.3671 | 65.5105 | 76.324 | 9.54 | 15.5576 | 107399.8828 |
81
- | 45000 | 0.7273 | 158.7909 | 15399.4043 | 2.3670 | 65.5316 | 76.299 | 9.537 | 15.6163 | 106089.8359 |
82
- | 48000 | 0.7758 | 158.7909 | 15434.1611 | 2.3671 | 65.4706 | 76.37 | 9.546 | 15.6027 | 106373.1953 |
83
- | 51000 | 0.8242 | 158.8033 | 15425.4648 | 2.3669 | 65.5734 | 76.25 | 9.531 | 15.6169 | 106089.8359 |
84
- | 54000 | 0.8727 | 158.9263 | 15434.1611 | 2.3670 | 65.5021 | 76.333 | 9.542 | 15.6085 | 106486.7812 |
85
- | 57000 | 0.9212 | 159.3887 | 15451.5547 | 2.3671 | 65.5842 | 76.238 | 9.53 | 15.6505 | 107342.5391 |
86
- | 60000 | 0.9697 | 159.4874 | 15390.7422 | 2.3670 | 65.5517 | 76.276 | 9.534 | 15.6641 | 105581.5391 |
87
- | 61875 | 1.0 | 159.6729 | 15492.9736 | 2.3671 | 65.3871 | 76.468 | 9.558 | 15.6926 | 107342.5391 |
88
 
89
  ### Framework versions
90
  - Distily 0.2.0
 
15
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
16
 
17
  It achieves the following results on the evaluation set:
18
+ - eval_enwikippl: 150.6954
19
+ - eval_frwikippl: 20983.1934
20
+ - eval_zhwikippl: 163274.0312
21
+ - eval_tinystoriesppl: 13.6584
22
+ - eval_loss: 2.1824
23
+ - eval_runtime: 65.7475
24
+ - eval_samples_per_second: 76.049
25
+ - eval_steps_per_second: 9.506
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
 
53
  - seed: 42
54
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
55
  - lr_scheduler_type: constant
 
56
  - num_epochs: 1.0
57
 
58
  ### Resource Usage
59
+ Peak GPU Memory: 8.2666 GB
60
 
61
  ### Eval-Phase Metrics
62
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
63
  | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
64
  | **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
65
+ | 0 | 0 | 15507.5488 | 57030.9961 | 5.9658 | 65.2151 | 76.669 | 9.584 | 7965.0278 | 102309.4219 |
66
+ | 3000 | 0.0485 | 150.3223 | 21042.3887 | 2.1825 | 65.2954 | 76.575 | 9.572 | 13.5953 | 164147.5625 |
67
+ | 6000 | 0.0970 | 150.5029 | 21042.3887 | 2.1824 | 65.675 | 76.132 | 9.517 | 13.6268 | 164498.2812 |
68
+ | 9000 | 0.1455 | 150.6954 | 20983.1934 | 2.1824 | 65.7475 | 76.049 | 9.506 | 13.6584 | 163274.0312 |
69
+ | 12000 | 0.1939 | 150.3514 | 20983.1934 | 2.1823 | 65.8055 | 75.981 | 9.498 | 13.6251 | 162882.4219 |
70
+ | 15000 | 0.2424 | 150.6196 | 21042.3887 | 2.1824 | 65.3959 | 76.457 | 9.557 | 13.6482 | 164147.5625 |
71
+ | 18000 | 0.2909 | 150.6487 | 21042.3887 | 2.1824 | 65.5594 | 76.267 | 9.533 | 13.6533 | 163797.5938 |
72
+ | 21000 | 0.3394 | 150.3980 | 21042.3887 | 2.1824 | 65.4773 | 76.362 | 9.545 | 13.6234 | 163972.4844 |
73
+ | 24000 | 0.3879 | 150.5495 | 21131.4961 | 2.1825 | 65.4348 | 76.412 | 9.551 | 13.6184 | 164849.75 |
74
+ | 27000 | 0.4364 | 150.7538 | 21042.3887 | 2.1822 | 65.4127 | 76.438 | 9.555 | 13.6635 | 162925.9219 |
75
+ | 30000 | 0.4848 | 150.6954 | 21042.3887 | 2.1825 | 65.4113 | 76.439 | 9.555 | 13.6510 | 164586.0 |
76
+ | 33000 | 0.5333 | 151.0109 | 20983.1934 | 2.1823 | 65.6274 | 76.188 | 9.523 | 13.6832 | 163186.8594 |
77
+ | 36000 | 0.5818 | 150.3514 | 21042.3887 | 2.1824 | 65.4107 | 76.44 | 9.555 | 13.6184 | 164586.0 |
78
+ | 39000 | 0.6303 | 150.6020 | 20983.1934 | 2.1823 | 65.415 | 76.435 | 9.554 | 13.6550 | 163884.9375 |
79
+ | 42000 | 0.6788 | 150.5495 | 21042.3887 | 2.1823 | 65.3696 | 76.488 | 9.561 | 13.6454 | 163186.8594 |
80
+ | 45000 | 0.7273 | 150.3223 | 20995.0234 | 2.1824 | 65.7092 | 76.093 | 9.512 | 13.6257 | 163274.0312 |
81
+ | 48000 | 0.7758 | 150.8706 | 21042.3887 | 2.1824 | 65.5511 | 76.276 | 9.535 | 13.6652 | 163186.8594 |
82
+ | 51000 | 0.8242 | 150.8940 | 21006.8594 | 2.1823 | 65.6118 | 76.206 | 9.526 | 13.6719 | 163186.8594 |
83
+ | 54000 | 0.8727 | 150.4738 | 20918.2773 | 2.1824 | 65.6557 | 76.155 | 9.519 | 13.6539 | 162925.9219 |
84
+ | 57000 | 0.9212 | 150.4446 | 21042.3887 | 2.1824 | 65.3885 | 76.466 | 9.558 | 13.6257 | 163622.8906 |
85
+ | 60000 | 0.9697 | 150.4097 | 20918.2773 | 2.1824 | 65.4087 | 76.442 | 9.555 | 13.6533 | 162795.4688 |
86
+ | 61875 | 1.0 | 150.6896 | 21042.3887 | 2.1825 | 65.8705 | 75.907 | 9.488 | 13.6533 | 163972.4844 |
87
 
88
  ### Framework versions
89
  - Distily 0.2.0
logs/attn_loss_fn=mse, attn_weight=10.0, hs_loss_fn=mse, hs_weight=10.0, learning_rate=0.001, warmup_ratio=0/events.out.tfevents.1723827430.5f530b1cf724 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:290caa5fc136c4368d93e1f550a43ca0908e8207112bfdc8995dc1df842d9c99
3
+ size 312