lapp0 commited on
Commit
babdc58
1 Parent(s): cd7ab5a

End of training

Browse files
README.md CHANGED
@@ -15,14 +15,14 @@ This student model is distilled from the teacher model [roneneldan/TinyStories-3
15
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
16
 
17
  It achieves the following results on the evaluation set:
18
- - eval_enwikippl: 198.4185
19
- - eval_frwikippl: 100815.4219
20
- - eval_zhwikippl: 1470416.875
21
- - eval_tinystoriesppl: 10.3978
22
- - eval_loss: 1.2095
23
- - eval_runtime: 6.4972
24
- - eval_samples_per_second: 76.957
25
- - eval_steps_per_second: 9.697
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
@@ -56,23 +56,23 @@ The following hyperparameters were used during training:
56
  - num_epochs: 1.0
57
 
58
  ### Resource Usage
59
- Peak GPU Memory: 6.6047 GB
60
 
61
  ### Eval-Phase Metrics
62
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
63
  | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
64
  | **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
65
- | 0 | 0 | 11624.0693 | 60523.7891 | 5.8780 | 6.5064 | 76.847 | 9.683 | 4963.9985 | 59716.3516 |
66
- | 5000 | 0.1010 | 198.4185 | 100815.4219 | 1.2095 | 6.4972 | 76.957 | 9.697 | 10.3978 | 1470416.875 |
67
- | 10000 | 0.2020 | 197.7357 | 97196.9141 | 1.2093 | 6.4818 | 77.139 | 9.72 | 10.2892 | 1447451.75 |
68
- | 15000 | 0.3030 | 197.2080 | 98104.7109 | 1.2096 | 6.504 | 76.876 | 9.686 | 10.4405 | 1471986.875 |
69
- | 20000 | 0.4040 | 195.2773 | 97005.4062 | 1.2094 | 6.4752 | 77.218 | 9.729 | 10.1995 | 1422946.875 |
70
- | 25000 | 0.5051 | 196.9104 | 100058.5391 | 1.2099 | 6.4866 | 77.082 | 9.712 | 10.2727 | 1497336.25 |
71
- | 30000 | 0.6061 | 195.8986 | 94337.375 | 1.2099 | 6.4887 | 77.057 | 9.709 | 10.3785 | 1435147.0 |
72
- | 35000 | 0.7071 | 196.9180 | 96216.1797 | 1.2091 | 6.5251 | 76.627 | 9.655 | 10.3306 | 1444365.625 |
73
- | 40000 | 0.8081 | 197.7434 | 97635.9688 | 1.2091 | 6.5177 | 76.715 | 9.666 | 10.3669 | 1472771.75 |
74
- | 45000 | 0.9091 | 198.4416 | 98575.6953 | 1.2093 | 6.4898 | 77.044 | 9.708 | 10.3216 | 1521499.0 |
75
- | 49500 | 1.0 | 197.6898 | 97801.2031 | 1.2091 | 6.4817 | 77.141 | 9.72 | 10.3293 | 1486192.625 |
76
 
77
  ### Framework versions
78
  - Distily 0.2.0
 
15
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
16
 
17
  It achieves the following results on the evaluation set:
18
+ - eval_enwikippl: 13782.8369
19
+ - eval_frwikippl: 65787.1094
20
+ - eval_zhwikippl: 58004.7852
21
+ - eval_tinystoriesppl: 6904.7764
22
+ - eval_loss: 5.9830
23
+ - eval_runtime: 6.5013
24
+ - eval_samples_per_second: 76.908
25
+ - eval_steps_per_second: 9.69
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
 
56
  - num_epochs: 1.0
57
 
58
  ### Resource Usage
59
+ Peak GPU Memory: 6.6058 GB
60
 
61
  ### Eval-Phase Metrics
62
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
63
  | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
64
  | **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
65
+ | 0 | 0 | 24265.9961 | 83952.2266 | 6.4532 | 6.5041 | 76.875 | 9.686 | 14059.6133 | 62337.3242 |
66
+ | 5000 | 0.1010 | 13782.8369 | 65787.1094 | 5.9830 | 6.5013 | 76.908 | 9.69 | 6904.7764 | 58004.7852 |
67
+ | 10000 | 0.2020 | 13782.8369 | 65676.0234 | 5.9770 | 6.4897 | 77.046 | 9.708 | 6916.2041 | 58066.7188 |
68
+ | 15000 | 0.3030 | 13804.2139 | 65639.0234 | 5.9770 | 6.4932 | 77.003 | 9.702 | 6925.3584 | 58066.7188 |
69
+ | 20000 | 0.4040 | 13812.7734 | 65639.0234 | 5.9770 | 6.511 | 76.793 | 9.676 | 6934.5249 | 58066.7188 |
70
+ | 25000 | 0.5051 | 13829.8955 | 65639.0234 | 5.9770 | 6.5 | 76.923 | 9.692 | 6944.8496 | 58097.6836 |
71
+ | 30000 | 0.6061 | 13834.1826 | 65639.0234 | 5.9765 | 6.5123 | 76.778 | 9.674 | 6949.4409 | 58128.7188 |
72
+ | 35000 | 0.7071 | 13834.1826 | 65639.0234 | 5.9765 | 6.4965 | 76.965 | 9.698 | 6952.8945 | 58159.7148 |
73
+ | 40000 | 0.8081 | 13842.7607 | 65639.0234 | 5.9765 | 6.5677 | 76.13 | 9.592 | 6957.4912 | 58159.7148 |
74
+ | 45000 | 0.9091 | 13851.3447 | 65639.0234 | 5.9770 | 6.5257 | 76.62 | 9.654 | 6957.4912 | 58159.7148 |
75
+ | 49500 | 1.0 | 13851.3447 | 65639.0234 | 5.9770 | 6.5191 | 76.698 | 9.664 | 6957.4912 | 58159.7148 |
76
 
77
  ### Framework versions
78
  - Distily 0.2.0
logs/dropout=0, learning_rate=0.004, optim=sgd, warmup_ratio=0, weight_decay=0/events.out.tfevents.1723944312.5f530b1cf724 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:af2a207847519aeb44160838a18197b7125bbb78f60909c98710b2fa86139c2c
3
+ size 312