lapp0 commited on
Commit
9b8dc5c
1 Parent(s): b4d8340

End of training

Browse files
README.md CHANGED
@@ -16,13 +16,13 @@ This student model is distilled from the teacher model [gpt2](https://huggingfac
16
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
17
 
18
  It achieves the following results on the evaluation set:
19
- - eval_enwikippl: 929.2385
20
- - eval_frwikippl: 4940.1445
21
- - eval_zhwikippl: 14653.9951
22
- - eval_loss: 7394.7842
23
- - eval_runtime: 22.0822
24
- - eval_samples_per_second: 45.285
25
- - eval_steps_per_second: 11.321
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
@@ -45,10 +45,10 @@ More information needed
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
- - distillation_objective: <distily.objectives.LegacyObjective object at 0x7f7eb07a3700>
49
  - train_embeddings: True
50
  - learning_rate: 4e-05
51
- - train_batch_size: 16
52
  - eval_batch_size: 4
53
  - seed: 42
54
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
@@ -62,20 +62,32 @@ Peak GPU Memory: 15.7299 GB
62
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
63
  | --- | --- | --- | --- | --- | --- | --- | --- | --- |
64
  | **teacher eval** | | 30.2385 | 57.2728 | | | | | 18.1772 |
65
- | 0 | 0 | 57512.4023 | 57507.9609 | 331030.5312 | 21.9079 | 45.646 | 11.411 | 56939.4805 |
66
- | 500 | 0.0808 | 3373.4189 | 12600.2275 | 12434.4316 | 21.4223 | 46.68 | 11.67 | 43231.4297 |
67
- | 1000 | 0.1616 | 2185.7129 | 8084.5122 | 10273.4082 | 21.7859 | 45.901 | 11.475 | 31395.9453 |
68
- | 1500 | 0.2424 | 1793.0647 | 6660.5288 | 9647.7441 | 21.3865 | 46.759 | 11.69 | 23173.5977 |
69
- | 2000 | 0.3232 | 1548.9072 | 6412.6167 | 9225.7920 | 21.433 | 46.657 | 11.664 | 18881.2090 |
70
- | 2500 | 0.4040 | 1398.5220 | 5857.9961 | 8856.8965 | 21.3942 | 46.742 | 11.685 | 15137.3125 |
71
- | 3000 | 0.4848 | 1300.4033 | 5470.7554 | 8459.7764 | 21.5101 | 46.49 | 11.622 | 18107.0938 |
72
- | 3500 | 0.5656 | 1197.4879 | 5503.0571 | 8287.1680 | 21.4532 | 46.613 | 11.653 | 17759.8516 |
73
- | 4000 | 0.6464 | 1128.0508 | 5481.5659 | 8032.7041 | 21.4263 | 46.672 | 11.668 | 16724.0586 |
74
- | 4500 | 0.7272 | 1077.5973 | 5100.6572 | 7940.9922 | 21.4997 | 46.512 | 11.628 | 16846.2285 |
75
- | 5000 | 0.8080 | 1003.7661 | 5090.5928 | 7673.4722 | 22.0409 | 45.37 | 11.343 | 13699.2832 |
76
- | 5500 | 0.8888 | 983.7222 | 4890.5869 | 7606.2720 | 21.916 | 45.629 | 11.407 | 15087.8770 |
77
- | 6000 | 0.9696 | 936.5551 | 4860.5083 | 7409.6958 | 21.5036 | 46.504 | 11.626 | 13386.4326 |
78
- | 6188 | 1.0 | 929.2385 | 4940.1445 | 7394.7842 | 22.0822 | 45.285 | 11.321 | 14653.9951 |
 
 
 
 
 
 
 
 
 
 
 
 
79
 
80
  ### Framework versions
81
  - Distily 0.2.0
 
16
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
17
 
18
  It achieves the following results on the evaluation set:
19
+ - eval_enwikippl: 685.3206
20
+ - eval_frwikippl: 4166.5459
21
+ - eval_zhwikippl: 10016.8096
22
+ - eval_loss: 7038.3359
23
+ - eval_runtime: 21.4586
24
+ - eval_samples_per_second: 46.601
25
+ - eval_steps_per_second: 11.65
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
 
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
+ - distillation_objective: <distily.objectives.LegacyObjective object at 0x7f7d9010c220>
49
  - train_embeddings: True
50
  - learning_rate: 4e-05
51
+ - train_batch_size: 8
52
  - eval_batch_size: 4
53
  - seed: 42
54
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
 
62
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
63
  | --- | --- | --- | --- | --- | --- | --- | --- | --- |
64
  | **teacher eval** | | 30.2385 | 57.2728 | | | | | 18.1772 |
65
+ | 0 | 0 | 55339.3672 | 57682.5742 | 331776.0 | 21.5939 | 46.309 | 11.577 | 57080.2930 |
66
+ | 500 | 0.0404 | 2539.2073 | 10701.1621 | 12591.8076 | 21.7416 | 45.995 | 11.499 | 52355.7031 |
67
+ | 1000 | 0.0808 | 1909.6178 | 6915.1094 | 10639.6162 | 21.6609 | 46.166 | 11.542 | 28701.2539 |
68
+ | 1500 | 0.1212 | 1563.3774 | 6141.7974 | 9791.8721 | 21.4972 | 46.518 | 11.629 | 22309.7324 |
69
+ | 2000 | 0.1616 | 1355.0397 | 6279.7275 | 9359.6162 | 21.5918 | 46.314 | 11.578 | 21985.8770 |
70
+ | 2500 | 0.2020 | 1227.5216 | 5683.0625 | 9057.0879 | 21.5112 | 46.487 | 11.622 | 18872.3887 |
71
+ | 3000 | 0.2424 | 1126.7594 | 5939.9272 | 8646.5283 | 21.3909 | 46.749 | 11.687 | 24154.6621 |
72
+ | 3500 | 0.2828 | 1050.4360 | 5464.5869 | 8420.1602 | 21.4858 | 46.542 | 11.636 | 17743.2598 |
73
+ | 4000 | 0.3232 | 995.7695 | 5204.1860 | 8302.2080 | 21.671 | 46.145 | 11.536 | 16421.9824 |
74
+ | 4500 | 0.3636 | 939.8518 | 4812.9336 | 8087.6162 | 21.6884 | 46.108 | 11.527 | 18503.0742 |
75
+ | 5000 | 0.4040 | 894.0764 | 5064.1064 | 7913.1841 | 21.5518 | 46.4 | 11.6 | 18140.9707 |
76
+ | 5500 | 0.4444 | 854.5688 | 4709.2144 | 7854.3359 | 21.4663 | 46.585 | 11.646 | 13195.6348 |
77
+ | 6000 | 0.4848 | 815.7767 | 4654.2524 | 7642.8481 | 21.5227 | 46.463 | 11.616 | 14954.4814 |
78
+ | 6500 | 0.5253 | 795.4309 | 4827.8882 | 7615.8398 | 21.7578 | 45.961 | 11.49 | 16576.2129 |
79
+ | 7000 | 0.5657 | 769.8770 | 4643.7627 | 7491.7759 | 21.5078 | 46.495 | 11.624 | 16412.1191 |
80
+ | 7500 | 0.6061 | 755.6475 | 4527.0581 | 7376.8638 | 21.6221 | 46.249 | 11.562 | 15845.7139 |
81
+ | 8000 | 0.6465 | 728.3958 | 4527.8569 | 7300.3838 | 21.828 | 45.813 | 11.453 | 17439.0645 |
82
+ | 8500 | 0.6869 | 725.3337 | 4182.8813 | 7177.5039 | 21.5139 | 46.482 | 11.62 | 14905.6289 |
83
+ | 9000 | 0.7273 | 693.5788 | 4218.2700 | 7109.6958 | 21.5247 | 46.458 | 11.615 | 13303.5693 |
84
+ | 9500 | 0.7677 | 685.3206 | 4166.5459 | 7038.3359 | 21.4586 | 46.601 | 11.65 | 10016.8096 |
85
+ | 10000 | 0.8081 | 681.6716 | 4036.2834 | 7004.8638 | 21.5314 | 46.444 | 11.611 | 8738.9414 |
86
+ | 10500 | 0.8485 | 656.0428 | 4098.0786 | 6922.6558 | 21.6176 | 46.259 | 11.565 | 9690.5127 |
87
+ | 11000 | 0.8889 | 655.4447 | 4287.2363 | 6842.6240 | 21.4458 | 46.629 | 11.657 | 13403.4229 |
88
+ | 11500 | 0.9293 | 635.5220 | 4204.7627 | 6807.0400 | 21.4021 | 46.724 | 11.681 | 10938.9814 |
89
+ | 12000 | 0.9697 | 633.3297 | 4234.2144 | 6815.6162 | 21.5151 | 46.479 | 11.62 | 10046.2852 |
90
+ | 12375 | 1.0 | 633.1452 | 4051.2502 | 6749.3442 | 21.4393 | 46.643 | 11.661 | 10211.2959 |
91
 
92
  ### Framework versions
93
  - Distily 0.2.0
logs/per_device_train_batch_size=8/events.out.tfevents.1723363728.93d6cbb3ad53 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5f5a80c19b7a2a924aea268a2f547437521dec0acbb11f552f2d1578df766d94
3
+ size 249