End of training
Browse files
README.md
CHANGED
@@ -16,13 +16,13 @@ This student model is distilled from the teacher model [gpt2](https://huggingfac
|
|
16 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
17 |
|
18 |
It achieves the following results on the evaluation set:
|
19 |
-
- eval_enwikippl:
|
20 |
-
- eval_frwikippl:
|
21 |
-
- eval_zhwikippl:
|
22 |
-
- eval_loss:
|
23 |
-
- eval_runtime:
|
24 |
-
- eval_samples_per_second:
|
25 |
-
- eval_steps_per_second: 11.
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
@@ -45,10 +45,10 @@ More information needed
|
|
45 |
### Training hyperparameters
|
46 |
|
47 |
The following hyperparameters were used during training:
|
48 |
-
- distillation_objective: <distily.objectives.LegacyObjective object at
|
49 |
- train_embeddings: True
|
50 |
- learning_rate: 4e-05
|
51 |
-
- train_batch_size:
|
52 |
- eval_batch_size: 4
|
53 |
- seed: 42
|
54 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
@@ -62,20 +62,32 @@ Peak GPU Memory: 15.7299 GB
|
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 30.2385 | 57.2728 | | | | | 18.1772 |
|
65 |
-
| 0 | 0 |
|
66 |
-
| 500 | 0.
|
67 |
-
| 1000 | 0.
|
68 |
-
| 1500 | 0.
|
69 |
-
| 2000 | 0.
|
70 |
-
| 2500 | 0.
|
71 |
-
| 3000 | 0.
|
72 |
-
| 3500 | 0.
|
73 |
-
| 4000 | 0.
|
74 |
-
| 4500 | 0.
|
75 |
-
| 5000 | 0.
|
76 |
-
| 5500 | 0.
|
77 |
-
| 6000 | 0.
|
78 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
79 |
|
80 |
### Framework versions
|
81 |
- Distily 0.2.0
|
|
|
16 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
17 |
|
18 |
It achieves the following results on the evaluation set:
|
19 |
+
- eval_enwikippl: 685.3206
|
20 |
+
- eval_frwikippl: 4166.5459
|
21 |
+
- eval_zhwikippl: 10016.8096
|
22 |
+
- eval_loss: 7038.3359
|
23 |
+
- eval_runtime: 21.4586
|
24 |
+
- eval_samples_per_second: 46.601
|
25 |
+
- eval_steps_per_second: 11.65
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
|
|
45 |
### Training hyperparameters
|
46 |
|
47 |
The following hyperparameters were used during training:
|
48 |
+
- distillation_objective: <distily.objectives.LegacyObjective object at 0x7f7d9010c220>
|
49 |
- train_embeddings: True
|
50 |
- learning_rate: 4e-05
|
51 |
+
- train_batch_size: 8
|
52 |
- eval_batch_size: 4
|
53 |
- seed: 42
|
54 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
|
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 30.2385 | 57.2728 | | | | | 18.1772 |
|
65 |
+
| 0 | 0 | 55339.3672 | 57682.5742 | 331776.0 | 21.5939 | 46.309 | 11.577 | 57080.2930 |
|
66 |
+
| 500 | 0.0404 | 2539.2073 | 10701.1621 | 12591.8076 | 21.7416 | 45.995 | 11.499 | 52355.7031 |
|
67 |
+
| 1000 | 0.0808 | 1909.6178 | 6915.1094 | 10639.6162 | 21.6609 | 46.166 | 11.542 | 28701.2539 |
|
68 |
+
| 1500 | 0.1212 | 1563.3774 | 6141.7974 | 9791.8721 | 21.4972 | 46.518 | 11.629 | 22309.7324 |
|
69 |
+
| 2000 | 0.1616 | 1355.0397 | 6279.7275 | 9359.6162 | 21.5918 | 46.314 | 11.578 | 21985.8770 |
|
70 |
+
| 2500 | 0.2020 | 1227.5216 | 5683.0625 | 9057.0879 | 21.5112 | 46.487 | 11.622 | 18872.3887 |
|
71 |
+
| 3000 | 0.2424 | 1126.7594 | 5939.9272 | 8646.5283 | 21.3909 | 46.749 | 11.687 | 24154.6621 |
|
72 |
+
| 3500 | 0.2828 | 1050.4360 | 5464.5869 | 8420.1602 | 21.4858 | 46.542 | 11.636 | 17743.2598 |
|
73 |
+
| 4000 | 0.3232 | 995.7695 | 5204.1860 | 8302.2080 | 21.671 | 46.145 | 11.536 | 16421.9824 |
|
74 |
+
| 4500 | 0.3636 | 939.8518 | 4812.9336 | 8087.6162 | 21.6884 | 46.108 | 11.527 | 18503.0742 |
|
75 |
+
| 5000 | 0.4040 | 894.0764 | 5064.1064 | 7913.1841 | 21.5518 | 46.4 | 11.6 | 18140.9707 |
|
76 |
+
| 5500 | 0.4444 | 854.5688 | 4709.2144 | 7854.3359 | 21.4663 | 46.585 | 11.646 | 13195.6348 |
|
77 |
+
| 6000 | 0.4848 | 815.7767 | 4654.2524 | 7642.8481 | 21.5227 | 46.463 | 11.616 | 14954.4814 |
|
78 |
+
| 6500 | 0.5253 | 795.4309 | 4827.8882 | 7615.8398 | 21.7578 | 45.961 | 11.49 | 16576.2129 |
|
79 |
+
| 7000 | 0.5657 | 769.8770 | 4643.7627 | 7491.7759 | 21.5078 | 46.495 | 11.624 | 16412.1191 |
|
80 |
+
| 7500 | 0.6061 | 755.6475 | 4527.0581 | 7376.8638 | 21.6221 | 46.249 | 11.562 | 15845.7139 |
|
81 |
+
| 8000 | 0.6465 | 728.3958 | 4527.8569 | 7300.3838 | 21.828 | 45.813 | 11.453 | 17439.0645 |
|
82 |
+
| 8500 | 0.6869 | 725.3337 | 4182.8813 | 7177.5039 | 21.5139 | 46.482 | 11.62 | 14905.6289 |
|
83 |
+
| 9000 | 0.7273 | 693.5788 | 4218.2700 | 7109.6958 | 21.5247 | 46.458 | 11.615 | 13303.5693 |
|
84 |
+
| 9500 | 0.7677 | 685.3206 | 4166.5459 | 7038.3359 | 21.4586 | 46.601 | 11.65 | 10016.8096 |
|
85 |
+
| 10000 | 0.8081 | 681.6716 | 4036.2834 | 7004.8638 | 21.5314 | 46.444 | 11.611 | 8738.9414 |
|
86 |
+
| 10500 | 0.8485 | 656.0428 | 4098.0786 | 6922.6558 | 21.6176 | 46.259 | 11.565 | 9690.5127 |
|
87 |
+
| 11000 | 0.8889 | 655.4447 | 4287.2363 | 6842.6240 | 21.4458 | 46.629 | 11.657 | 13403.4229 |
|
88 |
+
| 11500 | 0.9293 | 635.5220 | 4204.7627 | 6807.0400 | 21.4021 | 46.724 | 11.681 | 10938.9814 |
|
89 |
+
| 12000 | 0.9697 | 633.3297 | 4234.2144 | 6815.6162 | 21.5151 | 46.479 | 11.62 | 10046.2852 |
|
90 |
+
| 12375 | 1.0 | 633.1452 | 4051.2502 | 6749.3442 | 21.4393 | 46.643 | 11.661 | 10211.2959 |
|
91 |
|
92 |
### Framework versions
|
93 |
- Distily 0.2.0
|
logs/per_device_train_batch_size=8/events.out.tfevents.1723363728.93d6cbb3ad53
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:5f5a80c19b7a2a924aea268a2f547437521dec0acbb11f552f2d1578df766d94
|
3 |
+
size 249
|