thevyasamit commited on
Commit
09305b4
·
1 Parent(s): d32a88f

End of training

Browse files
Files changed (3) hide show
  1. README.md +88 -0
  2. generation_config.json +6 -0
  3. pytorch_model.bin +1 -1
README.md ADDED
@@ -0,0 +1,88 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: t5-base
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - rouge
8
+ model-index:
9
+ - name: t5-fine-tuned-with-25-yake-keywords
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # t5-fine-tuned-with-25-yake-keywords
17
+
18
+ This model is a fine-tuned version of [t5-base](https://huggingface.co/t5-base) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 1.7255
21
+ - Rouge1: 25.5557
22
+ - Rouge2: 11.1446
23
+ - Rougel: 20.7482
24
+ - Rougelsum: 24.0749
25
+ - Gen Len: 19.0
26
+
27
+ ## Model description
28
+
29
+ More information needed
30
+
31
+ ## Intended uses & limitations
32
+
33
+ More information needed
34
+
35
+ ## Training and evaluation data
36
+
37
+ More information needed
38
+
39
+ ## Training procedure
40
+
41
+ ### Training hyperparameters
42
+
43
+ The following hyperparameters were used during training:
44
+ - learning_rate: 2e-05
45
+ - train_batch_size: 2
46
+ - eval_batch_size: 2
47
+ - seed: 42
48
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
+ - lr_scheduler_type: linear
50
+ - num_epochs: 25
51
+
52
+ ### Training results
53
+
54
+ | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
55
+ |:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
56
+ | 1.3097 | 1.0 | 604 | 1.3789 | 25.4996 | 11.1787 | 20.8197 | 23.9248 | 19.0 |
57
+ | 1.1951 | 2.0 | 1208 | 1.3779 | 25.3276 | 11.2651 | 20.6966 | 23.6966 | 19.0 |
58
+ | 1.1081 | 3.0 | 1812 | 1.3903 | 26.1128 | 11.8339 | 21.2543 | 24.5472 | 18.994 |
59
+ | 1.0272 | 4.0 | 2416 | 1.4042 | 26.0231 | 11.5574 | 21.152 | 24.3645 | 18.992 |
60
+ | 0.919 | 5.0 | 3020 | 1.4225 | 25.8646 | 11.6121 | 21.0205 | 24.3016 | 18.992 |
61
+ | 0.8643 | 6.0 | 3624 | 1.4410 | 25.9734 | 11.6221 | 21.0577 | 24.4231 | 18.99 |
62
+ | 0.8215 | 7.0 | 4228 | 1.4599 | 25.6546 | 11.2999 | 20.8896 | 24.2294 | 19.0 |
63
+ | 0.7931 | 8.0 | 4832 | 1.4926 | 25.0796 | 10.9369 | 20.3639 | 23.629 | 19.0 |
64
+ | 0.7664 | 9.0 | 5436 | 1.5090 | 25.4341 | 11.0028 | 20.6241 | 23.9038 | 19.0 |
65
+ | 0.7053 | 10.0 | 6040 | 1.5259 | 25.491 | 10.8959 | 20.5515 | 23.934 | 18.998 |
66
+ | 0.6725 | 11.0 | 6644 | 1.5481 | 25.3073 | 10.7089 | 20.4993 | 23.8286 | 19.0 |
67
+ | 0.6462 | 12.0 | 7248 | 1.5710 | 25.6276 | 11.0744 | 20.7313 | 24.1134 | 19.0 |
68
+ | 0.6275 | 13.0 | 7852 | 1.5884 | 25.8615 | 11.0844 | 20.9412 | 24.2996 | 19.0 |
69
+ | 0.5838 | 14.0 | 8456 | 1.6131 | 26.1201 | 11.4081 | 21.3173 | 24.6139 | 19.0 |
70
+ | 0.5682 | 15.0 | 9060 | 1.6259 | 25.7212 | 11.1367 | 20.8247 | 24.1398 | 19.0 |
71
+ | 0.5629 | 16.0 | 9664 | 1.6473 | 25.6506 | 11.2149 | 20.8629 | 24.1527 | 19.0 |
72
+ | 0.5446 | 17.0 | 10268 | 1.6645 | 25.4396 | 10.7189 | 20.4852 | 23.924 | 19.0 |
73
+ | 0.5108 | 18.0 | 10872 | 1.6716 | 25.7213 | 11.2197 | 20.858 | 24.3088 | 19.0 |
74
+ | 0.5358 | 19.0 | 11476 | 1.6882 | 25.7908 | 11.2416 | 20.9464 | 24.2852 | 19.0 |
75
+ | 0.4959 | 20.0 | 12080 | 1.7027 | 25.6556 | 11.2363 | 20.8495 | 24.1665 | 18.992 |
76
+ | 0.4942 | 21.0 | 12684 | 1.7131 | 25.6156 | 11.175 | 20.7688 | 24.1641 | 19.0 |
77
+ | 0.4833 | 22.0 | 13288 | 1.7178 | 25.7798 | 11.2421 | 20.896 | 24.2101 | 19.0 |
78
+ | 0.4702 | 23.0 | 13892 | 1.7227 | 25.7147 | 11.2161 | 20.9105 | 24.1549 | 19.0 |
79
+ | 0.4747 | 24.0 | 14496 | 1.7241 | 25.6314 | 11.1929 | 20.8068 | 24.1442 | 19.0 |
80
+ | 0.4691 | 25.0 | 15100 | 1.7255 | 25.5557 | 11.1446 | 20.7482 | 24.0749 | 19.0 |
81
+
82
+
83
+ ### Framework versions
84
+
85
+ - Transformers 4.34.0
86
+ - Pytorch 2.0.1+cu118
87
+ - Datasets 2.14.5
88
+ - Tokenizers 0.14.1
generation_config.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "decoder_start_token_id": 0,
3
+ "eos_token_id": 1,
4
+ "pad_token_id": 0,
5
+ "transformers_version": "4.34.0"
6
+ }
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:186cc06efe208be6c929814e59a7eacda006eade01caeb3d87cae580af28f578
3
  size 891702929
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0cc649618996e6c47c5031291e9605be45e405ba2ecbaf3cb8604292083a83f3
3
  size 891702929