lilouuch commited on
Commit
d1b0b43
1 Parent(s): a1c4c9a

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +67 -0
README.md ADDED
@@ -0,0 +1,67 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - rouge
7
+ model-index:
8
+ - name: t5-small-finetuned-xsum_epoch4
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # t5-small-finetuned-xsum_epoch4
16
+
17
+ This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 2.4245
20
+ - Rouge1: 29.5204
21
+ - Rouge2: 8.4931
22
+ - Rougel: 22.9705
23
+ - Rougelsum: 23.0872
24
+ - Gen Len: 18.8221
25
+
26
+ ## Model description
27
+
28
+ More information needed
29
+
30
+ ## Intended uses & limitations
31
+
32
+ More information needed
33
+
34
+ ## Training and evaluation data
35
+
36
+ More information needed
37
+
38
+ ## Training procedure
39
+
40
+ ### Training hyperparameters
41
+
42
+ The following hyperparameters were used during training:
43
+ - learning_rate: 2e-05
44
+ - train_batch_size: 32
45
+ - eval_batch_size: 32
46
+ - seed: 42
47
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
+ - lr_scheduler_type: linear
49
+ - num_epochs: 4
50
+ - mixed_precision_training: Native AMP
51
+
52
+ ### Training results
53
+
54
+ | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
55
+ |:-------------:|:-----:|:-----:|:---------------:|:-------:|:------:|:-------:|:---------:|:-------:|
56
+ | 2.7175 | 1.0 | 7620 | 2.4899 | 28.585 | 7.7626 | 22.1314 | 22.2424 | 18.8174 |
57
+ | 2.6605 | 2.0 | 15240 | 2.4486 | 29.2362 | 8.2481 | 22.7049 | 22.8227 | 18.8273 |
58
+ | 2.6368 | 3.0 | 22860 | 2.4303 | 29.4228 | 8.4312 | 22.8991 | 23.0192 | 18.8262 |
59
+ | 2.6284 | 4.0 | 30480 | 2.4245 | 29.5204 | 8.4931 | 22.9705 | 23.0872 | 18.8221 |
60
+
61
+
62
+ ### Framework versions
63
+
64
+ - Transformers 4.20.1
65
+ - Pytorch 1.11.0
66
+ - Datasets 2.1.0
67
+ - Tokenizers 0.12.1