silmi224 commited on
Commit
1698d18
1 Parent(s): 2d3bfcb

Training complete

Browse files
README.md ADDED
@@ -0,0 +1,96 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: silmi224/finetune-led-35000
3
+ tags:
4
+ - summarization
5
+ - generated_from_trainer
6
+ metrics:
7
+ - rouge
8
+ model-index:
9
+ - name: exp2-led-risalah_data_v1
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # exp2-led-risalah_data_v1
17
+
18
+ This model is a fine-tuned version of [silmi224/finetune-led-35000](https://huggingface.co/silmi224/finetune-led-35000) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 1.5716
21
+ - Rouge1: 19.6052
22
+ - Rouge2: 10.044
23
+ - Rougel: 14.481
24
+ - Rougelsum: 18.8794
25
+
26
+ ## Model description
27
+
28
+ More information needed
29
+
30
+ ## Intended uses & limitations
31
+
32
+ More information needed
33
+
34
+ ## Training and evaluation data
35
+
36
+ More information needed
37
+
38
+ ## Training procedure
39
+
40
+ ### Training hyperparameters
41
+
42
+ The following hyperparameters were used during training:
43
+ - learning_rate: 1e-05
44
+ - train_batch_size: 1
45
+ - eval_batch_size: 1
46
+ - seed: 42
47
+ - gradient_accumulation_steps: 8
48
+ - total_train_batch_size: 8
49
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
+ - lr_scheduler_type: linear
51
+ - lr_scheduler_warmup_steps: 150
52
+ - num_epochs: 30
53
+ - mixed_precision_training: Native AMP
54
+
55
+ ### Training results
56
+
57
+ | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
58
+ |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|
59
+ | 3.3238 | 1.0 | 10 | 2.7939 | 8.2919 | 2.3936 | 6.4452 | 7.8858 |
60
+ | 3.0941 | 2.0 | 20 | 2.5393 | 8.884 | 2.3029 | 6.6763 | 8.2192 |
61
+ | 2.7749 | 3.0 | 30 | 2.2953 | 11.5346 | 3.61 | 7.9843 | 10.5072 |
62
+ | 2.5095 | 4.0 | 40 | 2.1271 | 13.3139 | 4.2267 | 9.3106 | 11.9731 |
63
+ | 2.3044 | 5.0 | 50 | 2.0140 | 14.8318 | 5.4652 | 10.4052 | 13.8404 |
64
+ | 2.1532 | 6.0 | 60 | 1.9280 | 15.6855 | 6.4587 | 10.7669 | 14.5093 |
65
+ | 2.032 | 7.0 | 70 | 1.8598 | 14.9367 | 5.7627 | 10.481 | 14.0571 |
66
+ | 1.9376 | 8.0 | 80 | 1.8049 | 14.8866 | 6.1106 | 10.0165 | 14.4686 |
67
+ | 1.8459 | 9.0 | 90 | 1.7491 | 13.6909 | 5.6398 | 9.2128 | 12.9399 |
68
+ | 1.7765 | 10.0 | 100 | 1.7213 | 16.7363 | 7.2146 | 11.2988 | 16.0402 |
69
+ | 1.704 | 11.0 | 110 | 1.6857 | 18.4687 | 8.7089 | 12.9138 | 17.8621 |
70
+ | 1.6542 | 12.0 | 120 | 1.6610 | 19.2238 | 8.9265 | 13.1614 | 17.6696 |
71
+ | 1.5957 | 13.0 | 130 | 1.6335 | 19.6057 | 9.8766 | 13.6908 | 18.6659 |
72
+ | 1.5413 | 14.0 | 140 | 1.6145 | 19.2875 | 9.7272 | 14.3241 | 17.7305 |
73
+ | 1.496 | 15.0 | 150 | 1.6232 | 18.1669 | 8.857 | 13.5735 | 17.1252 |
74
+ | 1.4535 | 16.0 | 160 | 1.6036 | 19.3501 | 10.1008 | 14.5871 | 18.4397 |
75
+ | 1.4204 | 17.0 | 170 | 1.5954 | 19.4201 | 10.3577 | 14.2019 | 18.4312 |
76
+ | 1.3829 | 18.0 | 180 | 1.5794 | 18.4944 | 9.4098 | 13.9 | 17.3891 |
77
+ | 1.3535 | 19.0 | 190 | 1.5814 | 19.9886 | 11.1416 | 15.0161 | 19.122 |
78
+ | 1.3328 | 20.0 | 200 | 1.5758 | 20.2011 | 10.5645 | 14.7218 | 19.1219 |
79
+ | 1.3063 | 21.0 | 210 | 1.5722 | 20.7308 | 10.834 | 15.3016 | 19.8805 |
80
+ | 1.2858 | 22.0 | 220 | 1.5745 | 19.648 | 10.77 | 14.0294 | 19.0395 |
81
+ | 1.2726 | 23.0 | 230 | 1.5651 | 20.4129 | 10.8196 | 15.0054 | 19.7253 |
82
+ | 1.2557 | 24.0 | 240 | 1.5709 | 18.6308 | 9.3525 | 13.8142 | 18.1621 |
83
+ | 1.2456 | 25.0 | 250 | 1.5659 | 19.6106 | 10.4499 | 14.4439 | 18.9271 |
84
+ | 1.233 | 26.0 | 260 | 1.5702 | 19.1583 | 9.7391 | 14.1738 | 18.5077 |
85
+ | 1.2267 | 27.0 | 270 | 1.5651 | 18.7654 | 9.8637 | 13.7809 | 18.2034 |
86
+ | 1.2203 | 28.0 | 280 | 1.5703 | 19.9698 | 10.4741 | 14.3559 | 19.389 |
87
+ | 1.2147 | 29.0 | 290 | 1.5739 | 19.9054 | 10.0052 | 14.6427 | 19.2278 |
88
+ | 1.2124 | 30.0 | 300 | 1.5716 | 19.6052 | 10.044 | 14.481 | 18.8794 |
89
+
90
+
91
+ ### Framework versions
92
+
93
+ - Transformers 4.41.2
94
+ - Pytorch 2.1.2
95
+ - Datasets 2.19.2
96
+ - Tokenizers 0.19.1
generation_config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token_id": 0,
3
+ "decoder_start_token_id": 2,
4
+ "early_stopping": true,
5
+ "eos_token_id": 2,
6
+ "length_penalty": 2.0,
7
+ "max_length": 128,
8
+ "min_length": 40,
9
+ "no_repeat_ngram_size": 3,
10
+ "num_beams": 2,
11
+ "pad_token_id": 1,
12
+ "transformers_version": "4.41.2",
13
+ "use_cache": false
14
+ }
runs/Jul20_05-48-11_5504f68f459e/events.out.tfevents.1721454495.5504f68f459e.34.1 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b630dbdf51c12e653359eb0e2487170ac540b06db008574c6e6464f555358f7a
3
- size 25587
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5ade5a9999103dfc25ac99d90bd3f278cb6d8d044977ea634bb1b0364649b6c8
3
+ size 26415
runs/Jul20_05-48-11_5504f68f459e/events.out.tfevents.1721469161.5504f68f459e.34.2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f1fdcd4f3289e2d7e4de87bfaaf4affeb365fd9728bc19b42fcecef5dea239b4
3
+ size 562