satyanshu404 commited on
Commit
282ffaa
1 Parent(s): 746ed9f

End of training

Browse files
README.md CHANGED
@@ -15,10 +15,12 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [facebook/bart-large-cnn](https://huggingface.co/facebook/bart-large-cnn) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 2.5934
19
  - Actual score: 0.8766
20
- - Predction score: 1.3535
21
- - Score difference: -0.4769
 
 
22
 
23
  ## Model description
24
 
@@ -43,118 +45,68 @@ The following hyperparameters were used during training:
43
  - seed: 42
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: linear
46
- - num_epochs: 100
47
  - mixed_precision_training: Native AMP
48
 
49
  ### Training results
50
 
51
- | Training Loss | Epoch | Step | Validation Loss | Actual score | Predction score | Score difference |
52
- |:-------------:|:-----:|:----:|:---------------:|:------------:|:---------------:|:----------------:|
53
- | No log | 1.0 | 15 | 3.6224 | 0.8766 | -0.4105 | 1.2871 |
54
- | No log | 2.0 | 30 | 3.5086 | 0.8766 | -0.2477 | 1.1243 |
55
- | No log | 3.0 | 45 | 3.3524 | 0.8766 | -0.3119 | 1.1886 |
56
- | No log | 4.0 | 60 | 3.2496 | 0.8766 | -0.1139 | 0.9905 |
57
- | No log | 5.0 | 75 | 3.1300 | 0.8766 | -0.3163 | 1.1929 |
58
- | No log | 6.0 | 90 | 3.0445 | 0.8766 | -0.4738 | 1.3504 |
59
- | No log | 7.0 | 105 | 2.9855 | 0.8766 | -0.5561 | 1.4327 |
60
- | No log | 8.0 | 120 | 2.9429 | 0.8766 | -0.6262 | 1.5028 |
61
- | No log | 9.0 | 135 | 2.9103 | 0.8766 | -0.4633 | 1.3399 |
62
- | No log | 10.0 | 150 | 2.8818 | 0.8766 | -0.5404 | 1.4170 |
63
- | No log | 11.0 | 165 | 2.8567 | 0.8766 | -0.7534 | 1.6300 |
64
- | No log | 12.0 | 180 | 2.8327 | 0.8766 | -0.7283 | 1.6049 |
65
- | No log | 13.0 | 195 | 2.8114 | 0.8766 | -0.5976 | 1.4742 |
66
- | No log | 14.0 | 210 | 2.7917 | 0.8766 | -0.7693 | 1.6460 |
67
- | No log | 15.0 | 225 | 2.7749 | 0.8766 | -0.5831 | 1.4597 |
68
- | No log | 16.0 | 240 | 2.7596 | 0.8766 | -0.5963 | 1.4729 |
69
- | No log | 17.0 | 255 | 2.7458 | 0.8766 | -0.5232 | 1.3998 |
70
- | No log | 18.0 | 270 | 2.7329 | 0.8766 | -0.1795 | 1.0562 |
71
- | No log | 19.0 | 285 | 2.7211 | 0.8766 | -0.2189 | 1.0955 |
72
- | No log | 20.0 | 300 | 2.7111 | 0.8766 | -0.3411 | 1.2177 |
73
- | No log | 21.0 | 315 | 2.7022 | 0.8766 | -0.3058 | 1.1824 |
74
- | No log | 22.0 | 330 | 2.6936 | 0.8766 | -0.3270 | 1.2036 |
75
- | No log | 23.0 | 345 | 2.6853 | 0.8766 | -0.1728 | 1.0494 |
76
- | No log | 24.0 | 360 | 2.6771 | 0.8766 | -0.2413 | 1.1179 |
77
- | No log | 25.0 | 375 | 2.6700 | 0.8766 | 0.0077 | 0.8689 |
78
- | No log | 26.0 | 390 | 2.6641 | 0.8766 | -0.0744 | 0.9510 |
79
- | No log | 27.0 | 405 | 2.6589 | 0.8766 | 0.0078 | 0.8689 |
80
- | No log | 28.0 | 420 | 2.6540 | 0.8766 | 0.0711 | 0.8055 |
81
- | No log | 29.0 | 435 | 2.6493 | 0.8766 | 0.2289 | 0.6477 |
82
- | No log | 30.0 | 450 | 2.6443 | 0.8766 | 0.1096 | 0.7670 |
83
- | No log | 31.0 | 465 | 2.6393 | 0.8766 | 0.1335 | 0.7431 |
84
- | No log | 32.0 | 480 | 2.6355 | 0.8766 | 0.3491 | 0.5275 |
85
- | No log | 33.0 | 495 | 2.6321 | 0.8766 | 0.4268 | 0.4498 |
86
- | 2.6272 | 34.0 | 510 | 2.6288 | 0.8766 | 0.3806 | 0.4960 |
87
- | 2.6272 | 35.0 | 525 | 2.6258 | 0.8766 | 0.8496 | 0.0271 |
88
- | 2.6272 | 36.0 | 540 | 2.6231 | 0.8766 | 0.6446 | 0.2321 |
89
- | 2.6272 | 37.0 | 555 | 2.6204 | 0.8766 | 0.6268 | 0.2498 |
90
- | 2.6272 | 38.0 | 570 | 2.6176 | 0.8766 | 0.8588 | 0.0178 |
91
- | 2.6272 | 39.0 | 585 | 2.6159 | 0.8766 | 0.9990 | -0.1224 |
92
- | 2.6272 | 40.0 | 600 | 2.6132 | 0.8766 | 1.0628 | -0.1862 |
93
- | 2.6272 | 41.0 | 615 | 2.6111 | 0.8766 | 0.9146 | -0.0380 |
94
- | 2.6272 | 42.0 | 630 | 2.6092 | 0.8766 | 1.0457 | -0.1691 |
95
- | 2.6272 | 43.0 | 645 | 2.6078 | 0.8766 | 0.9640 | -0.0874 |
96
- | 2.6272 | 44.0 | 660 | 2.6059 | 0.8766 | 1.0378 | -0.1612 |
97
- | 2.6272 | 45.0 | 675 | 2.6047 | 0.8766 | 1.0599 | -0.1833 |
98
- | 2.6272 | 46.0 | 690 | 2.6034 | 0.8766 | 1.1746 | -0.2980 |
99
- | 2.6272 | 47.0 | 705 | 2.6019 | 0.8766 | 1.1497 | -0.2730 |
100
- | 2.6272 | 48.0 | 720 | 2.6002 | 0.8766 | 1.2987 | -0.4221 |
101
- | 2.6272 | 49.0 | 735 | 2.5988 | 0.8766 | 1.2149 | -0.3383 |
102
- | 2.6272 | 50.0 | 750 | 2.5982 | 0.8766 | 1.2456 | -0.3690 |
103
- | 2.6272 | 51.0 | 765 | 2.5973 | 0.8766 | 1.2476 | -0.3709 |
104
- | 2.6272 | 52.0 | 780 | 2.5958 | 0.8766 | 1.2934 | -0.4168 |
105
- | 2.6272 | 53.0 | 795 | 2.5948 | 0.8766 | 1.2370 | -0.3604 |
106
- | 2.6272 | 54.0 | 810 | 2.5937 | 0.8766 | 1.2163 | -0.3397 |
107
- | 2.6272 | 55.0 | 825 | 2.5926 | 0.8766 | 1.2636 | -0.3869 |
108
- | 2.6272 | 56.0 | 840 | 2.5923 | 0.8766 | 1.3040 | -0.4273 |
109
- | 2.6272 | 57.0 | 855 | 2.5921 | 0.8766 | 1.3694 | -0.4928 |
110
- | 2.6272 | 58.0 | 870 | 2.5916 | 0.8766 | 1.1951 | -0.3185 |
111
- | 2.6272 | 59.0 | 885 | 2.5916 | 0.8766 | 1.3291 | -0.4525 |
112
- | 2.6272 | 60.0 | 900 | 2.5914 | 0.8766 | 1.3288 | -0.4521 |
113
- | 2.6272 | 61.0 | 915 | 2.5914 | 0.8766 | 1.3867 | -0.5101 |
114
- | 2.6272 | 62.0 | 930 | 2.5916 | 0.8766 | 1.4165 | -0.5399 |
115
- | 2.6272 | 63.0 | 945 | 2.5915 | 0.8766 | 1.4103 | -0.5337 |
116
- | 2.6272 | 64.0 | 960 | 2.5910 | 0.8766 | 1.3960 | -0.5194 |
117
- | 2.6272 | 65.0 | 975 | 2.5908 | 0.8766 | 1.3134 | -0.4368 |
118
- | 2.6272 | 66.0 | 990 | 2.5903 | 0.8766 | 1.3638 | -0.4872 |
119
- | 1.9897 | 67.0 | 1005 | 2.5900 | 0.8766 | 1.3875 | -0.5109 |
120
- | 1.9897 | 68.0 | 1020 | 2.5901 | 0.8766 | 1.2404 | -0.3637 |
121
- | 1.9897 | 69.0 | 1035 | 2.5900 | 0.8766 | 1.4162 | -0.5396 |
122
- | 1.9897 | 70.0 | 1050 | 2.5899 | 0.8766 | 1.4048 | -0.5281 |
123
- | 1.9897 | 71.0 | 1065 | 2.5900 | 0.8766 | 1.3967 | -0.5201 |
124
- | 1.9897 | 72.0 | 1080 | 2.5900 | 0.8766 | 1.4208 | -0.5442 |
125
- | 1.9897 | 73.0 | 1095 | 2.5903 | 0.8766 | 1.4418 | -0.5651 |
126
- | 1.9897 | 74.0 | 1110 | 2.5903 | 0.8766 | 1.4656 | -0.5890 |
127
- | 1.9897 | 75.0 | 1125 | 2.5905 | 0.8766 | 1.4504 | -0.5738 |
128
- | 1.9897 | 76.0 | 1140 | 2.5910 | 0.8766 | 1.3669 | -0.4903 |
129
- | 1.9897 | 77.0 | 1155 | 2.5912 | 0.8766 | 1.3362 | -0.4595 |
130
- | 1.9897 | 78.0 | 1170 | 2.5917 | 0.8766 | 1.3196 | -0.4430 |
131
- | 1.9897 | 79.0 | 1185 | 2.5918 | 0.8766 | 1.3537 | -0.4770 |
132
- | 1.9897 | 80.0 | 1200 | 2.5921 | 0.8766 | 1.3136 | -0.4370 |
133
- | 1.9897 | 81.0 | 1215 | 2.5923 | 0.8766 | 1.3806 | -0.5039 |
134
- | 1.9897 | 82.0 | 1230 | 2.5926 | 0.8766 | 1.3900 | -0.5134 |
135
- | 1.9897 | 83.0 | 1245 | 2.5924 | 0.8766 | 1.3907 | -0.5141 |
136
- | 1.9897 | 84.0 | 1260 | 2.5924 | 0.8766 | 1.3785 | -0.5019 |
137
- | 1.9897 | 85.0 | 1275 | 2.5926 | 0.8766 | 1.4009 | -0.5243 |
138
- | 1.9897 | 86.0 | 1290 | 2.5928 | 0.8766 | 1.4108 | -0.5342 |
139
- | 1.9897 | 87.0 | 1305 | 2.5929 | 0.8766 | 1.3947 | -0.5180 |
140
- | 1.9897 | 88.0 | 1320 | 2.5929 | 0.8766 | 1.3845 | -0.5078 |
141
- | 1.9897 | 89.0 | 1335 | 2.5928 | 0.8766 | 1.4045 | -0.5279 |
142
- | 1.9897 | 90.0 | 1350 | 2.5929 | 0.8766 | 1.3804 | -0.5038 |
143
- | 1.9897 | 91.0 | 1365 | 2.5931 | 0.8766 | 1.3962 | -0.5195 |
144
- | 1.9897 | 92.0 | 1380 | 2.5931 | 0.8766 | 1.3801 | -0.5034 |
145
- | 1.9897 | 93.0 | 1395 | 2.5932 | 0.8766 | 1.3664 | -0.4897 |
146
- | 1.9897 | 94.0 | 1410 | 2.5933 | 0.8766 | 1.3716 | -0.4950 |
147
- | 1.9897 | 95.0 | 1425 | 2.5933 | 0.8766 | 1.3935 | -0.5169 |
148
- | 1.9897 | 96.0 | 1440 | 2.5933 | 0.8766 | 1.3676 | -0.4910 |
149
- | 1.9897 | 97.0 | 1455 | 2.5934 | 0.8766 | 1.3914 | -0.5148 |
150
- | 1.9897 | 98.0 | 1470 | 2.5933 | 0.8766 | 1.3912 | -0.5146 |
151
- | 1.9897 | 99.0 | 1485 | 2.5934 | 0.8766 | 1.3930 | -0.5164 |
152
- | 1.7966 | 100.0 | 1500 | 2.5934 | 0.8766 | 1.3535 | -0.4769 |
153
 
154
 
155
  ### Framework versions
156
 
157
- - Transformers 4.35.0
158
  - Pytorch 2.1.0+cu118
159
- - Datasets 2.14.6
160
- - Tokenizers 0.14.1
 
15
 
16
  This model is a fine-tuned version of [facebook/bart-large-cnn](https://huggingface.co/facebook/bart-large-cnn) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 2.6454
19
  - Actual score: 0.8766
20
+ - Predction score: 0.3383
21
+ - Score difference: 0.5383
22
+ - Map: 0.675
23
+ - Ndcg@10: 0.7065
24
 
25
  ## Model description
26
 
 
45
  - seed: 42
46
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
  - lr_scheduler_type: linear
48
+ - num_epochs: 50
49
  - mixed_precision_training: Native AMP
50
 
51
  ### Training results
52
 
53
+ | Training Loss | Epoch | Step | Validation Loss | Actual score | Predction score | Score difference | Map | Ndcg@10 |
54
+ |:-------------:|:-----:|:----:|:---------------:|:------------:|:---------------:|:----------------:|:------:|:-------:|
55
+ | No log | 1.0 | 15 | 3.6562 | 0.8766 | -0.0289 | 0.9055 | 0.519 | 0.5699 |
56
+ | No log | 2.0 | 30 | 3.5539 | 0.8766 | 0.1581 | 0.7185 | 0.5457 | 0.5999 |
57
+ | No log | 3.0 | 45 | 3.3930 | 0.8766 | -0.0469 | 0.9235 | 0.654 | 0.7112 |
58
+ | No log | 4.0 | 60 | 3.2928 | 0.8766 | -0.1568 | 1.0334 | 0.5673 | 0.6259 |
59
+ | No log | 5.0 | 75 | 3.1723 | 0.8766 | -0.1002 | 0.9768 | 0.6457 | 0.6947 |
60
+ | No log | 6.0 | 90 | 3.0813 | 0.8766 | -0.1568 | 1.0334 | 0.589 | 0.6425 |
61
+ | No log | 7.0 | 105 | 3.0169 | 0.8766 | -0.6039 | 1.4805 | 0.659 | 0.7152 |
62
+ | No log | 8.0 | 120 | 2.9700 | 0.8766 | -0.7584 | 1.6350 | 0.6607 | 0.7159 |
63
+ | No log | 9.0 | 135 | 2.9340 | 0.8766 | -0.4008 | 1.2774 | 0.569 | 0.6225 |
64
+ | No log | 10.0 | 150 | 2.9044 | 0.8766 | -0.6995 | 1.5761 | 0.5923 | 0.6452 |
65
+ | No log | 11.0 | 165 | 2.8795 | 0.8766 | -0.4866 | 1.3632 | 0.5933 | 0.6460 |
66
+ | No log | 12.0 | 180 | 2.8558 | 0.8766 | -0.5519 | 1.4285 | 0.654 | 0.7112 |
67
+ | No log | 13.0 | 195 | 2.8351 | 0.8766 | -0.7601 | 1.6367 | 0.6940 | 0.7512 |
68
+ | No log | 14.0 | 210 | 2.8170 | 0.8766 | -0.7849 | 1.6616 | 0.6907 | 0.7538 |
69
+ | No log | 15.0 | 225 | 2.8016 | 0.8766 | -0.5879 | 1.4645 | 0.7140 | 0.7659 |
70
+ | No log | 16.0 | 240 | 2.7867 | 0.8766 | -0.6487 | 1.5254 | 0.7073 | 0.7659 |
71
+ | No log | 17.0 | 255 | 2.7737 | 0.8766 | -0.3421 | 1.2187 | 0.6923 | 0.7493 |
72
+ | No log | 18.0 | 270 | 2.7617 | 0.8766 | -0.1162 | 0.9928 | 0.6623 | 0.7219 |
73
+ | No log | 19.0 | 285 | 2.7502 | 0.8766 | -0.2738 | 1.1504 | 0.6023 | 0.6567 |
74
+ | No log | 20.0 | 300 | 2.7402 | 0.8766 | -0.5541 | 1.4307 | 0.6273 | 0.6859 |
75
+ | No log | 21.0 | 315 | 2.7312 | 0.8766 | -0.3386 | 1.2152 | 0.674 | 0.7259 |
76
+ | No log | 22.0 | 330 | 2.7228 | 0.8766 | -0.5500 | 1.4266 | 0.6940 | 0.7512 |
77
+ | No log | 23.0 | 345 | 2.7148 | 0.8766 | -0.2210 | 1.0976 | 0.6773 | 0.7338 |
78
+ | No log | 24.0 | 360 | 2.7074 | 0.8766 | -0.0863 | 0.9630 | 0.6773 | 0.7438 |
79
+ | No log | 25.0 | 375 | 2.7012 | 0.8766 | -0.1210 | 0.9976 | 0.6373 | 0.7038 |
80
+ | No log | 26.0 | 390 | 2.6955 | 0.8766 | -0.2872 | 1.1638 | 0.7217 | 0.7722 |
81
+ | No log | 27.0 | 405 | 2.6905 | 0.8766 | -0.1040 | 0.9806 | 0.7167 | 0.7682 |
82
+ | No log | 28.0 | 420 | 2.6859 | 0.8766 | -0.1951 | 1.0717 | 0.69 | 0.7329 |
83
+ | No log | 29.0 | 435 | 2.6815 | 0.8766 | -0.0243 | 0.9009 | 0.7133 | 0.7556 |
84
+ | No log | 30.0 | 450 | 2.6774 | 0.8766 | -0.1058 | 0.9824 | 0.6933 | 0.7356 |
85
+ | No log | 31.0 | 465 | 2.6732 | 0.8766 | -0.1431 | 1.0197 | 0.74 | 0.7803 |
86
+ | No log | 32.0 | 480 | 2.6697 | 0.8766 | -0.1289 | 1.0055 | 0.7 | 0.7403 |
87
+ | No log | 33.0 | 495 | 2.6668 | 0.8766 | -0.1033 | 0.9800 | 0.7367 | 0.7829 |
88
+ | 2.6871 | 34.0 | 510 | 2.6638 | 0.8766 | 0.0345 | 0.8422 | 0.7267 | 0.7756 |
89
+ | 2.6871 | 35.0 | 525 | 2.6613 | 0.8766 | 0.0778 | 0.7988 | 0.7557 | 0.8021 |
90
+ | 2.6871 | 36.0 | 540 | 2.6593 | 0.8766 | 0.0817 | 0.7950 | 0.7817 | 0.8269 |
91
+ | 2.6871 | 37.0 | 555 | 2.6572 | 0.8766 | 0.0656 | 0.8110 | 0.7517 | 0.7996 |
92
+ | 2.6871 | 38.0 | 570 | 2.6551 | 0.8766 | 0.1775 | 0.6991 | 0.7617 | 0.8069 |
93
+ | 2.6871 | 39.0 | 585 | 2.6535 | 0.8766 | 0.0677 | 0.8090 | 0.775 | 0.8169 |
94
+ | 2.6871 | 40.0 | 600 | 2.6520 | 0.8766 | 0.1447 | 0.7319 | 0.765 | 0.8043 |
95
+ | 2.6871 | 41.0 | 615 | 2.6508 | 0.8766 | 0.1812 | 0.6954 | 0.7017 | 0.7417 |
96
+ | 2.6871 | 42.0 | 630 | 2.6496 | 0.8766 | 0.2167 | 0.6600 | 0.6867 | 0.7303 |
97
+ | 2.6871 | 43.0 | 645 | 2.6488 | 0.8766 | 0.2700 | 0.6066 | 0.7267 | 0.7703 |
98
+ | 2.6871 | 44.0 | 660 | 2.6478 | 0.8766 | 0.3052 | 0.5714 | 0.6967 | 0.7377 |
99
+ | 2.6871 | 45.0 | 675 | 2.6472 | 0.8766 | 0.2416 | 0.6350 | 0.685 | 0.7191 |
100
+ | 2.6871 | 46.0 | 690 | 2.6466 | 0.8766 | 0.2533 | 0.6234 | 0.685 | 0.7191 |
101
+ | 2.6871 | 47.0 | 705 | 2.6461 | 0.8766 | 0.3144 | 0.5623 | 0.665 | 0.6991 |
102
+ | 2.6871 | 48.0 | 720 | 2.6458 | 0.8766 | 0.2416 | 0.6350 | 0.675 | 0.7065 |
103
+ | 2.6871 | 49.0 | 735 | 2.6455 | 0.8766 | 0.2159 | 0.6608 | 0.685 | 0.7191 |
104
+ | 2.6871 | 50.0 | 750 | 2.6454 | 0.8766 | 0.3383 | 0.5383 | 0.675 | 0.7065 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
105
 
106
 
107
  ### Framework versions
108
 
109
+ - Transformers 4.35.2
110
  - Pytorch 2.1.0+cu118
111
+ - Datasets 2.15.0
112
+ - Tokenizers 0.15.0
generation_config.json CHANGED
@@ -11,5 +11,5 @@
11
  "no_repeat_ngram_size": 3,
12
  "num_beams": 4,
13
  "pad_token_id": 1,
14
- "transformers_version": "4.35.0"
15
  }
 
11
  "no_repeat_ngram_size": 3,
12
  "num_beams": 4,
13
  "pad_token_id": 1,
14
+ "transformers_version": "4.35.2"
15
  }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:48f430329963d1e8f790466689092b4bd0318443215ff7f8e1b61ea6b626ce82
3
  size 1625422896
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7d5b8b8f5b37503807b820ae797ab8abb576a019cc0759be7e10f66d40344def
3
  size 1625422896
runs/Nov24_21-23-36_57d102f32f24/events.out.tfevents.1700861017.57d102f32f24.31434.2 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:7ad117efce18583b5acf21ed7c9bfc58ae5a4f19643ac32e4ab53e7d965050f3
3
- size 23434
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cdb33b2edb830788e0f784490ab803a4cd88bbd016c87fb533907f714a41bf5d
3
+ size 33036
runs/Nov24_21-23-36_57d102f32f24/events.out.tfevents.1700863951.57d102f32f24.31434.3 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4c0d1e8dbe3361b58c62bed04a1623e44ac2caf772c00bf16a34ad6d6b12b453
3
+ size 632