yihang7 commited on
Commit
37f2773
1 Parent(s): 9ca818e

Model save

Browse files
README.md ADDED
@@ -0,0 +1,157 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: princeton-nlp/Sheared-LLaMA-1.3B
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: Sheared-LLaMA-1.3B-dpo-full-3-epoch-hydrox-safe
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # Sheared-LLaMA-1.3B-dpo-full-3-epoch-hydrox-safe
15
+
16
+ This model is a fine-tuned version of [princeton-nlp/Sheared-LLaMA-1.3B](https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.0041
19
+ - Rewards/chosen: 1.7270
20
+ - Rewards/rejected: -15.3712
21
+ - Rewards/accuracies: 0.9983
22
+ - Rewards/margins: 17.0982
23
+ - Logps/rejected: -656.3423
24
+ - Logps/chosen: -371.7201
25
+ - Logits/rejected: 2.3459
26
+ - Logits/chosen: 0.3641
27
+
28
+ ## Model description
29
+
30
+ More information needed
31
+
32
+ ## Intended uses & limitations
33
+
34
+ More information needed
35
+
36
+ ## Training and evaluation data
37
+
38
+ More information needed
39
+
40
+ ## Training procedure
41
+
42
+ ### Training hyperparameters
43
+
44
+ The following hyperparameters were used during training:
45
+ - learning_rate: 5e-07
46
+ - train_batch_size: 8
47
+ - eval_batch_size: 4
48
+ - seed: 42
49
+ - distributed_type: multi-GPU
50
+ - num_devices: 8
51
+ - total_train_batch_size: 64
52
+ - total_eval_batch_size: 32
53
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
54
+ - lr_scheduler_type: linear
55
+ - lr_scheduler_warmup_ratio: 0.1
56
+ - num_epochs: 3
57
+
58
+ ### Training results
59
+
60
+ | Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen |
61
+ |:-------------:|:-----:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:|
62
+ | 0.6612 | 0.03 | 100 | 0.6499 | 0.0765 | -0.0151 | 0.8300 | 0.0916 | -502.7813 | -388.2253 | 3.2379 | 0.9032 |
63
+ | 0.4585 | 0.07 | 200 | 0.4458 | 0.5224 | -0.1242 | 0.9301 | 0.6466 | -503.8723 | -383.7663 | 3.2494 | 0.9081 |
64
+ | 0.2519 | 0.1 | 300 | 0.2540 | 1.2036 | -0.4814 | 0.9470 | 1.6851 | -507.4445 | -376.9535 | 3.2790 | 0.9127 |
65
+ | 0.17 | 0.14 | 400 | 0.1751 | 1.5794 | -1.0033 | 0.9562 | 2.5827 | -512.6629 | -373.1959 | 3.3007 | 0.9173 |
66
+ | 0.1179 | 0.17 | 500 | 0.1215 | 1.8423 | -2.0791 | 0.9588 | 3.9214 | -523.4217 | -370.5673 | 3.2925 | 0.9104 |
67
+ | 0.1032 | 0.2 | 600 | 0.1078 | 2.0902 | -2.7647 | 0.9596 | 4.8549 | -530.2773 | -368.0876 | 3.2574 | 0.9180 |
68
+ | 0.0614 | 0.24 | 700 | 0.0881 | 2.2830 | -3.4190 | 0.9638 | 5.7021 | -536.8207 | -366.1595 | 3.2243 | 0.9190 |
69
+ | 0.0666 | 0.27 | 800 | 0.0751 | 2.3690 | -4.0591 | 0.9689 | 6.4281 | -543.2214 | -365.2995 | 3.1788 | 0.9025 |
70
+ | 0.0706 | 0.31 | 900 | 0.0662 | 2.4002 | -4.5254 | 0.9722 | 6.9257 | -547.8843 | -364.9874 | 3.1624 | 0.9102 |
71
+ | 0.0711 | 0.34 | 1000 | 0.0577 | 2.4230 | -4.9179 | 0.9764 | 7.3409 | -551.8096 | -364.7598 | 3.1467 | 0.9093 |
72
+ | 0.0623 | 0.37 | 1100 | 0.0572 | 2.4840 | -5.3620 | 0.9773 | 7.8459 | -556.2499 | -364.1504 | 3.1186 | 0.9011 |
73
+ | 0.0443 | 0.41 | 1200 | 0.0526 | 2.4237 | -5.4784 | 0.9798 | 7.9021 | -557.4146 | -364.7530 | 3.1196 | 0.8961 |
74
+ | 0.0416 | 0.44 | 1300 | 0.0477 | 2.3874 | -6.2247 | 0.9823 | 8.6120 | -564.8768 | -365.1163 | 3.0683 | 0.8720 |
75
+ | 0.0365 | 0.48 | 1400 | 0.0448 | 2.2887 | -6.8360 | 0.9806 | 9.1246 | -570.9899 | -366.1031 | 3.0491 | 0.8667 |
76
+ | 0.0341 | 0.51 | 1500 | 0.0442 | 2.2795 | -6.9547 | 0.9848 | 9.2343 | -572.1777 | -366.1945 | 3.0299 | 0.8500 |
77
+ | 0.0406 | 0.54 | 1600 | 0.0414 | 2.0896 | -7.0003 | 0.9848 | 9.0899 | -572.6334 | -368.0941 | 3.0437 | 0.8442 |
78
+ | 0.0427 | 0.58 | 1700 | 0.0387 | 2.0380 | -7.1141 | 0.9857 | 9.1521 | -573.7712 | -368.6102 | 3.0458 | 0.8383 |
79
+ | 0.0225 | 0.61 | 1800 | 0.0421 | 2.2150 | -7.1052 | 0.9891 | 9.3203 | -573.6826 | -366.8395 | 3.0443 | 0.8362 |
80
+ | 0.0298 | 0.65 | 1900 | 0.0364 | 2.0854 | -7.7136 | 0.9882 | 9.7990 | -579.7668 | -368.1361 | 3.0306 | 0.8392 |
81
+ | 0.0255 | 0.68 | 2000 | 0.0353 | 2.1351 | -7.6852 | 0.9907 | 9.8203 | -579.4824 | -367.6387 | 3.0204 | 0.8292 |
82
+ | 0.019 | 0.71 | 2100 | 0.0296 | 2.1215 | -8.1790 | 0.9916 | 10.3005 | -584.4203 | -367.7745 | 3.0052 | 0.8412 |
83
+ | 0.0198 | 0.75 | 2200 | 0.0248 | 2.1218 | -8.4302 | 0.9907 | 10.5520 | -586.9324 | -367.7719 | 2.9878 | 0.8183 |
84
+ | 0.0192 | 0.78 | 2300 | 0.0238 | 2.0950 | -8.2588 | 0.9924 | 10.3538 | -585.2184 | -368.0402 | 2.9758 | 0.7942 |
85
+ | 0.0191 | 0.82 | 2400 | 0.0213 | 2.1701 | -8.6399 | 0.9941 | 10.8101 | -589.0295 | -367.2885 | 2.9719 | 0.8049 |
86
+ | 0.0215 | 0.85 | 2500 | 0.0224 | 2.1220 | -9.1960 | 0.9933 | 11.3180 | -594.5902 | -367.7695 | 2.9391 | 0.7799 |
87
+ | 0.0579 | 0.88 | 2600 | 0.0193 | 2.0368 | -9.3428 | 0.9933 | 11.3796 | -596.0587 | -368.6217 | 2.9297 | 0.7933 |
88
+ | 0.0163 | 0.92 | 2700 | 0.0180 | 1.9057 | -9.4956 | 0.9941 | 11.4013 | -597.5867 | -369.9328 | 2.9114 | 0.7628 |
89
+ | 0.019 | 0.95 | 2800 | 0.0194 | 1.9915 | -9.4265 | 0.9933 | 11.4179 | -596.8949 | -369.0752 | 2.9223 | 0.7736 |
90
+ | 0.0166 | 0.99 | 2900 | 0.0182 | 2.0770 | -9.1954 | 0.9958 | 11.2724 | -594.5848 | -368.2201 | 2.9186 | 0.7592 |
91
+ | 0.0121 | 1.02 | 3000 | 0.0180 | 1.9094 | -9.4964 | 0.9941 | 11.4059 | -597.5947 | -369.8957 | 2.8957 | 0.7557 |
92
+ | 0.011 | 1.05 | 3100 | 0.0150 | 2.0009 | -9.9345 | 0.9966 | 11.9354 | -601.9758 | -368.9812 | 2.8560 | 0.7294 |
93
+ | 0.0106 | 1.09 | 3200 | 0.0139 | 2.0861 | -9.6153 | 0.9966 | 11.7014 | -598.7830 | -368.1290 | 2.8565 | 0.7071 |
94
+ | 0.0095 | 1.12 | 3300 | 0.0134 | 1.9755 | -10.3936 | 0.9958 | 12.3691 | -606.5661 | -369.2344 | 2.8290 | 0.7083 |
95
+ | 0.0115 | 1.16 | 3400 | 0.0129 | 1.9719 | -10.3851 | 0.9949 | 12.3569 | -606.4811 | -369.2712 | 2.8212 | 0.7184 |
96
+ | 0.0152 | 1.19 | 3500 | 0.0124 | 2.0357 | -10.2131 | 0.9958 | 12.2488 | -604.7615 | -368.6329 | 2.8217 | 0.7140 |
97
+ | 0.01 | 1.22 | 3600 | 0.0116 | 2.0147 | -10.9243 | 0.9966 | 12.9390 | -611.8733 | -368.8428 | 2.7589 | 0.6517 |
98
+ | 0.0135 | 1.26 | 3700 | 0.0116 | 1.9527 | -10.8649 | 0.9966 | 12.8176 | -611.2795 | -369.4628 | 2.8017 | 0.7064 |
99
+ | 0.0078 | 1.29 | 3800 | 0.0112 | 1.7362 | -11.5598 | 0.9966 | 13.2960 | -618.2281 | -371.6281 | 2.7623 | 0.6879 |
100
+ | 0.0114 | 1.33 | 3900 | 0.0106 | 1.8313 | -11.3667 | 0.9983 | 13.1980 | -616.2969 | -370.6765 | 2.7616 | 0.6728 |
101
+ | 0.0077 | 1.36 | 4000 | 0.0101 | 1.9160 | -11.5484 | 0.9992 | 13.4645 | -618.1147 | -369.8296 | 2.7534 | 0.6694 |
102
+ | 0.0057 | 1.39 | 4100 | 0.0098 | 1.8898 | -11.3187 | 0.9983 | 13.2085 | -615.8172 | -370.0915 | 2.7553 | 0.6617 |
103
+ | 0.0056 | 1.43 | 4200 | 0.0091 | 2.0721 | -11.6748 | 0.9992 | 13.7469 | -619.3782 | -368.2689 | 2.7234 | 0.6265 |
104
+ | 0.006 | 1.46 | 4300 | 0.0088 | 1.8416 | -12.1884 | 0.9983 | 14.0300 | -624.5148 | -370.5739 | 2.7058 | 0.6225 |
105
+ | 0.0071 | 1.5 | 4400 | 0.0083 | 2.0151 | -11.7393 | 0.9983 | 13.7544 | -620.0233 | -368.8386 | 2.7124 | 0.6231 |
106
+ | 0.0101 | 1.53 | 4500 | 0.0083 | 2.0864 | -11.5153 | 0.9992 | 13.6016 | -617.7830 | -368.1264 | 2.7206 | 0.6407 |
107
+ | 0.0054 | 1.56 | 4600 | 0.0083 | 1.9930 | -11.3424 | 0.9975 | 13.3354 | -616.0542 | -369.0597 | 2.7246 | 0.6099 |
108
+ | 0.0116 | 1.6 | 4700 | 0.0080 | 1.9298 | -11.3167 | 0.9975 | 13.2464 | -615.7971 | -369.6923 | 2.7200 | 0.6008 |
109
+ | 0.0116 | 1.63 | 4800 | 0.0074 | 1.8809 | -11.4685 | 0.9975 | 13.3494 | -617.3154 | -370.1813 | 2.6917 | 0.5698 |
110
+ | 0.0087 | 1.67 | 4900 | 0.0073 | 1.8993 | -11.8845 | 0.9983 | 13.7838 | -621.4749 | -369.9968 | 2.6861 | 0.5798 |
111
+ | 0.0031 | 1.7 | 5000 | 0.0072 | 1.8755 | -12.3032 | 0.9975 | 14.1787 | -625.6624 | -370.2348 | 2.6435 | 0.5411 |
112
+ | 0.0115 | 1.73 | 5100 | 0.0076 | 1.9283 | -11.9068 | 0.9958 | 13.8351 | -621.6979 | -369.7066 | 2.6527 | 0.5393 |
113
+ | 0.0065 | 1.77 | 5200 | 0.0074 | 1.9870 | -11.9105 | 0.9949 | 13.8975 | -621.7357 | -369.1199 | 2.6790 | 0.5763 |
114
+ | 0.006 | 1.8 | 5300 | 0.0068 | 1.7994 | -12.4601 | 0.9958 | 14.2595 | -627.2310 | -370.9959 | 2.6264 | 0.5393 |
115
+ | 0.0076 | 1.84 | 5400 | 0.0064 | 2.0449 | -12.2057 | 0.9966 | 14.2506 | -624.6871 | -368.5407 | 2.6409 | 0.5465 |
116
+ | 0.0042 | 1.87 | 5500 | 0.0062 | 1.9941 | -12.4399 | 0.9983 | 14.4340 | -627.0295 | -369.0491 | 2.6332 | 0.5433 |
117
+ | 0.0079 | 1.9 | 5600 | 0.0061 | 1.9119 | -12.4000 | 0.9983 | 14.3118 | -626.6300 | -369.8711 | 2.6300 | 0.5377 |
118
+ | 0.0066 | 1.94 | 5700 | 0.0062 | 2.0544 | -12.1682 | 0.9983 | 14.2226 | -624.3120 | -368.4457 | 2.6248 | 0.5288 |
119
+ | 0.0071 | 1.97 | 5800 | 0.0061 | 2.0943 | -12.2702 | 0.9975 | 14.3645 | -625.3325 | -368.0468 | 2.6248 | 0.5422 |
120
+ | 0.0021 | 2.01 | 5900 | 0.0057 | 1.9195 | -12.9348 | 0.9983 | 14.8543 | -631.9785 | -369.7946 | 2.5712 | 0.5186 |
121
+ | 0.0029 | 2.04 | 6000 | 0.0057 | 1.8384 | -13.3904 | 0.9983 | 15.2288 | -636.5340 | -370.6057 | 2.5405 | 0.4960 |
122
+ | 0.0035 | 2.07 | 6100 | 0.0056 | 1.6150 | -14.2858 | 0.9975 | 15.9009 | -645.4886 | -372.8395 | 2.4718 | 0.4415 |
123
+ | 0.0053 | 2.11 | 6200 | 0.0053 | 1.8268 | -13.9429 | 0.9983 | 15.7696 | -642.0590 | -370.7222 | 2.4921 | 0.4576 |
124
+ | 0.0044 | 2.14 | 6300 | 0.0052 | 1.9443 | -13.8117 | 0.9975 | 15.7560 | -640.7470 | -369.5464 | 2.5079 | 0.4705 |
125
+ | 0.0026 | 2.18 | 6400 | 0.0053 | 2.0456 | -13.7455 | 0.9975 | 15.7911 | -640.0853 | -368.5343 | 2.5139 | 0.4823 |
126
+ | 0.0026 | 2.21 | 6500 | 0.0050 | 2.0028 | -13.6496 | 0.9983 | 15.6524 | -639.1260 | -368.9618 | 2.5135 | 0.4823 |
127
+ | 0.0029 | 2.24 | 6600 | 0.0050 | 1.8856 | -13.7926 | 0.9975 | 15.6782 | -640.5563 | -370.1337 | 2.4828 | 0.4459 |
128
+ | 0.0023 | 2.28 | 6700 | 0.0049 | 1.9422 | -14.0760 | 0.9983 | 16.0182 | -643.3903 | -369.5678 | 2.4698 | 0.4471 |
129
+ | 0.003 | 2.31 | 6800 | 0.0048 | 1.8633 | -14.4649 | 0.9983 | 16.3282 | -647.2790 | -370.3570 | 2.4646 | 0.4562 |
130
+ | 0.0058 | 2.35 | 6900 | 0.0049 | 1.8085 | -14.8512 | 0.9975 | 16.6597 | -651.1427 | -370.9051 | 2.4275 | 0.4292 |
131
+ | 0.0032 | 2.38 | 7000 | 0.0048 | 1.9006 | -14.6340 | 0.9983 | 16.5346 | -648.9703 | -369.9842 | 2.4387 | 0.4425 |
132
+ | 0.0018 | 2.41 | 7100 | 0.0047 | 1.8215 | -15.0376 | 0.9983 | 16.8592 | -653.0067 | -370.7746 | 2.4153 | 0.4296 |
133
+ | 0.001 | 2.45 | 7200 | 0.0046 | 1.8195 | -15.0112 | 0.9983 | 16.8307 | -652.7422 | -370.7950 | 2.4153 | 0.4248 |
134
+ | 0.0057 | 2.48 | 7300 | 0.0045 | 1.8920 | -14.4156 | 0.9983 | 16.3077 | -646.7868 | -370.0694 | 2.4336 | 0.4234 |
135
+ | 0.004 | 2.52 | 7400 | 0.0044 | 1.7826 | -14.6522 | 0.9983 | 16.4348 | -649.1526 | -371.1638 | 2.4101 | 0.4117 |
136
+ | 0.0025 | 2.55 | 7500 | 0.0044 | 1.8202 | -14.7043 | 0.9983 | 16.5245 | -649.6732 | -370.7875 | 2.4040 | 0.4069 |
137
+ | 0.0035 | 2.58 | 7600 | 0.0044 | 1.8712 | -14.7562 | 0.9983 | 16.6273 | -650.1921 | -370.2782 | 2.4019 | 0.4087 |
138
+ | 0.002 | 2.62 | 7700 | 0.0043 | 1.8406 | -14.8610 | 0.9983 | 16.7017 | -651.2407 | -370.5836 | 2.3996 | 0.4114 |
139
+ | 0.002 | 2.65 | 7800 | 0.0043 | 1.8042 | -15.0820 | 0.9992 | 16.8862 | -653.4503 | -370.9484 | 2.3936 | 0.4147 |
140
+ | 0.0046 | 2.69 | 7900 | 0.0042 | 1.8043 | -15.2990 | 0.9983 | 17.1033 | -655.6204 | -370.9472 | 2.3757 | 0.3993 |
141
+ | 0.0025 | 2.72 | 8000 | 0.0042 | 1.8289 | -15.3097 | 0.9983 | 17.1386 | -655.7274 | -370.7011 | 2.3634 | 0.3853 |
142
+ | 0.0023 | 2.75 | 8100 | 0.0041 | 1.7995 | -15.2380 | 0.9983 | 17.0375 | -655.0099 | -370.9947 | 2.3619 | 0.3779 |
143
+ | 0.0025 | 2.79 | 8200 | 0.0040 | 1.8013 | -15.2440 | 0.9983 | 17.0453 | -655.0703 | -370.9769 | 2.3668 | 0.3827 |
144
+ | 0.002 | 2.82 | 8300 | 0.0040 | 1.8040 | -15.2101 | 0.9983 | 17.0141 | -654.7317 | -370.9499 | 2.3660 | 0.3834 |
145
+ | 0.0023 | 2.86 | 8400 | 0.0040 | 1.7441 | -15.3132 | 0.9983 | 17.0572 | -655.7621 | -371.5493 | 2.3498 | 0.3680 |
146
+ | 0.002 | 2.89 | 8500 | 0.0040 | 1.7551 | -15.3278 | 0.9983 | 17.0828 | -655.9080 | -371.4393 | 2.3509 | 0.3714 |
147
+ | 0.004 | 2.92 | 8600 | 0.0040 | 1.7500 | -15.3290 | 0.9983 | 17.0790 | -655.9205 | -371.4897 | 2.3518 | 0.3701 |
148
+ | 0.0041 | 2.96 | 8700 | 0.0040 | 1.7294 | -15.3645 | 0.9983 | 17.0940 | -656.2756 | -371.6956 | 2.3478 | 0.3660 |
149
+ | 0.0029 | 2.99 | 8800 | 0.0040 | 1.7305 | -15.3609 | 0.9983 | 17.0914 | -656.2390 | -371.6845 | 2.3464 | 0.3647 |
150
+
151
+
152
+ ### Framework versions
153
+
154
+ - Transformers 4.35.0
155
+ - Pytorch 2.1.1+cu121
156
+ - Datasets 2.14.6
157
+ - Tokenizers 0.14.1
all_results.json ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 3.0,
3
+ "eval_logits/chosen": 0.3641189634799957,
4
+ "eval_logits/rejected": 2.345853090286255,
5
+ "eval_logps/chosen": -371.7200927734375,
6
+ "eval_logps/rejected": -656.34228515625,
7
+ "eval_loss": 0.004054788034409285,
8
+ "eval_rewards/accuracies": 0.9983165264129639,
9
+ "eval_rewards/chosen": 1.7269809246063232,
10
+ "eval_rewards/margins": 17.098175048828125,
11
+ "eval_rewards/rejected": -15.371195793151855,
12
+ "eval_runtime": 270.3044,
13
+ "eval_samples": 9500,
14
+ "eval_samples_per_second": 35.146,
15
+ "eval_steps_per_second": 1.099,
16
+ "train_loss": 0.03725933715852631,
17
+ "train_runtime": 48940.1741,
18
+ "train_samples": 188284,
19
+ "train_samples_per_second": 11.542,
20
+ "train_steps_per_second": 0.18
21
+ }
config.json ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "princeton-nlp/Sheared-LLaMA-1.3B",
3
+ "architectures": [
4
+ "LlamaForCausalLM"
5
+ ],
6
+ "attention_bias": false,
7
+ "bos_token_id": 1,
8
+ "eos_token_id": 2,
9
+ "hidden_act": "silu",
10
+ "hidden_size": 2048,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 5504,
13
+ "max_position_embeddings": 4096,
14
+ "model_type": "llama",
15
+ "num_attention_heads": 16,
16
+ "num_hidden_layers": 24,
17
+ "num_key_value_heads": 16,
18
+ "pad_token_id": 0,
19
+ "pretraining_tp": 1,
20
+ "rms_norm_eps": 1e-05,
21
+ "rope_scaling": null,
22
+ "rope_theta": 10000.0,
23
+ "tie_word_embeddings": false,
24
+ "torch_dtype": "bfloat16",
25
+ "transformers_version": "4.35.0",
26
+ "use_cache": false,
27
+ "vocab_size": 32000
28
+ }
eval_results.json ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 3.0,
3
+ "eval_logits/chosen": 0.3641189634799957,
4
+ "eval_logits/rejected": 2.345853090286255,
5
+ "eval_logps/chosen": -371.7200927734375,
6
+ "eval_logps/rejected": -656.34228515625,
7
+ "eval_loss": 0.004054788034409285,
8
+ "eval_rewards/accuracies": 0.9983165264129639,
9
+ "eval_rewards/chosen": 1.7269809246063232,
10
+ "eval_rewards/margins": 17.098175048828125,
11
+ "eval_rewards/rejected": -15.371195793151855,
12
+ "eval_runtime": 270.3044,
13
+ "eval_samples": 9500,
14
+ "eval_samples_per_second": 35.146,
15
+ "eval_steps_per_second": 1.099
16
+ }
generation_config.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 1,
4
+ "eos_token_id": 2,
5
+ "pad_token_id": 0,
6
+ "transformers_version": "4.35.0"
7
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f1315a68e1f6e1aaeb4e8d4844b595c31d3337d5f19197949234f6665cb69749
3
+ size 2690871976
runs/Jan19_20-45-42_amaterasu/events.out.tfevents.1705697174.amaterasu.1852734.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:69ca6ccbc84f95adbb13afd161011152b74f7d135b9ddf3bbf1db5c3e7c6af20
3
+ size 629594
runs/Jan19_20-45-42_amaterasu/events.out.tfevents.1705746385.amaterasu.1852734.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5525ed1c906cf35659b79b5015793eb9d149ad79ee0824cbbec4de315758a92c
3
+ size 828
special_tokens_map.json ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": true,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "</s>",
11
+ "lstrip": false,
12
+ "normalized": true,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": "</s>",
17
+ "unk_token": {
18
+ "content": "<unk>",
19
+ "lstrip": false,
20
+ "normalized": true,
21
+ "rstrip": false,
22
+ "single_word": false
23
+ }
24
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,38 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<unk>",
5
+ "lstrip": false,
6
+ "normalized": true,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "<s>",
13
+ "lstrip": false,
14
+ "normalized": true,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "</s>",
21
+ "lstrip": false,
22
+ "normalized": true,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ }
27
+ },
28
+ "bos_token": "<s>",
29
+ "chat_template": "{% for message in messages %}\n{% if message['role'] == 'user' %}\n{{ '<|user|>\n' + message['content'] + eos_token }}\n{% elif message['role'] == 'system' %}\n{{ '<|system|>\n' + message['content'] + eos_token }}\n{% elif message['role'] == 'assistant' %}\n{{ '<|assistant|>\n' + message['content'] + eos_token }}\n{% endif %}\n{% if loop.last and add_generation_prompt %}\n{{ '<|assistant|>' }}\n{% endif %}\n{% endfor %}",
30
+ "clean_up_tokenization_spaces": false,
31
+ "eos_token": "</s>",
32
+ "model_max_length": 2048,
33
+ "pad_token": "</s>",
34
+ "sp_model_kwargs": {},
35
+ "tokenizer_class": "LlamaTokenizer",
36
+ "unk_token": "<unk>",
37
+ "use_default_system_prompt": false
38
+ }
train_results.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 3.0,
3
+ "train_loss": 0.03725933715852631,
4
+ "train_runtime": 48940.1741,
5
+ "train_samples": 188284,
6
+ "train_samples_per_second": 11.542,
7
+ "train_steps_per_second": 0.18
8
+ }
trainer_state.json ADDED
The diff for this file is too large to render. See raw diff
 
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:10b6a54ccfba09c48898620df7dcd86dd4eda2d89bd8366e2829ca0b60e2ec86
3
+ size 5752