End of training
Browse files
README.md
ADDED
@@ -0,0 +1,250 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
library_name: transformers
|
3 |
+
license: mit
|
4 |
+
base_model: roberta-base
|
5 |
+
tags:
|
6 |
+
- generated_from_trainer
|
7 |
+
model-index:
|
8 |
+
- name: BERiT_2.0
|
9 |
+
results: []
|
10 |
+
---
|
11 |
+
|
12 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
13 |
+
should probably proofread and complete it, then remove this comment. -->
|
14 |
+
|
15 |
+
# BERiT_2.0
|
16 |
+
|
17 |
+
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on an unknown dataset.
|
18 |
+
It achieves the following results on the evaluation set:
|
19 |
+
- Loss: 4.3491
|
20 |
+
|
21 |
+
## Model description
|
22 |
+
|
23 |
+
More information needed
|
24 |
+
|
25 |
+
## Intended uses & limitations
|
26 |
+
|
27 |
+
More information needed
|
28 |
+
|
29 |
+
## Training and evaluation data
|
30 |
+
|
31 |
+
More information needed
|
32 |
+
|
33 |
+
## Training procedure
|
34 |
+
|
35 |
+
### Training hyperparameters
|
36 |
+
|
37 |
+
The following hyperparameters were used during training:
|
38 |
+
- learning_rate: 0.0008131878854370431
|
39 |
+
- train_batch_size: 16
|
40 |
+
- eval_batch_size: 16
|
41 |
+
- seed: 42
|
42 |
+
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
43 |
+
- lr_scheduler_type: linear
|
44 |
+
- num_epochs: 75
|
45 |
+
|
46 |
+
### Training results
|
47 |
+
|
48 |
+
| Training Loss | Epoch | Step | Validation Loss |
|
49 |
+
|:-------------:|:-------:|:-----:|:---------------:|
|
50 |
+
| 6.7675 | 0.3873 | 500 | 6.5915 |
|
51 |
+
| 6.5119 | 0.7746 | 1000 | 6.3912 |
|
52 |
+
| 6.378 | 1.1619 | 1500 | 6.3244 |
|
53 |
+
| 6.3103 | 1.5492 | 2000 | 6.2670 |
|
54 |
+
| 6.3097 | 1.9365 | 2500 | 6.2378 |
|
55 |
+
| 6.2799 | 2.3238 | 3000 | 6.2406 |
|
56 |
+
| 6.2717 | 2.7111 | 3500 | 6.2725 |
|
57 |
+
| 6.2694 | 3.0984 | 4000 | 6.2770 |
|
58 |
+
| 6.2595 | 3.4857 | 4500 | 6.2695 |
|
59 |
+
| 6.2386 | 3.8730 | 5000 | 6.2952 |
|
60 |
+
| 6.2393 | 4.2603 | 5500 | 6.2538 |
|
61 |
+
| 6.2442 | 4.6476 | 6000 | 6.2138 |
|
62 |
+
| 6.2322 | 5.0349 | 6500 | 6.2639 |
|
63 |
+
| 6.2028 | 5.4222 | 7000 | 6.3072 |
|
64 |
+
| 6.2306 | 5.8095 | 7500 | 6.2069 |
|
65 |
+
| 6.2027 | 6.1967 | 8000 | 6.2065 |
|
66 |
+
| 6.2 | 6.5840 | 8500 | 6.2331 |
|
67 |
+
| 6.1877 | 6.9713 | 9000 | 6.2548 |
|
68 |
+
| 6.181 | 7.3586 | 9500 | 6.2235 |
|
69 |
+
| 6.1608 | 7.7459 | 10000 | 6.2316 |
|
70 |
+
| 6.1933 | 8.1332 | 10500 | 6.2513 |
|
71 |
+
| 6.187 | 8.5205 | 11000 | 6.1930 |
|
72 |
+
| 6.1734 | 8.9078 | 11500 | 6.2105 |
|
73 |
+
| 6.167 | 9.2951 | 12000 | 6.2032 |
|
74 |
+
| 6.148 | 9.6824 | 12500 | 6.2129 |
|
75 |
+
| 6.1411 | 10.0697 | 13000 | 6.2228 |
|
76 |
+
| 6.131 | 10.4570 | 13500 | 6.2537 |
|
77 |
+
| 6.1321 | 10.8443 | 14000 | 6.2499 |
|
78 |
+
| 6.1412 | 11.2316 | 14500 | 6.2424 |
|
79 |
+
| 6.1389 | 11.6189 | 15000 | 6.1810 |
|
80 |
+
| 6.1265 | 12.0062 | 15500 | 6.1974 |
|
81 |
+
| 6.1345 | 12.3935 | 16000 | 6.2404 |
|
82 |
+
| 6.1072 | 12.7808 | 16500 | 6.1847 |
|
83 |
+
| 6.1213 | 13.1681 | 17000 | 6.1444 |
|
84 |
+
| 6.1016 | 13.5554 | 17500 | 6.1880 |
|
85 |
+
| 6.1306 | 13.9427 | 18000 | 6.1328 |
|
86 |
+
| 6.108 | 14.3300 | 18500 | 6.1761 |
|
87 |
+
| 6.1137 | 14.7173 | 19000 | 6.1504 |
|
88 |
+
| 6.0994 | 15.1046 | 19500 | 6.1977 |
|
89 |
+
| 6.1026 | 15.4919 | 20000 | 6.1830 |
|
90 |
+
| 6.0999 | 15.8792 | 20500 | 6.1811 |
|
91 |
+
| 6.0978 | 16.2665 | 21000 | 6.1263 |
|
92 |
+
| 6.0877 | 16.6538 | 21500 | 6.1645 |
|
93 |
+
| 6.0917 | 17.0411 | 22000 | 6.1742 |
|
94 |
+
| 6.0973 | 17.4284 | 22500 | 6.1636 |
|
95 |
+
| 6.1098 | 17.8156 | 23000 | 6.2017 |
|
96 |
+
| 6.0828 | 18.2029 | 23500 | 6.1007 |
|
97 |
+
| 6.0999 | 18.5902 | 24000 | 6.0879 |
|
98 |
+
| 6.0953 | 18.9775 | 24500 | 6.1521 |
|
99 |
+
| 6.079 | 19.3648 | 25000 | 6.1391 |
|
100 |
+
| 6.0682 | 19.7521 | 25500 | 6.0962 |
|
101 |
+
| 6.058 | 20.1394 | 26000 | 6.0719 |
|
102 |
+
| 6.0643 | 20.5267 | 26500 | 6.1114 |
|
103 |
+
| 6.0498 | 20.9140 | 27000 | 6.1111 |
|
104 |
+
| 6.0665 | 21.3013 | 27500 | 6.1200 |
|
105 |
+
| 6.0825 | 21.6886 | 28000 | 6.0961 |
|
106 |
+
| 6.0369 | 22.0759 | 28500 | 6.1578 |
|
107 |
+
| 6.0512 | 22.4632 | 29000 | 6.0876 |
|
108 |
+
| 6.026 | 22.8505 | 29500 | 6.1211 |
|
109 |
+
| 6.0558 | 23.2378 | 30000 | 6.0837 |
|
110 |
+
| 6.0466 | 23.6251 | 30500 | 6.0552 |
|
111 |
+
| 6.0202 | 24.0124 | 31000 | 6.0906 |
|
112 |
+
| 6.0019 | 24.3997 | 31500 | 6.0580 |
|
113 |
+
| 6.0352 | 24.7870 | 32000 | 6.0521 |
|
114 |
+
| 5.9983 | 25.1743 | 32500 | 6.0701 |
|
115 |
+
| 6.0367 | 25.5616 | 33000 | 6.0859 |
|
116 |
+
| 6.0183 | 25.9489 | 33500 | 6.1353 |
|
117 |
+
| 5.9726 | 26.3362 | 34000 | 6.0918 |
|
118 |
+
| 5.982 | 26.7235 | 34500 | 6.0434 |
|
119 |
+
| 6.0261 | 27.1108 | 35000 | 6.0038 |
|
120 |
+
| 5.9818 | 27.4981 | 35500 | 6.0328 |
|
121 |
+
| 5.9659 | 27.8854 | 36000 | 6.0672 |
|
122 |
+
| 5.9835 | 28.2727 | 36500 | 6.0334 |
|
123 |
+
| 5.98 | 28.6600 | 37000 | 6.0673 |
|
124 |
+
| 5.9756 | 29.0473 | 37500 | 5.9969 |
|
125 |
+
| 5.979 | 29.4345 | 38000 | 6.0067 |
|
126 |
+
| 5.9728 | 29.8218 | 38500 | 6.0297 |
|
127 |
+
| 5.9596 | 30.2091 | 39000 | 5.9682 |
|
128 |
+
| 5.9866 | 30.5964 | 39500 | 6.0026 |
|
129 |
+
| 5.975 | 30.9837 | 40000 | 5.9987 |
|
130 |
+
| 5.9678 | 31.3710 | 40500 | 5.9919 |
|
131 |
+
| 5.9676 | 31.7583 | 41000 | 5.9807 |
|
132 |
+
| 5.9294 | 32.1456 | 41500 | 5.9629 |
|
133 |
+
| 5.9465 | 32.5329 | 42000 | 5.9608 |
|
134 |
+
| 5.9554 | 32.9202 | 42500 | 5.9522 |
|
135 |
+
| 5.9042 | 33.3075 | 43000 | 5.9674 |
|
136 |
+
| 5.9359 | 33.6948 | 43500 | 5.9959 |
|
137 |
+
| 5.9339 | 34.0821 | 44000 | 5.9914 |
|
138 |
+
| 5.9215 | 34.4694 | 44500 | 5.9134 |
|
139 |
+
| 5.8901 | 34.8567 | 45000 | 5.9219 |
|
140 |
+
| 5.9134 | 35.2440 | 45500 | 5.9305 |
|
141 |
+
| 5.9086 | 35.6313 | 46000 | 5.9433 |
|
142 |
+
| 5.9051 | 36.0186 | 46500 | 5.8672 |
|
143 |
+
| 5.8991 | 36.4059 | 47000 | 5.8599 |
|
144 |
+
| 5.8789 | 36.7932 | 47500 | 5.8966 |
|
145 |
+
| 5.892 | 37.1805 | 48000 | 5.8956 |
|
146 |
+
| 5.8591 | 37.5678 | 48500 | 5.8597 |
|
147 |
+
| 5.8855 | 37.9551 | 49000 | 5.8776 |
|
148 |
+
| 5.856 | 38.3424 | 49500 | 5.9281 |
|
149 |
+
| 5.838 | 38.7297 | 50000 | 5.8091 |
|
150 |
+
| 5.8556 | 39.1170 | 50500 | 5.7789 |
|
151 |
+
| 5.8527 | 39.5043 | 51000 | 5.7454 |
|
152 |
+
| 5.8172 | 39.8916 | 51500 | 5.7894 |
|
153 |
+
| 5.8249 | 40.2789 | 52000 | 5.7938 |
|
154 |
+
| 5.809 | 40.6662 | 52500 | 5.7688 |
|
155 |
+
| 5.8152 | 41.0534 | 53000 | 5.7233 |
|
156 |
+
| 5.7961 | 41.4407 | 53500 | 5.6899 |
|
157 |
+
| 5.7767 | 41.8280 | 54000 | 5.7165 |
|
158 |
+
| 5.7582 | 42.2153 | 54500 | 5.7639 |
|
159 |
+
| 5.7893 | 42.6026 | 55000 | 5.6803 |
|
160 |
+
| 5.7365 | 42.9899 | 55500 | 5.6790 |
|
161 |
+
| 5.7365 | 43.3772 | 56000 | 5.6499 |
|
162 |
+
| 5.7569 | 43.7645 | 56500 | 5.6215 |
|
163 |
+
| 5.7321 | 44.1518 | 57000 | 5.6148 |
|
164 |
+
| 5.7186 | 44.5391 | 57500 | 5.5600 |
|
165 |
+
| 5.701 | 44.9264 | 58000 | 5.5373 |
|
166 |
+
| 5.7009 | 45.3137 | 58500 | 5.5664 |
|
167 |
+
| 5.7 | 45.7010 | 59000 | 5.5163 |
|
168 |
+
| 5.677 | 46.0883 | 59500 | 5.4210 |
|
169 |
+
| 5.6673 | 46.4756 | 60000 | 5.3903 |
|
170 |
+
| 5.6297 | 46.8629 | 60500 | 5.3785 |
|
171 |
+
| 5.6222 | 47.2502 | 61000 | 5.3162 |
|
172 |
+
| 5.6181 | 47.6375 | 61500 | 5.2644 |
|
173 |
+
| 5.5784 | 48.0248 | 62000 | 5.2543 |
|
174 |
+
| 5.5799 | 48.4121 | 62500 | 5.2034 |
|
175 |
+
| 5.5509 | 48.7994 | 63000 | 5.1793 |
|
176 |
+
| 5.5665 | 49.1867 | 63500 | 5.1611 |
|
177 |
+
| 5.5418 | 49.5740 | 64000 | 5.1162 |
|
178 |
+
| 5.5094 | 49.9613 | 64500 | 5.0998 |
|
179 |
+
| 5.4983 | 50.3486 | 65000 | 5.0847 |
|
180 |
+
| 5.488 | 50.7359 | 65500 | 5.0962 |
|
181 |
+
| 5.4842 | 51.1232 | 66000 | 5.0385 |
|
182 |
+
| 5.4456 | 51.5105 | 66500 | 5.0509 |
|
183 |
+
| 5.4167 | 51.8978 | 67000 | 4.9671 |
|
184 |
+
| 5.4094 | 52.2851 | 67500 | 4.9199 |
|
185 |
+
| 5.4044 | 52.6723 | 68000 | 4.9520 |
|
186 |
+
| 5.3853 | 53.0596 | 68500 | 4.9233 |
|
187 |
+
| 5.388 | 53.4469 | 69000 | 4.8602 |
|
188 |
+
| 5.3735 | 53.8342 | 69500 | 4.8504 |
|
189 |
+
| 5.3755 | 54.2215 | 70000 | 4.8019 |
|
190 |
+
| 5.3352 | 54.6088 | 70500 | 4.8239 |
|
191 |
+
| 5.3469 | 54.9961 | 71000 | 4.8391 |
|
192 |
+
| 5.3198 | 55.3834 | 71500 | 4.7593 |
|
193 |
+
| 5.2901 | 55.7707 | 72000 | 4.7801 |
|
194 |
+
| 5.2921 | 56.1580 | 72500 | 4.7699 |
|
195 |
+
| 5.2942 | 56.5453 | 73000 | 4.7290 |
|
196 |
+
| 5.2615 | 56.9326 | 73500 | 4.7631 |
|
197 |
+
| 5.2719 | 57.3199 | 74000 | 4.7217 |
|
198 |
+
| 5.2724 | 57.7072 | 74500 | 4.7176 |
|
199 |
+
| 5.256 | 58.0945 | 75000 | 4.6826 |
|
200 |
+
| 5.2318 | 58.4818 | 75500 | 4.6467 |
|
201 |
+
| 5.2152 | 58.8691 | 76000 | 4.6603 |
|
202 |
+
| 5.2301 | 59.2564 | 76500 | 4.6464 |
|
203 |
+
| 5.2057 | 59.6437 | 77000 | 4.7027 |
|
204 |
+
| 5.2073 | 60.0310 | 77500 | 4.6243 |
|
205 |
+
| 5.1845 | 60.4183 | 78000 | 4.6194 |
|
206 |
+
| 5.1919 | 60.8056 | 78500 | 4.5877 |
|
207 |
+
| 5.1805 | 61.1929 | 79000 | 4.5641 |
|
208 |
+
| 5.1723 | 61.5802 | 79500 | 4.5644 |
|
209 |
+
| 5.1805 | 61.9675 | 80000 | 4.5706 |
|
210 |
+
| 5.1738 | 62.3548 | 80500 | 4.5631 |
|
211 |
+
| 5.1586 | 62.7421 | 81000 | 4.5652 |
|
212 |
+
| 5.1666 | 63.1294 | 81500 | 4.5658 |
|
213 |
+
| 5.1412 | 63.5167 | 82000 | 4.5540 |
|
214 |
+
| 5.1336 | 63.9040 | 82500 | 4.4969 |
|
215 |
+
| 5.1561 | 64.2912 | 83000 | 4.5313 |
|
216 |
+
| 5.1223 | 64.6785 | 83500 | 4.5508 |
|
217 |
+
| 5.1426 | 65.0658 | 84000 | 4.6013 |
|
218 |
+
| 5.1124 | 65.4531 | 84500 | 4.4723 |
|
219 |
+
| 5.1187 | 65.8404 | 85000 | 4.5116 |
|
220 |
+
| 5.1162 | 66.2277 | 85500 | 4.4991 |
|
221 |
+
| 5.0953 | 66.6150 | 86000 | 4.4395 |
|
222 |
+
| 5.1166 | 67.0023 | 86500 | 4.4828 |
|
223 |
+
| 5.0869 | 67.3896 | 87000 | 4.4650 |
|
224 |
+
| 5.0936 | 67.7769 | 87500 | 4.4560 |
|
225 |
+
| 5.0963 | 68.1642 | 88000 | 4.4724 |
|
226 |
+
| 5.1117 | 68.5515 | 88500 | 4.4756 |
|
227 |
+
| 5.0679 | 68.9388 | 89000 | 4.4347 |
|
228 |
+
| 5.0803 | 69.3261 | 89500 | 4.4484 |
|
229 |
+
| 5.0786 | 69.7134 | 90000 | 4.4027 |
|
230 |
+
| 5.0483 | 70.1007 | 90500 | 4.3975 |
|
231 |
+
| 5.0678 | 70.4880 | 91000 | 4.4025 |
|
232 |
+
| 5.053 | 70.8753 | 91500 | 4.4227 |
|
233 |
+
| 5.068 | 71.2626 | 92000 | 4.4204 |
|
234 |
+
| 5.0643 | 71.6499 | 92500 | 4.3634 |
|
235 |
+
| 5.0812 | 72.0372 | 93000 | 4.4053 |
|
236 |
+
| 5.0543 | 72.4245 | 93500 | 4.4241 |
|
237 |
+
| 5.0564 | 72.8118 | 94000 | 4.3448 |
|
238 |
+
| 5.0486 | 73.1991 | 94500 | 4.4480 |
|
239 |
+
| 5.0508 | 73.5864 | 95000 | 4.3254 |
|
240 |
+
| 5.0298 | 73.9737 | 95500 | 4.4201 |
|
241 |
+
| 5.0585 | 74.3610 | 96000 | 4.4283 |
|
242 |
+
| 5.0454 | 74.7483 | 96500 | 4.3491 |
|
243 |
+
|
244 |
+
|
245 |
+
### Framework versions
|
246 |
+
|
247 |
+
- Transformers 4.47.1
|
248 |
+
- Pytorch 2.5.1+cu121
|
249 |
+
- Datasets 3.2.0
|
250 |
+
- Tokenizers 0.21.0
|