collapse_gemma-2-2b_hs2_accumulate_iter14_sftsd0
This model is a fine-tuned version of google/gemma-2-2b on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.0965
- Num Input Tokens Seen: 73303720
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 8e-06
- train_batch_size: 8
- eval_batch_size: 16
- seed: 0
- gradient_accumulation_steps: 16
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant_with_warmup
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 1
Training results
Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
---|---|---|---|---|
No log | 0 | 0 | 1.3909 | 0 |
1.7418 | 0.0037 | 5 | 1.3896 | 283016 |
1.7344 | 0.0075 | 10 | 1.3773 | 553560 |
1.5807 | 0.0112 | 15 | 1.3466 | 829520 |
1.6918 | 0.0149 | 20 | 1.2969 | 1104416 |
1.4616 | 0.0186 | 25 | 1.2554 | 1378088 |
1.3628 | 0.0224 | 30 | 1.2300 | 1647616 |
1.3335 | 0.0261 | 35 | 1.2012 | 1927080 |
1.1335 | 0.0298 | 40 | 1.2042 | 2196256 |
0.993 | 0.0336 | 45 | 1.2222 | 2465744 |
0.8682 | 0.0373 | 50 | 1.2417 | 2740328 |
0.7051 | 0.0410 | 55 | 1.2756 | 3011840 |
0.6471 | 0.0447 | 60 | 1.3154 | 3294808 |
0.5933 | 0.0485 | 65 | 1.3214 | 3575912 |
0.3248 | 0.0522 | 70 | 1.3279 | 3843608 |
0.3982 | 0.0559 | 75 | 1.2968 | 4117384 |
0.3271 | 0.0597 | 80 | 1.2363 | 4389488 |
0.3801 | 0.0634 | 85 | 1.2423 | 4663112 |
0.2857 | 0.0671 | 90 | 1.2348 | 4941696 |
0.2346 | 0.0709 | 95 | 1.2208 | 5214992 |
0.2019 | 0.0746 | 100 | 1.2133 | 5490840 |
0.2757 | 0.0783 | 105 | 1.2069 | 5761232 |
0.1878 | 0.0820 | 110 | 1.2012 | 6036672 |
0.2866 | 0.0858 | 115 | 1.2081 | 6311104 |
0.2213 | 0.0895 | 120 | 1.2049 | 6582192 |
0.2466 | 0.0932 | 125 | 1.2016 | 6851072 |
0.1662 | 0.0970 | 130 | 1.1964 | 7125120 |
0.1657 | 0.1007 | 135 | 1.1856 | 7398056 |
0.2092 | 0.1044 | 140 | 1.1845 | 7661624 |
0.1373 | 0.1081 | 145 | 1.1874 | 7933568 |
0.2012 | 0.1119 | 150 | 1.1816 | 8207720 |
0.227 | 0.1156 | 155 | 1.1815 | 8482984 |
0.2314 | 0.1193 | 160 | 1.1859 | 8764080 |
0.241 | 0.1231 | 165 | 1.1763 | 9041088 |
0.1635 | 0.1268 | 170 | 1.1760 | 9312824 |
0.1613 | 0.1305 | 175 | 1.1713 | 9591032 |
0.1428 | 0.1342 | 180 | 1.1705 | 9861688 |
0.2154 | 0.1380 | 185 | 1.1701 | 10130584 |
0.1592 | 0.1417 | 190 | 1.1695 | 10396288 |
0.224 | 0.1454 | 195 | 1.1652 | 10671776 |
0.1585 | 0.1492 | 200 | 1.1631 | 10941376 |
0.112 | 0.1529 | 205 | 1.1633 | 11209336 |
0.202 | 0.1566 | 210 | 1.1670 | 11482168 |
0.2199 | 0.1604 | 215 | 1.1605 | 11758048 |
0.1858 | 0.1641 | 220 | 1.1644 | 12027648 |
0.1426 | 0.1678 | 225 | 1.1603 | 12308112 |
0.1395 | 0.1715 | 230 | 1.1623 | 12582808 |
0.2408 | 0.1753 | 235 | 1.1684 | 12862536 |
0.1774 | 0.1790 | 240 | 1.1578 | 13142488 |
0.1936 | 0.1827 | 245 | 1.1580 | 13416304 |
0.1581 | 0.1865 | 250 | 1.1678 | 13690168 |
0.166 | 0.1902 | 255 | 1.1600 | 13965104 |
0.2145 | 0.1939 | 260 | 1.1581 | 14241336 |
0.1921 | 0.1976 | 265 | 1.1561 | 14506704 |
0.1838 | 0.2014 | 270 | 1.1566 | 14780912 |
0.1328 | 0.2051 | 275 | 1.1549 | 15057048 |
0.1783 | 0.2088 | 280 | 1.1550 | 15327984 |
0.1569 | 0.2126 | 285 | 1.1573 | 15605984 |
0.1855 | 0.2163 | 290 | 1.1516 | 15870808 |
0.1537 | 0.2200 | 295 | 1.1498 | 16142480 |
0.2167 | 0.2237 | 300 | 1.1557 | 16420472 |
0.1634 | 0.2275 | 305 | 1.1514 | 16697824 |
0.2691 | 0.2312 | 310 | 1.1516 | 16975496 |
0.1153 | 0.2349 | 315 | 1.1555 | 17249888 |
0.0874 | 0.2387 | 320 | 1.1531 | 17515048 |
0.0916 | 0.2424 | 325 | 1.1491 | 17790120 |
0.1629 | 0.2461 | 330 | 1.1481 | 18064400 |
0.1972 | 0.2498 | 335 | 1.1478 | 18343624 |
0.1616 | 0.2536 | 340 | 1.1473 | 18611304 |
0.1283 | 0.2573 | 345 | 1.1443 | 18884856 |
0.1194 | 0.2610 | 350 | 1.1449 | 19160440 |
0.111 | 0.2648 | 355 | 1.1470 | 19436800 |
0.1609 | 0.2685 | 360 | 1.1447 | 19706952 |
0.1914 | 0.2722 | 365 | 1.1412 | 19984392 |
0.1571 | 0.2760 | 370 | 1.1420 | 20255328 |
0.1173 | 0.2797 | 375 | 1.1424 | 20523272 |
0.1878 | 0.2834 | 380 | 1.1454 | 20798864 |
0.1569 | 0.2871 | 385 | 1.1439 | 21071416 |
0.108 | 0.2909 | 390 | 1.1384 | 21353968 |
0.1343 | 0.2946 | 395 | 1.1433 | 21626600 |
0.1259 | 0.2983 | 400 | 1.1469 | 21908408 |
0.1876 | 0.3021 | 405 | 1.1397 | 22180952 |
0.1621 | 0.3058 | 410 | 1.1363 | 22455504 |
0.1721 | 0.3095 | 415 | 1.1370 | 22728632 |
0.2429 | 0.3132 | 420 | 1.1358 | 23003424 |
0.1479 | 0.3170 | 425 | 1.1357 | 23280088 |
0.1218 | 0.3207 | 430 | 1.1355 | 23561256 |
0.171 | 0.3244 | 435 | 1.1324 | 23832176 |
0.1367 | 0.3282 | 440 | 1.1319 | 24107944 |
0.1099 | 0.3319 | 445 | 1.1356 | 24373464 |
0.1744 | 0.3356 | 450 | 1.1349 | 24648584 |
0.1486 | 0.3393 | 455 | 1.1336 | 24925920 |
0.1815 | 0.3431 | 460 | 1.1322 | 25200000 |
0.1402 | 0.3468 | 465 | 1.1302 | 25470952 |
0.1077 | 0.3505 | 470 | 1.1305 | 25749736 |
0.1256 | 0.3543 | 475 | 1.1305 | 26022968 |
0.1368 | 0.3580 | 480 | 1.1341 | 26290072 |
0.1321 | 0.3617 | 485 | 1.1306 | 26557864 |
0.0958 | 0.3655 | 490 | 1.1287 | 26826976 |
0.1143 | 0.3692 | 495 | 1.1321 | 27103304 |
0.1883 | 0.3729 | 500 | 1.1318 | 27377152 |
0.1728 | 0.3766 | 505 | 1.1284 | 27648216 |
0.1553 | 0.3804 | 510 | 1.1301 | 27920104 |
0.1078 | 0.3841 | 515 | 1.1292 | 28194304 |
0.1122 | 0.3878 | 520 | 1.1282 | 28460384 |
0.1518 | 0.3916 | 525 | 1.1265 | 28730120 |
0.1693 | 0.3953 | 530 | 1.1257 | 29013600 |
0.1261 | 0.3990 | 535 | 1.1276 | 29285648 |
0.1602 | 0.4027 | 540 | 1.1281 | 29560576 |
0.1976 | 0.4065 | 545 | 1.1239 | 29836608 |
0.1456 | 0.4102 | 550 | 1.1254 | 30110856 |
0.1291 | 0.4139 | 555 | 1.1283 | 30383200 |
0.1062 | 0.4177 | 560 | 1.1262 | 30654032 |
0.1443 | 0.4214 | 565 | 1.1252 | 30928624 |
0.159 | 0.4251 | 570 | 1.1237 | 31204376 |
0.1522 | 0.4288 | 575 | 1.1256 | 31473912 |
0.1325 | 0.4326 | 580 | 1.1257 | 31745176 |
0.1278 | 0.4363 | 585 | 1.1237 | 32019632 |
0.1503 | 0.4400 | 590 | 1.1223 | 32296064 |
0.1548 | 0.4438 | 595 | 1.1221 | 32564024 |
0.1548 | 0.4475 | 600 | 1.1231 | 32843904 |
0.0875 | 0.4512 | 605 | 1.1213 | 33115728 |
0.2076 | 0.4549 | 610 | 1.1228 | 33379992 |
0.145 | 0.4587 | 615 | 1.1275 | 33648856 |
0.1648 | 0.4624 | 620 | 1.1246 | 33921360 |
0.1347 | 0.4661 | 625 | 1.1196 | 34197528 |
0.1155 | 0.4699 | 630 | 1.1219 | 34474280 |
0.1555 | 0.4736 | 635 | 1.1243 | 34750568 |
0.2262 | 0.4773 | 640 | 1.1218 | 35027704 |
0.1241 | 0.4811 | 645 | 1.1164 | 35306024 |
0.1101 | 0.4848 | 650 | 1.1171 | 35585928 |
0.1566 | 0.4885 | 655 | 1.1211 | 35858232 |
0.1489 | 0.4922 | 660 | 1.1225 | 36137264 |
0.1649 | 0.4960 | 665 | 1.1209 | 36419544 |
0.1673 | 0.4997 | 670 | 1.1192 | 36694016 |
0.2157 | 0.5034 | 675 | 1.1203 | 36971624 |
0.1412 | 0.5072 | 680 | 1.1190 | 37243512 |
0.1234 | 0.5109 | 685 | 1.1167 | 37514848 |
0.1092 | 0.5146 | 690 | 1.1195 | 37793992 |
0.1482 | 0.5183 | 695 | 1.1173 | 38067336 |
0.1382 | 0.5221 | 700 | 1.1154 | 38337264 |
0.073 | 0.5258 | 705 | 1.1176 | 38611096 |
0.0883 | 0.5295 | 710 | 1.1179 | 38885272 |
0.1614 | 0.5333 | 715 | 1.1153 | 39164016 |
0.256 | 0.5370 | 720 | 1.1140 | 39441896 |
0.1286 | 0.5407 | 725 | 1.1153 | 39712960 |
0.1444 | 0.5444 | 730 | 1.1142 | 39988216 |
0.0978 | 0.5482 | 735 | 1.1126 | 40267432 |
0.207 | 0.5519 | 740 | 1.1147 | 40539640 |
0.142 | 0.5556 | 745 | 1.1141 | 40813632 |
0.1309 | 0.5594 | 750 | 1.1112 | 41090952 |
0.1359 | 0.5631 | 755 | 1.1110 | 41363768 |
0.2047 | 0.5668 | 760 | 1.1117 | 41637880 |
0.0918 | 0.5705 | 765 | 1.1113 | 41916320 |
0.1391 | 0.5743 | 770 | 1.1125 | 42182808 |
0.1413 | 0.5780 | 775 | 1.1121 | 42454312 |
0.1448 | 0.5817 | 780 | 1.1088 | 42727256 |
0.0924 | 0.5855 | 785 | 1.1123 | 42991592 |
0.0958 | 0.5892 | 790 | 1.1121 | 43263736 |
0.2293 | 0.5929 | 795 | 1.1092 | 43533520 |
0.1919 | 0.5967 | 800 | 1.1074 | 43810032 |
0.1968 | 0.6004 | 805 | 1.1104 | 44083072 |
0.1335 | 0.6041 | 810 | 1.1092 | 44358176 |
0.1386 | 0.6078 | 815 | 1.1073 | 44630776 |
0.1997 | 0.6116 | 820 | 1.1086 | 44907016 |
0.1504 | 0.6153 | 825 | 1.1098 | 45181896 |
0.1099 | 0.6190 | 830 | 1.1083 | 45459968 |
0.1573 | 0.6228 | 835 | 1.1059 | 45731720 |
0.1288 | 0.6265 | 840 | 1.1074 | 45999272 |
0.0939 | 0.6302 | 845 | 1.1095 | 46272896 |
0.1638 | 0.6339 | 850 | 1.1075 | 46544208 |
0.1544 | 0.6377 | 855 | 1.1072 | 46819456 |
0.1268 | 0.6414 | 860 | 1.1065 | 47093544 |
0.1984 | 0.6451 | 865 | 1.1071 | 47368296 |
0.0976 | 0.6489 | 870 | 1.1082 | 47638976 |
0.1267 | 0.6526 | 875 | 1.1072 | 47903216 |
0.1525 | 0.6563 | 880 | 1.1065 | 48174864 |
0.1437 | 0.6600 | 885 | 1.1068 | 48447112 |
0.1267 | 0.6638 | 890 | 1.1075 | 48722472 |
0.1772 | 0.6675 | 895 | 1.1066 | 49003736 |
0.1302 | 0.6712 | 900 | 1.1075 | 49277216 |
0.1853 | 0.6750 | 905 | 1.1063 | 49545896 |
0.1179 | 0.6787 | 910 | 1.1062 | 49817952 |
0.171 | 0.6824 | 915 | 1.1079 | 50095424 |
0.1804 | 0.6862 | 920 | 1.1065 | 50373240 |
0.1201 | 0.6899 | 925 | 1.1050 | 50640792 |
0.1739 | 0.6936 | 930 | 1.1055 | 50916680 |
0.1747 | 0.6973 | 935 | 1.1055 | 51191224 |
0.1357 | 0.7011 | 940 | 1.1074 | 51465120 |
0.1284 | 0.7048 | 945 | 1.1053 | 51739760 |
0.1788 | 0.7085 | 950 | 1.1043 | 52019336 |
0.1674 | 0.7123 | 955 | 1.1044 | 52292424 |
0.1234 | 0.7160 | 960 | 1.1051 | 52574464 |
0.1455 | 0.7197 | 965 | 1.1053 | 52858920 |
0.1842 | 0.7234 | 970 | 1.1039 | 53134392 |
0.1203 | 0.7272 | 975 | 1.1031 | 53398824 |
0.2038 | 0.7309 | 980 | 1.1038 | 53681752 |
0.089 | 0.7346 | 985 | 1.1039 | 53956512 |
0.1598 | 0.7384 | 990 | 1.1062 | 54231040 |
0.173 | 0.7421 | 995 | 1.1068 | 54497704 |
0.2212 | 0.7458 | 1000 | 1.1032 | 54772976 |
0.1095 | 0.7495 | 1005 | 1.1027 | 55046408 |
0.104 | 0.7533 | 1010 | 1.1036 | 55317296 |
0.169 | 0.7570 | 1015 | 1.1052 | 55592376 |
0.1172 | 0.7607 | 1020 | 1.1042 | 55862504 |
0.1173 | 0.7645 | 1025 | 1.1034 | 56135864 |
0.1468 | 0.7682 | 1030 | 1.1044 | 56406744 |
0.1661 | 0.7719 | 1035 | 1.1057 | 56679048 |
0.1231 | 0.7756 | 1040 | 1.1029 | 56947824 |
0.1461 | 0.7794 | 1045 | 1.1027 | 57214496 |
0.1456 | 0.7831 | 1050 | 1.1051 | 57483880 |
0.1537 | 0.7868 | 1055 | 1.1049 | 57751256 |
0.1324 | 0.7906 | 1060 | 1.1019 | 58017112 |
0.158 | 0.7943 | 1065 | 1.1018 | 58289176 |
0.1556 | 0.7980 | 1070 | 1.1033 | 58560144 |
0.0889 | 0.8018 | 1075 | 1.1036 | 58825040 |
0.0991 | 0.8055 | 1080 | 1.1044 | 59104120 |
0.1011 | 0.8092 | 1085 | 1.1035 | 59370400 |
0.1497 | 0.8129 | 1090 | 1.1025 | 59637008 |
0.1145 | 0.8167 | 1095 | 1.1021 | 59914392 |
0.1928 | 0.8204 | 1100 | 1.1047 | 60185320 |
0.1485 | 0.8241 | 1105 | 1.1043 | 60461024 |
0.117 | 0.8279 | 1110 | 1.1013 | 60739168 |
0.1275 | 0.8316 | 1115 | 1.1016 | 61007200 |
0.1062 | 0.8353 | 1120 | 1.1022 | 61276640 |
0.1414 | 0.8390 | 1125 | 1.1023 | 61547128 |
0.1873 | 0.8428 | 1130 | 1.1019 | 61821640 |
0.1528 | 0.8465 | 1135 | 1.1048 | 62095712 |
0.23 | 0.8502 | 1140 | 1.1050 | 62371136 |
0.0977 | 0.8540 | 1145 | 1.1007 | 62643344 |
0.141 | 0.8577 | 1150 | 1.1002 | 62911536 |
0.1885 | 0.8614 | 1155 | 1.1022 | 63181368 |
0.1442 | 0.8651 | 1160 | 1.1018 | 63460448 |
0.1554 | 0.8689 | 1165 | 1.0997 | 63730104 |
0.1222 | 0.8726 | 1170 | 1.1007 | 64000480 |
0.1444 | 0.8763 | 1175 | 1.1009 | 64273304 |
0.1472 | 0.8801 | 1180 | 1.0997 | 64552232 |
0.0976 | 0.8838 | 1185 | 1.0987 | 64826376 |
0.1177 | 0.8875 | 1190 | 1.1004 | 65109376 |
0.1193 | 0.8913 | 1195 | 1.1012 | 65384368 |
0.1753 | 0.8950 | 1200 | 1.1013 | 65653136 |
0.0992 | 0.8987 | 1205 | 1.1014 | 65929040 |
0.1479 | 0.9024 | 1210 | 1.1023 | 66204392 |
0.1088 | 0.9062 | 1215 | 1.1017 | 66479784 |
0.1608 | 0.9099 | 1220 | 1.0998 | 66757192 |
0.0723 | 0.9136 | 1225 | 1.1004 | 67035888 |
0.127 | 0.9174 | 1230 | 1.1008 | 67307640 |
0.1231 | 0.9211 | 1235 | 1.0990 | 67580056 |
0.1386 | 0.9248 | 1240 | 1.0987 | 67853608 |
0.1425 | 0.9285 | 1245 | 1.1002 | 68135272 |
0.1557 | 0.9323 | 1250 | 1.0989 | 68411824 |
0.0822 | 0.9360 | 1255 | 1.0985 | 68679576 |
0.1152 | 0.9397 | 1260 | 1.1000 | 68955072 |
0.2109 | 0.9435 | 1265 | 1.1004 | 69226304 |
0.0782 | 0.9472 | 1270 | 1.0988 | 69490936 |
0.1817 | 0.9509 | 1275 | 1.0970 | 69771680 |
0.1312 | 0.9546 | 1280 | 1.0970 | 70045152 |
0.1397 | 0.9584 | 1285 | 1.0981 | 70315248 |
0.1079 | 0.9621 | 1290 | 1.0990 | 70583760 |
0.0937 | 0.9658 | 1295 | 1.1004 | 70854824 |
0.119 | 0.9696 | 1300 | 1.0989 | 71131880 |
0.1107 | 0.9733 | 1305 | 1.0970 | 71409120 |
0.1593 | 0.9770 | 1310 | 1.0955 | 71676168 |
0.1217 | 0.9807 | 1315 | 1.0963 | 71948880 |
0.1601 | 0.9845 | 1320 | 1.0972 | 72222896 |
0.149 | 0.9882 | 1325 | 1.1001 | 72492056 |
0.2421 | 0.9919 | 1330 | 1.0995 | 72765872 |
0.1438 | 0.9957 | 1335 | 1.0971 | 73039664 |
0.1481 | 0.9994 | 1340 | 1.0965 | 73303720 |
Framework versions
- Transformers 4.44.0
- Pytorch 2.4.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 4
Model tree for RylanSchaeffer/collapse_gemma-2-2b_hs2_accumulate_iter14_sftsd0
Base model
google/gemma-2-2b