File size: 66,484 Bytes
a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae 626629d 81bb2ae 626629d 81bb2ae 626629d 81bb2ae 626629d 81bb2ae 626629d a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 626629d 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 626629d a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 626629d a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 626629d a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae 626629d 81bb2ae 626629d 81bb2ae 626629d 81bb2ae 626629d 81bb2ae 626629d 81bb2ae 626629d 81bb2ae 626629d 81bb2ae 626629d 81bb2ae 626629d 81bb2ae 626629d 81bb2ae 626629d 81bb2ae 626629d 81bb2ae 626629d 81bb2ae 626629d a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 626629d 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae 626629d 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae 626629d 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 81bb2ae a1bd5f2 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 823 824 825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 868 869 870 871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 944 945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 975 976 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 1000 1001 1002 1003 1004 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104 1105 1106 1107 1108 1109 1110 1111 1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 1126 1127 1128 1129 1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 1186 1187 1188 1189 1190 1191 1192 1193 1194 1195 1196 1197 1198 1199 1200 1201 1202 1203 1204 1205 1206 1207 1208 1209 1210 1211 1212 1213 1214 1215 1216 1217 1218 1219 1220 1221 1222 1223 1224 1225 1226 1227 1228 1229 1230 1231 1232 1233 1234 1235 1236 1237 1238 1239 1240 1241 1242 1243 1244 1245 1246 1247 1248 1249 1250 1251 1252 1253 1254 1255 1256 1257 1258 1259 1260 1261 1262 1263 1264 1265 1266 1267 1268 1269 1270 1271 1272 1273 1274 1275 1276 1277 1278 1279 1280 1281 1282 1283 1284 1285 1286 1287 1288 1289 1290 1291 1292 1293 1294 1295 1296 1297 1298 1299 1300 1301 1302 1303 1304 1305 1306 1307 1308 1309 1310 1311 1312 1313 1314 1315 1316 1317 1318 1319 1320 1321 1322 1323 1324 1325 1326 1327 1328 1329 1330 1331 1332 1333 1334 1335 1336 1337 1338 1339 1340 1341 1342 1343 1344 1345 1346 1347 1348 1349 1350 1351 1352 1353 1354 1355 1356 1357 1358 1359 1360 1361 1362 1363 1364 1365 1366 1367 1368 1369 1370 1371 1372 1373 1374 1375 1376 1377 1378 1379 1380 1381 1382 1383 1384 1385 1386 1387 1388 1389 1390 1391 1392 1393 1394 1395 1396 1397 1398 1399 1400 1401 1402 1403 1404 1405 1406 1407 1408 1409 1410 1411 1412 1413 1414 1415 1416 1417 1418 1419 1420 1421 1422 1423 1424 1425 1426 1427 1428 1429 1430 1431 1432 1433 1434 1435 1436 1437 1438 1439 1440 1441 1442 1443 1444 1445 1446 1447 1448 1449 1450 1451 1452 1453 1454 1455 1456 1457 1458 1459 1460 1461 1462 1463 1464 1465 1466 1467 1468 1469 1470 1471 1472 1473 1474 1475 1476 1477 1478 1479 1480 1481 1482 1483 1484 1485 1486 1487 1488 1489 1490 1491 1492 1493 1494 1495 1496 1497 1498 1499 1500 1501 1502 1503 1504 1505 1506 1507 1508 1509 1510 1511 1512 1513 1514 1515 1516 1517 1518 1519 1520 1521 1522 1523 1524 1525 1526 1527 1528 1529 1530 1531 1532 1533 1534 1535 1536 1537 1538 1539 1540 1541 1542 1543 1544 1545 1546 1547 1548 1549 1550 1551 1552 1553 1554 1555 1556 1557 1558 1559 1560 1561 1562 1563 1564 1565 1566 1567 1568 1569 1570 1571 1572 1573 1574 1575 1576 1577 1578 1579 1580 1581 1582 1583 1584 1585 1586 1587 1588 1589 1590 1591 1592 1593 1594 1595 1596 1597 1598 1599 1600 1601 1602 1603 1604 1605 1606 1607 1608 1609 1610 1611 1612 1613 1614 1615 1616 1617 1618 1619 1620 1621 1622 1623 1624 1625 1626 1627 1628 1629 1630 1631 1632 1633 1634 1635 1636 1637 1638 1639 1640 1641 1642 1643 1644 1645 1646 1647 1648 1649 1650 1651 1652 1653 1654 1655 1656 1657 1658 1659 1660 1661 1662 1663 1664 1665 1666 1667 1668 1669 1670 1671 1672 1673 1674 1675 1676 1677 1678 1679 1680 1681 1682 1683 1684 1685 1686 1687 1688 1689 1690 1691 1692 1693 1694 1695 1696 1697 1698 1699 1700 1701 1702 1703 1704 1705 1706 1707 1708 1709 1710 1711 1712 1713 1714 1715 1716 1717 1718 1719 1720 1721 1722 1723 1724 1725 1726 1727 1728 1729 1730 1731 1732 1733 1734 1735 1736 1737 1738 1739 1740 1741 1742 1743 1744 1745 1746 1747 1748 1749 1750 1751 1752 1753 1754 1755 1756 1757 1758 1759 1760 1761 1762 1763 1764 1765 1766 1767 1768 1769 1770 1771 1772 1773 1774 1775 1776 1777 1778 1779 1780 1781 1782 1783 1784 1785 1786 1787 1788 1789 1790 1791 1792 1793 1794 1795 1796 1797 1798 1799 1800 1801 1802 1803 1804 1805 1806 1807 1808 1809 1810 1811 1812 1813 1814 1815 1816 1817 1818 1819 1820 1821 1822 1823 1824 |
---
base_model: sentence-transformers/all-MiniLM-L6-v2
datasets: []
language:
- en
library_name: sentence-transformers
license: apache-2.0
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:1490
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: Can you explain how to configure the credentials for authentication
to a remote MLflow tracking server in ZenML?
sentences:
- 'w_bucket=gs://my_bucket --provider=<YOUR_PROVIDER>You can pass other configurations
specific to the stack components as key-value arguments. If you don''t provide
a name, a random one is generated for you. For more information about how to work
use the CLI for this, please refer to the dedicated documentation section.
Authentication Methods
You need to configure the following credentials for authentication to a remote
MLflow tracking server:
tracking_uri: The URL pointing to the MLflow tracking server. If using an MLflow
Tracking Server managed by Databricks, then the value of this attribute should
be "databricks".
tracking_username: Username for authenticating with the MLflow tracking server.
tracking_password: Password for authenticating with the MLflow tracking server.
tracking_token (in place of tracking_username and tracking_password): Token for
authenticating with the MLflow tracking server.
tracking_insecure_tls (optional): Set to skip verifying the MLflow tracking server
SSL certificate.
databricks_host: The host of the Databricks workspace with the MLflow-managed
server to connect to. This is only required if the tracking_uri value is set to
"databricks". More information: Access the MLflow tracking server from outside
Databricks
Either tracking_token or tracking_username and tracking_password must be specified.
This option configures the credentials for the MLflow tracking service directly
as stack component attributes.
This is not recommended for production settings as the credentials won''t be stored
securely and will be clearly visible in the stack configuration.
# Register the MLflow experiment tracker
zenml experiment-tracker register mlflow_experiment_tracker --flavor=mlflow \
--tracking_uri=<URI> --tracking_token=<token>
# You can also register it like this:
# zenml experiment-tracker register mlflow_experiment_tracker --flavor=mlflow
\
# --tracking_uri=<URI> --tracking_username=<USERNAME> --tracking_password=<PASSWORD>
# Register and set a stack with the new experiment tracker'
- 'token_hex
token_hex(32)or:Copyopenssl rand -hex 32Important: If you configure encryption
for your SQL database secrets store, you should keep the ZENML_SECRETS_STORE_ENCRYPTION_KEY
value somewhere safe and secure, as it will always be required by the ZenML server
to decrypt the secrets in the database. If you lose the encryption key, you will
not be able to decrypt the secrets in the database and will have to reset them.
These configuration options are only relevant if you''re using the AWS Secrets
Manager as the secrets store backend.
ZENML_SECRETS_STORE_TYPE: Set this to aws in order to set this type of secret
store.
The AWS Secrets Store uses the ZenML AWS Service Connector under the hood to authenticate
with the AWS Secrets Manager API. This means that you can use any of the authentication
methods supported by the AWS Service Connector to authenticate with the AWS Secrets
Manager API.
"Version": "2012-10-17",
"Statement": [
"Sid": "ZenMLSecretsStore",
"Effect": "Allow",
"Action": [
"secretsmanager:CreateSecret",
"secretsmanager:GetSecretValue",
"secretsmanager:DescribeSecret",
"secretsmanager:PutSecretValue",
"secretsmanager:TagResource",
"secretsmanager:DeleteSecret"
],
"Resource": "arn:aws:secretsmanager:<AWS-region>:<AWS-account-id>:secret:zenml/*"
The following configuration options are supported:
ZENML_SECRETS_STORE_AUTH_METHOD: The AWS Service Connector authentication method
to use (e.g. secret-key or iam-role).
ZENML_SECRETS_STORE_AUTH_CONFIG: The AWS Service Connector configuration, in JSON
format (e.g. {"aws_access_key_id":"<aws-key-id>","aws_secret_access_key":"<aws-secret-key>","region":"<aws-region>"}).
Note: The remaining configuration options are deprecated and may be removed in
a future release. Instead, you should set the ZENML_SECRETS_STORE_AUTH_METHOD
and ZENML_SECRETS_STORE_AUTH_CONFIG variables to use the AWS Service Connector
authentication method.'
- 'tive Directory credentials or generic OIDC tokens.This authentication method
only requires a GCP workload identity external account JSON file that only contains
the configuration for the external account without any sensitive credentials.
It allows implementing a two layer authentication scheme that keeps the set of
permissions associated with implicit credentials down to the bare minimum and
grants permissions to the privilege-bearing GCP service account instead.
This authentication method can be used to authenticate to GCP services using credentials
from other cloud providers or identity providers. When used with workloads running
on AWS or Azure, it involves automatically picking up credentials from the AWS
IAM or Azure AD identity associated with the workload and using them to authenticate
to GCP services. This means that the result depends on the environment where the
ZenML server is deployed and is thus not fully reproducible.
When used with AWS or Azure implicit in-cloud authentication, this method may
constitute a security risk, because it can give users access to the identity (e.g.
AWS IAM role or Azure AD principal) implicitly associated with the environment
where the ZenML server is running. For this reason, all implicit authentication
methods are disabled by default and need to be explicitly enabled by setting the
ZENML_ENABLE_IMPLICIT_AUTH_METHODS environment variable or the helm chart enableImplicitAuthMethods
configuration option to true in the ZenML deployment.
By default, the GCP connector generates temporary OAuth 2.0 tokens from the external
account credentials and distributes them to clients. The tokens have a limited
lifetime of 1 hour. This behavior can be disabled by setting the generate_temporary_tokens
configuration option to False, in which case, the connector will distribute the
external account credentials JSON to clients instead (not recommended).'
- source_sentence: What is an example of a ZenML server YAML configuration file?
sentences:
- 'sing a type annotation.
Tuple vs multiple outputsIt is impossible for ZenML to detect whether you want
your step to have a single output artifact of type Tuple or multiple output artifacts
just by looking at the type annotation.
We use the following convention to differentiate between the two: When the return
statement is followed by a tuple literal (e.g. return 1, 2 or return (value_1,
value_2)) we treat it as a step with multiple outputs. All other cases are treated
as a step with a single output of type Tuple.
from zenml import step
from typing_extensions import Annotated
from typing import Tuple
# Single output artifact
@step
def my_step() -> Tuple[int, int]:
output_value = (0, 1)
return output_value
# Single output artifact with variable length
@step
def my_step(condition) -> Tuple[int, ...]:
if condition:
output_value = (0, 1)
else:
output_value = (0, 1, 2)
return output_value
# Single output artifact using the `Annotated` annotation
@step
def my_step() -> Annotated[Tuple[int, ...], "my_output"]:
return 0, 1
# Multiple output artifacts
@step
def my_step() -> Tuple[int, int]:
return 0, 1
# Not allowed: Variable length tuple annotation when using
# multiple output artifacts
@step
def my_step() -> Tuple[int, ...]:
return 0, 1
Step output names
By default, ZenML uses the output name output for single output steps and output_0,
output_1, ... for steps with multiple outputs. These output names are used to
display your outputs in the dashboard and fetch them after your pipeline is finished.
If you want to use custom output names for your steps, use the Annotated type
annotation:
from typing_extensions import Annotated # or `from typing import Annotated on
Python 3.9+
from typing import Tuple
from zenml import step
@step
def square_root(number: int) -> Annotated[float, "custom_output_name"]:
return number ** 0.5
@step
def divide(a: int, b: int) -> Tuple[
Annotated[int, "quotient"],
Annotated[int, "remainder"]
]:
return a // b, a % b'
- 'HyperAI Orchestrator
Orchestrating your pipelines to run on HyperAI.ai instances.
HyperAI is a cutting-edge cloud compute platform designed to make AI accessible
for everyone. The HyperAI orchestrator is an orchestrator flavor that allows you
to easily deploy your pipelines on HyperAI instances.
This component is only meant to be used within the context of a remote ZenML deployment
scenario. Usage with a local ZenML deployment may lead to unexpected behavior!
When to use it
You should use the HyperAI orchestrator if:
you''re looking for a managed solution for running your pipelines.
you''re a HyperAI customer.
Prerequisites
You will need to do the following to start using the HyperAI orchestrator:
Have a running HyperAI instance. It must be accessible from the internet (or at
least from the IP addresses of your ZenML users) and allow SSH key based access
(passwords are not supported).
Ensure that a recent version of Docker is installed. This version must include
Docker Compose, meaning that the command docker compose works.
Ensure that the appropriate NVIDIA Driver is installed on the HyperAI instance
(if not already installed by the HyperAI team).
Ensure that the NVIDIA Container Toolkit is installed and configured on the HyperAI
instance.
Note that it is possible to omit installing the NVIDIA Driver and NVIDIA Container
Toolkit. However, you will then be unable to use the GPU from within your ZenML
pipeline. Additionally, you will then need to disable GPU access within the container
when configuring the Orchestrator component, or the pipeline will not start correctly.
How it works'
- 'fied, or a string, in which case it must be a path# to a CA certificate bundle
to use or the CA bundle value itself
verify_ssl:
Here is an example of a ZenML server YAML configuration file:
url: https://ac8ef63af203226194a7725ee71d85a-7635928635.us-east-1.elb.amazonaws.com/zenml
verify_ssl: |
-----BEGIN CERTIFICATE-----
...
-----END CERTIFICATE-----
To disconnect from the current ZenML server and revert to using the local default
database, use the following command:
zenml disconnect
How does it work?
Here''s an architecture diagram that shows how the workflow looks like when you
do zenml deploy.
The deploy CLI makes use of a "recipe" inside the zenml-io/zenml repository to
deploy the server on the right cloud. Any configuration that you pass with the
CLI, is sent to the recipe as input variables.
PreviousDeploying ZenML
NextDeploy with Docker
Last updated 15 days ago'
- source_sentence: When should I update my service account name to ensure security?
sentences:
- 'y <SERVICE_ACCOUNT_NAME> update.
Important noticeEvery API key issued is a potential gateway to access your data,
secrets and infrastructure. It''s important to regularly rotate API keys and deactivate
or delete service accounts and API keys that are no longer needed.
PreviousConnect in with your User (interactive)
NextInteract with secrets
Last updated 15 days ago'
- 'Connect in with your User (interactive)
You can authenticate your clients with the ZenML Server using the ZenML CLI and
the web based login. This can be executed with the command:
zenml connect --url https://...
This command will start a series of steps to validate the device from where you
are connecting that will happen in your browser. You can choose whether to mark
your respective device as trusted or not. If you choose not to click Trust this
device, a 24-hour token will be issued for authentication services. Choosing to
trust the device will issue a 30-day token instead.
To see all devices you''ve permitted, use the following command:
zenml authorized-device list
Additionally, the following command allows you to more precisely inspect one of
these devices:
zenml authorized-device describe <DEVICE_ID>
For increased security, you can invalidate a token using the zenml device lock
command followed by the device ID. This helps provide an extra layer of security
and control over your devices.
zenml authorized-device lock <DEVICE_ID>
To keep things simple, we can summarize the steps:
Use the zenml connect --url command to start a device flow and connect to a zenml
server.
Choose whether to trust the device when prompted.
Check permitted devices with zenml devices list.
Invalidate a token with zenml device lock ....
Important notice
Using the ZenML CLI is a secure and comfortable way to interact with your ZenML
tenants. It''s important to always ensure that only trusted devices are used to
maintain security and privacy.
Don''t forget to manage your device trust levels regularly for optimal security.
Should you feel a device trust needs to be revoked, lock the device immediately.
Every token issued is a potential gateway to access your data, secrets and infrastructure.
PreviousConnect to a server
NextConnect with a Service Account
Last updated 19 days ago'
- '━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━┷━━━━━━━┷━━━━━━━━┛A lot more is hidden behind
a Service Connector Type than a name and a simple list of resource types. Before
using a Service Connector Type to configure a Service Connector, you probably
need to understand what it is, what it can offer and what are the supported authentication
methods and their requirements. All this can be accessed directly through the
CLI. Some examples are included here.
Showing information about the gcp Service Connector Type:
zenml service-connector describe-type gcp
Example Command Output
╔══════════════════════════════════════════════════════════════════════════════╗
║ 🔵 GCP Service Connector (connector type: gcp) ║
╚══════════════════════════════════════════════════════════════════════════════╝
Authentication methods:
🔒 implicit
🔒 user-account
🔒 service-account
🔒 oauth2-token
🔒 impersonation
Resource types:
🔵 gcp-generic
📦 gcs-bucket
🌀 kubernetes-cluster
🐳 docker-registry
Supports auto-configuration: True
Available locally: True
Available remotely: True
The ZenML GCP Service Connector facilitates the authentication and access to
managed GCP services and resources. These encompass a range of resources,
including GCS buckets, GCR container repositories and GKE clusters. The
connector provides support for various authentication methods, including GCP
user accounts, service accounts, short-lived OAuth 2.0 tokens and implicit
authentication.
To ensure heightened security measures, this connector always issues short-lived
OAuth 2.0 tokens to clients instead of long-lived credentials. Furthermore, it
includes automatic configuration and detection of credentials locally
configured through the GCP CLI.
This connector serves as a general means of accessing any GCP service by issuing
OAuth 2.0 credential objects to clients. Additionally, the connector can handle
specialized authentication for GCS, Docker and Kubernetes Python clients. It'
- source_sentence: Where can I find the instructions to clone the ZenML quickstart
repository and set up the stack?
sentences:
- 'into play when the component is ultimately in use.The design behind this interaction
lets us separate the configuration of the flavor from its implementation. This
way we can register flavors and components even when the major dependencies behind
their implementation are not installed in our local setting (assuming the CustomArtifactStoreFlavor
and the CustomArtifactStoreConfig are implemented in a different module/path than
the actual CustomArtifactStore).
Enabling Artifact Visualizations with Custom Artifact Stores
ZenML automatically saves visualizations for many common data types and allows
you to view these visualizations in the ZenML dashboard. Under the hood, this
works by saving the visualizations together with the artifacts in the artifact
store.
In order to load and display these visualizations, ZenML needs to be able to load
and access the corresponding artifact store. This means that your custom artifact
store needs to be configured in a way that allows authenticating to the back-end
without relying on the local environment, e.g., by embedding the authentication
credentials in the stack component configuration or by referencing a secret.
Furthermore, for deployed ZenML instances, you need to install the package dependencies
of your artifact store implementation in the environment where you have deployed
ZenML. See the Documentation on deploying ZenML with custom Docker images for
more information on how to do that.
PreviousAzure Blob Storage
NextContainer Registries
Last updated 19 days ago'
- 't_repository: str
user: Optional[str]
resources:cpu_count: Optional[PositiveFloat]
gpu_count: Optional[NonNegativeInt]
memory: Optional[ConstrainedStrValue]
step_operator: Optional[str]
success_hook_source:
attribute: Optional[str]
module: str
type: SourceType
train_model:
enable_artifact_metadata: Optional[bool]
enable_artifact_visualization: Optional[bool]
enable_cache: Optional[bool]
enable_step_logs: Optional[bool]
experiment_tracker: Optional[str]
extra: Mapping[str, Any]
failure_hook_source:
attribute: Optional[str]
module: str
type: SourceType
model:
audience: Optional[str]
description: Optional[str]
ethics: Optional[str]
license: Optional[str]
limitations: Optional[str]
name: str
save_models_to_registry: bool
suppress_class_validation_warnings: bool
tags: Optional[List[str]]
trade_offs: Optional[str]
use_cases: Optional[str]
version: Union[ModelStages, int, str, NoneType]
was_created_in_this_run: bool
name: Optional[str]
outputs: {}
parameters: {}
settings:
docker:
apt_packages: List[str]
build_context_root: Optional[str]
build_options: Mapping[str, Any]
copy_files: bool
copy_global_config: bool
dockerfile: Optional[str]
dockerignore: Optional[str]
environment: Mapping[str, Any]
install_stack_requirements: bool
parent_image: Optional[str]
python_package_installer: PythonPackageInstaller
replicate_local_python_environment: Union[List[str], PythonEnvironmentExportMethod,
NoneType]
required_hub_plugins: List[str]
required_integrations: List[str]
requirements: Union[NoneType, str, List[str]]
skip_build: bool
source_files: SourceFileMode
target_repository: str
user: Optional[str]
resources:
cpu_count: Optional[PositiveFloat]
gpu_count: Optional[NonNegativeInt]
memory: Optional[ConstrainedStrValue]
step_operator: Optional[str]
success_hook_source:
attribute: Optional[str]
module: str
type: SourceType'
- 'as the ZenML quickstart. You can clone it like so:git clone --depth 1 git@github.com:zenml-io/zenml.git
cd zenml/examples/quickstart
pip install -r requirements.txt
zenml init
To run a pipeline using the new stack:
Set the stack as active on your clientCopyzenml stack set a_new_local_stack
Run your pipeline code:Copypython run.py --training-pipeline
Keep this code handy as we''ll be using it in the next chapters!
PreviousDeploying ZenML
NextConnecting remote storage
Last updated 19 days ago'
- source_sentence: How do I register and connect an S3 artifact store in ZenML using
the interactive mode?
sentences:
- 'hich Resource Name to use in the interactive mode:zenml artifact-store register
s3-zenfiles --flavor s3 --path=s3://zenfiles
zenml service-connector list-resources --resource-type s3-bucket --resource-id
s3://zenfiles
zenml artifact-store connect s3-zenfiles --connector aws-multi-type
Example Command Output
$ zenml artifact-store register s3-zenfiles --flavor s3 --path=s3://zenfiles
Running with active workspace: ''default'' (global)
Running with active stack: ''default'' (global)
Successfully registered artifact_store `s3-zenfiles`.
$ zenml service-connector list-resources --resource-type s3-bucket --resource-id
zenfiles
The ''s3-bucket'' resource with name ''zenfiles'' can be accessed by service
connectors configured in your workspace:
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━┓
┃ CONNECTOR ID │ CONNECTOR NAME │ CONNECTOR TYPE
│ RESOURCE TYPE │ RESOURCE NAMES ┃
┠──────────────────────────────────────┼──────────────────────┼────────────────┼───────────────┼────────────────┨
┃ 4a550c82-aa64-4a48-9c7f-d5e127d77a44 │ aws-multi-type │ 🔶 aws │
📦 s3-bucket │ s3://zenfiles ┃
┠──────────────────────────────────────┼──────────────────────┼────────────────┼───────────────┼────────────────┨
┃ 66c0922d-db84-4e2c-9044-c13ce1611613 │ aws-multi-instance │ 🔶 aws │
📦 s3-bucket │ s3://zenfiles ┃
┠──────────────────────────────────────┼──────────────────────┼────────────────┼───────────────┼────────────────┨
┃ 65c82e59-cba0-4a01-b8f6-d75e8a1d0f55 │ aws-single-instance │ 🔶 aws │
📦 s3-bucket │ s3://zenfiles ┃
┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━┛
$ zenml artifact-store connect s3-zenfiles --connector aws-multi-type
Running with active workspace: ''default'' (global)
Running with active stack: ''default'' (global)
Successfully connected artifact store `s3-zenfiles` to the following resources:'
- '👣Step Operators
Executing individual steps in specialized environments.
The step operator enables the execution of individual pipeline steps in specialized
runtime environments that are optimized for certain workloads. These specialized
environments can give your steps access to resources like GPUs or distributed
processing frameworks like Spark.
Comparison to orchestrators: The orchestrator is a mandatory stack component that
is responsible for executing all steps of a pipeline in the correct order and
providing additional features such as scheduling pipeline runs. The step operator
on the other hand is used to only execute individual steps of the pipeline in
a separate environment in case the environment provided by the orchestrator is
not feasible.
When to use it
A step operator should be used if one or more steps of a pipeline require resources
that are not available in the runtime environments provided by the orchestrator.
An example would be a step that trains a computer vision model and requires a
GPU to run in a reasonable time, combined with a Kubeflow orchestrator running
on a Kubernetes cluster that does not contain any GPU nodes. In that case, it
makes sense to include a step operator like SageMaker, Vertex, or AzureML to execute
the training step with a GPU.
Step Operator Flavors
Step operators to execute steps on one of the big cloud providers are provided
by the following ZenML integrations:
Step Operator Flavor Integration Notes SageMaker sagemaker aws Uses SageMaker
to execute steps Vertex vertex gcp Uses Vertex AI to execute steps AzureML azureml
azure Uses AzureML to execute steps Spark spark spark Uses Spark on Kubernetes
to execute steps in a distributed manner Custom Implementation custom Extend the
step operator abstraction and provide your own implementation
If you would like to see the available flavors of step operators, you can use
the command:
zenml step-operator flavor list
How to use it'
- 'Azure Container Registry
Storing container images in Azure.
The Azure container registry is a container registry flavor that comes built-in
with ZenML and uses the Azure Container Registry to store container images.
When to use it
You should use the Azure container registry if:
one or more components of your stack need to pull or push container images.
you have access to Azure. If you''re not using Azure, take a look at the other
container registry flavors.
How to deploy it
Go here and choose a subscription, resource group, location, and registry name.
Then click on Review + Create and to create your container registry.
How to find the registry URI
The Azure container registry URI should have the following format:
<REGISTRY_NAME>.azurecr.io
# Examples:
zenmlregistry.azurecr.io
myregistry.azurecr.io
To figure out the URI for your registry:
Go to the Azure portal.
In the search bar, enter container registries and select the container registry
you want to use. If you don''t have any container registries yet, check out the
deployment section on how to create one.
Use the name of your registry to fill the template <REGISTRY_NAME>.azurecr.io
and get your URI.
How to use it
To use the Azure container registry, we need:
Docker installed and running.
The registry URI. Check out the previous section on the URI format and how to
get the URI for your registry.
We can then register the container registry and use it in our active stack:
zenml container-registry register <NAME> \
--flavor=azure \
--uri=<REGISTRY_URI>
# Add the container registry to the active stack
zenml stack update -c <NAME>
You also need to set up authentication required to log in to the container registry.
Authentication Methods'
model-index:
- name: zenml/finetuned-all-MiniLM-L6-v2
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 384
type: dim_384
metrics:
- type: cosine_accuracy@1
value: 0.3132530120481928
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.6144578313253012
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.7168674698795181
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7891566265060241
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.3132530120481928
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.20481927710843373
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.1433734939759036
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.0789156626506024
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.3132530120481928
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.6144578313253012
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.7168674698795181
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7891566265060241
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5579120329651274
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.48292933639319197
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.4907452723782479
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 256
type: dim_256
metrics:
- type: cosine_accuracy@1
value: 0.2891566265060241
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.6144578313253012
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.7108433734939759
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7650602409638554
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.2891566265060241
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.20481927710843373
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.14216867469879516
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.07650602409638553
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.2891566265060241
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.6144578313253012
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.7108433734939759
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7650602409638554
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5394043126982406
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.46553595333715836
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.4739275972429515
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 128
type: dim_128
metrics:
- type: cosine_accuracy@1
value: 0.28313253012048195
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.5481927710843374
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.6506024096385542
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7168674698795181
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.28313253012048195
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.1827309236947791
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.1301204819277108
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.07168674698795179
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.28313253012048195
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.5481927710843374
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.6506024096385542
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7168674698795181
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5067699591037801
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.43858529355517323
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.44791284428498435
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 64
type: dim_64
metrics:
- type: cosine_accuracy@1
value: 0.24096385542168675
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.46987951807228917
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.5843373493975904
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.6807228915662651
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.24096385542168675
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.1566265060240964
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.11686746987951806
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.06807228915662648
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.24096385542168675
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.46987951807228917
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.5843373493975904
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6807228915662651
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.45307543718220417
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.3806679097341751
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.389050349953244
name: Cosine Map@100
---
# zenml/finetuned-all-MiniLM-L6-v2
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) <!-- at revision 8b3219a92973c328a8e22fadcfa821b5dc75636a -->
- **Maximum Sequence Length:** 256 tokens
- **Output Dimensionality:** 384 tokens
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("zenml/finetuned-all-MiniLM-L6-v2")
# Run inference
sentences = [
'How do I register and connect an S3 artifact store in ZenML using the interactive mode?',
"hich Resource Name to use in the interactive mode:zenml artifact-store register s3-zenfiles --flavor s3 --path=s3://zenfiles\n\nzenml service-connector list-resources --resource-type s3-bucket --resource-id s3://zenfiles\n\nzenml artifact-store connect s3-zenfiles --connector aws-multi-type\n\nExample Command Output\n\n$ zenml artifact-store register s3-zenfiles --flavor s3 --path=s3://zenfiles\n\nRunning with active workspace: 'default' (global)\n\nRunning with active stack: 'default' (global)\n\nSuccessfully registered artifact_store `s3-zenfiles`.\n\n$ zenml service-connector list-resources --resource-type s3-bucket --resource-id zenfiles\n\nThe 's3-bucket' resource with name 'zenfiles' can be accessed by service connectors configured in your workspace:\n\n┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━┓\n\n┃ CONNECTOR ID │ CONNECTOR NAME │ CONNECTOR TYPE │ RESOURCE TYPE │ RESOURCE NAMES ┃\n\n┠──────────────────────────────────────┼──────────────────────┼────────────────┼───────────────┼────────────────┨\n\n┃ 4a550c82-aa64-4a48-9c7f-d5e127d77a44 │ aws-multi-type │ 🔶 aws │ 📦 s3-bucket │ s3://zenfiles ┃\n\n┠──────────────────────────────────────┼──────────────────────┼────────────────┼───────────────┼────────────────┨\n\n┃ 66c0922d-db84-4e2c-9044-c13ce1611613 │ aws-multi-instance │ 🔶 aws │ 📦 s3-bucket │ s3://zenfiles ┃\n\n┠──────────────────────────────────────┼──────────────────────┼────────────────┼───────────────┼────────────────┨\n\n┃ 65c82e59-cba0-4a01-b8f6-d75e8a1d0f55 │ aws-single-instance │ 🔶 aws │ 📦 s3-bucket │ s3://zenfiles ┃\n\n┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━┛\n\n$ zenml artifact-store connect s3-zenfiles --connector aws-multi-type\n\nRunning with active workspace: 'default' (global)\n\nRunning with active stack: 'default' (global)\n\nSuccessfully connected artifact store `s3-zenfiles` to the following resources:",
"Azure Container Registry\n\nStoring container images in Azure.\n\nThe Azure container registry is a container registry flavor that comes built-in with ZenML and uses the Azure Container Registry to store container images.\n\nWhen to use it\n\nYou should use the Azure container registry if:\n\none or more components of your stack need to pull or push container images.\n\nyou have access to Azure. If you're not using Azure, take a look at the other container registry flavors.\n\nHow to deploy it\n\nGo here and choose a subscription, resource group, location, and registry name. Then click on Review + Create and to create your container registry.\n\nHow to find the registry URI\n\nThe Azure container registry URI should have the following format:\n\n<REGISTRY_NAME>.azurecr.io\n\n# Examples:\n\nzenmlregistry.azurecr.io\n\nmyregistry.azurecr.io\n\nTo figure out the URI for your registry:\n\nGo to the Azure portal.\n\nIn the search bar, enter container registries and select the container registry you want to use. If you don't have any container registries yet, check out the deployment section on how to create one.\n\nUse the name of your registry to fill the template <REGISTRY_NAME>.azurecr.io and get your URI.\n\nHow to use it\n\nTo use the Azure container registry, we need:\n\nDocker installed and running.\n\nThe registry URI. Check out the previous section on the URI format and how to get the URI for your registry.\n\nWe can then register the container registry and use it in our active stack:\n\nzenml container-registry register <NAME> \\\n\n--flavor=azure \\\n\n--uri=<REGISTRY_URI>\n\n# Add the container registry to the active stack\n\nzenml stack update -c <NAME>\n\nYou also need to set up authentication required to log in to the container registry.\n\nAuthentication Methods",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Information Retrieval
* Dataset: `dim_384`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.3133 |
| cosine_accuracy@3 | 0.6145 |
| cosine_accuracy@5 | 0.7169 |
| cosine_accuracy@10 | 0.7892 |
| cosine_precision@1 | 0.3133 |
| cosine_precision@3 | 0.2048 |
| cosine_precision@5 | 0.1434 |
| cosine_precision@10 | 0.0789 |
| cosine_recall@1 | 0.3133 |
| cosine_recall@3 | 0.6145 |
| cosine_recall@5 | 0.7169 |
| cosine_recall@10 | 0.7892 |
| cosine_ndcg@10 | 0.5579 |
| cosine_mrr@10 | 0.4829 |
| **cosine_map@100** | **0.4907** |
#### Information Retrieval
* Dataset: `dim_256`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.2892 |
| cosine_accuracy@3 | 0.6145 |
| cosine_accuracy@5 | 0.7108 |
| cosine_accuracy@10 | 0.7651 |
| cosine_precision@1 | 0.2892 |
| cosine_precision@3 | 0.2048 |
| cosine_precision@5 | 0.1422 |
| cosine_precision@10 | 0.0765 |
| cosine_recall@1 | 0.2892 |
| cosine_recall@3 | 0.6145 |
| cosine_recall@5 | 0.7108 |
| cosine_recall@10 | 0.7651 |
| cosine_ndcg@10 | 0.5394 |
| cosine_mrr@10 | 0.4655 |
| **cosine_map@100** | **0.4739** |
#### Information Retrieval
* Dataset: `dim_128`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.2831 |
| cosine_accuracy@3 | 0.5482 |
| cosine_accuracy@5 | 0.6506 |
| cosine_accuracy@10 | 0.7169 |
| cosine_precision@1 | 0.2831 |
| cosine_precision@3 | 0.1827 |
| cosine_precision@5 | 0.1301 |
| cosine_precision@10 | 0.0717 |
| cosine_recall@1 | 0.2831 |
| cosine_recall@3 | 0.5482 |
| cosine_recall@5 | 0.6506 |
| cosine_recall@10 | 0.7169 |
| cosine_ndcg@10 | 0.5068 |
| cosine_mrr@10 | 0.4386 |
| **cosine_map@100** | **0.4479** |
#### Information Retrieval
* Dataset: `dim_64`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.241 |
| cosine_accuracy@3 | 0.4699 |
| cosine_accuracy@5 | 0.5843 |
| cosine_accuracy@10 | 0.6807 |
| cosine_precision@1 | 0.241 |
| cosine_precision@3 | 0.1566 |
| cosine_precision@5 | 0.1169 |
| cosine_precision@10 | 0.0681 |
| cosine_recall@1 | 0.241 |
| cosine_recall@3 | 0.4699 |
| cosine_recall@5 | 0.5843 |
| cosine_recall@10 | 0.6807 |
| cosine_ndcg@10 | 0.4531 |
| cosine_mrr@10 | 0.3807 |
| **cosine_map@100** | **0.3891** |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 1,490 training samples
* Columns: <code>positive</code> and <code>anchor</code>
* Approximate statistics based on the first 1000 samples:
| | positive | anchor |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 9 tokens</li><li>mean: 21.23 tokens</li><li>max: 49 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 237.64 tokens</li><li>max: 256 tokens</li></ul> |
* Samples:
| positive | anchor |
|:---------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>How can you leverage MLflow for tracking and visualizing experiment results in ZenML?</code> | <code>MLflow<br><br>Logging and visualizing experiments with MLflow.<br><br>The MLflow Experiment Tracker is an Experiment Tracker flavor provided with the MLflow ZenML integration that uses the MLflow tracking service to log and visualize information from your pipeline steps (e.g. models, parameters, metrics).<br><br>When would you want to use it?<br><br>MLflow Tracking is a very popular tool that you would normally use in the iterative ML experimentation phase to track and visualize experiment results. That doesn't mean that it cannot be repurposed to track and visualize the results produced by your automated pipeline runs, as you make the transition toward a more production-oriented workflow.<br><br>You should use the MLflow Experiment Tracker:<br><br>if you have already been using MLflow to track experiment results for your project and would like to continue doing so as you are incorporating MLOps workflows and best practices in your project through ZenML.<br><br>if you are looking for a more visually interactive way of navigating the results produced from your ZenML pipeline runs (e.g. models, metrics, datasets)<br><br>if you or your team already have a shared MLflow Tracking service deployed somewhere on-premise or in the cloud, and you would like to connect ZenML to it to share the artifacts and metrics logged by your pipelines<br><br>You should consider one of the other Experiment Tracker flavors if you have never worked with MLflow before and would rather use another experiment tracking tool that you are more familiar with.<br><br>How do you deploy it?<br><br>The MLflow Experiment Tracker flavor is provided by the MLflow ZenML integration, you need to install it on your local machine to be able to register an MLflow Experiment Tracker and add it to your stack:<br><br>zenml integration install mlflow -y<br><br>The MLflow Experiment Tracker can be configured to accommodate the following MLflow deployment scenarios:</code> |
| <code>What are the required integrations for running pipelines with a Docker-based orchestrator in ZenML?</code> | <code>ctivated by installing the respective integration:Integration Materializer Handled Data Types Storage Format bentoml BentoMaterializer bentoml.Bento .bento deepchecks DeepchecksResultMateriailzer deepchecks.CheckResult , deepchecks.SuiteResult .json evidently EvidentlyProfileMaterializer evidently.Profile .json great_expectations GreatExpectationsMaterializer great_expectations.ExpectationSuite , great_expectations.CheckpointResult .json huggingface HFDatasetMaterializer datasets.Dataset , datasets.DatasetDict Directory huggingface HFPTModelMaterializer transformers.PreTrainedModel Directory huggingface HFTFModelMaterializer transformers.TFPreTrainedModel Directory huggingface HFTokenizerMaterializer transformers.PreTrainedTokenizerBase Directory lightgbm LightGBMBoosterMaterializer lgbm.Booster .txt lightgbm LightGBMDatasetMaterializer lgbm.Dataset .binary neural_prophet NeuralProphetMaterializer NeuralProphet .pt pillow PillowImageMaterializer Pillow.Image .PNG polars PolarsMaterializer pl.DataFrame , pl.Series .parquet pycaret PyCaretMaterializer Any sklearn , xgboost , lightgbm or catboost model .pkl pytorch PyTorchDataLoaderMaterializer torch.Dataset , torch.DataLoader .pt pytorch PyTorchModuleMaterializer torch.Module .pt scipy SparseMaterializer scipy.spmatrix .npz spark SparkDataFrameMaterializer pyspark.DataFrame .parquet spark SparkModelMaterializer pyspark.Transformer pyspark.Estimator tensorflow KerasMaterializer tf.keras.Model Directory tensorflow TensorflowDatasetMaterializer tf.Dataset Directory whylogs WhylogsMaterializer whylogs.DatasetProfileView .pb xgboost XgboostBoosterMaterializer xgb.Booster .json xgboost XgboostDMatrixMaterializer xgb.DMatrix .binary<br><br>If you are running pipelines with a Docker-based orchestrator, you need to specify the corresponding integration as required_integrations in the DockerSettings of your pipeline in order to have the integration materializer available inside your Docker container. See the pipeline configuration documentation for more information.</code> |
| <code>What is the difference between the stack component settings at registration time and runtime for ZenML?</code> | <code>ettings to specify AzureML step operator settings.Difference between stack component settings at registration-time vs real-time<br><br>For stack-component-specific settings, you might be wondering what the difference is between these and the configuration passed in while doing zenml stack-component register <NAME> --config1=configvalue --config2=configvalue, etc. The answer is that the configuration passed in at registration time is static and fixed throughout all pipeline runs, while the settings can change.<br><br>A good example of this is the MLflow Experiment Tracker, where configuration which remains static such as the tracking_url is sent through at registration time, while runtime configuration such as the experiment_name (which might change every pipeline run) is sent through as runtime settings.<br><br>Even though settings can be overridden at runtime, you can also specify default values for settings while configuring a stack component. For example, you could set a default value for the nested setting of your MLflow experiment tracker: zenml experiment-tracker register <NAME> --flavor=mlflow --nested=True<br><br>This means that all pipelines that run using this experiment tracker use nested MLflow runs unless overridden by specifying settings for the pipeline at runtime.<br><br>Using the right key for Stack-component-specific settings<br><br>When specifying stack-component-specific settings, a key needs to be passed. This key should always correspond to the pattern: <COMPONENT_CATEGORY>.<COMPONENT_FLAVOR><br><br>For example, the SagemakerStepOperator supports passing in estimator_args. The way to specify this would be to use the key step_operator.sagemaker<br><br>@step(step_operator="nameofstepoperator", settings= {"step_operator.sagemaker": {"estimator_args": {"instance_type": "m7g.medium"}}})<br><br>def my_step():<br><br>...<br><br># Using the class<br><br>@step(step_operator="nameofstepoperator", settings= {"step_operator.sagemaker": SagemakerStepOperatorSettings(instance_type="m7g.medium")})<br><br>def my_step():<br><br>...<br><br>or in YAML:<br><br>steps:<br><br>my_step:</code> |
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
384,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `gradient_accumulation_steps`: 16
- `learning_rate`: 2e-05
- `num_train_epochs`: 4
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `bf16`: True
- `tf32`: True
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: epoch
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 16
- `eval_accumulation_steps`: None
- `learning_rate`: 2e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 4
- `max_steps`: -1
- `lr_scheduler_type`: cosine
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: True
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: True
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: True
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch_fused
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | dim_128_cosine_map@100 | dim_256_cosine_map@100 | dim_384_cosine_map@100 | dim_64_cosine_map@100 |
|:-------:|:-----:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|
| 0.6667 | 1 | 0.4153 | 0.4312 | 0.4460 | 0.3779 |
| **2.0** | **3** | **0.4465** | **0.4643** | **0.4824** | **0.3832** |
| 2.6667 | 4 | 0.4479 | 0.4739 | 0.4907 | 0.3891 |
* The bold row denotes the saved checkpoint.
### Framework Versions
- Python: 3.10.14
- Sentence Transformers: 3.0.1
- Transformers: 4.41.2
- PyTorch: 2.3.1+cu121
- Accelerate: 0.31.0
- Datasets: 2.19.1
- Tokenizers: 0.19.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MatryoshkaLoss
```bibtex
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |