text,start,duration all right gang so David Shapiro here,0.599,4.861 um one of the questions that I get the,3.6,5.1 most on patreon and Linkedin and,5.46,5.7 everywhere else is how do I make a QA,8.7,4.92 chatbot I get questions like how do I,11.16,4.979 answer specific questions whether it's,13.62,6.18 in a um a high risk situation and I mean,16.139,7.861 like uh law medicine mental health all,19.8,6.059 kinds of stuff and of course you know,24.0,4.859 chat GPT is out and you can do some,25.859,4.861 stuff but you know it's it's bounded,28.859,3.601 right they have their own things but,30.72,5.1 with the chat GPT API we can do our own,32.46,4.68 stuff,35.82,4.02 um and natural language interfaces are,37.14,4.5 certainly the way of the future or at,39.84,3.66 least the way of right now for how we're,41.64,4.02 going to be interacting with data and,43.5,4.98 computers so I did a little a little,45.66,3.96 poll,48.48,4.5 um and a very very clear majority people,49.62,7.02 want QA chat Bots now I was laying in,52.98,5.64 bed last night and I was like I want to,56.64,4.5 do I want to accelerate longevity and,58.62,4.439 regenerative medicine,61.14,3.36 um because like that's one of the things,63.059,3.481 that I'm I really am looking forward to,64.5,4.439 like I've got an old shoulder injury and,66.54,4.259 I've got like you know I'm getting old,68.939,4.021 right and getting old sucks,70.799,4.14 and I just saw that Sam Altman invested,72.96,5.46 180 million in um in this stuff so I was,74.939,5.521 like okay well what if I can combine,78.42,4.32 these two right so here's what we're,80.46,4.979 going to do we're going to go through,82.74,5.04 the whole process of,85.439,4.86 um using the latest and greatest AI,87.78,5.22 tools to make even more AI tools to help,90.299,5.581 accelerate longevity and regenerative,93.0,5.34 medicine and we're going to do it with,95.88,6.059 the chat GPT API so this is going to be,98.34,5.7 a series uh I don't know it's going to,101.939,3.54 take a while to unpack and I'm also,104.04,2.82 going to do a little bit of real-time,105.479,2.881 editing with pausing and stuff so,106.86,3.0 anyways,108.36,3.66 um here's where I'm starting,109.86,5.7 and I often uh in past experiments I'll,112.02,6.059 often collect the data before like,115.56,3.78 without showing you guys and then I'll,118.079,3.36 forget how I got it so I'm documenting,119.34,4.86 how I got the data in the first place so,121.439,5.28 I would come over to Bing AI which uh,124.2,4.86 Bing chat searches the internet and I,126.719,4.501 can tell it what I'm looking for and,129.06,3.899 also speaking of,131.22,4.26 um we're up to 10. so that's good,132.959,4.321 um it'll be better once they figure out,135.48,4.2 how to get get us back to unlimited,137.28,3.9 conversations but we're we're going,139.68,3.12 we're going in the right direction,141.18,4.32 so I asked for sources for primary,142.8,4.92 research and I said like NIH and archive,145.5,4.14 and of course it just said NIH and,147.72,4.56 archive but it also added nature,149.64,4.38 um and so basically what I'm doing is,152.28,5.52 I'm going to gather some sources I'm,154.02,6.0 going to download some like PDFs or uh,157.8,4.2 or other articles,160.02,4.68 um and use that as my my data now,162.0,5.22 obviously this is probably not I'm I,164.7,4.38 probably will not succeed in advancing,167.22,4.799 science on my own but what I can do is I,169.08,5.22 can demonstrate how and then someone,172.019,4.321 else can take it as and put it into a,174.3,4.019 product speaking of,176.34,3.78 um I have had on occasion some people,178.319,3.961 reach out to me and say like oh I,180.12,4.08 Incorporated your work into like my,182.28,3.9 startup or into my business like please,184.2,4.679 let me know what you're actually using,186.18,5.76 my work for one it's just nice to know,188.879,5.101 but two it also helps me understand,191.94,4.2 where I'm adding value,193.98,4.92 right because my my whole goal I'm not,196.14,6.0 here to make a ton of money I am here to,198.9,5.28 help bring about post scarcity and,202.14,3.599 Singularity and all that fun stuff all,204.18,3.12 right so anyways,205.739,4.5 um let's see I'm specifically trying to,207.3,5.04 accelerate,210.239,7.801 um uh longevity and regenerative,212.34,8.039 uh research,218.04,8.339 um with AI so I'm looking for uh papers,220.379,10.741 that can be integrated into automatic,226.379,8.22 NLP stuff,231.12,5.399 um,234.599,5.28 yeah that's basically it so,236.519,7.741 so I think the kinds of papers that will,239.879,8.64 be most helpful are likely,244.26,4.979 um,248.519,4.561 to include specific techniques uh,249.239,7.081 proteins enzymes,253.08,6.18 um enzymes,256.32,5.58 that doesn't look great enzymes,259.26,5.18 um and therapies,261.9,6.84 uh let's see does that help narrow it,264.44,5.979 down,268.74,3.84 all right so if I tell Bing what I'm,270.419,4.141 doing hopefully it'll have a better,272.58,4.679 understanding of what I'm trying to,274.56,5.28 achieve and then give me a little bit,277.259,5.461 more specific stuff so anyways uh,279.84,5.88 regenerative medicine so nature.com,282.72,5.28 regenerative,285.72,5.699 uh looks like this is pretty good so,288.0,4.68 there's plenty of stuff that's open,291.419,4.141 access so in order to respect the data,292.68,4.32 I'm only going to download stuff that,295.56,3.359 has Open Access which it's great that,297.0,2.82 that,298.919,2.28 um they do this actually I'm going to,299.82,4.98 bookmark this favorites bar,301.199,5.22 um because that's good information let's,304.8,3.3 see yes that helps a lot thank you for,306.419,2.661 sharing,308.1,3.12 artificial intelligence and Longevity,309.08,5.02 medicine which discusses interesting so,311.22,4.8 there's,314.1,3.36 oh,316.02,4.56 okay so I found a nature article talking,317.46,6.48 about it oh this is only a preview okay,320.58,5.7 so I think we'll need to specify,323.94,3.66 um,326.28,4.44 that we need uh Open Access stuff let's,327.6,4.56 see,330.72,3.84 let's see um do you want me to show you,332.16,5.16 how to download any of these papers,334.56,3.54 um,337.32,5.52 no some of them are not open access but,338.1,8.28 I'm not looking for,342.84,7.44 um Ai and aging papers I'm doing the AI,346.38,7.14 I'm looking for,350.28,6.66 um uh therapeutic,353.52,6.6 techniques and candidates,356.94,6.56 and you try again,360.12,3.38 yep so therapeutic techniques and,368.639,2.941 candidates for longevity and,370.44,3.18 regenerative medicine so it's just what,371.58,4.32 it's doing is it's distilling down my,373.62,4.5 search query which that's actually a,375.9,3.54 super easy thing to do with prompt,378.12,3.06 engineering you just say take this,379.44,4.74 paragraph and convert it into a Google,381.18,4.739 search query or in this case a Bing,384.18,4.079 search query,385.919,4.56 all right let's see regenerative,388.259,4.5 medicine all right the Lancet so that's,390.479,4.681 a good one envisioning future Trends in,392.759,4.44 regenerative medicine,395.16,5.759 so some of these Yes except all cookies,397.199,5.401 where'd it go okay,400.919,4.201 all right so this is not this is,402.6,5.159 requiring stuff so here's one thing,405.12,3.84 where,407.759,3.841 um I am really skeptical because like,408.96,5.94 all of this gatekeeping and pay walls is,411.6,4.5 actually really going to slow down,414.9,4.16 research especially in the age of AI,416.1,5.76 that being said it's like okay there is,419.06,5.74 some value added by,421.86,6.54 um by these things okay uh let's focus,424.8,8.1 on Open Access sources,428.4,5.78 um of,432.9,5.639 regenerative uh medicine research,434.18,9.1 can you find any more aside,438.539,8.481 aside from archive,443.28,3.74 and nature because like okay the fact,447.36,4.38 that like this one is here so medical,449.819,4.861 Express but this is like a no-name site,451.74,6.0 so it's like I don't even know if,454.68,6.239 like what this is yeah this is just,457.74,6.0 someone restating something else and,460.919,4.381 it's just pointing at nature anyways,463.74,3.84 okay so it looks like nature Open Access,465.3,4.92 is going to be our best source so far we,467.58,5.459 can also have archive,470.22,4.68 um so let's see,473.039,5.461 if we go to Archive and we look for like,474.9,6.06 let's see where is it,478.5,4.86 well it oh looks like we're looking for,480.96,5.359 Bio archive okay,483.36,2.959 oh man this website is like from the 90s,487.259,4.44 [Laughter],489.79,5.09 okay this is gonna be a pain,491.699,5.761 ouch all right Neuroscience molecular so,494.88,5.759 molecular biology Immunology genomics,497.46,5.76 genetics,500.639,7.74 um let's see cell biology cancer biology,503.22,5.879 um,508.379,3.72 biochemistry,509.099,3.0 synthetic biology that's fun,512.159,5.461 okay structural bias so a lot of this,515.339,4.08 stuff is not going to be like directly,517.62,4.62 relevant which is interesting,519.419,4.98 um okay sure open Journal of,522.24,4.8 regenerative medicine hey that looks,524.399,4.261 stem cell research and regenerative,527.04,2.94 medicine,528.66,5.34 and npj there we go open Journal of,529.98,5.7 regenerative medicine so let's bookmark,534.0,4.16 that,535.68,2.48 favorites bar,538.38,4.74 Open Access Journal stem cell research,540.779,4.381 and regenerative medicine,543.12,6.08 okay I'm not seeing any papers,545.16,4.04 um let's see,551.76,2.54 looks like they only have like a couple,555.54,4.739 of issues per year I don't know this,557.76,4.32 looks,560.279,4.381 low quality,562.08,4.379 okay,564.66,3.48 um all right well we've got a couple,566.459,4.621 right so let me let me bookmark bio,568.14,5.84 archive as well,571.08,2.9 okay so if we come here got bioarchive,575.82,4.68 we've got open Journal of regenerative,578.88,3.3 medicine,580.5,3.12 um and then we've got nature so we've,582.18,4.08 got three sources,583.62,6.18 um of of data so what I'll do now is I'm,586.26,5.04 not going to what make you watch me,589.8,3.18 download everything but now you see how,591.3,5.46 I've gone about finding this information,592.98,6.419 um and so then I'll download just a,596.76,5.639 whole whole mess of this stuff,599.399,5.701 um and then save it and we'll start to,602.399,4.741 slice and dice it all right pause it,605.1,4.62 download stuff we'll be right back okay,607.14,7.139 and we're back so I downloaded,609.72,7.08 um about let's see 81,614.279,6.481 papers so this is obviously 81 research,616.8,6.84 papers on a variety of topics this is,620.76,4.8 not a really coherent search strategy,623.64,3.96 this is just a proof of concept because,625.56,3.66 this is,627.6,3.66 um to demonstrate like because this is,629.22,5.28 an intractable amount of of material,631.26,5.699 um sure a professional scientist will,634.5,5.76 probably easily skim through more than,636.959,5.221 this number while performing a,640.26,4.92 literature review but imagine that,642.18,6.42 you've got 2 000 or 10 000 of these that,645.18,4.74 you need to sift through and get,648.6,3.78 information from so imagine that in in,649.92,4.38 your in your citations page instead of,652.38,4.56 having 30 to 100 what if you have like,654.3,5.4 30 000 citations and we can do that,656.94,5.339 automatically with AI and we can find,659.7,6.66 them quickly through chat bots so that's,662.279,6.24 step one was downloading stuff which I,666.36,4.68 just showed you and then step two is,668.519,4.56 let's convert it so I have this handy,671.04,3.239 dandy,673.079,3.841 um public archive right here,674.279,4.381 um called document scraping and I've got,676.92,6.18 a convert PDF file here so what we're,678.66,7.08 going to do next is we're going to and,683.1,4.02 I'll show you it's really simpler,685.74,3.839 convert PDF to text,687.12,4.68 and it what it'll do is it'll grab,689.579,6.901 everything in that um in that folder uh,691.8,6.539 that's a PDF and then dump a text,696.48,4.44 version out to the uh diff uh to the,698.339,5.041 next folder and then it it breaks it up,700.92,4.979 by page so I'll just say new page right,703.38,5.579 it's not a big deal and this forms a,705.899,6.241 nice easy way of of seeing it so I'll,708.959,6.861 show you what it does so CD to chat GPT,712.14,6.36 regenerative medicine and then we'll do,715.82,5.74 python step 01 convert,718.5,4.68 and so what it's doing is it's going,721.56,3.779 through and converting them one by one,723.18,4.98 and it's dot it's dropping them here so,725.339,7.641 you can see here is the information,728.16,4.82 and obviously this is not going to grab,733.339,4.901 the um,736.26,3.9 uh what you may call it it's not going,738.24,4.02 to grab the the graphics and stuff and,740.16,3.179 that's fine because we don't have,742.26,3.78 multimodal models yet apparently gpt4 is,743.339,4.381 going to be multimodal,746.04,4.979 um but at least this should give us all,747.72,6.359 the information that we need in terms of,751.019,6.06 uh text information so this is running,754.079,5.041 it'll take a minute,757.079,2.641 um,759.12,2.76 obviously very much taking its time so,759.72,3.299 I'll go ahead and pause it again and,761.88,3.66 show you the end result Also let's see,763.019,3.901 we're only at 12 minutes so we'll see,765.54,3.359 how much further we can get,766.92,4.859 um yeah we'll be right back,768.899,5.581 all right so we've got,771.779,6.36 um we've got 81 text files now I always,774.48,7.62 check my data and so here's a problem is,778.139,7.561 um some of the lines uh have no spaces,782.1,5.16 some of them do some of them don't it's,785.7,4.5 really weird I don't know if this is a,787.26,5.879 problem with the with the PDF plumber or,790.2,6.12 with the underlying PDF itself,793.139,6.121 um so I I don't know what's going on but,796.32,4.86 from there,799.26,4.379 and here like okay you see this one this,801.18,4.38 one looks like it's formatted fine I'm,803.639,3.301 not going to delete it because it's like,805.56,3.66 okay we still want this information and,806.94,3.24 also,809.22,4.739 um even if it is uh even if if the,810.18,6.24 formatting is botched like that as it is,813.959,3.741 in some cases,816.42,5.159 gpt3 can still often read these,817.7,6.579 um so let me just grab this and show you,821.579,4.141 what I mean,824.279,5.601 so let's grab this and go to,825.72,4.16 um and then we say,829.98,3.419 um,832.32,4.079 let's see complete,833.399,7.94 um uh fix the formatting issue,836.399,4.94 fixed,842.339,2.721 and so then if we let's see do that up,846.839,6.56 to 500 zoom out a little,849.6,3.799 so you can see it can still read it and,853.98,4.02 add the spaces back in which means that,855.839,4.141 it can comprehend the um the word,858.0,4.44 boundary problems,859.98,5.64 um so yeah gpt3 can understand it it's,862.44,5.459 honestly probably been trained on a lot,865.62,5.399 of data that is malformed like this,867.899,5.101 um so it's fine,871.019,4.5 um yeah that's good so now let's come,873.0,4.8 over here and what we're going to do is,875.519,4.32 we need to convert this these text files,877.8,3.68 into something that's usable,879.839,5.041 so one of the advantages of having new,881.48,5.32 page right here is that we can open open,884.88,4.98 it and then split it into,886.8,5.76 um into individual Pages again that are,889.86,4.32 text so we're going to do is we're going,892.56,3.959 to have a folder called Paint whoops,894.18,5.459 papers underscore Json,896.519,6.12 and now I'm going to ask chat GPT,899.639,5.601 um write a python script,902.639,6.961 that uh opens all,905.24,7.959 uh no let's see that yeah that opens all,909.6,7.2 text files from the folder,913.199,6.32 um papers underscore text,916.8,8.339 splits them into a list of strings,919.519,8.461 with the demarcator,925.139,5.82 new page,927.98,7.799 then take each page and get an embedding,930.959,10.261 by passing the string to a function that,935.779,7.781 I call,941.22,4.82 um let's see,943.56,6.139 gpt3 embedding,946.04,3.659 to a function called,952.26,5.879 gpt3 embedding,955.16,4.9 um,958.139,3.621 finally,960.06,4.5 let's see save,961.76,8.5 uh all of that into a Json file,964.56,7.76 um,970.26,2.06 one Json file for each original,972.54,7.94 text file,977.0,3.48 the Json should have,981.56,4.66 um,985.139,3.301 elements such as,986.22,3.84 original,988.44,8.1 file name and then a list of pages where,990.06,12.0 each page has a number,996.54,8.82 which order it was,1002.06,6.899 um the original text,1005.36,7.26 and finally the embedding,1008.959,5.82 um okay so that should be enough sure,1012.62,5.399 here's a python script wow it's fast I,1014.779,6.12 guess nobody else is using it right now,1018.019,4.32 um okay,1020.899,4.861 Pages embeddings gpt3 embed page for,1022.339,7.561 page and Page oh wow okay that's fun,1025.76,6.199 um yep original file name file name,1029.9,5.899 Pages equals I plus one text Page,1031.959,6.521 embedding to list for I and Page,1035.799,4.961 embedding enumerate zip pages,1038.48,5.76 dang I think this is it,1040.76,5.539 um,1044.24,2.059 [Laughter],1046.52,5.7 this function is wrong but that's fine,1048.7,7.18 um yep okay so let's copy this and come,1052.22,5.1 out here,1055.88,4.58 all right so let's get rid of,1057.32,6.3 those guys,1060.46,6.339 all right so for file name in OS Lister,1063.62,4.98 dur path,1066.799,4.681 and dur path is here so we actually want,1068.6,5.939 to make this a function,1071.48,7.26 um let's see that's great but let's make,1074.539,6.481 the um,1078.74,8.22 the for file name in durpath a function,1081.02,10.92 and then call it from if name,1086.96,7.68 equals,1091.94,4.8 Main,1094.64,4.76 all right,1096.74,2.66 so process text it's defining a function,1100.46,6.06 within that doesn't make any sense,1103.64,8.0 that is what it's fine,1106.52,5.12 yep uh or let's see no stop stop stop,1112.94,8.52 um let's remove the uh,1117.62,7.86 gbd3 embedding function from being,1121.46,8.82 nested inside another function,1125.48,8.66 um that is not pep 8,1130.28,9.08 approved at least I don't think so,1134.14,5.22 sure here's an updated yep,1141.38,5.039 and so basically I'm just going to,1145.039,2.52 ignore that,1146.419,4.081 um because I wrote that function else,1147.559,8.581 wise okay so for this in that etc etc,1150.5,8.48 there we go,1156.14,2.84 and then,1160.4,2.96 uh let's see oh one problem,1163.4,4.68 one more problem,1165.86,6.0 um the output directory needs to be,1168.08,9.5 specified as papers underscore Json,1171.86,5.72 and see this is why I don't like using,1179.24,3.96 um,1182.539,3.301 uh copilot because you don't have a,1183.2,4.859 dialogue right it's just guessing what,1185.84,4.8 you want and then what I can do is I can,1188.059,3.601 just look,1190.64,4.34 at this and say okay,1191.66,3.32 why is it not allowing me to,1195.32,5.78 pass,1198.38,2.72 uh,1201.86,2.36 papers Json,1205.82,5.64 I'll pass okay,1208.4,3.779 um,1211.46,4.98 why is the out path hard coded and not,1212.179,6.301 parameterized,1216.44,6.38 does that make any sense to you,1218.48,4.34 [Laughter],1224.86,4.06 I'm way too passive aggressive with this,1228.98,4.64 thing I apologize,1230.84,2.78 there was an error generating a response,1236.299,4.041 I'm,1240.559,2.661 no do not,1244.16,3.36 so here's the thing when you do,1245.96,3.959 regenerate and it just continues no stop,1247.52,4.62 stop stop because then it's broken up,1249.919,4.14 into two so what I'm going to do instead,1252.14,4.26 is I'm going to come here and just one,1254.059,4.5 remove the saltiness,1256.4,4.44 um please fix,1258.559,5.041 and let's start over good catch,1260.84,6.48 so uh,1263.6,6.78 open AI maybe you can have it as a as a,1267.32,5.88 setting but honestly if there's an error,1270.38,5.159 I would rather it just like save what it,1273.2,5.16 did and then pipe that in I don't know,1275.539,4.621 like yeah,1278.36,3.78 there we go,1280.16,5.28 much better okay so let's copy this,1282.14,6.62 and come back over here,1285.44,3.32 um we don't need that function so,1290.12,4.32 basically what I did here is I copied a,1292.7,4.02 few functions from another script,1294.44,5.4 so here's my embedding one oh because,1296.72,5.699 we're using data from the internet I,1299.84,4.86 always do this where I force it I encode,1302.419,4.921 it to ASCII and then decode and that,1304.7,5.04 fixes Unicode errors because,1307.34,5.819 um gpt3 often does not do well with some,1309.74,5.34 forms of Unicode I still haven't figured,1313.159,3.661 it out maybe it's not even an issue,1315.08,4.14 anymore but if it is it's still out,1316.82,5.4 there okay so if it doesn't exist make,1319.22,6.36 it great for file name and Os Lister if,1322.22,6.48 it ends with DOT text that's great split,1325.58,6.479 it into Pages embed it and then here's,1328.7,5.219 the output so we get all the pages,1332.059,4.321 excellent,1333.919,4.681 um I'm just going to trust that this,1336.38,4.26 works,1338.6,4.86 um because I respond with this so the,1340.64,5.1 vector I don't think it's a numpy array,1343.46,4.8 so that's fine embedding to list I think,1345.74,4.14 it's all ready,1348.26,4.14 yeah,1349.88,6.299 so embeddings equals so this is already,1352.4,5.82 a list I believe so I think this will,1356.179,5.661 break that's fine,1358.22,3.62 and then we zip the pages and embeddings,1361.88,4.7 oh interesting okay,1367.46,3.98 we'll see if this is formatted correctly,1369.679,5.761 and then let's see save it,1371.44,5.979 so it just replaces the file name that's,1375.44,4.08 good,1377.419,4.921 um with OS but I actually already have a,1379.52,5.039 save Json thing so we just have the file,1382.34,4.56 path and the payload and in this one I,1384.559,5.761 specify ensure ASCII false sort key is,1386.9,5.639 true and then indent so this formats it,1390.32,4.02 nice and pretty,1392.539,3.901 um so what we'll do instead,1394.34,4.38 is we'll comment this out and then we'll,1396.44,3.239 just do,1398.72,4.68 save Json and what is the order I have,1399.679,6.48 it file path and payload,1403.4,5.1 um so then the file path,1406.159,5.161 is,1408.5,5.6 this guy,1411.32,2.78 and then the payload is,1418.22,6.14 output dict,1421.22,3.14 yes,1425.24,2.66 all right so that should be good now if,1429.14,5.46 name equals main etc etc so what I want,1431.659,5.581 to do next is add a little bit of debug,1434.6,5.9 I want to see what it's doing,1437.24,3.26 um so what we'll do is,1441.08,4.04 hmm,1446.299,4.021 I guess this is just going to be messy,1448.159,4.02 no matter what we do,1450.32,4.2 um so let's add an embedding let's add,1452.179,5.181 some output here,1454.52,2.84 um,1458.24,3.2 so we'll do print,1459.26,5.22 content to embed,1461.44,6.719 and then we'll do content,1464.48,3.679 and then we'll print,1468.799,4.641 vector,1470.72,2.72 Vector all right so that way we'll be,1475.28,5.399 able to see it embedding as it goes,1477.38,6.48 um and that should be fine and then yeah,1480.679,4.5 that'll be fine,1483.86,4.26 all right so let's save this and come,1485.179,8.0 over here so python step O2,1488.12,5.059 um hey look we had it we had a we had a,1494.78,7.62 an issue Unicode decode error okay in uh,1497.72,7.86 text F read oh yep,1502.4,6.139 so here we go,1505.58,2.959 all right so I'm gonna ask I I know what,1511.22,3.36 the problem is but I'm gonna ask chat,1513.38,6.2 GPT got an error can you fix it,1514.58,5.0 there we go,1527.179,2.721 dude so basically it's it's pretty,1532.34,5.16 simple all you have to do,1535.76,5.039 is specify,1537.5,6.12 make sure those are spaces okay yes so,1540.799,6.24 you specify the encoding is utf-8,1543.62,5.22 um and so now it knows better all right,1547.039,4.081 so let's try that again,1548.84,4.56 that,1551.12,5.58 that should not go that fast,1553.4,5.42 um,1556.7,2.12 okay so the fact that it's going that,1562.22,3.78 fast and it's not outputting the vector,1564.14,5.039 means that it's barfing somewhere here,1566.0,5.58 so let's,1569.179,4.62 comment out this because it's probably,1571.58,4.979 returning none,1573.799,5.161 um which I had I had that uh because,1576.559,3.781 there was another project that I was,1578.96,2.459 working on where some of the things,1580.34,3.56 failed,1581.419,2.481 always use notepad,1585.86,5.46 um let's see embedding null yeah so you,1588.98,5.88 see it was not embedding so let's,1591.32,6.68 delete these,1594.86,3.14 it worked mostly,1598.159,3.081 let me close these because they're,1602.059,4.021 Superfluous and let's see why it's,1603.38,5.299 blowing up,1606.08,2.599 oh,1611.26,5.32 you need an API key what do you mean I,1613.059,5.261 need an API key here let me fix this,1616.58,3.9 I'll be right back,1618.32,4.32 okay I think I fixed it so I added this,1620.48,4.799 here and then I copied my git ignore and,1622.64,5.82 API key right here so we should be good,1625.279,5.581 let's give it a one last try and also I,1628.46,4.199 did a time check we're at almost 30,1630.86,4.62 minutes so this will probably be it for,1632.659,4.861 today but we're off to a pretty good,1635.48,4.439 start if I do say so myself,1637.52,4.98 all right so let's do let's try and do,1639.919,4.921 some embeddings that's still seen oh,1642.5,4.5 nope it's just that fast look at that,1644.84,4.819 look at that,1647.0,2.659 um all right so let's take a look at the,1649.7,4.16 output,1651.5,2.36 oops all right so the embedding,1658.22,4.579 so for each page we get embedding page,1663.02,5.639 number text embedding page number text,1666.44,4.5 and then we we keep the original file,1668.659,4.201 name as well,1670.94,3.96 um and so this by using this we can,1672.86,3.96 trace it back to the original PDF as,1674.9,3.6 well because we're going to basically,1676.82,4.2 need to be able to cite our sources,1678.5,5.1 um all right cool so this is tearing,1681.02,4.44 through this,1683.6,3.179 um let's see make sure it hasn't blown,1685.46,3.839 up all right it's still going and so you,1686.779,3.9 see basically what we're doing is we're,1689.299,3.421 getting one embedding for every single,1690.679,4.561 page now there's a lot more that we're,1692.72,5.16 going to need to do,1695.24,4.799 um in order to make this usable because,1697.88,4.02 some of these are like 80 pages long,1700.039,4.921 which is way too much for it to read so,1701.9,5.1 then it's like okay well if you search,1704.96,4.38 it based on the embedding that'll get,1707.0,4.02 you close but then what else do you do,1709.34,3.839 there's a lot of problems to solve but,1711.02,4.38 the fact of the matter is this is going,1713.179,4.801 to give us basically a super Advanced,1715.4,6.72 chat based scientific search engine now,1717.98,6.48 because I'm also working on cognitive,1722.12,4.08 architecture we can have it do a lot of,1724.46,3.36 thinking for you in the background,1726.2,3.839 that's going to be the real game changer,1727.82,4.26 not just a chat bot that allows you to,1730.039,3.601 search but a chat bot that you can ask,1732.08,3.78 at scientific questions and it will go,1733.64,5.159 think about the problem for you,1735.86,5.4 um so I'm going to let this finish I'm,1738.799,3.661 going to go ahead and stop the recording,1741.26,4.08 and we'll come back tomorrow for part,1742.46,6.68 two thanks for watching,1745.34,3.8