davidshapiro_youtube_transcripts / PostSingularity Predictions How will our lives corporations and nations adapt to AI revolution_transcript.csv
Stevross's picture
Upload 50 files
421fea8
raw
history blame contribute delete
No virus
51.5 kB
text,start,duration
morning everybody David Shapiro here,0.9,6.54
with a video so my video about AGI was,3.12,5.88
super popular,7.44,3.36
um I suppose I should uh should have,9.0,4.32
anticipated that you know making a bold,10.8,5.46
declaration like AGI within 18 months so,13.32,4.74
with that being said,16.26,6.96
uh the ramp up to AGI ASI or super,18.06,7.44
intelligence and then Singularity seems,23.22,3.719
like it's accelerating especially if you,25.5,4.38
go by Reddit comments on R Singularity,26.939,4.321
and a few other places people are like,29.88,2.82
is this happening are we actually,31.26,4.02
approaching the wall of exponential,32.7,5.4
takeoff so assuming that that is the,35.28,4.32
case and that we are ramping up to the,38.1,3.54
singularity within the coming uh you,39.6,4.139
know months or years or whatever let's,41.64,4.5
explore how the singularity will,43.739,3.961
actually unfold,46.14,3.12
first,47.7,4.14
we need to Define singularity so what do,49.26,4.86
we mean when we say Singularity there's,51.84,5.039
obviously a lot of ways to Define this,54.12,4.5
thing and the simplest way that I could,56.879,3.66
come up with to Define it is just we say,58.62,4.32
that the singularity is when AI becomes,60.539,4.501
orders of magnitude more intelligent,62.94,4.98
than all humans combined so basically if,65.04,5.04
current trends continue in terms of AI,67.92,6.18
research and the power of AI gpt4 is,70.08,7.14
just as intelligent as many people more,74.1,6.6
intelligent than some and gpt5 is being,77.22,5.939
trained and there's open source versions,80.7,5.22
etc etc so basically if you haven't been,83.159,4.801
living under a rock you are probably,85.92,5.46
aware of the rapid ramp up of AI and,87.96,4.86
it's not showing any signs of slowing,91.38,3.12
down if anything it's accelerating,92.82,4.02
because now we're not looking at AI,94.5,4.079
advancements on a monthly basis we're,96.84,4.26
looking at it on a weekly basis I was,98.579,4.5
actually at a Meetup recently and,101.1,3.479
someone pointed that out they're like I,103.079,2.821
think we're actually already in the,104.579,2.821
singularity because we're measuring,105.9,4.62
advancements on a week to week basis and,107.4,5.1
soon we might be measuring it on a,110.52,3.66
day-to-day basis,112.5,5.22
okay so let's break this down what are,114.18,5.88
some of the macro economic changes,117.72,4.32
um that we can expect to see with the,120.06,3.3
singularity,122.04,3.719
first we have to talk about what remains,123.36,4.739
scarce because it's very easy to get,125.759,4.5
caught up in this you know magical,128.099,3.601
thinking of AI is going to change,130.259,3.661
everything but there's going to be some,131.7,4.5
things that don't change with Ai No,133.92,5.22
matter how smart it gets so if the,136.2,4.92
singularity happens there's a few things,139.14,3.54
that are not really going to change that,141.12,4.44
much so first is desirable and arable,142.68,5.52
land some places will remain deserts,145.56,5.58
hence this AI generated image of a woman,148.2,6.0
in a desert we'll get to that later when,151.14,5.22
we talk about Fusion,154.2,5.1
but uh fresh potable water most of the,156.36,4.739
water on the planet is also salt water,159.3,4.26
again we might be able to change that if,161.099,5.461
we solve uh if we solve uh nuclear,163.56,5.88
fusion and other energy sources,166.56,4.02
um but then there's other physical,169.44,3.72
resources such as minerals,170.58,5.7
um and and and mind natural resources,173.16,6.18
that will also probably remain scarce no,176.28,5.76
matter how intelligent AI becomes so,179.34,4.979
this will be a really con uh critical,182.04,4.86
constraint which could really drive up,184.319,5.28
the value of some of these resources,186.9,4.919
um but and as I mentioned if we solve,189.599,4.201
nuclear fusion we could desalinate water,191.819,4.92
we could air irrigate deserts and so on,193.8,4.2
and so forth but that could have,196.739,3.481
unintended consequences because if,198.0,4.62
suddenly you irrigate every desert on,200.22,4.68
the planet maybe those deserts actually,202.62,4.44
form a really critical component of our,204.9,5.339
ecosystem lastly if we solve space,207.06,4.92
flight we could probably start,210.239,4.441
harvesting asteroids and other even,211.98,5.22
other planets for Rare Minerals because,214.68,4.08
there are,217.2,3.66
trillions and trillions and trillions of,218.76,4.619
dollars worth of Rare Minerals out there,220.86,4.5
on the solar system so it's entirely,223.379,3.601
possible that all of these will actually,225.36,3.959
be solved at some point in the future,226.98,3.839
with this with the singularity but at,229.319,3.361
least in the short term these will,230.819,3.961
remain scarce resources,232.68,4.68
now on the flip side from a,234.78,4.679
macroeconomic perspective what becomes,237.36,4.14
abundant with the singularity the,239.459,3.541
primary thing that becomes abundant with,241.5,3.48
the singularity is knowledge information,243.0,4.56
and cognitive labor so what I mean by,244.98,4.74
cognitive labor is thinking knowledge,247.56,5.64
work service industry jobs so basically,249.72,6.12
what happens if AI becomes orders of,253.2,4.74
magnitude smarter than humans and we,255.84,3.54
remain in control of it this is all,257.94,4.5
assuming the the good ending right that,259.38,5.039
we don't get wiped out,262.44,4.08
basically what that means is that human,264.419,5.34
cognitive effort becomes irrelevant,266.52,6.6
um now that sounds really awful but one,269.759,4.5
thing that I realized while I was,273.12,3.6
working on this is that on an individual,274.259,3.601
basis,276.72,4.32
most people's cognitive effort is,277.86,5.1
already irrelevant right we're on a,281.04,4.08
planet of 8 billion people chances are,282.96,3.72
someone has already solved the problem,285.12,3.66
that you're working on whether or not,286.68,4.2
you realize it is a different you know,288.78,3.72
story but,290.88,3.78
um you know doing science and and,292.5,4.199
solving problems it can be really,294.66,3.96
difficult and it's it's mostly a matter,296.699,5.401
of right place right time but you know,298.62,5.7
there are there well with that being,302.1,4.2
said there are still unsolved problems,304.32,3.0
out there,306.3,4.8
and the singularity with you know hyper,307.32,6.599
Advanced AI will probably just result in,311.1,4.14
kind of the same behavior that we're,313.919,2.941
already seeing because the collective,315.24,3.36
wisdom of humanity solves problems,316.86,4.5
pretty quickly right but the velocity of,318.6,4.86
that problem solving will probably go up,321.36,4.679
uh you know maybe a few degrees maybe an,323.46,5.519
order of magnitude not sure yet but,326.039,5.761
a hyperabundance of cognitive labor is,328.979,5.521
is actually probably not going to be as,331.8,4.92
immediately and dramatically impactful,334.5,4.5
as you might think because like look at,336.72,3.96
Reddit and Twitter and other social,339.0,4.259
media platforms that allow you to solve,340.68,5.76
problems get answers uh and and move on,343.259,5.94
very quickly so basically instead of,346.44,4.44
having you know lazy Twitter or Reddit,349.199,3.901
where you ask a problem ask the machines,350.88,4.2
for a problem where you know people are,353.1,3.659
collectively now the machines are going,355.08,3.179
to be doing it and actually some of the,356.759,3.061
people that I'm working with on these,358.259,4.021
autonomous AI projects one of the key,359.82,3.9
things that we're working on is figuring,362.28,3.96
out how to get AI to talk to each other,363.72,6.18
in an autonomous manner that is safe and,366.24,5.22
transparent and this is where natural,369.9,4.079
language comes in because you don't want,371.46,4.62
AI using their own coded language to,373.979,4.861
talk to each other you want AI using uh,376.08,4.92
human readable natural language to talk,378.84,4.68
to each other anyways that's a topic for,381.0,5.28
another video,383.52,4.98
um so let's move forward from those,386.28,4.74
macro economic changes to technological,388.5,5.28
breakthroughs if suddenly we have a,391.02,5.519
hyperabundance of cognitive effort or,393.78,4.979
cognitive labor what kind of Technology,396.539,5.461
Solutions can we imagine being solved so,398.759,5.461
first is high energy physics,402.0,4.62
high energy physics the stuff that,404.22,5.039
they're working on at the CERN at LHC,406.62,5.22
this includes nuclear fusion it could,409.259,5.28
include even anti-matter research who,411.84,4.68
knows maybe time travel may be faster,414.539,4.261
than light travel not really sure but at,416.52,3.899
least the first problem that will likely,418.8,3.72
be solved in high energy physics is,420.419,5.041
probably going to be nuclear fusion it's,422.52,5.22
really difficult to anticipate what,425.46,4.22
solving nuclear fusion will do because,427.74,4.799
nuclear fusion is a thousand times more,429.68,5.139
powerful and efficient than any form of,432.539,4.921
energy we have today so when you have a,434.819,4.921
hyperabundance of energy suddenly a lot,437.46,4.2
of other things become possible for,439.74,3.84
instance you can then afford to,441.66,4.74
desalinate as much water as you need you,443.58,5.519
can then afford to run underground Farms,446.4,5.1
you know that are completely unbounded,449.099,4.38
from arable land there's all kinds of,451.5,3.9
stuff that you can do once you unlock,453.479,5.701
nuclear fusion the knock-on effects of,455.4,6.239
solving nuclear fusion are impossible to,459.18,4.859
say not only just like in the short term,461.639,4.321
we can come up with a couple ideas but,464.039,3.66
certainly in the long term solving,465.96,3.9
nuclear fusion solves so many other,467.699,3.361
problems,469.86,3.36
um it solves recycling because then,471.06,3.72
suddenly you can afford to just melt,473.22,3.06
down any material no matter how,474.78,3.84
expensive it is so you can reclaim all,476.28,4.38
the lithium all the Cobalt all the,478.62,5.04
nickel from everything platinum gold,480.66,5.22
pretty much every mineral becomes,483.66,5.039
accessible no matter how difficult it is,485.88,5.52
to isolate because suddenly if you have,488.699,5.641
many many gigajoules of energy available,491.4,6.18
at all times for practically free it,494.34,4.68
doesn't matter how much energy it costs,497.58,3.119
to recycle a material that's just,499.02,3.84
another example,500.699,4.981
um another set of solved problems that,502.86,4.86
you can expect with a hyperabundance of,505.68,4.56
of cognitive labor is basically disease,507.72,5.879
genetics and aging uh that you know the,510.24,5.7
the human body our genetics our,513.599,4.5
metabolism one of the most complex,515.94,5.88
systems uh in existence uh there's more,518.099,5.161
than a hundred thousand metabolic,521.82,3.24
pathways that we know of in the human,523.26,4.8
body alone and they all interact uh not,525.06,4.5
only that they interact with your genes,528.06,4.98
your epigenetics uh your microflora all,529.56,6.24
kinds of stuff super complex system but,533.04,5.4
if you have a hyper abundance of of,535.8,5.7
intellect then you can create new tools,538.44,4.92
you can create new processes you can,541.5,4.68
manage vast amounts of information and,543.36,4.86
so then we might end up we might end up,546.18,4.98
curing all disease all aging and,548.22,4.799
untangling all genetics,551.16,3.84
within a relatively short period of time,553.019,5.041
after achieving you know the singularity,555.0,5.279
or AGI or however you want to call it,558.06,4.2
excuse me and then finally Material,560.279,4.74
Science so Material Science we're,562.26,3.84
already seeing the beginning of this,565.019,2.461
with Alpha fold,566.1,3.9
and so basically imagine that you have,567.48,4.08
Alpha fold which if you're not familiar,570.0,5.1
with Alpha fold that is a way of using,571.56,5.339
deep neural networks using Transformers,575.1,4.38
to model protein folding,576.899,5.041
which was an unsolved problem but now,579.48,3.96
that it's a solved problem we can model,581.94,4.019
any protein folding now take that to the,583.44,4.019
next level what if not only you can,585.959,3.661
model all protein folding you can,587.459,4.621
measure or you can model all protein,589.62,4.44
interactions all genetic interactions,592.08,4.08
then take that one step further you can,594.06,3.959
measure or model,596.16,5.16
um uh nanoparticles carbon You can,598.019,6.781
predict how to build very very Advanced,601.32,6.6
Materials which could revolutionize for,604.8,4.32
instance batteries and computer,607.92,4.68
technology I predict that the Materials,609.12,4.8
Science breakthroughs that will result,612.6,4.98
from um from AI means that like,613.92,6.3
basically in a in five to ten years your,617.58,4.86
phone could be more powerful than all,620.22,4.739
computers on Earth today,622.44,4.44
um and I'm not really exaggerating when,624.959,3.241
I say that because the amount of,626.88,4.38
computational power just in the atoms of,628.2,4.74
a phone like if you have a membrane or,631.26,3.259
whatever or a three-dimensional wafer,632.94,4.5
the amount of potential computational,634.519,5.861
power in matter is,637.44,6.72
inconceivable basically inconceivable,640.38,6.3
um so anyways you know it would it would,644.16,4.44
not surprise me if we move up a,646.68,4.02
kardashev scale or two,648.6,3.9
um post singularity,650.7,4.92
now that being said there are still some,652.5,5.94
unsolved problems uh that that pretty,655.62,5.94
much no amount of intellectual labor,658.44,6.72
um on Earth could solve so for instance,661.56,5.279
uh some people in the comments have,665.16,3.239
asked about you know the hard problem of,666.839,4.021
Consciousness uh that may or may not be,668.399,4.741
solvable by machines period that might,670.86,3.84
be something that we humans have to,673.14,3.24
figure out for ourselves,674.7,4.079
uh which extends to fundamental,676.38,5.399
questions of existence of cosmology some,678.779,5.041
of these things are not necessarily a,681.779,4.381
matter of you know mathematically,683.82,4.019
proving it and measuring it in the lab,686.16,3.6
some of these things are a matter of,687.839,4.201
interpretation some of these things are,689.76,4.199
a matter of subjective values such as,692.04,3.96
the meaning of life so on and so forth,693.959,3.301
now,696.0,3.54
one thing that people imagine when we,697.26,3.96
talk about transhumanism or,699.54,4.14
post-humanism is that we will have some,701.22,5.34
sort of transcendence event I personally,703.68,4.62
don't think the singularity will result,706.56,4.019
in some kind of transcendent event where,708.3,5.76
we all become like Q from Star Trek or,710.579,5.341
you know some final solution where we,714.06,4.98
become beings of energy I also don't,715.92,4.68
think that mind uploading is a good idea,719.04,2.76
I know a lot of people think that that's,720.6,3.6
great but like we don't understand why,721.8,4.44
we are conscious you know and and,724.2,5.22
basically I I predict that you know if,726.24,4.62
you try and upload your mind you're just,729.42,3.3
going to upload a copy of yourself and,730.86,3.3
then your body will be dead and so,732.72,3.6
subjectively you will have died but a,734.16,3.78
copy of you will continue on forever so,736.32,3.06
I don't think that mind uploading is a,737.94,4.2
good idea which if that's the case then,739.38,4.98
like we will forever be locked in our,742.14,3.6
organic bodies,744.36,3.56
even if there are digital copies of us,745.74,5.099
frolicking out in cyberspace they're not,747.92,4.359
going to be us and they're going to have,750.839,3.06
an entirely different set of constraints,752.279,3.781
because then if they're if if you become,753.899,3.721
or a copy of you becomes a digital,756.06,3.3
entity you suddenly don't have the same,757.62,3.839
biological constraints and so we have,759.36,5.279
this like Grand Divergence of digital,761.459,5.88
post-humans and then us organic meat,764.639,3.841
bags,767.339,3.06
um that's that to me sounds like an,768.48,3.659
unsolved problem that I don't think AI,770.399,3.721
is going to fix for us,772.139,5.7
all right uh moving on to social changes,774.12,5.64
jobs and occupations,777.839,3.661
so,779.76,4.44
as machines get more intelligent the the,781.5,5.639
tldr is that most jobs are going to,784.2,5.1
become irrelevant,787.139,3.721
um you know I've talked with people,789.3,3.3
about this there's a lot of there's a,790.86,3.419
lot of BS jobs out there that nobody,792.6,3.299
really wants to do but you do it because,794.279,4.5
you know you need to eat and you need to,795.899,5.581
pay for your house and whatever,798.779,5.581
and so what we're going to have to do is,801.48,4.5
then recalibrate,804.36,4.979
how we think of meaning and purpose and,805.98,4.56
success,809.339,4.5
and this includes uh maybe uh shifting,810.54,4.739
and having a greater emphasis on,813.839,3.961
Creative creativity exploration and,815.279,4.201
self-improvement,817.8,3.96
um and then one idea that that came from,819.48,5.76
discussing this with chat GPT was that,821.76,6.06
as a society we might instead instead of,825.24,5.52
focusing on conformance to one standard,827.82,5.22
of Education we might instead really,830.76,4.74
focus on what makes everyone unique,833.04,5.46
which was a really interesting new model,835.5,5.76
of education so imagine that you go to,838.5,4.98
school and instead of like everyone has,841.26,4.44
the same classes you have a broad,843.48,4.859
variety of projects and experiments and,845.7,4.68
things to figure out what it is that one,848.339,4.021
you really care about and two what,850.38,3.899
really makes you stand out and so then,852.36,4.74
everyone can have a very different uh,854.279,5.0
focus on education,857.1,4.44
my first year of school was at,859.279,4.481
Montessori school and so I can imagine,861.54,6.239
taking that to the next level anyways,863.76,5.16
um I know that there's a lot of people,867.779,3.18
that say oh well without a job we have,868.92,4.919
no meaning that is your neoliberal,870.959,5.461
programming speaking I and other people,873.839,4.62
that have made a transition to a,876.42,4.26
different uh different kind of,878.459,4.44
occupation you know my my occupation is,880.68,4.08
now YouTube and patreon which I find,882.899,5.281
much more interesting and and and uh and,884.76,7.379
rewarding is much closer to Lifestyles,888.18,6.18
that have existed in the past so for,892.139,4.56
instance in ancient Greece particularly,894.36,4.919
in Sparta Spartan citizens were not,896.699,4.08
allowed to have a job,899.279,4.5
their job was to be soldiers to be,900.779,5.041
Hunters to be politicians to participate,903.779,4.981
in culture and Society not to be leather,905.82,6.06
workers or anything else and so,908.76,5.519
obviously ancient Sparta didn't,911.88,4.5
ultimately didn't do so well ancient,914.279,4.261
Athens they did much better very similar,916.38,4.86
model with the with the the citizen,918.54,4.68
class the Leisure Class,921.24,4.38
um ditto for ancient Rome so humans have,923.22,4.979
adapted to kind of these effectively,925.62,5.64
opposed scarcity world before but,928.199,4.681
instead of working on the backs of,931.26,4.8
subjugated classes of people we will be,932.88,6.48
we will all enter into a post-scarcity,936.06,6.12
Leisure Class on the backs of AI that's,939.36,3.96
kind of what I predict is going to,942.18,1.86
happen,943.32,2.28
because honestly most people want that,944.04,3.419
anyways and if we have a collective,945.6,4.56
willpower to want that who cares and I,947.459,4.32
can hear some of you already complaining,950.16,3.239
oh corporations are never going to allow,951.779,3.18
that to happen I'm gonna get to that in,953.399,5.041
just a second glad you asked okay so if,954.959,5.041
we,958.44,4.139
sudden if nobody's job really matters,960.0,4.92
what do we do then right one of the,962.579,4.141
conversations that I had at a Meetup was,964.92,3.12
like well what if everyone just plays,966.72,3.0
video games there's actually a reason,968.04,3.659
that video games are so popular,969.72,5.04
uh because video games uh can social uh,971.699,4.981
can foster social connection right a lot,974.76,4.379
of games are very very social today and,976.68,4.44
they're also challenging which means,979.139,3.901
that they uh give you a sense of,981.12,4.2
competence a sense of Mastery and,983.04,4.739
finally video games give you a lot more,985.32,5.22
autonomy like you you can be anyone that,987.779,5.041
you want in a video game world and those,990.54,4.08
three things satisfy,992.82,2.639
um the three pillars of,994.62,2.82
self-determination Theory autonomy human,995.459,4.081
connection incompetence which is why so,997.44,3.78
many people play video games,999.54,4.62
so if you look at sdt self-determination,1001.22,5.94
Theory and then you say okay well take,1004.16,5.52
away the need for a job and suddenly AI,1007.16,5.7
gives us all a lot more autonomy gives,1009.68,4.62
us an opportunity for more human,1012.86,3.419
connection the only remaining thing is,1014.3,3.06
challenge,1016.279,3.12
and what happens for a lot of people who,1017.36,4.38
retire or step away from conventional,1019.399,5.881
work is that we realize like oh wait I,1021.74,5.88
can challenge myself in new ways,1025.28,3.6
um all of you that watch my YouTube,1027.62,3.36
channel I don't actually need to do all,1028.88,3.84
the coding experiments that I do but I,1030.98,4.26
find it deeply satisfying to challenge,1032.72,4.68
myself to try and solve the problems out,1035.24,3.66
there and I'm not saying everyone is,1037.4,3.72
going to engage in this kind of problem,1038.9,4.32
solving some people are going to go to,1041.12,4.98
do martial arts or go climb mountains or,1043.22,5.579
whatever but we humans love love,1046.1,4.02
challenges,1048.799,4.201
we need to feel competent and we need to,1050.12,5.7
to have a sense of Mastery and um the,1053.0,5.28
Sam Altman interview he pointed out that,1055.82,5.28
yes AI has solved go and chess and other,1058.28,4.86
things but we still play chess we just,1061.1,3.12
don't play against computers because,1063.14,2.7
there's no point there's no sense of,1064.22,2.94
Mastery against something that you're,1065.84,3.36
never going to win against,1067.16,4.56
um so anyways the long-term effect of,1069.2,4.14
this is that we're probably going to see,1071.72,4.38
new social structures emerge,1073.34,5.64
um or maybe even older social structures,1076.1,5.939
re-emerge I particularly predict that,1078.98,4.14
we're going to see more,1082.039,3.721
multi-generational homes more kind of,1083.12,5.58
tribal or Village lifestyle things uh,1085.76,5.7
and re-emerge because suddenly it's like,1088.7,5.28
okay well there's you know here's a,1091.46,4.5
dozen people that I really like and none,1093.98,4.079
of us have a job so let's go form an Eco,1095.96,3.66
Village you know out in the countryside,1098.059,5.221
or maybe an urban co-living situation in,1099.62,4.86
the city,1103.28,3.54
um who knows uh just some speculation,1104.48,3.3
there,1106.82,3.78
okay so I promised that we would address,1107.78,4.38
the uh some of the elephants in the room,1110.6,3.6
so let's unpack all the risks and,1112.16,4.86
factors that will go into this Rosie,1114.2,5.58
post Singularity result that I have,1117.02,3.899
outlined,1119.78,3.779
so the first one is the development and,1120.919,5.701
control of AI obviously many of you are,1123.559,5.461
probably aware that there's been a uh,1126.62,4.86
the the letter circulating that's signed,1129.02,4.2
by a whole bunch of people including,1131.48,4.38
Elon Musk and Max tegmark all calling,1133.22,6.3
for a a moratorium on the advancement of,1135.86,6.0
AI for at least six months while we take,1139.52,4.08
a breath and reassess,1141.86,5.34
so it's an it is possible that if we,1143.6,6.06
continue at a Breakneck pace and things,1147.2,5.52
do it and people do it wrong then we're,1149.66,4.56
going to end up in some kind of,1152.72,4.68
dystopian or cataclysmic outcome so,1154.22,5.88
there's basically two primary uh failure,1157.4,5.76
modes for this one is we lose control of,1160.1,5.939
the AI and it decides to kill us all the,1163.16,4.62
other failure the other major failure,1166.039,4.02
mode is that we don't lose control of AI,1167.78,4.259
but the wrong people get the powerful Ai,1170.059,4.201
and they use it to kill everyone else or,1172.039,4.26
subjugate everyone else so those are the,1174.26,4.26
two primary failure modes that have to,1176.299,4.861
do with AI development and control,1178.52,4.5
um and this has been explored in a lot,1181.16,3.06
of fiction,1183.02,2.88
um and so like I'm kind of tired of it,1184.22,3.18
so I'm not going to really talk about it,1185.9,4.26
that much more but point being is that,1187.4,6.12
99 of people don't really want an AI,1190.16,5.1
apocalypse some people seem to really,1193.52,4.2
wish for it but I think that's a sense,1195.26,4.68
of nihilism like leaking through some,1197.72,3.72
people think it's inevitable and there's,1199.94,3.78
a sort of fatalism about it and that,1201.44,3.84
again you know I empathize with people,1203.72,2.819
like that,1205.28,3.42
um I Echo Sam Altman's sentiment that,1206.539,3.661
like yeah there are a lot of people,1208.7,2.82
afraid and I'm not going to tell them,1210.2,2.7
that they're wrong or that they're,1211.52,3.6
stupid or that it's magical thinking,1212.9,4.92
like we are playing with fire,1215.12,4.62
um I just happen to be very sanguine,1217.82,4.68
about it because I feel like one all the,1219.74,5.22
problems that exist are solvable and two,1222.5,4.74
I think that they are solvable in the,1224.96,3.839
very near term,1227.24,4.86
okay another big risk is distribution of,1228.799,5.101
benefits this is one of the biggest,1232.1,3.84
things that people are worried about,1233.9,3.96
which is okay,1235.94,4.14
do you the like the one of the most,1237.86,4.38
common pushbacks is like do you honestly,1240.08,3.66
think that corporations are going to,1242.24,4.439
allow everyone to live a luxurious,1243.74,4.799
lifestyle or that the rich and Powerful,1246.679,4.441
are going to allow everyone else to live,1248.539,4.38
like they do,1251.12,3.419
well first I don't know that they'll,1252.919,3.721
have that much of a choice in it but two,1254.539,4.681
I think the fact that the the masses,1256.64,5.52
like you and I the proletariat we don't,1259.22,5.28
want to live in a cyberpunk Hell right,1262.16,5.04
and we have seen what happens repeatedly,1264.5,5.16
through history as people get hungrier,1267.2,4.26
and more desperate the most recent,1269.66,3.6
incident was the Arab Spring in which,1271.46,4.56
case much of the Middle East the Arab,1273.26,5.7
world rose up and the primary driving,1276.02,6.24
Factor was economic conditions,1278.96,4.86
um and then of course you go back even,1282.26,4.08
further the French Revolution this kind,1283.82,4.02
of thing has happened time and time,1286.34,3.6
again so I'm not too particularly,1287.84,3.959
worried about that because push comes to,1289.94,3.78
shove people are going to stand up and,1291.799,5.581
and and redistribute forcefully now,1293.72,6.06
I'm not advocating for you know Civil,1297.38,3.9
War or anything I don't even think it's,1299.78,4.32
going to come to that because you know I,1301.28,4.92
follow Davos and World economic forum,1304.1,5.22
and U.N and and all the you know Halls,1306.2,6.18
of power IMF the World Bank,1309.32,5.94
um the halls of power really are paying,1312.38,4.08
attention to this and I think that,1315.26,3.899
they're preparing for it honestly so for,1316.46,6.18
instance I suspect that the uh the the,1319.159,5.701
stimulus checks that America did during,1322.64,4.62
the pandemic I think that that was a,1324.86,4.5
pilot program to demonstrate that,1327.26,4.2
redistribution works that it is fast,1329.36,4.14
efficient and fair because they what,1331.46,3.48
they did was they did the stimulus,1333.5,4.62
checks alongside the the um the paycheck,1334.94,6.18
uh Protection Program the PPP loans and,1338.12,4.38
they basically did a side-by-side test,1341.12,4.439
showing look the PPP loans are expensive,1342.5,5.28
and Rife with corruption and the,1345.559,3.781
stimulus checks went directly to people,1347.78,3.84
who needed it and it all got spent by,1349.34,4.5
individuals who needed it so I kind of,1351.62,4.74
think that the the stimulus checks were,1353.84,6.66
a a pilot program or a prototype for Ubi,1356.36,6.48
and when you look at the landscape right,1360.5,4.44
now where there's been over 300 000 Tech,1362.84,5.28
layoffs and more other other kinds of,1364.94,4.92
people are already starting to get laid,1368.12,4.38
off and notified of layoffs due to,1369.86,5.1
Technologies like chat GPT,1372.5,4.62
um my fiance who's a writer and is in a,1374.96,3.78
lot of writing discords there are,1377.12,3.179
copywriters out there who are already,1378.74,5.28
getting laid off and losing work to AI,1380.299,6.541
um so like the AI layoffs are coming so,1384.02,4.74
I think that we're also going to see a,1386.84,4.5
lot of stimulus checks coming and it's,1388.76,4.5
just a matter of okay are these stimulus,1391.34,3.66
checks permanent and I think that they,1393.26,3.12
will be,1395.0,3.84
the regulatory environment so this is,1396.38,5.1
where that letter that just came out,1398.84,4.98
um is is asking for regulation Sam,1401.48,4.14
Allman has asked for regulation Elon,1403.82,4.32
Musk has asked for regulation all kinds,1405.62,4.14
of people are asking for more regulation,1408.14,5.159
now the big problem here though is one,1409.76,5.46
there's no agreement on how to regulate,1413.299,3.541
these things and in the conversations,1415.22,5.1
I've had at meetups the question rapidly,1416.84,4.98
comes up how do you even enforce it,1420.32,3.859
right if all these models are getting,1421.82,4.5
faster and more efficient and you can,1424.179,4.661
run them on laptops now you can't put,1426.32,3.96
that Genie back in the model so does,1428.84,3.18
regulation even matter,1430.28,4.44
or if it does how,1432.02,4.139
so,1434.72,3.839
the big concern here with the regulatory,1436.159,3.961
environment at the federal and,1438.559,3.721
international level is existing power,1440.12,4.38
structures and the status quo so the,1442.28,3.899
wealthy and Powerful are going to want,1444.5,3.179
to remain the wealthiest and most,1446.179,3.721
powerful on the planet that's just how,1447.679,4.38
it is and how it has always been there,1449.9,4.139
have been reset events like you know the,1452.059,4.5
French Revolution American Revolution so,1454.039,3.901
on and so forth there have been reset,1456.559,3.301
events in history but they're generally,1457.94,4.38
violent and we want to avoid that so do,1459.86,4.319
the so do the powers that be also want,1462.32,4.56
to avoid that but the biggest problem in,1464.179,4.74
these conversations that I've had is,1466.88,4.5
that things are advancing so fast and,1468.919,4.561
the gerontocracy which is ruled by the,1471.38,3.12
elderly,1473.48,3.9
old folks generally don't get AI they,1474.5,4.919
don't understand how much is changing,1477.38,4.2
and why and what its impact is going to,1479.419,4.38
be and that honestly could be one of the,1481.58,5.52
biggest risks is you know us younger,1483.799,5.521
people we get it we see it coming,1487.1,3.78
even some of the people at the meetups,1489.32,3.359
that I talk to their children are,1490.88,6.06
already acclimating to a an AI world and,1492.679,5.641
they're going to trust the AI more than,1496.94,2.219
people because it's like well,1498.32,2.94
politicians lie and yeah chat GPT might,1499.159,3.661
get it wrong sometimes but it's not,1501.26,2.88
going to lie to you but not like a,1502.82,3.239
politician will so we're in we're in for,1504.14,4.38
some very interesting uh advancements in,1506.059,4.921
the regulatory front,1508.52,5.519
public perception and adaptation so,1510.98,5.22
there's a lot of fun fear uncertainty,1514.039,4.861
and doubt uh denialism doomerism and,1516.2,3.959
then also lots of people saying oh,1518.9,3.779
that's still decades away it's not it's,1520.159,5.041
months and years away not decades,1522.679,5.401
um so the another big problem is a lot,1525.2,4.44
of this uncertainty a lot of this,1528.08,3.12
denialism,1529.64,3.12
um some of the there's various aspects,1531.2,3.839
of the of the denialism for instance,1532.76,3.96
some people think oh well ai's never,1535.039,3.301
going to be as smart as us or it's never,1536.72,2.76
going to be smarter than us and it's,1538.34,2.819
like I kind of think that it's already,1539.48,4.5
smarter than those people it just lacks,1541.159,3.961
autonomy,1543.98,4.02
um but you know that's my opinion and I,1545.12,4.159
know some of you disagree with it,1548.0,3.779
anyways this is another big risk is,1549.279,4.0
because a lot of people are sticking,1551.779,3.481
their head in the sand and then there's,1553.279,3.541
also comments around the world like,1555.26,2.94
someone was saying that I think in,1556.82,3.66
France like they don't even like people,1558.2,4.079
aren't even talking about it right it's,1560.48,4.079
so like all of this is happening so,1562.279,4.441
quickly and most people aren't even,1564.559,4.561
aware of it of course chat GPT made the,1566.72,4.02
news but then people just kind of you,1569.12,3.78
know World by and large collectively,1570.74,4.319
Shrugged without understanding how fast,1572.9,4.379
this is ramping up so public perception,1575.059,4.921
and and acclimating to this could also,1577.279,4.681
be a big barrier,1579.98,5.04
uh Global uh Global cooperation and,1581.96,6.48
collaboration the big thing here is,1585.02,6.42
um what what I call trauma politics so,1588.44,4.92
basically you look at people like Putin,1591.44,5.16
and Xi Jinping both of whom suffered a,1593.36,6.6
tremendous amount of trauma at the hands,1596.6,6.3
of their dystopian governments,1599.96,5.099
um and they basically are seeking power,1602.9,5.46
for the purpose of self-soothing,1605.059,5.821
um that's pretty much all there is to it,1608.36,5.4
but when when people who have a,1610.88,4.32
tremendous amount of trauma come into,1613.76,4.2
power they tend to have a more,1615.2,5.04
nihilistic worldview which with which,1617.96,5.16
then results in things like genocide,1620.24,5.039
mass incarceration surveillance States,1623.12,4.32
because they want control they want as,1625.279,3.9
much control and power as they can get,1627.44,4.32
and it's never enough,1629.179,3.181
um,1631.76,4.32
and so this nihilism also creates a,1632.36,6.6
self-fulfilling prophecy because they,1636.08,5.52
project their pain onto the world which,1638.96,4.38
causes more trauma look at the war in,1641.6,2.76
Ukraine,1643.34,2.339
um look at China's treatment of the,1644.36,3.419
uyghurs and then that creates a,1645.679,4.561
self-perpetuating loop of more trauma,1647.779,4.741
intergenerational trauma and so forth so,1650.24,5.76
so on and so forth and so in my opinion,1652.52,5.519
um this unaddressed uh basically,1656.0,5.1
intergenerational PTSD or nihilism is,1658.039,4.561
the greatest threat to humanity because,1661.1,3.3
these are the kinds of people who will,1662.6,4.14
look at these things Ai and say oh,1664.4,3.72
that's the perfect weapon for control,1666.74,2.939
that's the perfect weapon for,1668.12,4.32
subjugation whereas healthy individuals,1669.679,5.041
look at Ai and say maybe we don't do,1672.44,4.44
that,1674.72,5.4
um Singularity facts so uh there is a,1676.88,4.86
lot of kind of gotcha questions that,1680.12,3.36
come up I tried to capture some of the,1681.74,5.1
best ones what will happen to money post,1683.48,5.16
Singularity,1686.84,2.88
um some people think like oh,1688.64,3.24
cryptocurrency is the future or maybe we,1689.72,4.5
get do away with money altogether,1691.88,4.26
well I've got some good news and some,1694.22,6.059
bad news the uh the good news is that uh,1696.14,6.06
it is entirely possible that money will,1700.279,3.961
change monetary systems will change and,1702.2,4.2
financial policies will change however,1704.24,4.679
the concept of currency the concept of,1706.4,5.279
money is too useful and too helpful,1708.919,4.081
because,1711.679,4.141
um it is an abstract uh reserve of value,1713.0,4.98
and it is also a really good medium of,1715.82,3.12
Exchange,1717.98,3.72
and so you know whether that means that,1718.94,4.8
Bitcoin or other cryptocurrencies are,1721.7,5.16
gonna are gonna you know replace uh fiat,1723.74,4.919
currency I'm not really going to say one,1726.86,3.9
way or another but basically currency is,1728.659,4.441
here to stay in some form,1730.76,3.84
um personally I think that there's too,1733.1,3.9
many problems with cryptocurrency,1734.6,4.86
um namely that it is uh subject to,1737.0,4.38
manipulation because its value can,1739.46,2.819
change,1741.38,3.5
a lot right like the the the the the,1742.279,6.0
wild swings of value of of Bitcoin and,1744.88,5.679
stuff basically proves that it is not a,1748.279,4.801
stable reserve of value and you know,1750.559,4.5
people have lost fortunes on it people,1753.08,4.02
have made fortunes on it too usually,1755.059,4.261
people that with uh,1757.1,4.439
um not the best intentions oh I don't,1759.32,4.739
say usually but sometimes basically uh,1761.539,6.421
organized crime um loves cryptocurrency,1764.059,5.34
uh what will happen to the human,1767.96,4.56
population now this one really kind of,1769.399,5.941
uh uh is is interesting because there's,1772.52,4.62
a lot of debate over what is the actual,1775.34,3.3
carrying capacity of the planet some,1777.14,4.62
people say oh it's easily 50 billion,1778.64,6.18
um and it's no it's not simply enough no,1781.76,4.86
the carrying capacity of the planet is,1784.82,4.5
nowhere near 50 billion there is,1786.62,5.1
technically enough room physical room,1789.32,5.219
450 billion humans but when you look at,1791.72,4.8
the the the constraints of,1794.539,4.62
thermodynamics hydrological Cycles the,1796.52,4.74
amount of arable land no,1799.159,5.52
now it is possible that the singularity,1801.26,6.12
with you know its results in nuclear,1804.679,4.801
fusion and stuff you could probably tip,1807.38,3.299
that a little bit further right,1809.48,3.24
especially if you can,1810.679,4.081
um synthesize more arable land or grow,1812.72,5.4
food underground or desalinate water you,1814.76,4.98
could probably boost the carrying,1818.12,3.96
capacity of the planet quite a bit 50,1819.74,4.26
billion still seems way out there for me,1822.08,4.14
but the biggest thing is going to be is,1824.0,3.779
not going to be those things like you,1826.22,4.559
know okay we we overcome those those,1827.779,5.041
energetic constraints it's still going,1830.779,4.02
to come down to like,1832.82,4.8
uh mostly mostly management right,1834.799,5.401
sustainable management of of the,1837.62,5.179
population because the thing is you know,1840.2,6.0
you you if if Logistics breaks down,1842.799,5.681
today we all starve pretty quickly right,1846.2,4.5
because we don't have locally sourced,1848.48,5.52
food our food and water you know,1850.7,6.0
requires a very stable,1854.0,4.799
um infrastructure in order to provide,1856.7,4.26
that and that only gets worse when you,1858.799,3.6
have like 50 billion people on the,1860.96,4.14
planet so you know sustainable and,1862.399,5.16
responsible management of necessary,1865.1,5.459
resources primarily food and water are,1867.559,5.941
going to be the key to what happens with,1870.559,5.281
the human population now in some of the,1873.5,4.26
discussions that I've had there's a few,1875.84,4.02
confounding factors here one thing that,1877.76,4.32
isn't mentioned on this slide is what,1879.86,4.08
happens if we solve aging,1882.08,4.8
because what happens with populations is,1883.94,6.06
as they become more gender equal women,1886.88,6.12
choose to have fewer children and so,1890.0,4.679
what if people are living longer but,1893.0,3.96
having fewer children I kind of predict,1894.679,3.48
that the population is going to,1896.96,2.88
stabilize there's always going to be,1898.159,4.801
some people who want children but at the,1899.84,5.819
same time right like if you if you don't,1902.96,4.56
if you don't actually really deeply want,1905.659,3.481
children you're probably not going to,1907.52,3.659
have them and then in a post-scarcity,1909.14,3.72
life like,1911.179,4.261
maybe you choose never to have children,1912.86,4.38
and again some people will choose to,1915.44,3.66
have children and even if you solve,1917.24,3.419
aging people will still die they're,1919.1,3.059
still going to be accidents right,1920.659,3.301
they're still going to be,1922.159,4.081
um maybe a few a handful of Unsolved,1923.96,4.38
medical issues but primarily you're,1926.24,4.62
going to see accidents and also one of,1928.34,5.28
the conversations that came up was okay,1930.86,4.38
well if you can if you can,1933.62,3.419
hypothetically live forever do you want,1935.24,5.1
to and the I the many people suspect,1937.039,4.561
that you won't actually want to live,1940.34,2.699
forever you might choose to live for a,1941.6,3.12
few hundred years but then you might get,1943.039,3.421
tired of life and then you know quit,1944.72,4.14
taking the life extending medicine and,1946.46,4.38
allow yourself to die naturally who,1948.86,3.0
knows,1950.84,3.3
um but personally I kind of predict a a,1951.86,5.58
a a population stabilization,1954.14,6.36
food so food has been a big thing,1957.44,6.0
um so on top of you know vertical,1960.5,4.919
farming or underground farming powered,1963.44,3.839
by nuclear fusion okay great we can eat,1965.419,4.74
whatever we want wherever we want I also,1967.279,5.721
suspect that biotechnology is going to,1970.159,5.88
really change our diet and what I mean,1973.0,5.98
by that is synthetic Foods engineered,1976.039,6.901
foods and even hyper personalized diets,1978.98,7.199
so for instance by and large you might,1982.94,5.04
believe that Dairy is bad for you,1986.179,4.021
because it's you know got you know anal,1987.98,5.76
you know saturated fat in it but when I,1990.2,6.0
started uh when I added more Dairy to my,1993.74,4.74
diet all my numbers got better because,1996.2,4.56
it's just in my jeans it's in whatever,1998.48,4.38
and so but I had to figure that out,2000.76,4.74
through trial and error Dairy raises,2002.86,4.439
some people's cholesterol in my case it,2005.5,4.86
lowered it so the combination of,2007.299,5.341
engineered Foods better bioinformatics,2010.36,3.9
and biotech,2012.64,4.68
um and and things like mobile Farms oh,2014.26,4.56
there's actually I actually saw an ad,2017.32,3.839
for it the the first like portable Farms,2018.82,4.859
are actually the the container shipping,2021.159,5.281
container Farms are are coming so that,2023.679,4.62
only that only ramps up and gets better,2026.44,4.079
over time so that means you go to the,2028.299,3.541
grocery store,2030.519,3.421
and everything that you could possibly,2031.84,3.78
want is there and it's fresh and it's,2033.94,4.38
local so that so you know some people,2035.62,4.14
are worried like oh well they're going,2038.32,2.579
to take our steaks they're going to take,2039.76,2.94
our Burgers I don't think so I think,2040.899,3.481
you're actually going to have much more,2042.7,3.12
options and they're going to be,2044.38,3.24
healthier options uh in a post,2045.82,3.779
Singularity world,2047.62,4.559
uh War so I did mention trauma politics,2049.599,5.401
and and geopolitics earlier,2052.179,5.281
obviously the biggest the absolute,2055.0,5.879
biggest risk here is an AI arms race,2057.46,6.78
um even Nations liberal democracies that,2060.879,5.881
are not run by deeply traumatized,2064.24,5.28
tyrants are still going to be engaged in,2066.76,5.94
some kind of AI arms race uh which is an,2069.52,4.56
unfortunate reality I'm not saying that,2072.7,3.0
that's a good thing I'm not passing,2074.08,2.94
moral judgment on it it's just an,2075.7,3.36
observation every time there's new,2077.02,4.379
technology it is integrated into the,2079.06,5.579
military apparatus I don't all I also,2081.399,4.2
don't think that we're going to end up,2084.639,2.821
with a one world government,2085.599,4.201
at least not anytime soon,2087.46,4.32
um and there's numerous reasons for this,2089.8,3.9
not the least of which is language,2091.78,4.68
barriers cultural differences,2093.7,4.32
um uh,2096.46,4.8
past grievances between cultures,2098.02,4.56
um you know it could take many many,2101.26,4.5
generations to heal those um those,2102.58,5.82
Intercultural wounds before people even,2105.76,5.94
want to uh collaborate you look at the,2108.4,5.28
the animosity between like China and,2111.7,4.44
Japan between Israel and Palestine,2113.68,5.64
between Iran and and a bunch of other,2116.14,5.64
nations and so on and so forth it takes,2119.32,4.98
a lot of work to heal those wounds,2121.78,4.319
and there's a lot of resistance to,2124.3,4.62
Healing those wounds and those wounds,2126.099,5.341
could continue to Fester what I'm hoping,2128.92,5.699
is that AI actually helps us break the,2131.44,5.76
cycle of intergenerational trauma and so,2134.619,4.381
then within maybe you know two or three,2137.2,4.139
generations we're ready for a more,2139.0,5.28
peaceful Global community and again I,2141.339,4.201
still don't think that a global,2144.28,3.0
government is going to happen,2145.54,3.2
um just because like,2147.28,3.6
geographically speaking like it kind of,2148.74,3.64
makes sense to have the uh the the,2150.88,3.36
Nations the nation states and then the,2152.38,3.719
union model that makes the most sense,2154.24,3.66
right now like you know France is still,2156.099,3.361
France Great Britain is still Great,2157.9,3.06
Britain but they're part of the European,2159.46,2.7
Union right,2160.96,3.84
and over time I do suspect that those,2162.16,4.56
Continental sized unions will get,2164.8,4.26
stronger but not that they'll replace,2166.72,4.379
the local governments just like you know,2169.06,4.08
we have municipals we have local city,2171.099,4.321
governments we have County we have state,2173.14,3.3
governments and we have Federal,2175.42,2.28
governments I think that we're just,2176.44,3.659
going to add a few tiers on top of that,2177.7,4.8
and eventually we will end up with a,2180.099,4.681
global governance but again I think that,2182.5,3.48
it's probably at least two or three,2184.78,3.18
generations away minimum,2185.98,4.74
and then finally corporations so I did,2187.96,5.1
promise that I would address this,2190.72,4.74
um so some people and this includes,2193.06,4.68
myself I hope that corporations as we,2195.46,4.619
know them go away because corporations,2197.74,5.04
are intrinsically amoral and I don't,2200.079,4.861
mean immoral amoral and corporations,2202.78,5.28
their morality is only beholden to the,2204.94,5.34
investor right uh to the shareholders,2208.06,3.84
and the shareholders just want more,2210.28,3.059
value,2211.9,3.78
um whatever it costs right and,2213.339,4.681
corporations will always explore every,2215.68,4.08
little nook and cranny of what they can,2218.02,4.2
legally get away with,2219.76,5.64
um and that often is results in bad,2222.22,5.3
things such as mistreatment of people,2225.4,4.8
environmental abuse and so on,2227.52,4.66
so because corporations are,2230.2,4.08
intrinsically amoral I hope that they go,2232.18,4.38
away but I don't think that they will,2234.28,4.799
um I tried I tried to figure out how the,2236.56,4.2
singularity could result in this but I I,2239.079,3.0
the more I explore it the more I,2240.76,2.94
realized like no,2242.079,3.241
basically what's going to happen is that,2243.7,4.32
AI is going to allow corporations to,2245.32,4.32
produce more with less,2248.02,3.66
so productivity will continue to go up,2249.64,4.199
while head count goes down and I talked,2251.68,4.62
about this in my AI job apocalypse video,2253.839,4.681
a couple months ago basically what's,2256.3,4.86
going to happen is that uh you're going,2258.52,4.92
to see corporations replace as many of,2261.16,4.26
their workers as they can and so then,2263.44,3.48
you have,2265.42,3.54
um the the owner the ownership class,2266.92,4.439
whether it's shareholders CEOs whoever,2268.96,6.6
is going to have basically unmitigated,2271.359,6.601
um stock price growth because suddenly,2275.56,4.08
the greatest constraint and the most,2277.96,3.18
expensive aspect of running a,2279.64,4.439
corporation human labor is no longer a,2281.14,5.76
factor so we I think that we are at risk,2284.079,5.28
of seeing like the mega Corp things that,2286.9,5.699
you see in like uh dystopian sci-fi I,2289.359,4.98
think that we probably are at risk of,2292.599,4.02
seeing you know multi-trillion dollar,2294.339,4.621
quadrillion dollar companies out there,2296.619,4.141
that have almost no employees that are,2298.96,4.2
all entirely run by shareholders and,2300.76,3.54
then AI,2303.16,3.06
uh so that is,2304.3,3.96
an interesting thing now as to whether,2306.22,4.02
or not they will allow the rest of us to,2308.26,4.079
live in certain ways I kind of think,2310.24,4.92
that they don't care right because as as,2312.339,4.801
obscenely wealthy as corporations are,2315.16,4.62
going to be like,2317.14,5.1
there's just it doesn't make sense for,2319.78,4.319
them to expend any energy depriving,2322.24,3.78
everyone else,2324.099,4.081
um and so like let's just imagine that,2326.02,5.22
like Elon Musk takes SpaceX and uses it,2328.18,5.7
to start harvesting asteroids and SpaceX,2331.24,5.879
becomes a 20 trillion dollar Company by,2333.88,5.76
harvesting iridium and Cobalt and,2337.119,5.941
platinum from asteroids great is is Elon,2339.64,5.64
Musk going to personally say actually I,2343.06,3.24
don't think that I think that everyone,2345.28,3.96
should live in slums and favelas around,2346.3,4.68
the world no he he's not going to care,2349.24,3.66
he doesn't give a crap how everyone else,2350.98,3.84
lives as long as he's a trillionaire,2352.9,3.9
right and so when I think it through it,2354.82,4.14
that way it's like it would take a lot,2356.8,4.559
of deliberate effort on corporate,2358.96,5.639
um to on on behalf of Corporations to,2361.359,5.641
deliberately deprive the rest of us of a,2364.599,3.661
better life so I don't think that's,2367.0,2.94
going to happen certainly it's something,2368.26,3.359
to be aware of because again,2369.94,4.08
corporations are intrinsically amoral,2371.619,5.941
which is one of the biggest risks to our,2374.02,5.28
standard of living in the future,2377.56,4.74
okay that's that thanks for watching um,2379.3,4.5
I hope you thought found this video,2382.3,4.38
enlightening and thought-provoking,2383.8,6.6
um yeah I know that uh there will,2386.68,5.1
probably be some disagreements in the,2390.4,3.3
comments um keep it civil or you get,2391.78,5.42
banned thanks bye,2393.7,3.5