davidshapiro_youtube_transcripts / Escaped Sapiens David Shapiro on AI alignment safety and the future of humanity_transcript.csv
Stevross's picture
Upload 50 files
421fea8
raw
history blame contribute delete
No virus
17.6 kB
text,start,duration
is there anything that makes humans,0.06,3.0
Irreplaceable,1.86,3.3
so are we getting into a stage where,3.06,4.38
there's nothing that AI that humans can,5.16,4.8
do that AI won't be able to do,7.44,5.279
uh from a functional standpoint from a a,9.96,5.759
an objective standpoint I don't think so,12.719,4.98
um and that actually begs a very deep,15.719,3.421
philosophical and spiritual question,17.699,2.821
which is what is the point of living,19.14,4.5
what is the point of being a human,20.52,5.16
um and that uh is something that I've,23.64,4.559
done some work on I've wrote a paper or,25.68,5.58
a short book called post nihilism where,28.199,5.581
what I suspect is that we are barreling,31.26,4.92
towards uh what I call a nihilistic,33.78,3.54
crisis or actually we're in the middle,36.18,3.059
of a nihilistic crisis and it actually,37.32,3.68
started with the Industrial Revolution,39.239,4.921
if you look at a lot of poetry and,41.0,5.98
literature works of fiction during the,44.16,5.28
the rise of the Industrial Revolution a,46.98,4.079
lot of people had a lot of existential,49.44,3.299
anxiety about what was the point of,51.059,4.801
Being Human in an era of machines and,52.739,4.621
this kind of pops up every now and then,55.86,4.26
right same thing happened with computers,57.36,5.539
um with the Advent of you know uh,60.12,5.34
high-speed computers nuclear weapons so,62.899,5.021
on and so forth technological advances,65.46,3.839
advancements tend to give us some,67.92,2.66
existentialing,69.299,3.961
but to your question about like okay,70.58,5.38
what is the benefit of being a human in,73.26,4.62
a world where from a product,75.96,3.78
productivity standpoint or an economic,77.88,3.48
standpoint machines can do everything,79.74,3.48
that we can do better faster and cheaper,81.36,4.98
what's the point and so that is where we,83.22,5.219
have to change our orientation towards,86.34,4.919
how we value our own life and our own,88.439,4.5
subjective experience so that's a deeply,91.259,3.661
deeply philosophical and religious,92.939,4.561
perspective or a question and it's it's,94.92,5.04
really interesting because depending on,97.5,4.2
someone's spiritual upbringing or,99.96,4.5
spiritual disposition the question lands,101.7,5.279
very differently because many uh,104.46,4.199
religious doctrines around the world,106.979,3.721
basically say that humans have a soul,108.659,4.861
and that sets us apart and so whether or,110.7,5.76
not that's true uh people have a model,113.52,5.279
for just saying my subjective experience,116.46,6.299
of being is Matt is very meaningful and,118.799,7.741
it is unique and so part of overcoming a,122.759,6.301
nihilistic crisis is we all have to face,126.54,4.62
that whether or not not we believe in,129.06,4.2
souls or God or whatever,131.16,5.159
and we have to kind of go back to basics,133.26,5.76
and look at the subjective experience of,136.319,4.261
our own being and so back to your,139.02,3.42
question earlier about children I,140.58,3.42
suspect that children who grow up with,142.44,4.26
AI they will just intrinsically know oh,144.0,4.08
yeah my experience is different from,146.7,3.179
this machine and that's okay and that,148.08,3.299
they won't have any existential anxiety,149.879,4.261
about it I hope at least do you have are,151.379,4.381
you hopeful for the future or do you,154.14,3.84
have this anxiety,155.76,5.58
um no I I am uh I think I'm biologically,157.98,5.46
programmed to be optimistic I just I,161.34,3.899
can't be cynical,163.44,5.04
um and I I,165.239,5.881
part of that is that I've done a,168.48,4.08
tremendous amount of work to understand,171.12,3.6
what the dangers and risks are and I've,172.56,4.44
also tried to contribute to coming up,174.72,4.86
with a more optimistic outcome the,177.0,4.379
machines so we we all learned this from,179.58,3.42
watching Scooby-Doo,181.379,3.841
um the monsters are always humans right,183.0,4.26
there's no such thing as as an evil,185.22,3.9
monster out there the problem is always,187.26,4.92
humans and so this is this is a big,189.12,5.24
reason that I've done my work is because,192.18,4.62
you know it's not it's not that a,194.36,4.36
machine is going to replace you and,196.8,3.359
that's a bad thing right we all,198.72,3.599
fantasize about like hey I want to you,200.159,3.72
know go live in the countryside and just,202.319,3.481
go fishing every day we all know what we,203.879,4.261
want to do if we don't have to work what,205.8,5.4
we are truly afraid of is not being able,208.14,5.22
to take care of ourselves is that if the,211.2,4.14
machine takes our job we're gonna go,213.36,3.9
hungry we're gonna lose our home we're,215.34,4.02
gonna end up lonely and and whatever,217.26,4.8
that's the actual fear,219.36,4.799
um nobody actually wants to keep working,222.06,4.38
right nobody want like I remember one of,224.159,4.921
the advertisements for um for a like,226.44,5.1
health insurance here in America was you,229.08,3.9
get to keep your health insurance you,231.54,2.82
like your health nobody likes health,232.98,4.259
insurance it's a necessary evil right,234.36,6.959
jobs occupations are a necessary evil of,237.239,6.301
the economic environment that we're in,241.319,4.14
and the technological limitations that,243.54,3.96
we're in and so as these things progress,245.459,3.481
this is this is I'm basically just,247.5,3.599
unpacking why I'm optimistic as these,248.94,3.78
things progress I hope that we're all,251.099,3.36
going to be able to have kind of a Back,252.72,3.54
to Basics moment where it's like you,254.459,3.12
wake up one day and it's like how do you,256.26,2.58
actually want to live right if you want,257.579,2.94
to go fishing every day do it if you,258.84,3.54
want to focus on being an opera singer,260.519,4.381
go do that you know there's we all have,262.38,4.74
stuff that we want to do but that we,264.9,4.44
sacrifice for the sake of earning enough,267.12,4.2
money to take care of ourselves and that,269.34,5.34
is the reality for most of us today,271.32,4.8
one of the reasons why we have this,274.68,3.36
worry is because currently we live in,276.12,3.48
sort of a negotiated environment right,278.04,4.02
the successive labor movements was,279.6,4.92
because labor was needed when humans are,282.06,4.699
no longer needed,284.52,4.92
there's sort of a worry that,286.759,4.121
we're not going to have the opportunity,289.44,2.64
to go fishing,290.88,2.46
right where they're going to have,292.08,3.3
nothing and I guess that's that's the,293.34,3.48
the worry that you're pointing at what,295.38,3.24
what do you think the first jobs are,296.82,3.9
that are going to go,298.62,3.66
well there's already been quite a few,300.72,2.64
layoffs,302.28,3.96
um various uh communities on Reddit or,303.36,4.98
private communities on Discord,306.24,4.56
um so for instance my uh fiance's uh,308.34,4.38
were both writers but she's on a few uh,310.8,3.899
private writing communities,312.72,3.479
um copywriters have already been laid,314.699,4.201
off and replaced by AI,316.199,4.741
um uh marketing teams have been notified,318.9,3.6
that you know they've got a year until,320.94,2.88
they're all going to get laid off and,322.5,3.66
replaced by you know AI generated images,323.82,4.86
and AI generated emails,326.16,4.62
um so it's happening,328.68,5.04
um yeah that's that that's where we're,330.78,6.24
at now I guess to to your your larger,333.72,5.28
point of you know if we're all,337.02,3.959
replaceable you know what's what's the,339.0,4.08
bottom line and the fact of the matter,340.979,4.381
is from a corporate perspective from uh,343.08,3.899
from the perspective of neoliberalism,345.36,4.559
human labor is one of the most expensive,346.979,5.461
aspects of productivity and it's also,349.919,4.861
the biggest constraint you look at a,352.44,4.5
population decline in places like China,354.78,4.02
and Japan because China just crested,356.94,3.96
right so from here on out China's,358.8,3.54
population is going down for at least,360.9,3.239
the next Century Japan has been in,362.34,3.78
Decline for a couple decades now uh,364.139,4.201
ditto for Italy and a few other nations,366.12,4.799
so their labor force is Contracting,368.34,5.34
right and from an economic perspective,370.919,6.241
that's really bad for for for Nations so,373.68,6.0
AI hopefully will actually Shore up,377.16,4.86
those Labor uh labor markets and,379.68,5.7
actually replace lost human labor now,382.02,5.28
because humans are so expensive right,385.38,4.62
you can pay uh 20 a month for chat gbt,387.3,4.56
and it can basically serve as an,390.0,3.84
executive assistant and personal coach,391.86,3.839
and every it can replace literally,393.84,3.479
thousands of dollars worth of Labor and,395.699,3.901
it costs 20 a month chat GPT is,397.319,3.541
infinitely cheaper than most human,399.6,2.879
employees,400.86,2.76
um and that's only going to get better,402.479,3.06
right because either the model is going,403.62,4.019
to get more efficient and cheaper,405.539,3.481
um or it's going to get smarter and more,407.639,3.0
powerful and therefore more valuable or,409.02,5.16
both in all likelihood so one one of the,410.639,4.861
things that I predict,414.18,3.12
is that we are going to have a post,415.5,4.56
Labor uh market economy before too long,417.3,5.58
and in that respect uh basically,420.06,5.46
economic productivity will be decoupled,422.88,4.319
from Human labor,425.52,3.48
um and in that case you know you're,427.199,3.181
going to see quadrillion dollar,429.0,3.84
valuation uh for companies that have no,430.38,3.539
employees,432.84,3.0
and that might sound like that that,433.919,4.081
could be an ingredient for a dystopian,435.84,5.04
world that nobody wants to live in,438.0,4.74
we'll get to like the regulation and,440.88,4.5
stuff of that later but from a from a,442.74,5.7
from a purely GDP perspective AI is,445.38,4.08
going to be the best thing that ever,448.44,4.86
happened to GDP to uh to uh economics,449.46,6.54
because again it will decouple uh human,453.3,4.2
labor from the constraint and that there,456.0,2.94
there will still be a few constraints,457.5,4.62
natural resources Rare Minerals uh fresh,458.94,6.06
water arable land right there's going to,462.12,4.32
be there's always going to be some,465.0,3.599
physical constraints but we're going to,466.44,4.56
remove human labor as one of the main uh,468.599,5.281
constraints to economics and that is,471.0,4.38
going to mandate kind of those things,473.88,2.759
like you said like if you want to go,475.38,3.78
fishing well how right if you don't have,476.639,4.921
any economic power if you don't have any,479.16,5.099
way to make a demand then that's a big,481.56,4.38
problem which is what we're going to,484.259,2.821
have to negotiate we're going to have to,485.94,2.58
negotiate a new social contract,487.08,2.88
basically,488.52,3.06
what do you think the impact is going to,489.96,3.9
be on births ultimately do you think,491.58,4.86
people are going to just start having AI,493.86,5.1
children because it's cheaper,496.44,4.08
you know that's a really difficult,498.96,3.9
question I could see it going either way,500.52,4.44
um there's plenty of books and and and,502.86,5.04
fiction out there and research papers,504.96,4.799
um people have predicted you know the,507.9,4.74
population uh explosion you know the,509.759,4.381
Earth will become uninhabitable because,512.64,2.94
we'll have billions and billions of,514.14,3.48
people that we can't feed other people,515.58,3.36
are worried that you know the population,517.62,3.359
is going to collapse,518.94,3.36
um and I actually had a pretty long,520.979,3.36
conversation about this just to kind of,522.3,4.26
clarify my own ideas uh again with chat,524.339,3.781
GPT,526.56,4.38
um and so there's a few driving factors,528.12,6.42
that cause uh birth rates to decline,530.94,5.78
um uh women entering the workforce,534.54,4.26
education and empowerment for women,536.72,5.1
access to birth control so it turns out,538.8,6.479
when a society advances and becomes a,541.82,6.16
little bit more uh sophisticated or or,545.279,5.101
gains more access or some you know Ginny,547.98,4.38
coefficient goes up whatever metrics you,550.38,4.68
use education goes up fertility rates go,552.36,3.539
down,555.06,2.7
some of that has to do with the choices,555.899,4.141
of Family Planning you know men and,557.76,3.6
women decide to have fewer children,560.04,3.6
women have more control over their own,561.36,3.479
fate,563.64,3.18
um and so fertility rates tend to go,564.839,4.021
down and this is a very very reliable,566.82,4.019
Trend globally,568.86,3.14
um you know,570.839,3.901
regardless of culture regardless of,572.0,5.14
other economic conditions as education,574.74,6.0
rates go up as uh uh women in the,577.14,5.639
workforce goes up fertility rates goes,580.74,4.14
down this is a global thing with no,582.779,5.041
exceptions right so if you extrapolate,584.88,5.1
that out then you can probably make a,587.82,5.1
relatively safe assumption that as AI,589.98,5.58
spreads around the world and economics,592.92,4.44
and education and everything goes up,595.56,4.14
that fertility rates will continue to go,597.36,5.7
down around the whole world South Korea,599.7,4.92
I believe has the lowest fertility rate,603.06,4.74
on the planet at 0.8 births per woman,604.62,5.52
which is uh like,607.8,4.86
um just uh just above a third of the,610.14,5.34
replacement rate so it's entirely,612.66,6.299
possible that under these trends that a,615.48,5.46
population collapse is actually the most,618.959,4.921
real danger that we face so well what do,620.94,5.1
you do about that one thing that I think,623.88,5.34
is going to happen is that AI will lead,626.04,6.479
to Medical breakthroughs and I suspect,629.22,6.36
that we are close if not already at uh,632.519,5.041
the the place of what's called Longevity,635.58,4.86
escape velocity which is that the,637.56,4.38
medical breakthroughs that happen every,640.44,3.54
year extend your life by more than a,641.94,4.62
year so basically,643.98,4.56
hypothetically if you're healthy enough,646.56,4.56
today if you're not about to die and you,648.54,5.1
have access to decent enough uh health,651.12,4.68
care then that the compounding returns,653.64,4.56
of medical research and AI means that,655.8,3.96
you and I could live to be several,658.2,3.66
centuries old which means that the,659.76,3.9
population of the planet will stabilize,661.86,5.159
as birth rates continue to decline now,663.66,5.28
whether I I do think that some people,667.019,4.56
will ultimately choose like AI,668.94,4.74
companions as they become more realistic,671.579,3.721
certainly a lot of people have seen,673.68,3.36
shows like Westworld,675.3,2.46
um you know one of my favorite,677.04,2.28
characters of all time is data from Star,677.76,3.24
Trek and I would love to have data as a,679.32,3.12
friend right,681.0,4.82
um so I absolutely suspect that um that,682.44,6.0
anthropomorphic machines will be part of,685.82,4.66
our Lives before too long,688.44,3.78
um whether what form they take you know,690.48,3.299
whether it's a robotic dog that never,692.22,4.38
dies or you know a walking talking,693.779,5.161
friend that is always there to hang out,696.6,4.739
or if it's a romantic partner like uh,698.94,3.839
you know in the movie Her,701.339,3.481
um with Joaquin Phoenix and Scarlett,702.779,3.781
Johansson there's lots of possibilities,704.82,4.32
uh for how life is going to be but like,706.56,4.26
I said I think one of the most reliable,709.14,4.02
durable Trends is fertility rates go,710.82,4.5
down so the question is will that be,713.16,3.9
offset by longevity,715.32,4.44
so in other words rather than sort of,717.06,5.16
the the dangerous Skynet that some,719.76,5.16
people Envision we might just get out,722.22,5.16
competed sexually uh into Extinction,724.92,4.8
something along those lines yeah that's,727.38,4.26
that's entirely possible especially when,729.72,3.66
you consider that um actually there was,731.64,4.199
a line from Terminator two it was when,733.38,4.98
Sarah Connor was watching uh you know,735.839,3.901
the Terminator Arnold Schwarzenegger,738.36,4.2
play with John and she realized that the,739.74,5.279
machine has infinite patience and will,742.56,4.5
always be there because John was his,745.019,4.26
mission and I realized that from a,747.06,3.899
philosophical standpoint one reading of,749.279,4.5
that is that the machine could be a,750.959,4.261
better parent than a human parent could,753.779,4.021
ever be because for a child from a,755.22,4.619
child's perspective they should be their,757.8,3.84
Prime their parents primary Mission but,759.839,3.721
that's never the case right parents are,761.64,3.78
humans too and they have their own needs,763.56,4.26
their own desires their own plans but,765.42,5.219
when you have a machine that it's if it,767.82,4.68
is designed that you are its primary,770.639,3.781
Mission whether you're an adult or a,772.5,4.86
child like that could be like,774.42,5.599
from some perspectives a better outcome,777.36,4.5
obviously some people are probably,780.019,3.701
cringing which is understandable that's,781.86,3.659
a perfectly healthy reaction to the idea,783.72,3.9
of replacing children and parents with,785.519,4.081
machines but it's possible Right,787.62,5.06
hypothetically possible,789.6,3.08
[Music],798.3,11.7
foreign,807.0,3.0