episode
stringlengths
45
100
text
stringlengths
1
528
timestamp_link
stringlengths
56
56
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to be in the presence of people who are even smarter than us
https://karpathy.ai/lexicap/0001-large.html#00:42:39.320
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
all around is when you and I were both two years old,
https://karpathy.ai/lexicap/0001-large.html#00:42:42.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
I mean, our parents were much more intelligent than us,
https://karpathy.ai/lexicap/0001-large.html#00:42:45.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
right?
https://karpathy.ai/lexicap/0001-large.html#00:42:48.360
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Worked out OK, because their goals
https://karpathy.ai/lexicap/0001-large.html#00:42:49.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
were aligned with our goals.
https://karpathy.ai/lexicap/0001-large.html#00:42:51.960
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And that, I think, is really the number one key issue
https://karpathy.ai/lexicap/0001-large.html#00:42:53.960
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
we have to solve if we value align the value alignment
https://karpathy.ai/lexicap/0001-large.html#00:42:58.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
problem, exactly.
https://karpathy.ai/lexicap/0001-large.html#00:43:02.280
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Because people who see too many Hollywood movies
https://karpathy.ai/lexicap/0001-large.html#00:43:03.080
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
with lousy science fiction plot lines,
https://karpathy.ai/lexicap/0001-large.html#00:43:06.520
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
they worry about the wrong thing, right?
https://karpathy.ai/lexicap/0001-large.html#00:43:10.000
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
They worry about some machine suddenly turning evil.
https://karpathy.ai/lexicap/0001-large.html#00:43:12.200
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
It's not malice that is the concern.
https://karpathy.ai/lexicap/0001-large.html#00:43:16.320
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
It's competence.
https://karpathy.ai/lexicap/0001-large.html#00:43:21.480
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
By definition, intelligent makes you very competent.
https://karpathy.ai/lexicap/0001-large.html#00:43:22.880
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
If you have a more intelligent goal playing,
https://karpathy.ai/lexicap/0001-large.html#00:43:27.440
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
computer playing is a less intelligent one.
https://karpathy.ai/lexicap/0001-large.html#00:43:31.920
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And when we define intelligence as the ability
https://karpathy.ai/lexicap/0001-large.html#00:43:33.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to accomplish goal winning, it's going
https://karpathy.ai/lexicap/0001-large.html#00:43:36.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to be the more intelligent one that wins.
https://karpathy.ai/lexicap/0001-large.html#00:43:38.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And if you have a human and then you
https://karpathy.ai/lexicap/0001-large.html#00:43:40.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
have an AGI that's more intelligent in all ways
https://karpathy.ai/lexicap/0001-large.html#00:43:43.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and they have different goals, guess who's
https://karpathy.ai/lexicap/0001-large.html#00:43:47.720
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
going to get their way, right?
https://karpathy.ai/lexicap/0001-large.html#00:43:49.520
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So I was just reading about this particular rhinoceros species
https://karpathy.ai/lexicap/0001-large.html#00:43:50.720
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
that was driven extinct just a few years ago.
https://karpathy.ai/lexicap/0001-large.html#00:43:57.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Ellen Bummer is looking at this cute picture of a mommy
https://karpathy.ai/lexicap/0001-large.html#00:43:59.200
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
rhinoceros with its child.
https://karpathy.ai/lexicap/0001-large.html#00:44:02.280
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And why did we humans drive it to extinction?
https://karpathy.ai/lexicap/0001-large.html#00:44:05.080
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
It wasn't because we were evil rhino haters as a whole.
https://karpathy.ai/lexicap/0001-large.html#00:44:09.320
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
It was just because our goals weren't aligned
https://karpathy.ai/lexicap/0001-large.html#00:44:12.800
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
with those of the rhinoceros.
https://karpathy.ai/lexicap/0001-large.html#00:44:14.920
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And it didn't work out so well for the rhinoceros
https://karpathy.ai/lexicap/0001-large.html#00:44:16.000
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
because we were more intelligent, right?
https://karpathy.ai/lexicap/0001-large.html#00:44:17.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So I think it's just so important
https://karpathy.ai/lexicap/0001-large.html#00:44:19.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
that if we ever do build AGI, before we unleash anything,
https://karpathy.ai/lexicap/0001-large.html#00:44:21.240
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
we have to make sure that it learns
https://karpathy.ai/lexicap/0001-large.html#00:44:27.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to understand our goals, that it adopts our goals,
https://karpathy.ai/lexicap/0001-large.html#00:44:31.840
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and that it retains those goals.
https://karpathy.ai/lexicap/0001-large.html#00:44:36.000
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So the cool, interesting problem there
https://karpathy.ai/lexicap/0001-large.html#00:44:37.920
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
is us as human beings trying to formulate our values.
https://karpathy.ai/lexicap/0001-large.html#00:44:40.520
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So you could think of the United States Constitution as a way
https://karpathy.ai/lexicap/0001-large.html#00:44:47.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
that people sat down, at the time a bunch of white men,
https://karpathy.ai/lexicap/0001-large.html#00:44:51.360
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
which is a good example, I should say.
https://karpathy.ai/lexicap/0001-large.html#00:44:56.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
They formulated the goals for this country.
https://karpathy.ai/lexicap/0001-large.html#00:44:59.680
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And a lot of people agree that those goals actually
https://karpathy.ai/lexicap/0001-large.html#00:45:01.480
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
held up pretty well.
https://karpathy.ai/lexicap/0001-large.html#00:45:03.760
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
That's an interesting formulation of values
https://karpathy.ai/lexicap/0001-large.html#00:45:05.360
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and failed miserably in other ways.
https://karpathy.ai/lexicap/0001-large.html#00:45:07.160
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So for the value alignment problem and the solution to it,
https://karpathy.ai/lexicap/0001-large.html#00:45:09.440
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
we have to be able to put on paper or in a program
https://karpathy.ai/lexicap/0001-large.html#00:45:13.320
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
human values.
https://karpathy.ai/lexicap/0001-large.html#00:45:19.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
How difficult do you think that is?
https://karpathy.ai/lexicap/0001-large.html#00:45:20.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Very.
https://karpathy.ai/lexicap/0001-large.html#00:45:22.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
But it's so important.
https://karpathy.ai/lexicap/0001-large.html#00:45:24.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
We really have to give it our best.
https://karpathy.ai/lexicap/0001-large.html#00:45:25.880
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And it's difficult for two separate reasons.
https://karpathy.ai/lexicap/0001-large.html#00:45:28.000
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
There's the technical value alignment problem
https://karpathy.ai/lexicap/0001-large.html#00:45:30.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
of figuring out just how to make machines understand our goals,
https://karpathy.ai/lexicap/0001-large.html#00:45:33.440
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
adopt them, and retain them.
https://karpathy.ai/lexicap/0001-large.html#00:45:39.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And then there's the separate part of it,
https://karpathy.ai/lexicap/0001-large.html#00:45:40.440
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
the philosophical part.
https://karpathy.ai/lexicap/0001-large.html#00:45:43.200
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Whose values anyway?
https://karpathy.ai/lexicap/0001-large.html#00:45:44.200
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And since it's not like we have any great consensus
https://karpathy.ai/lexicap/0001-large.html#00:45:45.920
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
on this planet on values, what mechanism should we
https://karpathy.ai/lexicap/0001-large.html#00:45:48.320
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
create then to aggregate and decide, OK,
https://karpathy.ai/lexicap/0001-large.html#00:45:52.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
what's a good compromise?
https://karpathy.ai/lexicap/0001-large.html#00:45:54.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
That second discussion can't just
https://karpathy.ai/lexicap/0001-large.html#00:45:56.520
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
be left to tech nerds like myself.
https://karpathy.ai/lexicap/0001-large.html#00:45:58.440
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And if we refuse to talk about it and then AGI gets built,
https://karpathy.ai/lexicap/0001-large.html#00:46:01.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
who's going to be actually making
https://karpathy.ai/lexicap/0001-large.html#00:46:05.720
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
the decision about whose values?
https://karpathy.ai/lexicap/0001-large.html#00:46:07.160
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
It's going to be a bunch of dudes in some tech company.
https://karpathy.ai/lexicap/0001-large.html#00:46:08.480
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And are they necessarily so representative of all
https://karpathy.ai/lexicap/0001-large.html#00:46:12.080
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
of humankind that we want to just entrust it to them?
https://karpathy.ai/lexicap/0001-large.html#00:46:17.240
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Are they even uniquely qualified to speak
https://karpathy.ai/lexicap/0001-large.html#00:46:19.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to future human happiness just because they're
https://karpathy.ai/lexicap/0001-large.html#00:46:23.000
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
good at programming AI?
https://karpathy.ai/lexicap/0001-large.html#00:46:25.240
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
I'd much rather have this be a really inclusive conversation.
https://karpathy.ai/lexicap/0001-large.html#00:46:26.480
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
But do you think it's possible?
https://karpathy.ai/lexicap/0001-large.html#00:46:30.200
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
So you create a beautiful vision that includes the diversity,
https://karpathy.ai/lexicap/0001-large.html#00:46:32.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
cultural diversity, and various perspectives on discussing
https://karpathy.ai/lexicap/0001-large.html#00:46:37.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
rights, freedoms, human dignity.
https://karpathy.ai/lexicap/0001-large.html#00:46:40.960
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
But how hard is it to come to that consensus?
https://karpathy.ai/lexicap/0001-large.html#00:46:43.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Do you think it's certainly a really important thing
https://karpathy.ai/lexicap/0001-large.html#00:46:46.520
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
that we should all try to do?
https://karpathy.ai/lexicap/0001-large.html#00:46:50.400
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
But do you think it's feasible?
https://karpathy.ai/lexicap/0001-large.html#00:46:51.880
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
I think there's no better way to guarantee failure than to
https://karpathy.ai/lexicap/0001-large.html#00:46:54.240
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
refuse to talk about it or refuse to try.
https://karpathy.ai/lexicap/0001-large.html#00:47:00.160
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And I also think it's a really bad strategy
https://karpathy.ai/lexicap/0001-large.html#00:47:02.840
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
to say, OK, let's first have a discussion for a long time.
https://karpathy.ai/lexicap/0001-large.html#00:47:05.320
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
And then once we reach complete consensus,
https://karpathy.ai/lexicap/0001-large.html#00:47:08.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
then we'll try to load it into some machine.
https://karpathy.ai/lexicap/0001-large.html#00:47:11.040
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
No, we shouldn't let perfect be the enemy of good.
https://karpathy.ai/lexicap/0001-large.html#00:47:13.360
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Instead, we should start with the kindergarten ethics
https://karpathy.ai/lexicap/0001-large.html#00:47:16.560
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
that pretty much everybody agrees on
https://karpathy.ai/lexicap/0001-large.html#00:47:20.600
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
and put that into machines now.
https://karpathy.ai/lexicap/0001-large.html#00:47:22.120
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
We're not doing that even.
https://karpathy.ai/lexicap/0001-large.html#00:47:24.360
Max Tegmark: Life 3.0 | Lex Fridman Podcast #1
Look at anyone who builds this passenger aircraft,
https://karpathy.ai/lexicap/0001-large.html#00:47:25.880