Watch – Why fascism is so tempting — and how your data could power it | Yuval Noah Harari

Why fascism is so tempting — and how your data could power it | Yuval Noah Harari

https://youtu.be/xHHb7R3kx40

Politics becomes the struggle to control the flows of data.
And dictatorship now means that too much data is being concentrated in the hands of the government or of a small elite.
The greatest danger that now faces liberal democracy is that the revolution in information technology will make dictatorships more efficient than democracies.
In the 20th century, democracy and capitalism defeated fascism and communism because democracy was better at processing data and making decisions. Given 20th-century technology, it was simply inefficient to try and concentrate too much data and too much power in one place.
But it is not a law of nature that centralized data processing is always less efficient than distributed data processing. With the rise of artificial intelligence and machine learning, it might become feasible to process enormous amounts of information very efficiently in one place, to take all the decisions in one place, and then centralized data processing will be more efficient
than distributed data processing.
09:55
And then the main handicap of authoritarian regimes
09:58
in the 20th century —
10:00
their attempt to concentrate all the information in one place —
10:05
it will become their greatest advantage.
10:10
Another technological danger that threatens the future of democracy
10:15
is the merger of information technology with biotechnology,
10:20
which might result in the creation of algorithms
10:24
that know me better than I know myself.
10:29
And once you have such algorithms,
10:31
an external system, like the government,
10:34
cannot just predict my decisions,
10:38
it can also manipulate my feelings, my emotions.
10:42
A dictator may not be able to provide me with good health care,
10:47
but he will be able to make me love him
10:51
and to make me hate the opposition.
10:55
Democracy will find it difficult to survive such a development
11:00
because, in the end,
11:02
democracy is not based on human rationality;
11:06
it’s based on human feelings.
11:10
During elections and referendums,
11:12
you’re not being asked, “What do you think?”
11:15
You’re actually being asked, “How do you feel?”
11:20
And if somebody can manipulate your emotions effectively,
11:24
democracy will become an emotional puppet show.
11:30
So what can we do to prevent the return of fascism
11:34
and the rise of new dictatorships?
11:37
The number one question that we face is: Who controls the data?
11:44
If you are an engineer,
11:46
then find ways to prevent too much data
11:50
from being concentrated in too few hands.
11:53
And find ways to make sure
11:56
the distributed data processing is at least as efficient
12:01
as centralized data processing.
12:04
This will be the best safeguard for democracy.
12:07
As for the rest of us who are not engineers,
12:11
the number one question facing us
12:14
is how not to allow ourselves to be manipulated
12:19
by those who control the data.
12:23
The enemies of liberal democracy, they have a method.
12:27
They hack our feelings.
12:30
Not our emails, not our bank accounts —
12:32
they hack our feelings of fear and hate and vanity,
12:38
and then use these feelings
12:40
to polarize and destroy democracy from within.
12:44
This is actually a method
12:46
that Silicon Valley pioneered in order to sell us products.
12:52
But now, the enemies of democracy are using this very method
12:57
to sell us fear and hate and vanity.
13:01
They cannot create these feelings out of nothing.
13:06
So they get to know our own preexisting weaknesses.
13:10
And then use them against us.
13:13
And it is therefore the responsibility of all of us
13:17
to get to know our weaknesses
13:19
and make sure that they do not become a weapon
13:23
in the hands of the enemies of democracy.
Entire Transcript

Hello, everyone.00:15
It’s a bit funny, because I did write that humans will become digital,
00:20
but I didn’t think it will happen so fast

00:23
and that it will happen to me.
00:25
But here I am, as a digital avatar,
00:28
and here you are, so let’s start.
00:32
And let’s start with a question.
00:34
How many fascists are there in the audience today?
00:38
(Laughter)
00:39
Well, it’s a bit difficult to say,
00:42
because we’ve forgotten what fascism is.
00:46
People now use the term “fascist”
00:49
as a kind of general-purpose abuse.
00:52
Or they confuse fascism with nationalism.
00:56
So let’s take a few minutes to clarify what fascism actually is,
01:02
and how it is different from nationalism.
01:05
The milder forms of nationalism have been among the most benevolent
01:10
of human creations.
01:12
Nations are communities of millions of strangers
01:17
who don’t really know each other.
01:19
For example, I don’t know the eight million people
01:22
who share my Israeli citizenship.
01:26
But thanks to nationalism,
01:28
we can all care about one another and cooperate effectively.
01:32
This is very good.
01:34
Some people, like John Lennon, imagine that without nationalism,
01:39
the world will be a peaceful paradise.
01:44
But far more likely,
01:45
without nationalism, we would have been living in tribal chaos.
01:50
If you look today at the most prosperous and peaceful countries in the world,
01:55
countries like Sweden and Switzerland and Japan,
02:00
you will see that they have a very strong sense of nationalism.
02:05
In contrast, countries that lack a strong sense of nationalism,
02:09
like Congo and Somalia and Afghanistan,
02:13
tend to be violent and poor.
02:16
So what is fascism, and how is it different from nationalism?
02:21
Well, nationalism tells me that my nation is unique,
02:27
and that I have special obligations towards my nation.
02:31
Fascism, in contrast, tells me that my nation is supreme,
02:37
and that I have exclusive obligations towards it.
02:42
I don’t need to care about anybody or anything other than my nation.
02:48
Usually, of course, people have many identities
02:52
and loyalties to different groups.
02:55
For example, I can be a good patriot, loyal to my country,
02:59
and at the same time, be loyal to my family,
03:03
my neighborhood, my profession,
03:05
humankind as a whole,
03:06
truth and beauty.
03:09
Of course, when I have different identities and loyalties,
03:13
it sometimes creates conflicts and complications.
03:17
But, well, who ever told you that life was easy?
03:21
Life is complicated.
03:23
Deal with it.
03:26
Fascism is what happens when people try to ignore the complications
03:32
and to make life too easy for themselves.
03:36
Fascism denies all identities except the national identity
03:41
and insists that I have obligations only towards my nation.
03:47
If my nation demands that I sacrifice my family,
03:51
then I will sacrifice my family.
03:54
If the nation demands that I kill millions of people,
03:58
then I will kill millions of people.
04:01
And if my nation demands that I betray truth and beauty,
04:07
then I should betray truth and beauty.
04:11
For example, how does a fascist evaluate art?
04:16
How does a fascist decide whether a movie is a good movie or a bad movie?
04:22
Well, it’s very, very, very simple.
04:27
There is really just one yardstick:
04:29
if the movie serves the interests of the nation,
04:33
it’s a good movie;
04:34
if the movie doesn’t serve the interests of the nation,
04:37
it’s a bad movie.
04:39
That’s it.
04:40
Similarly, how does a fascist decide what to teach kids in school?
04:45
Again, it’s very simple.
04:47
There is just one yardstick:
04:49
you teach the kids whatever serves the interests of the nation.
04:55
The truth doesn’t matter at all.
05:00
Now, the horrors of the Second World War and of the Holocaust remind us
05:05
of the terrible consequences of this way of thinking.
05:10
But usually, when we talk about the ills of fascism,
05:15
we do so in an ineffective way,
05:18
because we tend to depict fascism as a hideous monster,
05:22
without really explaining what was so seductive about it.
05:27
It’s a bit like these Hollywood movies that depict the bad guys —
05:32
Voldemort or Sauron or Darth Vader —
05:36
as ugly and mean and cruel.
05:38
They’re cruel even to their own supporters.
05:41
When I see these movies, I never understand —
05:45
why would anybody be tempted to follow a disgusting creep like Voldemort?
05:52
The problem with evil is that in real life,
05:56
evil doesn’t necessarily look ugly.
05:59
It can look very beautiful.
06:02
This is something that Christianity knew very well,
06:05
which is why in Christian art, as [opposed to] Hollywood,
06:08
Satan is usually depicted as a gorgeous hunk.
06:13
This is why it’s so difficult to resist the temptations of Satan,
06:17
and why it is also difficult to resist the temptations of fascism.
06:22
Fascism makes people see themselves
06:25
as belonging to the most beautiful and most important thing in the world —
06:30
the nation.
06:32
And then people think,
06:33
“Well, they taught us that fascism is ugly.
06:37
But when I look in the mirror, I see something very beautiful,
06:40
so I can’t be a fascist, right?”
06:43
Wrong.
06:44
That’s the problem with fascism.
06:45
When you look in the fascist mirror,
06:48
you see yourself as far more beautiful than you really are.
06:53
In the 1930s, when Germans looked in the fascist mirror,
06:57
they saw Germany as the most beautiful thing in the world.
07:02
If today, Russians look in the fascist mirror,
07:05
they will see Russia as the most beautiful thing in the world.
07:09
And if Israelis look in the fascist mirror,
07:12
they will see Israel as the most beautiful thing in the world.
07:18
This does not mean that we are now facing a rerun of the 1930s.
07:24
Fascism and dictatorships might come back,
07:27
but they will come back in a new form,
07:31
a form which is much more relevant
07:33
to the new technological realities of the 21st century.
07:38
In ancient times,
07:40
land was the most important asset in the world.
07:44
Politics, therefore, was the struggle to control land.
07:49
And dictatorship meant that all the land was owned by a single ruler
07:55
or by a small oligarch.
07:57
And in the modern age, machines became more important than land.
08:03
Politics became the struggle to control the machines.
08:07
And dictatorship meant
08:09
that too many of the machines became concentrated
08:13
in the hands of the government or of a small elite.
08:17
Now data is replacing both land and machines
08:21
as the most important asset.
08:24
Politics becomes the struggle to control the flows of data.
08:29
And dictatorship now means
08:32
that too much data is being concentrated in the hands of the government
08:38
or of a small elite.
08:40
The greatest danger that now faces liberal democracy
08:45
is that the revolution in information technology
08:49
will make dictatorships more efficient than democracies.
08:54
In the 20th century,
08:56
democracy and capitalism defeated fascism and communism
09:01
because democracy was better at processing data and making decisions.
09:07
Given 20th-century technology,
09:09
it was simply inefficient to try and concentrate too much data
09:15
and too much power in one place.
09:19
But it is not a law of nature
09:23
that centralized data processing is always less efficient
09:29
than distributed data processing.
09:32
With the rise of artificial intelligence and machine learning,
09:36
it might become feasible to process enormous amounts of information
09:41
very efficiently in one place,
09:44
to take all the decisions in one place,
09:47
and then centralized data processing will be more efficient
09:52
than distributed data processing.
09:55
And then the main handicap of authoritarian regimes
09:58
in the 20th century —
10:00
their attempt to concentrate all the information in one place —
10:05
it will become their greatest advantage.
10:10
Another technological danger that threatens the future of democracy
10:15
is the merger of information technology with biotechnology,
10:20
which might result in the creation of algorithms
10:24
that know me better than I know myself.
10:29
And once you have such algorithms,
10:31
an external system, like the government,
10:34
cannot just predict my decisions,
10:38
it can also manipulate my feelings, my emotions.
10:42
A dictator may not be able to provide me with good health care,
10:47
but he will be able to make me love him
10:51
and to make me hate the opposition.
10:55
Democracy will find it difficult to survive such a development
11:00
because, in the end,
11:02
democracy is not based on human rationality;
11:06
it’s based on human feelings.
11:10
During elections and referendums,
11:12
you’re not being asked, “What do you think?”
11:15
You’re actually being asked, “How do you feel?”
11:20
And if somebody can manipulate your emotions effectively,
11:24
democracy will become an emotional puppet show.
11:30
So what can we do to prevent the return of fascism
11:34
and the rise of new dictatorships?
11:37
The number one question that we face is: Who controls the data?
11:44
If you are an engineer,
11:46
then find ways to prevent too much data
11:50
from being concentrated in too few hands.
11:53
And find ways to make sure
11:56
the distributed data processing is at least as efficient
12:01
as centralized data processing.
12:04
This will be the best safeguard for democracy.
12:07
As for the rest of us who are not engineers,
12:11
the number one question facing us
12:14
is how not to allow ourselves to be manipulated
12:19
by those who control the data.
12:23
The enemies of liberal democracy, they have a method.
12:27
They hack our feelings.
12:30
Not our emails, not our bank accounts —
12:32
they hack our feelings of fear and hate and vanity,
12:38
and then use these feelings
12:40
to polarize and destroy democracy from within.
12:44
This is actually a method
12:46
that Silicon Valley pioneered in order to sell us products.
12:52
But now, the enemies of democracy are using this very method
12:57
to sell us fear and hate and vanity.
13:01
They cannot create these feelings out of nothing.
13:06
So they get to know our own preexisting weaknesses.
13:10
And then use them against us.
13:13
And it is therefore the responsibility of all of us
13:17
to get to know our weaknesses
13:19
and make sure that they do not become a weapon
13:23
in the hands of the enemies of democracy.
13:27
Getting to know our own weaknesses
13:29
will also help us to avoid the trap of the fascist mirror.
13:36
As we explained earlier, fascism exploits our vanity.
13:40
It makes us see ourselves as far more beautiful than we really are.
13:46
This is the seduction.
13:48
But if you really know yourself,
13:50
you will not fall for this kind of flattery.
13:54
If somebody puts a mirror in front of your eyes
13:58
that hides all your ugly bits and makes you see yourself
14:04
as far more beautiful and far more important
14:08
than you really are,
14:09
just break that mirror.
14:13
Thank you.
14:14
(Applause)
14:22
Chris Anderson: Yuval, thank you.
14:24
Goodness me.
14:25
It’s so nice to see you again.
14:27
So, if I understand you right,
14:29
you’re alerting us to two big dangers here.
14:32
One is the possible resurgence of a seductive form of fascism,
14:36
but close to that, dictatorships that may not exactly be fascistic,
14:41
but control all the data.
14:43
I wonder if there’s a third concern
14:45
that some people here have already expressed,
14:47
which is where, not governments, but big corporations control all our data.
14:52
What do you call that,
14:54
and how worried should we be about that?
14:56
Yuval Noah Harari: Well, in the end, there isn’t such a big difference
14:59
between the corporations and the governments,
15:02
because, as I said, the questions is: Who controls the data?
15:05
This is the real government.
15:07
If you call it a corporation or a government —
15:09
if it’s a corporation and it really controls the data,
15:12
this is our real government.
15:14
So the difference is more apparent than real.
15:18
CA: But somehow, at least with corporations,
15:21
you can imagine market mechanisms where they can be taken down.
15:24
I mean, if consumers just decide
15:26
that the company is no longer operating in their interest,
15:29
it does open the door to another market.
15:31
It seems easier to imagine that
15:33
than, say, citizens rising up and taking down a government
15:36
that is in control of everything.
15:37
YNH: Well, we are not there yet,
15:39
but again, if a corporation really knows you better than you know yourself —
15:44
at least that it can manipulate your own deepest emotions and desires,
15:50
and you won’t even realize —
15:51
you will think this is your authentic self.
15:54
So in theory, yes, in theory, you can rise against a corporation,
15:58
just as, in theory, you can rise against a dictatorship.
16:02
But in practice, it is extremely difficult.
16:07
CA: So in “Homo Deus,” you argue that this would be the century
16:11
when humans kind of became gods,
16:14
either through development of artificial intelligence
16:17
or through genetic engineering.
16:20
Has this prospect of political system shift, collapse
16:26
impacted your view on that possibility?
16:29
YNH: Well, I think it makes it even more likely,
16:32
and more likely that it will happen faster,
16:35
because in times of crisis, people are willing to take risks
16:40
that they wouldn’t otherwise take.
16:42
And people are willing to try
16:45
all kinds of high-risk, high-gain technologies.
16:49
So these kinds of crises might serve the same function
16:54
as the two world wars in the 20th century.
16:57
The two world wars greatly accelerated
17:00
the development of new and dangerous technologies.
17:04
And the same thing might happen in the 21st century.
17:07
I mean, you need to be a little crazy to run too fast,
17:11
let’s say, with genetic engineering.
17:13
But now you have more and more crazy people
17:17
in charge of different countries in the world,
17:19
so the chances are getting higher, not lower.
17:23
CA: So, putting it all together, Yuval, you’ve got this unique vision.
17:27
Roll the clock forward 30 years.
17:28
What’s your guess — does humanity just somehow scrape through,
17:31
look back and say, “Wow, that was a close thing. We did it!”
17:35
Or not?
17:36
YNH: So far, we’ve managed to overcome all the previous crises.
17:40
And especially if you look at liberal democracy
17:43
and you think things are bad now,
17:46
just remember how much worse things looked in 1938 or in 1968.
17:52
So this is really nothing, this is just a small crisis.
17:56
But you can never know,
17:58
because, as a historian,
18:00
I know that you should never underestimate human stupidity.
18:05
(Laughter) (Applause)
18:06
It is one of the most powerful forces that shape history.
18:11
CA: Yuval, it’s been an absolute delight to have you with us.
18:14
Thank you for making the virtual trip.
18:16
Have a great evening there in Tel Aviv.
18:18
Yuval Harari!
18:19
YNH: Thank you very much.
18:20
(Applause)