Podchaser Logo
Home
Cause and Effect on a Global Scale

Cause and Effect on a Global Scale

Released Wednesday, 3rd July 2024
Good episode? Give it some love!
Cause and Effect on a Global Scale

Cause and Effect on a Global Scale

Cause and Effect on a Global Scale

Cause and Effect on a Global Scale

Wednesday, 3rd July 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

6:00

bomber gets to Kokura, there's briefly cloud cover. And

6:02

they don't want to accidentally drop the bomb somewhere

6:04

that's not the city, because of course that would

6:06

not have the same effect. So they

6:08

decide to go to the secondary target, which

6:10

is Nagasaki. They literally do a loop to

6:12

see, hey, maybe it clears up. Yes. It

6:16

doesn't. Yeah. And on

6:18

to Nagasaki. Exactly. They actually, I think, do loops

6:20

until they're running low on fuel and they're starting

6:22

to think, okay, we're not going to make it

6:24

to the secondary target. So they finally pull the

6:26

plug on Kokura, drop the bomb on Nagasaki. So

6:28

hundreds of thousands of people live or die in

6:30

these cities based on a 19 year

6:32

old vacation and a cloud. And the point

6:34

that I think is important to realize here

6:36

is that if you were modeling this,

6:38

if you were trying to say, how is the US government

6:41

going to determine where to drop the atomic bomb? You

6:43

would not put in your model the vacation

6:45

histories of American government officials or cloud

6:48

cover. You would come up with these very

6:50

obvious big things like where are the places

6:52

that have strategic importance or propaganda value? And

6:54

if you did that, you probably would put

6:56

Kyoto on top of the list and you

6:58

get the wrong answer and you wouldn't get

7:00

the wrong answer because you were stupid. You'd

7:02

get the wrong answer because sometimes things that

7:04

don't seem to be important actually end up

7:06

being the most important factor in an outcome.

7:09

And the Japanese actually have an expression Kokura's

7:11

luck. Tell us what that means to the

7:13

Japanese. Yeah. I think this is a very

7:15

useful thing to think about. It's Kokura's luck

7:17

refers to when you unknowingly escape disaster. So

7:20

it was a long time before the US

7:22

government acknowledged that they were planning to drop

7:24

the bomb on Kokura. So hundreds

7:26

of thousands of people in that city had no

7:28

idea there was an airplane over them that but

7:30

for a cloud would have incinerated the entire city

7:33

and killed most of them. And so

7:35

I think this is the kind of thing where

7:37

one of the ideas that is central to the

7:39

argument in Fluke is that these sorts of things,

7:41

this Kokura's luck is happening to us all the

7:43

time. We're completely oblivious to the

7:45

diversions in our lives and our societies,

7:48

the alternative possible history, simply

7:50

because we can only experience one reality. And what

7:52

we do is we then stitch a narrative back

7:55

where it's A to B. This makes complete sense.

7:57

Here are the five reasons why this happened. I

8:00

think this is a way that we end up

8:02

deluding ourselves into a neater and tidier version of

8:04

the real world So so you describe why we

8:06

can't know what matters Most

8:08

because we can't see the alternative universes.

8:11

I love this quote We ignore

8:14

the invisible pivots the

8:16

moments that we will never realize were

8:18

consequential The near misses and near hits

8:20

that are unknown to us because we

8:22

have never seen and will never see

8:24

our Alternative possible lives

8:26

that that's really very

8:29

chilling to know that we're just walking through

8:31

life Unaware that hey

8:33

atomic bomb over our head better.

8:36

Hope the clouds don't clear up. Yeah I have this

8:38

saying that I refer to a lot in the book

8:40

Which is that we control nothing but we influence everything

8:42

and this is when you think about this in our

8:44

own lives I think this is

8:46

something where you realize that there are these diversions happening

8:49

constantly There's a film in the 1990s

8:51

with gun of Peltra called sliding doors And

8:53

it has this idea and I sort of riff

8:55

on that with this concept I coined called the

8:57

snooze button effect where I you imagine

9:00

that you know, it's Tuesday morning You're a little

9:02

bit groggy wake up the snooze button beckons to

9:04

you You slap it and you get delayed by

9:06

five minutes You imagine you're now your life rewinds

9:08

by 30 seconds and you say no,

9:11

I won't hit the snooze button I'll get out of bed now.

9:13

I think that has changed your life Now

9:15

the question is how much has it changed your life and

9:17

under some short time scales Maybe things sort of get ironed

9:19

out in the end, but you you're gonna have different conversations

9:22

that day You're gonna talk to different people you might get

9:24

in a car accident in some days, right? I mean, these

9:26

are the kinds of things that we sort of are

9:28

oblivious to and I think when you think about them

9:31

with social Change it's happening all the time, too I

9:33

mean, there's just so many ways that

9:35

the world could have unfolded differently but for a few

9:37

small changes I mean, you know you think about even

9:39

like 9-11 We

9:41

think about all the variables that go into 9-11 one of them

9:43

that people don't talk about was the weather It

9:45

was incredibly blue blue sky day. Yes. Yeah,

9:48

and if you had if you had a

9:51

You know very very cloudy day or storm some of the

9:53

planes wouldn't have taken off on time They might have had

9:55

a chance to foil some of the plots or if you

9:57

had had a different slate of passengers on flight 93 So

10:00

if it had gone September 10th or September 12th,

10:03

maybe those passengers don't take down the plane, maybe

10:05

the White House or the Capitol is destroyed, and

10:08

then the world's different. I mean, can you

10:10

imagine how it would change America or geopolitics

10:12

if there was no White House anymore? So

10:14

I think these are the kinds of things

10:16

where you just imagine that there's this straight

10:19

line of cause and effect. And of course, when

10:21

we experience the world, we then explain it. But

10:24

these small changes could really reshape the future.

10:26

Some of them are going to be more

10:29

consequential, like the Kyoto story. Others

10:31

are going to be a little bit less consequential,

10:33

at least on human timescales. But the

10:35

point is, we can't know. And I think that's something that

10:37

is bewildering to think about. So can

10:40

we actually identify cause and effect?

10:42

We tell ourselves stories. We

10:45

have not only narrative fallacy in

10:47

everything we do, because we love

10:49

a good plot line, but there's

10:51

also hindsight bias where we imagine,

10:53

oh, I knew this was coming all

10:55

along. And

10:57

can we really, truly know the

10:59

impact of how A

11:02

leads to B, or

11:04

how something that we think is completely

11:06

meaningless actually has deep

11:08

significance? Yeah, so I very much

11:11

subscribe to this view that all models are wrong, but

11:13

some are useful. George Box. Yes, exactly. But

11:15

I think that one of the things that has been lost on us is

11:17

I think there's so much of the world that runs on

11:19

models that we sometimes forget

11:22

that they are extremely simplified abstractions of

11:24

reality and that we actually don't understand

11:26

how the causation works. And

11:28

I think that creates hubris that's dangerous. So when

11:30

you think about why the atomic bomb ended up

11:32

getting dropped on Hiroshima, there are an infinite number

11:34

of causes. And there are things that we would

11:36

not think about. Geological

11:39

forces forging uranium millions of years ago is part

11:41

of that story. Einstein being born

11:43

is part of that story. The Battle of

11:45

Midway pivoting on a fluke event where the

11:47

US wins because they just happened to stumble

11:49

upon the Japanese fleet at the right moment.

11:52

If any of these things have been different, there's an

11:54

almost infinite number of them where little tweak would have

11:56

been different, a different outcome would have happened. Now, for

11:59

the useful navigation, of society, we have to simplify

12:01

reality, because we can't build a model that has

12:03

900,000 variables. So

12:06

what you instead do is you sort of

12:08

say, OK, this is a crude version of

12:10

reality. And I think one of

12:12

the things that is really useful about some models,

12:14

like Google Maps, for example, we know

12:17

that's not the world. We know the map is not

12:19

the territory. You look at Google Maps, and you're not

12:21

like, oh, well, I imagine that that's what the real

12:23

world looks like. It's a clear abstraction. I think when

12:25

we start to get into forecasting and

12:28

other modeling of social change, I

12:30

think we lose sight of the fact that

12:32

we have a Google Maps distortion, and that

12:34

we're actually looking at something that is potentially

12:36

useful to navigate, but is very, very different

12:38

from the real world. Really

12:41

interesting. So let's talk

12:43

about the way different

12:45

schools of thought perceive

12:49

and manage these philosophical

12:51

differences. You point out

12:54

Eastern and Western thinking have

12:57

a very different set of precepts

13:00

because of just the

13:02

nature of each society. In

13:05

the Bible, in Genesis, God proclaims, let us

13:07

make man in our image after

13:09

our likeness, and

13:11

let them have dominion over the fishes, the fowl, the

13:14

cattle, et cetera. Eastern culture takes

13:16

a whole lot more of

13:18

a collectivist approach where you're part

13:20

of a group, not you

13:23

were made in God's images. Tell

13:25

us a little bit about how this schism

13:28

developed, and what

13:30

is the relationship of chaos theory

13:32

to each? Yeah, so this

13:34

is a speculative theory, but it's a

13:36

theory that suggests that the reason why

13:38

Eastern cultures have much more relational concepts

13:40

of interconnectivity between humans and the rest

13:42

of the world and

13:44

human society as well is

13:46

derived from the differences, or proximity

13:48

rather, that humans have to primates,

13:51

for example, in their own cultures.

13:54

So there's lots of monkey gods and so on, and

13:56

there's also, of course, lots of monkeys in many of

13:58

these cultures that are developing. And... The

14:00

idea is, the hypothesis, is

14:03

that this meant that people could not

14:05

avoid the commonality that we have with

14:07

the rest of the world, right? Whereas

14:09

if you think about biblical societies, if you look

14:11

at animals and you see camels, you

14:14

think like, hey, we are super different. We are separate

14:16

from the rest of the world. So

14:18

the argument is that over the

14:20

long stretch of civilization, that this

14:22

created a slightly different mentality that

14:24

then manifests in what's called relational

14:26

versus atomistic thinking. And

14:28

Western society is atomistic thinking on steroids, which

14:31

is to say, you know, I

14:33

mean, the American dream is very atomistic, individualist.

14:35

It's like, you know, if you just want

14:37

to succeed, then you have to do everything.

14:40

Whereas the relational concepts are much more

14:42

about the interconnections that people have. And

14:45

so I think that also tells you how you think about

14:47

society, right? Social change is either driven by individuals or it's

14:49

driven by systems. And I think

14:51

that there is a way in which Western

14:54

culture, I think, can learn to actually

14:56

appreciate some of the complexity of social

14:58

change more with a healthy

15:00

increased dose of relational thinking. And

15:03

you kind of bring the Eastern

15:05

and Western philosophies together, where you

15:07

discuss the overview effects. And

15:10

it really begins with the

15:12

United States, Western society

15:14

sends astronauts to the moon,

15:16

sends astronauts around around the

15:19

earth. And these astronauts are

15:21

chosen out of often

15:23

out of the military, out of the

15:25

Air Force, they're pilots, they're, they're logical,

15:28

they're unfeeling, they're supposed to be

15:32

essentially soldiers. And

15:34

yet all of them have this impact

15:36

when they see the blue green earth

15:38

in its entirety from space. They

15:41

all describe it as being

15:43

overwhelmed by a life

15:46

shattering epiphany on the interconnection

15:49

of everything. That doesn't

15:51

sound very Western. That sounds more like an Eastern

15:53

philosophy. But this has been time and time again,

15:55

lots of astronauts have had this. Yeah, there's, you

15:57

know, it's funny because There's been like 9,500 generations

15:59

of modern humans and 9,497 of them have not

16:01

seen the earth,

16:08

right? So, when

16:10

people do see the earth, they have this profound

16:12

epiphany. And as you say, they were worried about

16:14

sending up philosophers and poets because they

16:16

figured they'd be overwhelmed by the sort of existential awe

16:19

and like, you know, forget to hit the right buttons

16:21

or whatever. So they picked these people who are supposed

16:23

to be robots effectively in their personality. And all of

16:25

them still have this incredible sort of

16:27

epiphany about the interconnection of the world because you

16:29

look at the single planet and you think, okay,

16:32

this is one structure. This is not something where

16:34

I'm this distinct bit. You're like, this is all

16:36

together, right? Now I think what's

16:38

really striking about that is that those

16:40

world views do shape your thinking around social

16:42

change. And I think when you

16:44

start to think that you are in control rather than

16:47

an agent of influence, you have a different worldview. When

16:49

you start to think that you're individual rather than relational,

16:51

you have a different worldview. And all these

16:53

things feed into the ways that we set up models that we sort of

16:56

interact with our conceptions of social

16:58

change and so on. And also the degree

17:00

to which we have hubris that we can control things. And

17:03

I think this is where the danger comes in, right? It's not

17:05

that you shouldn't model. It's not that you shouldn't have abstractions of

17:07

systems. It's that when you start to

17:09

get hubristic about it and think you have top-down

17:11

individualist control, you start to get

17:14

overconfident in ways that you try to tame

17:16

something that I think is untamable. And

17:18

this is where we get shocks more often because you

17:20

try to impose this sort of control on

17:23

a system that is so complex that

17:25

it resists control. And so

17:27

there's some of these things where I think the

17:29

insights, the philosophy behind this, it's

17:31

sort of lurking there invisibly where

17:33

no one says this when they build a model, but

17:36

it's obviously shaping the way they think about it

17:38

and their sort of assumptions before they go into

17:40

trying to determine how to navigate risk and uncertainty.

17:43

Along those lines, you have a great quote in

17:46

the book. God may have created

17:48

the clock, but it was Newton's laws

17:50

that kept it ticking. So

17:52

how do you resolve

17:55

that inherent tension between

17:57

big, driving

18:00

things or random elements affecting

18:03

it, or is there no

18:05

resolving them, they both matter? Yeah,

18:08

so I think it's a question of time scales. And

18:11

I think one of the big problems, and this

18:13

is something that I, you know, it's such a nuanced concept

18:15

that it's sometimes difficult to explain, but I

18:17

think there's a really important point about whether

18:20

ideas that happen for a long

18:22

time seem to be validated by what goes

18:25

on, the patterns that we see, right? Whether

18:27

you can actually falsify a theory when you're talking

18:29

about social change. So my favorite example of this

18:32

is the Arab Spring. In political science, my

18:34

own realm, there is a lot of stuff

18:36

written in sort of 2008, 2009, even

18:39

into 2010 that says, here's

18:42

why Middle Eastern dictatorships are extremely resilient. And there's

18:44

all this data showing this, the longevity, et cetera,

18:46

et cetera. And then like within six months of

18:48

some of these books coming out, you know, all

18:50

of them are on fire. I mean, I saw

18:52

a political risk map when I was in grad

18:54

school where like every single country that was on

18:56

fire was green on the political risk map from

18:58

the previous year, right? Now, there's

19:00

two ways of thinking about that. The first

19:02

way is to say the theory has been

19:04

falsified. They were wrong, right? The second

19:07

way of thinking about it is, hold on, maybe

19:09

the world changed. Maybe the patterns of cause and

19:11

effect have actually shifted, right? And I

19:13

think this is something that people don't appreciate that much,

19:15

is they assume that the patterns of the past are

19:17

going to be predictive of the patterns of the future.

19:19

I mean, David Hume came up with this idea hundreds

19:21

of years ago, but it is something that I think

19:24

is particularly important for our world, because

19:26

the patterns of the past being indicative

19:28

of the patterns of the future has never

19:30

before been as flawed of an assumption

19:32

because our world is changing faster than

19:34

ever before. So I think one of

19:36

the issues that we have is when we think about these sort

19:38

of clockwork models, where we say, oh yes, you know, these are

19:40

the ways that things have worked in the past, our

19:43

world is very, very different year to year. And that

19:45

didn't used to happen. I mean, I was talking before

19:47

about these, you know, 9,500 generations of humans. If

19:51

you think about the sort of entirety

19:53

of human history as a 24 hour day, 23

19:56

hours in like 10 minutes is

19:59

hunter-gatherer period. And then you

20:01

get into farming, which is another 30 minutes, and

20:03

then you got a few minutes for the Industrial

20:06

Revolution. And you get to the information age, which

20:08

we're in now, which is 11 seconds in this

20:10

one day clock. And I think the

20:12

point that's important here is that if

20:14

we base almost all of our decision making

20:16

and almost all of our models on causal

20:19

inference from past patterns of behavior, but

20:21

the world is changing year to year, then

20:24

the assumptions we're making are becoming more

20:26

and more short-lived. And I think that's

20:28

where we're embedding risk into our thinking

20:30

because we have no other way

20:32

of inferring cause and effect other than past patterns. There's no

20:35

alternative. That's what Hume says. He's like, this is the only

20:37

way we can understand the world. This is to look at

20:39

what happened in the past. We can't look into the future.

20:42

So I think this is something that I do

20:44

worry about when I see a lot of decision

20:46

making built on this sort of mentality

20:48

of the clockwork model that like, oh yes, well, it's

20:50

just gonna keep ticking along. And there's

20:52

a lot of very smart thinkers who have thought about black swans

20:54

and so on. I just think that we've made a system where

20:57

the black swans are actually gonna be more

20:59

frequent. I think we've designed a system that

21:02

is more prone to systemic risk than before.

21:04

Especially given not only does information move faster

21:06

than ever, but we're more

21:09

interconnected, we're more related, and

21:11

it becomes increasingly difficult if

21:13

not impossible to figure out

21:15

what are the unanticipated results,

21:18

consequences, side effects

21:20

of anything that we do. Yeah,

21:23

and this is one of those things where

21:25

I think there's some pretty good examples from

21:27

history of when somebody tries to control a

21:29

system that is uncontrollable and it

21:31

backfires catastrophically. And my favorite example is,

21:33

I shouldn't say favorite, it's a horrible

21:35

tragedy, but the best illustration of this

21:38

is Mao has this idea in communist China. He

21:40

has this idea, he says, we're gonna

21:42

eradicate disease, and the way we're gonna do this is

21:45

massive four pests campaign, so we're gonna kill all these

21:47

pests. So he basically tells

21:49

everyone to go out and kill

21:51

all these various things that potentially are vectors

21:54

of disease. And what it

21:56

ultimately does, it leads to one of the worst

21:58

famines in human history because they've... disrupted the ecosystem

22:00

and they figure, oh, as long as we just

22:02

get rid of these pests, it will be fine.

22:04

What they actually have done is they've made it

22:06

so the crops fail. And

22:09

so this is the kind of stuff where I

22:11

think it's the parable that warns us of

22:14

assuming that simply because we

22:16

have either have had some success in the past or

22:18

because our model seems to guide us in this way

22:21

that we can therefore insert ourselves into a system

22:23

and not worry about the unintended consequences. I think

22:25

that's the kind of thing where a lot of

22:27

the people who are the doomers in AI are

22:29

talking about this. There are some things where when

22:32

you have AI-based decision making, the

22:35

training data is the past. So there

22:37

are some things that I think are

22:39

getting worse in this front. And we

22:41

are also, as you said,

22:43

the interconnectivity. One

22:45

of my favorite examples of this is the Suez

22:47

Canal boat, the infamous Suez Canal boat. You

22:50

have a gust of wind that hits a boat

22:52

and twists it sideways. It gets lodged in the

22:54

canal. And the best estimate I've seen is

22:56

that it created $54 billion of economic

22:58

damage. And they said it was something like 0.2% to 0.4%

23:00

of global GDP could have been

23:02

wiped off by this one

23:04

boat. Now, the question is, is there ever

23:06

another moment in human history where one boat

23:09

could do that? And I think the answer

23:11

is quite clearly no. So

23:13

maybe the one that brought the plague. But

23:16

I mean, this is the kind of stuff where

23:18

I think one of the lessons that I think

23:20

is important is that there's a trade-off very

23:23

often between optimization and resilience. And

23:26

I think we're told all the time

23:28

efficiency and optimization are the guiding

23:31

principles of so many of our systems. But

23:34

they come at a cost. They do create less resilience.

23:36

And I think there are some things where the

23:39

long-term planning that we can do is to put a

23:41

little bit more into resilience and a little bit less

23:43

into optimization. It will cost us money in the short

23:45

term, but it will probably save us a hell of

23:47

a lot of money in the long term. Really, really

23:50

interesting. And

23:55

more of the planet's environmental

23:57

leaders and problem solvers for

24:00

the Bloomberg Green Festival. in

24:02

Seattle July 10th to the

24:05

13th title sponsor Amazon official

24:07

airline Alaska Airlines supporting sponsor

24:10

Providence contributing sponsor warehouse er

24:12

get 40% off using

24:14

promo code radio 40 at

24:17

Bloomberg live.com/green festival so

24:21

I found the book fascinating and

24:23

I really enjoyed where you will

24:25

you go down the evolutionary biology

24:28

rabbit hole starting with convergence

24:31

is the everything happens for

24:33

a reason school of evolutionary

24:36

biology contingency is

24:38

the the G rated

24:40

version is stuff happens theory explain

24:43

the difference between the two yes

24:45

I I think that evolutionary biology has a

24:47

lot to teach us about understanding change it's

24:49

a historical science and they're trying to understand

24:51

you know the origin story of species and

24:54

they're thinking about cause and effect just as people

24:56

in economics and politics are as well and

24:59

so these two ideas they're very simple

25:01

to understand with two examples the first example

25:03

of contingency is the asteroid that wipes out

25:05

the dinosaurs now if this asteroid

25:07

which was by the way was produced by

25:09

an oscillation in a place called the Oort

25:11

cloud in the distant reaches of space absolute

25:14

outer ring of assorted

25:18

detritus that surrounds the entire solar

25:20

system beyond Pluto yeah so this

25:22

this oscillation flings the space rock

25:25

towards earth and it hits

25:27

in the most destructive way possible it hits in

25:29

the ocean in a way that brings up a

25:31

lot of toxic gas and effectively incinerates

25:33

the dinosaurs because the surface temperature went up to

25:35

about the same level as a broiled chicken I

25:38

mean it was it was deadly right

25:40

now the reason this is important is because if it

25:42

had hit a slightly different place on the earth the

25:45

dinosaurs probably wouldn't have died out and let me just

25:47

point out and you mentioned this in the book it's

25:49

not like if it hits a different continent five

25:52

seconds earlier five seconds later it

25:55

completely misses that sulfur rich

25:58

if miss at the in the

26:00

Yucatan Peninsula. Yeah, so

26:02

I mean, this is the kind of stuff

26:04

where you think about it and it is

26:07

very unsettling because you can imagine everything that

26:09

humans have done. I mean,

26:11

you have a second difference in this asteroid. There's

26:13

no humans. Because the extinction of the dinosaurs is

26:15

what led to the rise of mammals and eventually

26:17

the evolution of us. And so

26:20

this is contingency. It's where this small

26:22

change could radically reshape the future. Now,

26:24

convergence is the alternative hypothesis and they

26:26

both exist, right? The sort of order

26:28

and disorder. And convergence

26:30

says, okay, yeah, there's a lot of noise. There's

26:32

a lot of fluctuations and flukes. But

26:35

eventually things that work win, right? So my

26:37

favorite example of this is that if you

26:39

look at, if you were to take out

26:41

a human eye and you were

26:43

to look at it and you were to compare

26:45

it next to an octopus's eye, they're actually extremely

26:47

similar, which is bizarre because there's about 600 million

26:49

years of separate evolutionary

26:52

pathways for the two branches of

26:54

life. And the reason this

26:56

happened isn't because, you know, we

26:58

just got super lucky. It's because evolution came up

27:00

with a strategy by random experimentation

27:02

that simply worked. It made the species

27:04

navigate the world effectively long enough to

27:07

survive to have offspring, which is the

27:09

engine of evolution, right? So this

27:11

is the kind of stuff where, yeah, there was like

27:13

a lot of very profound differences. I mean, we do

27:15

not look like octopus, thank goodness. But it's something where,

27:17

as a result of that, the eye is basically the

27:20

same. And so the question

27:22

here, I think, is can

27:24

we apply these frameworks to our own change, right?

27:26

In our own societies. And so what I try

27:28

to say is, okay, there's some stuff that is

27:30

ordered. There's lots of regularity. There's lots of patterns

27:32

in our lives. That's the convergence stuff.

27:34

At some point, you know, you go on

27:36

the highway, there might be an accident sometimes,

27:38

but like most of the time, you know,

27:40

the cars drive around the same speed. They

27:43

have space between them. It's about the same distance,

27:45

right? And like, there's all these patterns, but every

27:47

so often there's a car accident and that's contingency,

27:50

right? So this is the kind of stuff where

27:52

what I say is that the way that

27:54

social change happens and also our lives unfold

27:56

is what I call contingent convergence. It's not

27:59

the most beautiful. phrase, but it's I

28:01

think very accurate in saying, okay, so there's

28:03

these contingencies that change the path you're on.

28:06

And then once you're on that path, the sort of

28:09

forces of order do constrain the outcomes

28:11

that are possible. They say, look, this

28:13

stuff's going to work, that stuff's not

28:16

going to work. And the sort of

28:18

survivors bias produces the stuff that does

28:20

work. So I think this is a

28:22

useful framework that I'm borrowing from evolutionary

28:24

biology to help us better understand social

28:26

change. So before I get to contingents

28:29

convergence, I want to stay

28:31

with the difference between contingents, which

28:34

is the meteor killing the dinosaurs

28:36

and allowing them out mammals to

28:38

arrive to rise and convergence. A

28:40

couple of other examples that

28:43

you give in the book of convergence. Crab

28:46

like bodies, keep evolving

28:48

time and again, there are

28:51

five separate instances that

28:53

that shape somehow seems to provide a useful,

28:55

adaptive way to navigating the world. Yeah. So

28:57

this is, I mean, this is one of

28:59

those things where evolutionary biologists joke about that

29:01

and they always say, you know, eventually we're

29:03

going to have pincers like we're, we're all

29:05

going to end up as crabs because like

29:07

evolution, if you know, and some of them

29:09

say, if there, if there is a God,

29:11

he really likes crabs. And this is actually,

29:13

you know, I actually heard that about Beatles,

29:15

but there's actually a word for

29:18

this. Carsonization is the process of

29:20

evolving towards a crab like shape.

29:23

Similarly, flight. I never thought about this

29:25

until I read it in the book,

29:27

flight evolved four separate times. It's insects,

29:30

it's bats, it's birds, and it's pterosaurs.

29:32

That that's amazing. Yeah. I mean, this

29:35

is the stuff where, you know, evolution

29:37

is the, it's a

29:39

really powerful lesson of the value

29:41

of undirected experimentation because every strange thing that

29:44

we see around us, every, you know, organism,

29:46

every plant, et cetera, is just the byproduct

29:48

of this undirected experimentation, navigating uncertainty, right? I

29:51

mean, that the world is changing all the

29:53

time. There's different concentrations of oxygen. They sometimes

29:55

have to be in the ocean, sometimes have

29:57

to be on land. And the, this

30:00

sort of diverse array of life is

30:02

just undirected experimentation. But the thing is

30:05

that these forces do end

30:07

up constraining the possibilities. Now, when

30:09

we talk about carcinization, there's a really interesting thing that I

30:11

don't go into much depth in the book, but it's called

30:13

the Burgess Shale up in Canada and the Canadian Rockies. And

30:16

it's basically like this fossilized

30:19

museum of all these really wild body

30:21

plans that used to exist hundreds of

30:23

millions of years ago before a mass

30:25

extinction event. And what happened

30:27

is they all got obliterated, so you can't

30:29

have any sort of convergence from those body

30:31

plans because they don't exist anymore. Whereas

30:33

the ones that survived, all of us are

30:36

derived from them, right? So the contingency is

30:38

like, okay, which body plans exist? Which sort

30:40

of ways could you set up life with

30:42

spines or not spines, whatever it is. And

30:45

then once you have that contingent event where

30:47

there's the extinction, within that there's

30:49

this sort of constrained evolution that is, okay,

30:52

well, when this happens, the animal dies. So

30:54

it doesn't exist very long. And when

30:56

this happens, the animal survives, so it does

30:58

exist. And this is where carcinization, you need

31:00

to have a term because the crabs

31:03

are very much survivors. And

31:05

it turns out that unless you were on the other

31:07

side of the planet from where the

31:10

meteor hit, if you're

31:12

a burrower, if you get underground, you

31:14

could survive those fires in that heat

31:18

and then come out and continue

31:20

the evolutionary process. Yeah, I mean, this is

31:22

the thing, I find this really fascinating to

31:24

think about, but also unsettling is that, all

31:28

the life that exists now is basically offspring

31:30

of either something that could dig when

31:32

the asteroid hit or that lived in

31:34

the ocean. And that's it, right? Because

31:36

everything else died. Now, the really

31:39

strange thing to think about as well is that I

31:41

told the story about my great grandfather's first wife

31:43

and then there's this murder and so on, but

31:45

you keep tracing these things back, right? So my

31:48

great grandfather's ancestors had to meet in just the

31:50

right way. And their great grandfather, they had to

31:52

meet. But you go back then six million years,

31:54

this chimpanzee-like creature had to meet another chimpanzee-like creature

31:57

and the two of them mating is part of

31:59

the story of human. existence. You go back further,

32:01

you know, there's a worm-like creature hundreds of millions

32:03

of years ago. It dies, we probably don't exist.

32:06

Or my favorite example I think in the book

32:08

is, this is a finding from Modern Science

32:11

about a year ago, was they

32:13

found out that the reason why mammals

32:15

don't lay eggs, right, why we don't

32:17

have eggs and we instead have live

32:19

births, is they believed based

32:21

on genetic testing that a single shrew-like

32:23

creature got infected by a virus a

32:25

hundred million years ago which caused a

32:27

mutation which led to placenta and the

32:30

rise of mammals. And you think

32:32

of, I mean, to me that is just so

32:34

utterly bizarre to imagine that our existence, like everything

32:36

in humans, you know, ancient Rome, all this stuff,

32:38

you know, Donald Trump, whatever it is, all of

32:41

it is completely contingent on a shrew-like creature a

32:43

hundred million years ago getting sick. It's

32:45

like when you think about this stuff, I think

32:47

evolutionary biology tell, you know, they have encountered black

32:49

swans throughout hundreds of millions of years. It's basically

32:51

the origin story of complex

32:54

life. So let's talk about one

32:56

of those black swans and the

32:58

specific concept of contingent

33:01

convergence. I love

33:03

the example you use of the

33:05

long-term evolution experiment using

33:07

E. coli, 12 identical

33:10

flasks of E.

33:13

coli and in separate,

33:15

separate environment,

33:17

separate but identical environments, run

33:20

10 million years worth of

33:22

human evolution through it. What's

33:24

the results of that? Yeah, this one, this one,

33:26

making E. coli sexy in a book is pretty

33:29

hard, I must say, but I think this is

33:31

such a powerful lesson for change. So I had

33:33

to include it. I flew out to Michigan State

33:35

to meet with the people running the long-term evolution

33:37

experiment and the simple idea they had, the genius

33:40

idea was they said, let's see

33:42

what happens if we take 12 identical

33:44

populations of E. coli. So they're genetically

33:46

identical. We put them in

33:48

12 flasks and we just evolve them

33:50

for decades, right? And because E. coli

33:53

life cycles are so short, it's basically

33:55

the equivalent of millions of years of

33:57

human evolution. Like multiple lifespans a day,

33:59

generations. Exactly. So it's

34:01

like, it's the equivalent of it. If you went

34:03

through like great, great, great grandparents each day, right?

34:06

Now the beauty of this experiment is they controlled

34:08

everything. So there's nothing in these flasks except

34:10

for a glucose and citrate

34:13

mix because the glucose is food

34:15

for the E. coli and the citrate is like a

34:17

stabilizer. Okay? Now what happens they

34:19

figure, okay, let's test contingency or convergence

34:22

and for like the first 15 years or so

34:24

the experiment The lesson was okay.

34:26

It's it's convergence because all 12 of the

34:28

lines were evolving in slightly different ways. There's

34:30

noise, right? There's little differences. The genome is

34:32

not the same, but they're

34:34

basically all getting fitter at eating

34:37

glucose so they're getting better at surviving and

34:40

then one day a researcher comes in and one of

34:42

the flasks is Cloudy and this is not supposed to

34:44

be the way it is It looks like a little

34:46

bit of milk has been dropped into it instead of

34:48

this really clear substance that the rest the other 11

34:50

are So they sort of think

34:52

oh, this is a mistake and they throw it out They

34:55

restart because they've frozen the equalize they restart

34:57

it like every the equivalent of every 500

34:59

years Yeah, five. So they could

35:02

reset the clock anytime. They won't exactly 12 flasks.

35:04

Yes, they're all frozen They all this sort

35:06

of fossil record. They can restart at any point

35:08

So they restart the experiment in this flask just

35:10

backing up a little bit and about

35:12

two weeks later I think it is or something like that. They

35:15

The flask turns cloudy again and like okay, this was

35:18

not an accident. There's something going on here So they

35:20

actually paid to sequence the genome very expensive at the

35:22

time a lot cheaper today But

35:24

they paid it paid to sequence it and

35:26

the amazing finding this is the thing when I read

35:28

this I was like this is a central way of

35:31

capturing my idea is That

35:33

when they looked at the genome there

35:35

were four totally random mutations that

35:37

did not matter at all for

35:39

the survivability of the E.

35:41

coli That proceeded in just

35:44

the right chain that when the fifth

35:46

mutation happened all of the sudden

35:48

that population could now eat the citrate Which was

35:50

not supposed to happen right? It was supposed to only

35:52

eat the glucose the citrate was there as a

35:54

stabilizer But as a result

35:56

of this they became way more fit way more

35:58

survivable than the other population because they

36:00

could eat something the others couldn't, right? And

36:03

what happened then is that since then, and this has

36:05

now been going on for 20 plus years or so,

36:08

since then, the citrate population has an advantage

36:10

over all of the other 11, and none

36:13

of the others have developed that mutation because

36:15

it's sort of like a house of cards.

36:17

You had to have these exact four accidents

36:19

in exactly the right order. If they'd reached, if they'd changed the

36:21

order, it wouldn't have happened. And then they had

36:24

to finally, on top of that four, those four accidents, they had

36:26

to have the fifth accident, which gives them the ability to eat

36:28

citrate. And so this is the

36:30

idea of contingent convergence, right? It's like for

36:32

that population that evolved the ability to eat

36:34

citrate, that one mutation

36:37

has changed everything forever. It will never go

36:39

back to eating glucose the same way as

36:41

the others. But for

36:43

the others that didn't develop that change, they

36:46

are all still evolving in relatively predictable ways.

36:48

So I think this is the capturing

36:51

of the sort of paradox of our

36:53

lives is that we exist somewhere between

36:55

order and disorder. Complete disorder would destroy

36:57

humans, we couldn't exist and our societies

36:59

couldn't function. Complete order also wouldn't work

37:01

because there'd be no change, there'd be

37:03

no innovation and so on. And

37:06

so I think this is where contingent convergence

37:08

really shines, but I will admit that trying

37:10

to do a sound bite version of the

37:12

long-term evolution experiment is something that in writing

37:14

the book was probably the

37:16

greatest challenge of making something about bacteria interesting.

37:18

But it's really fascinating because if you stop

37:20

and think about that, first of all, the

37:22

genius of doing this over 20 years when

37:25

you have no idea what the outcome is and

37:28

hey, maybe we're wasting our lives and our career doing this,

37:30

number one. But number two, you come

37:32

in and you see that it's cloudy. Is

37:35

it, I'm assuming it's cloudy because

37:37

they're reproducing in greater numbers, they're

37:39

processing the citrate, a whole bunch

37:41

of different stuff is going on

37:44

than the other 11 environments.

37:46

And one has to imagine that

37:49

if this wasn't taking place in an experiment,

37:51

but this was a big

37:54

natural scenario, the

37:57

citrate consuming E. coli would. eventually take over

37:59

the population because they have twice as much

38:02

food available or more than just the plain

38:04

old glucose eating E. coli. Yeah, and this

38:06

is, I mean, what I was talking to,

38:08

so one of the researchers named Richard Lenski,

38:11

the other one, Zach Blount, and I was

38:13

talking to them about this. And they said,

38:15

look, we tried to control everything. We tried

38:17

to control every single, you know, you pipette

38:20

the exact same amount of solution into the

38:22

beakers each day and so on. But

38:24

what they said was that, you know, well, what

38:27

if one day, you know, when we

38:29

were washing the flask, just a

38:31

tiny microscopic amount of soap stayed

38:33

on there, right? That could affect the evolution.

38:36

And so there's, I mean, even, even in this experiment,

38:38

there's contingency they couldn't control, which is, I mean, it's

38:41

the most controlled evolutionary experiment that's ever been done. But

38:43

it's still like, you know, these little tiny bits, if

38:46

you just have, you know, a microscopic bit of soap,

38:48

well, that's going to kill some of the bacteria and

38:50

then the evolutionary pathway is going to be slightly changed.

38:52

And I think this is the stuff where, you know,

38:55

had they been a different researcher, had a grant run

38:57

out, they might've just said, okay, we've solved it.

38:59

It's all convergence because they could have

39:01

shut down the experiment after 15 years. So there's just

39:04

all these things that are like layered on top of

39:06

each other. And I think, you know, a lot of

39:08

scientists, especially in the world of evolutionary biology, understands

39:10

that this is something that we really

39:13

have to take seriously. And I

39:15

think the way that we are

39:17

set up in human society is to ignore

39:19

the contingency, because those are not useful things

39:21

to think about. They're the noise, they're the

39:23

aberrations, they're the outliers, you know, you delete

39:25

them from the data, whatever. And I think

39:27

this is the kind of stuff where the

39:29

lesson here is that those are actually central

39:32

to the question of how change happens. I

39:34

love this quote from the book, I

39:36

began to wonder whether the

39:39

history of humanity is just

39:41

an endless but futile struggle

39:43

to impose order, certainty and

39:45

rationality onto a world defined

39:47

by disorder, chance and chaos.

39:50

Yeah, I mean, I think this is where I became

39:53

a bit of a disillusioned social scientist, to be honest,

39:55

was that I think that the

39:57

way that I was taught to present change

39:59

to people. was to come

40:01

up with a really elegant model, you know, a

40:03

really beautiful equation, and that

40:05

has statistical significance and has like the smallest

40:08

number of variables possible to explain the entire

40:10

world. And the reason that I

40:12

ended up, you know, having that mentality that

40:14

I think we're trying to cram complexity into these

40:16

neat and tidy sort of straight jack models is

40:18

because my PhD dissertation

40:21

and so on, I was looking at the origin story

40:23

of coups and civil wars, that

40:25

was part of my research. And

40:27

these are black swan events. I mean, you know,

40:29

there's only a few coup attempts that happen every

40:31

year, and they're so hard to

40:34

predict. I mean, because, you know, one of

40:36

the coup plots that I studied was

40:38

where this guy, you know, who's a sort of

40:40

mid-level officer in the army, just on a whim,

40:43

decides to try to overthrow the government. And

40:45

he's got like 50 guys in his command, this is in 1997

40:47

in Zambia, and

40:51

his plan is to kidnap the army commander and

40:53

force the army commander to announce the coup on the radio.

40:55

It's not a stupid plan, it's actually, it probably would have

40:57

worked. But the group of soldiers that

40:59

were dispatched to the house, I interviewed some

41:02

of them when I went to Zambia, and

41:04

they said, look, you know, we ran

41:06

in, the army commander's in his pajamas, he runs out the back

41:08

because he sees these soldiers coming to kidnap him, and

41:10

he climbs up the compound wall, and

41:13

you know, it's like in a film where like they

41:15

grab his pant leg, he's pulling up, they're pulling down,

41:17

and they just, he slips through their fingers. And

41:20

he then goes to the government HQ

41:22

and announces that there's a coup plot

41:24

underway, and so the soldiers go to

41:26

the radio station, they capture the coup

41:28

ringleader, who's at this point literally hiding

41:30

in a trash can, okay? Three

41:33

hours after the coup plot has been hatched. Now,

41:35

the problem is I was reading all this stuff

41:37

about like Zambia's democracy, and it was, oh, Zambia's

41:39

a resilient democracy, it's one of the beacons of

41:41

African democracy in the 1990s. And

41:44

I'm trying to reconcile this with the fact that in

41:47

my own research, I'm finding this story where the soldier says

41:49

like, yeah, I think if I was like one second faster,

41:52

I probably would have gotten the government

41:54

overthrown. And on top of this, the other contingency

41:57

was they didn't chase him. And I said, why didn't you

41:59

chase him? said, well, the

42:01

army commander's wife was really attractive and we

42:03

wanted to talk to her and also we

42:05

opened the fridge and there's

42:07

Namibian import beer in the fridge and

42:10

we hadn't had Namibian beer for a long time. So we said,

42:12

you know, screw this, we're going to, we're going to drink some

42:14

beer and talk to the wife. And I'm

42:16

thinking, you know, like, like, how do I put this

42:18

in my model? Like, you know, I mean, like, like

42:20

what is my quantitative analysis going to show me about

42:22

this? And I think that's the stuff where

42:24

those little pivot points and

42:27

studying really rare events that are highly consequential

42:29

makes you think differently about the nature of

42:31

social change. And I would go to these like political

42:33

science conferences and I was just like,

42:35

I don't, I don't believe this is how the world works.

42:37

I think there are times where these can be useful models,

42:40

but I don't think we're capturing reality accurately. And that's

42:42

where, you know, some of the origin story professionally of

42:44

the book comes from. You

42:47

have to build in attractive women

42:49

and imported beer into your models

42:51

or, or more accurately

42:53

just completely random events. There's

42:55

a research note in the

42:57

book from an

43:00

evolutionary biologist. 78% of

43:02

new species were triggered by

43:04

a single event, typically

43:08

a random mistake or genetic error.

43:10

Yeah. My favorite, my favorite example of this

43:13

is something called the bottleneck effect. And it's

43:15

actually, I think it's actually an important idea

43:17

for economics as well. So I'll start with

43:19

the, the biology, the bottleneck is where a

43:21

population arbitrarily gets reduced to a very small

43:23

number. And the number of people in

43:25

that population could be, you know, it could be 10, it could

43:27

be a hundred, whatever it is, but who

43:30

those 10 or a hundred people are

43:32

really, really matters. So there's, there's, there's

43:34

one Island, for example, where half the

43:36

population has asthma because it was populated

43:38

initially by this bottleneck of a very small number

43:40

of people who disproportionately had more asthma than the

43:42

rest of the population. There's

43:44

elephant seals, for example, who got whittled down

43:47

through hunting and so on to something like,

43:49

I think it was 50 breeding pairs or

43:51

something like that. But which exact seals lived

43:54

or died completely changed the trajectory of

43:56

that species. Now I sort of say

43:59

this because human society has had bottlenecks

44:01

at various times. We don't know exactly how small

44:03

they've been, but the hypothesis is

44:05

perhaps that it may have been as few

44:07

as a few thousand humans at one point.

44:10

And which humans were in that group, that

44:12

determined everything for who's alive now, right? So

44:14

if you swap out, you know, one person

44:16

for a different person, you've changed the trajectory

44:19

of the species. Now I think this is

44:21

also true when you think about economics, you

44:23

think about innovation. Every so

44:25

often shocks go through industries and they whittle

44:27

down the competition. And who survives in that

44:30

moment is potentially somewhat arbitrary. It could be

44:32

based on some pressures, it could be a

44:34

smart CEO, but the sort

44:36

of survivors in that bottleneck then will dictate how

44:38

the industry might unfold in the future. I mean,

44:41

you know, Apple has this outsized effect on the

44:43

tech industry, but you know, maybe the timing's a

44:45

little bit different and Apple dies. I mean, it's

44:47

not implausible. Hey, but for Microsoft giving them a

44:50

loan in what was it, 98? But

44:53

for the antitrust case, which

44:55

gave Microsoft an incentive to have

44:58

another survivable operating system, who

45:00

knows? Yeah. And so this, you know, when

45:02

you think about, I think bottlenecks are a

45:04

useful way of thinking about this partly because

45:06

they affect trajectories very, very profoundly, but

45:09

also because they can be arbitrary. And I think this

45:11

is something where what we do

45:13

in human society is we write history

45:15

backwards. So we look at who is successful

45:17

and we say, I mean, hindsight bias, you know,

45:20

many people, I'm sure, have talked to you about

45:22

this, but it's very important to underline that, like,

45:24

when these arbitrary things happen, if

45:26

you then infer causality, that's a neat and

45:28

tidy story, you actually are learning

45:30

exactly the wrong lesson. I mean, the

45:33

reason these particular elephant seals survived is

45:35

probably arbitrary. It just happened to depend

45:37

on who the people who are poaching

45:39

them, you know, happened

45:41

to stumble upon. And then,

45:43

of course, the evolutionary history of that animal is completely changed.

45:46

So I think that lesson is that, you

45:48

know, sometimes when bottlenecks happen, it reshapes the

45:50

trajectory of the future, but it also is

45:53

inescapably arbitrary at times. And

45:56

we don't like that. I mean, the entire world

45:58

of self help in the entire world of sort

46:00

of business advice is, oh,

46:03

these people were successful, here's how you replicate it.

46:05

And the replication is always just do what they

46:07

did, right? But I mean, of course, the world's

46:09

different now. I mean, if you do what they

46:12

did, you're just making something that's not truly innovative.

46:14

Right, you can't invent an iPhone today. Exactly. So

46:17

it's fascinating when you talk about

46:19

bottlenecks, I read a book some

46:21

years ago called Last Ape Standing,

46:23

and it talks about all the

46:26

various proto-human

46:28

species, from Cro-Magnum to

46:30

Neanderthal to Homo sapiens.

46:33

And the theory is that in

46:36

the last Ice Age, maybe

46:38

it's 20 or 40,000 years ago, we

46:41

were down to a few thousand

46:43

humans. And but

46:46

for the Ice Age ending when

46:48

it did, another year, again,

46:51

we may not be having this conversation. There may be

46:54

no humans around. Yeah, I mean, this is the, this

46:57

is the stuff also where I think that the sort of

46:59

predictable patterns that people try to impose on the world are

47:02

also subject to whims of timing,

47:04

right? And your example

47:07

is completely apt, and I think it's a very important

47:09

one. And I think it also speaks to the question

47:11

when you say when the Ice Age ends, right? The

47:13

timing issue is so important. Now,

47:15

one of my examples of this that I

47:17

think is so fascinating is you

47:20

think about like our daily lives. And

47:22

our daily lives are basically set up

47:24

in groups of seven. We've got a

47:26

seven day week. Why is that? So

47:28

I start looking into this. And effectively

47:30

what happens is there's this period in

47:33

ancient Rome where they have

47:35

this superstition that says the planets are

47:37

really important for being auspicious and so

47:39

on. And they can see, because

47:41

they don't have telescopes, five planets with

47:43

the naked eye and the sun and the moon. You

47:46

add them up, that's seven. They set up a

47:48

seven day week because of that. That's why we

47:50

divide our lives in seven. And it's because of

47:52

this thing that I also talk about in Fluke,

47:55

which is this concept of lock-in, where an arbitrary

47:57

thing can happen. And then sometimes it persists doesn't

48:00

and that's often very random. So my

48:02

other example of this is everything that we write, everything

48:04

that we say is derived from

48:06

English being locked in when the printing press was

48:08

invented. If the printing press had been invented six

48:11

decades earlier, six decades later, there'd be a different

48:13

language because the language was in flux and

48:15

all of a sudden it became really important to have a

48:17

standardized system. So a lot of people used to write the

48:20

word had, H-A-D-D-E. Now

48:23

that was expensive because they figured, okay, we've got

48:25

to typeset this with a bunch of letters. Why

48:27

don't we just do H-A-D? Boom,

48:29

all of a sudden the language changes. There's

48:31

a series of things that happen really, really quickly

48:33

but they basically produce modern English. And

48:35

so I think this sort of concept of the

48:37

arbitrary experimentation and superstition of the Romans

48:40

and then getting locked in and the empire sort of

48:42

sets it up and then it spreads and all that.

48:44

And then you think, okay, why do we have a

48:46

five day working? I mean, it's partly tied to the

48:49

superstition about the auspicious nature

48:51

of the visible planets which themselves are an

48:53

arbitrary byproduct of how our eyes evolved. So

48:55

I mean, it's just sort of everything

48:58

you think about has got these sort of tentacles where

49:00

they could have been slightly different and then our

49:02

lives would be radically changed. One

49:05

of the things that's so fascinating

49:07

with us as narrative storytellers, right?

49:09

We think about, okay, we've had the spoken

49:12

language for tens

49:14

of thousands of years, maybe a hundred thousand

49:16

years and we think about the cuneiform

49:18

and the written language going back to the

49:21

Egyptians and the Greeks but

49:24

that's history and

49:26

99% of the people who

49:28

lived during that period were illiterate. In

49:31

fact, species

49:33

wide literacy, which we arguably still

49:36

don't have but are closer to,

49:38

this is like a century old.

49:40

Like for a hundred years, people

49:42

could read and write and

49:45

meaning most people but go back

49:48

beyond the century and the vast majority

49:50

of people either couldn't read,

49:52

couldn't write, never went to school. They

49:55

had to get up and work the land. They

49:57

didn't have time to mess around with this

49:59

silly stuff. Yeah, you know,

50:01

I think there's a lot of things where we

50:03

are blinded to the fact that

50:05

we have lives that are unlike any humans who

50:07

have come before us, right? And I

50:10

think there's some really big superstructure events that

50:12

are related to this that really do affect

50:14

our lives. So my favorite way of thinking

50:16

about this is that I think

50:18

that every human who came before the modern period, most,

50:20

you know, at least, you know, maybe the last 200

50:22

years or so, what they experienced was

50:25

uncertainty in their day to day life. There was almost

50:27

no regularity, no patterns in their day to day life.

50:29

They didn't know where their next meal would come from.

50:31

They didn't know, you know, whether they would get eaten

50:33

by an animal, etc. The crops might fail, you know,

50:35

etc. But they had what

50:37

I call global stability, which is to say like the

50:39

parents and the children lived in the same kind of

50:41

world, you're a hunter-gatherer, your kids a hunter-gatherer, you know,

50:44

and this means that the parents teach the kids how to

50:46

use technology. There's basically regularity from

50:48

generation to generation for thousands of years. Yeah,

50:50

we have flipped that, right? So what we

50:52

have is local stability and global instability. So

50:55

we have extreme regularity, like no human has

50:57

ever experienced before, where we can know to

50:59

almost the minute when something we order off

51:01

the internet is going to arrive at our

51:03

house, and we go to Starbucks anywhere in

51:05

the world, and we can have the same

51:07

drink, and it's going to taste basically the

51:09

same thing. And we're really angry if somebody

51:11

messes up, you know, an order because that

51:13

expectation of regularity is so high. But

51:16

we have global instability. I mean, you know, I

51:18

grew up in a world where the internet didn't

51:20

exist really for ordinary people. And now it's impossible

51:22

to live without it. You know,

51:24

you think about the ways that children teach parents how

51:26

to use technology, that's never been possible before. And

51:29

on top of this, you have the sort of AI, you know,

51:32

rise where the world's going to profoundly change in

51:34

a very short period of time, there has never

51:36

been a generation

51:39

of our species, where not just

51:41

the global dynamics have changed generation to

51:43

generation, but within generations, I mean, we're

51:45

going to live in a world where,

51:48

you know, the way that we understand

51:50

and navigate systems and our

51:52

lives is going to change multiple times

51:54

in one lifetime. And you think about,

51:56

you know, Hunter Gathers, the average

51:59

human generation is about 20 years old, 26.9

52:01

years in the long stretch of our

52:03

species, you can go 27 years over

52:05

and over and over, it's pretty much

52:07

the same world for pretty much the entirety of

52:09

our species until I would say the last maybe

52:11

100 years or so. And that's the thing, you

52:14

think about this, the more you think about this, the

52:16

more of these examples you find. I mean, one of them

52:18

is, you know, jet lag, I flew in from London,

52:21

and there's been three generations of people who

52:23

could ever move fast enough to knock out

52:25

their biology in a way that they have

52:27

jet lag. So, I mean, there's just a

52:29

million things that we experience as routine that

52:31

no humans before us have ever been able to

52:34

experience. You could never outrun your circadian rhythm until

52:37

you could travel at a few hundred

52:39

miles an hour and go from country

52:41

to country. You

52:43

couldn't even change time zones until,

52:45

what is it, 75 years ago? Yeah,

52:48

I mean, there's an amazing map. I

52:50

don't know the exact name of it. It's

52:52

an ISO chrome map or something like that, but

52:54

it's a map of London from 100

52:56

plus years ago, and

52:58

it's showing the world based on how long

53:01

it takes you to get anywhere. And

53:03

you see that like Western Europe is,

53:05

you know, the closest and it's like five

53:07

plus days or whatever, right? Now, somebody

53:09

made a renewed version of that map a couple of

53:12

years ago, and the furthest reach you can

53:14

go is like 36 plus hours, whereas

53:16

in the old map, it was like three plus

53:18

months. And, you know, that's the

53:20

stuff as well, where we just, we've sped up the

53:22

world so much, and I think this is embedded a

53:24

lot of the dynamics where flukes and sort

53:27

of chance events become more common. 36

53:29

hours, I think you get to the moon in 36 hours. That's

53:32

right, I mean, it's true. That's how much it's changed.

53:36

The Bloomberg Sustainable Business Summit returns

53:38

to Singapore on July 31st for

53:41

solutions-driven discussions on unlocking growth

53:43

in times of increased ESG

53:45

scrutiny and competition. Join

53:47

global business leaders and investors to

53:50

drive innovation and scale best practices

53:52

in sustainable business and finance. Learn

53:55

more at bloomberglive.com/Sustainable Biz

53:58

Singapore. That's Bloomberg Live. So

56:00

that's one of those things where if a different set of

56:02

people have been in the room with Cameron then maybe

56:04

they don't hold the referendum and then that's a very different world

56:06

we live in. So you're over

56:09

in the UK looking at the United States as

56:11

a political science. The

56:14

election of Donald Trump in 2016 by 40 or

56:16

50,000 votes and a handful of swing states,

56:22

fascinating question. Was

56:24

that a random contingency

56:27

or was the convergence and the arc

56:29

of history moving towards a

56:31

populist in the United States? Yes,

56:34

so there's sort of precursor factors that Trump

56:36

tacked into and this is the convergence, right?

56:39

This is the stuff that's the trends. I do think

56:41

there's some pretty big contingencies around Trump. I mean, there's

56:43

one hypothesis which I can't, I don't know

56:45

Donald Trump's thinking, but there's speculation by people

56:47

who are close to him that the

56:49

moment he decided he would definitely run for the 2016 race

56:52

was in 2011 when there was the White

56:54

House Correspondents Dinner and

56:57

he was publicly humiliated by Barack Obama with

56:59

a joke that basically said something to the

57:01

effect of, I really sympathize with

57:03

you Donald because I couldn't handle the hard choices

57:05

that you have to make on Celebrity Apprentice whereas

57:08

I have to make the easy choices in the

57:10

Situation Room and everyone's sort of laughing at Donald

57:12

Trump and so on. And the

57:14

question is, if the joke writer had not

57:16

come up with that idea or Obama said,

57:18

let's just can that joke, does Trump run?

57:20

I mean, that's question one. Then

57:23

there's the questions around the election, right? And this

57:25

is something where without

57:27

going into too much detail, the reopening of

57:29

the FBI investigation, which happens because of a

57:31

congressman in New York and his

57:33

inability to sort of control himself. Sending

57:37

naked genital pictures to underage women. Thank

57:39

you for saying it for me. So

57:41

this is the thing where this

57:43

causes the reopening of the FBI investigation. Did this cause

57:46

a shift in votes in those three critical states? I

57:48

don't know, but possibly, right? Could be. And

57:50

on top of that, you have one

57:52

of my things that I do talk about in the book. I have a

57:55

chapter called The Lottery of Earth and

57:58

this is the strangest example of US politics uncertainty

1:22:00

can be a really wonderful thing and

1:22:03

you just have to sometimes accept it and

1:22:05

then navigate based on the understanding that

1:22:07

there is radical uncertainty that we can't

1:22:10

eliminate and that is where

1:22:12

some of the best flukes in life

1:22:15

come from. Really very fascinating. Thank You

1:22:17

Brian for being so generous with your

1:22:19

time. We have been speaking with Brian

1:22:21

Klass, Professor of Global Politics at University

1:22:24

College London and author of the

1:22:26

new book Fluke, Chance, Chaos

1:22:28

and Why Everything We Do Matters.

1:22:31

If you enjoy this conversation well be sure and check

1:22:33

out any of the 500 previous

1:22:36

discussions we've had over the past

1:22:38

10 years. You can find those

1:22:40

at iTunes, Spotify, YouTube, wherever you

1:22:43

find your favorite podcast. Check

1:22:45

out my new podcast At the Money.

1:22:48

Once a week a quick discussion

1:22:50

with an expert on a subject

1:22:52

that matters to investors. You can

1:22:54

find those in the Masters in

1:22:56

Business feed. Sign up for my

1:22:58

daily reading list at ritalts.com. Follow

1:23:01

me on Twitter at ritalts. Follow

1:23:03

the full family of Bloomberg podcasts

1:23:05

at podcast. I

1:23:07

would be remiss if I did

1:23:10

not thank the crack team that

1:23:12

puts these conversations together each week.

1:23:14

Kaylee Lapara is my audio engineer.

1:23:16

Atika Valbrun is my project manager.

1:23:18

Sean Russo is my researcher. Anna

1:23:20

Luke is my producer. I'm Barry

1:23:22

Ritalts. You've been listening to Masters

1:23:24

in Business on Bloomberg Radio. The

1:23:41

Bloomberg Sustainable Business Summit returns to

1:23:43

Singapore on July 31st for

1:23:46

solutions driven discussions on unlocking

1:23:48

growth in times of increased

1:23:50

ESG scrutiny and competition. Join

1:23:52

global business leaders and investors to

1:23:55

drive innovation and scale best practices

1:23:57

in sustainable business and finance.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features