Podchaser Logo
Home
The math problem that could break the internet

The math problem that could break the internet

Released Wednesday, 28th September 2022
 2 people rated this episode
The math problem that could break the internet

The math problem that could break the internet

The math problem that could break the internet

The math problem that could break the internet

Wednesday, 28th September 2022
 2 people rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

From Radio Lab and WNYC

0:03

Studios comes an exploration. don't

0:06

see this thing. Into the mystery. of

0:08

our planet. I never dreamed that it would happen

0:10

with real creatures. We were just going,

0:12

how did you do that? An occasional songs.

0:17

How is the song?

0:19

Terrestrials. We

0:22

are Terrestrials. Listen on

0:24

radio lab for kids wherever you get podcasts.

0:28

This

0:29

week's episode of unexplainable is brought to you

0:31

by Samsung. Are you looking

0:33

to supercharge your productivity? Samsung's

0:36

new Galaxy z Fold four

0:38

offers a world of possibilities with an expansive

0:41

foldable screen. It's

0:43

more than just a phone. zfold four

0:45

is a full on multitasking powerhouse

0:48

that lets you do more than ever before. right

0:50

from the palm of your hand. You can fold

0:53

the large immersive screen to watch your favorite

0:55

content while using your phone to take notes.

0:57

or to take the perfect selfie all

0:59

hands free. Visit samsung

1:02

dot com to learn more about Galaxy

1:04

z Fold four.

1:07

Hey,

1:07

Brian. Hey, Meredith. I want I

1:09

want you to imagine something with me,

1:11

if you will.

1:12

Okay. Alright. So

1:13

imagine one day. You wake up.

1:16

Okay. And the Internet is broken.

1:18

Ugh.

1:19

the

1:20

So hackers are getting into

1:22

your bank accounts, your to your Twitter

1:24

accounts, your work email. Mhmm. Instead

1:27

of loading banner ads, your

1:29

computer would just like start loading viruses,

1:31

Okay. Great.

1:33

Great. So they have all of my money,

1:36

my identity -- Yep. -- all of

1:38

that.

1:38

My secrets. Exactly. Is

1:41

this the day I just put down the phone

1:43

and then walk away. Just

1:45

walk away. Yeah. I'm just gonna walk away.

1:47

Finally, fulfill your desk knee as a mountain

1:49

man.

1:49

This is not working. Life on the Internet.

1:53

What you just described sounds so

1:55

broken. I wouldn't I don't

1:58

know if there's a tech support that could

1:59

fix it.

2:00

Right. So this doomsday scenario,

2:03

this is the Internet without encryption.

2:07

What

2:07

is encryption?

2:09

Yes. Largely max.

2:11

Okay. Encryption is

2:13

this cloak that wraps

2:15

your private information. And so

2:17

that anybody that's seeing that information,

2:20

it just looks like random stat It

2:22

just looks like gibberish. And

2:24

so encryption is the thing that's really

2:26

protecting your private information

2:29

as it travels through the web.

2:32

So you don't see my Social Security

2:34

number. You see this cloak of gibberish.

2:36

Right. If somebody intercepted that

2:38

and tried to read that information, all

2:40

they would see was

2:41

randomness. So it sounds like

2:44

I use encryption all

2:46

the time. All the time.

2:48

Mhmm.

2:49

Our everyday lives on the Internet,

2:52

they're built on an elaborate and

2:54

largely invisible system

2:56

of encryption. Google says that

2:58

ninety five percent of the traffic that comes through

3:00

their site is encrypted

3:01

in some way.

3:03

because there's this whole system of certificate

3:05

and digital signatures that are all

3:07

based on encryption that

3:10

tell you that when you go

3:12

to Amazon, that that's actually Amazon.

3:16

So encryption is

3:18

just at the heart of building trust

3:20

on the Internet. So I know the websites

3:22

I'm going to have not been in

3:25

receptive by somebody --

3:27

Right. -- and can send my private information

3:29

to people just knowing it's for their eyes

3:31

only. Exactly. Exactly. So

3:35

I like all this encryption as

3:38

you've described it to me. It sounds

3:40

nice. Right. get to do a lot of cool

3:42

stuff with it. Is it

3:44

is it in danger?

3:46

So that's what I would love to tell

3:48

you about on today's show. Okay.

3:50

How the Internet was built on encryption?

3:53

and how am I come tumbling down?

4:11

When

4:11

I started working in cryptography, almost

4:14

all my colleagues told me I was crazy, and

4:16

they were they were right.

4:18

Marty Hellman is a professor

4:19

at Stanford University. Have been

4:21

for God over fifty years?

4:24

Fifty

4:24

years ago, computers were

4:26

these massive plastic boxes

4:28

with these Itty Bitty little screams, and

4:31

they were getting more and more intertwined

4:33

with their money.

4:35

ATMs were cutting

4:37

edge technology in nineteen sixty nine,

4:40

and Nasdaq, the world's first electronic

4:43

stock

4:43

market, opened in nineteen seventy

4:45

one. I remember saying I could foresee

4:47

the day when you

4:49

might buy a loaf of bread with an electronic

4:51

funds transfer. I couldn't say a debit card

4:53

because we didn't have them. It

4:55

was a new age. It was a new

4:57

relationship

4:58

to what money was and what it represented.

5:01

Nixon took America off the gold standard

5:03

in this time too, nineteen seventy one.

5:06

So money was getting more abstract, more

5:08

electronic.

5:10

Safe that protected physical

5:12

bills in gold bars were the

5:14

security of the past.

5:16

We needed a safe to protect the information

5:18

of money.

5:19

the electronic communications that

5:22

were quickly becoming more and more relevant.

5:24

And I said, what happens if someone

5:27

Maybe they can't steal billions of dollars, but they

5:29

just crashed the system so nobody knows how

5:31

much money they've got in their bank account. What

5:33

are they what happens then? and

5:35

so I saw the need for encryption. Marty

5:37

was on a quest to bring digital

5:39

encryption to the masses.

5:42

Encryption that could be used by

5:44

the public used commercially to

5:46

protect the electronic messages starting

5:48

to send money back and forth. But

5:51

Marty had a problem because at

5:53

this

5:53

point, encryption was

5:55

dominated by the government. Almost

5:58

no one outside the military

5:59

even really knew how it worked.

6:03

Any

6:03

research into the underlying principles

6:05

of encryption

6:06

was automatically classified

6:08

and considered

6:10

a potential threat to national security.

6:13

Agencies like the NSA had

6:15

top secret encryption departments. whovering

6:18

up all the best mathematicians in

6:19

the country. But

6:21

if you wanted to study encryption out

6:23

in the open, it was a lonely place

6:25

to be.

6:27

The field was almost nonexistent.

6:30

Most of it was in classified literature.

6:32

I'd go to information theory conferences and

6:35

there would often be people with name tags that

6:37

said, let's

6:38

see, what was it? Department

6:40

of Defense was NSA, and everyone

6:42

who said US government was with the CIA.

6:45

it wasn't hard to figure out who was who.

6:47

Marty's friends, they all warned him

6:49

off. They told him he had no

6:51

chance going up against a juggernaut,

6:53

like the NSA. friends told

6:55

me how can you hope to discover anything

6:58

that NSA doesn't already know. They

7:00

have a decades

7:00

head start and they said, I don't care.

7:03

what they know is not available for commercial

7:04

use if I develop it it is.

7:06

In

7:07

order to bring encryption to the public,

7:09

Marty needed to reinvented basically

7:12

in the light of day, and

7:14

he needed a team. Whit

7:16

Duffy showed up on a doorstep in

7:18

the fall of nineteen seventy four, and

7:21

he was an itinerant cryptographers the way

7:23

he described himself. By

7:24

the time he showed up at Marty's doorstep,

7:27

the sky went duffy.

7:29

He had forged his own path in

7:31

academia. I'm not a good

7:33

student. Whit had spent years

7:35

going to universities and libraries

7:37

and cutting edge laboratories, trying

7:40

to piece together any unclassified

7:43

information that he could find on cryptography.

7:45

and

7:46

he kept hitting the same dead ends

7:48

that Marty was hitting. Until

7:50

in nineteen seventy four,

7:52

the head of the cryptography lab at IBM

7:54

told Whit

7:55

he said, I can't tell you much. We're

7:57

under a secrecy order here, but

7:59

you ought

7:59

to go look up my friend Marty

8:02

Hellman when you're back at Stanford. he

8:04

subsequently wished he hadn't sent

8:06

that because Marty and I became

8:08

a big pain in his sample push.

8:11

Witten Marty hit it off right away.

8:13

our interaction, in many ways,

8:16

ran the opposite of a normal

8:18

the

8:19

student, a graduate student professor

8:21

relationship. I I describe it as,

8:23

you know, I think possibly I'm more

8:25

imaginative than he is, certainly he's

8:28

smart as I am. I I really liked working

8:30

with him. but he didn't like anyone telling them

8:32

what to do.

8:33

Marty and Whit were totally seduced

8:35

by cryptography. I

8:37

sometimes joke that there's amused just

8:39

like there's amused poetry, there's some use of

8:41

cryptography, and she whispered my

8:43

ear, she whispered my ear, she probably whispered

8:45

in a lot of other people's ears who just

8:47

wrote it off as a crazy dream.

8:49

and

8:49

they got to work. So it haphazardly

8:52

putting together all the bits and pieces

8:54

they had gathered from the muses.

8:58

What had been obsessing for years

9:00

about how to use cryptography

9:02

to communicate remotely in

9:04

a digital world we

9:06

were moving into a world where

9:08

people would have intimate friendships with

9:10

people they never met in person. And

9:13

the cryptography was the only thing that

9:15

would give you any sort of privacy.

9:17

Into what? This

9:18

presented two clear problems.

9:21

I

9:21

had had these two problems in the back of

9:23

my mind. you know,

9:24

one for ten years and one for five.

9:28

have a hot backburn. So

9:30

imagine that you and I wanted to

9:32

privately share information without

9:35

ever meeting in person. We

9:37

could set up a safe for us to put

9:39

letters in and

9:40

no one else could read them. those letters,

9:42

they would be private. But

9:44

we'd both need keys to open

9:47

the safe door and we couldn't

9:49

share those keys without exchanging

9:51

them in person. This

9:52

was Whit's first problem. How

9:55

do you share a secret key

9:57

remotely. And

9:59

if we're just, you know, sending keys

10:01

around, how do you make

10:02

sure that you're sending them to the right person

10:05

without ever meeting them

10:07

in person.

10:09

Verifying identities remotely. This

10:11

was Whit's second problem. and

10:14

I was trying to combine those two problems.

10:17

And at some point, I

10:19

realized that

10:21

that must be possible.

10:24

One afternoon in nineteen seventy

10:26

six, Whit was noodling on these problems,

10:29

and he had breakthrough. I

10:31

understood I had discovered something important.

10:34

And I went downstairs to get myself a

10:36

Coca Cola and almost forgot

10:39

it. and

10:39

on walking in the stairs, I fortunately

10:42

remembered it again. And then I walked

10:44

downhill to Martin's house

10:46

to explain it to him So

10:48

thinking back to that problem of

10:50

you and I trying to share messages without

10:52

ever meeting in person, what's

10:55

idea was something like, what

10:57

if the safe had a mail slot in

10:59

it.

10:59

That way, you could come by

11:01

any time and drop off your letters

11:03

in the save, but

11:05

you wouldn't need keys. And

11:07

then I could come by later, open

11:09

the safe with my keys and read your letters.

11:11

So encrypting, putting

11:14

them information in the safe. is

11:16

a different step than

11:17

decrypting, taking the information out.

11:20

Whits idea was to split

11:22

the encryption and the decryption.

11:27

This

11:27

also solves the second problem

11:29

of identity because you know

11:31

that that's my safe and my mail

11:33

thought. I'm

11:34

the only one with the keys to the save,

11:36

and so I'm the only one that can open

11:39

the door and take the information out.

11:42

Having the keys is a way

11:44

of proving my identity.

11:46

Of course, you'd need your

11:48

own

11:48

saved in your own mail slot where

11:50

I could come by any time and drop off

11:52

letters for you. But then we

11:54

would have a secure way to

11:56

exchange

11:57

information.

11:59

The bottom line

11:59

is if we both have safes and

12:02

we both have our own protected personal

12:04

keys, we can trust that we're

12:06

talking to each other. entrust

12:08

that we're talking privately without

12:11

ever having to meet in person.

12:17

It

12:17

was a stunningly elegant idea,

12:19

something

12:19

they called public key cryptography.

12:22

Whit

12:22

had come up with the idea of

12:25

public key cryptography, but no way

12:27

to do it.

12:27

Now Marty and Whit had to figure

12:30

out how to build these safes.

12:32

The

12:32

trick was, of course, they can build them out

12:34

of iron and steel. They needed

12:37

to build them out of math.

12:39

In

12:39

cryptography, The SAFE isn't

12:41

a physical object. It's

12:42

like a mathematical cloak

12:45

covering up private information with

12:47

random static.

12:49

transforming

12:51

understandable and usable information

12:53

into

12:55

incomprehensible

12:57

useless garbage but it isn't

12:59

just about locking up information

13:01

under random static.

13:03

You

13:03

also have to be able to easily

13:06

unlock that randomness with a key

13:08

and

13:08

turn it back into readable,

13:11

usable information.

13:13

Marty and Whit wanted to find the simplest

13:16

system that could fit that pattern. and

13:18

so they looked into a type of math problem

13:21

called one way

13:22

functions.

13:26

One

13:26

way functions are math problems

13:28

that are designed to be easy to solve

13:30

but take a lot of time and energy

13:33

to reverse. Like,

13:35

seven

13:36

times thirteen, I could do

13:38

seventy twenty or ninety I think it's ninety

13:40

one. I could do that in my head in few seconds.

13:42

if you gave me ninety one and asked me

13:45

to factor it into two primes, it takes

13:47

longer. So

13:48

Multiplying as easy and factoring

13:50

as hard.

13:51

But if you have one of the factors

13:53

already, then you can easily get the other

13:55

one. So that's

13:56

the secret key. But not every one

13:59

way function can be made

13:59

into a cryptographic system.

14:02

Not all one way functions are good

14:04

at making encryptions, but

14:06

all encryptions have a one way

14:08

function at their heart. And

14:10

for this all to work, these one way

14:13

functions need to be super hard

14:15

to solve without a key.

14:17

So tough that it's It's not even

14:19

worth a hacker's time to try.

14:24

One

14:24

night. There was probably one AM.

14:27

Marty was at his desk with

14:28

a pencil and paper racking his

14:31

brain trying to figure out a way to bring Whit's

14:33

idea of public key cryptography to

14:35

life. I

14:35

was playing and I tried

14:37

a new permutation on what secret,

14:39

what's public, what's private, and all of sudden

14:41

it came out after

14:42

a few months of work. Marty

14:44

and Whit published their findings. They

14:47

put together everything that they had been thinking

14:49

about,

14:50

the safes, the public key cryptography,

14:52

the one way functions, and

14:55

the first line that they wrote.

14:57

We stand today on the brink of

14:59

a revolution in cryptography.

15:02

It

15:02

was probably wit. That that

15:04

sounds more like wit than me.

15:05

Whit Whit is not above

15:08

grandstanding, and he's often right.

15:10

I

15:10

think I got that one right.

15:13

For

15:13

the first time in history, there was

15:16

research that could make encryption available

15:18

on a commercial scale. And

15:20

the open research community was thrilled.

15:23

But

15:23

NSA had a whole another reaction.

15:26

The

15:26

NSA was not happy,

15:29

but they had lost their

15:30

monopoly on cryptography. There

15:32

was actually a fight, then they say, loosely speaking,

15:34

maybe more than see speaking wanted to throw me

15:36

in jail.

15:37

Marty and Whit's work threatened the

15:39

whole way that

15:40

the NSA did business.

15:42

If

15:42

all this cryptography research was

15:44

out in the open, then

15:45

more foreign governments could encrypt

15:48

their information. and

15:49

that made the NSA's job much

15:52

harder. I was telling foreign

15:54

entities how to protect their secrets. I was trying

15:56

to tell American entities how to protect there

15:58

is, but there's no way to do

15:59

one without the other. An

16:01

NSA employee

16:02

wrote a letter to the journal that

16:04

published their work and

16:05

accused them of breaking the law.

16:08

specifically

16:08

the international traffic

16:10

and

16:10

arm circulations.

16:12

It's against the wall obviously to export

16:15

a jet fighter plane, right, without an export

16:17

license. It's also against the law

16:19

to export the plans for how to make that fighter

16:22

because

16:22

that could be used to make it.

16:25

And the Itar, the international traffic

16:27

and arms regulations, defines anything cryptographic

16:30

as an implement of war, And

16:32

so by publishing in international

16:35

journals how to design good cryptographic

16:37

systems, we were

16:39

exporting technical specification on

16:41

on implements of war without an export license.

16:44

Marty

16:44

immediately brought this accusation

16:46

to the General Counsel at Stanford University.

16:49

It's

16:49

unconstitutional because

16:52

it would be a violation of freedom of the press and

16:54

freedom of speech. That was his legal

16:56

opinion. but he also warned me and

16:58

I'll never forget this. If I was

17:00

prosecuted, Stanford would defend me.

17:03

But if I was found

17:04

guilty and all appeals were exhausted.

17:06

They couldn't go to jail for me.

17:08

Whit and Marty continued their fight

17:10

for

17:10

robust accessible encryption.

17:13

And

17:13

Marty came to see himself as

17:15

a security officer for the public.

17:17

No one was representing the public and the

17:19

public needed

17:20

protecting and the group that you'd expect

17:22

to protect them, the part of the government that should be

17:24

doing that wasn't doing it. So

17:26

I realized that's the role I had assumed.

17:28

The

17:28

reaction from the NSA sparked

17:31

a nationwide debate about the government

17:33

threat to open publication. And

17:35

who had the right to access tools

17:37

of privacy? articles

17:39

came out in science in

17:41

the New York Times. The

17:42

media was all on our side. I mean, the Times, for

17:44

example, because this is freedom of the press, and remember

17:47

it was right after Watergate. My wife

17:49

was really happy when this became

17:51

big news because she said, up to that

17:53

point, if something happened to me, nobody would really

17:55

know what had happened. Whereas now, if

17:57

you're a public figure and suddenly

17:59

you

17:59

have an accident, there would be questions hopefully.

18:02

Remember,

18:02

I'm also pissing off, not just NSA,

18:05

but they're foreign equivalents. And

18:07

I had other friends worked in the community who

18:09

told me that, yes, my life was in danger.

18:11

So who knows?

18:12

People told me, though, watch my

18:14

ass. I I never worried about

18:16

it. Various people have told me NSA

18:19

threatened them and things like that. It was

18:21

never more than rude to me.

18:23

Eventually,

18:24

the NSA backed off.

18:26

They

18:26

never pressed charges against Marty and

18:28

Whit. And

18:29

over the years, MSA stopped

18:32

trying to classify all cryptographic research.

18:35

They

18:35

came to agree with Marty and Whit.

18:38

and

18:38

saw that everyone can benefit from

18:40

encryption. American

18:41

secrets of great commercial importance.

18:44

Sales

18:44

have national security importance.

18:46

Instead

18:46

of classifying all cryptography

18:49

research from the start,

18:51

Whit told me that the NSA began to

18:53

scout talent from early drafts.

18:55

of scientific

18:56

journals.

18:57

So they were very good at at

18:59

at observing papers and

19:01

approaching people informally and saying,

19:03

you know, some combination of

19:06

would you please not publish this? And, you know,

19:08

maybe you'd like to get a clearance and come to some

19:10

of our meetings. We work on interesting problems.

19:13

And

19:13

today,

19:14

nearly fifty years later,

19:17

public key encryption is a fundamental

19:19

building block of the Internet. and

19:21

of our daily lives.

19:24

How many of you have surf

19:26

the Internet? How many of you have bought something with a

19:28

credit card on the Internet? How many of you do electronic

19:30

banking? you're using cryptography. You just

19:32

don't realize it because it's integrated,

19:34

automatic, and transparent, which is the way it

19:37

should be. Today, we're well over the

19:39

brink. We we no we no longer stand on the

19:41

brink of revolution in cryptography. It's happened.

19:46

Whit and Marty they saw a

19:48

vision of a future that they

19:50

helped create. And

19:52

all that encryption that we use every

19:54

day It depends on those

19:56

one way functions, those mathematical

19:59

locks. The

20:01

problem is math is always

20:03

changing and evolving.

20:06

Today, multiplying large prime

20:08

numbers may be a good one way function.

20:11

It's easy to solve, but takes lots

20:13

of time and effort to reverse without key.

20:17

But

20:17

tomorrow. Tomorrow,

20:18

somebody might figure out a

20:20

new way to factor numbers. a

20:23

new way that's much, much more

20:25

efficient. And

20:26

then that asymmetry it

20:29

disappears. and the walk is easy

20:31

to get into without the key.

20:33

After the break, is

20:35

it possible to

20:36

future proof encryption? And

20:39

how answering that question

20:41

might break the Internet.

20:51

This is advertiser content brought to

20:53

you by AbbVie. We're

20:55

always trying to understand how do we

20:57

stop disease. Right? Disease comes in.

21:00

We try to stop it. That's how we practice

21:02

medicine. An interesting thought

21:04

would be, what if we actually didn't

21:07

get disease? What

21:09

if we prevented disease?

21:12

Hi. My name is Howard Jacob.

21:14

I'm a vice president at AbbVie. I've

21:17

had the genomic research Center, and I

21:19

also had our data integration across

21:21

R and D. Better medicine starts with

21:23

better information.

21:25

So Howard and AbbVie have been setting

21:27

a collection of

21:27

more than one million human genomes to advance

21:29

our understanding of disease. It's

21:32

a large amount of data. Your DNA

21:34

extended end to end from all of your

21:36

hundred trillion cells would go to the

21:38

sun and back six hundred and sixty

21:41

six point five times.

21:43

All of that genetic information will have

21:45

major

21:45

implications for the future of healthcare, helping

21:48

doctors tailor treatment to every patient's

21:50

needs. For many genes, there are

21:52

tests where you can actually say, oh, this is

21:54

the dose you should take. This is when you

21:56

should take it and this is why you're taking it.

21:59

It opens the door

21:59

for prescribing medications differently

22:02

and at different doses for each individual. Their

22:05

work is already saving lives, but genetic

22:08

researchers are far from satisfied. I

22:10

just know that we can do more. I know

22:13

we can enable to practice medicine

22:15

better. I know we can able patients

22:17

to live healthier lives. I know we can

22:19

develop better therapies and we just have

22:21

to do it. To learn more about

22:23

how AbbVie is shaping the future of medicine,

22:25

visit here now dot ABBIE

22:30

Any

22:35

job at Amazon can turn into a high paying

22:37

career. For employees looking for

22:39

security today, Amazon offers good

22:41

pay and health care on day one. New

22:44

parents also get up to twenty weeks of fully

22:46

paid leave, so they have the flexibility to

22:48

spend time with family while making a living.

22:51

For opportunity tomorrow, Amazon is investing

22:53

one point two billion dollars into free technical

22:56

training that's available to over seven hundred

22:58

and fifty thousand associates across the

23:00

country. With these programs hourly

23:02

employees have the opportunity to move into

23:04

higher paying, highly skilled jobs in

23:06

fields like software engineering and robotics.

23:09

Learn more at about amazon dot com.

23:21

What encryption does Tesla use? Curve

23:24

25519 most

23:27

secure discrete

23:29

log parameter. There

23:31

is not anymore. Our

23:33

network just blew the part. Unexplainable.

23:35

We're back. I'm

23:38

Meredith and and the

23:39

You're Meredith. I'm

23:42

Brian. Yep. Okay. because of encryption,

23:45

I know you are you because

23:47

we're on an encrypted I

23:49

think we're on an encrypted channel. Yeah.

23:51

And there's not just some, you

23:54

know, kind of deep

23:55

fake man in the middle giving

23:58

me fake Meredith. This has all been

23:59

an elaborate ruse.

24:01

So we've like built up

24:03

this pretty secure Internet. like,

24:05

I feel pretty good on the Internet. I

24:07

don't think about it. Yeah. I see, like,

24:09

the little lock icon in my web

24:11

browser and I know this

24:13

is encrypted. So

24:16

is this the Internet we might have

24:18

forever? Are

24:20

we cool?

24:21

So one way functions at the

24:23

heart of the Internet security today, like,

24:25

are hard to

24:27

break. Mhmm. But that's

24:29

really subject to changing technology. Oh,

24:32

so these so the the simple

24:34

question here is these

24:35

one way functions, kind of math problems,

24:38

easy to do, hard to reverse. These

24:40

are are locks on the Internet. These

24:43

locks can be picked. Is that what you're

24:45

saying?

24:45

Oh, definitely. One example that comes

24:48

to mind is there's a really common one way

24:50

function based on multiplication

24:51

and factoring

24:53

and the invention of quantum

24:56

computers which are on

24:58

the horizon, a twinkle, and lot of

25:00

researchers' eyes, the

25:02

way that those computers are built would

25:04

actually make factoring

25:07

as a one way function totally obsolete.

25:09

Okay. And so, like, there's a lot

25:11

of work in research and geography

25:13

right now, looking into these one

25:16

way functions and making them quote unquote,

25:18

quantum safe. It's a little

25:20

bit like an upgrade to your security system

25:22

on your operating system on the computer,

25:25

like, people are seeing new

25:27

technologies, new types of computing

25:29

power on the horizon and trying

25:31

to

25:32

add patches or figure out

25:34

what we would need to change in order

25:36

to keep that safe.

25:37

Yeah. There's a bit of a cat and

25:39

mouse thing here. You see new tech coming,

25:41

you try to beef

25:43

up the locks, but then, you know,

25:46

I'm sure new even new new tech

25:48

can come and then be and then we'll need even

25:50

new locks.

25:51

Yeah. But that's an example of a

25:53

threat that people can see coming.

25:55

Mhmm. But, like, what if there's

25:57

a threat that you can't? Okay.

25:59

So, can all locks be broken?

26:02

Is it possible to not

26:04

do this cat and mouse game of like seeing

26:07

new tech and building a new lock? Yeah.

26:08

So that's the main question that's been

26:11

motivating the cryptographer Rafael

26:13

Pass.

26:13

My name is Rafael Pass. I'm

26:16

a professor of computer science.

26:18

He basically believes

26:20

that, like, cryptography's mass

26:22

magic. There's a bunch of just

26:25

beautiful and such contradictory concepts

26:28

in photography. Things that just seem impossible

26:30

at first and then

26:33

using cryptography,

26:35

the impossible becomes possible.

26:37

It's like finding magic in in mathematics.

26:40

but it's really it's actually true.

26:43

So this math wizard, this

26:45

the sorcerer of math, What

26:48

is his question when it comes to these math locks?

26:51

Is there

26:52

a perfect lock?

26:54

That would be nice. So

26:55

this is very theoretical. This is,

26:57

like, the the perfect luck

26:59

as a concept. Yeah. Before we build

27:02

it, we have to, like, know if this concept can

27:04

even exist.

27:05

Exactly. Exactly. So the the

27:07

way that Rafael puts us is, like,

27:09

does a true one way function

27:12

cannot exist. Mhmm. So right

27:14

now, we've been, like, talking about one way functions

27:16

as, like, easy to solve, but

27:18

hard to reverse. but

27:20

that hard is a moving target.

27:22

It's subjective to the technology and the knowledge

27:24

that we have. But it would

27:26

if there was a true one way function that

27:29

was easy to solve, but impossible

27:31

to reverse. And then I

27:33

can buy things on the Internet forever

27:36

and that's it.

27:38

That's what I want.

27:39

Yeah. No matter what fancy quantum

27:41

computers are coming down the road. Okay.

27:43

This would be mathematically impossible

27:46

to reverse.

27:47

So how does he figure

27:49

out if this type of lock

27:51

even can exist? Like, how does he

27:53

even figure out if

27:55

it's even possible? Right. So

27:57

he's

27:58

looking for a unifying

27:59

theory across

28:02

all one way functions.

28:04

So

28:04

we'd try to see whether there exists

28:06

some kind of, like,

28:07

mother problem or master problem

28:10

that

28:10

can tell us whether one of your functions actually

28:12

do exist or not. Tell

28:14

me, has he has he figured this out?

28:16

So a few years ago in in twenty

28:18

years or no, You're

28:21

gonna, like, tell me a whole story. Yes.

28:24

Basically,

28:24

yes. He found he found

28:27

maybe not exactly the answer, but

28:30

he found a very very promising

28:32

lead.

28:32

So

28:34

a few years ago, Rafael and his

28:37

grad student, Yani, they were digging

28:39

into this unsolved

28:41

problem of computer science.

28:43

So, like, totally different field than

28:45

cryptography. Mhmm. And they were looking

28:47

at this problem. It's called comoguro

28:50

complexity. k. k.

28:52

Komal comoguro complexity.

28:55

Yep.

28:55

And so this is like a

28:57

famous unsolved

28:59

problem

29:02

in computer science. a

29:03

problem as we studied

29:05

at least since the nineteen sixties.

29:07

And

29:08

it has to do with the nature of

29:10

randomness.

29:11

randomness is so critical in photography.

29:14

That's the the walls of the

29:16

safe that you're locking the information behind. You're

29:18

like transforming usable

29:20

information into gybriish

29:21

to do that. Gybriish

29:24

is is randomness. Right? Yeah. because

29:26

we don't have, like, a physical safe here.

29:28

Like, our information on the Internet is

29:30

cloaked in randomness.

29:33

Instead of seeing my credit card number

29:35

passing through the cyber space,

29:37

you see

29:39

some things that just look like gibberish. And

29:41

and

29:43

randomness that gibberish is

29:47

at the heart of this complexity

29:49

problem. Yeah. And

29:51

so basically, the problem to

29:53

solve is, like, can you write a computer

29:55

program that can analyze

29:59

randomness? I think

30:01

this is a deeply philosophical question I guess,

30:03

we're looking at something in nature and we're trying

30:05

to see, was this just random or is

30:07

there something interesting going on?

30:09

So

30:10

the solution to

30:12

this complexity problem. It's

30:14

not just like some math problem on

30:16

a chalkboard somewhere where all you have

30:18

to do is like beautiful

30:21

minded.

30:21

Oh, x equals three. You're right.

30:23

cancer. You're right.

30:24

Exactly. There's not like a solution

30:26

like that. The solution to this problem would

30:29

actually be a computer

30:31

program that could analyze

30:33

the randomness of any given information.

30:36

And

30:36

that itself is a tool Like, if you

30:38

had the tool, truly see through

30:41

randomness and see if something was

30:43

truly random or if there was like a signal,

30:45

some information

30:46

buried in it, like this

30:48

computer program

30:50

could basically see through

30:53

the safe walls of any encryption

30:55

scheme. Yeah.

30:56

because if our information is protected

30:58

by randomness and if you can see through

31:00

randomness, you can probably fetch

31:03

out that information. Exactly.

31:05

If you solve the complexity problem, what

31:08

happens?

31:08

If you solve the complexity problem,

31:11

then

31:13

one way functions, true one way

31:15

functions cannot exist, and

31:17

not only that,

31:18

everything that we've built on

31:21

potential one way functions

31:22

is instantly

31:24

broken.

31:25

You've broken all candidate one way

31:27

functions all encrypted schemes,

31:30

all digital signatures, everything can be broken.

31:36

So

31:36

solving this problem, could you could

31:38

give you the power to break the

31:40

Internet?

31:41

Yeah.

31:43

it would the

31:45

way that it's been described to me is that it

31:47

would instantly break all

31:49

encryption.

31:50

but

31:52

there

31:53

may not be an answer to this

31:55

complexity problem. It

31:57

might just not be solvable.

31:59

Oh.

31:59

At all. Like, we don't know

32:02

if there's an answer.

32:03

What happens if we just

32:05

can't solve it?

32:07

So if

32:08

this complexity problem is difficult

32:11

to solve, then

32:12

Rafael claims that he has a

32:14

very clear blueprint of

32:16

how to build the perfect lock, something

32:18

that's provably

32:19

secure. Okay. So just to acknowledge something

32:22

here. Today. You've taken us

32:24

on a journey And

32:26

each step of the journey has an

32:28

unanswered question. So, like, as we

32:30

we proceed further into

32:32

darkness here in this story, so

32:34

because it gets a little heady here. So

32:36

first off, you have this idea

32:39

of one way function. Right. don't

32:41

know if there's a perfect lock out there. Mhmm.

32:43

the answer to the question, is there

32:45

a perfect lock out there? Mhmm.

32:48

Hinges on the answer to

32:50

another unanswered question. And -- Totally.

32:53

-- yes. Complexity problem. Mhmm.

32:55

So there's, like, two great big questions

32:57

here, one leading to the other. Exactly.

32:59

Well, Rafael has told us there's a road

33:02

to the answer. Right. And if

33:04

we work really hard on this math problem,

33:07

we might get an answer to the

33:09

question of

33:11

is there a

33:12

perfect lock out there?

33:13

Yes. But

33:16

this is kinda high stakes because either

33:19

we might get perfect locks out of this or

33:21

we might realize that all locks will fail.

33:24

Yes.

33:26

That would mean that communication intravats

33:28

would never be able to be secure. That

33:32

would be pretty bad.

33:34

Do you

33:34

think that's going to happen?

33:36

I don't think so. I hope not.

33:39

Is that worth it? Is that worth pursuing

33:41

this path? for, like, that dream of perfection.

33:43

Like, we might find ruin. And

33:46

is that worth it here? I would

33:48

maybe stay away from this complexity problem

33:50

because I don't wanna break the Internet.

33:55

Yeah. I mean, I feel like,

33:57

to me, this feels

33:59

very similar

33:59

to nuclear physics.

34:02

And like the study of

34:04

that through the thirties and the forties opened

34:07

the door to weapons of tremendous

34:09

power, the atomic bomb. So that

34:12

would be, like, the path that would lead us to breaking

34:14

all encryption on the Internet. Yeah. But

34:16

it also gave us really fundamental

34:19

answers about, like, the

34:21

nature of matter in our universe.

34:23

Right? Mhmm. That then led to

34:26

tremendous tools in medicine

34:28

and agriculture and carbon

34:31

free energy. Mhmm. The pursuit

34:33

of knowledge and the pursuit particularly of

34:35

these, like, very fundamental truths.

34:38

They have powerful and dramatic consequences.

34:41

Mhmm. Yes. There could be

34:43

this

34:43

world of the, like, atomic

34:46

bomb for -- Mhmm. -- encryption, but

34:48

it could also lead us

34:51

to a whole new era of

34:53

encryption and lead us to tools

34:56

that we don't even know what they are

34:57

yet. when we functions are

35:00

great, they're awesome, but they're

35:02

not everything we want from cryptography, where

35:04

you have much loftier goals.

35:07

So it would be awesome to also

35:09

achieve these more advanced cryptographic

35:12

tools

35:13

using some

35:14

company like this. But it

35:16

also strikes me there's a counterpoint here

35:18

in that, yes, I'm usually extremely optimistic

35:21

about an unanswered question. But

35:23

here, I'm realizing that sometimes

35:26

looking

35:26

into a non answer question could

35:29

lead you to dangerous things too.

35:31

Yeah.

35:31

I mean, I think when

35:33

you look for fundamental truths,

35:36

the

35:36

consequences of that are just inherently

35:39

bigger. If you're looking for something

35:41

that connects

35:42

all locks, then

35:44

if you find a flaw, that's a flaw in

35:46

all locks. It's

35:47

just like a nature of the question.

35:49

It's a little scary. It's a

35:51

little scary for sure. On the other

35:53

hand,

35:54

is it less scary to live in a world

35:56

where

35:57

you have

35:59

reasonable security in

36:02

the locks that you have and, like, reasonable

36:04

faith that they haven't already been broken. Yeah.

36:07

all of these things, all of that

36:09

the cryptography that we've been talking about,

36:11

they're

36:12

they're tools, they're ways

36:14

for us to share information, they're

36:16

or ways for us to build our

36:18

lives and our relationships

36:21

on the Internet

36:23

remotely.

36:23

Like

36:25

we're having this conversation remotely, if,

36:27

like, the development of the encryption that

36:30

we have today allowed us to do this,

36:32

then what could we use

36:35

the tools of tomorrow to build

36:37

in the future? Like, there's also an inherently

36:39

optimistic view. there's

36:41

still even, like, for now. Right? You

36:44

can still

36:45

run into some shady stuff on the Internet.

36:47

People can still steal things.

36:50

Yeah. Yeah. I mean, like, crucially, encryption

36:53

is really about that

36:54

protecting information as it

36:56

travels through the Internet. But, like,

36:58

there's still data breaches all the time.

37:01

Like, once it gets to a destination, like,

37:03

who knows how your information is getting stored

37:05

on some, like, company server?

37:06

And it strikes me that even if you

37:09

have a perfect lock and perfect encryption,

37:12

you could still give away the password or

37:14

the keys to that? Potentially. Someone could

37:16

do it. Click on some suspicious

37:17

link. There's, like, ten new bread

37:19

recipes.

37:20

And they asked for my

37:22

Apple password. I'm like sure why not.

37:35

This

37:35

episode was reported and

37:37

produced by Meredith Hoddnott, with

37:39

help from Bird Pinkerton. It

37:42

was edited by Katherine Wells and

37:44

Brian Resnick with help from Noam Hasenfeld

37:47

and Gillian Weinberger. So scoring

37:49

by Meredith and Noaham. Ifene

37:51

did a little something something too, but, you know, mainly

37:54

Meredith and Noaham. Mixing and sound design

37:56

by me, ifene Shapiro and

37:58

fact checking by

37:59

Zooey Mollik. Mandy Nguyen

38:01

is off to adventure in the Great North.

38:04

Christian Aalla found his way home.

38:07

special thanks to Russell Brandon and Erica

38:09

Clarike. If you want to learn more

38:12

about one way functions and complexity,

38:14

check out Erica's article researchers

38:16

identify master problem

38:18

underlying all cryptography in

38:21

Quanda Magazine. If you have thoughts

38:23

about this episode or ideas for the show,

38:25

please email us. We're unexplainable at

38:28

vox dot com. We'd also

38:30

love it if you wrote us a review or a rating.

38:33

unexplainable as part of the vox media

38:35

podcast network, and we'll be back

38:38

next week.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features