Podchaser Logo
Home
Despite OpenAI Chaos, Wall Street Is Still Betting Big on AI

Despite OpenAI Chaos, Wall Street Is Still Betting Big on AI

Released Wednesday, 22nd November 2023
Good episode? Give it some love!
Despite OpenAI Chaos, Wall Street Is Still Betting Big on AI

Despite OpenAI Chaos, Wall Street Is Still Betting Big on AI

Despite OpenAI Chaos, Wall Street Is Still Betting Big on AI

Despite OpenAI Chaos, Wall Street Is Still Betting Big on AI

Wednesday, 22nd November 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

It's a UNIX

0:04

system. I

0:07

know this. It's

0:12

got all the files

0:14

of the whole park.

0:19

It tells you everything. Sir, he's uploading the virus. Eagle

0:21

One, the package is being delivered. What are you guys going to do for Thanksgiving? I am

0:23

going to be in the hospital. I'm going

0:24

to be in the hospital. I'm going to be in the hospital. I'm going

0:26

to be in the hospital. I'm going to be in the hospital. I'm going to be in

0:28

the hospital. I am already

0:30

at my in-laws. We're

0:33

splitting the cooking up three ways, so

0:35

I'll be doing that. Starting

0:37

as soon as I am done with work

0:39

today. What about you? We've

0:43

got a friend coming over. We

0:46

live pretty far away from... My

0:48

family's in Texas. I'm in South Carolina. Right.

0:51

We make the trip for Christmas,

0:53

but not for Thanksgiving. So we have friends

0:55

come over. Classic. We're

0:59

doing a big meal here. Cool. That's

1:01

great. What about you, Emily? I'm

1:04

making the trek out to Long Island tomorrow

1:06

morning because

1:08

I refuse to take the Long Island Railroad

1:10

the Wednesday night before Thanksgiving.

1:12

I took

1:16

it once when I was going to

1:18

my in-laws for Thanksgiving, and there were

1:21

kids that seemed like they were

1:23

on the precipice of throwing up from drinking

1:25

so much on the train. Hell yeah, there were. I

1:28

was like really close. I could not get farther.

1:31

I was so close to them, and I was like,

1:33

this is just going to end so horribly.

1:35

Look, I've seen it all on the

1:37

Long Island Railroad. It's a real service of magic, whatever that kind of magic

1:39

is, whether

1:43

it be like dark magic or, you know, good magic. Who's

1:46

to say?

1:46

I think that's

1:48

definitely right. Our mileage may vary. Yeah.

1:52

It's funny. I didn't think about the need to

1:54

get really messed up before

1:56

you go home for Thanksgiving when you're like 22.

2:00

Seems like it's a rite

2:02

of passage for certain people heading

2:04

east from Brooklyn. Well, I mean, I

2:06

think it's also like, you know, there's

2:10

the Thanksgiving tradition in many a suburban

2:12

hometown of, you know, going to the local

2:15

bar that you were not old enough to go to when you were in high

2:17

school, then see all the people.

2:19

Not something that I have ever partaken

2:22

in or have any desire to do, but... Oh,

2:25

that I've done. Some of the, like,

2:27

one version of that is also going to the city

2:30

from Long Island, getting super trashed

2:32

at whatever shitty bars that you and your friends would get

2:34

trashed at when you were in high school.

2:37

Part of the thing too is that you can drink

2:39

open container on the Long Island railroad,

2:42

but there are a couple of days every year that you can. That

2:44

is a big part of it, I would say.

2:47

Yeah,

2:47

they used to sell, like, they used to have, like, little, like,

2:49

wine carts on the platform so you could

2:51

get, like, a little can of wine. But

2:54

the big thing is you get beer

2:57

in a giant plastic cup with a straw.

3:00

Like, it's like a giant, like, soda cup that you would get at,

3:02

like, McDonald's or whatever.

3:04

Right. That is, like, a pinnacle. You

3:06

can't do that all the time? No, you

3:09

can't do that on, I think, the

3:11

day before Thanksgiving and

3:13

St. Patrick's Day. Well, there

3:15

we go. I forget exactly what

3:17

the deal

3:19

is, but needless to say, the way

3:22

that you get around that in high school is by buying a Gatorade

3:25

at the local, you know, train station

3:27

bodega and putting whatever amount

3:29

of shitty plastic, you know, jug vodka

3:32

you can find in there. Right.

3:33

Work around. And I'm not even,

3:36

you know, I'm personally not saying

3:37

this from experience, I'm not even thinking about

3:39

that, but I wouldn't get it

3:41

firsthand. Underage drinkers

3:44

are nothing if not innovative. Yeah, really. The

3:47

southern version of that

3:49

is go to Sonic and get a Route 44

3:52

sized drink, just

3:54

like this massive 44-ounce soda,

3:57

and then you pour out some of it and get it. and

4:00

put your preferred spirit

4:03

in there.

4:04

I've heard that Sonic's ice is very good. Oh,

4:06

it's amazing. It's like a very specific ice cube

4:08

shape. Yeah,

4:11

it's like these tiny little

4:14

balls. I don't know how to explain it, but it's

4:16

perfect. And everyone

4:18

should try Sonic once in their life. It's

4:20

one of America's finest foods. Sonic

4:23

popular. You know what, I think I'm

4:25

gonna have Sonic for lunch tomorrow pre-Thanksgiving.

4:28

God bless. It's

4:32

one of the only places here that's open 24-7 and

4:35

like every day, so. I

4:38

feel like it's legendary to people who

4:40

have not had it. I've never had it and

4:42

I've been like, it's on my bucket list.

4:44

I don't know if it's worthy of that, but it definitely

4:47

is. No, it's gonna be like, it'll

4:49

be like when In-N-Out Burger started coming into

4:51

Texas and all the people from

4:53

California loved it. Everyone that

4:55

was here that already had Sonic and Water Burger were

4:57

like, this is not, what is this with

4:59

this California shit? Get this out of here.

5:02

I went to a DQ for the first time

5:04

when I was in Washington State

5:06

over the summer. Surprisingly,

5:09

it was like the DQ in the Walmart parking

5:11

lot in Kurt Cobain's hometown. Hell

5:14

yeah. Which is just like various levels of like

5:16

suburban ennui that I'm like, wow.

5:18

You really understand them now.

5:21

Yes. The lyrics, they make more sense.

5:25

Dairy Queen is legitimate. Let's not

5:27

pretend it's not. It's great.

5:29

I mean, I had some very good chicken fingers.

5:33

Like literally no complaints.

5:35

Ice cream like 75 different ways. There's

5:37

nothing wrong with that. Exactly. So

5:41

we're actually here to talk about some

5:44

of your reporting Maxwell. Can

5:47

you introduce yourself

5:49

and then we'll get into it a little bit.

5:52

Sure, I am Maxwell Stron.

5:55

I am a reporter at Motherboard to cover

5:57

technology focused on sort of financial.

6:00

issues. So

6:02

we're having this this lovely Thanksgiving

6:04

conversation the day before Thanksgiving it is November

6:07

22nd at this moment

6:09

right now at 11 a.m. who

6:12

is in charge of open AI?

6:15

That is a

6:17

very good question and I

6:20

think the fact that I can't give you a

6:22

super hard answer

6:25

is maybe evidence of

6:27

evidence of how chaotic things have been.

6:29

I mean de facto the

6:31

answer is Sam Altman once again.

6:35

He's agreed to come back and join

6:37

the board. I don't know if the paperwork is signed

6:40

so you know it could technically for a

6:42

few more minutes there will be the Twitch CEO

6:44

Emmett Shear who was CEO for

6:47

I don't know a couple hours

6:49

or something like that. So

6:52

but yeah I mean for all intents and purposes

6:55

Sam is back in

6:57

charge after pretty crazy

6:59

five days or so.

7:02

What the fuck is going on? What happened

7:05

and I know or something I've been thinking about

7:07

Emily I know that I have to stop cursing at least in the

7:10

first 10 minutes when

7:12

we move to YouTube. So I'm

7:14

bad at that too so we'll

7:16

have a swear jar, a virtual swear jar. We need

7:18

a cyber swear jar but what is

7:21

what happened five days ago like what is going

7:24

on over there? Sure

7:25

so the long

7:27

and short of it is that

7:31

the board which is

7:33

a very very particularly

7:35

small board that is

7:38

collection a strange collection of people

7:40

to be honest

7:42

was basically at odds with one another

7:44

for a long time over

7:46

some sort of philosophical differences

7:49

seems like maybe some personal differences

7:52

and some

7:55

one faction that's more academic

7:57

more do worry about AI sort

7:59

of formed on

8:01

one side and Sam

8:04

Altman and Greg Brockman who's

8:06

kind of you know his one of his

8:08

allies underlings I

8:10

guess you could say he was the president on

8:13

the other side and the

8:15

board made a move I think there were outside

8:18

of Sam and Greg just four people

8:21

that made this decision only

8:23

four people that caused all of this news

8:26

came together and decided

8:28

that they would push Sam out, demote

8:32

Greg and Greg later

8:34

quit so we can get into

8:36

the reasons of why they decided

8:38

to do that and what the philosophical differences

8:41

were but that's the thing that sort of precipitated

8:44

everything that's happened since then.

8:46

Let's back up a little bit I don't kind

8:48

of jumping around on our map here but I

8:52

want to put OpenAI the company

8:54

into like some sort of corporate context for

8:56

people before we start dissecting them. This

8:59

is kind of like if

9:01

you didn't know much about the company other than

9:04

like its financials it's kind of a fairy

9:06

tale startup right like

9:08

what's their evaluation at like how long have they been around

9:12

how much money are they making?

9:14

Sure

9:16

the I

9:18

mean yes in lots of ways they are a fairy

9:20

tale startup story

9:22

I mean they are valued at something like 80

9:24

billion dollars I think maybe

9:28

looking for had been looking for money that would

9:30

value them at 90 you know they were

9:33

I think valued at 30 billion a year

9:35

ago so that's you know the 3x

9:38

that you're looking for if you're a startup

9:40

in a single year they had this

9:42

charismatic founder Sam Altman

9:44

who's one of the most well-connected people in

9:46

Silicon Valley with the head of Y Combinator

9:49

which is you know really influential

9:52

startup accelerator in the

9:55

in the Bay Area and yeah

9:58

and most importantly They had

10:00

two products that had really,

10:02

I mean, inarguably taken

10:04

the world by storm in ChatGBT,

10:08

which you'd be hard pressed to find someone

10:10

who doesn't know what that is at this point,

10:13

and Dolly, which was the text

10:16

to image generator that took the

10:18

world by storm, I think, before

10:20

ChatGBT. So I mean, all

10:23

of those things combined. You

10:26

have the profile of a generational

10:30

startup, and a lot of people had talked about

10:32

Altman regardless

10:34

of what they thought about him as being

10:37

the Mark Zuckerberg of the 2020s, or the

10:39

Bill Gates, or the Jeff Bezos. I

10:44

mean, that was kind of the sense that people had,

10:46

this person was going to define

10:49

the world of AI and be the public face

10:51

of it in the same way that Zuckerberg

10:54

became the face of social media. Now,

10:58

there are some key differences between

11:00

open AI and your typical startup

11:04

that really paid a key role

11:06

in what we've seen happen over the last

11:08

five days. And

11:11

I can go into that history or not, but

11:14

the long and short of it is that it started as

11:16

a nonprofit. It was not

11:18

actually started as a sort of for-profit,

11:21

typical, you know, raised venture capital,

11:24

prepared an IPO company. It

11:26

was started as a nonprofit organization

11:28

that sort of transitioned and transformed

11:31

in a lot of ways into a more traditional

11:34

powerhouse startup. And that caused

11:36

a lot of the corporate weirdness

11:39

that we've seen

11:41

today.

11:42

Yeah, it's like I feel

11:44

very silly because I keep getting Sam

11:46

Altman and Sam Bankman freed mixed up. And

11:48

it's not just because their first names are both Sam,

11:51

but there's some effective altruism

11:53

kind of running throughout this whole thing. And that's kind

11:55

of part of the reason why

11:58

it seems to me. from what I've been

12:00

reading of your work and other people saying,

12:03

there's a lot of like the philosophical

12:05

differences between the effect

12:07

of altruism and then also effective

12:11

accelerationism.

12:13

Yeah,

12:14

effective altruism just keeps popping

12:16

its head into every major

12:21

catastrophe occurring in the technology

12:23

sector somehow. But

12:25

the way that effective altruism is popping

12:27

its head in here is very different from the way

12:29

that it popped its head into the FTX thing.

12:32

So at FTX, Sam

12:34

Bankman Freed was a hardline

12:36

believer in effective altruism. And

12:39

as part of their philosophy, it was

12:41

make as much money at all costs so

12:44

that you can then donate it or

12:46

use it to save the world

12:50

through X, Y, or Z. And so

12:53

in that philosophical school

12:56

of thought, it didn't

12:58

matter how the money was made. It didn't matter

13:00

if it was made unethically as long

13:02

as it was accumulated and then used to

13:05

save the world. Effective

13:07

altruism has this other aspect, though, which

13:09

is a sort of doomerism related to

13:12

artificial intelligence, a real fear

13:14

that AI could end the

13:18

world, the robots take over, the terminator

13:20

kills everyone, that sort of thing.

13:22

Now,

13:23

Sam Altman is not an effective altruist,

13:27

but two of the members of the board were,

13:31

or at least are heavily affiliated

13:33

with it just to cover our bases. And

13:36

those two people were

13:38

kind of it was surprising that they had come

13:41

to have such influence

13:44

at the company. They had

13:46

no real firm association outside

13:48

of that. One of them was an academic,

13:51

not really in the corporate

13:53

world at all. Maybe that

13:56

as a result of its nonprofit

13:58

origins. became really,

14:00

really worried that Sam Altman,

14:02

Greg, everyone that kind of stood with him, was

14:05

moving too fast, breaking things,

14:07

blah, blah, blah. And so those two

14:09

factions, Sam on one side and this

14:11

woman, Helen Toner on the other side, started

14:14

kind of go at each other. Helen

14:16

even wrote a paper in

14:19

which she criticized OpenAI. It was a very

14:21

academic paper and celebrated

14:24

another AI startup.

14:27

And Sam was like, what the hell? And

14:30

she was like, well, it's an academic paper.

14:32

And he was like, everything you say matters

14:34

to this company. And so they were kind

14:36

of butting heads with each other. And increasingly,

14:39

someone in the middle was the

14:42

company's chief scientist, Eliot Settskever,

14:44

who worked with Sam,

14:47

but also was really a big

14:50

time doomer. And so he was kind

14:52

of found himself in the middle. And he consequently

14:55

also was the person who kind of invited

14:59

Sam onto the Zoom call or the Google Hangout

15:02

and said, we've got some news

15:04

to share with you. And you

15:06

know what happened next. They're not using Microsoft

15:08

Teams to do that. That feels very not

15:11

aligned with the brand. I

15:13

think when the reporting came out, I

15:16

cannot imagine Microsoft was super happy

15:18

about that. But yeah, Google Hangout. I'm

15:20

sorry, does anyone like Teams? No one

15:23

likes Teams. No.

15:24

No one likes Teams. Karen

15:27

uses it all the time. And it does not look good. I'm

15:33

sure even the Microsoft employees are using

15:35

it reluctantly, let's be honest. Microsoft

15:37

is important to this story. This is an aside before we

15:40

dig into some of the other wild things.

15:44

They injected, what, $13 billion? They

15:46

had the $13 billion investment?

15:48

$13 billion. So as I've

15:51

said, I know there's like a million different aspects

15:53

of this story. And that's why everyone's so confused.

15:56

But just to

15:58

try and put the Microsoft thing in context. So,

16:01

OpenAI was a nonprofit.

16:04

Elon Musk was associated. Elon

16:07

Musk decides he's

16:09

bailing.

16:10

They go, oh, we need money.

16:13

It requires a lot of money to build a complex

16:15

AI system. It ends up, of course. And

16:18

so they go, let's kind of reformat

16:20

this, create this for-profit arm. We

16:22

can use it to get investment.

16:24

Microsoft goes,

16:26

great. We would

16:29

love to give you, first, a billion dollars

16:31

and, ultimately, $13 billion. So

16:35

they own, technically, 49% of the company.

16:38

But unlike a typical

16:40

corporate structure,

16:42

an investor that owns 49% of generic Corporation X

16:48

has a lot of sway in the decision-making

16:50

of that company, because the board

16:53

is beholden to the shareholders so

16:55

the shareholders can exert a lot of influence.

16:58

OpenAI actually is not like

17:00

that.

17:01

The board is not,

17:05

its mandate is not to please

17:07

shareholders. It's to make

17:10

sure, effectively, that AI is developed

17:12

in a safe way and doesn't take over

17:14

the world and that Terminator doesn't come to pass.

17:17

And so, as a result, Microsoft

17:18

had invested all this money and

17:23

did not have any legal

17:25

effective power at the company. And

17:27

so when they found out that –

17:30

yeah, go ahead. So I just want to make sure that I understand.

17:33

So there's a nonprofit with

17:36

the board, and that nonprofit

17:38

is in control of the for-profit

17:40

company that Microsoft has invested in. Exactly.

17:45

And that board is very small. And that board is very

17:47

small. And those two entities are separate

17:49

enough that if you are Microsoft

17:52

and you invest the billions into

17:54

the for-profit thing, you don't

17:56

have much control over it.

17:59

Yeah. And I mean –

17:59

And until Friday, there had been

18:02

a pretty good argument that

18:04

they were creating a better version of a corporation,

18:07

one a corporate or better

18:10

version of a business at least,

18:12

one in which we can take all this money from

18:15

Microsoft so that we can build all these

18:17

fancy toys, but we're not

18:19

in the end of the day, just

18:21

like a shareholder capitalist function.

18:24

Well, our shareholder, we're

18:26

not just behold it to shareholders. We

18:29

in fact have a higher mission. We're going

18:31

to create a responsible AI,

18:33

a better AI. And as a result, we've

18:35

created a better system that looked great

18:38

until Friday, when obviously

18:41

shit hit the fan. It's funny

18:44

as when all of this originally went down, I

18:47

joked to myself in my own head, because I remember

18:50

maybe like a week ago, a

18:52

hot Sam Altman quote dropped where he said

18:55

that, AGI is

18:57

going to be a magic intelligence in the sky.

19:00

I was like, oh, it sounds like they're trying to build God or something.

19:03

I wonder if that's why they fired him.

19:05

It sounds like something

19:08

like that may actually be at the root

19:10

of this.

19:12

I think the root

19:14

conflict is very similar to that.

19:18

Yes.

19:19

Sam, well, one way to think about it would

19:21

be Sam Altman

19:24

and I'm generalizing here, but

19:26

if you'll let me, Sam Altman is

19:28

the Silicon Valley move fast and break

19:30

things guy at the company. And

19:32

then you have the more conservative

19:35

members of the board and this guy, Elliot

19:38

Sutskever, who was the chief scientist,

19:40

and they are the, we could destroy

19:43

the world. Everything could end

19:45

as a result of what we are building right

19:47

now. And so those two factions

19:50

were constantly kind of at each other's

19:52

throat. I don't know which

19:54

is right, but I could say

19:56

for sure that that was creating a lot of conflict

19:59

at the company.

19:59

companies the highest levels.

20:02

What? I okay.

20:05

Just the whole thing is so strange.

20:07

It's just like that was that was really what I learned

20:10

watching all because I was

20:13

not paying attention to the corporate structure of open

20:15

AI. You know, I was interested in chat GPT

20:18

and AI and their effects on society

20:20

and my job, obviously, but I

20:22

didn't realize how

20:24

weird it was at the top

20:27

of this thing. Will

20:29

you tell me so we know who the chief scientist

20:31

is. Who are the other people on the board? I'll

20:33

tell you who they were two days ago and I'll

20:36

tell you who they were now. How does that say?

20:38

This is I love this. This is great. Yeah,

20:42

because I assume that when you lose

20:44

a fight like this, you

20:46

maybe don't get to stick around, right?

20:49

Yes. So

20:52

let's say a year ago, there were nine

20:54

people on the board.

20:56

Three of them left.

20:57

Some of them were like Reed Hoffman,

20:59

but blah, blah, we don't have to get bogged

21:02

down and that

21:03

that then there were six

21:05

Sam Altman,

21:06

Greg Brockman move fast and

21:08

break things.

21:09

Then you had the four others that voted

21:12

against them in the end. One was

21:14

Elliott sets cover the chief scientist,

21:16

little bit do me but loves

21:19

AI. Then you had weird lay for

21:21

reasons I still can't quite understand. Adam

21:24

D'Angelo, who's the chorus

21:26

CEO. And then you had two other

21:28

people, Tasha McCauley

21:31

and Helen Toner. These are the effective altruist

21:33

types. One of them is weirdly married

21:36

to Joseph Gordon Levitt. Just

21:39

as in the side. Wait, wait, wait, wait, really?

21:42

That is true. Yeah. Okay, cool. Cool,

21:44

cool, cool. So he's just giving you a little bit

21:46

of flavor. So anyway, that

21:49

was the six you had Greg and Sam

21:51

on one side versus these four others.

21:54

The four, you know, kick them out effectively

21:57

will kick Sam out Greg leaves and

21:59

blah, blah, blah. blah, blah, in fighting, in fighting, in

22:01

fighting. It's all blown

22:03

up through this deal that got cut basically

22:06

last night. And now you have only

22:08

three people on the board as of today.

22:11

The Quora guy remains. And

22:13

then you have two kind of weird characters

22:16

come in. One is Brett Taylor, who used

22:18

to run the Twitter

22:20

board and he shoved the deal down

22:22

Musk's throat. He's kind of

22:24

like your kind of classic old school

22:27

Silicon Valley type, former CTO

22:29

of Facebook. And then for reasons

22:31

I really, really, really cannot

22:33

understand. Larry Summers,

22:35

who is just like a strange

22:38

character that continues to pop up in

22:40

the American story, former

22:43

treasurer secretary, former Clinton

22:45

guy, former Obama guy, former president

22:48

of Harvard, somehow weirdly

22:50

involved in seemingly every major crisis

22:52

in American history. So you've only got

22:54

three people there now. Now, supposedly they're going

22:58

to grow it, but who knows? So I

23:00

mean, this is a huge blow

23:03

up overnight. The structure isn't changing.

23:05

Structure is not changing as far as I

23:07

know. You know, by the time you publish this podcast,

23:10

maybe it will. It is 1130 a.m. on November 22nd. Again, it is 1130 a.m. on

23:12

November 22nd.

23:19

It's

23:21

hard to say anything. You know, there's

23:23

lots that we don't know, but I do know that

23:25

the effective altruists, and

23:28

they are also, you know, I don't want to just boil

23:30

them down to that alone. They are, you

23:33

know, also entrepreneurs

23:35

and researchers in their own rights. And I do think there's

23:37

validity to being concerned about AI,

23:40

but they are gone.

23:41

They've decided they've made it cut a deal.

23:44

And as part of that deal, Sam can come

23:46

back, but he can't be on the board and

23:48

he has to undergo an investigation

23:51

into the vague claims that they had made about

23:53

his communication issues. So there's all

23:55

sorts of weird deal making happening. I

23:57

mean, personally, I think they really.

24:00

screwed it up and blew

24:02

their hand. But, you know, that's just me.

24:05

Yeah. Talk about the, because that was the,

24:08

the board issued a statement. Maybe

24:10

that's the wrong way to say it. The board said that

24:13

the reason they fired him was that he had weird

24:15

communication issues. Right. Yeah.

24:18

That's what they said. And then everyone was like,

24:20

well, what does that mean? Did he

24:22

pull a Sam Bankman free?

24:24

Did he lie to

24:26

the board? Did he, uh, I don't

24:28

know. You,

24:31

there was all sorts of speculation about it was what

24:34

it was publicly and at the company,

24:37

they were like, what did he do? What did

24:39

he do that caused you to do this? And,

24:41

you know, ironically, the,

24:44

they, you know, they're, they're, they're, the accusation

24:46

they leveled about him was that he wasn't good at

24:48

communicating. But ever since

24:50

they put out that statement, their communication

24:53

has been ridiculously bad, both

24:55

in

24:56

the company, just saying,

24:58

uh, what, what it wasn't, it wasn't

25:00

safe to you. It wasn't financial malfeasance,

25:02

but not ever saying what it was. And they also,

25:05

and this is my personal opinion, opinion,

25:07

absolutely screwed

25:09

it publicly. Because once they

25:11

put out that statement, they basically went

25:14

silent and Sam was able to drive

25:16

the conversation through the media. And

25:18

then like, we still don't know what their

25:20

argument was. And that was like a huge,

25:23

huge misstep. But to me, it seems like

25:25

they're kind of more academically focused

25:27

people who are a little bit out of their league in

25:29

sort of this sort of succession type

25:32

story. But

25:33

yeah, they never said anything. And they

25:35

should, and in my opinion, they should have. I

25:37

mean, even if they had to leak it through the press, because

25:39

like, one, I want to know.

25:42

And two, if that was it, that's,

25:45

you know, that's pretty weak to just say

25:47

that someone has communication issues and leave it at

25:49

that. And

25:53

I'm not really a Sam Altman defender.

25:55

Like, I don't believe that this guy is God.

25:58

I think he's like a deal making, you know. So

26:00

the deal maker is not even like

26:03

the genius behind all

26:05

of the technological breakthroughs, but

26:07

like they really screwed the

26:10

pooch is what I think personally.

26:13

And I mean, after, you

26:15

know, as this was all going on, there was that letter

26:17

from all of those open

26:19

AI employees who were basically saying,

26:22

bring Sam back, this is ridiculous. And

26:25

I remember, you know, this is this all

26:27

happened so fast. So please do correct me

26:29

if I am bungling anything.

26:31

If you get anything, yes.

26:35

And if you get anything wrong about this story,

26:37

it's completely fine. If I get anything wrong

26:39

about the story, maybe a little bit less fine,

26:41

but still, I think socially acceptable because

26:44

the amount of stuff that's come out about this over

26:46

the last five years, five days, it's five

26:48

years. Yeah, I feel like as

26:51

a

26:51

total aside, I feel like, you know, as

26:53

we were getting to record like, you know, getting

26:55

ready to record this morning, I feel

26:57

like I woke up, saw that Sam Altman

27:00

was once again named the CEO

27:02

of open AI or the head of the board or whatever.

27:04

And I was like, you know, I guess

27:08

we've done a lot of there's been a lot of stuff

27:10

happening, but none of it matters now.

27:12

And we literally just spent

27:15

a week going around and

27:18

getting back to the start. Yeah,

27:20

I mean, that there's a whole

27:22

moral somewhere in there, although I'm not sure what

27:24

it is. But to answer your earlier question,

27:26

yeah, I mean, the employees were

27:28

completely behind Sam and I think that was actually

27:31

a really important reason that

27:34

the members of the board ended up

27:36

cutting a deal to bring him back because,

27:38

you know,

27:39

they claimed though, they're all about safety. And

27:41

I believe that they really are the

27:43

members of the board that were really concerned about

27:45

it. But then what happened

27:48

was you had 700, I think it was 700 of 70, 770 employees saying, if

27:51

you don't bring them back

27:54

and you don't resign, we're just going to quit

27:56

and go elsewhere. And then you have a real

27:58

problem on your hand that I think

27:59

think actually is arguably an even

28:02

worse safety issue, which is that you have

28:05

this incredibly influential technology

28:08

with no one literally running

28:10

it that had created it. And that,

28:12

you know, we know is a problem. I mean, you can look

28:15

at like what happened to Twitter after they got

28:17

rid of everyone. It became

28:19

a disaster. The idea that ChatGBT,

28:22

something similar could happen to ChatGBT,

28:24

which people are just putting in like

28:26

untold amounts of information to every

28:28

day was really scary. So

28:31

I think that, you know, it's

28:33

not how I would exert my

28:36

influence as an employee, but you know, they

28:38

wanted to do it to bring back their boss and they were

28:40

successful.

28:42

All right, cyber listeners, we're gonna pause there for a break.

28:44

We'll be right back after this. Green

28:47

Chef is the number one meal kit for eating

28:49

clean with dinners that work for you, not

28:51

the other way around. No matter

28:53

what's going on, my wife and I would like to come

28:55

together at the end of the day to cook a fresh meal. And

28:58

I love Green Chef because it handles all

29:00

the shopping and planning for us. There's

29:02

no last minute runs to the grocery store or fussing

29:05

over what we're in the mood for. Green

29:07

Chef, it has handled all of

29:09

that. All we need to do is follow the simple

29:11

instructions and a healthy meal is ready in 25 minutes

29:13

or less. Are

29:16

you looking to stock up on functional snacks

29:18

and clean beverages to energize you through the holidays?

29:22

Well, shop Green Chef's new green bundles,

29:24

which are available at Green Market. It's one-stop

29:26

shop for nutritious grab and go breakfasts.

29:29

They've got vegan options, brunch kits, wholesome

29:32

lunches, ready to eat snacks, veggie sides,

29:34

and more. You can easily add those to

29:36

your weekly order. So feel your best

29:39

this November with seasonal recipes featuring

29:41

certified organic fruits and vegetables, organic

29:43

cage-free eggs, and sustainably sourced

29:46

seafood. Cyber listeners

29:48

can get a great deal. For Green Chef's best deal

29:50

of the year, get $250 off

29:52

with code Cyber250 at GreenChef.com

29:55

slash Cyber250. For

29:58

Green Chef's best deal of the year. Get $250

30:01

off with code Cyber250 at

30:04

GreenChef.com slash Cyber250. Green

30:07

Chef, the number one meal kit for eating

30:10

well. Hey

30:12

there cyber listeners, Matthew here. This episode

30:15

was brought to you by Delete.me. A

30:17

few years ago I did some reporting that made

30:19

a few people pretty angry. I got a few

30:22

death threats and I started to worry about just

30:24

how much of my private information was available in

30:26

the public. We've got a safety team

30:28

here at Vice that helps journalists navigate these kinds

30:30

of situations. But the single best thing

30:32

that the security team did was sign

30:34

me up for Delete.me. Delete.me

30:37

is a service that will help you take control of all

30:39

the bits of information about you and your family that end

30:42

up online. Your name, your address,

30:44

the names of your family, even your shopping habits

30:46

are scraped and cataloged by data brokers.

30:49

Delete.me will help you find that information and,

30:52

when it can, remove it. Now,

30:54

not everyone is in danger of being doxxed, but

30:57

we're all getting more spam in our inbox

30:59

and on our phones. Everyone is in danger

31:01

from identity theft scams and phishing attacks. Everyone

31:04

has more private information squirreled away on the hard

31:06

drives of data brokers than they'd probably like.

31:09

Delete.me helps you fight back against the scammers, spammers,

31:12

and other malevolent forces online. After

31:14

you sign up for Delete.me and tell it what information

31:17

to look for, it scouts, go to work rooting

31:19

out your personal information from websites and

31:21

data brokers. Then it gets removed.

31:24

And if it can't get the information removed,

31:26

it lets you know who has it and what exactly

31:29

they have. Delete.me even

31:31

generates a custom report for you that lets

31:33

you see how exposed you are and

31:35

new tools like email masking reduce inbox

31:37

clutter better than a typical spam

31:39

filter. Using Delete.me

31:42

is like having a partner that's looking out for you. We're

31:44

all living on the internet more than ever, putting

31:46

more of ourselves out there, but that

31:48

exposure creates risk. Delete.me

31:51

is a partner that helps you navigate that risk. It

31:53

certainly helped me feel safe after

31:55

I started getting death threats and

31:58

been really excited to be signed up. up for them for the

32:01

next year. My new report

32:03

is coming in soon. They've already removed me from

32:05

like 70 odd lists in the past few months

32:08

and I'm excited to see how many more they can get

32:11

me off of. So, take control

32:13

of your data by signing up for Delete.me.

32:16

Now, get 20% off your Delete.me plan when

32:18

you go to joindelete.me.com slash

32:20

cyber and use promo code cyber.

32:22

The only way to get 20% off is to

32:24

go to joindelete.me.com

32:27

slash cyber and enter promo code

32:29

cyber at checkout. That's joindelete.me.com

32:32

slash cyber, promo code cyber.

32:35

AI might be the most important new computer technology

32:38

ever. It's storming every industry

32:40

and literally billions of dollars are being invested.

32:43

So, buckle up. The

32:45

problem is that AI needs a lot of speed and

32:47

processing power. So, how do you

32:49

compete without costs spiraling out

32:51

of control? It's time to upgrade

32:53

to the next generation of the cloud, Oracle

32:56

Infrastructure or OCI. OCI

32:59

is a single platform for your infrastructure, database,

33:01

application development and AI needs.

33:04

OCI has four to eight times the bandwidth

33:07

of other clouds, offers one consistent

33:09

price instead of variable regional

33:11

pricing and of course, nobody does

33:13

data better than Oracle. So,

33:15

now you can train your AI models at twice the speed

33:18

and less than half the cost of other clouds. If

33:21

you want to do more and spend less like Uber,

33:23

8x8 and Databricks Mosaic, take

33:25

a free test drive of OCI at oracle.com

33:29

slash cyber. That's oracle.com

33:31

slash cyber, oracle.com

33:33

slash cyber. All

33:35

right, cyber listeners, we are

33:38

once again on talking about OpenAI.

33:41

So, for a little while in the

33:45

liminal space that we all lived

33:47

in, it may still in fact be living in, Microsoft

33:51

was going to scoop up everybody?

33:55

I guess so, yeah. I mean, I

33:57

think, you know, like I said, Microsoft

33:59

had no legal power as a shareholder

34:02

in this situation, but they did try

34:04

as hard as they could to exert control over

34:07

it. So, you know, Microsoft

34:09

CEO really

34:11

pissed off that he didn't get any

34:14

notice about what was happening on Friday. And

34:18

then he was trying to get Ram Sam

34:20

back into the organization on Sunday.

34:22

And then when they went instead with the Twitch guy,

34:25

they, he was like, well, I'm just

34:27

gonna hire them all myself then. Basically,

34:31

he was like,

34:32

if you work at OpenAI and

34:34

you wanna come work for Microsoft, come

34:37

on over. And so I think that he

34:39

was also, I mean, in retrospect,

34:41

definitely, trying to exert pressure

34:43

on OpenAI to say, bring this

34:46

guy back, or we're literally gonna take

34:48

this thing down. I

34:51

wanna highlight some of the wilder characters

34:53

in this before we put

34:56

a bow on where we are at the moment. So

35:01

for a hot minute, Image Shear was

35:04

gonna be in charge. Who's Image

35:06

Shear?

35:07

Who's Image Shear? Who's

35:09

Image Shear?

35:11

What's weird about Adam Shear that's with the Twitch

35:13

guy? I

35:16

don't know. According

35:18

to Twitter, a lot of things. Right after

35:20

he got it down, he

35:22

got it on there, and a

35:25

lot of surfacing between, how can we

35:27

say, of a questionable matter

35:32

about some of his theories on questions

35:35

about sexuality. I'll leave

35:37

you the listeners to

35:39

look that up themselves. Probably

35:42

shows that they did not have

35:44

a huge amount of vetting in the two hours

35:47

before they named him. But yeah,

35:49

I think the main thing about

35:51

Emmett was

35:54

the key was a doomer. He

35:57

had referred to a or compared AI.

35:59

I'm going to do a fusion

36:02

bomb and we just said, you know,

36:04

opening arm is

36:06

moving at

36:07

level 10, the

36:13

E-causification is moving at level 1 or 2,

36:16

and so I think

36:19

that if you look at the board, it's really

36:21

interesting to me.

36:22

I'm not going to be nice in the morning, but I'm going to do

36:24

a move below. So we got Maraud in

36:26

and did a commendable job for

36:28

about 24 hours. It

36:34

seems to me that there was this kind of...

36:37

There seems to be one of the... Oh, there have been

36:39

one of them. What

36:41

is the big, going concern,

36:42

this is more fundamental? That

36:45

is the hot ticket.

36:47

That was run by people

36:50

who thought that they were building a

36:52

nuke and maybe shouldn't do it. It's

36:55

a very strange

36:57

situation. I still

36:59

have a little bit of trouble wrapping

37:02

my head around, and it's

37:04

made me think,

37:06

should the

37:07

people... It's a failure

37:09

of self-regulation, I think, if

37:12

nothing else, you know.

37:14

Should the people within the company

37:17

be responsible for putting

37:19

the brakes on a potentially

37:22

world-changing,

37:24

world-altering, world-destroying technology?

37:27

I think the evidence from this weekend

37:30

says we shouldn't at least on

37:32

our own trust them to be responsible

37:35

for that.

37:36

Anyone who appears as a character in a Reddit

37:40

atheist-aligned Harry Potter fanfic

37:43

should not be in charge of anything. Oh my

37:45

God, how could I forget the fanfic? Don't do this

37:47

to me. I'm sorry,

37:50

you know about this, right?

37:51

I very much know about this. This

37:54

is going to be a very, very brief overview,

37:57

which is because I am...

37:59

Harry Potter

38:01

and the methods of rationality is a fan fiction

38:04

that I know was basically in

38:06

name only and that is like this huge

38:08

Harry Potter fan fiction that Emmett Shear

38:10

is a named character in. 404 Media

38:14

did an article about it last night that I haven't read

38:16

yet because I try to do what

38:19

I can to preserve my sanity

38:21

as much as I can and

38:23

knowing things about the Harry Potter fan fiction

38:26

world drama, I've

38:28

sworn off that.

38:30

So it's written by a guy,

38:33

Harry Potter and the methods of rationality,

38:35

it's written by a guy who's like the founder

38:37

of Less Wrong and is one of these

38:40

big Reddit atheists extreme rationality

38:42

guys who

38:45

had an op-ed in time

38:48

maybe the middle of last year about

38:51

why AI is going to lead to the nuclear

38:53

apocalypse. He's

38:56

extremely doomerie. He's

38:59

one of the guys that thinks like we're

39:02

going to hook up AI to like

39:05

3D printers and the 3D printers are going to make

39:08

diseases that will kill all of humanity

39:10

because why would AI not do that? It's

39:13

the rational thing for AI to do. And

39:16

so like Harry Potter and the methods of rationality

39:18

is like this big fanfic opus

39:21

that he wrote that's

39:24

like a way to rational pill Harry

39:27

Potter fans. The whole thing is so

39:29

weird. Is this something I have to read or that

39:31

I can avoid for the rest of my life?

39:34

I don't think anybody in the year 2023

39:37

needs to read a Harry Potter fanfic.

39:38

I think we're good. Harry Potter's

39:41

time is well past us. No

39:43

offense to

39:44

my friends who still do that but let's

39:47

keep it 100 here.

39:48

Read

39:51

it, rational Harry Potter. Nobody

39:54

wants that. Go

39:56

read my immortal instead. Exactly.

39:58

Thank you.

40:01

So

40:03

the other

40:04

person I just want to have

40:06

some words said about is the chief scientist.

40:10

There's some stuff written in

40:13

the Atlantic about him

40:15

and how he's kind of, we've already said

40:17

he's a weirdo, but

40:19

he would do things like lead chance

40:21

at the company. What

40:24

is, what's, what? Elliot

40:27

Sitzkiver, probably the most interesting

40:29

person in this whole tale

40:32

of corporate intrigue. I

40:35

guess the main thing that I would say is

40:37

that he is well regarded as a

40:39

complete genius. He is like

40:42

a AI deep learning genius,

40:45

one of the most well respected people in

40:47

the field in the pure nuts and bolts of

40:49

deep learning. He came

40:51

from the academic world. He did a little

40:53

bit of Google stuff. Musk personally

40:56

recruited him over to open

40:58

AI,

40:59

but he also is super

41:01

into the sort of doomer

41:04

safety side. And so, you know, he,

41:06

he, I think, created

41:09

a sort of, I'm trying

41:11

to think about how to even describe this.

41:15

He basically burns a,

41:18

something in effigy that was supposed to

41:20

represent, you know,

41:22

a bad AI and he was supposed to

41:25

be burning away all of like these bad

41:27

AI. He

41:29

made a totem. Yeah, he made a totem to burn

41:31

effigy as like a religious ritual

41:33

to burn away the bad

41:36

AI. Exactly. And I think the other,

41:38

there was sort of almost a spiritual component

41:41

to his belief. So on the one

41:43

hand, you had this really hard line belief

41:46

in the power of deep learning and

41:48

AI. And on the other hand, he had this

41:50

sort of spiritual side that

41:52

said, you know, we have to remain

41:54

cognizant of the fact that this can destroy

41:57

the world. And like we were talking about earlier.

41:59

It's a confusing dichotomy

42:02

that he is so obsessed with building

42:04

it and simultaneously, you know,

42:07

like, absolutely terrified

42:09

of what he could be building. But

42:12

yeah, I mean, he, as a result of that,

42:14

became increasingly concerned about

42:17

Altman. He was eventually went

42:19

to the board with its concerns. He,

42:21

you know, was worried that Altman was, you

42:24

know, heading down a path that, you know,

42:26

was

42:28

the antithesis of what OpenAI should

42:31

represent. He was the one who called

42:33

Altman into the board meeting to

42:36

get him fired. If he

42:38

didn't do the firing himself, he very

42:40

well might have. And then, bizarrely,

42:44

he switched and called for Altman

42:46

to return, probably maybe

42:49

because of pressure from the other employees.

42:51

But he's just been all over the place. I

42:53

mean, philosophically, he's building

42:56

the thing and terrified of it. He,

42:58

you know, wanted Altman fired.

43:01

And then three days later was saying what a

43:03

horrible mistake he was. He might

43:06

well represent OpenAI better than

43:08

anyone else because he has

43:10

the tools, but he also seems

43:13

to not quite understand

43:16

the corporate world and how to make

43:18

moves inside large, complex

43:20

organizations. I love

43:23

watching dudes struggle

43:26

with the idea of God and

43:28

how they're unable to resolve it. And then like what happens

43:31

after? Like so

43:33

many people, like, get

43:36

real into atheism, and then things

43:38

can get really strange afterwards. Right.

43:42

It's like we've got the tools to build what I would

43:44

think of as a classical image

43:47

of like God is this thing in this thing

43:49

in the sky that watches over us. We should

43:52

do that. That sounds like a great idea. Right.

43:55

Says some of the AI scientist.

43:58

I guess the thinking is like someone's

44:00

gonna build it, we should be the ones who

44:02

it since we're like the

44:04

responsible ones but

44:07

I think that might be a good

44:09

theory that might not have realistic

44:12

applications. Yeah,

44:14

you're spending too much time talking to your word calculator

44:16

buddy. Right, exactly. You're

44:20

doing some projection there. Yeah, right.

44:23

Let's

44:25

expand this out away from OpenAI if we

44:27

can. You also wrote

44:29

this great piece before this kind

44:32

of about Wall Street's love affair

44:34

with the idea of AI in general. You know,

44:36

we're at this era where we're post crypto crash

44:39

where there's kind of this big I would say

44:42

like drive to find

44:44

what the next tech bubble is

44:46

going to be and I would say we're firmly in the middle

44:48

of the AI tech bubble, right?

44:51

Definitely. I think you know when

44:53

Wall Street sees new they see

44:56

opportunity and

44:58

you know I think that they search for

45:00

a technological breakthrough

45:03

might mean you know some innovation

45:05

that leads to a productivity increase

45:07

that leads to more profits that means

45:09

that they should invest in the company. That's

45:12

one side of it. What I was interested

45:14

in is how they're using

45:16

planning to use AI within the banks

45:19

themselves. So I mean Jamie

45:21

Dimon,

45:22

CEO of the biggest

45:24

bank buy assets in the US, you know

45:26

he said that it's going to be critical

45:28

to his company's future success

45:31

and you know there's already

45:33

toying with ways to you know summarize

45:37

legal documents, sift through

45:41

earnings reports, you know sift

45:44

through news reports. That

45:47

stuff I think is really interesting. There's cyber

45:49

security, cyber

45:51

security elements, blah blah blah. What I'm

45:53

really interested in is this

45:55

kind of fringe that is testing

45:58

using AI to make investment

46:00

decisions, you know, like, let's

46:03

entrust the AI to

46:05

tell us to invest in company X,

46:07

Y, and Z, and how much

46:09

to invest and then to switch over to company

46:12

B. That is a really

46:14

big change that I think is pretty interesting.

46:17

And while it's on the fringe, I

46:19

suspect it's on the fringe in the way that 15

46:22

years from now, it will be part of the

46:24

mainstream. Yes. I mean, Wall

46:26

Street embraces this kind of thing

46:28

pretty readily most of the time, right? Like they were

46:31

a big reason that

46:33

we've got some

46:35

of the internet infrastructure we

46:37

do in the country is so that they can

46:39

make lightning fast trades from coast to coast.

46:42

Yeah, I mean, Wall

46:45

Street is always looking for

46:47

an edge. I mean, that's the whole

46:50

game. Financial

46:53

innovation essentially is looking

46:55

for a way to repackage new

46:57

things, old things to make

46:59

them look new, or, you know,

47:01

to figure out a way that they can be slightly

47:04

better than the trading firm across the

47:06

street. So, yeah,

47:08

they're always looking for an edge and computers

47:11

have increasingly been a part of that.

47:12

That's why we had high frequency

47:15

trading become so popular, you

47:17

know, over the last

47:19

over the century, it allowed

47:21

them to kind of

47:23

trade faster, get an edge, and

47:26

occasionally create a flash crash that

47:28

would take away billions in shareholder

47:30

value in a couple seconds. Is

47:32

it possible that so like,

47:35

what can you get in kind of the nuts and bolts of what they're

47:37

talking about actually doing with the AI? Yeah,

47:39

so there's a lot of different stuff.

47:41

I mean, you know, on

47:43

a more basic level of what's already

47:46

happening, Vanguard, big

47:48

financial institution is

47:51

using AI to generate retirement

47:53

portfolios. Morgan Stanley

47:56

has sort of a chat GBT assistant

47:58

for financial advisors.

47:59

users, help them kind of interact

48:02

with clients.

48:04

JPMorgan Chase has

48:07

created a pattern for something called index

48:09

GBT. Well, supposedly that's

48:11

going to help traders decide

48:13

where to invest. And then

48:15

there's people on, like I said, you know, there's

48:18

people whose entire investment

48:20

thesis is based on the development

48:23

of an AI that tells them where to invest.

48:26

And then there's also people who are trying to use large

48:28

language models, chat GBT

48:30

type things to help

48:33

traders say, like, help

48:36

me follow 50 stocks 24-7, help me

48:38

understand the

48:41

rule, financial rule through

48:44

which I can make bet X, Y, and Z. So

48:46

they're trying all sorts of things and

48:51

to varying degrees of success, I

48:53

think. So far, I think, you

48:55

know, AI has not been

48:57

able to beat the market, let's say.

48:59

Is this any different from, say,

49:02

bots that have already been deployed? I

49:04

mean, we've had those for years, right? Sam Bankman

49:07

Freed was using one, I think,

49:09

both at FTX

49:12

and Jane Street. Yeah, it's

49:14

a good question. I think it is

49:16

different. Definitely the use of large language

49:18

models to try and make decisions, financial

49:21

decisions, as a sort of assistant,

49:24

is a different sort of innovation

49:26

that we haven't seen before. It comes with concerns

49:29

because a

49:29

lot of large language models tend to

49:31

be wrong a lot of the time.

49:34

So that could be a concern.

49:38

I think also, you know,

49:40

the idea of using complex, you know,

49:43

generative AI to

49:45

create a portfolio is a

49:48

different sort of thing than just using a bot.

49:51

I mean, Betterment is like a robo-advisor

49:53

and they kind of have algorithms

49:56

or models that they create and then allow

49:58

it to kind of do its thing.

50:00

That is different from having a

50:02

sort of quote unquote smart, generative

50:05

AI that's sort of making the decisions on

50:08

its own learning, you

50:10

know, and developing a thesis as we

50:12

go.

50:13

That is different. It's,

50:15

you know, potentially creates

50:18

more autonomy by the AI,

50:21

which could be of concern

50:24

if it makes a incorrect

50:26

decision and invests, you know, $10

50:29

trillion into, I don't

50:31

know, bookstores

50:33

or something like that, you know? And then,

50:36

you know, everyone's like, wait, why have

50:38

we over invested in bookstores?

50:40

Creates a bubble, crash. I think that's

50:43

the concern.

50:44

I'm sure Michael Lewis will write a wonderful

50:46

book. I'm sure that he will. Hopefully

50:48

he

50:51

finds someone who isn't committing a major

50:54

massive fraud. The

50:57

profile this time. Well, he would profile

51:00

the AI and really humanize it. Sure,

51:02

sure. In a very touching

51:04

manner, for sure.

51:06

I just want a social network of

51:09

this entire year's, you know, tech

51:11

rise and downfall.

51:12

Oh my God. When are we getting that

51:15

is my question. I promise

51:17

you that there are 45

51:19

agents in Hollywood right

51:21

now fighting for, you know, 10 scripts

51:23

or something like that. Yeah. Yeah.

51:26

Whichever one can sign Trent Reznor to

51:28

do the score, just let me know when that's there.

51:30

Exactly. Exactly. I would love Joseph

51:32

Gordon Levitt as Sam Altman.

51:36

It needs to be an inside job.

51:37

Yeah. It's the, they

51:40

hire Aaron Sorkin to write their

51:42

version of what happened. The

51:46

movie I want to see is

51:49

Sam Bankman's read versus the

51:51

Binance CEO, you know, like

51:54

kind of like they were rivals

51:56

and it seemed like the Binance CEO

51:58

like basically took down.

51:59

or started Sam Bankman's

52:02

free fall and then he fell this

52:04

week.

52:04

He fell as well. I

52:07

know we made fun of the Michael

52:09

Lewis book, but I actually do think there's

52:12

a lot of good stuff in there and a lot of the finance,

52:14

a lot of the stuff between them, he

52:17

captures their interpersonal rivalry

52:19

pretty well and it's pretty fascinating.

52:22

Also, nice segue. I

52:25

know how to segue. What can I say? Yeah,

52:29

what's going on here? We're a year

52:31

out from the crypto crash. We're

52:34

deep in the aftermath. CZ

52:36

was one of the last guys standing, right? Yeah,

52:41

CZ Chai Ping Zhao

52:44

is the CEO of Binance, which

52:46

is the world's largest crypto exchange.

52:50

I think Sam Bankman freed

52:52

in him. They were the big rivals,

52:54

the big two of the industry. I

52:58

think, ironically,

53:00

Sam fell first. I think there were always

53:02

a few more questions about CZ.

53:05

He lives in the UAE. They

53:08

really had no company headquarters.

53:11

There were all sorts of sketchy seeming

53:13

things that were happening. But

53:15

after Sam Bankman freed, fell, he

53:18

was like kind of the last man standing in a

53:20

lot of ways.

53:22

But

53:22

as these things go, the

53:26

US federal agencies were

53:30

well underway in a long

53:32

investigation into all sorts of stuff

53:36

that, well, I guess we can say now, was

53:38

completely illegal. This

53:41

week, he kind of surprisingly

53:44

said that he would step down, plead

53:46

guilty to a criminal charge. And

53:48

so he's out at Binance. The

53:51

company is going to pay a $4 billion

53:54

fine

53:55

related to charges like

53:57

allowing America.

53:59

Americans to trade with Iranians and Russians

54:02

and things like that, which is against

54:04

the law for all sorts of reasons. And

54:07

yeah, but Binance will continue under

54:10

a new CEO. I

54:12

think that was a big part of the deal. But

54:15

yeah, I mean,

54:17

it's a fall from grace and it's almost so

54:19

much faster and less dramatic that

54:22

I think it's going to go a little bit more under the radar

54:24

than Sam's. CZ

54:27

was never as big like a personality

54:29

or a character as Sam was, right? Sam

54:32

was hanging out with football

54:34

stars and was very seen and

54:37

was trying to be the face of

54:39

this thing in a way that CZ wasn't. I

54:41

think Sam was trying to be the face for like the

54:44

general person who didn't know who crypto was.

54:47

Your mom's face of crypto is

54:49

what Sam was. CZ,

54:53

I think, instead has sort of like,

54:55

I mean, in the crypto space,

54:58

he was like a god, you know, they were

55:00

like bowed down to him. They just thought he

55:02

was an absolute genius, smarter

55:05

than everyone else, you know, always seemed

55:10

to be a step ahead of the US, kind of laughing

55:13

at the federal agencies, pretending

55:15

to be in compliance and then saying,

55:17

you know, no KYC behind

55:19

the scenes. I mean, he

55:22

was, you know, he kind of represented

55:24

a little bit of like the thumbing of the nose at,

55:27

you know, people who thought that, you know, the

55:29

rules should apply to them. And

55:32

you know, like their whole mantra is

55:34

to

55:35

create a freer

55:36

flow of money. And it seems like

55:38

they definitely did that because the

55:41

money was flowing pretty freely between

55:45

or at least through them and some terrorist organizations

55:48

that came out.

55:49

Yeah, it turns out you can't just invent securities

55:51

a lot out of whole cloth and then say they're not

55:53

securities and shouldn't be regulated like

55:55

securities. Yeah, I mean, that I think

55:58

plays more into.

55:59

SEC investigation that's ongoing,

56:02

so we have not seen the last of it. But

56:04

it seems like he's going

56:07

to go into the background. He's saying

56:09

that he's going to

56:10

take some time off, maybe start investing

56:12

a little. Maybe I know the funniest

56:15

thing was that he was like, and I might make

56:17

myself available for mentoring if

56:19

people are interested.

56:21

I don't know if that's a good idea

56:23

or

56:23

not. I don't know. I

56:25

mean, it's a pretty crazy thing to say like,

56:27

hey, I just stepped down from my

56:30

organization and pled guilty. My

56:33

company had to pay a $4 billion fine. I'm

56:36

basically a disgrace now. And

56:39

I'm available if you want to chat

56:41

about how to start your startup. Just

56:43

put time on my calendar. Yeah, exactly.

56:47

You learn more from failure than you do from success.

56:51

That's definitely true. And in

56:53

that

56:53

way, I guess he has a lot to tell people. Would

56:56

you rather talk to Napoleon right at

56:58

the height of his power, or do you want to interview

57:02

Napoleon when he's in exile on the island? Yeah,

57:05

I will say I'm joking

57:07

a little bit. I'm obviously saying these things in jest.

57:10

I don't really feel the need to be super kind

57:12

to CZ, but he did

57:14

say, at a minimum, I'll tell

57:17

you what not to do. If

57:20

you're my mentee, I'll tell you what mistakes

57:23

to avoid.

57:24

Good times. All

57:26

right, Maxwell, I think that about covers

57:28

the speaks wild news. Thank you

57:31

so much for coming on to cyber. Everyone

57:33

enjoy your Thanksgiving. Have a good Thanksgiving,

57:35

guys.

57:37

Bye, everyone. Bye. Bye.

58:00

A A

58:09

A A

58:14

A A

58:21

A A

58:27

A A

58:35

A A

58:44

A A

58:52

A A

59:00

A A

59:09

A

59:11

A

59:16

A A

59:22

A A

59:32

A A

59:40

A A

59:48

A A

59:56

A

59:59

A

59:59

actors, authors, chefs,

1:00:02

musicians, and more about how the food

1:00:05

and the culinary traditions of their youth

1:00:07

shape their lives in interesting and

1:00:10

sometimes surprising ways.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features