Podchaser Logo
Home
ScarJo vs. ChatGPT + Neuralink’s First Patient Opens Up + Microsoft’s A.I. PCs

ScarJo vs. ChatGPT + Neuralink’s First Patient Opens Up + Microsoft’s A.I. PCs

Released Friday, 24th May 2024
Good episode? Give it some love!
ScarJo vs. ChatGPT + Neuralink’s First Patient Opens Up + Microsoft’s A.I. PCs

ScarJo vs. ChatGPT + Neuralink’s First Patient Opens Up + Microsoft’s A.I. PCs

ScarJo vs. ChatGPT + Neuralink’s First Patient Opens Up + Microsoft’s A.I. PCs

ScarJo vs. ChatGPT + Neuralink’s First Patient Opens Up + Microsoft’s A.I. PCs

Friday, 24th May 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Looking for a new ride? Shuff for your next car from

0:02

the comfort of your home, 100% online with Carvana. Carvana

0:06

has a massive inventory, including thousands of cars

0:08

under $20,000, so finding

0:10

a car that fits your budget and

0:12

lifestyle is hassle-free. Carvana makes car financing

0:14

hassle-free. Get pre-qualified in minutes by answering

0:16

a few quick questions. Carvana also offers

0:19

customizable terms and payments as low as

0:21

$0 down, because payment

0:23

plans should fit your plans. Visit carvana.com or

0:25

download the app today to find your next

0:27

car without ever leaving the comfort of home

0:30

or anywhere else. Terms and conditions may

0:32

apply. I had an interesting experience recently.

0:34

I went into a float tank for the first time

0:36

you've ever been in one of these? I

0:38

have seen them in sci-fi movies. Was

0:41

this like a back-to-tank from Star Wars where

0:43

you're healed from your injuries? No,

0:46

this is like a trendy new thing

0:48

in the Bay Area where you basically

0:50

go into these like pods. Imagine if

0:53

like Apple designed a coffin. It's like

0:55

a shiny white pod the size of

0:57

your body, and you

0:59

go in and it's filled with

1:01

a couple inches of very

1:03

salty water. So you

1:06

basically just lie there and you float

1:08

for an hour inside this pod, and

1:10

it's supposed to be relaxing. And was

1:13

it relaxing? Sort

1:16

of. I mean, it's definitely relaxing to be

1:18

like floating gently in some like warm salty

1:20

water, but it is a little

1:22

claustrophobic because you can make it totally dark in

1:24

there and do sort of sensory deprivation. But

1:27

the thing that actually made it less relaxing

1:29

for me was that you have music that

1:31

you can choose from, but the selection is

1:33

not good. It's like cheesy

1:35

yoga music. It's like pan flute. It's like,

1:38

you know, chimes. It's not what I wanted

1:40

to be listening to down there. Which

1:42

was espresso by Sabrina Carpenter. That's

1:45

what I want to be listening to. Kevin,

1:47

let me ask you this. Have you ever

1:49

heard of Taking a Bath? Because you're able

1:51

to get many of the benefits of driving

1:53

somewhere to float in two inches of water,

1:56

and you can choose your own music. Might want

1:58

to look into that. That's true. You know

2:00

you and I should go into into a pod

2:02

together and just start talking about tech news for

2:04

an hour You know what

2:07

they call that podcast I'm

2:13

Kevin ruse. I'm a tech columnist at the New York Times.

2:15

I'm Casey Newton from platformer and this is hard for Open

2:19

AI wanted Scarlett Johansson to be the voice

2:21

of chat CPT But then something got

2:23

lost in translation Then no one's

2:25

our bar the first person to get

2:27

Elon Musk's neural link implanted in his

2:29

brain Joins us to talk about how

2:31

a brain computer interface Changes Well

2:49

Kevin just as it seemed like

2:52

things were starting to settle after some wild

2:54

AI Demos last week

2:56

a shocking statement from one of the world's

2:59

most popular actresses has made us reconsider Everything

3:02

open AI has been telling us about its voice

3:04

assistant. Yeah, this is one of the craziest tech stories

3:06

of the year I've been totally obsessed with every twist

3:08

and turn. I'm very excited to talk

3:10

with you about it today Now did you ever think we

3:12

would have a literal Avenger fighting back against the relentless much

3:14

of AI? Is that sort of

3:16

what this story is about? So

3:18

last week we talked here about the

3:21

announcement from open AI about their new

3:23

GPT for Oh model Yeah, which was

3:25

most striking for this very flirty voice

3:28

assistant that they used in the demos

3:30

They showed us Kevin remind us what

3:32

was so striking about that demo So

3:34

the voice that they demoed it was

3:37

this sort of lilting female voice It

3:39

was a little flirty as you said

3:41

it sort of varied its register kind of

3:44

giggled at its own jokes it was like

3:46

it was it was very lifelike and realistic

3:48

and basically immediately as

3:50

the demo is going out people

3:52

start making comparisons to the movie

3:54

her and to Scarlett Johansson's character

3:56

in that movie Samantha and

3:58

like the the company itself made that

4:01

comparison, invited that comparison. Yes, Kevin.

4:03

And people are actually calling this

4:05

the greatest act of cultural appropriation

4:07

since Scarlett Johansson was cast in

4:09

Ghost in the Child. You

4:12

went there. That's right. So

4:14

on Sunday, OpenAI post to its

4:16

website this mysterious blog post titled

4:19

How the Voices for Chat GPT

4:21

Were Chosen. And in the

4:23

blog post, it says, quote, We believe

4:25

that AI voices should not deliberately mimic

4:27

a celebrity's distinctive voice. Sky's voice is

4:29

not an imitation of Scarlett Johansson, but belongs

4:32

to a different professional actress using her own

4:34

natural speaking voice. Kevin, when you saw that

4:36

blog post go up, did you have any

4:38

idea what was going on? No,

4:40

but it was one of those things where it's

4:43

like my my my sky is not an imitation

4:45

of Scarlett Johansson t shirt. It's

4:47

raising a lot of questions already answered by

4:49

my sky is not based on Scarlett Johansson

4:51

t shirt. It was like, if

4:54

you were saying this clearly something is happening in

4:56

the background, you did not just decide to come

4:58

out of this from nowhere. And it was

5:00

just a sign that things were going to get a

5:02

little weird. Yeah, absolutely. You know, this blog post went

5:04

up very late Pacific time on Sunday. And to me,

5:06

it was a sign that this was going to be

5:09

a rough night, which is the title of a 2017

5:11

film starring Scarlett Johansson. Okay, so

5:14

on Monday morning, things start to

5:16

become a little more clear when

5:18

OpenAI pulls Sky's voice from the

5:21

app. And Joanne Jang,

5:23

who is the model behavior lead at

5:25

OpenAI talks to the Verge and says,

5:27

quote, we've been in conversation with Scar

5:29

Joe's team. Pretty familiar there, Joey,

5:32

because there seems to be some confusion. We

5:34

want to take the feedback seriously and hear

5:36

out the concerns. And she

5:38

further suggested that maybe people hear similarities

5:40

because there are so few convincing female

5:43

voice assistants around. Does that seem convincing

5:45

to you, Kevin? No, of

5:47

course not. Because what? Well, you know, to

5:49

me, it just seemed like Scarlett was really

5:51

trying to get under the skin of OpenAI,

5:53

which, you know, under the skin is a

5:55

2013 film starring Scarlett Johansson. Oh, boy. All

5:57

right. So on Monday nights, Scarlett Johansson

5:59

herself released. There's a statement and this is

6:01

the do the right. Yes, it's sort of what

6:03

makes the world stops and to see really lays

6:05

out quite a marriage and was I think we

6:08

should walk through Yes so according the scarlet which

6:10

is why Color Sam. Altman had approached her

6:12

in. September Twenty twenty three about hiring

6:14

her to voice Sassy Be Teeth are saying

6:16

that it would be good for everyone to

6:19

see tech and creatives working together. A cabin

6:21

you remember that September was when they rolled

6:23

out voices to the In In a Taxi

6:25

Be T and In the up the rights

6:28

to around that same time it seems Sam

6:30

has this idea arm and you know to

6:32

me the seems. So

6:35

you're already employed by The Splits. Know

6:38

if you're going well girls are handsome Movie I

6:40

have every non medicines and or well I be

6:42

three It. Just seems clear that when same

6:44

approach or he wanted the prestige of

6:46

having her voice in the app the

6:48

press these being a two thousand six

6:50

I'm sorry for what amounts of aura

6:52

of so super entire I have really

6:54

attacks adequately a lot of a snow

6:56

at the time see declines for whatever

6:58

reason and then she writes in her

7:00

statements nine months later presumably referring to

7:02

last week's My friends, family and the

7:04

general public all noted how much the

7:06

newest system name Skies sounded like me

7:08

and how does she feel about it?

7:10

Well she said quote when I heard.

7:12

The release demo I was shocked, angered, and

7:15

in disbelief that Mr. Altman would pursue have

7:17

wasted sounded so eerily similar to my It's

7:19

the My closest friends and news outlets could

7:22

not tell the difference. Mr. Altman even insinuated

7:24

that the similarity was intentional Tweeting a single

7:26

word hurts and this really threw me for

7:28

a loops. I see says that two days

7:31

before the demo, Ahmed had reached out to

7:33

her egypt asking her to reconsider but then

7:35

open a I rolled out the demos this

7:37

guy before she could respond. so isn't that

7:40

a while? didn't know Crazy. So I mean

7:42

like. yeah we've got more about what it means

7:44

later but that this is the part where i'm just

7:46

like oh they they screwed this whole thing up so

7:48

badly yes and also like if there was a chance

7:50

that made you could work in a why wouldn't sue

7:52

you know wait for the and i think the answer

7:55

to that by the ways that they wanted to sort

7:57

of upstate google before it's a developer congress but to

8:00

finish out this statement, she says that she and

8:02

her lawyer sent two letters to Altman at OpenAI

8:04

asking for a detailed accounting of the process that

8:07

created the voice. And I think that is

8:09

probably what led to the blog post that went

8:11

up on Sunday night. And she closed with

8:13

a call to action. And I don't know if

8:15

we want to play the Star Spangled Banner underneath

8:17

this. I do think it would sound nice. I'll

8:20

just sort of read the quote in a time

8:22

when we are all grappling with deepfakes and

8:24

the protection of our own likeness, our own work,

8:26

our own identities. I believe these are questions

8:28

that deserve absolute clarity. I look forward to

8:30

resolution in the form of transparency

8:32

and the passage of appropriate legislation

8:35

to help ensure that individual rights

8:37

are protected. So after

8:39

that, Kevin, in the middle of the week,

8:42

OpenAI puts out a statement which they

8:44

attribute to Sam Altman. And it

8:46

says, quote, The voice of Sky is

8:48

not Scarlett Johansson, and it was never

8:50

intended to resemble hers. We cast the

8:52

voice actor behind Sky's voice before any

8:54

outreach to Miss Johansson. And to respect

8:57

for Miss Johansson, we have paused using

8:59

Sky's voice in our products.

9:02

So that was the statement in the middle

9:04

of the week. And I have spent the

9:06

last few days, Kevin, trying to figure out

9:08

what in the Vicky Christina Barcelona is going

9:10

on here. What did you find? Well, I

9:12

found first of all, that is a Scarlett

9:14

Johansson movie. I got that.

9:17

All right. All right. So on Thursday,

9:19

I had a chance to ask OpenAI

9:21

some questions. And my first question was,

9:23

who exactly at this company knew what

9:25

the heck was going on? Okay. And

9:27

what I was told was this, the

9:30

voice team decided they wanted to

9:32

record five voices for chat GBT.

9:35

But after that, they decided, hey, it

9:37

would be cool if we could get

9:39

Scarlett Johansson. And as part

9:41

of that, Sam Altman was sent out

9:43

on a mission to get Scarlett Johansson.

9:46

And according to them, that is when

9:48

he reached out to her in

9:51

September. And to sort of

9:53

bolster this timeline, they did a couple things. They

9:56

showed me a job posting from May of last

9:58

year where they advertised for Act for

10:00

these roles and I saw the job

10:02

posting. It did not mention Scarlett Johansson.

10:04

It did not mention her or any

10:07

other movies. They played... They did not

10:09

say only Black Widows may apply. No,

10:11

it didn't say that. Okay. They

10:16

then played for me a clip

10:18

from Sky's Audition where

10:21

she talks about, you know, walking around with

10:23

her toddler and basically

10:25

just gives you the impression of,

10:27

no, this is a real voice. This is

10:29

not a composite of other people's voices, which

10:31

is like one conspiracy theory that was sort

10:34

of floating around this week. And then finally,

10:36

they showed me a video clip of

10:39

the actor in the recording booth while

10:41

they were doing this recording. Now, this video clip

10:44

was very short. It was heavily pixelated and it

10:46

was taken from so far away that I couldn't

10:48

even tell where the human was supposed to be

10:50

until the second time I watched it. So

10:53

I wouldn't say that that clip alone is giving me

10:55

a lot of confidence in the narrative here, but I

10:57

have seen some sort of video suggesting that at some

10:59

point, a human being was saying

11:01

something into a microphone. Okay. So

11:04

let me just repeat all this back to you

11:06

and you tell me if I have the timeline

11:08

and the version of events, right? So OpenAI is

11:10

saying that they did not initially plan to have

11:12

a voice of Scarlett Johansson or

11:15

even one inspired by Scarlett Johansson as part

11:17

of this chat GPT voice release, but that

11:19

they later sort of came up with the

11:21

idea, well, maybe we should have

11:23

this sixth voice and maybe if Scarlett Johansson

11:25

will say yes, then we can get her

11:28

in as the sixth voice. That obviously never

11:30

happened, but they are basically saying this

11:32

was all never intended to

11:35

mimic the voice of Scarlett Johansson.

11:37

Any resemblance to people living

11:39

or dead named Scarlett Johansson is purely coincidental.

11:41

Is that basically what they are telling you?

11:43

That is what they're telling me. And

11:46

How do you feel about that narrative?

11:48

So Yeah, I guess I buy the

11:50

narrow version of events that OpenAI is

11:52

claiming happened here and I Also have

11:54

listened to clips of Sky and listened

11:56

to clips of Scarlett Johansson and they

11:58

don't sound totally accurate. Identical to me.

12:00

So it is totally plausible that they

12:03

had this other voice actor or play

12:05

this role. But there are so two

12:07

things that aren't quite adding up for

12:09

me. One of them is like, okay,

12:11

say you didn't. Mean. A Cast

12:13

a Scarlett Johanson sound alike. Why then

12:15

spend so much time around the lots

12:17

of this new voice teacher says making

12:19

people feel like they were listening to

12:21

Samantha from her to Serve Directly connect

12:23

the release of this product to this

12:25

movie and this actress. why do that?

12:27

If it you know it's going to

12:29

get you in trouble and then the

12:31

second thing is open. A I itself

12:34

has said in the past that they

12:36

do not want their synthetic voices to

12:38

serve mimic public figures. In fact, there

12:40

was actually a statement that they put

12:42

out on March. Twenty ninth. Earlier this

12:44

year in a blog post that Open

12:46

A I wrote called Navigating the Challenges

12:48

and Opportunities A Synthetic Voices And one

12:50

of the things they say in this

12:52

blog post is it There should be

12:54

a quote no Go Voiceless that detects

12:57

and prevents the creation of voices that

12:59

are too similar to prominent figures. I'll

13:01

be very interested if there is a

13:03

litigation around this issue. If you know

13:05

any of the discovery, they find evidence

13:07

that Open A I employees were served

13:09

talking about how similar his voice sounded

13:12

to Scarlett Johanson. Whether or not that

13:14

violated their own open a I policy

13:16

about not creating synthetic voices that were

13:18

too close to the voices a prominent

13:20

people. So let's talk about why this

13:22

matters because can understand you might be

13:24

listening. As saying the seems like kind

13:27

of a small saying, right? It's just

13:29

a voice Hates: If Super Intelligence is

13:31

coming soon, Is a voice really what

13:33

we should be worried about? Split? I

13:35

think it's important for a couple resist

13:37

and the first one to vince is

13:39

that the creative community is already deeply

13:42

skeptical of a. i rights last year

13:44

we have the sag aftra strikes and

13:46

this was a corpse length of the

13:48

fear of their which is that they

13:50

worried that companies would steal an actress

13:52

voice and image use it without their

13:54

permission as he and eventually either drive

13:56

them out of a job or just

13:59

it's right there wages way down. So

14:01

during that, thrite actors were able to win

14:03

concessions on this point. And now here you

14:05

have a different case where Scarlett Johansson wakes

14:07

up one day after saying no to this

14:09

company. And now the sound alike voice is

14:11

the voice of chat TBT. Yeah. So

14:14

Kevin, first of all, is what open AI

14:16

did here legal? It

14:19

may be it may not be. And it's

14:21

a little hard to tell. But I can

14:23

imagine that there's going to be some litigation

14:25

or maybe a settlement. I mean, Scarlett Johansson

14:27

has said that she has lawyered up. And

14:29

there are actually two legal cases

14:31

that people are sort of using to say

14:33

about this case involving Scarlett Johansson and open

14:35

AI that actually the facts would be highly

14:37

in her favor if she did decide to

14:39

litigate. No, do you want to tell us

14:41

about one? So one of them is a

14:43

case from 1988

14:46

called weights v Frito lay.

14:48

This is a case, the Frito

14:50

lay corporation makers of fine snacks decided

14:53

that they wanted to make a commercial for

14:55

a new flavor of Doritos. And

14:58

they really wanted Tom weights to sing

15:00

in it. But Tom weights is

15:02

sort of famously anti commercial, like he just

15:04

didn't want to have his songs used to

15:06

endorse products. It was like against his values.

15:09

So they went out and they paid $300 to

15:11

a Tom weights impersonator. Basically,

15:13

a guy who's in a band who

15:16

sounds exactly like Tom weights, and

15:18

they put the impersonator in the

15:20

Doritos commercial. They really

15:22

said weights, weights, don't tell me that you don't want

15:24

to be in this commercial. Yeah. Yeah.

15:27

So then the commercial comes out, Tom weights, his friends

15:29

start calling him and saying, Hey, I thought you were

15:31

against commercials. Why are you all of a sudden endorsing

15:33

Doritos? He gets mad, he

15:35

sues, and he sues not for copyright

15:38

violations, because you can't copyright a sort

15:40

of way of singing or talking, but

15:42

for false endorsement. And a

15:44

jury awards him $2.6 million. He's

15:47

basically the heir to the Cool

15:49

Ranch Dorito fortune. Congratulations. The

15:52

other one is around the same time, Bette Midler

15:54

had a very similar thing happen to her Involving

15:57

the Ford Motor Company, which went to her.

16:00

Her and said hey, could we use your

16:02

your song, your voice in a commercial for

16:04

our new Mercury Sable be the wind beneath

16:06

our wings But Midler? Exactly. So she says

16:08

no. I don't really want to endorse products,

16:11

so instead they go. and they hire one

16:13

of her former backup singers and basically instructors

16:15

backup singer to sound as close to bet

16:17

Midler as possible. See takes us to court

16:19

see when. Four hundred thousand dollars in damages?

16:22

kinds of I think Scarlett Johanson has and

16:24

not at and I would never say an

16:26

airtight case here to there's no such thing

16:28

spreading. She has a very. Strong case here.

16:30

split. but even setting aside the the legality

16:33

of it, Kevin, I'm curious to get your

16:35

thought on what this does for public perception

16:37

once. where are we right now and how

16:39

average people are thinking about ai, what role

16:41

at my play in their life and and

16:43

whether it might threaten them in some ways.

16:45

And you think this is the most so

16:47

damaging thing to come out of this particular

16:49

episode? It's not. Actually, you know I'm I'm

16:51

sure they will. You know I'd be will

16:53

figure out a way to sort of make

16:55

things right with crowds or Hansen where they'll

16:57

go to court. But I think the broader

16:59

damn as here. Is to the the public trust

17:01

in open A I This is a company that

17:03

has said he A we are building something that

17:05

was eventually become an artificial general intelligence. We're doing

17:08

this for the good of humanity and we want

17:10

you to trust us on that. and I think

17:12

they got the benefit of the doubt for a

17:14

while because they were releasing things that were cool

17:17

and useful to people. Taxi be teeth you know

17:19

with of a moment where a lot people said

17:21

okay maybe they are I'm in a at at

17:23

the from head of the pack here and maybe

17:25

we're okay with that then I think we've we

17:28

saw things start to decay a little bit with.

17:30

each successive release and the and the sort

17:32

of overall vibe being that this was actually

17:34

not sort of the a nonprofit research lab

17:37

as it had been started but was actually

17:39

something more like a very traditional tech companies

17:41

and so i just think we've seen a

17:43

gradual erosion of that process from the public

17:46

in open a eyes and i do ultimately

17:48

think that hurts them long term i'd i

17:50

agree with you i think this has been

17:52

a really bad a month for the perception

17:55

attack amongst average people i think this is

17:57

a moment where we have seen tech companies

17:59

get really greedy and greedy at

18:01

the expense of working people. And so

18:03

like as May is coming to a

18:06

close, on one side of the ledger,

18:08

you have Scarlett Johansson and an entire

18:10

creative class of workers rallying around her.

18:12

And on the other hand, you have

18:14

OpenAI Soundalike voice, Google AI overviews eating

18:16

the web and the Apple hydraulic press

18:19

from the commercial like crushing everyone into

18:21

a fine pulp, right? So I think

18:23

the tech industry needs a better story

18:25

to tell here than we're coming for

18:27

your voice and there's nothing you can

18:30

do about it. Totally. So there's a

18:32

second thing that I wanna talk about

18:34

though, Kevin, which is the implications of

18:36

this story for OpenAI because I think

18:38

it recontextualizes one of last year's biggest

18:41

stories, which was Sam Altman temporarily getting

18:43

bounced out of the company. So can

18:45

you just remind us what happened in

18:47

November to Sam? Yeah, so he

18:49

was fired in a surprise

18:52

move by a number of members

18:54

of the nonprofit board that governs

18:56

OpenAI along with Ilya Sutskover

18:59

who was the chief scientist at the time.

19:01

And by way of explaining why they

19:03

were firing him, they made these kind

19:05

of vague statements about how Sam had

19:07

not been quote, consistently candid. And

19:09

just basically implied that he was sort of

19:11

a slippery person who was

19:13

telling different things to different people and

19:16

who they had sort of lost faith

19:18

in. But they never sort

19:20

of gave many concrete examples

19:22

of that. And so I think it

19:24

was tough for people to understand why

19:27

make such a sudden and

19:29

important decision in sort

19:32

of the dead of night without consulting

19:34

anyone. And so Sam sort

19:36

of had the trust and the faith

19:38

of OpenAI's employees. And so they

19:40

rallied around him. They all, remember they were

19:42

gonna briefly go, I'll go work at Microsoft.

19:45

And then the board members end up

19:48

being sort of pushed off the board. And

19:51

Sam is brought back as the CEO.

19:53

It's a wild story, but I think

19:56

a very interesting thought experiment for me over

19:58

the past week has been. What if

20:00

the board coup had happened now? What if the

20:02

board had waited to make its move on Sam

20:04

until now, when I think, I think

20:07

it's fair to say he would not get the same benefit

20:09

of the doubt from the employees or the investors in OpenAI

20:11

that he has today. I think that's true. Now, this is

20:13

a company that is valued at what? 80 to $90 billion.

20:17

I think the employees who are working

20:19

there want to see the equity that

20:21

they have in that company realize and

20:23

I think that there's a very good

20:25

chance that if what you just

20:28

laid out happened, those employees

20:30

would still support Sam. That said, you're

20:32

right. We would have a good example

20:34

of why this board might be a

20:37

little bit concerned. I know at least

20:39

for me, ever since, you know,

20:41

as we did our own reporting, we talked to people

20:43

involved in that situation. I have

20:45

thought since November, there might be another

20:47

shoe to drop here, right? We may

20:50

eventually learn what the board was so

20:52

concerned about. And I feel like this

20:54

week for the first time, we actually

20:56

know now, like it is this sort

20:58

of thing. Yeah, but I actually, as

21:01

strange as it's gonna sound, I don't think

21:03

this Scarlett Johansson voice thing is actually the

21:05

worst thing that has happened to OpenAI over

21:07

the past couple of weeks. What do you

21:09

think it was? So there was this other

21:11

story that came out just recently about these

21:13

employee agreements at OpenAI. And this

21:15

came to light after Ilya

21:17

Sutskover and Jan Leikey, who were the heads

21:20

of the company's super alignment team, both

21:22

announced they were leaving the company. And

21:25

people started looking into the paperwork

21:27

that OpenAI employees have to sign

21:30

when they leave the company. And recently,

21:32

Kelsey Piper at Vox reported that there

21:34

was a really unusual provision in this

21:36

exit paperwork that basically said that if

21:39

OpenAI people left the company and then

21:41

spilled the beans or said something or

21:43

disclosed something about the company or disparaged

21:45

it in any way publicly, they

21:48

could not only break this contract,

21:50

but they could have their vested equity clawed

21:52

back, which we should explain why that's such

21:54

a big deal. So normally, you go work

21:56

at a tech company, you get stock options,

21:58

some of those stock options. options vest over

22:00

time. And this is traditionally how tech

22:02

employees make a lot of money. Their

22:05

stock options vest, they sell them, they

22:07

get the money. So

22:09

it is not unusual when you leave a

22:11

tech company to have your unvested equity forfeited.

22:14

What is extremely unusual, and actually I've never heard

22:16

of this happening before in the tech industry, is

22:18

for a company to say, we can actually take

22:20

back your vested equity if you've left the company

22:23

and you despair just publicly. So this was something

22:25

that a lot of OpenAI former

22:27

employees had been terrified of. It's a reason why we

22:29

haven't seen a lot of former OpenAI employees

22:31

speaking out. And when it

22:33

became public, a lot of people in the

22:35

AI industry said, this is crazy. We

22:38

have not seen this at any other companies.

22:40

They are trying to silence former employees for

22:42

speaking out. And then you saw actually Sam

22:44

Altman make this statement about it,

22:47

where he basically said, I didn't know about this. I

22:49

didn't know this was part of our paperwork. I've been

22:51

trying to get Scarlett Johansson to do the voice of

22:53

Chef GPT, the guy at time for this. So

22:56

he said, we've never clawed back anyone's vested equity.

23:00

Basically he said, this is one of the

23:02

few times I've been genuinely embarrassed running OpenAI.

23:04

I did not know this was happening and

23:06

I should have and we'll fix it. So

23:08

the reason I think this is actually a bigger

23:11

deal than the Scarlett Johansson thing, despite it not

23:13

getting nearly as much attention, is because OpenAI is

23:15

in a talent war, right? They are constantly

23:17

trying to pick off the best AI

23:19

researchers from all of the biggest companies

23:21

in Silicon Valley. It's a very talent

23:23

heavy business and talent dependent

23:25

business. And if they start losing

23:27

people who say, this company seems like they're slippery

23:29

and I can't trust them, I think that is

23:31

an existential threat to them in the long term.

23:34

Well, look, if

23:36

you're wondering why you've never heard me give an interview

23:38

where I talk about what it's like to work with

23:40

Kevin, I have signed something very similar. But

23:43

you're right, it is very unusual, particularly for

23:46

a company with Open in its name. And

23:48

I agree this was another black eye for

23:50

the company this week. It also of course

23:52

came on the heels of their entire super

23:54

alignment team being dissolved, which we discussed last

23:56

week. So there's just kind of a lot

23:59

of swirling drama. around that company. Now

24:01

we should also say while all that's going on,

24:03

OpenAI's business is doing great. I don't want to

24:05

pretend that it's not. Altman was on stage at

24:07

a Microsoft Developers Conference this week, which we'll talk

24:10

about in a little bit here.

24:12

There has been some reporting that Apple and

24:14

OpenAI are going to announce a big partnership

24:16

next month at Apple's own developer conference. And

24:18

finally, there was reporting this week that on

24:21

the day that the GPT40

24:25

model was announced, OpenAI's revenue shot at more

24:27

than 20%, according to

24:29

third party estimates. So clearly, OpenAI

24:31

is doing great. But

24:33

the sorts of things that

24:35

we've seen this week have given me some

24:38

pause. And I wonder if they've given you

24:40

some pause as you think of what is

24:42

the future of this company? Yes, I think

24:44

the correct way to phrase what I've been

24:46

feeling this week would be vibe shift. I

24:48

think there has been a big vibe shift

24:51

around AI when it comes to the creative

24:53

community, but especially with OpenAI as it relates

24:55

to the sort of trustworthiness of what they're

24:57

building. And I've talked to

24:59

people who say, I basically gave this company the

25:01

benefit of the doubt, I gave San the benefit

25:03

of the doubt, they seem to be

25:05

saying a lot of the right things. And now they're just

25:07

kind of like, I don't know, man. And I also think

25:10

it's like I was thinking about this

25:12

sort of idea of the

25:15

Silicon Valley builder's mindset of

25:17

like, ask for forgiveness,

25:20

not permission. And I think that's been

25:22

the way that a lot of successful

25:24

companies have been built in Silicon Valley.

25:26

Uber, Facebook, to some extent was

25:28

a story of asking for forgiveness, not permission.

25:31

And I think that that works with most technologies.

25:34

But I think with AI, it's a little bit

25:36

different. Yeah, absolutely.

25:39

It's also he asked for permission,

25:42

not for forgiveness, he did it backward.

25:44

He asked for permission, didn't get

25:46

permission, and then asked for forgiveness after he

25:48

didn't get permission, which is not something that

25:50

he didn't even ask for forgiveness. He said

25:53

I would never ask for forgiveness because there

25:55

was nothing to apologize for in the first

25:57

place because this voice isn't based on Scarlett

25:59

Johansson. It is the wild. sort of, you

26:01

know, hardest to pin down narrative. It

26:03

is. I mean, look, the thing that this brings to

26:05

mind for me, Kevin, is that you and I both

26:08

covered the decline in public perception around Facebook, right? And

26:11

Facebook once seemed like this silly little toy. Nobody

26:13

paid too much attention. Then after the 2016 election,

26:15

everybody is like, wait, is this secretly a mind

26:17

control device that is, you know, making all of

26:19

our teenagers insane and everything else? And

26:21

obviously, it's way too soon to say that something like

26:23

that is happening to OpenAI. But

26:26

I'm telling you, this is how it starts, right? I

26:29

think more and more people are becoming convinced every

26:31

day that whatever AI is going to be in

26:33

the short to medium term is not going to

26:35

be a good bargain for them. And they're not

26:37

going to give OpenAI the benefit of the doubt.

26:39

And that means that OpenAI, I think, needs to

26:41

be really careful in how it makes its next

26:44

several decisions around this kind of stuff. So,

26:46

look, I'm glad the Hollywood A-list is paying attention.

26:49

If you are worried about what a company like

26:51

OpenAI might do with your voice or your job,

26:53

you're in good company, which is a 2004 movie,

26:56

Starling Scarlet. Oh, good. I was

26:58

worried there was one Scarlett Johansson movie we weren't going

27:00

to mention. When

27:03

we come back, Neuralink's first ever patient

27:05

joins us to talk about how the

27:08

technology is changing his life. Looking

27:26

for a new ride? Shop for your next car from the

27:28

comfort of your home, 100% online with Carvana. Carvana

27:32

has a massive inventory, including thousands of cars

27:34

under $20,000. So, finding

27:36

a car that fits your budget and

27:39

lifestyle is hassle-free. Carvana makes car financing

27:41

hassle-free. Get pre-qualified in minutes by answering

27:43

a few quick questions. Carvana also offers

27:45

customizable terms and payments as low as

27:48

$0 down because payment plans should fit

27:50

your plans. Visit carvana.com or download

27:52

the app today to find your next car

27:54

without ever leaving the comfort of home or

27:56

anywhere else. Terms and conditions may

27:58

apply. Hi, I'm Megan

28:00

Loram, the director of photography at The New

28:03

York Times. A photograph can

28:05

do a lot of different things. It can

28:07

connect us. It can bring us to places

28:09

we've never been before. It can

28:11

capture a story in a universal visual

28:13

language. But one thing that all these

28:16

photographs have in common is that, you

28:18

know, they don't just come out of

28:20

the ether. We spend a lot of

28:22

time anticipating news stories, working with the

28:24

best photographers across the globe. These

28:26

are photographers who have spent years

28:28

mastering their technical craft, developing

28:30

their skills as visual chroniclers of

28:32

our world. You know, getting certified

28:34

as a scuba diver and learning

28:36

how to shoot underwater to document

28:38

climate change or tremendous cardiovascular training

28:40

in order to ski on the

28:42

slopes next to Olympic athletes. This

28:44

is an effort that takes tons

28:46

of time and consideration and resources.

28:48

All of this is possible only

28:50

because of New York Times subscribers.

28:52

If you're not a subscriber yet,

28:54

you can become one at nytimes.com

28:57

slash subscribe. So,

29:01

Casey, as you know, one of the

29:03

technologies that is fascinating to me right

29:05

now is the Brain Computer Interface or

29:07

BCI. That's right. In fact, it's so

29:09

fascinating to me that I forced you

29:11

to try one a few weeks

29:13

ago and I think it's safe to say it was not

29:15

all that impressive. No, I was willing to go through with

29:17

it because I've always wanted to see if we could detect

29:20

any brain activity for you. But afterwards

29:22

I thought, I can't do this anymore.

29:25

So, just to back up, Brain Computer

29:27

Interfaces are a type of technology that

29:29

allows you to basically control a computer

29:31

directly with your brain. And this has

29:33

been something that obviously Science Fiction has

29:35

been talking about and that scientists have

29:38

been working on for decades. The company

29:40

that is perhaps best known in this

29:42

space is Neuralink, Elon Musk's Brain Computer

29:44

Interface Company. And I've just been

29:46

fascinated by this whole area. I

29:49

made you try one. It was what's called a

29:51

non-invasive brain computer interface, which means it's not inside

29:53

your literal head. It's like a headband that you

29:55

wear. And it doesn't work that well. The technology

29:57

was not all that impressive. We didn't end up...

30:00

airing the segment where we tried this thing on because

30:02

it just wasn't very good. But

30:04

with Neuralink, the implant, the brain computer

30:06

interface goes literally inside your skull on

30:08

your brain and it allows you to

30:11

control a cursor with your mind. So

30:13

I think we should tell people a

30:15

little bit kind of about like what

30:18

this thing is and what it looks

30:20

like because there are these threads that

30:22

have electrodes on them that penetrate into

30:24

the brain and those electrodes

30:27

read signals which then get translated

30:29

through the Neuralink device and that's where

30:31

I've lost the plot. Can you pick it up from there?

30:34

Yeah, so it basically translates your electrical activity

30:36

in your brain into commands to control something

30:38

on the outside of your body like a

30:40

computer or something like that. And

30:43

for years now, tech companies have

30:45

been looking at using BCIs to

30:47

help people who have debilitating conditions

30:50

like a spinal cord injury or

30:52

a stroke or some limitation on

30:54

their mobility. But also a

30:56

lot of people in Silicon Valley just talk about

30:58

this as a potential next step in computing altogether.

31:01

Like in the future, some or

31:03

all of us will have these kind of brain computer

31:05

interfaces. And I've even

31:07

heard people in the AI world say, this is

31:10

the way we are going to stay ahead of

31:12

the robots as the AIs get smarter is that

31:14

we are going to implant computers in our head

31:16

that will basically increase our own cognitive capacity. That

31:19

technology does not exist. It is mostly just an

31:21

idea from science fiction. We have no idea whether

31:23

that would work or not. But this

31:25

is a big important technology that a lot of

31:27

people in tech are excited about. And

31:29

just over the last few months, we've actually seen one

31:32

of the clearest views yet into how this might

31:34

work in a real human. Yeah. And

31:36

while you've just spun out some really fantastical

31:38

sci fi scenarios, Kevin, what appeals to me

31:41

about this story is that it is a

31:43

case of technology helping one person who had

31:45

something really terrible happen to him. So I

31:47

want to say up front, very few people

31:50

have been as critical of Elon Musk as

31:52

I think the two of us have been

31:54

on this show in particular. And

31:58

as hard for me as it is to set aside. my

32:00

personal feelings particularly about what he did to Twitter.

32:03

I truly am so inspired by how

32:05

this technology is helping one person and

32:07

I think it is absolutely worth understanding

32:10

what is this thing that they built

32:12

and how has it changed the life

32:14

of at least one individual. Yeah so

32:17

today we have a really special opportunity

32:19

to talk to the one person on

32:21

earth who has actually gotten the Neuralink

32:23

brain computer interface implanted in his skull.

32:26

This man is named Nolan Arbaugh and

32:28

he now has a computer system about

32:30

the size of a coin and a bunch of

32:32

threads with electrodes that connect to his brain that

32:35

allow him to do things like move a

32:37

cursor around on a computer with

32:39

just his mind and this is a

32:41

big deal for Nolan because for the

32:44

last eight years he's been paralyzed from

32:46

the shoulders down he had a freak

32:48

accident eight years ago where he suffered

32:50

a severe spinal cord injury and so

32:52

he has been a quadriplegic for the

32:54

last eight years and he volunteered to

32:57

be patient number one in this Neuralink

32:59

experience. So today we're going to talk to

33:01

Nolan about what having this Neuralink device in his

33:03

brain has been like how it's changed his life

33:05

and why he volunteered to risk his body on

33:08

this unproven new technology. Let's bring him in.

33:15

Nolan Arbaugh welcome to Hard Fork. Nice to

33:17

be here thanks for having me. Yeah where

33:19

are we catching you right now? Just describe

33:22

where you are. I'm

33:24

in my house in my

33:26

bed so if you hear any weird noises

33:28

in the background it's my bed I have

33:31

an air mattress so it's kind of blowing air through the

33:33

whole thing all the time. Got it. Where's

33:35

home for you? Yeah I'm in Yuma, Arizona. Cool.

33:39

So it's been a crazy last

33:41

few weeks for you back in

33:43

January Elon Musk announced that the

33:45

first human patient had been successfully

33:48

implanted with a Neuralink device but

33:50

he didn't say the name. It's

33:52

only really in the last two months that your

33:54

name has become public, and just the last week

33:57

or so that you've started to talk more broadly

33:59

about your experience. What is it been

34:01

like to be served? the literal face of

34:03

this technology? Yes, But all

34:06

right I'm just here trying to

34:08

get all the information out as

34:10

many people as possible. I think

34:12

it's amazing technology. I think what's

34:14

going on in my life and

34:16

what I for see the future

34:18

will hold is worth bringing. The

34:20

whole world along with. Such

34:23

been caught man own. I'm enjoying it so

34:25

far. Yeah, I

34:27

just want to go back to

34:29

serve before you got this narrowly

34:32

device implanted in your head. What

34:34

compelled you to register to participate

34:36

in this extremely your new and

34:39

untested experiment? Yeah. I

34:41

mean I didn't really know much about it.

34:43

My buddy called me up one day and

34:45

kind of gaming the five min a rundown.

34:47

I wasn't expecting anything the come from it

34:49

so I'll admit a few joke. I'm application

34:51

you know I just I figured I would

34:53

never your back and then once I did

34:55

start hearing that I had to think about

34:57

a little bit more. Seriously had very serious

34:59

and candid conversations with my parents, my friends.

35:02

And. Ultimately when it

35:04

came down to it and are selected

35:06

I decided that. I just wanted to

35:08

help. I knew the I wanted to help make it

35:11

safer for everyone after me and I knew that I

35:13

wanted her. You. Know try to make

35:15

a difference in the world and something I've

35:17

been trying to do something of I'm looking

35:19

for for a years and the seem like

35:22

the perfect opportunity. Dt.

35:24

Remember any of the jokes the

35:26

put on your application? Yeah, I

35:28

said I wanted an Iron Man

35:30

suit. I said that I wouldn't

35:33

mind being uploaded in the Matrix.

35:37

You know, seems like a cigarette and

35:39

other a one of the truth is

35:41

that case it I have learned in

35:43

reporting about Tech over the past decade

35:45

is that you never want to try

35:47

version One point and right, it's risky

35:49

to try the first version of anything

35:51

because the bugs are still being worked

35:53

out usually. there's some some rough edges

35:55

of split you are literally being asked to

35:58

try version one point out not just a

36:00

new gadget but of something that is going

36:02

to go inside your skull. So was that

36:04

part of your process of thinking through this?

36:07

No one has ever had one of these put in their brains

36:09

before. Maybe I want to let someone else

36:11

be the first person. Yeah,

36:14

it crossed my mind. Something that my

36:16

buddy and I, the buddy that called

36:18

me on the phone, we talked about

36:20

at length was, you know, this is the worst

36:22

version of it that's ever going to be in a human. Maybe

36:26

someone else should go first. And

36:28

I'll get a better version later on down the road. Or maybe I

36:30

don't do it at all and wait for it to be on

36:32

the market to the public and then I get an even

36:34

better version. But ultimately I figured

36:37

that if anyone's going to do it, then

36:39

I should. I have a pretty

36:41

solid foundation with my faith in God. And

36:44

I just felt like I've thought my

36:46

entire, you know, accident. I'm glad that it

36:48

happened to me and no one else I

36:50

know because it's just a very hard thing

36:52

to experience, you know, being a quadriplegic. And

36:55

I wouldn't ever want any of my friends to have to

36:57

go through this. So it's just a mindset that I've had

36:59

forever. And with the Neuralink,

37:02

it was the same thing. Like if anything were

37:04

to go wrong, I would feel terrible if I

37:06

passed up to wait for a better version and

37:08

something went wrong to someone else. So I

37:10

knew that it had to be me. Your

37:13

parents are your primary caregivers. What were

37:15

your conversations with like them about doing

37:17

this? Yeah, they were really

37:19

hard for a while. I

37:21

mean, as a quadriplegic, the only real

37:24

thing that I have left is my mind,

37:26

is my brain, is my personality. And

37:29

it's hard to let someone go in

37:32

and kind of rummage around up there,

37:34

especially with something that's never been tested

37:36

in a human. So one

37:38

of the things that I mentioned

37:40

to them was that if I had

37:42

any sort of brain deficiency afterward, if I was

37:45

mentally handicapped in any way, that I didn't want

37:47

them to take care of

37:49

me anymore, that I wanted them to put me in

37:51

some sort of home. Because taking care of a quadriplegic

37:53

is hard enough, but taking care of a quadriplegic with

37:55

a traumatic brain injury is something that I would never

37:57

want my parents to do. So I made them agree.

38:00

to that. Yeah. Were you

38:02

nervous like the night before the

38:04

operation to install your neural length

38:06

device? What did you find yourself

38:08

thinking about as you went to

38:11

bed? Yeah, I wasn't nervous at all. I

38:13

was just excited. My buddy and

38:15

I were sitting around making jokes. I

38:18

don't know hanging out night before I just wanted to

38:20

get it over with honestly. What kind of jokes were

38:22

they making? We

38:24

were planning on releasing

38:27

some like cyborg related jokes, thinking

38:29

of things that only I would

38:31

be allowed to say. Just random

38:33

like turning to

38:35

phrases too like, oh, blew my mind picking

38:37

my brain, things like that. Yeah, just all

38:43

sorts. I'm curious if you

38:45

had a thought as you were heading

38:47

into this surgery of what the

38:49

first things you wanted to do would

38:52

be once you had a working brain

38:54

computer interface. Yeah, I

38:56

mean, I'm a big gamer. I wanted to play

38:58

games. That was one of the big things I

39:00

wanted to do. I also wanted to be able

39:02

to read. I mean, there's nothing like being able

39:04

to hold a book again, and the smell and

39:06

the feel of a paperback book. It's one of

39:09

my favorite things in the world and something that

39:11

I've missed for a long time. I can't do

39:13

that. And so

39:15

the next best thing is just being

39:17

able to read in general. I've

39:20

had to listen to audiobooks for the

39:22

last eight years, because I haven't been able to sit

39:25

in the same position to read a book. And

39:27

I had no way of turning pages. I can

39:29

do it sort of on a Kindle. But

39:32

I was listening to audiobooks and I don't

39:34

really like listening to audiobooks, to be honest.

39:37

Sometimes the narration is terrible. I don't

39:39

want to throw shade, but I remember

39:41

reading like an Aragon audiobook

39:43

and the voice that the narrator

39:46

had for the dragon made me turn it off

39:48

immediately. This is quite possibly the worst thing I've

39:50

ever heard in

39:54

my life. It was Gilbert Gottfried voicing the

39:56

dragon. No, that would have been amazing. I

39:58

would have listened to that. That on

40:00

repeat, honestly. No,

40:03

but just something as simple as that. Being able to

40:06

line my bed and read a book. There's

40:09

just something about it. So I was really

40:11

looking forward to that kind of stuff. So

40:13

how was the actual surgery? Was it long?

40:15

What was the recovery process like? Can you

40:17

tell us just about the actual implant? Yeah,

40:19

it was super, super quick. We

40:22

got to the hospital at five a.m. I think

40:24

I was scheduled for surgery at seven. There was

40:26

a lot of just getting me ready, getting me

40:28

in bed. The surgery was supposed to last between

40:31

four to six hours. They were

40:33

expecting there to be hang ups. They were expecting,

40:35

say, the needle on the robot to break.

40:38

They brought, I think, 20 iterations of that

40:40

needle in case it broke and then they

40:42

had to stop and replace it. And the

40:44

needle didn't break once. Just everything

40:46

performed above and beyond what they

40:48

expected. And so the surgery took

40:50

under two hours. And

40:52

then I was out

40:55

of surgery and they prescribed some pain

40:57

pills. I didn't take a single one. I don't

40:59

know, it was just so easy. The

41:01

worst part was I wasn't able to shower

41:03

for the first few days because my incision

41:06

needed to heal. But outside of that, the

41:08

recovery process was so easy. I didn't feel

41:10

any pain at all. Wow. And

41:13

when you woke up from your surgery, what

41:16

was actually different? Did you

41:18

feel different? And

41:20

then how long was it before you actually

41:22

got to turn on and use the Neuralink

41:24

device itself? Yeah, I mean,

41:27

I had a gnarly scar with some staples

41:29

in it. I mean, that was pretty freaking

41:31

sick. Like I was super happy about that.

41:33

I got some cool pictures. But

41:35

yeah, I think within, I don't

41:38

know, like an hour or two after my

41:40

surgery, they came in and connected me to

41:43

like a little tablet they had. What

41:45

do you mean connected you? Like literally plugged

41:47

something into your head? Or like what, is

41:49

it Bluetooth? It's a Bluetooth connection, yeah. So

41:51

they just wake up the implant with a

41:54

little coil, like a charging coil, almost like

41:56

the same thing that you put your phone

41:58

down on a mat to charge. It's

42:00

very similar. You just hold something over my head and it

42:02

wakes up. It's how you charge it as well

42:04

I put that in like a hat and I wear a hat and it

42:06

charges and

42:08

so they woke it up they Connected that

42:11

to a tablet And on

42:13

the screen they just showed a bunch of the

42:15

channels the channels are each electrode in

42:17

my brain and those electrodes are picking

42:19

up like neuron firing and

42:23

So they showed me say like eight channels and I

42:25

got to see like live like

42:27

real-time The neurons firing

42:29

in my brain and everyone just kind

42:31

of freaked out in the hospital room.

42:33

Everyone started cheering They were clapping which

42:35

was totally unnecessary It

42:38

was so awkward But yeah,

42:40

it was really really cool Right.

42:43

I'm so curious No And about the actual experience

42:45

of using this neuro length device because one of

42:47

the things that it allows you to do is

42:49

to control A cursor on a

42:51

screen as if you were like using

42:53

a mouse just by thinking but I've

42:55

never known like you know When I'm

42:57

using a computer, I'm not thinking I'm going

42:59

to place my cursor here. I'm going to

43:02

click this button I'm going I'm it's just

43:04

like it's a it's a much more kind

43:06

of fast twitch sort of unconscious response I'm

43:09

curious when you're trying to control a cursor

43:11

on a screen How intently do you have

43:13

to think about it in order for the

43:15

cursor to actually react? I

43:17

mean at first I wasn't very good

43:20

at it I was doing

43:22

what we call like attempted movements attempted

43:24

movements are basically, you know like

43:26

I said all the signals in my brain

43:28

are still firing so the threads are implanted in my motor

43:30

cortex and so when

43:32

I Attempt to move my hand those

43:35

signals are firing the implants picking that up

43:37

and an algorithm is Basically learning what I'm

43:40

trying to do and after doing it a

43:42

certain amount of times it'll Translate

43:44

that into cursor control in some way or

43:46

another and then it'll keep learning as

43:49

I was like a week in maybe two

43:51

weeks In I just thought to move the

43:53

cursor in one direction And

43:56

it moved it it Blew

43:58

my mind like it was it was wild.

44:01

But then over time, it

44:03

just becomes second nature. It's not like I'm

44:05

thinking like cursor come over here and I'm

44:07

waiting for it to get there or anything.

44:09

It's just it's very, very, like I said,

44:11

intuitive. And and how like fine grain

44:13

is the control you have over a cursor now? Is

44:16

it like roughly equivalent to like what it used to

44:18

be like when you were using a mouse or is

44:20

there still a gap there? I

44:22

would say it's very similar.

44:25

I'm not as quick with the cursor as a

44:27

lot of other people, but I don't think that

44:30

can't be made up. I think that

44:32

just comes with a bit more practice

44:34

and also just a bit more tweaking on

44:37

the software side. Like this is still very

44:39

early days, a few months in and we're

44:41

already where we're at, which is amazing. I

44:43

think by the end of my time

44:46

in this study, whenever that will be, I'll

44:48

be better than most people with a cursor.

44:50

Wow. I mean, one of the details that I

44:52

just love from the initial reporting

44:55

on your story is that after you

44:57

got your neural link, you played eight

44:59

straight hours of the video game Civilization

45:01

6. And I just love that because

45:04

it's like I imagine that,

45:06

you know, you had doctors years ago thinking that

45:09

in theory, if this ever, you know, this

45:11

brain computer interface thing ever worked, it would

45:13

allow people to, you know, do more types

45:15

of creative labor and and be more productive

45:17

at work. And you get it and just

45:20

immediately start gaming, which I think shows that you just

45:22

have the heart of a true gamer. Yeah,

45:24

yeah, I, I keep telling them that I

45:26

keep saying, you know, I'm just so unproductive

45:29

with this thing, like, by giving me more

45:31

things to do, like, I would much rather

45:33

be doing work. And they're like, no, just

45:35

do what you want to do. That's what

45:37

we want is to make you able to

45:40

play games to go surf the web and

45:42

do things that you want. It's not about,

45:44

you know, what other people believe you should

45:46

be doing or anything. It's just whatever makes

45:48

your life better. And waste your life like

45:51

the rest of us. I mean, it makes so

45:53

much sense, though, because it's like for for what,

46:00

eight years you had been deprived of being

46:02

able to just like use your hands to

46:04

play games. I love to play video games.

46:06

Like I play video games myself every week.

46:08

And I guarantee you that if I had

46:10

been in your shoes, I absolutely would have

46:12

been playing as civilization when they went to.

46:14

Yeah, yeah. Now, now is it also true

46:16

that you played Mario Kart? I

46:18

did. I did very early on, maybe a couple

46:20

weeks in, they hooked me up to a switch.

46:23

And that was very like hands on. They

46:26

were real time tweaking things right now we're

46:28

working on, you know, giving me that capability

46:30

on my own. So any day I

46:32

want I could just hop in. I

46:34

think that's pretty close. That's amazing. Now,

46:36

I want to know like how different

46:39

this is from other assistive technologies that

46:41

have come before it because we've had

46:43

things like eye tracking for computer control

46:45

before for people who have lost mobility.

46:47

So have you tried any other ways

46:49

of controlling computers before

46:52

this? And so how does this

46:54

implant stack up to other things that people have

46:56

been using to do similar things in

46:58

the past? Yeah, I've tried it all

47:00

from, you know, the first few

47:02

weeks, first few months I was in the

47:05

hospital after my accident, they had me trying

47:07

everything. And sure things have gotten better since

47:09

then, but they're just not even in the

47:11

same league as nearly eye trackers. They're

47:13

just not as good. A lot of

47:15

it has to do with, you know, being

47:18

centered on the screen, making sure that

47:20

your levels don't change. I

47:22

have really bad spasms. I'm very spastic. So

47:24

if I move at all, like my body

47:26

spasms to the right and I'm off center,

47:28

then the eye tracker doesn't really work anymore.

47:31

I've tried other things like a quad stick. And

47:33

there was a video of a guy, a quadriplegic who

47:36

was using a quad stick to play

47:38

things like Fortnite and stuff. I tried,

47:40

I tried playing like one

47:42

of the Call of Duty games come out like after

47:44

my accident. And I hopped on and tried the campaign.

47:46

And I think they're storming like the beaches of

47:48

Normandy. And I didn't even make it off the

47:50

beach. Unfortunately,

47:52

the case for so many people in that beach.

47:55

It's a very real, real case. That

47:59

happens to me too. play team shooters

48:01

because I just get my butt kicked

48:03

by 11-year-olds so I'm sympathetic.

48:05

What's your excuse, Rif? Nolan,

48:08

I want to ask you about thread

48:10

retraction because this is something that has

48:12

happened since your surgery. A lot of

48:14

the threads that connect the

48:16

Neuralink device to your brain actually

48:19

started retracting. I saw a figure

48:21

that 85% of them had retracted

48:23

and this was potentially endangering your

48:26

ability to use this device. Talk

48:29

with us about this and what the first signs

48:31

you experienced were that something wasn't right.

48:34

Yeah, a few weeks in, I just

48:36

started losing control of the cursor is what it

48:38

comes down to. It would start

48:40

drifting on me. I would want it to go

48:42

right and it would go left. I

48:45

could not get it to go down, things like that.

48:48

It just became impossible to use. About

48:51

a week later, so this was about three weeks

48:53

in, about a week later they told me that

48:56

they had seen some evidence of thread retraction but

48:58

I think they had only found out a day

49:00

before. They kept me in the loop the whole

49:02

time. They actually took a

49:04

scan of your brain and said it looks like we

49:06

can see that the threads have retracted? No, no. Brain

49:09

scans won't even show

49:11

the threads. What they can

49:13

do is look at the electrodes

49:17

over time and see

49:19

which electrodes on the threads

49:21

are sending signals and which ones are

49:23

sending strong signals or weak signals. They

49:26

can really tell which electrodes are

49:28

still in my brain. Right

49:30

now it's about 15% that are

49:32

still actively sending strong signals

49:34

in my brain. Do

49:37

they have any theories about why these threads have come

49:39

loose? Yeah, it has to do

49:41

with how much the human brain moves. Apparently,

49:43

they had thought that everything they

49:46

had read, all the surgeons they had

49:48

talked to, said that

49:50

the brain moves about one millimeter and

49:52

then when they implanted everything in my

49:54

brain, they found that my brain moves

49:56

actually three millimeters. So on a

49:58

scale of 3x what they read. expecting. So

50:02

obviously this is not like fixing a computer or

50:04

an iPhone or something where you can just like

50:06

open it up and fix it like the opening

50:09

up wouldn't involve opening up your head and your

50:11

skull and and doing brain surgery

50:13

on you again. So

50:15

how did they go about trying to fix this?

50:18

Yeah I offered them to go in take

50:21

out the implant and put in

50:23

a new one I was like if it's gonna

50:25

get me back to peak performance and that's what

50:27

I want if it's gonna help me stay in

50:29

the study I offered

50:31

that and they said no we're gonna

50:34

like stop take a step back and

50:37

see if we can fix this on

50:39

the software side which is ultimately what

50:41

ended up happening. They just tweaked the

50:43

way that they were recording signals from

50:46

the threads and from the electrodes

50:49

and that ended up working. There there were

50:51

a couple different ways they were recording the

50:54

neuron spikes in my brain. I mean

50:56

there's a ton of information coming from

50:58

the neurons at all times and they

51:00

were trying to interpret those

51:02

spikes in a certain way and they

51:05

had found that the way that we started was

51:07

the most efficient but then once all

51:09

the threads started retracting they needed to rethink that

51:11

and so they switched to a different way of

51:13

recording those signals and they found that that was

51:16

actually much better. And so

51:18

like where are you at now? Like

51:20

Neuralink has said that your performance with

51:22

the device is now better than it

51:24

was before all of this happened. Does

51:26

that continue to be the case? Yeah

51:28

yeah absolutely. I'm still getting

51:30

better too. I'm curious like

51:32

some of the people that I've talked to

51:35

in the tech community believe that BCIs are

51:37

going to be just a major mainstream technology

51:39

in the future not just for people with

51:41

disabilities but for basically anyone. I mean that

51:44

you know the next big platform shift may

51:46

not be people putting computers on their head

51:48

like with VR and maybe people putting computers

51:50

in their head and eventually we will all

51:53

be walking around with these brain implants. Based

51:56

on your experience do you think that is a

51:58

plausible future here? Yeah I don't see why. not.

52:01

I think they're safe. I think the

52:04

possibilities are endless with this technology. I

52:06

mean, we're just scratching the surface. I

52:09

don't know what kind of things people

52:11

are going to be able to

52:13

do with this in 10 years. I don't

52:15

think anyone really knows. There are

52:17

applications that we see can be

52:19

useful like helping cure paralysis or

52:22

different motor diseases, helping

52:24

cure blindness. But once we start getting all

52:26

of that, then that begs a

52:29

lot of other questions like, if we can do this, why

52:32

can't we do even more? And with the

52:34

AI revolution that we're in right now, how

52:37

can this be applied to all of these

52:39

pieces of hardware in our brain? I just

52:41

think we're in for an explosion,

52:43

like exponential growth in

52:46

this field, especially now that Neuralink has come

52:48

out with this. It's going to bring all

52:50

the other BCIs up and it's going

52:52

to push Neuralink to get even better. It's going

52:54

to be like a new space race, but in

52:56

the brain. You know,

53:00

I'm curious to get a sense of

53:03

what this has just been like for, you know,

53:05

you were talking earlier, I was really moved when

53:07

you were talking about just craving the experience of

53:09

holding a book in your hands again, something that

53:11

I take for granted. And now

53:13

presumably you've been able to read books,

53:15

you've been able to play games. What

53:18

is this done for you emotionally

53:20

to kind of get access to

53:22

some of those things that you had been missing

53:24

out on? I

53:27

mean, it's hard to even

53:29

put into words. Just

53:33

this amount of independence

53:35

that I've been given, it

53:38

changed my life for the

53:40

last few months. It's changed my

53:43

parents' lives. Little

53:46

things, I mean, very, very little things have

53:48

made huge differences. Like when I was able

53:50

to get a drink of water on my

53:52

own in the middle of the night because

53:54

I got like a little bottle

53:56

that stretched across my bed and allowed

53:58

me to have drinks. drinks in the

54:01

night, that relieved about 90%

54:04

of sleepless nights that my parents had.

54:07

And now with Nuralink, it's even

54:09

more than that. I'm able to do a

54:12

lot more on my own than I was never able

54:14

to do in the last eight years. I

54:16

don't have to wake anyone up in my family to

54:18

come help me in the middle of the night. I

54:20

don't have to feel guilty if at 2 a.m. I

54:23

want to connect and read or listen to an

54:25

audio book or play a game or just go

54:27

on and check my social media or text someone

54:30

back. I don't have to feel guilty about trying

54:32

to wake someone up. There's

54:35

just so much that I'm grateful for being able

54:37

to do this. And ultimately I want to use

54:39

it to help people, find

54:41

some way to help people, and I'm

54:43

on the path to doing that. And that's what

54:45

I've wanted since I was a kid, just to

54:48

find some way to help people. And after my

54:50

accident, I didn't think that was ever gonna happen.

54:52

I knew that I could still speak, but I

54:54

mean, who would want to listen to me speak

54:57

about nothing, I had no life experience to give

54:59

them. I guess now it's a bit different. But

55:02

yeah, I don't know. It's been such a

55:05

huge blessing to me, honestly. Yeah. I'm

55:08

curious, Nolan, what have your conversations with Elon

55:10

Musk been like? I

55:13

haven't had many. I talked to

55:15

him on FaceTime right before surgery. Was

55:18

like, hey, thanks for choosing me to

55:20

do this. I'm really excited,

55:22

really thankful, really blessed. And he

55:24

was like, yeah, this has been great. Really looking

55:26

forward to it, making a huge step. And

55:29

I said, let's rock and roll. He's like,

55:31

let's do it. And that was it. And

55:33

then after surgery, I spoke to him in

55:35

person. He came to the hospital. I

55:37

was still pretty drugged up on anesthesia, and

55:40

I couldn't get his sweet bomber jacket out

55:42

of my mind. I was just

55:44

lying in my bed thinking the whole time, don't

55:46

mention this bomber jacket. Don't mention the bomber jacket.

55:49

But it was cool. I think we have very

55:52

similar ideals about

55:54

what this can do for

55:56

humanity and where we

55:58

can go from here. And... just our drive

56:00

to help people in that way. I

56:03

think it's amazing that someone of

56:05

his caliber has stepped up and

56:07

stepped into this role

56:10

for helping people like me. I

56:12

mean, I never thought anything

56:15

like this would ever happen to me or

56:17

to people like me. And to have such

56:19

a high profile figure say, you know, I'll

56:22

take that on and I'll fight that. It's just

56:24

amazing. I'm curious what you

56:26

make of the promises that people like

56:28

Elon have made for how BCIs could

56:31

improve in our lifetimes. He did get

56:33

a little criticism a few years ago

56:35

for some statements he made at a

56:37

Neuralink presentation where he suggested that these

56:40

BCIs could eventually allow blind people to

56:42

see or give people with spinal cord

56:44

injuries, like the use of their full

56:47

bodies back. A lot of health experts

56:49

were very skeptical and they basically said,

56:51

it's irresponsible to say this stuff given that

56:53

the science is just nowhere near there yet. And

56:55

I'm curious how you feel when you hear the

56:57

kinds of lofty promises about what this technology may

56:59

be able to do someday. Do you get excited

57:01

or do you say like, hey, wait a minute,

57:04

let's like stick to what the science is capable

57:06

of now? No,

57:08

I'm super excited about it. It gives

57:10

me and people like me something to hope for.

57:13

Once you take away hope, that's the end

57:15

for most people. And for him

57:17

to promise something like that, even

57:20

if it never comes about, it's just the

57:22

fact that he's trying and he

57:24

sees it as a possibility. I

57:27

take that kind of passion to heart.

57:30

I don't agree with people who say that it's

57:33

irresponsible. I think it's a

57:35

reality from my perspective that it's going

57:37

to happen probably in my lifetime. And

57:39

if I'm irresponsible for saying that, then

57:41

like I'm sorry, but it

57:43

gives me something to look forward to and gives me

57:45

something to strive for and to work towards. And

57:48

maybe I fall short of that, but

57:50

I'll be damned if I don't give it my all. Yeah,

57:53

I mean, what's very clear to me about you, and on

57:55

is that you just have like a much

57:57

higher risk tolerance than I do. I

58:00

get nervous to go to the doctor to

58:02

get like some little you know thing and

58:04

you hear are saying I will volunteer I

58:06

will step up to be patient number one

58:09

for this potentially very severe use of technology

58:11

So yeah, my hats off to you for

58:13

just being willing to put your hand up

58:15

for it. Not for real. Yeah Thanks,

58:18

man. All right, no, well great

58:20

to talk to you. Thank you so much for your time.

58:22

Thanks so much for coming on. Thanks for having me guys.

58:24

I really appreciate it After

58:29

our interview we reached out to Neuralink to

58:31

confirm some of what Nolan shared with us

58:33

about his surgery But we didn't

58:35

hear back from them. You can read more about

58:38

his experience on their website When

58:40

I come back, we'll talk to my colleague Karen

58:43

Weiss about Microsoft's big AI enough Looking

58:55

for a new ride? Shop

58:58

for your next car from the comfort of

59:00

your home 100% online with Carvana Carvana

59:03

has a massive inventory including thousands

59:05

of cars under $20,000 So finding

59:07

a car that fits your budget

59:09

and lifestyle is hassle-free Carvana makes

59:11

car financing hassle-free Get pre-qualified

59:13

in minutes by answering a few quick questions Carvana

59:16

also offers customizable terms and payments as low

59:18

as zero dollars down because payment plans should

59:20

fit your plans Visit

59:23

carvana.com or download the app today to find

59:25

your next car without ever leaving the comfort

59:27

of home or anywhere else terms

59:29

and conditions may apply Well,

59:32

Casey, it is the most exciting time of

59:34

the year in the tech industry, which is

59:36

developer conference season That's right for a lot

59:38

of people Kevin this time of year is

59:40

about dads and grads for us. It's about

59:42

API's So

59:44

last week we talked about Google's I

59:46

owe developer conference and everything they showed

59:48

off and this week Microsoft had its

59:51

big annual developer conference called build you

59:53

did not go in person. Did you

59:55

I did it and candidly? Well, while

59:57

I read some coverage of this I

59:59

want to learn so much more because

1:00:01

I only had so much time left

1:00:03

over after I finished researching the filmography

1:00:05

of Scarlett Johansson. Right. So Microsoft obviously

1:00:07

is also very excited about AI. They

1:00:09

have been building out a lot of

1:00:11

their own AI tools and products and

1:00:13

services. And this week at

1:00:16

Build, they actually demoed some new hardware

1:00:18

that they are making that is sort of built

1:00:20

around AI. Now, isn't putting AI

1:00:23

directly into the computer's house Skynet began

1:00:25

the Terminator films? I'm not sure.

1:00:28

It's been a while since I watched those movies. I

1:00:30

think Microsoft gets less coverage by a tech journal.

1:00:32

It's like you and me then it deserves in

1:00:34

part because a lot of what they do is

1:00:36

like boring enterprise software stuff. But they

1:00:38

are the biggest company in the world and

1:00:40

they have been investing in AI significantly over

1:00:42

the past few years. And I would say

1:00:44

between their stake in open AI and all

1:00:46

of their own AI projects, they're just a

1:00:48

major, major player in this world. Yeah, it

1:00:51

is true. I probably don't pay as much

1:00:53

attention to Microsoft as I should. And it

1:00:55

is for a somewhat selfish reason, which is

1:00:57

I just use Macs. And so

1:00:59

sometimes it feels like this stuff just is not as

1:01:01

relevant to my life. But let's face it, for

1:01:03

most of the working world, they are doing their

1:01:06

work on a PC. And so if Microsoft says

1:01:08

we're putting AI in it, then we should be

1:01:10

paying attention. Yeah. So to

1:01:12

talk about this, we're going to bring on my

1:01:14

colleague Karen Weiss, who covers Microsoft for the New

1:01:16

York Times. She went to Build and she's going

1:01:18

to tell us all about what Microsoft announced. Karen

1:01:23

Weiss, welcome

1:01:26

to Hard Fork. Happy

1:01:28

to join you guys. So

1:01:32

Karen, you actually went up to Microsoft's

1:01:34

headquarters earlier this week for their annual

1:01:36

Build conference. So just set the scene

1:01:38

for us a little bit. Like what

1:01:40

was it like? How did it compare

1:01:43

to previous experiences? Well, I

1:01:45

think what was unique was on Monday,

1:01:47

they tried to really hype this AI

1:01:49

PC announcement and they did

1:01:51

something on the campus like the big headquarters

1:01:54

in Redmond just outside of Seattle. And

1:01:56

you had to be in person there. They weren't live streaming it

1:01:58

and Satya was going to get it. to give a keynote

1:02:00

at it. So it was trying to definitely build

1:02:03

attention. And so, you know, they tried, it was like a lot

1:02:05

of hoopla. There's a lot of music

1:02:07

in the background and stuff like that.

1:02:09

And they tried to recreate the magic,

1:02:12

if you will, when they launched the

1:02:14

Bing, Sydney chatbot that was

1:02:16

not spoken of about a year and whatever

1:02:18

ago. Yeah, that all

1:02:21

turned out great. So, Karen, one

1:02:23

of the things that Microsoft announced are

1:02:25

these things called Co-Pilot Plus PCs, which

1:02:28

as I understand it, are basically a personal

1:02:31

computer, a Windows PC that is basically

1:02:33

built to run AI and

1:02:36

that runs it very fast and that is

1:02:38

sort of all wrapped around the

1:02:40

capabilities of these AI models. So what did

1:02:42

they actually announce and sort of how is

1:02:44

it different from the Windows

1:02:46

PC that people use today?

1:02:49

Yeah, the main thing is that these

1:02:51

PCs have a bunch of AI

1:02:53

models, AI systems locally on the

1:02:55

computer. And they can run

1:02:57

different AI tools or

1:03:00

models because they have this

1:03:02

new type of processor, essentially. It's called

1:03:04

a, bear with me, an NPU, a

1:03:06

Neuro Processing Unit. And it

1:03:08

is very quick with really little drain on

1:03:10

your battery because the problem you have is

1:03:12

when you run AI, it's like super

1:03:14

intensive, right? It's running all these calculations

1:03:17

constantly. And so this is a whole

1:03:19

new generation of chip. Right. So

1:03:21

I guess I'm just struggling to understand what it

1:03:24

means for AI to be able to run

1:03:26

locally because I also have run large

1:03:28

language models locally on my hard drive

1:03:30

before. And is that what

1:03:32

they're saying? Is that if you wanna

1:03:34

run something like a chatbot, it'll just

1:03:36

be much faster to do or are

1:03:38

there actually capabilities that these will have?

1:03:41

Is it gonna be so deeply woven

1:03:43

into the operating system that people's

1:03:45

experience of their computer will actually change

1:03:47

somehow? Right, so they are hoping

1:03:49

the latter. You'll be shocked to hear. But

1:03:51

basically the idea is because there are these

1:03:53

things that are run locally, they can have

1:03:55

access to information that is only stored locally

1:03:58

and they can be faster and more. interactive

1:04:00

in a way where, you know, even right now, when

1:04:02

you go into a chatbot, you ask your question, right?

1:04:05

And it like takes a second and it goes boop, boop, boop,

1:04:07

boop, boop, and each little word kind of comes out one at

1:04:09

a time. This would speed that all

1:04:11

up as well. And so they're hoping that by putting

1:04:13

these different, they have said there were more than 40

1:04:15

models that can run, can kind of

1:04:17

come pre-installed, so to speak, on

1:04:19

the laptops. And their hope is that

1:04:22

developers now start playing with those and build tools

1:04:24

off of it. And they

1:04:26

liken it to the iPhone when the iPhone

1:04:28

said, oh, wait, here's a, you know, GPS,

1:04:30

here's a altitude meter, like whatever

1:04:33

kind of tools are in the hardware, people start

1:04:35

building it and then you get Uber

1:04:37

or whatever type of system that then made the

1:04:39

iPhone this kind of like key platform for people

1:04:41

to build off of. And so

1:04:43

I think there's a reason why they announced this before

1:04:45

their big event for developers is because they're trying to

1:04:47

say to developers, build for this, there's a

1:04:50

future here. Now I think

1:04:52

speed matters a lot. I do think that

1:04:54

that can really change the way that this

1:04:56

stuff gets used. And I can absolutely see

1:04:58

this becoming a developer platform. But

1:05:00

at the same time, Karen, I am reminded of

1:05:03

the last time that Kevin and I went up

1:05:05

to Redmond for an event. And we

1:05:07

were told Bing has AI now and AI

1:05:09

is going to change everything. And

1:05:11

I think we were optimistic that maybe that would

1:05:14

be the case. And then you fast forward to

1:05:16

today and Google is still by far the most

1:05:18

dominant search engine in the market. So I wonder

1:05:20

as you're hearing this presentation telling

1:05:22

us that AI chips inside PCs

1:05:25

are going to be a next

1:05:27

generation kind of PC that is truly going

1:05:30

to change everything. I wonder, what

1:05:32

did you think of that? I think

1:05:34

the difference between this and Bing

1:05:36

is Windows. So Windows is this

1:05:39

ubiquitous operating system that Microsoft controls.

1:05:41

So when they announced this, it

1:05:43

wasn't just them trying to get up

1:05:45

move from a tiny market share. They

1:05:47

have this enormous market share. And they had all

1:05:49

the biggest laptop makers in the world there showing

1:05:52

off versions of these devices. So I think they

1:05:54

have a power or an influence on the PC

1:05:56

in a way that they don't necessarily in search

1:05:58

or with Bing. That said, they

1:06:00

have to prove the utility of these things. And

1:06:02

part of why running the models locally

1:06:04

is important is it's cheap, right? You pay for

1:06:07

the processor upfront when you buy the laptop. But

1:06:09

every time you ping it, you're not spending a

1:06:11

penny. Whereas if you're inferencing things in the cloud,

1:06:13

if you're running things off the cloud, it

1:06:15

becomes really expensive for a developer to think, oh, can I

1:06:17

even afford to offer this product

1:06:20

to people? Now, if they can just do it locally, it might

1:06:22

not be quite as good. Like the language

1:06:24

models that can run on a laptop now are not

1:06:26

as good as GPT-4. But they

1:06:28

say about as good as GPT-3.5,

1:06:31

which was what Caffe d'Ebuty was initially

1:06:33

launched with. So you can

1:06:35

get some good enough uses is the theory.

1:06:38

But again, they have to find the kind

1:06:40

of key use cases that will show it.

1:06:42

I was a little surprised. The examples they

1:06:44

released with were not... They

1:06:47

were clearly trying to demonstrate certain capabilities.

1:06:49

Like what? What did they

1:06:51

demonstrate? Some were a little more

1:06:53

speculative and futuristic, saying this is the kind

1:06:55

of thing you would be able to do.

1:06:57

So like one was a dad

1:07:00

speaking with a voice

1:07:02

chatbot, essentially, asking for help on how

1:07:04

to solve the particular Minecraft situation. And

1:07:06

the voice thing, you can come in

1:07:08

and then say, oh, you

1:07:11

need these materials to build this new thing

1:07:13

in Minecraft. And therefore, oh, hurry, this thing

1:07:15

just popped up. Go run over there and

1:07:17

hide in this basement thing. And

1:07:19

because gaming is real time, you

1:07:21

can't do that if you're pinging in the cloud. And

1:07:24

so this was an example of a way where you could

1:07:26

bring in general AI and assistance into

1:07:28

like a real live moment, essentially.

1:07:30

But that was one of the

1:07:32

more futuristic products. It's like not launching

1:07:34

with that. Right. I

1:07:36

want to ask about this other feature

1:07:38

that they announced that I think got

1:07:41

a lot of attention, which is something

1:07:43

called recall. What is

1:07:45

recall? They

1:07:47

liken it to a photographic

1:07:49

memory. And it's actually

1:07:51

kind of makes sense because what it does

1:07:53

is it basically builds a history of everything

1:07:55

you have looked at on your laptop. And

1:07:58

it literally takes constant screenshots of your

1:08:00

screen stores those locally and

1:08:02

then you can ask it, oh

1:08:04

where was that thing that Kevin sent me that

1:08:06

was like kind of weird and he laughed when

1:08:09

he told me about it. And

1:08:11

since you and I are of course chatting on teens Kevin,

1:08:13

on my podcast, it would be like, oh

1:08:15

that we can pull that up here because the

1:08:17

transcript says that he laughed in this moment and

1:08:20

you can kind of scrub back and forth

1:08:22

in time trying to visually

1:08:24

look for what you were searching

1:08:27

for. So basically if you've always wondered what

1:08:29

would it like to have an FBI agent

1:08:31

living inside your computer, you can now have

1:08:33

that. Perfect. So

1:08:36

there's an app called Rewind that has been doing

1:08:39

something that sounds very similar to this,

1:08:41

but just so I'm clear, this is

1:08:43

taking screenshots, it is storing them

1:08:45

on your machine and then it is allowing

1:08:48

you to use generative AI to sort of

1:08:50

search back through your previous encounters

1:08:53

with your computer and say what was that

1:08:55

restaurant menu I was looking at last Tuesday

1:08:57

or whatever. But I'm curious, Karen, what you

1:08:59

think the target audience for this, who is

1:09:01

actually going to use this

1:09:04

feature and what examples did Microsoft give of how

1:09:06

it might be useful? They gave

1:09:08

some of kind of personal uses. They

1:09:10

had this very funny example of this

1:09:12

woman trying to get a dress for

1:09:14

her grandma and they talked about how

1:09:16

she had searched things online, she had

1:09:18

chatted in Discord with her abuela which

1:09:20

I thought was very funny because my

1:09:23

grandma used to AOL instant message me but I'm not sure

1:09:25

how many grandmas are on Discord. But

1:09:27

the idea was that there's this digital record

1:09:29

and you can go back and be like,

1:09:31

oh, what was that sparkly one? And

1:09:34

you can use language like that and because

1:09:36

it has visual intelligence, it can

1:09:38

go back and look for the sparkly blue dress you

1:09:41

had looked for and then, oh, didn't abuela say that

1:09:43

she actually really liked a pantsuit instead

1:09:45

and then you can kind of find the pantsuit. So

1:09:47

that's kind of a personal one. I was recently shopping

1:09:49

for jeans and part of me is like, oh, I

1:09:51

could see that. There was the jeans that I liked,

1:09:53

but also I'm scared if I were to type in

1:09:55

like, like jeans I was just looking

1:09:57

at. How many white jeans would show up?

1:10:00

It's kind of mortifying to think about how many wide-legs

1:10:02

you need to find to find a good pair. Totally.

1:10:05

So, you know, I can see all those

1:10:07

use cases. At the same time, all of

1:10:09

us are journalists. We often talk to people

1:10:11

off the record confidentially. We have sensitive information

1:10:13

on our laptops. We're not alone in that.

1:10:15

I think most people working jobs

1:10:18

have some sort of confidential information that is

1:10:20

on their computer. So I hear everything that

1:10:22

you've just described, Karen, and I think, absolutely

1:10:24

not. I tried Rewind for a while. I

1:10:26

found it terrifying and deleted it from my

1:10:29

phone. What is your thinking right

1:10:31

now about whether a lot of people are

1:10:33

going to be willing to invite this level

1:10:35

of surveillance onto their devices? Yeah, I mean,

1:10:37

the default is definitely to

1:10:39

take all of it. You can go back in and

1:10:42

manually delete certain days. You can have

1:10:44

it opt out of certain applications. But

1:10:46

like if you're going to opt out of the

1:10:48

web search, like that's one of the main uses

1:10:50

for it. So it's a big question. That's that

1:10:52

it's defaulting to being there. And one of the

1:10:55

things when I was kind of researching before the

1:10:57

event was a lot of people

1:10:59

haven't interacted with chat bots that haven't used

1:11:01

chat CPT, but they get a basic Windows

1:11:03

laptop from their work. And there's a big

1:11:05

old button that says co-pilot on it. And

1:11:07

these tools will be there. And for many

1:11:10

people, it might be the first time that

1:11:12

they really have exposure to it. Also, I'm

1:11:14

not sure people will understand the technology behind

1:11:16

it. And that is literally taking a picture

1:11:18

of everything that you're you're doing. And we should

1:11:20

say, like, you know, Microsoft

1:11:23

has said that all of these screenshots

1:11:25

stay on the device itself. They are

1:11:27

not being sent to Microsoft. Microsoft has

1:11:29

also said, you know, they're not doing

1:11:32

any kind of like content moderation on

1:11:34

them. So like, if you've been looking

1:11:36

at your bank account information on your PC, they're not

1:11:38

gonna like scrub that from the screenshots, but that it

1:11:41

will all stay locally on the device. And so only

1:11:43

the person whose device that is can access that. But

1:11:45

I think there are a lot of questions that

1:11:47

people will have, especially, you know,

1:11:49

if this is maybe a corporate

1:11:51

issued computer, does my employer then

1:11:53

have the ability to go back and look at

1:11:56

screenshots of every time I've ever used this computer?

1:11:58

Of course they will. gets normalized,

1:12:00

I can imagine employers handing

1:12:02

you your shiny new co-pilot

1:12:05

plus PC and saying,

1:12:07

you have to leave this recall feature

1:12:09

on. And then if we ever have

1:12:11

a disciplinary issue with you or we

1:12:14

are just suspicious about you for whatever

1:12:16

reason, we can review everything you have

1:12:18

ever done on your corporate issued laptop.

1:12:20

That seems like a nightmare dystopia to

1:12:23

me. And they can search for it

1:12:25

easily. Yeah. Right. Your

1:12:29

only chance to survive that is just

1:12:31

that they're using Bing search, which doesn't work

1:12:33

for the time. Well, Karen, that brings me

1:12:36

to one of my other questions about all

1:12:38

of this stuff that Microsoft announced this week.

1:12:40

It's like, who is it for? Because I

1:12:43

think traditionally Microsoft, a big part of their

1:12:45

business is selling to businesses, its enterprise customers,

1:12:47

its large companies that already run a lot

1:12:49

of Windows PCs that already use Outlook and

1:12:52

Teams and all the other Microsoft products. And

1:12:54

they can just kind of keep adding to

1:12:56

that bundle. So do you see these AI

1:12:58

PCs and all the features that are on

1:13:01

them as being aimed at businesses or are they

1:13:03

really making a consumer play here? You know,

1:13:05

I initially thought beforehand that it would

1:13:07

be more of the businesses, businesses by

1:13:10

literally the majority of laptops now. When was

1:13:12

the last time you guys refreshed your PC?

1:13:14

Mine was in 2011, I believe. You know,

1:13:17

you're eligible every three years at the New York Times.

1:13:19

Oh no, no, my personal one. When was the last

1:13:21

time you bought a personal one? Your PC is 13

1:13:23

years old, Karen. What are you using for? Are you

1:13:26

running Windows 98? XP? This is the MacBook and it's

1:13:28

lasted and like for

1:13:32

the internet, right? Everything's on the internet that

1:13:34

I do now. I'm not like ripping my

1:13:36

CDs anymore and taking my music from one

1:13:38

thing to another. You're not mining Bitcoin? I'm

1:13:41

not. I was surprised though. They

1:13:44

definitely sort of benchmarked it and kind

1:13:46

of compared it to the MacBook Airs. So

1:13:48

they're smaller, they are lighter,

1:13:50

they say they start at $1,000. So

1:13:53

they're not like Chromebook prices, but they're

1:13:55

also not a MacBook Pro super

1:13:58

heavy thing. Karen,

1:14:00

I'm curious how this positions

1:14:02

Microsoft against Apple. Apple's having

1:14:04

its own developer conference next

1:14:06

month, and there's a

1:14:08

lot of speculation that Microsoft's announcements at

1:14:11

this event this week were designed to

1:14:13

take some attention away from anything Apple might

1:14:16

announce in just a few weeks. Apple is

1:14:18

also expected to do a lot around generative

1:14:20

AI in its own hardware

1:14:22

products. So what is the competition between

1:14:25

Microsoft and Apple like right now? These

1:14:28

chips, these new NPUs, do

1:14:31

propel the performance of these laptops

1:14:35

basically in the realm of Apple now.

1:14:37

So it's in the same class because

1:14:39

they've changed over this tip architecture

1:14:42

or style essentially now to

1:14:44

a model that actually Apple had been pursuing. So

1:14:47

the question is, will people make the jump from

1:14:49

one ecosystem to another? Will their

1:14:51

employers make a jump from one ecosystem to

1:14:53

another? All of that stuff is I think

1:14:56

I don't know the answer to that. But they are clearly

1:14:58

trying to show that they are

1:15:00

building even their hardware with this

1:15:02

AI first mentality. And again,

1:15:04

if people want that, if they can

1:15:06

demonstrate the utility of that, that's the

1:15:08

question about all of this. Yeah,

1:15:11

I just think so much of this

1:15:13

is going to depend on how Microsoft

1:15:16

implements all this stuff and honestly how

1:15:18

annoying they are about it. So

1:15:20

I have a Windows PC. It's my daily

1:15:22

sort of PC that I use. And

1:15:25

I would say most of the time I really like

1:15:27

it, but then there are just these times when

1:15:29

it's very clear that Microsoft is just getting a

1:15:31

little bit greedy and they just start popping things

1:15:33

up or putting things in weird places. Like the

1:15:35

other day, I was using my computer and I

1:15:38

got a Skype news alert. Have you seen these

1:15:40

yet? I've seen somebody

1:15:42

posting these on threads. And it's so funny.

1:15:44

It's so funny. I was just minding my

1:15:46

own business, doing email. Up comes this little

1:15:49

notification that says, Skype says the

1:15:51

US economy added this many jobs last month. And

1:15:53

I'm like, why is Skype talking to me? Did

1:15:55

I ask Skype to talk to me? No, I

1:15:57

did not. But that is just a classic case of

1:15:59

Microsoft. So trying to juice engagement by doing something

1:16:01

that I think a lot of people would feel

1:16:03

is very annoying. So I can see these AI PCs

1:16:06

being very useful and I want to try one, but

1:16:08

I think if they can't resist sort of, I don't

1:16:10

know, just trying to nudge you into using it

1:16:12

more and more or in different ways, I think

1:16:14

that is going to turn a lot of people off.

1:16:16

I mean, I think a lot of AI things

1:16:18

are going to be nudging you more and more

1:16:20

because they want discoverability is that phrase that you

1:16:22

hear a lot about about the features that these

1:16:24

things can do. You know, this is a huge

1:16:26

Alexa problem, right? People know Alexa can

1:16:28

do the timer or whatever, but she can

1:16:32

do more and they want you to do more,

1:16:34

but like they got to push it to you.

1:16:36

Otherwise, how do you know? So I think that's

1:16:38

a very hard urge to resist, even though I

1:16:40

completely understand and agree. Yeah, I mean, Kevin, I'm

1:16:42

like you, I would like to give one of

1:16:44

these things a try, see what it can do,

1:16:46

see how fast this AI is once you get

1:16:48

it on the device. But I do continue to

1:16:50

have trust issues with Microsoft. This is a company

1:16:53

that just last month started testing ads in the

1:16:55

Windows 11 start menu. So every

1:16:57

time you go to like, look at the

1:16:59

programs on your computer now, you might just

1:17:01

have to see an ad. And I

1:17:03

don't know the more AI is on my computer,

1:17:05

the less I'm excited about a company that is

1:17:07

looking to shut ads into different parts of the

1:17:09

interface. You know, imagine you're you're going to, you

1:17:11

know, shop for jeans for Abuela

1:17:13

and then, oh, you're looking for that. That

1:17:15

well, here's an ad for that now, Karen.

1:17:18

Right. So there's just a lot of stuff

1:17:20

in here that sort of has my eyebrows

1:17:22

arsed. Karen, did you

1:17:24

actually get to try any of this stuff? Did you get

1:17:26

to get your hands on one of their new AI PCs?

1:17:28

Yeah, they had like a demo station set

1:17:31

up. And so like, it's populated with like

1:17:33

all the demo data. And so the recall

1:17:35

example was hard to know what it would

1:17:37

be like when you had a real body

1:17:40

of your own data. Would it feel super

1:17:42

creepy? Would it be really useful? Would it

1:17:44

be not useful because it returns so much information

1:17:46

that you can't actually scrub through it all? I

1:17:48

understand the problem they're trying to solve with that.

1:17:51

Like, I think we all have the zillion tabs

1:17:53

open that you keep open just so

1:17:55

you don't forget about it. So I understand that

1:17:57

like impetus behind it, but it was hard to get a

1:17:59

sense of... for that for me of

1:18:01

like would I personally use this and

1:18:04

like it essentially because it was just filled

1:18:06

with all this dummy content essentially. Right.

1:18:08

You know, you know, Karen dummy content was actually

1:18:10

the original title for this podcast. Some

1:18:14

people think it's still could be. Nailed

1:18:17

it. All right, Karen Wise, thanks for

1:18:19

coming. Thanks guys. Thank

1:18:31

you. Looking

1:18:39

for a new ride? Check for your

1:18:41

next car from the comfort of your

1:18:43

home. One hundred percent online with Carvana.

1:18:45

Carvana has a massive inventory, including thousands

1:18:47

of cars under twenty thousand dollars. So

1:18:49

finding a car that fits your budget

1:18:51

and lifestyle is hassle free. Carvana makes

1:18:53

car financing hassle free. Get pre-qualified in

1:18:55

minutes by answering a few quick questions.

1:18:57

Carvana also offers customizable terms and payments

1:18:59

as low as zero dollars down because

1:19:01

payment plans should fit your plans. Visit

1:19:03

carvana.com or download the app today to

1:19:05

find your next car without ever leaving

1:19:07

the comfort of home or anywhere else.

1:19:10

Terms and conditions may apply. Hard

1:19:14

Fork is produced by Whitney Jones and Rachel

1:19:16

Cohn. We're edited by Jen Point. We're fact

1:19:18

checked by Caitlin Love. Today's show

1:19:20

was engineered by Chris Wood. Original

1:19:23

music by Marion Lozano, Rowan Nemestow

1:19:25

and Dan Powell. Our

1:19:27

audience editor is Nel Gologli. Video

1:19:29

production by Ryan Manning and Dylan Bergison. If

1:19:32

you haven't checked out our YouTube channel, you

1:19:34

can find it at youtube.com/hard fork. Special

1:19:37

thanks to Paula Schumann, Kui-Wing Tam,

1:19:39

Kate Laprasi and Jeff Rameranda. As

1:19:41

always you can email us at hard fork at any

1:19:44

time. What's

1:20:14

up sandwich heads? Today on Steve-O's Sandwich Reviews we've

1:20:16

got the tips and tricks to the best sandwich

1:20:18

order and it all starts with this little guy

1:20:20

right here, Pepsi Zero Sugar. Marshall's

1:20:22

a pastrami craving a cubano. Yeah, sounds

1:20:24

delicious but boom! Add the crisp, refreshing

1:20:26

taste of Pepsi Zero Sugar and cue

1:20:28

the fireworks. Lunch, dinner or late night

1:20:30

it'll be a sandwich worth celebrating. Trust

1:20:32

me, your boy's eaten a lot of

1:20:35

sandwiches in his day and the one

1:20:37

thing I can say with absolute fact,

1:20:40

every bite is better with Pepsi.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features