Podchaser Logo
Home
 'Battle for your brain': What the rise of brain-computer interface technology means for you

'Battle for your brain': What the rise of brain-computer interface technology means for you

Released Wednesday, 27th March 2024
Good episode? Give it some love!
 'Battle for your brain': What the rise of brain-computer interface technology means for you

'Battle for your brain': What the rise of brain-computer interface technology means for you

 'Battle for your brain': What the rise of brain-computer interface technology means for you

'Battle for your brain': What the rise of brain-computer interface technology means for you

Wednesday, 27th March 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:03

WearableTech, your Fitbit, smartwatch, and the

0:05

like, they can already do things

0:08

like measure your heart rate or

0:10

how well you're sleeping just based

0:12

on how you're moving or

0:14

signals through your skin. So

0:17

what do you think the next frontier

0:19

might be in WearableTech? The

0:21

next new thing devices can monitor

0:23

and measure. Just

0:26

think about it. What

0:28

do you think? I

0:31

use my earbuds every day because I

0:33

want to know how my brain changes

0:35

based on all of the things that

0:37

I do because my brain is changing

0:39

all the time. It's the most sophisticated

0:41

learning apparatus that we have. So

0:44

I use my earbuds as a way to

0:46

understand what's happening to my brain as

0:48

I play with my daughter, hang out with my

0:50

cat, listen to music, work,

0:54

and it's really interesting. I learn a lot about

0:56

myself. I learn a lot about what

0:59

makes me happy and

1:01

perform better and when I'm

1:03

really stressed, what impact that has

1:05

on me. This

1:08

is Tan Lee, co-founder and

1:10

CEO of Emotive, one of

1:12

a new crop of companies

1:14

that sees great potential in

1:16

BCI, or Brain Computer Interface

1:18

Technology. Lee believes

1:20

the possibilities for such tech

1:23

are endless, helping the elderly

1:25

experiencing cognitive decline, empowering the

1:27

disabled community to perform actions

1:29

simply through thinking, even helping

1:31

you understand yourself better, how

1:34

to be happier or

1:36

more efficient. Lee

1:38

says Brain Computer Interface Tech will

1:40

one day be able to do

1:43

all of these things through major

1:45

advances in miniaturized electroencephalography technology, or

1:47

EEG, which can read signals from

1:50

the human brain and send them

1:52

to amplifiers, which in her company's

1:54

case, are in those earbuds. It's

1:57

giving you a feedback on your

1:59

computer. So if I click on the icon

2:01

to see what's going on in my brain

2:03

at the moment, I can see what's happening

2:05

in my brain. And then I can also

2:07

see a report over the course of the

2:10

day, when during the day my brain was

2:12

in an optimal state. And then I

2:14

can correlate that with what I was

2:16

doing at that time. So when I

2:18

look back on my afternoon on Sunday,

2:20

I knew exactly what I was doing. So I

2:22

knew why that was different to

2:25

the barrage of the meetings

2:27

I had on Friday afternoon, which

2:29

caused my brain to be a much more intense

2:32

state. And so that allows me to

2:34

change my day a little bit, carve

2:37

out more time for focused work, so

2:39

that I can actually work more optimally.

2:43

Well, Tan Lee isn't the only one

2:45

who thinks this is utterly fascinating. Her

2:48

three-year-old daughter sees her at her desk, wearing

2:50

her earbuds and checking in on her state

2:53

of mind. She said, Mommy, I want to

2:55

see. And I said, this is Mommy's brain.

2:57

And she said, I want to see

2:59

my brain. And I said, you're

3:01

too little. And

3:03

so it doesn't fit her, but she's so

3:05

intrigued by it. Currently,

3:08

Emotives earbuds are available only

3:10

on their website. Lee

3:12

says she hopes that one day they'll

3:14

be available in stores for widespread use

3:16

in the consumer market. But for

3:18

now, her main clients are not

3:20

consumers, they're employers. One

3:23

of our clients is JLL. JLL is

3:26

a large real estate organization. And JLL

3:28

came to saying that, you know, the

3:30

future of work is changing rapidly. How

3:33

can we design our workplaces better

3:35

so that we can make

3:38

sure that when people are at work, they're getting

3:40

what they want from the work environment?

3:42

So in that case, we will

3:44

invite volunteers within the organization to

3:46

sign up for a research study,

3:48

where they will wear a device

3:50

for a certain period of time.

3:52

And what we do is we

3:55

capture brain data from those experiences

3:57

in order to try and map

3:59

out. Now, what is the relationship

4:01

between an environment that's conducive

4:03

to teamwork and collaboration versus

4:06

something that doesn't actually

4:08

achieve those desired outcomes? By

4:12

the way, JLL is also known as

4:15

Jones Lang Lissau Inc., one

4:17

of the largest real estate companies in the world,

4:19

ranked 185th on the Fortune 500, $20 billion in

4:21

revenue last year, and

4:26

100,000 employees worldwide. Some

4:29

of which have been asked to participate in the kind

4:32

of research study Lee mentioned. So

4:35

what happens to the data those employees' brains

4:37

are pumping out into Emotives earbuds?

4:40

What's really important about Emotives is that

4:42

fundamentally we do not believe in how

4:46

companies have transacted with data in the past. We're

4:49

a company that was born about 10 years ago,

4:51

and so we've seen a lot of the changes

4:54

in the public's view of

4:56

how data is mined for

4:58

corporate advantage without the informed

5:01

consent of the users and

5:03

participants. And so we conduct ourselves

5:05

in a very thoughtful and ethical manner

5:07

in regards to data. The users need

5:09

to have control of when

5:12

they collect data, how data is

5:14

shared, and in fact, we don't

5:16

sell or share your data with

5:18

anyone without your explicit consent. Well,

5:22

this is on point. I'm Meghna

5:24

Chakrabarti, and that was Tan Lee,

5:26

co-founder and CEO of the neuro-technology

5:29

firm Emotives, one of a new

5:31

group of companies that's rapidly advancing

5:33

the possibilities of brain-computer-interface technology. Well,

5:36

my guest today says the positive

5:39

possibilities of such tech are exciting

5:41

and essential, but it's naive to

5:43

think that power to read brainwaves

5:46

will be used exclusively for good

5:48

because the potential for exploitation is

5:50

just too great, both

5:52

by corporations and governments.

5:55

So she says now, as

5:58

brain-computer-face technology is starting

6:00

to enter our lives and our minds,

6:03

now is the time to establish new

6:05

rules, to defend the right to think

6:07

freely, and to keep our minds

6:10

our own private property.

6:13

Well, that comes from Nita Farahani, professor

6:16

of law and philosophy at Duke University.

6:19

Her new book is The Battle for Your

6:21

Brain, Defending the Right to Think Freely in

6:23

the Age of Neurotechnology. Professor

6:26

Farahani, welcome back to On Point. Thank

6:29

you. It's great to be here. So

6:31

I would like you to take

6:33

us back to the first moment

6:35

you realized that this revolution

6:37

in tech was coming. You write

6:39

about a 2018 summit at the

6:43

Wharton School in Pennsylvania. What

6:46

happened there? So I

6:48

had been studying neurotechnology and

6:50

even consumer neurotechnology for quite

6:52

a few years. But

6:54

at that summit, early

6:57

on in the summit, Josh Deweyen stood

6:59

up. He was one of

7:01

the people at a company that was a

7:03

startup called Control Labs. And

7:06

he was showcasing this new device where

7:08

they were taking electrodes and putting

7:11

them into what looks like an

7:13

everyday watch. And

7:15

he held up his hands and he said, you know, wouldn't

7:18

it be great if instead of

7:20

having the kind of clumsy output

7:23

that we have, that is these

7:25

hands, these like sledgehammer-like devices, we

7:28

could interact much more seamlessly with all

7:30

of the rest of our technology with

7:33

a device like the one on my wrist. Or

7:35

if we wanted to type, we

7:37

could type by thinking about typing

7:39

rather than by having to pound

7:41

away on a keyboard and how

7:43

we've gone backwards in time typing

7:45

on phones with our two thumbs.

7:49

But he was showcasing with something altogether

7:51

different than anything I had seen before

7:53

because while I had played with and

7:55

seen these devices in the past, they

7:57

hadn't really solved the form factor. They

7:59

were still... electrodes that

8:01

you would have to wear across your forehead and

8:03

a headband that was both

8:05

silly looking and uncomfortable, but

8:07

the applications were also much more limited.

8:10

They were limited to things like meditation

8:12

or, you know, personal gaming devices that

8:14

you might play. This idea that you

8:17

could take and make brain

8:19

wearable devices integrated into our everyday

8:21

devices to power our

8:23

interaction with all of the rest of our

8:26

technologies, that was the moment when I realized

8:28

all the things that I'd been thinking about

8:30

and worrying about for quite a long time, suddenly

8:33

we're going to come true. And I

8:35

was convinced, given the form factor, that

8:37

it would just make sense for Apple

8:39

to acquire control labs. But,

8:42

you know, I was floored when a

8:44

year later it was Meta who acquired

8:46

them instead. I thought that was the

8:48

pivotal acquisition and I then

8:50

I was like, okay, it is time to get writing this book.

8:53

A.K.A. Facebook,

8:55

A.K.A. Mark Zuckerberg.

8:58

Exactly. Exactly. I was like, if Mark

9:01

Zuckerberg is investing in this technology, I

9:03

mean, the things I was worried about,

9:05

they are going to come true. And

9:08

it also just made it so clear to me

9:10

that this is a mainstream movement. This is the

9:12

next big thing. It's not

9:15

a niche application for people who are interested

9:17

at home and, you know, trying to quantify

9:20

and see their own brains. This

9:22

was going to be come the way in which

9:24

we interact with the rest of our technology by

9:26

using our brains and our thoughts

9:29

as the way we interact with everything around

9:31

us. That was a revolutionary moment

9:33

and that acquisition was both

9:35

terrifying but also a call

9:38

to action to me to get writing and to

9:40

get this message out. A.K.A. Okay. So, but your

9:42

view on the brain computer interface technology is

9:44

quite nuanced. I mean, you don't see it as a

9:46

universal bad. So we're going to talk about its potentials,

9:50

the complex potential in

9:53

a minute here. But it doesn't it

9:55

make sense though that this would be

9:57

sort of the next front.

10:00

I mean, you call it the last fortress, that

10:03

technology hasn't yet fully overwhelmed.

10:06

But the brain is very much how

10:08

we, in a sense, what happens in

10:10

the brain is how we define ourselves as human

10:12

beings. So it is

10:15

what generates all our thoughts, feelings,

10:17

actions. So it would seem very

10:19

logical that technology would want

10:21

to understand,

10:24

harness and maximize what it can do with

10:26

that. Absolutely right. So first of

10:28

all, you're right. My view is

10:30

nuanced, and my view is nuanced because I

10:32

believe that this technology is the

10:34

next step for humans in ways that can

10:36

be deeply empowering. And I also think the

10:39

fact that our brains have remained this black box

10:41

and mysterious even to us, that we can only

10:44

access through self-reflection in ways

10:46

that aren't objective, you

10:49

know, that's not good for addressing

10:51

any of the major causes of

10:54

human suffering, such as neurological disease

10:56

and disorder and mental illness, or

10:58

even just understanding ourselves. So

11:01

of course, it makes sense that this

11:03

is where the next step in both

11:06

self-quantification, but also the thing that

11:08

we as humans would be pushing

11:10

for, which is access to and

11:12

understanding our own brains would

11:14

be happening. It also just makes sense

11:16

that we have all of these clumsy

11:19

interfaces between us and other technology and

11:21

the ability to be able to much

11:24

more seamlessly interact with other technology would

11:26

be deeply appealing to other companies. But

11:29

I'm also a skeptic on

11:31

motivations and, you

11:34

know, both I think my own cultural heritage,

11:36

but just the work that I

11:38

do as an ethicist and

11:40

a law professor, it's

11:43

always made me look at, okay, but what

11:45

are the complex set of motivations that bring

11:48

these different organizations to the table? Yeah, that

11:50

makes is what makes you our favorite kind of

11:52

guest, Professor Ferhani. So we'll

11:55

talk a lot more about the positives,

11:57

the negatives, and really most importantly, what

11:59

kind of questions you say we should

12:01

be asking ourselves now as a society,

12:03

as this technology comes at us at

12:05

full pace. So, Nita

12:07

Farahani, stand by for just a moment. We'll be

12:10

right back. This is On Point. Support

12:22

for the On Point podcast comes from

12:25

Indeed. We're driven by the search for

12:27

better, but when it comes to hiring,

12:29

the best way to search for a

12:31

candidate isn't to search at all. Don't

12:33

search, match with Indeed. Ditch the busy

12:35

work and use Indeed for scheduling, screening

12:37

and messaging so you can connect with

12:39

candidates faster. And listeners will

12:41

get a $75 sponsored

12:43

job credit to get your

12:46

job's more visibility at indeed.com/on

12:48

point. That's indeed.com/on point. Terms

12:51

and conditions apply. Need to

12:53

hire? You need Indeed. This

12:56

is On Point. I'm Magna Chakrabarti and Nita

12:58

Farahani joins us today. She's a professor

13:00

of law and philosophy at Duke

13:02

University, and she's out with a

13:04

new book titled The Battle for Your Brain,

13:06

defending the right to think freely in the

13:08

age of neurotechnology. Professor

13:11

Farahani, you've actually worn some of

13:13

these devices that currently exist

13:15

yourself. Can you tell us a little bit about

13:18

what it is that you wore, how it worked and what it felt

13:20

like to have it? Sure. So

13:23

I have been, I guess, toying around in

13:25

some ways with many of these devices, but

13:27

also using them personally for

13:30

some applications. So the earliest of

13:32

these devices were kind of

13:34

hard plastic devices that

13:36

would go across your forehead

13:38

and some of them tuck

13:40

behind your ears or fit

13:42

tightly across your scalp. And

13:45

the idea was to make contact with

13:47

dry electrodes to your forehead or to

13:50

different parts of your scalp that could

13:52

allow the electrical activity in your brain to

13:54

be picked up and then interact with, through

13:57

Bluetooth, some kind of application on your phone.

14:00

And the more recent devices, as

14:02

you described, and as the conversation

14:04

with Ton Lee made clear that

14:07

I also have had access to are electrodes

14:09

that are embedded inside of earbuds. And these

14:11

feel just like the normal earbuds you would

14:14

use to make a phone call or listen

14:16

to music or, you know, do a Zoom

14:18

call or headphones where the soft cups that

14:20

go around your ears have electrodes inside of

14:23

them as well. So you can't detect them.

14:25

They're just like the rest of the technology

14:27

that you would wear or one

14:29

of these watches that has a sensor

14:31

inside. I've used them primarily both

14:34

to test them out but also for

14:36

meditation. So I'm not

14:38

great at self-meditation, being able

14:40

to both keep my focus and ability to

14:43

stay in that state but also like,

14:45

am I doing it right? And

14:48

so what's neat about these devices

14:50

is the interaction with an application

14:53

lets you get real-time what's called

14:55

neurofeedback. So if I get my

14:57

brain states into a way

14:59

that brings down my stress levels and

15:02

shows that I'm in this kind of meditative

15:04

state, you have signatures in your brain that

15:06

can be detected and decoded that suggest that,

15:09

then you get something like chirping

15:11

birds or, you know, some

15:13

other kind of audible feedback. And

15:16

that's been really helpful for me. I'm a chronic migrainer

15:19

and high stress levels can really

15:21

trigger a migraine for me. And

15:24

using these as sort of a preventive

15:26

tool, something where even if I just

15:28

spend a few minutes of bringing my

15:31

stress levels down and remaining in a

15:33

meditative state, for me have been really

15:35

helpful in limiting the frequency and the

15:37

severity of my migraines. You

15:40

know, it occurs to me

15:42

that there are then there's so many

15:44

potential applications, positive applications, right,

15:46

for this technology. Like, you know, I've suffered from

15:49

depression for most of my life and I think

15:51

it would be kind of amazing to have

15:54

a device that would give me some sort

15:56

of feedback to say, you know, your

15:58

brain patterns right now. are indicating,

16:01

I don't know, some sort of negative

16:03

feedback loop that's going to deepen

16:05

your depression or something, anything like that. No,

16:07

you're right. You're right. So first,

16:09

I'm so sorry that it's grappled with, but

16:12

I mean, but that's, you know, you're one

16:14

of many millions of people who are grappling

16:16

with different effects of

16:18

the brain, whether it's migraines or

16:20

depression or people who suffer for

16:22

epilepsy, for example, and need an

16:24

early warning of having an epileptic

16:26

seizure. These devices can be

16:28

quite powerful. In fact, I talk about some of

16:30

those in the book from using

16:33

both feedback, but also neurostimulation,

16:35

which has been transformative for

16:37

some people with depression or people

16:40

who have ADHD, for example, there are a

16:42

lot of studies that show that using

16:44

neurofeedback, especially in children, over

16:47

a number of weeks can

16:50

actually be more powerful than drugs

16:52

or drugs alone and certainly have

16:55

far fewer side effects or somebody

16:57

who has epileptic seizures, like a

17:00

very close family friend of ours died of

17:02

an epileptic seizure without

17:04

early warning. She was alone

17:07

at the time. She vomited from the

17:09

epileptic seizure and then died from,

17:11

they believe, choking on her own vomit. If

17:13

she had had a one hour, you know,

17:15

in advance early warning of having that seizure,

17:17

she could have gotten herself to safety. She

17:19

could have, you know, made sure that she

17:21

took just in time medication. You

17:24

know, there's so much good

17:26

that could come from being able

17:28

to track our own brains and

17:30

improve them, enhance them, use neurofeedback.

17:34

Our own daughter, our eight year

17:36

old, while she doesn't use one

17:38

of these neurological devices uses biofeedback

17:41

through a heart rate monitor, which

17:43

has been gamified. She can play

17:45

games which get harder when her

17:47

stress levels and heart rate increase.

17:50

And then the way that she

17:52

wins the game is by being

17:54

able to self control, by emotionally

17:56

regulate and learning those skills at

17:58

a young age, I think, are... powerful and

18:00

important. So I'm definitely not, you

18:03

know, a Luddite when it comes to this technology.

18:05

I think it's both coming, but it also has

18:08

a lot of promise for

18:10

humanity if done right, if implemented with the

18:12

right safeguards, if used in

18:14

ways that benefit individuals, I think

18:16

it can be incredibly transformative. That

18:18

poor word, if it carries so much

18:20

weight on its shoulders. It does. And unfortunately,

18:22

I, you know, I have to say

18:25

that because I, you know, I

18:27

am somebody who is deeply optimistic and

18:29

I want the good of this technology

18:31

for humanity, but, but I see the

18:34

risks and, and I see the risks,

18:37

you know, especially in this domain, because there

18:39

is really nothing more sensitive and fundamental than

18:41

what it means to be human than having

18:44

that space of inner monologue of

18:46

private thought of being able to

18:49

entertain and turn over ideas in

18:51

your own mind without fear of it being misused

18:53

by other people, accessed by other people,

18:56

commodified by companies, interfered with

18:58

by governments and, and the

19:00

potential of connecting

19:02

our brains to technology makes

19:05

all of those risks a possibility. So

19:08

just as a side thought, there's

19:10

the technology in and of itself,

19:12

the hardware, then there's the, you

19:14

know, the means by which we

19:17

can interpret it, right? The kind

19:19

of feedback it, the machines generate,

19:21

but how much confidence at this

19:23

moment do you have about the interim

19:25

phase, like the analysis of what

19:28

the brain, those EEG signals are

19:30

sending? Do we actually know and understand

19:33

how to read what the signals

19:35

are? Yeah, it's a great question. So,

19:38

you know, a lot of people ask me how

19:40

good are the devices? And my

19:43

answer to that is it depends on

19:45

what you're using them for. You know,

19:47

can it decode your literal thoughts? You

19:49

know, the true inner monologue that you're

19:51

having? No, both the

19:54

technology itself, like the electrodes, the

19:56

sensors, the hardware, have

19:59

improved that. over the past decade,

20:01

but there's still some noise and interference

20:03

and different people may have

20:05

them applied in different ways that aren't quite

20:08

the right fit to pick up exactly the

20:10

right signal, and there can be interference from

20:12

muscle twitches or eye blinks or other devices

20:14

in your environment because it's electrical activity that

20:17

is picking up. And then

20:19

the software, the AI, I think

20:21

everybody knows that AI has been

20:23

having just exponential growth in its

20:26

capabilities. And what

20:28

we're picking up here from the brain

20:30

through these devices, what they're detecting

20:33

really is patterns of data, and those

20:35

patterns of data increasingly can be interpreted

20:38

and decoded with the sophistication of the

20:40

algorithms that play. So I think depending

20:42

on what we're talking about, it can

20:45

be very accurate and very good for

20:47

basic brain states like attention and boredom

20:49

and cognitive decline and stress, and are

20:52

you happy or are you sad? It

20:55

can be very accurate for probing

20:57

the brain for information through particular

20:59

signals of recognition in the brain, but

21:02

it doesn't do unless it's

21:04

implanted neurotechnology. There's not very

21:07

good real-time decoding

21:09

of speech, for example, even though that

21:11

is coming in many ways, and in

21:13

some ways we can talk about even

21:16

your intention to type or to communicate

21:18

or send a text message can be

21:20

decoded with this technology. Intention. Okay. Well,

21:23

intention, right? I'm going to say that because

21:25

there's like you thinking in your

21:27

mind and having a kind of moment

21:30

of self-reflection, and then you intending to

21:32

type something, which is speech

21:34

that you mean to go from your

21:36

mind out into the rest of the world. And

21:39

that has different representation in the brain. It's

21:41

easier to decode speech you

21:43

intend to communicate than that

21:45

inner monologue. Yeah. So this is

21:48

where we get into minority report territory, but we're going to hold that

21:50

thought, if I

21:52

can use that pun here for a

21:54

moment. Because now what I'd

21:56

like to do is sort of push

21:58

into the possible. The futures

22:00

that you think through. In

22:02

the book, the Battle for your

22:04

Brain because Ah will get governments

22:06

in a few minutes. But I

22:08

think the most immediate place of

22:10

change we might see was hinted

22:13

at by Timely at the beginning

22:15

of our show. Because yes, workplaces.

22:17

Who would be very very interested?

22:19

Or are very very interested in

22:22

whatever meaning. Already had a headache

22:24

as it had to make. It

22:26

work better workers, more efficient. What have you?

22:28

So if you don't mind, I will read

22:30

a little bit of a scenario that you

22:33

imagine here at the beginning of the book

22:35

and then you can sort of talk us

22:37

through the rest of the So. This is

22:39

what need a Thera Honey says we might

22:41

be closer to than we think. So the

22:43

like this: you're in the zone. You can

22:45

even believe how productive you been. Your memo

22:47

is finished and sent, your inbox is under

22:49

control and you're feeling sharper than you haven't

22:52

a decade. Sensing your joy, your playlist shifts.

22:54

To your favorite song sending. Chills up your

22:56

spine. As the music begins to play, you

22:58

glance at the program running in the background.

23:00

On your computer screen. And notice a now

23:02

familiar sight that appears. whenever. You are

23:05

overloaded with pleasure. Your saida brain

23:07

wave activity decreasing in the right

23:09

central and right temporal regions of

23:11

your brain. You mentally move the

23:13

cursor to the left and scroll

23:15

through your brain data. Over the

23:17

past few hours. You can see

23:19

your stress levels rising as the

23:21

deadline to finish your approached, causing

23:23

your beta brain wave activity to

23:25

peak right before an alert popped

23:27

up telling you to take a

23:29

brain break. But what's unusual?

23:31

change in your brain activity when you

23:33

sleep? It started earlier in the month.

23:35

You compose a text your doctor in

23:38

your mind and send it with a

23:40

mental flight. Of your cursor. Could.

23:43

You take a quick look at my

23:45

brain data. Anything to worry about. So

23:47

what happens next? In your imagine

23:49

scenario here. So.

23:52

from their ah is nita

23:55

there's there's a number of

23:57

different pieces from the employer

24:01

looking at the brain data and sending

24:03

a message to the employees saying, congratulations

24:05

on your brain metrics over the past

24:07

quarter. And you know,

24:09

you've earned another performance bonus. You're

24:11

excited about that. You still have your earbuds

24:14

in, not thinking about all of the data

24:16

that you're giving to your employer as you

24:18

go home, jamming to

24:20

the music and having forgotten that brainwave

24:22

data is being collected at the

24:24

same time. And then you come to the office the

24:26

next day and a somber cloud has fallen over the

24:28

office and you discover that

24:30

the government has subpoenaed all of

24:33

the brainwave data, along with all

24:35

of the other information about employees

24:37

because they're looking for co-conspirators for

24:39

a crime. And

24:41

you know, and it's funny that, that scenario,

24:45

my, my brilliant editor at St. Martin's Press,

24:47

he invited me to write a

24:49

scenario that could really put it all

24:51

together in kind of

24:53

one easy to understand narrative.

24:56

You know, what's the full spectrum of this

24:58

from the promise, which is

25:00

your ability to do things like hone

25:02

your own focus and attention and track

25:05

your own brain activity and bring down your

25:07

own stress levels and have real time feedback

25:09

about when you're suffering from

25:11

cognitive overload to the risks

25:13

and the ways in which employers

25:15

are already using this technology where you

25:19

know, it's dystopian in what I describe

25:21

it as, I believe of

25:24

having your brain be part of the

25:26

performance metrics. There's so much happening in

25:28

the workplace right now on productivity scoring

25:30

and you know, the, I

25:32

think over surveillance of employees in ways

25:34

that really are not helping morale or

25:37

the dignity of work. And this, these

25:41

brain metrics are already being used by

25:44

companies and increasingly well. And

25:46

then the frightening possibility, which we've already

25:48

seen with other kinds of personal health

25:51

data, whether it's Fitbit data or heart

25:53

rate data, which has been

25:55

subpoenaed by law enforcement and used in

25:57

criminal cases. And the idea that. Once

26:00

you open up your brain passively

26:02

thinking that you're using it to track your

26:04

own attention, that all is

26:06

fair game and can give

26:09

a lot of insights. The example that I

26:11

use in that scenario is that they're looking

26:13

for synchronization of brain activity between different workers.

26:15

It turns out when you're working with people,

26:17

you have higher degrees of synchronization

26:19

in your brainwave patterns and you

26:21

could actually use that to figure

26:24

out who's collaborating, who's developing a

26:26

union, who's working together, who you

26:28

wouldn't expect to see those patterns

26:30

of synchronization. So, as I start

26:32

to imagine all of that and

26:35

all of those scenarios are

26:37

possible with existing research

26:39

and existing technology, I

26:43

think it makes clear what the kind of

26:45

dystopian possibilities are of surveillance

26:47

of the brain. Well,

26:49

you even talk about in this imagined scenario

26:52

that maybe you might find a

26:54

coworker attractive and that would be

26:57

recorded. Your brain activity

26:59

of course. Because you can pick up those.

27:01

You can tell Amorous feelings. These

27:05

inward and deeply held feelings

27:08

are not things you would want to reveal. I'll

27:10

tell you a funny story, Magda, which is my

27:12

eight year old has her

27:14

first crush. She will be embarrassed that I'm sharing

27:16

this with you. With the world. With

27:19

the world. With the world. But

27:22

her friend apparently

27:24

has a crush on her as well. This

27:27

most mortifying thing to them possible that they

27:29

both have a crush. It's

27:31

the thing that everybody is teasing them about

27:33

even though we think it's darling and wonderful.

27:38

But that, imagine when you were

27:40

a child and you have these

27:42

early crushes which are so incredibly

27:44

formative. You don't want anybody

27:47

else to know and you're in a classroom required

27:49

to wear neural interface brain

27:52

wearables to track your attention and focus which

27:55

can pick up so much more information including

27:58

these kind of Amorous feelings. Those

28:00

are things you should be able to keep to

28:02

yourself. Those are things that other people shouldn't have

28:04

access to. Those are things that are so formative

28:06

to self-identity. And so when

28:08

I talk about these ideas of mental

28:11

privacy and the importance of this last

28:13

bastion of freedom, this last fortress, I

28:16

think it's the most important fortress. It's the one

28:18

that's most formative to who you are as a

28:20

human being. Yeah. You know, about

28:22

the workplace then, it seems like there's

28:25

two major sets of issues here.

28:27

One is A, how this technology

28:29

can have an impact on workers,

28:31

both positively and negatively.

28:34

But B, in terms of the economy

28:37

that we all function in, this all

28:39

sounds like surveillance capitalism potentially on

28:42

mega, mega steroids. And

28:45

so we've got about a minute and

28:47

a half before the next break, Professor Farahani. Can

28:50

you just walk us through a couple of

28:53

the major questions there where you think we should

28:55

be asking ourselves as a society right now when

28:57

it comes to commercial or

28:59

capitalism's use of this technology? Right.

29:02

I think the first and most important is, is

29:05

there any justified use of

29:07

brain metrics by employers? And I

29:09

outlined an example in the book

29:11

of commercial drivers who are already

29:13

having their brain activity monitored for

29:15

fatigue levels. And if

29:17

you were just measuring that, the only thing

29:20

that you were extracting through the algorithms and

29:22

brainwave data was whether or not a commercial

29:24

pilot or truck driver was sleepy or awake

29:26

and it was more precise than other kinds

29:29

of information, maybe those are circumstances

29:31

in which we might think it's a good

29:33

use of the technology. But

29:35

when you're using it to track attention

29:38

when periods of mind wandering are

29:40

punished rather than celebrated as the

29:42

most important moments of insights, when

29:44

you're introducing a more kind of

29:46

global surveillance of even what a

29:48

person is thinking and feeling, I

29:51

think that can be so undermining

29:53

for just the abilities for

29:55

humans to flourish, to feel

29:57

like they have trust in the

29:59

workplace. Who wants to continue to think

30:01

freely so that those are the kind

30:04

of worries that I have in that

30:06

context, so again the need to? Guard

30:09

Rails Legal and ethical guardrails around

30:11

this or we're going to explore

30:13

more. But what that means regarding

30:15

the you know what governments might

30:17

be interested in when it comes

30:19

to being able to. Use

30:22

technology to understand what's going on in your

30:24

brain is so offensive or on his closest

30:26

thing with us for another minute will be

30:28

right back on. This is on point. And

30:32

and. I'm

30:42

Kathleen Coal Tar and I'm the host of

30:44

a new podcast. Crime Story. Every

30:47

week we bring you a different crime

30:49

told by the storyteller who knows it

30:51

best. The one with missiles can

30:53

be fail. Got another witness was murdered

30:55

who couldn't sugar coat the story? I

30:57

was getting calls from cause of his

30:59

attorney threatening to sue every day. Every

31:01

crime in one way or another is

31:03

a reflection of who we are as

31:05

a people, as a city, as a

31:07

country. You find us wherever you get

31:09

your podcast. This

31:12

is on point a magnet chakrabarti. Today

31:14

we are talking with Professor Need a Thera Honey.

31:16

Her new book is the battle for your brain,

31:19

defending the right to think freely. It's in the

31:21

age. Of neurotechnology. And

31:23

just before the break professor. Farahani had

31:25

talked about, you know, hard, there's a world

31:27

in which. Kids in school

31:29

classrooms might see no put on headbands

31:31

and teachers would. Foot would be

31:33

able to measure how much they're focusing

31:36

or how much they're able. To concentrate

31:38

on a given assignments? Well, she

31:40

wasn't just making that up because

31:42

that world actually exists. The Wall

31:44

Street Journal recently visited a classroom

31:46

just a few hours outside of

31:48

Shanghai to see how both Ai

31:50

and brain computer interface technology is

31:52

being used in Chinese. Classrooms.

31:56

For. This History class as he begins

31:58

the sitting. On a brain. The device is

32:00

made in China and

32:02

has three electrodes, two behind

32:05

the ears and one on the

32:07

forehead. These sensors pick up electrical stiffness

32:09

sent by neurons in the brain. The

32:12

neural data is then sent in real time

32:14

to the teacher's computer. The

32:16

student is moving back to the room. The

32:19

teacher can quickly find out who's

32:21

paying attention and who's not. The

32:24

student is moving back to

32:26

the room. The teacher can quickly

32:28

find out who's paying attention and who's

32:30

not. The student should get a special

32:32

notification. The report is

32:34

then generated that shows how well the

32:36

class was paying attention. It even

32:39

details each student's concentration level

32:41

at 10 minute intervals. It's

32:44

then sent to a chat group for parents. Well,

32:48

that's from a Wall Street Journal documentary about

32:50

AI in China. And the journal's

32:52

reporters noted that it's not entirely clear

32:54

what the headbands are measuring or if

32:56

they're accurate, but you can

32:58

see the potential and the use

33:01

and purpose of this kind of

33:03

technology. Now, combine that with what

33:05

we already know about China's well-established

33:08

surveillance state that is carefully observing

33:10

its citizens. Here

33:13

in this Shanghai surveillance center,

33:15

no resident goes unwatched. Hundreds

33:19

of millions of cameras are installed

33:21

all over China. We

33:26

have algorithms that automatically recognize

33:28

certain behaviors. If someone is

33:31

wearing a mask, for example, we immediately

33:33

protect this wrongdoing. So

33:36

that's a little bit about China's

33:38

established surveillance state from the documentary

33:40

wing of the German broadcaster, DW.

33:43

Joining us now is Margaret Kozol. She's

33:46

an assistant professor and teaches international affairs

33:48

at the Georgia Institute of Technology. She's

33:51

currently on leave to the Savannah River National

33:53

Laboratory. Professor Kozol, welcome to you.

33:57

Thank you, Bagnah. I'm very happy

33:59

to be here. to be here to talk

34:01

about emerging technology and international security. So

34:04

I started my work

34:07

as a PhD chemist and

34:10

having experience in the high-tech startup

34:12

field before I got to this

34:14

work. And by the way, I'm

34:16

an associate professor, not

34:19

an assistant professor. Oh, you know what? In fact,

34:21

I had the word associate written on my page,

34:23

but somehow my brain and mouth said

34:25

assistant. My apologies. If I'd been working the

34:27

right kind of device, maybe I wouldn't have

34:29

made that mistake. But

34:32

so go ahead, tell us a little bit more. And

34:34

China is an easy place for us to

34:37

focus because, again, they already have such an

34:39

established surveillance state. Do

34:41

we know how the Chinese government is viewing

34:44

the potential of this kind

34:46

of brain-computer interface technology? So there

34:50

is a whole wealth

34:53

of information to unpack there in your

34:55

question. And I do have to

34:57

start out by saying that while I am

34:59

currently a professor of international affairs at Georgia

35:02

Tech, I have to

35:04

be sure to convey that my views

35:06

do not necessarily reflect the positions of

35:08

the Department of Energy, Department of Defense,

35:11

or any other organization that I've been

35:13

affiliated. Back to

35:15

your question. So some

35:17

of the empirical and quantitative work

35:20

that I've done has looked at

35:22

this question of likely

35:24

adoption of brain-computer interfaces, as

35:26

well as other neuro-technologies,

35:29

particularly by China in comparison

35:31

to the United States. So

35:35

first of all, understanding the

35:38

inner workings of the PRC

35:41

often is very difficult, but

35:43

China has been quite

35:47

articulate on some of its

35:49

aspects of where they're going with

35:51

what they call the Brain

35:53

China Project.

35:56

And particularly, they have

35:58

this articulate vision

36:01

of what they call one

36:03

body, two wings. So

36:05

that's for building the core and

36:07

developing the applications. And

36:10

some of this is

36:12

intentionally going to effective

36:14

approaches for diagnosis, intervention

36:16

of brain disorders, and

36:19

some of it is intentionally

36:21

to more security implications

36:24

in terms of the types

36:26

of technologies. Okay,

36:29

excuse me. Pardon

36:31

me, but let's focus on that second

36:33

part because that's really where we wanted

36:35

to take the conversation. Can you just

36:37

give me what the maybe top concern

36:40

is amongst the national security

36:42

establishment here regarding China's view

36:44

of the potential of brain-computer

36:47

interface technology? So

36:50

there hasn't been articulated a

36:52

specific concern here within the

36:54

United States with respect

36:56

to any details. It

36:59

is in the broader concerns about

37:01

Chinese technology, Chinese

37:06

acquisition and the ability for them to

37:08

challenge us. One

37:10

of the concerns that in my

37:12

work we have articulated is

37:14

that because of the likelihood,

37:17

looking at these different factors

37:19

and studying them, not just

37:22

the technology but understanding how

37:25

different technology gets deployed,

37:27

it is more

37:29

likely for these

37:32

kind of technologies,

37:34

in particular BCIs,

37:36

to be deployed

37:39

and adopted first

37:41

in the PRC. Okay,

37:45

so now help me understand something. Does

37:48

China have this phrase

37:50

of the sixth domain? I've seen that floating

37:52

around. What does that mean? So

37:55

that's in reference to the war fighting

37:57

domain. So we have war fighting

37:59

domains. domains in the United States

38:02

too. So the sixth war

38:04

fighting domain in China,

38:07

in the People's Liberation Army,

38:10

is the cognitive domain. And

38:12

the cognitive domain can be

38:14

split up into a number

38:16

of different pieces. The

38:19

biggest piece is things like information

38:21

warfare, which that can be

38:23

everything from misuse,

38:27

mischaracterization, disinformation, via

38:29

traditional propaganda to

38:33

use of the internet. But

38:35

it also can be things

38:37

that are targeting neuro-technologies, targeting

38:39

the brain, targeting the ability

38:42

to undermine the self. Wow.

38:45

Okay. So Associate Professor

38:47

Margaret Kozol teaches international

38:49

affairs at Georgia Tech and currently on

38:52

leave to the Savannah River National Laboratory.

38:54

Thank you so much for joining us

38:56

today. You're most welcome. Okay.

38:59

So Professor Farahani, just

39:01

give me your quick thoughts about what the China

39:03

example tells us we need to be thinking about.

39:07

So I think a couple of things. One is, we can

39:09

think about it from a national security perspective

39:12

in the United States. So the

39:14

Biden administration in late

39:16

December 2021 sanctioned

39:19

a number of Chinese companies for

39:22

creating so-called or purported brain control

39:24

weapons. And on this kind of

39:26

idea of the

39:28

cognitive domain, there

39:30

is both influence campaigns. This is what we're

39:32

worried about, for example, with TikTok and

39:35

shaping views and minds, but also

39:37

picking up biometric data and precise

39:39

profiles on American citizens. But

39:42

also this anxiety about a

39:44

kind of arms race and brain computer interface. And

39:47

that could be everything from the development

39:49

of kind of super soldiers. So there's

39:51

been a lot of talk about that.

39:53

There's even a conference that was just

39:55

recently held by the Commerce Department here

39:57

in the US with all of the

39:59

major implications. planted BCI manufacturers

40:01

about whether there should be export controls to

40:04

prevent China from using our technology

40:06

and what could be a race

40:09

for capabilities within the military. But

40:12

then beyond the kind of, you know,

40:14

domain of influence and military use, there's

40:17

been these anxieties around the

40:19

creation of weapons that could

40:21

disable or disorient minds. And

40:24

while the Havana syndrome, you

40:27

know, cases have largely been dismissed

40:29

by the intelligence community at this

40:31

point as being fueled

40:34

by or funded by foreign adversaries,

40:36

there's still a lot of worry

40:38

about that kind of focus of

40:40

developing kinds of technologies, whether it's

40:42

electromagnetic or microwave technologies that

40:44

could be aimed at human

40:46

brains and minds. And so, you

40:48

know, I think we need to worry about it from

40:51

a national security perspective. And then we

40:53

also need to learn from and worry

40:55

about government use

40:58

of the technology in surveillance

41:00

or in interference with freedom of

41:02

thought. And so I worry about

41:04

it not just from a, you know,

41:06

U.S. versus China perspective, but

41:08

what their example of the surveillance state

41:10

shows us of interfering with what

41:13

I think is the most important, again, aspect

41:15

of what it means to have human flourishing,

41:17

the ability to think freely. And whether

41:20

or not the technology works, if you're

41:22

required, whether in the workplace or in

41:25

everyday life, to where

41:27

bring computer interface technology that could

41:29

be intercepted by the government, the

41:31

informational asymmetry is usually so

41:34

powerful that you might be

41:36

afraid to even think bad

41:38

thoughts or dissident thoughts. So,

41:41

you know, I think their example teaches

41:43

us a lot about the risks of

41:45

the technology. That's right. And not

41:47

just in the national security context, just in our

41:49

own lives based on what governments

41:52

can do to their own people anywhere

41:55

in the world. I mean, look, we're already living in a world

41:57

where people are afraid to say certain things. I mean, I can

41:59

make... very much see a next

42:01

step being afraid to even think them.

42:04

So in the last few minutes of the conversation,

42:06

Professor Farahani, I mean the real purpose of your

42:09

book is, as you say, to get

42:11

us to start thinking about a new

42:13

aspect of freedom that we

42:16

need to incorporate into social

42:18

norms, into ethical guidelines, into

42:21

our legal structure. And

42:23

you call it cognitive liberty,

42:25

which includes mental privacy, freedom

42:28

of thought, and self-determining determination.

42:31

So tell us more about how you conceive of this

42:34

notion of cognitive liberty. Thank

42:36

you. That I think is part of

42:39

what gives me the optimism and the hope that we

42:41

were talking about in the beginning of the conversation. You

42:44

know, we're at the forefront of this

42:46

transformational moment with this technology, which really

42:48

is going to become much more ubiquitous

42:51

and part of our everyday lives. And

42:54

the question is, are we going to give up

42:56

our rights, our mental privacy, our

42:58

freedom of thought just as easily as

43:00

we've given up all of the rest

43:02

of our privacy in exchange for the

43:04

convenience of typing or swiping with our

43:06

minds? And I think we

43:08

at this moment, at this juncture, at this

43:11

earlier stage, have a choice to make to

43:13

change the terms of service and put it

43:15

in favor of individuals. I see

43:17

cognitive liberty as an update to our conception

43:20

of liberty, but in the digital age. It's

43:23

a concept that applies well beyond

43:25

just neurotechnology. It applies to how

43:27

we think about social media and

43:29

addictive technologies and neuromarketing and weapons

43:32

that are being designed to attack

43:34

the brain. And how

43:36

I think of it is both as a

43:38

legal but also a cultural and social norm.

43:41

As a legal norm, it would invite

43:43

us and require us to update our

43:45

international human rights to recognize

43:48

this right to cognitive liberty as

43:50

a civil and political right, which would

43:53

direct us to update three existing rights,

43:55

our right to privacy to explicitly include

43:57

the right to mental privacy. freedom

44:00

of thought to apply more broadly than just

44:02

religious freedom and belief, but to include a

44:04

right not to have our thoughts used against

44:06

us and not to be punished for our

44:09

thoughts and not to have our thoughts manipulated.

44:12

And self-determination, updating that

44:15

from a concept of what's really been

44:17

understood as a political and collective right

44:19

to an individual right to access our

44:21

own brains to be able to enhance

44:24

or change them and to determine how

44:26

we want to shape our own mental

44:28

experiences. Well,

44:33

I want to continue living in a world

44:35

where my last truly safe and protected

44:37

space is inside my own mind, right?

44:39

It's our final retreat, right? Yes,

44:41

yes. It is our last fortress. And it's

44:44

one I think we can't

44:46

afford to quietly let go. I

44:48

think it's so urgent that people

44:50

join the conversation and the call

44:52

to action now because it

44:54

will be too late to claw it back later. But

44:57

it isn't too late now to

44:59

really define the way in which this

45:01

technology will be integrated into

45:03

society and how our relationships with others will

45:06

be when it comes to the most precious

45:08

thing we have, which is our minds, our

45:10

ability to think freely. We have

45:12

one minute left, Professor Frejani. And there's something you

45:14

teased us with a little earlier that I'd love to

45:17

sort of close with. And that

45:19

is your own family background, your cultural background,

45:21

and how this plays into how

45:24

you're thinking about these technologies. So

45:27

I'm Iranian American. My parents

45:29

left Iran really a decade

45:32

before the revolution, but always intended to

45:34

go back. Weren't able to

45:36

as the political unrest occurred. But all of

45:38

my first cousins, all of my aunts and

45:40

uncles still live in Iran. And

45:42

I've grown up in a world in

45:44

which I understand and see people who

45:46

are afraid to speak freely, family

45:49

members who are afraid to tell

45:51

us what's happening for fear of

45:53

being persecuted. And that world, those

45:55

conversations, I think attune me to the

45:57

ways in which technology can be mixed use.

46:00

the ways in which surveillance can interfere

46:02

with people's ability to rise up, defend

46:05

their own freedom, defend their own rights. It's

46:07

a world I don't want us to unveil

46:10

through a kind of Orwellian future

46:13

of neurotechnology, but to ensure we

46:15

safeguard our right to think freely. Right,

46:17

but you also talk about how in the Iran example

46:19

you see the power of technology to mobilize people for

46:21

change. Yeah, yeah. All the Durranian women, for

46:23

example. Yes, and the Twitter,

46:25

you know, the use of Twitter during the

46:28

Green Revolution. There is hope and

46:30

there is peril, and we get to decide which

46:32

one we decide to champion in life. Well,

46:35

Nita Farahani, professor of law and philosophy

46:37

at Duke University, the book is

46:39

The Battle for Your Brain, defending the right

46:41

to think freely in the age of neurotechnology.

46:44

We have an excerpt of it at

46:47

onpointradio.org. Thank you so much for joining

46:49

us. Thank you. I'm Magna

46:51

Trachra Barty. This is On Point. Thank

46:58

you.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features