Podchaser Logo
Home
New York Times Journalists Jennifer Valentino-DeVries and Michael H. Keller on "A Marketplace of Girl Influencers Managed by Moms and Stalked by Men"

New York Times Journalists Jennifer Valentino-DeVries and Michael H. Keller on "A Marketplace of Girl Influencers Managed by Moms and Stalked by Men"

Released Wednesday, 10th April 2024
Good episode? Give it some love!
New York Times Journalists Jennifer Valentino-DeVries and Michael H. Keller on "A Marketplace of Girl Influencers Managed by Moms and Stalked by Men"

New York Times Journalists Jennifer Valentino-DeVries and Michael H. Keller on "A Marketplace of Girl Influencers Managed by Moms and Stalked by Men"

New York Times Journalists Jennifer Valentino-DeVries and Michael H. Keller on "A Marketplace of Girl Influencers Managed by Moms and Stalked by Men"

New York Times Journalists Jennifer Valentino-DeVries and Michael H. Keller on "A Marketplace of Girl Influencers Managed by Moms and Stalked by Men"

Wednesday, 10th April 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Hi, everyone. I'm Brené Brown, and this is Unlocking

0:02

Us. Okay,

0:09

y'all, this is our fourth episode in

0:11

a series that I am calling Living

0:13

Beyond Human Scale, the possibilities, the cost,

0:15

and the role of community. And

0:18

this is going to be a really unique series

0:21

because it crosses over between Unlocking Us and the

0:23

Dare to Lead podcast. We're talking

0:25

about everything from mental health and social

0:27

media to how

0:29

do we get ready to work

0:32

with and not for

0:34

artificial intelligence. I think there

0:36

are a lot of possibilities for innovation and

0:38

really great change. And

0:41

we're getting pressed to live

0:43

really beyond how we are socially,

0:45

biologically, cognitively, and spiritually wired. This

0:49

conversation is tough today. I'm

0:53

talking with Jennifer Valentino-Davries

0:55

and Michael Keller. They

0:58

are both award-winning journalists for

1:00

The New York Times, and

1:02

they wrote an article. It

1:05

appeared in The New York Times on the 22nd of February. The

1:09

title was A Marketplace of Girl

1:12

Influencers Managed by Moms

1:15

and Stalked by Men. And

1:18

I've been really interested in the last

1:20

year and a half, maybe two years,

1:22

about the

1:25

influencer economy. And

1:28

oh my God,

1:30

it's so nuts how

1:35

susceptible we are, how pissed

1:37

off we get. I

1:39

even found myself coming

1:42

across clips of me and the context of the

1:44

clips were cut off. And I thought, oh my

1:46

God, what is happening on

1:48

social media where everybody's got an idea and

1:50

a belief and everyone's selling you shit all

1:52

the time? It just made me crazy. So

1:56

we reached out to Jennifer and

1:59

Michael. again, the New

2:01

York Times reporters and asked if they would

2:03

talk to us about the investigation that led

2:05

to this article what they learned specifically

2:09

About these are basically Young

2:12

girls, I mean young like elementary school through

2:14

Millot, you know, not old enough to have

2:18

accounts on social media platforms their

2:20

accounts are managed by their moms and They're

2:25

dealing with a ton of Sexualized

2:28

comments from men The

2:31

mothers are and parents are reacting

2:33

in a variety of ways From

2:35

oh my god, how did this get started? How do I

2:37

get out to well? That's what it takes to earn a

2:40

dollar the influencers

2:42

are you know, some of them are making

2:44

money Some of them are doing it for

2:46

apparel deals and it's

2:48

one small narrow Niche

2:52

part of the influencer economy, but

2:55

I think there's lessons there to

2:57

learn for all of us so

3:00

let me tell you a little bit about Jennifer and Michael

3:04

then we'll jump in You Support

3:14

for this episode comes from Viator Experiences

3:17

are what people love the most about travel.

3:19

That's why Viator has over 300,000 bookable experiences

3:24

So there's always something for everyone

3:26

They offer everything from simple tours

3:28

to extreme adventures plus Viator's travel

3:30

experiences have millions of real traveler

3:32

reviews So you have the information

3:34

you need to book the best

3:36

activities for your trip Download

3:38

the Viator app now and use code Viator

3:41

10 for 10% off your first booking in

3:43

the app one app over 300,000

3:46

travel experiences you'll remember do more

3:48

with Viator Apple

3:52

card is the perfect cashback rewards credit

3:54

card You earn up to 3% daily

3:57

cash on every purchase every day. That's

3:59

three percent percent on your favorite products

4:01

at Apple, two percent on all other Apple

4:03

card with Apple pay purchases, and

4:06

one percent on anything you buy with

4:08

your titanium Apple card or virtual card

4:10

number. Visit apple.co/card calculator

4:12

to see how much you

4:14

can earn. Apple card

4:16

issued by Goldman Sachs bank USA, Salt

4:18

Lake city branch, subject to credit approval,

4:20

terms apply. Jennifer

4:25

is a reporter on the investigative team

4:27

at the New York times where she

4:29

uses data analysis to explore complex subjects.

4:31

Her work frequently examines the far reaching

4:33

effects of technology on society

4:36

such as the spread of propaganda online

4:39

and the legal questions posed by digital

4:41

surveillance. She specializes in

4:43

collecting and analyzing data and using

4:45

it in combination with traditional reporting to

4:47

tell stories. She's been a

4:49

journalist for 20 years, first at the Houston

4:51

chronicles in for a decade, the wall street

4:53

journal, where she covered technology, privacy, and computer

4:55

security. She joined the New York times in 2018 in

4:57

2022. She

5:00

was part of a team that won the Pulitzer

5:02

prize in national reporting for

5:04

coverage of systemic failures in

5:07

American policing that led

5:09

to avoidable deaths. She grew up in

5:11

Texas. She graduated from UT Austin where

5:13

she developed a law for journalism and reporting while she

5:16

was working for the student paper. She has

5:18

a master's degree from the school of public

5:20

and international affairs at Princeton. We'll

5:22

include a link to the primary article we're

5:24

discussing and links to where you

5:26

can find Jennifer's byline on different

5:29

stories. Michael is also a

5:31

New York times reporter who combines

5:33

traditional reporting with computer programming, often

5:36

investigating how technology affects society and

5:38

young people. Also worked

5:40

on the team that won the Pulitzer

5:42

prize. Michael

5:44

really gravitates towards topics that highlight people's

5:47

personal stories within the context of larger

5:49

national international issues. Michael's

5:51

been a journalist for 15 years, has covered issues

5:53

ranging from politics to the environment. I

5:56

think you're going to find

5:58

this conversation. Disturbing,

6:02

frustrating, challenging, I

6:05

don't know if

6:07

it's provocative, but I will

6:09

say that there is a

6:11

very explicit conversation about the

6:13

over-sexualization of little girls, about

6:16

the predatory men who stalk them on

6:18

Instagram. So if you're sensitive to these topics,

6:21

I would maybe

6:24

skip this podcast, read

6:26

the transcript, whatever you can

6:28

do to take care of yourself. It's a tough

6:30

topic, but we need to talk about it because

6:32

it's happening all over, and I

6:34

don't think the platforms are doing very

6:37

much, if anything, to stop it. Let's jump

6:39

into the conversation. Jen

6:45

and Michael, thank you so much for joining us

6:47

on Unlocking Us. I appreciate y'all being

6:49

here. Thanks for having us. Yeah, thank

6:51

you so much. So we're

6:54

going to talk to Jen and

6:56

Michael, both various teams, New York

6:58

Times, Journalist. An article

7:00

was published on February 22nd,

7:03

the title, A Marketplace of Girl

7:05

Influencers Managed by Moms and Stalk

7:07

by Men. I

7:09

have been studying and looking at

7:11

influencer culture for the last two

7:13

years, and this

7:16

article, everyone

7:19

I know that's read it, including my team that helped me

7:21

prep, just, they kind of took their breath away. And

7:24

I needed to examine some of my responses to it I

7:26

have not liked, and so we can talk about that in

7:29

a minute. I'd like to start where

7:31

we always start, which is kind of tell us your

7:33

story. So Jen, would you like to go first and

7:35

just tell us how, about

7:37

the journey that, Lane, did you hear? Sure.

7:40

Well, I was born

7:42

and raised in Texas, like

7:45

you actually, in San Antonio, and I

7:47

went to college at

7:49

the University of Texas at Austin

7:51

and eventually went to grad

7:53

school and studied public policy

7:55

and economics. And,

7:58

you know, I started reporting. on

8:01

technology sort of by happenstance. And

8:03

it was the reporting job that

8:05

was available during the great recession.

8:08

And I've been covering a lot of tech

8:11

issues. Over the course of my

8:13

career. And Michael

8:15

and I worked together on the

8:17

investigations team and we both have

8:19

covered technology a lot and also

8:21

use a lot of data in

8:24

our reporting. So we do similar

8:26

things but bring somewhat different approaches

8:28

to our reporting. So we enjoy

8:30

working together. Yeah,

8:32

I think that quantitative background I could definitely see it

8:35

in this article. Even as a social scientist,

8:37

I kept going back to the methodology and I

8:39

kept thinking, wow, somebody here

8:41

has got a background in data science.

8:43

So that makes sense to me. Michael,

8:45

how did you end up here? I

8:48

did not know I wanted

8:50

to go into journalism. I grew up in

8:52

Los Angeles and went

8:54

to college on the other side of the country

8:56

at Georgetown and studied comparative

8:59

literature and psychology, which

9:01

I similarly got started in

9:04

journalism kind of post-recession and

9:06

graduated really right when the recession was

9:09

going on. And so it

9:11

was not evident how those two majors

9:13

were gonna lead to a career in

9:15

anything. But a lot of my coursework

9:19

was around in psychology, people with

9:22

conditions like Alzheimer's or degenerative disorders. And

9:24

so it was a lot of interviewing

9:26

and writing case studies. I

9:30

thought the compilate work

9:32

was also kind of about analyzing stories

9:34

from multiple perspectives. And when I was

9:36

thinking about what I wanted to do,

9:39

journalism just seemed really interesting and kind

9:41

of inadvertently had been

9:43

getting a lot of the practice that rectified

9:46

into these kind of nuanced, complicated

9:48

stories. Around that

9:50

time, it was also when data visualization

9:53

and data journal was starting up and

9:55

that seemed really interesting to me. I

9:58

didn't have any kind of computer. programming background,

10:00

but it was fun and I

10:03

liked to do design. I did that for my

10:05

college paper. And that

10:07

was kind of what you could get a job

10:09

doing in that tough time, although it's still a

10:11

tough time in the industry. And

10:15

got attracted to reporting on technology companies, kind

10:17

of lent itself better to doing those types

10:19

of investigations, although I've covered a lot of

10:21

things beyond technology companies. And

10:24

yes, I think Jen and I both come at it from a perspective

10:27

of what's going to dive into really

10:30

kind of human centered focused stories, things

10:32

that matter because they're affecting people's lives.

10:36

But how can we build

10:38

a foundation or get into that by collecting

10:40

a lot of data, seeing what's going on?

10:43

Most stories, I think, operate

10:46

on this kind of low and

10:48

high altitude, or you want to zoom

10:50

in and see what's happening to one

10:52

individual and really get the reader to care about

10:54

their story. But then you also

10:56

need to zoom out and say, you know, there's also thousands

10:59

of others like this person. And so I think those

11:01

two skill sets, I think, go together really well. You

11:04

definitely see it in this article. It's interesting

11:06

because I think the subject matter is

11:09

so hard in places in this article

11:11

that I found myself kind of tapping

11:13

out of it and looking at it

11:15

more analytically, like, wow, this is the

11:17

best of when qualitative and quantitative

11:19

research comes together because I

11:23

understood the scope and breadth, and then I

11:25

also felt deeply the story. So

11:28

it's a really incredibly reported

11:30

investigation. So congratulations to

11:32

both of you. Thanks. So

11:36

for this investigation, Jen and Michael

11:39

analyzed 2.1 million

11:41

Instagram posts, monitored months

11:43

of online chats, have professed

11:46

pedophiles and interviewed over 100

11:48

people, including parents and children.

11:53

And I consider myself to

11:55

be, like, pretty tapped in. I did

11:57

not know this phenomenon existed. Explain

12:01

to me this idea of primarily

12:04

mothers running young

12:06

young girls accounts, primarily

12:10

on Instagram, for

12:13

compensation. And

12:17

how did we get here? I

12:20

didn't know until we started looking

12:22

into this. And what

12:25

led us to this

12:27

story, I think, is

12:30

helpful in understanding the genesis of

12:32

what we did and then we can get

12:34

a bit into how we as a society

12:36

got here. So we've got two stories. How

12:39

we as journalists arrived at this and how just

12:42

overall we've gotten to this point. Michael

12:44

had actually been reporting on the

12:47

spread of child sex abuse material

12:49

online prior to the pandemic even.

12:52

And I was interested in

12:54

expanding on his work and

12:57

so spoke with a long-term source

13:00

of mine who was at Stanford

13:02

about the issues with child

13:05

sex abuse content, which some people

13:07

refer to as child pornography, but

13:09

it's not really pornography in the

13:11

consensual way that most people think of,

13:13

obviously. So then

13:16

that source of some really interesting

13:18

things. They're an expert in online

13:20

security and safety and they said

13:22

that child sex abuse

13:24

material is the worst thing really

13:26

that can happen online. And

13:29

so because it's so devastating, it's clearly important

13:31

for us to pay a lot of attention

13:33

to it, but that there

13:36

are other issues online

13:38

that are also worth paying

13:40

attention to that are not

13:42

quite as devastating, but they

13:45

are worth trying to solve

13:47

and exploring because even though

13:50

they aren't as damaging as

13:53

actual child sex abuse, they are

13:55

likely to affect far more

13:57

people and that Among

14:01

his concerns was the effect

14:04

of social media and influencer

14:06

culture on girls and young

14:10

women and kind of their

14:12

psychological space and their view

14:14

of themselves and their behavior

14:17

in relating to the world. And

14:20

so I was curious and I

14:22

started looking up tween influencer, tween

14:24

model, you know, on different social

14:26

media platforms. And these accounts

14:29

showed up. This was prior to

14:31

the pandemic, events intervened in our

14:33

ability to report on it. But

14:36

those accounts were around even then

14:38

and they were run by the

14:40

parents. Usually it said

14:43

mom run or mama managed. And

14:46

the girls were very often in

14:48

high heels or skimpy clothing

14:51

or even just midriff bearing

14:53

clothing and short shorts, bathing suits,

14:55

leotards, that kind of thing. And clearly

14:57

the followers were male. And

15:00

sometimes we're saying either creepy things

15:03

or even if it wasn't outright

15:05

sexualized, they would be leaving heart

15:07

eye emojis, which I

15:09

did not think was particularly

15:11

appropriate for an adult male to leave

15:13

on a child photo. And

15:16

so that was how Michael and

15:18

I got to this point. I

15:22

think we've gotten

15:24

here as a society through

15:27

several ways that we have explored and we're

15:29

continuing to explore. But I think a big

15:31

part of it is just this

15:34

influencer economy and how

15:36

people see on social media

15:40

that this is a

15:42

viable career path. It

15:44

is something that kids are

15:47

being encouraged to do, not

15:49

even necessarily by their parents, but by their

15:51

peers and by people they see online. And

15:54

even for people who aren't wanting

15:56

to be influencers, the idea of...

16:00

Benefits even those that aren't monetary

16:02

that you get from being

16:04

online the sense of approval That

16:07

you get is really powerful and

16:09

I think I know Michael

16:11

has some other ideas There are a lot

16:13

of factors that are coming together that have

16:15

created this moment just

16:18

add to that echo everything that Jen said

16:20

the barrier to entry like we've

16:22

seen in a lot of industries

16:24

has come way down and and

16:28

For most the parents that we spoke to they

16:30

explain their journey in Terms

16:34

like well, we're already doing dance and

16:36

gymnastics so we may

16:38

as well become an ambassador for a

16:40

leotard company that we like and It's

16:43

just very easy to set up an

16:46

account and get followers, you know

16:48

in the language of

16:50

Silicon Valley It's the frictionless

16:52

experience and I think in

16:55

our reporting not just on this topic, but you know a

16:57

lot of technology Investigations

16:59

we work on we explore kind

17:01

of the unintended

17:04

or unnoticed

17:07

Consequences of those kind of increasingly

17:09

frictionless experiences What are the harms

17:12

that get introduced when it becomes

17:14

easier to grow at scale for

17:16

anyone to grow at scale? Including

17:19

children. So I think it's all

17:21

a part of these larger dynamics where

17:23

it previously, you know,

17:25

we've we've always had

17:27

kind of a phenomenon of pageants

17:30

and stage parents and This

17:33

just makes it both much

17:35

easier for more people to do and

17:38

you have the culture telling you Yes

17:40

This is a viable way

17:42

for your child to get a career later on

17:45

and not just that if she's

17:47

not doing it She's gonna be missing out on

17:49

opportunities. We heard that over and over

17:51

again from parents. I Was

17:53

really surprised in one of the reader

17:55

choice comments on the article Someone

17:58

wrote a mother wrote that

18:01

I think their child was involved in dance or

18:03

gymnastics and the mother wouldn't

18:05

allow the child to have a YouTube channel or

18:08

an Instapage and how devastating

18:10

that was for the child because all

18:12

of their peers had kind of

18:15

these influencer pages and channels

18:18

and they were talking about a very young child.

18:20

I mean I pulled the stat from your investigation.

18:24

Nearly one in three preteens

18:26

list influencing as a career goal and 11%

18:28

of those born in Gen Z between

18:32

1997 and 2012 described themselves as influencers. Yeah. Like

18:38

this is a new career category. Yeah.

18:42

You know I think something that a

18:44

lot of the parents we interviewed expressed

18:46

to us was that this

18:48

was really something that their child

18:50

wanted to do, that their child

18:52

was driving and they felt

18:55

that they were supporting their child

18:57

in wanting to do this. I

19:01

mean I have to say

19:03

that like the

19:05

moral outrage that I experienced I'll just I'll

19:07

own my own stuff when I read it

19:10

and kind of like so

19:13

mothers are running accounts it seems to be

19:15

far I just want to make sure that

19:17

I'm I'm using that language specifically much more

19:20

often mothers than fathers is that the case?

19:22

Yes. Yeah. So

19:24

primarily mothers are running an account with

19:27

young girls as young as elementary

19:30

school, right? Correct. And

19:34

I mean there's a

19:36

call box here parents are the driving

19:38

force behind these accounts some even offer

19:40

the sale of photos, exclusive chat

19:42

sessions and some are even

19:44

selling the girls worn leotards to

19:47

mostly unknown male followers. True

19:49

or not true? Correct. Yes

19:51

that's true. Okay

19:54

well I mean holy

19:57

shit like okay so I have

19:59

to say that that my first reaction was

20:01

so terrible. Because

20:06

my first reaction was these mothers

20:09

are the worst human beings alive. The

20:11

second group of people I took to task

20:13

were the platforms. And

20:16

I almost felt like when I pulled back,

20:18

I was just assuming, of course the men

20:20

aren't gonna be held accountable, because this

20:22

is just gonna be, this is just,

20:25

this is reality. Right. Do

20:27

you know what I'm saying? Yeah, no, of

20:29

course. I think that these are all questions

20:31

that we ourselves grappled

20:35

with. And really?

20:37

Yeah, well, I mean, we

20:39

and our editors, and I

20:41

mean, honestly, I feel that

20:43

those are natural reactions. I,

20:48

however, and I think I responded

20:51

to some reader comments on the story too,

20:54

trying to kind of express though,

20:56

that I think it's important not to let

20:58

them off the hook. Right. To

21:01

just assume that men are gonna

21:03

be horrible, and it's the responsibility

21:05

of everybody else to work around

21:07

that. There are ways in

21:09

which that is true, especially when a

21:11

parent has a responsibility to

21:15

protect their child, and honestly, the platform has

21:17

a responsibility to if they're saying

21:19

that it's safe for

21:21

young people to go on here with

21:24

parental supervision if they're under 13,

21:27

or they say that it's safe for anybody

21:29

13 or above, you don't even have to

21:32

have a parent managed account, the

21:34

platforms have a responsibility

21:37

to their consumer base

21:39

to ensure that safety, if

21:42

that's something that they are promising. That

21:45

aside though, the fact

21:47

that the men are

21:50

doing this should not be dismissed

21:52

or taken for granted. And

21:55

I think that there

21:57

are a lot of questions. for

22:00

us about how this kind of

22:02

behavior is expected and also how

22:04

technology has made it, Michael

22:07

was talking about making things more

22:09

seamless, while it has also made this

22:14

kind of consumption of

22:17

children's images or this sort of

22:19

treatment of women who are online

22:22

also seamless. And we

22:25

monitored a number

22:27

of chats on Telegram

22:29

groups. What is

22:31

Telegram before you go? Let me stop you because

22:33

I had not heard of that. I'm glad I

22:35

haven't heard of it. I think I'm glad. But

22:38

what is it? It is

22:40

an encrypted platform that is more

22:45

personal use. It's a bit like WhatsApp.

22:47

I think more people are probably familiar

22:49

with WhatsApp. Okay. It's encrypted though. And

22:51

so it can be harder to find

22:54

groups. It's not like you can just

22:56

go and search the way

22:58

you can on Instagram. And it's harder

23:00

for regulators or law enforcement or whoever

23:03

to get a handle on what's going

23:05

on there sometimes. So

23:07

you're on this platform and you're kind

23:09

of doing a content analysis and investigating

23:11

inside this platform. Right. We were

23:14

just monitoring some of these chats

23:16

that we had heard about during

23:18

the course of our reporting as

23:20

places where these men gathered to

23:23

discuss these child influencers

23:26

who were on Instagram. And

23:28

they would talk about how great

23:31

it was. They would use

23:33

this to encourage each other

23:36

to justify their behavior because

23:38

it was on Instagram. And

23:40

so I think it's

23:42

clearly not a healthy thing

23:45

to have that kind of

23:47

behavior be encouraged and normalized.

23:51

It was fascinating in the reporting process

23:53

of being able to talk

23:55

with parents and hear them grappling with

23:57

it in real time. I think in One

24:00

of the first interviews we did, we weren't

24:03

sure how they were going to go. Our parents can

24:05

open up to us about this. We're

24:08

having kind of a pretty normal conversation like, tell us how'd

24:10

you get started? And it was like a timeline and this

24:13

and end up asking, you know, what

24:15

do you think other parents should know

24:17

about this? And the answer was something like,

24:20

this is the worst thing you could do. And

24:23

it was kind of threw us for a loop. We're

24:25

like, wait a minute, wait, back up a second. Do

24:27

you think it's terrible, but you've told us how you're

24:29

doing it? Explain this

24:32

conflict. And a

24:34

lot of parents explain that, not just this one mom, but it

24:36

was this, they felt like they were in a bind. And

24:39

they often use the term of,

24:42

this is a digital resume for

24:44

our child and we kind of have to do

24:46

this. And there was a real spectrum.

24:49

There were people that were getting

24:51

really tangible things out of it. I think

24:53

that that family was

24:55

getting legitimate dance

24:58

opportunities to go at

25:00

shows and things that were real and their

25:02

daughter was really enjoying doing. Other

25:05

parents though were just kind of doing

25:07

it because their daughter enjoyed getting free

25:09

clothing and toys or fashion

25:12

accessories. In a lot of

25:14

cases, they

25:16

paid to be a part of these

25:18

ambassadorships or they got discounts. They

25:21

paid by buying the apparel. And

25:24

then kind of at the far end of the spectrum and some of the

25:26

things that you cited from the piece, some

25:29

were selling photos and used clothing.

25:32

There was definitely a big range. And

25:35

we had thought that perhaps there

25:37

was more of a financial incentive for

25:39

more people, but for a

25:42

significant number, there wasn't this idea of, oh

25:44

yeah, we're making tens of thousands or even

25:47

thousands of dollars. And that's why we're doing

25:49

it. It was a lot more kind

25:51

of nuanced and we had to kind of talk with

25:53

each family individually of kind of, what

25:55

is the motivation? What are you doing this? One

25:58

thing that we also really heard a lot. in how

26:00

they grappled with this question was most, pretty

26:03

much everyone I think thought that what

26:07

they were doing was safe or how they were

26:09

going about it was safe. Almost

26:11

everyone brought up the safety concerns

26:13

and that there were creepy men

26:16

that would message them or try and contact

26:18

them other ways, but they thought

26:20

that they had put foundries on it, such as,

26:22

well, my daughter isn't

26:24

the one using Instagram, I'm the

26:26

one running the account. And

26:28

so for us, we thought that's really interesting, how do we make

26:31

sense of this? Is that safe? Does

26:33

that put enough distance between

26:35

the child and these men online? And one

26:37

of the really surprising things was we found that

26:41

even in those scenarios where the

26:43

daughter is not on Instagram themselves,

26:48

there were men online who would

26:50

try and do a form

26:52

of blackmail, although blackmail is not quite the right

26:54

word, they would reach

26:56

out to the

26:59

family's school, the daughter's school, and accuse

27:02

them of

27:04

producing explicit imagery, even

27:06

though it may not have been the case, and

27:08

cause all these real world issues

27:10

for them, or they would show

27:13

up at their home and

27:15

leave quote unquote gifts. So

27:17

our process of grappling with was hearing what

27:19

people were saying and then

27:21

going and testing that and saying, okay, is

27:24

that true? What is safe? And there

27:26

were a lot of real world harms

27:29

that were happening, and it was very

27:31

difficult to say, oh yes, you can do XYZ

27:33

and have your child be quote unquote on

27:36

social media in a safe way. It really

27:38

was a lot more dangerous and darker than

27:40

we thought. Tell me if

27:42

this is true in my read. It seems

27:44

like there's a real kind of scale of

27:47

what people are doing. Not every mom that's running

27:50

a kid's influencer page is selling used clothing, right?

27:52

There's a real scale, right? But

27:56

it seems like for young girls who are featured

28:00

on these

28:02

influencer accounts, especially where

28:05

we're talking about apparel,

28:09

men are typically an

28:12

issue. Is that the case? Is

28:14

there anyone who has figured out, like, hey, this

28:16

is a wholesome, creepy

28:18

guy, free influencer

28:20

project. We've never seen it. We've never had a

28:22

problem. We don't have one person in our

28:24

comments that's bothering us. No,

28:27

I would say that your assessment there

28:29

is spot on. You

28:31

were talking about your feeling of moral

28:33

outrage earlier. A lot of people felt

28:35

that way. And it is important to

28:37

keep in mind that not all of

28:39

these parents are selling used leotards. Right.

28:41

The vast majority, I should say, are

28:43

not doing things like that.

28:46

The vast majority are not selling

28:48

private images or images in string

28:50

bikinis. Those are definitely a factor.

28:52

And I think it's important to

28:55

highlight that. But most

28:58

of the parents are trying to keep

29:00

men off, but

29:02

they are still getting a lot

29:04

of men trying to comment and

29:06

follow like their

29:08

posts. And although

29:11

we didn't attempt to quantify these,

29:13

even if you have child accounts

29:15

that are the sort where

29:17

it's like the kids are unboxing

29:19

toys and that kind of thing,

29:22

they also get

29:24

creepy men following them.

29:27

It's not perhaps to the

29:29

degree, like the accounts

29:32

that we were focused on were

29:34

the ones that were only of

29:37

children. We had criteria

29:40

that they had to meet to be

29:42

counted in our methodology here.

29:44

So, you know, they had to

29:46

have multiple pictures of the child

29:48

in form fitting or revealing clothing.

29:50

And that could range from a string

29:53

bikini to yoga

29:55

pants, like yoga leggings,

29:58

and a crop. kind

30:00

of bra yoga top. The

30:03

ones we were studying were most likely to

30:05

attract men, but I think that any

30:08

child with a public account is

30:10

bound to, from some

30:12

time to another, get a

30:14

creepy male follower. Michael, I'm seeing you

30:17

shaking your head yes. Yeah,

30:19

I think how you started this interview

30:21

is often how we start them. We just asked

30:23

parents, tell us how you got

30:25

into this and let them tell

30:27

a chronology. We didn't go in

30:30

saying, tell us about all of your safety concerns

30:32

or tell us about all the problems that you're

30:34

having with the platform that would be

30:36

a bit kind of like leading the witness type.

30:38

So we just say, tell us your story. And

30:41

I think every single parent brought up these

30:43

kind of creepy men, comment, safety issues on

30:45

their own. I have

30:47

to say that there's a part of

30:49

me that believes at the very best,

30:51

while I know that not everyone is

30:53

doing the extreme things, it

30:56

still feels like, for me, a

31:01

range of commodifying

31:03

children. Yeah,

31:06

and we wanted to look at that and bring

31:09

a bit of data

31:11

or kind of test that creatively.

31:15

And so one of the things that we did was we

31:19

had this universe of 5,000

31:21

accounts that Jen

31:23

and I and a few other colleagues

31:25

spent weeks manually going

31:27

through. We had kind of an automated crawler

31:30

that collected some that matched the

31:33

criteria that we talked about where it was like

31:35

the form fitting, revealing clothing and listed that the

31:37

parent was managing it. And then once

31:39

we had that 5,000, that's where we got

31:41

those 2.1 million posts. And

31:45

so we took those images and

31:48

fed them through two different

31:50

image classifiers. So these are

31:52

AI type machine learning systems

31:54

that Google and Microsoft offer,

31:56

just a part of their

31:58

cloud computing services. And

32:01

you give it an image and it

32:03

gives you back and answer whether it

32:05

thinks it's quote unquote, racy or not.

32:08

And so we had a very large sample or we

32:10

could seat in. These images and

32:13

compare it to. The

32:16

number of likes and comments. And

32:18

fall work out from that profile.

32:21

And the A We did see a

32:23

correlation that there were there was a

32:25

spectrum. There. Were. The.

32:28

Ones that were posting. A

32:30

larger percentage of these receive

32:32

your. Photos did get

32:35

higher engagement. There. Was

32:37

a group where we heard this repeatedly

32:39

from parents that. The. First thing they do

32:41

in the morning. And the last in the

32:44

do at night is go through comments and

32:46

try and walk accounts and block Creepy Man.

32:48

It seemed like. On the other end

32:50

of the spectrum, there were some that were able

32:53

to send them off to a larger degree, but

32:55

it did. That did bear out in the data.

32:58

The. Data demonstrated to ask

33:00

that. Yes,

33:02

you. Were. A

33:04

running one of these accounts and really

33:06

dead. Spend a lot of time every

33:08

day blocking man the reach of your

33:10

account. This. Suddenly.

33:13

Limited and what we saw

33:15

in this data was. As

33:19

a chance.large are.

33:21

they tended to

33:23

have more male

33:25

followers and. Sis

33:28

supported but we were hearing and

33:30

the interviews which says that. He

33:34

has. He's spent a lot of time blocking

33:36

man your reach has limited use. Hadn't. Grow

33:38

your followers. And then

33:40

net. At some point.

33:44

Eyes are real, their images would

33:46

go viral and parents would become

33:48

overwhelmed by the number as people

33:50

following their child and they couldn't

33:52

keep up with this amount of

33:54

blocking or on the other side

33:56

of the coin they might just

33:58

decide that they. Wanted their child's account

34:01

to have a lot of followers. And.

34:04

So. If you want

34:07

that, you sort of has to.

34:10

Accept these mail followers and

34:12

said that's another decision point.

34:14

You of course have the

34:16

decision to go online at

34:18

all, You have the decision

34:20

to represent certain types of

34:22

clothing or not and you

34:24

also have the decision to

34:26

accept Followers are not that

34:28

are male are really only

34:30

restrict them to other. Parents

34:33

and kids who are in the. World

34:35

that your child is a part

34:37

of. Whether that's the answer gymnastics.

34:39

Or something else and.

34:42

This. Is the decision that some of

34:44

the parents were willing to make because

34:46

they wanted their child's account to grow?

34:49

Or they just felt that they didn't

34:51

have time to keep up with it

34:53

and still wanted to keep the account

34:55

up anyway? Jeffersonville.

34:58

Said it just as some people to said oh

35:00

yeah that just a number on the internet. we

35:02

don't really. Have an emotional

35:04

attachment to it. We keep a distance

35:07

to it. We don't get too involved.

35:09

Addicts was too overwhelming or. They

35:11

just had other goals. In mind. We.

35:14

Talked already about his parents are grappling

35:16

with it but there were other. Actors

35:19

in this ecosystem who. Were.

35:22

Totally fine with opening the floodgates.

35:24

We spoke with lion. Very.

35:26

Small clothing company in Florida. You

35:29

said Yeah, You know I. I do get

35:31

creepy followers, but. I. Need to

35:34

have a large following tell if

35:36

I business is going be successful.

35:38

So. As long as they're

35:40

not really been. Super. Explicit

35:42

on there being polite and

35:45

not overly sexual. I.

35:47

Keep them on their because I need the numbers.

35:49

So. We're trying to explore this kind of

35:52

how do we end up here and I

35:54

think those types of incentives are really at

35:56

play. This

36:05

message comes from Apple Cart earn

36:07

up to three percent daily cash

36:09

back on every purchase. Everyday them

36:11

grow it at four point five

36:14

zero percent annual percentage You: when

36:16

you open a savings account with

36:18

Apple Cart, visit Apple South Co

36:20

flashcard calculator to see how much

36:22

you can earn. Apple Cards subject

36:24

to credit approval. Savings available to

36:26

Applecart owners subs Eligibility savings accounts

36:28

provided by Goldman Sachs base Usa

36:30

member Ft I see from Supply.

36:39

I'm sorry for and I still live

36:41

at home like why is this Fisher

36:43

City me a tendency to be. His.

36:50

Last fall the National Association of

36:52

Realtors will need to date as

36:54

selling that housing affordability had fallen

36:56

to it's lowest point since the

36:58

Nineteen eighties for reasons of economy

37:00

karma American. There just aren't a

37:02

lot of home sales going on

37:05

and there's really kind of a

37:07

standoff right now between buyers and

37:09

sellers. That standoff. Leads to another kind

37:11

of sand off with always find me and forty

37:13

for many years old and haven't read a. Prepared

37:15

for non and I know. Tomahawk.

37:19

Sir, I don't know here. like set

37:21

on Today explained a two part series

37:23

on how it might not be yourself,

37:25

our housing reality sites and the American

37:28

Dream is in your childhood Bedroom series

37:30

sponsored by Mint Mobile City. Explain in

37:32

your seed every weekday. And Tell Tell

37:34

Mama! Minutes

37:38

Dog: She's like skin on the

37:40

doors. Oh and everything okay. Their

37:42

their moms in a school dogs.

37:48

Are. Discussed in this podcast as part of

37:50

a series that I'm calling Living Beyond Human

37:52

Scale. The. Cost in the

37:54

possibilities. And. This is definitely a

37:57

problem of living beyond. Human scale of

37:59

parenting be on human scale I don't

38:01

think as as piss off as I

38:03

got really net I'm not sure a

38:06

single mother. That. I

38:08

read about. I'm not convinced. That.

38:10

They would do this in real life

38:13

that they would dress their children in

38:15

this outfit and have them parade around

38:17

actual men that are gawking and yelling

38:19

lead things that I think this is

38:21

a human scale. Unfortunately where

38:24

it's pretty clear that the

38:26

psychological and self worth ramifications

38:28

are very I Rl. You.

38:31

Know and so. I. Have

38:33

to ask this question because it came up a lot

38:35

and I is in research we would call like an

38:37

ecological fallacy of i assign it to the wrong saying.

38:40

There's a lot of mention. From.

38:42

The parents when they're talking

38:44

in this article about dance,

38:47

cheer, and gymnastics. Is.

38:49

The commonality. Not the type of

38:51

parent but the tight sitting clothing.

38:53

dude you're see what I'm asking

38:55

like why is it hovering around

38:58

this. A big

39:00

commonality is the type of

39:02

clothing that has. That

39:04

is what is most likely

39:07

to attract these men who

39:09

are interested in children the

39:12

sexualized way to more sexualized

39:14

the clothing is, the more

39:16

they are gonna be interested

39:19

in those images. But I

39:21

also seems that there. Are

39:24

means actually between

39:26

that and he

39:28

activities in English

39:30

that have a

39:33

predominates. So we

39:35

did have parents

39:37

as. Child actors.

39:39

Child models. They are definitely

39:41

part of our universe, and

39:43

I do think that those

39:46

types of pages attracts adult

39:48

men as well, particularly if

39:50

the child. And there's modeling

39:52

and an adult aside way

39:55

or just. Made to

39:57

look more grown up by

39:59

it's competitor The Gymnastics Competitive

40:01

cheerleading, Competitive stance. These kinds

40:03

of activities for whatever reason

40:05

and I would actually add

40:07

some child pageants and that

40:10

we didn't do. We had

40:12

some pageant parents as well.

40:14

They. Seem more likely

40:17

to be sexualizing. These.

40:19

Girls and also the

40:21

outfits are sexualized. We.

40:24

Did not have a lot of

40:26

say child swimmers who are wearing

40:28

tight fitting clothing. but for whatever

40:30

reason they're not modeling and I

40:32

wonder. Does. That mean this is a questions and

40:35

I don't know the answer. I know that you

40:37

probably don't hesitate to supported empirically this answer, but

40:39

one hypothesis of mine. Would. Be

40:41

as as as as a former swimmer were

40:43

never in make up. In

40:45

some sports, It has

40:48

been normalized have a full face of make up at

40:50

five or six years old and so he was sixteen

40:52

year old. slimmer out of the pools, got green hair

40:54

and no make up and down the line is you

40:57

know it's not. Yes, it's an appearance part.

40:59

Of the sport norm, the athletic norm I

41:01

think in some ways. That makes

41:03

a lot of sense. I think

41:05

I was really interested in the

41:08

concepts as their sexual is a

41:10

sin of girls activities even if

41:13

you think about girls volleyball. Or.

41:16

Beach volleyball in particular, right?

41:18

Yeah, as the women's version

41:20

of a lot of sports

41:22

has, they have much tighter

41:24

fitting skipped beer clothing and

41:26

I think that it is

41:28

a cultural phenomenon worse in

41:30

Harrogate. And But the point

41:32

you raise about these particular

41:34

activities: Not only are you

41:36

wearing those types of clothing,

41:38

but it's common to have

41:40

a full face as make

41:42

up and the design of

41:44

the clothing. It seems very

41:47

important. There's a lot of

41:49

stylization that goes on there, and so

41:51

I think there's. An

41:53

interesting intersect. San that's happening

41:55

here and we didn't report out all

41:57

the details of this but it would

41:59

be assassin. They had an academic study

42:01

as I think it's really right for people

42:03

to take a look at. Is

42:06

interesting to I think because I mean I've seen

42:08

something I know I have some in and I

42:10

know a couple of other sports. The. Influencers.

42:13

And a lot of sports.

42:16

Weather's. Not and over

42:18

sexualization. You're talking

42:20

about apparel So there seems also

42:22

be a tie between the apparel

42:25

companies that sell into Dance and

42:27

Sheer and those are looking for

42:29

child influencers. Michael influencers just economic

42:31

know we know are very very

42:33

powerful. Yeah, but a swim. Suit

42:36

a speedo I would imagine

42:38

is looking for a bad

42:40

ass probably U T Austin

42:42

swimmer who broke a record

42:44

as an influencer. They're looking

42:46

differently at apparel and performance.

42:48

Does that make sense? It

42:51

as it has your and appearance

42:53

yeah and I think. I

42:55

get a it's not something that we barely.

42:57

I don't feel like I can. Provide

43:00

any video is an answer that. I

43:02

think these are all super interesting theories

43:04

and it sounds like they had had

43:07

some validity. We just don't know. Based

43:09

on our reporting in I don't know.

43:11

Either. For caviar just let their noses bees

43:13

or hypotheses that I'm drawing and ask the

43:15

answered way and on hypotheses not find is

43:18

Michael what would you add. With.

43:20

Pose this question to some choreographers and like

43:23

denser, we didn't really do a deep dive

43:25

into this, but it did come up and.

43:27

Some. Of them were equally concerned where they

43:30

would say yeah, we get some very young

43:32

girls who come in here. And

43:34

just the way that they moved away

43:36

they work is just not age appropriate

43:38

for them. And I think one

43:40

of the challenging things with this story. Is

43:43

very visual story but we don't want of

43:45

he. Also publishing photos of

43:47

the children that was a very difficult.

43:50

Question. Of how Do We meaning that And.

43:53

We're. Losing that that we saw that. we

43:55

just gonna be impossible to describe in a

43:58

print peace but the websites and includes the

44:00

photos of, but beyond just

44:02

the apparel, there was something

44:04

very kind of adultified and sexualized in

44:06

many of just their facial expressions and

44:09

poses, even beyond just dance poses in general or

44:11

a lot of stretching and things

44:14

like that, but just there

44:16

was something even that went a step further

44:18

in a lot of this. That seems like

44:20

some different behavior that you're

44:22

seeing from the adult world cast

44:24

down into this younger age group.

44:27

I think also plays into this dynamic of

44:29

not just doing dance, but doing

44:32

dance in a particular way in communication

44:34

with other parts of our culture that

44:36

are more adultified. Yeah, I mean,

44:38

this is where we also get into social learning theory.

44:40

This is where we get into, wow,

44:43

this pouty sexualized post

44:46

got twice the likes of

44:49

me laughing like a

44:51

13-year-old normally would. And

44:53

then this incredible hardwired

44:55

DNA central part of

44:58

social learning that says, I'm

45:00

more loved, there's more belonging, and

45:03

I'm more valued. Yeah, absolutely.

45:05

Yeah. You see these

45:07

poses from adult influencers,

45:10

adults on social media, adult

45:13

women. You think about

45:15

the Kardashians or a

45:17

whole host of

45:19

women who have made

45:22

names for themselves online. Yeah. And

45:24

it is this very pouty, winking

45:27

over the shoulder kind of

45:29

thing, very adultified. And

45:31

I think

45:34

if you just think of that, it's

45:38

maybe a disturbing thought, but you think of

45:40

that just translated down to like a nine-year-old

45:42

child. Yeah. That's what we're talking about. And

45:47

when you're that age, you aren't

45:49

gonna associate that with sex or

45:52

understand exactly, no, what the

45:54

subtext is, you just

45:56

see that this is what adults are doing online.

45:58

This is what old people do. girls, young

46:00

women who you might look up to

46:03

in some way are doing

46:05

online. So when somebody asks you to do

46:07

a post like that or take a picture

46:09

like that, you might not even know what

46:12

it means when people say a creepy old man or

46:15

what exactly the implications

46:17

of that are. You

46:19

just see that a lot of people are liking you,

46:22

basically. Your

46:32

brain needs support and new Ollie

46:34

Brainy Chews are a delightful way

46:36

to take care of your cognitive health.

46:38

Made with scientifically backed ingredients like Thai

46:41

ginger, L-C-A-N-ing, and coffee, Brainy Chews support

46:43

healthy brain function and help you find

46:45

your focus, stay chill, or get energized.

46:47

Be kind to your mind and get

46:50

these new tropus chews at ollie.com. That's

46:52

o-l-l-y.com. These statements have not

46:54

been evaluated by the Food and Drug

46:56

Administration. This product is not intended to

46:59

diagnose, treat, cure, or prevent any disease.

47:01

Hey there! Did you know Kroger always

47:03

gives you savings and rewards on top

47:05

of our lower than low prices? And

47:07

when you download the Kroger app, you'll

47:09

enjoy And when you download the Kroger app, you'll enjoy over $500 in savings

47:11

every week with digital coupons. And don't forget

47:13

FuelPoints to help you save up to $1

47:16

per gallon at the pump. Want

47:18

to save even more? With a Boost membership,

47:20

you'll get double FuelPoints and free delivery! The.

47:22

Up and save big at Kroger today. Kroger.

47:25

Fresh for every. Savings may vary by

47:27

state. Restrictions apply c. Site. For details.

47:33

Yeah, when you can take

47:35

that were all. I

47:38

might be like pro-France on the ban.

47:40

Like I might be pro-it's illegal to

47:42

post a photo of your child on

47:44

social media if your child is a

47:46

minor. Like I might be for that.

47:49

I'm curious, do you

47:51

have thoughts or do you not want to weigh in? So

47:55

I'm not here

47:57

to advocate one policy over

47:59

another? I think our role

48:01

is to explain to people what is

48:03

happening, what the heck is going on.

48:05

Yeah, yeah. So they can consider

48:08

their options. I

48:10

have some thoughts about approaches that

48:13

are interesting. In our reporting

48:15

process, we looked

48:17

at some of these laws that are

48:19

coming up in Europe and there are

48:22

also laws in a couple of states

48:24

that have been passed or are being

48:26

discussed that don't go as

48:28

far as to say that you can't post

48:30

your child online but are at least trying

48:33

to make some

48:35

inroads to bring influencing

48:38

more like the family vlogging

48:40

type of thing in line

48:43

with laws

48:45

that are related to child actors. So

48:48

ensuring that kids get some

48:51

of the money and taking that

48:53

approach. And I think these are really

48:56

interesting approaches for policy makers

48:58

to look at. The

49:00

right conversations we should be having. I

49:04

would say, I agree with Jen, we don't come

49:07

up with solutions, which solutions are hard, I'll

49:09

say. Yeah. But

49:11

I do push back on the idea

49:13

that the current form of

49:15

technology and social media and

49:18

the internet is

49:20

the only form or is the

49:22

inevitable form that these

49:24

systems have to take. And

49:27

I think if you think of it as, oh, this is

49:29

just what the internet is, social media is,

49:31

you're kind of forced to have the choice that you said of,

49:34

do I just do it or not do it? We

49:36

can have more imagination about

49:38

what safe systems are.

49:41

The conversation that came about a lot

49:44

in the mid 2010s around

49:46

privacy and surveillance in

49:49

the book by Edward Snowden was

49:51

there wasn't just a magic bullet to make

49:54

a system more privacy

49:56

friendly or more secure. You

49:59

had to take these principles. of privacy by design.

50:02

You had to build a system and at

50:04

every step think about what are the privacy

50:06

and personal data implications. The

50:08

equivalent in this space is safety by

50:10

design. And it took a

50:13

long time even for these privacy by design ideas

50:16

to get into the software and internet

50:18

space. I think safety by design is even

50:20

further back than that. Oh yeah. But

50:22

what would it be like if we built a

50:24

system that was just as fun, just

50:26

as cool, you could still share images, connect

50:28

with people that you care about, keep

50:31

up with the conversation. But

50:33

if every one of those features had

50:36

a sense of how does this

50:38

affect privacy, we could not have

50:40

this kind of false choice of either opt in,

50:42

opt out, be on the internet, be

50:45

completely off the internet. We push the

50:47

boundaries of what we think is possible for the

50:49

values that we want. I

50:51

mean, this is definitely every single person I've

50:53

talked to in this series so far. William

50:55

Brady on moral outrage and algorithms. S.

50:58

Craig Watkins on AI scaling

51:00

injustice or combating it depending on who's at

51:02

the table when we develop it. I mean, basically

51:05

what you're both saying is exactly what I'm hearing

51:07

is that there's a false

51:09

dichotomy about all in or all

51:11

out. We can reimagine it. I'm

51:14

just wondering, just personally,

51:16

this is my opinion, Bernays, me,

51:18

my opinion, is it's

51:21

going to require moral

51:23

imagination that's going to

51:25

bump up against commerce. My

51:28

money is on our ability to morally

51:31

imagine. My money is

51:33

not on the people that have

51:35

control giving away commerce in

51:38

favor of safety right now. So I hope

51:41

that happens. In general, the

51:43

influencer economy, I've almost gotten

51:45

to this point in my life where everybody on there

51:47

is a grifter. I'm like, just like, I have two

51:50

buttons. One that says like, sit

51:52

the fuck down and the other one says shut the

51:54

fuck up. Like, I mean, those are my social media

51:56

buttons. Am I allowed to, I'm

51:58

allowed, I mean, I'm saying. Yeah, it's

52:00

my podcast. This is your podcast. Yeah,

52:03

I'll get an E for that, but

52:05

that's okay. It's worth it. But I do have

52:07

these two buttons when I'm listening because when

52:10

my mom was first, and you'll relate to this,

52:12

Michael, because of your studies, probably when my mom

52:14

was first diagnosed with dementia and it was rapid

52:17

onset and was deteriorating fast,

52:20

and I would scroll through

52:22

like Instagram and it would

52:24

be like a person in a white coat

52:26

that was so reputable. We have figured out

52:28

how to reverse dementia and it's like I

52:31

had dementia and now I had a blueberry

52:33

a day mixed with some quinoa and now

52:35

I'll call my sisters and I'm like, I'm

52:37

making mom a daily quinoa blueberry shake. And

52:41

even right now, we're

52:43

in the path of totality. This

52:45

will have already passed. Hopefully

52:48

for the preppers who I'm getting ready to talk

52:50

about, I'll still be alive. But literally, I was

52:52

just reading an article about I have

52:55

some family members that are like prepping for the,

52:57

like they're going to have spam like at Y2K

52:59

and then they sent me a link

53:02

to the TikTok and then I dig into

53:04

this TikTok and this asshole is

53:06

a battery and flashlight salesman. And

53:10

I am a smart, critical

53:12

thinking, media literate person, but

53:15

when I'm sad because I just visited my mom,

53:18

I'm making those smoothies. Great.

53:21

What is happening? Yeah,

53:23

I think one of

53:25

my big takeaways from reporting on

53:27

this story is that

53:30

social media has just broken everyone's

53:32

brain. I think you're

53:36

talking about a couple of

53:38

facets, you know, misinformation, bad

53:41

healthcare information. Vulnerability. Yeah.

53:44

Vulnerability, just

53:47

the drive to get

53:49

what we now know

53:52

are these dopamine hits

53:54

from social media interaction,

53:56

the drive in

53:58

this. economy

54:01

to attain what is

54:03

perceived as some

54:06

sort of measure of

54:08

success and possibility of a

54:10

career that is stable and

54:13

generates additional income,

54:15

there are a lot of

54:18

different drivers behind

54:20

what we were exploring

54:22

with these parent-run

54:24

accounts that you are

54:27

seeing elsewhere in social media. I think

54:29

this is just the story of here

54:31

are some terrible parents who

54:34

are putting their kids online. And

54:36

I understand the judgment and I

54:38

think there are parents who have been in

54:40

fact arrested that we talk about in the

54:43

story. But I

54:45

think it's really important for us to

54:47

talk about how these are

54:49

changes that are affecting all of

54:51

us. Yes. And

54:53

even if you're not putting your kids

54:56

online, you're not making these particular decisions,

54:59

you can feel some of these changes

55:01

being wrought on our society overall. And

55:04

so I am hopeful

55:06

that readers would not simply say these

55:08

parents are terrible, we're just going to

55:10

dismiss this, none of this could ever

55:12

affect me or my kids or

55:14

in any way. Because

55:17

at some level, I think it is affecting

55:19

all of us and I think that's one

55:22

reason we wanted to talk about it. And

55:24

you also mentioned you don't think that we're

55:27

going to be able to stop this

55:29

drive for commerce. And

55:31

that's a concern I have too. I

55:33

think that Michael's point

55:36

about trying to reimagine things was

55:38

great. And I

55:40

completely agree with it. I

55:43

have no earthly idea how

55:45

we can get not just

55:47

these people to stop influencing,

55:50

but the platforms

55:52

to stop seeing as their

55:54

reason for being, having people

55:57

spend more time on their platform. I don't

55:59

know how we get around that and

56:01

that is also what's driving

56:03

a lot of this. Yeah, no, I agree.

56:05

Michael, thoughts? Hit us

56:08

with some positive moral imagination,

56:10

hopefulness. Hit us. Well,

56:13

this isn't a positive thought, but I think just to

56:15

go piggyback on what I'm saying, I

56:17

think just how things got broken, I just

56:19

really think a lot about...I'll get

56:21

to the positive part, I promise. Let's

56:24

go. I think a lot about how

56:26

we've lost the normal

56:29

signals of trust and authority that

56:31

we learn in the offline world.

56:34

And this applies to social media, but

56:36

also just to sites like Amazon too,

56:38

where if you were going

56:40

to get confronted with a

56:42

bad salesman or a bad product,

56:45

a harmful product, you would have to

56:47

walk into a store, you'd

56:49

be able to see, okay, have they

56:51

invested in this store? Do they have

56:53

salespeople? You have to do all this

56:55

stuff, which again, the tech world would

56:57

call, that was unnecessary for it. But

57:00

it came with a lot of signals of authority and

57:02

signals of trust. Now when you

57:04

see something online, it's all in the same

57:06

kind of shiny box that

57:09

looks really nice and undifferentiated from

57:11

everything else. I think that's just

57:13

a big factor of why things

57:15

get broken. We just don't have these ways

57:18

to sift out good from bad

57:20

and that applies both to products that you

57:22

can get online that aren't good or kind

57:24

of gray market stuffed and also to people.

57:27

You get these messages and they have

57:29

this avatar and they're in this very

57:31

nicely designed interface on your phone that's

57:33

extremely personal. So I think that's kind of a

57:35

bit of the mechanism for what

57:38

you've lost. The

57:41

one positive thing of just

57:43

kind of thinking how

57:45

things may be able to change, although there is

57:47

a caveat, a lot of the

57:50

safety conversation does remind me of the privacy

57:52

conversation from 10 years ago. And

57:55

at that time, it was

57:57

just accepted wisdom that no

57:59

one... about privacy. No one

58:01

values privacy. And you

58:03

could never get the platforms who are

58:06

built around advertising to

58:08

have that as value. Apple

58:10

has since come out and made privacy a

58:13

selling point and have decided that like actually

58:15

we do want to push that. The

58:17

cynical counter argument is

58:20

that if you sell privacy and

58:22

encryption, then it gives you a

58:24

free pass for moderation. So I'm not going

58:27

to say that there are

58:29

not also some self

58:31

interested things that could be at play. But

58:34

it is interesting to see over kind of the span

58:36

of 10 years how some of these things that are

58:38

very easily dismissed as oh, that'll never happen

58:41

does fully come around because the culture starts

58:43

to value that. That's not super

58:45

optimistic. I think there's still a lot of cynicism

58:47

you could put onto that as to why that

58:49

change came about. But I

58:52

still do think that I try and

58:54

find some optimism that the way the world is

58:56

now is not the way it will be in

58:58

the future. I think it kind of

59:00

goes to why I think we do this work

59:02

as we try and put information out there. One

59:05

of our colleagues once said that journalists aren't

59:07

cynics. It would be cynical to think that

59:09

nothing will change. We're actually optimists

59:12

because we think that by

59:14

exposing things by shedding light on them, it

59:16

does cause the world to change in hopefully

59:18

ways for the better. I think

59:21

that's true. And I think one of the things

59:23

that was so compelling about this article was

59:27

the mix of data, the

59:30

stories that brought it home, and then

59:32

also the ability to understand the methodology

59:34

and the scope. And

59:36

I will say if you think

59:38

about privacy, Snowden era privacy and

59:40

later, it's what Craig

59:43

Watkins said. There's going to have

59:45

to be policy intervention. Right. It's

59:48

not so much how much can

59:50

you put on the individual? I think that's also

59:52

a question of framing is companies love to say,

59:54

well, this is really all on you to sort

59:57

this out. And there's another way

59:59

of thinking about it. And yeah, the developments in what

1:00:01

Europe is doing is super interesting from

1:00:04

a policy perspective that you can nudge

1:00:06

it from the supply side, from the

1:00:08

company side. Yeah. The other two

1:00:10

things I think are interesting when we talk about your work

1:00:12

in the intersection of the earlier podcast

1:00:14

in this series is William Brady, who

1:00:16

was at Yale and now is at

1:00:18

Kellogg Northwestern, talked about algorithmic honesty. So

1:00:20

on everything that you see, it tells

1:00:23

why you've seen it. And

1:00:25

then Craig Watkins talking about

1:00:27

how it's indefensible now

1:00:30

to build algorithms

1:00:32

and AI that don't

1:00:34

have ethicists, people with

1:00:36

lived experience, humanists, safety

1:00:39

people, privacy people at the

1:00:41

table, that engineers and mathematicians,

1:00:43

computational scientists, this cannot be

1:00:45

their domain anymore. It

1:00:47

requires a mix of approaches.

1:00:50

And we've had, as

1:00:53

you saw with privacy, as Michael was

1:00:55

saying, there were policy changes, particularly in

1:00:57

Europe, California, and so forth. And it's

1:01:00

slow. Michael is so optimistic.

1:01:02

I am actually, I feel like so

1:01:05

much, I feel like great now. It's

1:01:07

like teeny tiny bit of optimism, right?

1:01:09

Me too. It's not my normal

1:01:11

role that I've had. No, it's not. And that's

1:01:14

why I'm... Jen's like,

1:01:16

who are you? Yeah. Yeah.

1:01:19

Yeah. So I'm feeling like, oh, wow,

1:01:21

we are actually making a difference. It's

1:01:24

just that it's slow and the pace of

1:01:26

technological change is so rapid that sometimes I

1:01:28

think it probably feels that we are making

1:01:30

no difference whatsoever. But if you look at

1:01:32

it over a span of decades, we could

1:01:34

maybe... If

1:01:37

you look at it over a span of decades, we could maybe catch

1:01:39

up. I've closed all my

1:01:41

calendar apps and yet it is still

1:01:43

doing notifications. I feel like

1:01:45

that's the universe dinging a little bell, like in

1:01:47

the movie when someone gets his wings. Okay.

1:01:50

Rapid fire. I'm going to switch between who goes

1:01:52

first and second. Are you ready? Okay.

1:01:56

Yes. Okay. Jen, you're called to be

1:01:58

really brave, but your fear is real. You can feel it. in the back

1:02:00

of your throat. You have to be brave. What's the

1:02:02

first thing you do? Just

1:02:04

take a deep breath. Yeah.

1:02:07

Michael, first thing you do. You got to be

1:02:09

brave. You can feel the fear. Probably

1:02:14

just remain silent like I just did.

1:02:16

Yeah, I was like, we're, he's modeling

1:02:18

it. You're glad it's, yeah. Yeah, it's right here.

1:02:20

Okay. Jen, last TV show you

1:02:22

binged and loved. Oh gosh,

1:02:25

we haven't finished it yet, but

1:02:27

we've been watching an

1:02:29

old season of Survivor, which I had

1:02:31

never watched before because I hate reality

1:02:33

TV and I'm really

1:02:35

weirdly into it. This is just my

1:02:37

family has been into really old reality

1:02:39

TV lately. We also watched the Amazing

1:02:41

Race. So that I'll just go with

1:02:43

that. I like it. Michael. I

1:02:46

was super late, but like this

1:02:48

month I watched the first season

1:02:50

of Love is Blind. I thought

1:02:53

it was fascinating. It was really,

1:02:55

really interesting. I'm

1:02:57

now looking forward to

1:03:00

doing the internet archeology of finding

1:03:02

all the past discourse and

1:03:05

how it aged, but I thought it was

1:03:07

a very in-depth analysis of

1:03:09

human behavior. So just

1:03:12

for a split second, we're like, no, these are

1:03:14

not investigative journalists. These are like but

1:03:17

then within a second, we're

1:03:19

reminded that they're nerds. Okay.

1:03:21

Favorite movie, Jen. Something

1:03:24

you wouldn't turn off. Well, something you wouldn't turn off.

1:03:29

I never turn off or

1:03:31

big Star Wars. Oh, yeah. Also

1:03:33

the Shawshank Redemption, just because you

1:03:36

can't ever turn it off. You

1:03:39

can't turn it off. Even though I don't know

1:03:41

that either of those are my favorite. I don't know that I

1:03:44

would say they were my favorite movies, but I wouldn't turn them

1:03:46

off. If I asked somebody who knew you really well, what's your

1:03:48

favorite movie? What would they say? They would say I

1:03:51

don't really watch movies. Got it. Okay.

1:03:53

I'm going to go with Shawshank Redemption folks.

1:03:56

Okay, Michael, you got to have a favorite movie. I

1:03:58

love LA story. Which is

1:04:00

an old Steve Martin movie from like 1992. Yes.

1:04:02

That is just I find it It's

1:04:06

just so so funny and touching

1:04:08

and kind of absurd and it's

1:04:10

very weird And I

1:04:12

think probably most of its jokes have

1:04:15

not aged poorly which I think for many old

1:04:17

movie is a hard Yeah, so I think it

1:04:19

still is is still good. I think it's a

1:04:21

weird one and it reminds me of LA so

1:04:23

I like it Yeah, Michael

1:04:25

as the LA not resident, but

1:04:28

He carries the vibe with him. Yeah Never

1:04:31

former resident of LA is much

1:04:33

better with movie resident. Oh, yeah Well,

1:04:37

you do kind of boost LA. Yeah,

1:04:39

I mean LA story

1:04:41

says so much about you. I'm gonna have to think

1:04:43

about it for a long time. Okay Favorite

1:04:46

meal gin. Oh Can

1:04:49

I say just like favorite food instead?

1:04:51

Yes. Okay Oreos and milk. It's not

1:04:54

a meal That's

1:04:56

so wholesome And

1:04:58

it can be a meal. I'm for it. Okay I

1:05:01

think I've had it as a meal before wait.

1:05:03

Do you dip or do you eat and swish? Oh

1:05:06

Definitely dip it dip until I get certain

1:05:08

amount of time like the cookie actually has

1:05:10

to be soft How many do

1:05:13

you lose? What percentage do you lose

1:05:15

it? It's a low percentage. I tried to

1:05:17

time it so that they're not like falling

1:05:19

off Okay. Yeah, I lose half.

1:05:21

Oh Half. Oh, that's

1:05:23

that's a lot Yeah,

1:05:25

it's not good Michael favorite food

1:05:28

or meal. I have

1:05:30

another LA one there's a slice

1:05:33

of chocolate cream pie from this place in

1:05:35

LA called the Apple Pan Which

1:05:38

is a very old I mean ancient for

1:05:40

LA it's from the 40s Which is before

1:05:42

time started and they have a wonderful

1:05:44

slice of chocolate cream pie with fries

1:05:48

Very controversial, but if you kind of salt the

1:05:50

fries and then dip them

1:05:52

in the whipped cream It's a

1:05:54

sweet salty hot cold combination. You may

1:05:56

get a lot of looks but just

1:05:59

stay the course Oh wait, the

1:06:01

universe loves it. Yeah. And

1:06:04

I'm from Texas where everyone here dips

1:06:06

their Wendy's french fries into frosties.

1:06:08

Oh, I was gonna say. Yes, into

1:06:10

the chocolate frosties. Yes, I do that

1:06:12

too. Great. I'm not the only one.

1:06:15

Completely. I didn't know that was a Texas thing.

1:06:17

Totally. So this I'm really interested in

1:06:19

this question. Michael, you can go first. What's on your nightstand? Like

1:06:22

an assortment of books and a lamp. That's

1:06:26

pretty much it. That's it. Jim?

1:06:28

I have a small nightstand and I'm in

1:06:31

my, I have books, a couple notebooks that

1:06:34

I use for work, Kleenex

1:06:36

box and lamp.

1:06:38

It's not very interesting. I have a

1:06:40

very small nightstand though also. It's

1:06:43

not fitting a lot. I guess most nightstands

1:06:45

should be small. I think they're mostly

1:06:47

small. I have like a leading tower of pizza of

1:06:49

books. Okay. And I always try to go

1:06:51

for the fiction but I usually end up just reading nonfiction. Yeah,

1:06:54

I have a Kindle actually now that I'm looking at

1:06:56

it. Oh my god, do you really? I

1:07:00

can't do it. You can't do the Kindle? Well,

1:07:03

I for a long time thought as

1:07:05

a technology reporter, I am often a

1:07:07

Luddite because I'm covering privacy

1:07:09

intrusions and whatnot. But you

1:07:12

know, we're in New York and carrying

1:07:14

a large book on my, in my

1:07:16

purse just to read on the subway

1:07:19

was just too much.

1:07:21

So the Kindle is really space

1:07:23

efficient. I appreciate that part of it. Yeah.

1:07:26

It just doesn't smell like a book. Okay. Michael,

1:07:29

what's one thing you're grateful for right now? It's

1:07:32

about to not be freezing cold. And it's

1:07:34

a pretty, pretty

1:07:37

pedestrian answer. But this,

1:07:39

I can see the sun came out and it's been raining here for

1:07:41

three days and I think that'll be nice. And

1:07:43

I have a couple of days off. So I'm grateful for that. I

1:07:45

love that. I wish you sun

1:07:48

and good days off. Jen, what are you

1:07:50

grateful for right now? Well,

1:07:53

I don't like to talk about my family very much.

1:07:55

So I'm not going to be too specific about it.

1:07:58

I'm grateful for my family. I know that's a. Pedestrian

1:08:00

answer as well. Y'all hold yourself to very

1:08:02

high answers. I think your answers are really

1:08:04

good Yeah, like I must be

1:08:06

like used to be interviewing really fancy people

1:08:08

because like Sun and some time off and

1:08:11

family I mean that just pretty much does

1:08:13

it that's that's it. I'm

1:08:15

grateful for y'all being on and I'm really It

1:08:18

was a hard piece to read but a really important

1:08:20

piece and to Jen up one point you made I

1:08:23

hope We'll

1:08:25

send everyone to it. I hope you read it I

1:08:28

hope for those of you listening you'll read this

1:08:30

piece if you haven't already read it It was

1:08:32

kind of wildfire across the communities in my life

1:08:35

But I saw my own struggles and myself

1:08:37

in it. It's really easy

1:08:40

to demonize people But

1:08:42

I don't know a person who

1:08:45

doesn't believe that Social

1:08:47

media is kind of a dangerous addiction and of

1:08:49

every of the 100% of the people

1:08:52

in my life who agree That's true. None

1:08:54

of us have stopped using it. So be

1:08:56

careful throwing your phone into glass houses, right?

1:08:58

Like same Yeah, thank

1:09:00

you all so much for being on Lockingus. I appreciate it. Thank you

1:09:02

for the work you're doing Tough

1:09:12

topic I mean I will say that I

1:09:14

really appreciate the thoughtfulness in which Jen and

1:09:16

Michael really approached this difficult topic Their

1:09:19

decision to not use photos. I mean go

1:09:21

check out the article in The New York Times

1:09:23

again The link will be on brenébrown.com. They

1:09:26

just handled this in such a it was

1:09:28

such integrity You can learn

1:09:30

all about the episodes the show notes all

1:09:32

the links on brenébrown.com We'll

1:09:35

have comments open on that page. You'll find

1:09:37

more links to read work from Jen and

1:09:39

Michael We'll have a transcript up

1:09:41

within three to five days of the episode going live

1:09:44

Also, we are starting to based

1:09:47

on really interesting demand in

1:09:49

addition to our monthly newsletter We're doing

1:09:51

a weekly digest about the podcast and

1:09:53

the stories that we're covering and looking

1:09:56

at what I'm listening to watching What I

1:09:58

think is interesting going on the world. So Feel

1:10:00

free to join up for that if you'd like another

1:10:02

thought provoking email in your box

1:10:05

every week that's

1:10:08

it stay awkward brave and kind and I'll

1:10:11

be interested to read your comments on this because I Know

1:10:14

I'm having a knee-jerk reaction, but kind of if you're

1:10:17

under 18 I wonder what kind of problems

1:10:19

it would solve to say no minors

1:10:22

on social media Something

1:10:24

not good is happening And

1:10:27

I'm curious about it staying curious. Okay. Thanks

1:10:29

y'all I'm not

1:10:38

you know, this is produced by Brene Brown

1:10:40

education and research group The music is by

1:10:42

Carrie Rodriguez and Gina Chavez Get

1:10:45

new episodes as soon as they're published by

1:10:47

following unlocking us on your favorite podcast

1:10:49

app We are part

1:10:51

of the Vox Media podcast network

1:10:54

discover more award-winning shows at podcasts

1:10:56

dot Vox Media calm Apple

1:11:01

card is the perfect cashback rewards credit

1:11:04

card You earn up to 3% daily

1:11:06

cash on every purchase every day That's

1:11:09

3% on your favorite products at Apple 2%

1:11:12

on all other Apple card with Apple pay

1:11:14

purchases and 1% on

1:11:16

anything you buy with your titanium Apple

1:11:18

card or virtual card number Visit

1:11:21

Apple.co/card calculator to see how

1:11:23

much you can earn Apple

1:11:25

card issued by Goldman Sachs Bank USA

1:11:27

felt like City branch subject to credit

1:11:29

approval terms apply

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features