Podchaser Logo
Home
What’s the State of Misinformation? with Renée DiResta

What’s the State of Misinformation? with Renée DiResta

Released Wednesday, 22nd May 2024
Good episode? Give it some love!
What’s the State of Misinformation? with Renée DiResta

What’s the State of Misinformation? with Renée DiResta

What’s the State of Misinformation? with Renée DiResta

What’s the State of Misinformation? with Renée DiResta

Wednesday, 22nd May 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

They always say trust your gut, but

0:02

one time my gut told me to bleach

0:04

my eyebrows. And that was Fashionable,

0:06

but not widely well received. While probiotics

0:08

can't help you with most of your

0:11

gut decisions, it can get your got

0:13

a little bit of support. And

0:15

Ritual has your back. They made

0:17

a three and one supplement with

0:19

clinically studied pre Biotics. in 1 supplement with clinically studied prebiotics, probiotics, and

0:22

a postbiotic to support a

0:24

balanced gut microbiome. Did

0:26

you know daily disturbances like poor diets,

0:28

stress, travel, the use of certain medications,

0:30

and plenty of other factors can throw

0:33

off your gut microbiome? Oh no! Enter

0:36

Ritual. Their Symbiotic Plus has

0:38

been a gorgeous tool. There's no

0:40

more shame in your gut game.

0:42

Symbiotic Plus and Ritual are here

0:44

to celebrate, not hide, your insides.

0:47

Get 25% off your first month for

0:49

a limited time at ritual.com/curious.

0:51

Start ritual or add

0:53

Symbiotic Plus to your

0:55

subscription today. That's ritual.com/curious

0:57

for 25% off.

1:00

Hey, it's Jonathan Van Ness. Americans

1:02

United for Separation of Church and

1:05

State defends your freedom to live

1:07

as yourself and believe as you choose,

1:09

so long as you don't harm others.

1:12

Core freedoms like abortion rights,

1:14

marriage equality, public education, and

1:16

even American democracy itself rest

1:19

upon the wall of separation between

1:21

church and state. Christian nationalists

1:24

are attacking these freedoms seeking to

1:26

force us all to live by

1:28

their narrow beliefs. Americans

1:30

United is fighting back. Freedom

1:33

without favor and equality without

1:35

exception. Learn more

1:38

about AU at

1:40

au.org/curious. Hey

1:47

curious people, I'm Jonathan Van Ness and

1:49

welcome back to Getting Curious. Well,

1:53

last time I checked it is 2024, which

1:55

is an election year. We've covered this quite

1:57

a bit on the podcast so far. But,

2:01

queens, I don't know if you remember our episode

2:03

with Nadia Bashir from a few years ago on

2:05

misinformation and disinformation, but the

2:07

misinformation and disinformation continues to

2:09

grow, continues to spiral. And

2:13

I'm really concerned about social media

2:15

algorithms and what roles they play

2:17

in spreading misinformation. And I think

2:20

really throwing our hands up and not

2:23

engaging and just getting frustrated and walking

2:25

away from the process, I

2:27

don't think is what's going to happen. But

2:29

I'm also curious about, has

2:31

misinformation ever been this pervasive?

2:34

So to talk about that, we're going to

2:36

be bringing in Renee DiResta. And

2:39

we're asking misinformation, disinformation, and

2:41

propaganda. What's the difference and how does it

2:43

affect us? And make sure to stick around

2:45

to the end of the episode where we'll

2:47

reflect on what we learned and if we

2:50

answered the question. So stick

2:52

around for that conversation. In the meantime,

2:54

my heart, and I know everyone

2:56

else's heart, is heavy and

2:59

feeling powerless in such

3:01

a gigantic system that is just, we're

3:03

just seeing disarray everywhere. I need to do

3:06

something to just feel

3:09

like I can do something to help the

3:12

suffering in Gaza. So I did

3:14

a little research and I looked into the World Central

3:16

Kitchen, which was established in 2023. But

3:19

I've been reading up on them. They've been doing a lot of

3:21

really good work in Gaza, helping to make sure that people are

3:23

fed. I've also been doing research in To

3:25

Save the Children. They have been around for a little

3:27

longer since 1953. That's actually a lot longer since

3:29

2023. But you get what I'm saying.

3:32

They're doing really important, essential work in Gaza. So

3:35

just before we get into that, I wanted to share

3:37

those two resources. As we support

3:39

a ceasefire now, and we

3:41

also want peace in the region, it's really

3:43

important that we say that there's no space

3:46

for antisemitism in this conversation.

3:49

There's no place for Islamophobia in this

3:51

conversation. And we know that people are

3:53

not their governments. And we know that

3:55

government's actions are not their people. That

3:58

was just on my heart. And I felt that... I

4:00

needed to share those two resources with

4:02

you guys this week. Okay,

4:04

let's get to our guest bio. Renee

4:07

DiResta is the research manager at

4:09

the Stanford Internet Observatory, where she

4:11

investigates the spread of malign narratives

4:14

across social networks and assists policymakers

4:16

in understanding and responding to the

4:18

problem. She has advised

4:21

Congress, the State Department, and

4:23

other academic, civic, and business

4:25

organizations and has studied disinformation

4:27

and computational propaganda in the

4:29

context of pseudoscience, conspiracies, terrorism,

4:31

and state-sponsored information warfare.

4:34

Her latest book, Invisible Rulers, The People

4:36

Who Turn Lies Into Reality, is out

4:39

on June 11th. So

4:41

we need your help, Renee. Also, how are you? Are you

4:43

thriving today? I'm good. I'm wonderful

4:45

today. You guys can't see this

4:47

unless you just so happen to see our

4:49

social content on this episode before listening. Renee's

4:51

got a really gorge orange headband today that

4:53

I'm living for sidebar. I just think it's

4:55

really cute. I also really want a headband

4:57

today. So just

5:02

to go over a little cliff notes

5:04

on misinformation and disinformation just to get

5:06

us up to speed. Can you tell

5:08

me the difference between misinformation and disinformation

5:10

again? Yeah, most people use it. The

5:13

misinformation to me is something that's

5:15

wrong, but unintentionally wrong. Disinformation is

5:18

something that is deliberately either wrong

5:21

or it's put out by somebody who's

5:23

not caught what they've seen. So with

5:25

misinformation, you're trying to use it to

5:27

mean something that is actually false, something

5:29

that can be disproven, something that actually

5:31

isn't true. Whereas disinformation,

5:33

sometimes what you're talking about is like

5:35

a campaign to make people believe a

5:37

thing. It's much more related to propaganda.

5:39

It's the idea that you're trying to

5:41

convince an audience of something. And

5:44

you might be doing that with false

5:46

facts or you might just be doing

5:48

it with fake accounts or you're using a lot

5:50

of bots, you're trying to boost something in authentically.

5:53

So that's how we tend to divide those things. Misinformation,

5:56

accidental, disinformation on purpose. So

5:58

how can we spot those misinformation? and

6:00

disinformation just when we're, like,

6:02

where does it exist and where do

6:04

we need to be aware that we

6:07

could be consuming it? Yeah. So misinformation

6:09

can be about anything, right? It's just somebody

6:11

gets something wrong. They read something,

6:13

they misinterpret it. There's a

6:15

scientific study with complicated statistics. They don't

6:18

really understand what they're reading, so they

6:20

make a claim about it. I

6:23

think that misinformation, you know, people usually share

6:25

it because they genuinely care and they really

6:27

believe they want to help their communities. What

6:29

a lot of people use the term for,

6:31

though, in this, like, kind of especially on

6:33

social media is when they're talking about an

6:36

opinion that they don't like, right? Or content

6:38

that's, like, rage bait. So we're using the

6:40

word, but we're not really using it in

6:42

the way that, you know, we're not really

6:44

using it to describe the thing that people

6:46

are actually upset about most of the time.

6:48

So I think the one thing that, you know, when you're

6:51

asking a question like, how do you spot it, a

6:53

lot of the time you kind

6:56

of believe it because it kind

6:59

of reinforces your pre-existing ideas, right?

7:01

You don't like a particular group

7:03

of people. You hear a claim

7:05

about that particular group of people, maybe a political party

7:08

that you don't like, and you're kind of inclined to

7:10

believe it. You

7:12

find yourself, like, you know, you see something

7:14

and they're, like, outraged, and so then you go

7:16

and you share it yourself. That's the kind of dynamic that

7:18

tends to happen with these things. And

7:20

that's why I think misinformation, when you're talking about

7:23

something that's true or false, it's a good time to

7:25

use the word. Otherwise, what you're really talking about

7:27

is propaganda and rage bait. So

7:30

I think really what's gotten me

7:32

so much is it being

7:35

queer and being genderqueer on

7:37

top of my sexuality is,

7:41

I think I've always seen and

7:43

noticed disinformation, especially when it comes

7:45

to our community, but it

7:47

does feel like there's just a higher amount of this

7:53

type of rage baiting sort

7:55

of content. And I

7:57

wonder if it's because... I've

8:00

hypothesized this on getting curious several times, but it's

8:02

so, you know, excuse me, but we've got to

8:04

do it. It's like, I think

8:06

that it's because negative content speaks to

8:09

like our fear and our brain. It's

8:11

like your survival instinct. It's like that

8:13

unknown thing or the really rage inducing thing,

8:15

that is like the snake, that is the

8:17

hippo that's going to like trample you, like

8:20

that is the thing that's going to kill

8:22

you. So you really notice it.

8:24

Whereas when it's like something that's like, you

8:26

know, a little fuzzy bunny,

8:29

like a happy story about like

8:31

someone doing something nice or accomplishing

8:33

something that doesn't have the same

8:35

amount of energy because it's not

8:37

like a threat to anyone. So

8:39

by playing off of that idea

8:42

of threat to whatever is

8:44

really good at getting people to click

8:46

and engage because people are afraid.

8:49

Do you think that that's true? I think that that's

8:52

true. I think one way that this really happens

8:54

is, you know, we were talking about his identity,

8:56

right? So we've kind

8:58

of hit a point, particularly on social

9:00

media, where people engage around things that

9:02

they feel like reinforce their identity. And

9:04

so much of that is

9:06

culture or politics. And so

9:09

regardless of what particular opinion you may

9:11

hold, what we see and

9:14

social science research kind of reinforces

9:16

this, this sort of geeky academic

9:18

stuff, basically says that people see

9:20

arguments against their political beliefs as

9:22

attacks on them personally also. So it puts

9:24

them into a mindset of, oh, I have

9:26

to fight with these people, right? I have

9:28

to, you know, I have to defeat these

9:30

people, these, these sort of like these enemies.

9:32

And so what you start to see is

9:34

people are all like, they're, they're

9:36

kind of like waiting for it. They

9:38

even go looking for it because it's like, well,

9:41

of course, this is what I do on the internet, right? I

9:43

fight with my political enemies. This is what the internet is for. This

9:45

is what Twitter is for. This is what TikTok is for. I'm going to

9:47

own my enemies today. And so you get people

9:49

who go there looking for that, you know,

9:51

looking for that content, basically just finding ways

9:53

to fight. And in addition to the ordinary

9:56

people that do this, the sort of crowds

9:58

of people, you have the influencers. And

10:00

the influencers are particularly good at it. And

10:03

that's because there's incentives for them. This is,

10:05

you know, they're making money off of it.

10:07

Their engagement goes up when they, you know,

10:09

when they do these things that rile up

10:12

their base. And so again, it's, you know,

10:14

I'm describing it in general terms because it's

10:16

not a thing that's unique to one particular

10:18

issue or one particular group or even one

10:21

particular country. It's just the incentives of social

10:23

media make it so that people engage like

10:25

this. And it really does intersect with what

10:27

you're describing, right, that sort of psychological response

10:30

to I am being attacked.

10:33

There is something I should be afraid of here. And

10:35

that kind of fear response gets people off the

10:38

fence and, you know, and kind of into the

10:40

melee. Is there a link or like, what's

10:42

the link between misinformation and then like polarization

10:44

across a whole broad group of people? And

10:46

the only other thing I was gonna say

10:49

about that too is, is that

10:51

like, we also just did this fun episode on like local

10:53

press. And one really interesting thing in that episode

10:56

that didn't occur to me at all, is

10:58

that like as local newspapers have dwindled, that

11:01

gave us practice in smaller groups of people, like

11:03

coming to a common understanding of facts. And she

11:05

used this example of like recycling. She was like,

11:07

if you ever wanna see a small town like

11:10

fucking fight each other tooth and nail,

11:12

talk about like redoing their recycling bins

11:14

and you will see people like just

11:17

get so pissed like about

11:19

something that seems like not like

11:21

as huge but it's like really intense.

11:24

But it gives people the practice on

11:26

like compromising and like coming to a

11:28

consensus. And we just don't, we have

11:30

way less practice coming to a common

11:32

consensus now, not only because of local

11:34

papers, but I think maybe because of misinformation

11:37

and like polarization, because at least papers had

11:39

to kind of back up what they were

11:42

saying like a little bit or be like,

11:44

this is an opinion, not fact. But

11:46

what is the link between misinformation and polarization for

11:48

like a community or like a whole country? Yeah,

11:50

there's a lot of, that's one of the things

11:53

that's constantly debated in academia, it's like, you

11:55

know, niche fights. And that's because it's really hard

11:57

to say like this piece

11:59

of information. and cause that belief, right?

12:01

Your beliefs are shaped, your identity is

12:03

shaped by who you engage with, what

12:05

you read, who you spend time with.

12:08

Used to be much more geographically based, like you know,

12:10

right? You have the people you're gonna fight about your recycling

12:12

bins with, live like next door. You

12:15

can be a real asshole in those debates

12:17

if you want to, but then you have

12:19

to see that guy every day, right? So

12:21

there is like, I think, a grace that

12:23

we, you know, that we afford

12:25

our neighbors, perhaps, when there's sort of a physical

12:27

proximity to us, and we don't have to do

12:29

that online, because there are just some random anon,

12:31

right? You know, somebody on the other end of

12:34

an avatar, and we don't have to care how

12:36

they feel about what we say or how we

12:38

act, because again, we're there to have the fight,

12:40

right? That's the incentives, like the norms are really

12:42

now around like, I am going to

12:44

go to the internet, like to brawl, right, now I'm

12:46

gonna go to the internet to learn something

12:48

new and make a new friends or neighbor

12:50

or whatever. So I

12:53

think that with the polarization stuff, it's

12:55

really hard to say like, misinformation leads

12:57

to polarization. I think one of the

12:59

issues though is when

13:01

you have that very factionalized world, and

13:03

then you have people who, they

13:06

really tend to start to trust the media

13:08

that speaks to them because it seems like

13:10

them, right? And that's different than local news,

13:12

which had to speak to all of the

13:15

people in the geographical region, right?

13:17

Whereas now, again, if I am starting a sub-stack,

13:19

say, and I wanna reach a particular group of

13:21

people, the people who are gonna pay me are

13:24

the people who I developed

13:26

a sense of trust with. And one way I can

13:28

do that is by saying like, I'm just like you,

13:30

right? As a member of

13:32

group X, let me tell you about how we should

13:34

think about this thing. And one thing that is

13:36

really interesting is when

13:39

you see actual proper disinformation

13:41

campaigns. And when I say

13:43

that, I'm usually referring to something where there is

13:45

like, a state actor involved, like let's

13:47

say Russia, China, Iran. One

13:50

of the things that happens is they pretend

13:52

to be members of the

13:54

community. That's how they're speaking to you. As

13:56

a fellow black person, we shouldn't vote for Hillary

13:59

Clinton. It's not... You shouldn't write it's

14:01

we shouldn't as a member of the shared

14:03

identity We should not do this thing. And

14:05

so you see that reinforcement that entrenchment and

14:08

then they and then they create the other

14:10

right as American

14:12

veterans we are not receiving our

14:14

benefits. Why are we having more

14:16

immigrants come in? Right. And so there's that

14:19

there's that connection where first and foremost what

14:21

I'm emphasizing is that we're in this together

14:23

We're a shared group, you know, we

14:25

have these shared beliefs and then also those other

14:27

people over there, you know They're

14:30

taking something away from us. And

14:32

so there's that that's the sort of thing where again when

14:35

you see actual literal state

14:37

propagandists doing that work That's

14:39

the strategy that they're running and it's because it

14:41

is the ability to say like you

14:44

trust me I'm like you those other

14:46

people are over there and this

14:48

is how you know This is that let me tell

14:50

you about how the world is we can't compromise with

14:52

them because they're always trying to take something from us

14:55

And you do see that again and like hyper

14:57

partisan political influencers do the same thing. Of course,

15:00

they're not, you know They're not fake. I'm not

15:02

saying in any way that they're quote inauthentic but

15:05

the the sort of The

15:07

accounts that are pretending to be something that they're

15:09

not are using that same type of rhetoric because

15:11

it's very effective So are we just totally

15:13

fucked like social media going to fuck

15:15

us to hell or like how can

15:17

we be effective with our information? One

15:20

of the things I think about is like, you

15:22

know, I have a very dry analytical way

15:24

of communicating, right? It's just you know, here's

15:26

my here the factors how I understand them.

15:28

I like to write I hate

15:30

being on video you know, I thought of making

15:32

kickoffs like gives me anxiety but One

15:35

of the things when the reasons why I wrote the book in part was

15:37

like I feel like Influencers have a

15:40

role to play here, right? Like

15:42

they set the norms. This

15:44

is how you engage This is how you criticize

15:46

somebody just how you talk about somebody right? Like

15:48

you have a lot of people who are who

15:50

really look up to people who have massive Followings

15:52

of huge audiences because they see it as like

15:54

here's a person who's kind of like me has

15:56

a lot of the same opinions It's me and

15:59

here's how they behave And it becomes

16:01

an offense like this

16:03

is the avatar for what it means to

16:05

be a good progressive or a good, you know,

16:07

fill in the blank, conservative or whatever. And

16:11

so that idea of norm setting is

16:13

something I think a lot about. I just

16:15

don't feel like I personally have the power

16:18

to do anything about that. The

16:21

only thing I can do is decide how am I

16:23

going to engage? And

16:26

I do feel like I have kind of hit a

16:28

point where when I see something outrageous about, you

16:30

know, a person or a politician I don't like, like, I actually do

16:33

go try to like find more articles about it

16:35

before I just hit the reshare button at this

16:37

point, like the kind of like, pause before you post

16:39

or whatever they say in media

16:41

literacy these days. But

16:44

there is that question

16:46

of like, how do you establish

16:48

norms within your community? Again, it

16:50

used to be so much geographical.

16:53

And then now, back around 2015,

16:55

the idea that, you know, you

16:59

got attention and quote unquote one by

17:01

like owning your enemies became sort of

17:03

norm for engaging on the internet. And

17:05

people grew massive followings because they were

17:07

very good on, you know, good at

17:09

dunking on people. Like that was their,

17:12

that was a whole mo. So I

17:14

think that kind of shift does have to happen from, you know,

17:17

from people who begin to realize that

17:19

like, you know, I think a lot

17:21

of influencers in particular start off thinking like, Oh, I'm

17:23

just posting my opinions to my friends. They

17:26

don't see it as like a source of power. Right?

17:28

But it is, it is a source of power. And

17:30

to think about how you use that power is something I wish we

17:32

had more of. If

17:39

you're like me, the threat of fascism is

17:41

weighing on you this year. But

17:44

even when the F word is uttered, way

17:46

too few of us are considering the full

17:48

scope of the danger, let alone how to

17:50

really stop it. the

18:00

understanding and urgency we need to defeat

18:02

it. And she is joined by great

18:04

guests to discuss the threat of civil

18:06

war, attacks on abortion rights and trans

18:08

rights, Trump and the theocrats, Project 2025,

18:10

efforts to erase history and critical

18:13

thinking, and much more. Check

18:15

out recent episodes featuring Kathleen

18:17

Belew, Jeff Charlotte, Sarah Posner,

18:20

Wajahat Ali, Dalia Lithwick, and

18:22

many more. Subscribe to the

18:24

Refuse Fascism podcast on your

18:26

listening platform of choice or

18:28

go to refusefascism.org/ podcast. Darling,

18:31

I was on a vacation recently

18:33

and stayed at an Airbnb. And

18:36

then I realized that while I

18:38

was away, my empty house

18:40

could be making money, honey.

18:43

If you're someone like me that is

18:45

busy and not home all the time,

18:47

your home could be an Airbnb. And

18:50

it's actually pretty simple to get started. Even

18:52

if you don't have a whole house, you

18:54

could start with just a spare room. Personally,

18:56

I really enjoy staying at Airbnb. I really

18:58

do. I love a good Airbnb. Who is

19:01

that? Come back British you and

19:03

it really is a great way to like support local

19:05

economy and support local people. So Airbnb

19:07

is fabulous. And I know I was doing my British voice

19:09

earlier, but we love Airbnb. So

19:11

think about what you could do with some extra

19:13

cash, whether you're looking to treat yourself to something

19:16

nice like a shopping spree or a spa day

19:18

or start a whole side hustle. Airbnb can help

19:20

you be that person. Your

19:23

home might be worth more than you think. Find

19:26

out how much at

19:28

airbnb.com/host. Is

19:36

the reality of what's playing out on TikTok at

19:38

all the reality that's playing out on the ground

19:40

in the Middle East or even here in the

19:42

United States, is there any way for us to

19:44

know? That is I think that

19:46

is really the question, right? And

19:49

a lot of the stuff that Stanford Internet

19:52

Observatory where I work, what we

19:54

try to understand we're looking at particularly

19:57

breaking crises.

20:00

So I remember on October 7th, immediately

20:03

after it happened, we had all the Telegram channels

20:05

up. And this is my

20:07

job, we look at public data, the Telegram channels

20:09

up, we were looking at what was happening on

20:11

X. And what you would see is

20:14

content that would land on the Telegram

20:17

channels, you couldn't verify

20:19

it really in the moment, but

20:21

people would go and they would take it and they

20:23

would move it over to X

20:25

instantly. Because here is a

20:28

sensational or horrific image, boom, we're going to put

20:30

it over here. And for

20:32

some of these, like, I would do reverse

20:34

image search or something, and I'm like, okay,

20:36

that came out of Syria like five years

20:39

ago, right? But that person who has just

20:41

shared that image is influential, has a large

20:43

following, and it's already gone viral, right? Because

20:45

everybody who's seen it is now outraged and

20:47

horrified also, they've clicked the share button, they've

20:49

participated in that process, and like, boom, we're off

20:52

to the races. Maybe you're going

20:54

to get some community notes that will eventually clarify

20:56

that, no, this is an image from Syria, but

20:58

that's going to have an after test, like two

21:00

million views, you know? And

21:02

we know from Nadia Bashir's episode that when you

21:05

get like, when you first learn something, like your

21:07

brain always is going to think that that's like

21:09

the default right thing, even if it was wrong.

21:12

Right, or you're not even going to see the

21:14

fact check. And this is, you know,

21:16

Israel Gaza was a particularly emotionally, like,

21:18

you know, horrifying thing for many, many

21:20

people. And then the response similarly was

21:22

an emotionally horrifying thing for many people.

21:24

And so they did come to feel

21:26

really invested in it, right? And so

21:28

what can you do if you're like,

21:30

you know, sitting here in the United

21:32

States of America? Well,

21:35

maybe your form of activism is boosting the things that

21:37

you think show your enemies in the worst light or

21:39

show the plight of your side or different ways you

21:41

can do that. But

21:44

what you have to work with is

21:46

not something that you have in any

21:48

way kind of personally verified. And

21:50

so when you have generative AI that

21:53

can produce an image that looks quite

21:55

plausibly like a building blown up

21:57

in a in a conflict zone, right? And

22:00

you can't reverse image search that because it

22:02

is a unique image. It's not going to

22:04

show you that this is from Syria five

22:06

years ago, because it literally doesn't exist, right?

22:08

It's a world. It's a reality, right? It's

22:11

a thing that looks highly plausible. And

22:14

so you can go and share it. And it's

22:16

very hard to figure out if it's true or not.

22:18

And so we're putting all of this onus on people,

22:20

even when I say like, you know, think before you

22:23

share, right? But that

22:25

does assume that you have so much time

22:27

in which you're going to actually go and

22:29

take the time to do that, that you

22:31

have tools that make that possible. And

22:35

most people really don't. And then the

22:37

other flip side that we saw with

22:39

that ability to kind of create unreality

22:41

is that unfortunately, you also saw people

22:43

denying reality that was quite real, right?

22:45

So there were, for example, images

22:47

of babies, right, that were released

22:50

by the Israeli government in the

22:53

days immediately following October

22:55

7. And they

22:57

were real. But interestingly, like, you

22:59

know, right wing influencer Ben Shapiro

23:01

tweeted them, and he

23:03

got tagged by other right wing

23:06

influencers who were like, no, no, no, I

23:08

ran this through an AI checker, and it

23:10

tells me that it's AI generated. And then

23:12

all of a sudden, you have this whole

23:15

debate about whether like, you know, did the

23:17

Israeli government fabricate images of dead babies and

23:19

post them on X? Did

23:22

Israel supporter Ben Shapiro, you know,

23:25

run a propaganda campaign? Was it the whole

23:27

like beheaded babies thing? Yeah. And

23:30

also that was very much caught up

23:32

in the beheaded baby story, which was

23:34

such a like macabre thing to be

23:36

talking about. But there was a journalist

23:38

who reported a story saying that

23:40

a person who was recovering

23:43

bodies after the October 7

23:45

massacre said that there were 40 beheaded

23:47

babies and 40 is a very, very,

23:49

very specific number. You will anchor to it

23:51

you will. And it's also a

23:53

very specific thing to search for. So

23:55

now all of a sudden, when you're searching for 40 beheaded

23:58

babies, that phrase this

24:00

is an opportune time for anybody who

24:02

has produced content related to that very,

24:04

very, very specific phrase is going to

24:07

pop up. And again, what gradually comes

24:09

out is there were some

24:12

babies that had been murdered in various

24:14

ways. Right. And so you wind up in this,

24:17

you know, this world where people are debating, is

24:19

it 40 or not 40? Is the image real?

24:21

Is the image not real? You know, it

24:25

was, it was like one of the most bizarre

24:27

media inquiries that I've ever gotten was, you know,

24:29

was like people wanting to talk about like that

24:32

story and what it meant. And I, you know,

24:34

that's what happens at like the Stanford,

24:36

like it is the Stanford Internet observatories.

24:38

Like you go to Stanford. Is that

24:41

where it is? It is a Stanford.

24:43

Yes. It's like

24:45

you go there and it's basically just like

24:47

a huge newsroom of like academics,

24:50

like verifying stories.

24:53

No, we are not fact checkers. That's the

24:55

funny thing. The media reaches out to

24:57

us periodically, try to try to understand

24:59

like we are adamant that we are not

25:01

fact checkers. It is not our job to tell

25:04

you what is true or false. But

25:06

what we can tell you a lot

25:08

of the time is where an image

25:10

first appeared. Right. And then what you'll

25:12

see is somebody like BBC verify, for

25:14

example, or the New York Times has

25:17

a verification team. Bellingcat has a verification

25:19

team. There are these people, I'll

25:21

use BBC as an example because we, you know, we

25:23

would occasionally talk to them about, you know, anti-uprange

25:26

videos on TikTok and what kind of

25:28

networks were behind those. So

25:30

what you would see is like, here's

25:33

where we first see this content. Here

25:35

is how it's moved across the Internet.

25:37

Here is how this influencer wound up

25:39

with it. So you can think about

25:41

it a little bit more like kind

25:43

of forensically tracing how something went viral.

25:45

Why did that thing come to be

25:47

in your feed? That's the kind of

25:49

work that when we are doing work

25:51

on disinformation, oftentimes that's what we're doing.

25:53

So we're not making a value judgment

25:55

about this is true or this is

25:57

false. We rely on fact checkers and

26:00

people who are doing that kind of authentication

26:02

ourselves. But what we can do is say,

26:04

here is how this became a

26:06

trending topic on Twitter. Here is how this

26:08

debate about the 40 dead babies unfolded across

26:10

the various political factions on the internet. Have

26:13

we seen or has there been

26:15

any like generative AI that's been proven

26:17

like on online

26:19

or on social as related to the Gaza

26:22

war? Yeah, there's

26:24

definitely there's stuff out there. You

26:26

can kind of Google for

26:28

it or Google for it. Media

26:31

does write these stories. Sometimes past a point,

26:33

we don't work very small team. We don't

26:35

have the bandwidth to continue to deal the

26:38

research project on a particular ongoing conflict. Usually

26:40

at some point, there are teams that are

26:42

devoted to conflicts of research that will come

26:44

in and do it. You know, we've worked

26:46

quite a bit on Ukraine in

26:49

the sort of early days of February

26:51

2022. That's not that's sort

26:54

of a little bit less

26:56

of a thing that we actively focus on now. There's so many

26:58

other people to do it. But

27:00

yeah, your question maybe I just

27:03

trailed off. No, you did. You totally did. That's

27:05

just like so interesting. So you guys are like,

27:07

so a news org will come to

27:09

you guys to like try. They're like, we have

27:11

this story like we need to figure out like

27:13

where these images came from. And so you can

27:15

kind of dissect like how a story played out.

27:18

And like, yeah, yeah, that's fascinating.

27:20

Yeah, so that's what we try to

27:23

be. And again, public data and quantitative

27:25

analysis. So that's what we focus on. Interest.

27:28

Let's go back to the book.

27:30

Invisible Rulers. So who

27:34

are our modern day propagandists? It's

27:36

me. Who else are the invisible

27:38

rulers? So the term invisible rulers came

27:41

from Edward Bernays, who sometimes called the

27:43

father of modern propaganda. And he wrote

27:45

this book. So he was

27:47

he worked on kind of selling World War I to

27:49

the American public. Right. And

27:53

the word propaganda back in the 1920s was not

27:56

yet a pejorative. Right. So he's making

27:58

this argument, the phrase he the The

28:00

sentence he says is like, there are invisible

28:02

rulers who control the destinies of millions. And

28:05

then he talks about how we are

28:07

governed by, and our

28:09

tastes are formed by, our ideas are suggested

28:11

by, men we've never even heard of. So

28:14

you think that your opinion is

28:16

being steered maybe by the media,

28:18

by the politician. But what's really

28:20

happening is there are these incredibly

28:22

powerful people who are opinion formers

28:24

and opinion shakers, who are actually

28:26

steering the politician or speaking to

28:28

the media. And that was where

28:30

the idea of invisible rulers came

28:32

from. So I wanted to

28:34

kind of like pull that phrase forward

28:36

100 years basically, right? So I thought

28:38

it was like such a captivating way

28:40

to describe it, particularly because most of

28:42

what Bernays is doing in the book,

28:45

most of the case studies in the

28:47

book are not about politics at all.

28:49

They're about marketing, right? They're about like,

28:52

I want to sell cigarettes to women.

28:54

How am I gonna do that? Well, I'm gonna call them like,

28:57

I forgot the name, the specific phrase to

28:59

use, torches of freedom. You know, you are

29:01

a liberated woman, if you are smoking, pick

29:03

up your torch of freedom. And

29:06

so you see this model

29:09

of influence as like,

29:11

we are gonna appeal to you as a member

29:13

of a group and make you think that as

29:16

a good member of a group, as a liberated

29:18

woman, you should be a smoker. And we're gonna

29:20

create that demand over time and by appealing to

29:22

your identity as a member of that group. And

29:25

so this book is kind of a fascinating read, and

29:27

again, 100 years in the future, because what he's

29:30

basically talking about are people

29:32

who are incredibly influential and

29:34

they just know how to reach and connect with

29:36

an audience. And so influencers,

29:38

like the very term influencer didn't

29:41

come out of politics, came out of marketing,

29:43

right? And I was like, you probably remember

29:45

this, the idea

29:47

of like, you can help, you

29:49

know, you can help the gap sell as

29:51

teens, right? You

29:53

can help Nike sell their sneakers by, because

29:56

like, you have like

29:59

a certain aesthetic. like you're fun,

30:01

your fans like you, you can help

30:03

a brand monetize. And so you actually

30:05

see brands going to, what

30:07

became influencers, trying to sell products

30:09

to their fans, right? It's actually

30:11

like a completely transactional thing in

30:13

the early days. And

30:16

gradually you see that kind of

30:18

move into political influencers, where instead

30:20

of selling a pair of shoes,

30:22

you're selling an ideology, right? You're

30:24

selling like a, you're selling a

30:26

topic to fight about, you're selling a culture war

30:28

opinion. And so the book

30:30

really just asks, what

30:33

does it look like when

30:35

the kind of people who

30:37

are molding opinions, suggesting ideas,

30:40

shaping the discourse, right? Getting

30:42

eyeballs on content, like

30:45

what do we call them? How do we think about them?

30:47

So I didn't intend propaganda to be like a pejorative at

30:49

all. I was thinking about it more in the context of

30:51

like, how it was used in the 1920s, which

30:54

was just like, here we

30:56

are, we're opinion shaping. That is a thing people do. And

30:59

we're gonna go ahead and do it.

31:01

And so my goal with the book is

31:03

just to kind of ask that question, right? What

31:05

happens when it's very self-directed

31:07

and you personally can earn quite

31:09

a good living off of it? Don't

31:19

you just love when someone looks at you and

31:21

says, what were you up to last

31:23

night? Well, no matter how

31:25

late you were up the night before,

31:27

Lumify Redness Reliever Eyedrops can help your eyes

31:29

look more refreshed and awake. Lumify

31:32

dramatically reduces redness in just one minute

31:34

to help your eyes look brighter and

31:36

whiter for up to eight hours. No

31:39

wonder it has over 6,000 five-star

31:41

reviews on Amazon. You won't believe

31:43

your eyes. You know

31:45

you can trust them though, because they're made

31:47

by the eye care experts at Bausch and

31:50

Lomb, and they're backed by six clinical studies.

31:52

Eye doctors trust them too. They're the

31:55

number one recommended redness reliever eye drop.

31:57

The one and only Lumify is an amazing...

32:00

drop that will have people saying

32:02

something's different about you in the

32:04

best way possible. So check out

32:06

lumifyeyes.com to learn more. Did

32:09

you know that while over 60% of

32:11

Americans dream of starting their own business,

32:13

less than 20% of them ever take

32:16

their first step? The reason? Building

32:19

a business is tough. Having

32:21

built a business or two myself, I

32:23

know just how difficult the whole process

32:25

is, but Taylor Brands is simplifying the

32:27

business journey. From launching

32:29

and managing to growing your business,

32:31

Taylor Brands isn't just another tool.

32:34

It's your online business partner from launch

32:36

to success. With Taylor

32:39

Brands, building your dream business

32:41

becomes an effortless experience. Yes!

32:43

From LLC information to bookkeeping,

32:45

invoicing to acquiring licenses and

32:47

permits, and even setting up

32:49

your bank account, Taylor Brands

32:51

handles it all seamlessly. And

32:53

our listeners will receive 35% off

32:56

Taylor Brands LLC information plans

32:58

using our link taylorbrands.com/JVN.

33:03

That's

33:05

tailorbrands.com/JVN.

33:09

So start your business journey today with

33:11

Taylor Brands. Obviously

33:19

things that happen on social media are stressful,

33:21

but you know, is it a security threat?

33:24

It depends on if something happens to you. But

33:26

what security threats does this pose

33:28

to people at large? Just the

33:30

level of polarization that plays

33:32

out on social media on so many

33:34

different terms. I mean, I think about

33:37

Charlottesville. I

33:39

think about times where people have lost their

33:41

lives in protests and in all

33:43

sorts of different ways. Are we over

33:45

concerned? Are we under concerned? Have we been

33:47

here before? What did you

33:49

find from researching the book about where we are?

33:52

So one of the things that I point to quite

33:54

a bit actually was this guy

33:58

by the name of Father Coughlin. Oh

34:00

my God. Yeah, there's a whole country on

34:02

Father Koppel. On Ultra, they

34:04

talked about him on Ultra, Rachel Maddow's podcast

34:07

about that. Oh, I have to go listen.

34:09

Oh my God. It's about

34:11

that senator from Minnesota who is

34:13

a fucking Nazi apologist who from

34:15

inside the Senate was distributing

34:17

Nazi propaganda. He was basically working in

34:19

cahoots with the Nazis to try to

34:21

get us not to go into World

34:24

War II. Then he

34:26

ended up dying in this weird plane

34:28

crash. He's the only sitting senator to

34:30

ever die in office. Then

34:32

after he died, this book that got recovered

34:34

from the plane crash, you have got to

34:37

listen to Ultra Renate. You will shit your

34:39

pants. I have to listen to this. It's

34:41

so good. Father Koppel was part of... Wasn't

34:44

he a radio guy from the 30s and he just

34:46

had this huge... He

34:49

was really giving you

34:51

Trumpy nationalist... It

34:55

was like the most popular radio show of that whole

34:57

decade, wasn't it? Yeah. At his

35:00

peak, he had about 30 million listeners.

35:03

Three zero million fucking

35:05

30 million? That's huge

35:07

by today's standards. Right. It's

35:10

objectively huge. I think the population in

35:12

the US was maybe 120 million at

35:14

that point. You can fact check

35:16

me. It's in the book. The book was fact checked.

35:19

What I thought was really interesting about

35:21

him is you see this man of

35:23

the moment, right? Radio is relatively

35:25

new. If you can listen to some

35:28

of the recordings, he's always talked about us having this

35:30

deep baritone. He sounds...

35:32

He's a priest. He's a man

35:34

of God. He's trusted. He's

35:37

got this hypnotic way of delivering

35:40

his sermon. What you

35:42

see is he's originally very much like a

35:44

supporter of the poor. He's a populist in

35:46

the way that populism is

35:48

not in the proto-populism of today, but

35:50

in the actual populism where it means

35:53

people who are trying to support the poor

35:55

and try to create political policies that

35:57

benefit the poor. You

35:59

see him... Originally a supporter of FDR than

36:01

he kind of becomes like an about enemy of

36:03

FDR He feels like he's failed he hasn't delivered

36:05

for the people But he in

36:07

turn becomes a big fan of Mussolini

36:09

and Hitler and so you see his

36:12

radicalization Happens and

36:14

because he has this massive Following

36:16

you see him really kind of like take those

36:18

followers along for the ride And

36:20

then you see the sort of oh shit moment

36:22

right where FDR doesn't want to intervene because of

36:24

the First Amendment He doesn't want to be seen

36:26

as you know Sifling this critic of his you

36:29

see the broadcasters you radio is licensed

36:31

right so there's kind of you know And

36:34

you can think about it as kind of a

36:36

parallel to social media in a sense not a

36:38

government license But again somebody is controlling who gets

36:41

that slot So they're trying to figure out

36:43

like how do we fact check him he

36:45

starts to say crazy things Chris

36:47

so knocked right where Jewish homes and businesses were

36:49

attacked he says oh well It was really the

36:52

fault of the Jews right so you see that

36:54

very kind of you know that rhetoric that you

36:56

can Hear echoes of or you know you sort

36:58

of see it again a century into the future

37:01

And so it's been a bit of time like trying

37:04

to explain that moment right you see you

37:06

know He's got a paramilitary organization that sort

37:08

of supports him. He's again. He's doing that

37:10

thing where he's like well You know violence

37:12

is terrible, but also we need to fight

37:14

for you know Catholics need to stand up

37:16

and fight for their kind right and so

37:18

you see that that same kind of rhetoric

37:20

is out there You

37:23

see the FBI going after these people

37:25

Because they really blow up like a

37:27

weapon factory yeah, yeah, so there's like

37:29

so they begin to do these sort of

37:32

street skirmishes You know there's a bunch of

37:34

different kind of political actions of these groups that

37:36

are Nominally aligned with him

37:38

there's a question about how active he really

37:40

is Yeah, that's

37:43

sort of plausible. Deny abilities there

37:45

sometimes. He's supporting it sometimes

37:47

He's taking a couple steps back You see

37:49

his supporters begin to protest as the broadcasters

37:51

do begin to crack down on him and

37:53

take him off the air right, they've is

37:55

very dramatic set of

37:57

events that's happening and You

38:00

know, you have media trying to figure out how

38:02

to counter speak effectively and then you have ordinary

38:05

like college professors that go and they

38:08

create this educational curriculum

38:10

called the Institute for Propaganda

38:12

Analysis and what I loved about it

38:14

was they literally Annotated

38:17

his speeches with emoji, right? They they

38:19

like came up with this like I

38:24

When I opened up the archives, I was like

38:26

this is actually this like kind of fucking incredible

38:28

I don't know how did we forget this but

38:30

you have these professors who begin to say look

38:33

We're not going to try to fact check him.

38:35

That's a waste of time, but let us explain

38:37

why the rhetoric works, right? Here is why you

38:39

feel so emotionally enraptured

38:41

by Father Coughlin Here's the

38:43

kind of rhetoric that's being used in

38:45

this speech in this sentence in this

38:47

like smear campaign against this enemy And

38:50

so they come up with all these little

38:52

names like the glittering generality, right? And and

38:54

then they literally make an emoji a diamond

38:56

emoji for the glittering generality and they take

38:59

all the speeches and they begin to release

39:01

Them basically as pamphlets for the communities for

39:03

people to just share in there, you know

39:05

Local like bowling club or whatever, you know, I

39:07

just sit around listening to the radio maybe And

39:10

they just kind of like dropped a little

39:13

diamond in the spots in the speech where

39:15

he's using that technique And so the point

39:17

is almost like why are we wasting our

39:19

time playing whack-a-mole with fact checks? And we

39:22

could be teaching people to recognize the sort

39:24

of tactics and tropes that propaganda Use

39:26

and so I spent a bunch of time on this as

39:29

a book basically just trying to like make that argument is

39:31

this this sort of

39:33

like a lost knowledge from the 1930s like

39:35

a better approach to responding than

39:38

you know, then then playing fact-checking games or

39:41

Screaming at somebody and trying to counter them

39:43

by screaming at them is like Exposing

39:46

the tricks actually a much more effective way to do

39:48

it Which is basically the glitting

39:50

generalities like when you take like What

39:54

is a glittering generality? to

39:56

mean something

39:58

where you can I painted entire

40:01

pension, entire group that like better and living

40:03

on time when the when you are yeah

40:05

oh like is veterans. we're not getting our

40:07

benefits and all these immigrants keep coming in

40:09

and taking or things so we want to

40:11

be like? Isn't that one where you're like

40:14

blaming the woes of like the Vi and

40:16

immigration. Is

40:19

the grain of truth there a lot of

40:21

veterans who don't get their benefits are a

40:23

lot of that are and to have extreme.

40:25

Mental health, struggled, struggled in are

40:27

totally forgotten about that completely. Grounded

40:29

and so there is like it. It's

40:32

residents he says something about

40:34

it is true. The.

40:36

And a demonization that particular. Season

40:39

Two. Is not necessarily the

40:41

correct you know they raised to

40:43

get into the propaganda at the

40:45

other Angus. There is a problem.

40:47

The scapegoating is problem for the

40:50

argument that. There is that getting

40:52

a benefits. Isn't something that is

40:54

actively true right into their is? With

40:56

a grain of truth that ferrets it's

40:58

very hard. Because. Then you have

41:00

to respond by getting into a whole debate

41:02

about it. About it is sad

41:04

that this. Crisis.

41:08

Nuance here Who? so he can think

41:10

about that instead, you're much more likely

41:12

to respond, mostly because he. Is. Generality

41:15

appeals to you. Which. Is like. Biological.

41:18

Males are a threat to women are we

41:20

are the I'm right, The Ark is a

41:22

fuckin' over fund the man and under fund

41:24

the women and give like the women city

41:26

or training facilities. But it's like the trans

41:29

women aren't even fucking men. so like what

41:31

are we talking about here Sharon, you know

41:33

I'm in the yeah that is so in

41:35

for. That is so true. So I'm. It's

41:38

Alexa near. Ah. And

41:41

also I realize when I started this I was

41:43

so said Her misinformation december me say never

41:45

really asked a guiding question which is like. Are.

41:48

We've fucked is missing Burmese and more

41:50

prevalent than ever. So maybe that's like

41:52

what it was in retrospect. the like

41:54

in an election year which we are

41:56

in Norway especially first is missing for

41:58

me, suddenly more prevalent. Than ever. Like

42:01

you have any recommendations for like

42:03

ah, money to how much time

42:05

a day said one spend on

42:07

Athena information at night. Let's

42:10

see, I'm loving it. did. I

42:13

think. As. He said. When

42:16

as it were narrative legacy people who are trying to be.

42:18

Influences A More Not trying to make it a

42:20

career and. People realize how much

42:22

influence the even some was in their

42:25

communities right? As a thinking about your

42:27

role as like wire sharing and are

42:29

you getting information out there. It.

42:31

Is really important. Are. We fact. I

42:33

mean. You know, Depends.

42:35

On who you. Know

42:37

I wish I could be optimistic.

42:40

I'm really not enough released under

42:42

stress. And that's because one of

42:44

the things I'm. One.

42:46

Of the things that has persisted disliked the

42:48

belief that the Twenty Twenty a lesson was

42:50

stolen. Has been.

42:53

Really pc interest and

42:56

it's. It's. Not there

42:58

are in a when it's expressed by lake A

43:00

person who has heard of it has their media

43:03

has told them in their elected leaders have told

43:05

us. Facet of

43:07

people are if you've been misled

43:09

by people who know better or

43:11

worse, seamlessly, using them for purposes

43:13

of maximizing their political power is

43:15

incredibly manipulative, and it's been happening

43:17

for four years now. Price: And

43:19

so that is where I think

43:21

the biggest in the biggest. Challenge.

43:25

Is actually going to be. The people are really

43:27

dug in. They've been hearing for four years. And

43:29

in a lesson was stolen from

43:31

them by those other people. Every

43:33

single campaign speech reinforces that I

43:35

it has no basis in reality

43:37

By no. Matter how many investigations there

43:39

are and sue you know, ballot said

43:41

ballot integrity and putting the scene Integrity

43:43

no matter how many. Times. Like

43:47

Two Sues Fox News, the financial in

43:49

math is Samantha and we still for

43:51

ten that somehow there was like that

43:53

there. There and they don't. Full.

43:55

throated the repudiated and emphasize on their

43:57

programs now know there was no evidence

44:00

So we are in two very,

44:02

very kind of distinct realities about

44:05

the basic legitimacy of American elections.

44:08

And that's the thing that concerns me

44:10

because it's one thing to disagree about,

44:12

you know, trans policies,

44:15

abortion policies, more policies, you name

44:17

it, while at least acknowledging that

44:19

an election is free and fair. And

44:21

we have hit a point where whatever direction

44:24

it goes, my real concern is that people

44:26

are going to be convinced that the winner

44:28

is illegitimate. And I think

44:30

that's actually a foundational problem for

44:32

American democracy in a way that

44:34

all of the other kind of issue-based debates

44:37

are predicated on the idea that we get to

44:39

fight those out at the ballot box. And if

44:41

we then create the perception that the ballot box

44:43

itself is not legitimate, that's where I think the

44:46

real concern is. So

44:49

my own work, my own kind of focus for

44:52

2024 is actually trying to understand more of

44:54

the narratives about election

44:57

de-legitimization, right? It's just,

44:59

I am not going to, you know, in

45:01

2020, we ran a project where

45:03

we just looked at false

45:05

and misleading claims related to voting. We

45:07

didn't pay attention to Hunter Biden's laptop.

45:10

We didn't pay attention to what

45:12

candidate A said about candidate B. We did

45:14

not care about the broader kind of culture

45:16

war, you know, issues of the day. The

45:18

only thing we cared about was what, you

45:21

know, were there lies about the procedures and

45:23

kind of protocols of voting, like these things

45:25

that they vote on Wednesday, not on Tuesday,

45:28

or lie to you about when early voting

45:30

ends, things like that. And then were they

45:32

working to de-legitimize the election? And

45:34

it was overwhelmingly the latter. It was

45:37

so much de-legitimization. And that's what I

45:39

think we really need to be focused

45:41

on, just as people who believe in democracy and

45:43

want to see the American project continue. Especially with

45:45

like generative AI, it's like you could make, I

45:47

mean, you know, when Trump's always talking about like

45:49

stuffing ballot boxes and like, you could

45:51

just make videos of that sort of thing. And

45:53

that's where I think, again, kind of going

45:56

back to our chat about the, you know,

45:58

the sort of fog of unreality. that

46:00

we started to see play out in Ukraine

46:03

and Gaza, right, these sort of very pivotal

46:05

moments. I think you will see

46:07

some of that play out during an election. And

46:09

one thing that becomes a challenge is if a

46:12

fact checker or authenticator says, no,

46:14

this isn't real, then the response

46:16

is going to be, yeah, but you don't trust that fact checker.

46:19

That's a mainstream media fact checker. And so

46:23

I think that that's

46:26

one of the things that I think is going to

46:28

be challenging about generative AI. It's going to be

46:30

even if something is created and identified as created,

46:32

there's going to be people who are going to

46:34

continue to want to believe it. And

46:37

that is where I think that kind

46:39

of divergence in reality is really toxic.

46:42

Yeah, I need to make a note of this, like

46:44

what I'm curious about now, because it's like that fucking

46:46

electoral college, honey. What Biden won like

46:48

those six or seven states by like 110 or something.

46:51

I think Trump won those six or seven

46:53

or whatever his coalition was by like a

46:55

little bit less. But the difference in the

46:57

popular vote is so big that it's

47:00

crazy that California, New

47:05

York and these states that

47:07

have way more population just

47:10

were still ruled by this

47:12

antiquated fucking electoral college. It was

47:14

literally meant to empower rural fucking

47:19

people, not even rural, people

47:21

who were participating in

47:23

the Transatlantic Trade directly. It

47:25

was really about like the South and like

47:27

making sure that lesser populated states didn't get

47:29

like out, like

47:31

they wanted more power. And so they fucking gave it to them.

47:33

And we're still paying the price. I mean, it is

47:36

definitely worth doing a podcast on American governance. You

47:38

got to do one American governance. Shit.

47:41

Okay, so you said something really important just a moment ago, which was, I

47:44

think a lot of people don't or you said a

47:46

lot of people don't even realize like the power that

47:48

they have to combat misinformation in their own lives and

47:51

like with their own platform in their own communities. It

47:53

makes you think about certain family members of mine who

47:55

just like, don't fucking

47:57

talk about anything because they're like, that

47:59

doesn't. affect me. So I'm just gonna like not

48:02

fucking talk about it at

48:05

all. And actually, I'm going to like enable

48:08

all these other fuckers talking

48:11

to you. You know who you are. You

48:15

guys, I found out that like a friend of

48:17

the friend of a friend of a family member

48:19

who's actually closer than what I just said, literally

48:21

had a fundraiser for fucking Casey DeSantis in

48:24

my home fucking town. I've been in that

48:26

house. Like fuck

48:28

me with this empty coffee

48:31

cup. Like I cannot

48:33

handle these people. So anyway, is there

48:35

a trusted source of information anymore or

48:38

what? Man,

48:41

you know that I think... Besides

48:44

me. I know everyone. I am. No,

48:48

no, no, no. That's really the question,

48:51

right? That's one of the things that you see.

48:55

And I write about, I have a whole chapter in the book

48:57

on COVID and I was like, oh boy,

49:00

we're gonna go folks out there. But because I

49:02

think that you can again, hold two ideas in

49:04

your head, which was institutions at times,

49:06

top to top, especially from a

49:08

communication standpoint. And also

49:10

there were people who profit from

49:13

and maximize their own clout based

49:15

on undermining confidence in institutions. So

49:17

both of these things are true. And one

49:20

of the things that's become a challenge is

49:22

you do have this like kind

49:25

of, you know, this fracturing

49:28

along tribal and identity

49:30

based lines. And

49:33

what you see happen there is like,

49:35

you can just dismiss a media outlet

49:37

because they're not trustworthy. I

49:39

have tended to, like,

49:41

okay, what can I get from associated pressure? What

49:45

can I get? Like, where are the areas

49:47

where the Wall Street Journal and the New

49:50

York Times and, you know, Reason

49:52

and Mother Jones all kind of have

49:54

the same body of facts. You know, is there at least

49:56

some, maybe they're going to spin it differently, but like, where

49:58

are the facts in that? in that kind

50:01

of rubric.

50:04

And then I think about like, if

50:07

I want to share to somebody like a family

50:09

member or a friend or someone where like we're

50:11

just not aligned on an issue and I think

50:13

that they have that information, which

50:16

outlet are they going to be most receptive to and

50:18

how can I find a way to

50:22

present something palatably? I don't know if that's the

50:24

right answer, that's how I have started doing it.

50:27

But another way you can do it is like, you

50:30

can see somebody post something or share something and

50:32

you can send them a DM. You don't have

50:34

to necessarily like blow them

50:36

up in the public comments. If

50:39

you have a huge disagreement with someone over

50:42

something as complicated as the Middle East

50:44

or as easy as you're recycling, but

50:47

it really depends on you having like a shared

50:50

set of facts, but both people don't have

50:52

a shared set of facts or haven't even done enough

50:54

research to really understand

50:56

the history and what we're talking about.

50:59

Like, is it worth a conversation? If someone's

51:02

really into propaganda, are

51:05

you not the right person to have that talk

51:07

with them? Can we come to consensus? And

51:10

is there any like personal policies that

51:12

could make an impact on how we

51:14

could be better at that? I

51:17

have tried to do, you know,

51:19

I have friends,

51:22

you know, I think across the political spectrum, I had

51:24

a fellowship in 2017, right? And

51:26

it was the Bush

51:28

and Clinton foundations, the LBJ foundation and

51:31

Bush too. So it was like half Democrats,

51:34

half Republicans. And I

51:36

really liked it because, this

51:39

was in 2017, I felt like it

51:42

was at a time when polarization felt like it

51:44

was getting worse and all of a sudden, I

51:46

had like these 50 people that I saw once

51:48

a month and then talked on the internet for

51:50

this sort of fellowship program for six months. And

51:53

I felt like I came away with such a

51:58

group of people where my experience was that that

52:00

we were always engaging in good faith. That

52:02

was the big line, right? You know?

52:06

And I just felt like, and you know, we're still friends, you

52:08

know, just got six years there, very,

52:11

very close. You're like a second family. And what I love about

52:13

it was that it was just a way to think about, you're

52:16

not gonna always agree. You're gonna

52:18

have your fights. We don't have

52:21

to have them in full public view. We

52:23

can have conversations privately. For

52:27

me, I, that's

52:29

how I've chosen to be most of the time

52:32

on social media. Like, look, there are a couple of areas

52:35

where I do feel like

52:38

I'm going to get in there and I'm gonna fight, right? There

52:40

are like, you know, I have three kids and

52:42

education policy is really important to me, vaccine policy is

52:45

really important to me. And in certain areas where I'm

52:47

like, okay, this is where like, I will let it

52:49

all, you know? But

52:52

like, are there ways to do it without being,

52:57

without like smearing people, right? Without making

52:59

somebody into a caricature. And I know

53:01

what that feels like because it happens

53:03

to me quite a bit too, right?

53:06

And it feels bad. More

53:09

importantly, I imagine you have this feeling,

53:11

you feel like you can't fight that, right?

53:13

And so you're always gonna see like that one

53:16

negative person, that one asshole who's like, in your

53:18

mention, smearing you as a thing. You feel almost

53:20

like you have to respond. Those are

53:22

the moments where I'm like, you know, I actually don't. Unless I

53:24

feel like I might be able to speak to

53:27

the bystanders where like it's worth responding for the

53:30

sake of the bystanders, then sometimes I'll do it.

53:32

But otherwise I don't feel that I necessarily, it's

53:34

not my job to have to fight with that

53:36

person in that moment. If

53:40

I don't feel like it's a good faith encounter, like again,

53:42

I said, I feel like I have this community of people

53:44

where I do feel like I understand like, a

53:47

good faith discussion can leave you feeling closer

53:49

to the person afterwards. It can leave you

53:51

feeling more informed. Maybe you didn't come to

53:53

an agreement. Maybe you agree to disagree, but

53:56

you have that, that sense

53:58

that you've come away like. like as

54:00

humans who have had a

54:03

conversation as opposed to wasting

54:05

your time just

54:08

being like shit on by some rando who's just gonna

54:10

go off and do it to the next person. Which

54:12

like even if they are trying to do it

54:14

in good faith on social, it is really hard because

54:16

it's like you see the text in whatever mood

54:18

you're kind of in. Exactly, however you

54:21

choose to read it in your like emotions and your

54:23

feelings in that moment, yes. It's just not

54:25

an easy place to feel like you've had

54:28

a conversation to come up closer with someone

54:30

when it's through text only and you've never

54:32

met them. Right, right and

54:34

I think I've had, I mean you probably even had this

54:36

with your friends, right? You get a text and

54:38

you like feel like they're being snippy maybe and

54:40

you're like offended by it and

54:43

then you know I had this happen

54:45

with a friend of mine and I actually was like okay

54:47

I've valued this relationship let me call her, I

54:50

can actually ring her phone which is not a

54:52

thing I normally do, you know? And

54:56

you know and I felt really glad that

54:58

I had made that choice, right? It was a complete

55:00

misunderstanding. I had read the situation wrong.

55:03

It was you know, she

55:06

did not realize that I was far more upset

55:08

about the issue that she'd kind of you know

55:10

made comments about than I was and it was

55:13

again a relationship, like a moment

55:15

of connection between two people. I

55:17

think social media, we're not necessarily

55:20

equipped to be broadcasting

55:22

at all times, receiving feedback

55:24

at all times. Everything

55:27

you ever do is forever. I'm sure

55:29

you know this. You might think it's fleeting

55:31

but someone has screenshotted it and they will

55:33

be there, you know? You know? And

55:38

then you live in like in

55:40

fear and self-censorship about you know

55:42

the like what is the, I

55:44

mean you do this but I definitely, when

55:47

I'm speaking about some contentious political issue or

55:49

even things that shouldn't be contentious like

55:51

me saying the 2020 election was not stolen or I'm like

55:53

I'm speaking and I'm like, I'm like interpret,

55:56

like what is the worst faith interpretation

55:58

of this sentence? Maki

56:00

my circle and they cannot account

56:03

for to make a decision on

56:05

my kids. I read. These are

56:07

like who is gonna just as as

56:09

any Sylvanus. As I'm speaking I feel

56:11

like it's something or for me doing

56:13

them know we're gonna suffer less like

56:15

seven years. I. Am in

56:17

out the gets a weird weird skills

56:20

develop pray. For us,

56:22

is there any like Paul is

56:24

like we're talking about. Father.

56:27

Copland like the census have been first

56:29

amendment stuff like. Is

56:31

there any policies around kids are getting

56:33

be. Is a

56:35

yeah? I mean. Or

56:37

three policies. I think a few minutes is

56:39

a people's. History as I got their

56:42

yard. Misinformation Disinformation Like someone turns because

56:44

you could really think that Lake. Is

56:47

really one thing that Nadia taught a second

56:49

or first episode about Miss of recent submission

56:51

that I didn't When he hears is that

56:53

you know both sides kind of do It

56:55

is in my whole thing away as like

56:57

a fuckin' leftist as I was like yeah,

56:59

but lives you were. It's like to try

57:01

the Prince that like protects more people, whereas

57:03

I feel like when people on the right

57:05

do it, it's like typically the policy harms

57:07

more people than what it's trying to protect,

57:10

which I feel like it's kind of a

57:12

significant distinction to me. I'm. We're

57:14

in a we didn't like really end

57:16

up getting their she was just I

57:18

was like i don't know because it

57:20

was like vaccines or like or abortion

57:22

were. It's

57:25

you're hurting more people than but then

57:27

it's like also hurts is over the

57:29

flipside that we're we're protecting women sports

57:31

but it's like he's a such a

57:34

conversation so be. Is

57:36

aired like his early as Hill or like our

57:38

a Move to Sell Free Speech or mean I

57:40

think even just like on social how I met

57:42

a made it so that you couldn't. Like.

57:44

I did this abortion. This video about

57:46

abortion a few weeks ago. That literally.like so

57:49

many lakes in if it. were

57:51

really hate gays or got up like a

57:53

million views but it had like. Really

57:55

low sears and most the time if

57:58

I had something that got to.money. We

58:00

got that me like this year it

58:02

would be like off the charts but

58:04

I could see how this new policy

58:06

of like no political stuff it's impacting.

58:08

they algorithm on Instagram a lot like

58:10

my podcast out Park Platform end my

58:12

regular when because they do talk about

58:14

political things they can see the way

58:16

that it's like doesn't reach the same

58:18

people in the Met Adidas are just

58:20

kind of encourages you to like, talk

58:22

about hair and like seek your and

58:24

handling you know yes. That's

58:26

a really big wasn't so. My

58:30

race and nobody. Is.

58:32

Ever happy Every single group feels that the

58:35

algorithm is up to get them. I can

58:37

tell you that most of the time it

58:39

is conservative tiny that the algorithm has been

58:41

out to get them to talk to people

58:44

that already engaged and. I'm really not on twitter

58:46

much anymore. I don't I don't enjoy it. a idea.

58:48

We got off to. Yeah, I'm

58:50

enjoying my friends enough. None of

58:53

this guy. and. You

58:55

know I like Alex Has Fifth Avenue make

58:57

him as awesome as I'm battling with a

58:59

guy with the Css policy. How that might

59:01

the Maurice. It's. So I used to pure

59:03

as a think people are like empty people on

59:06

Twitter we lived. are you outside of any at

59:08

the this idea that the algorithm is suppressing

59:10

you? And one thing that I always been very

59:12

interesting with a lot of people who are most

59:14

riled up about it it's usually and like twenty

59:17

seventeen. Heard from and influencers

59:19

talking about these arab and I was in

59:21

and a half from have put a label

59:23

on their Clinton is it is downright as

59:25

a you know it has been. In.

59:27

Some extreme cases block from sharing. In

59:29

the most extreme cases before taxes, he

59:31

does. But you'd

59:34

see ordinary people. Talking.

59:36

About how shadow banned they were and I thought

59:38

like this is such. This

59:41

is such an interesting for such I'm like why do

59:43

you think that is as well What? what? What evidence

59:45

like? Why do you feel that that this is a

59:47

thing like a. Swirling of cases that no.

59:50

Labels. On their tweets it's sort

59:52

of like replying to people but they're

59:54

convinced or sad event anyway and repeatedly.

59:56

The answer that like that and these

59:58

engagements was my friends don't. The all

1:00:00

of my content. right? Into the

1:00:02

rashly convinced that way. He.

1:00:04

Closed. The people who follow them didn't

1:00:06

see all their posts about was evidence

1:00:09

that they're really conservative. Viewpoint. Based

1:00:11

depression I was going on. With the

1:00:13

reality is the platform decides that the algorithm is

1:00:15

the kingmaker. Basically I spend a whole bunch of

1:00:17

time on this on the book in the context

1:00:20

of like a case study on the Facebook botched

1:00:22

have. Because. We

1:00:24

have visibility into. Tool called crowd single

1:00:26

crouching or gives us visibility into what's

1:00:28

happening on the platform To researcher at

1:00:30

the school again public post for the

1:00:33

data. Can you say paying attention for

1:00:35

a while? There really is a few.

1:00:37

Videos of the women next thing I

1:00:39

see minutes while at are making like

1:00:41

spaghetti o's on the counter. Remedies a

1:00:44

rose Weird food videos that really had

1:00:46

this moment constantly for since your feed.

1:00:49

So. As the because you're into things I start.

1:00:52

Looking into it under a minute is a

1:00:54

you tube or with like you know tens

1:00:56

of millions of of. Hard. And.

1:00:59

Suicide me he was seeing the also is

1:01:01

what's up with this on on Fifa. And

1:01:03

we were. We were talking about. It as a

1:01:05

men is because he's. Pretty. Different for me.

1:01:07

Listen until the didn't part of my country you know,

1:01:09

Why are we seeing the same pay? For.

1:01:11

Start looking at it and piece of actually

1:01:13

don't have many followers at all. A lot

1:01:16

of them have very very small foreign earth,

1:01:18

but the algorithm is just pushing and pushing

1:01:20

and pushing an hour. And. What

1:01:22

you see in the comments: his late

1:01:25

majesty Food content that has a sensational

1:01:27

headline. It is a weird thing people

1:01:29

are watching it and for like a

1:01:32

six fascination like they're grossed out by

1:01:34

I can't look away. The got sick

1:01:36

a train wreck. And.

1:01:39

This is the thing that the algorithm. Is like Loosey

1:01:41

Goosey was increasing. You see billions of

1:01:43

views on this concept network. And

1:01:46

then all the sudden the algorithm changes.

1:01:48

and the news though they just as

1:01:50

a traitor missing a vertical club fresh

1:01:52

and that's because somebody. You know,

1:01:54

some either team or. Whoever is

1:01:56

responsible for looking at the face but

1:01:58

mattress maybe realizes that. The comments are not

1:02:01

very favorable. The comments: your life why is

1:02:03

gross at my feet Again grants of their

1:02:05

engagement but it's not like positive. Answer:

1:02:08

when you start to see is like the algorithm

1:02:10

gives and takes away and it's got in this

1:02:12

particular case where liked about the case that he.

1:02:14

Was. Other was nothing remotely political about

1:02:16

this. Content are all. It was

1:02:18

surely a life. We have decided that this

1:02:21

is not a for us anymore This is

1:02:23

that it for users and so boom There

1:02:25

it is. It's and I and you see

1:02:27

done this network over the next six months

1:02:30

trying to climb back up to get back

1:02:32

to see that. The reason I use this

1:02:34

example is like. The. Platform

1:02:36

is all there is no neutral,

1:02:39

they are always out there trying

1:02:41

to decide what to show you

1:02:43

and your followers and over the.

1:02:46

And it's often times like they're

1:02:48

trying to balance things that are

1:02:51

explicitly harmful, right and illegal, send

1:02:53

things that are highly offensive, an

1:02:55

inflammatory. And there's a difference here

1:02:57

is that they have. And.

1:03:00

Particularly. On things that are offensive

1:03:02

inflammatory different people have different points of

1:03:04

view on what is offensive and what

1:03:06

isn't monetary as that are out. There

1:03:08

like telling you know tweaking these

1:03:11

these lovers behind the scenes and

1:03:13

he passes at everybody feel angry.

1:03:15

And disgusted and irritated

1:03:17

at moderators. Ah, but

1:03:19

the only real solution to this I

1:03:21

would argue is like giving users more

1:03:24

direct control over their experience, right? Were

1:03:26

when they have chosen to follow somebody

1:03:28

that is treated as a very strong

1:03:30

signal and then that content is. Pushed.

1:03:33

To them or that they see a lot more of it. The. Flip

1:03:35

side of that though is you are basically

1:03:37

saying. If people choose to follow

1:03:39

my you know heinous people are afraid

1:03:42

of these and has as a global

1:03:44

but uncle. The

1:03:47

matter what they're going to see to race

1:03:49

or as you wish, you make determinations for

1:03:52

what should be surface, what should be amplified,

1:03:54

what should be to boost. It is actually

1:03:56

a very, very complicated series of questions because

1:03:58

it has to work. The entire

1:04:01

system. So. That it becomes

1:04:03

a problem right to somebody? Who has played

1:04:05

a good person creating content that he

1:04:07

feels like in that you know in

1:04:09

the interests of herring humanity and helping

1:04:11

people. Are. Ultimately is going to

1:04:13

be subject to the same rules as people

1:04:15

who are. Race baiting are

1:04:17

doing other things if you decide to say like

1:04:20

if you follow somebody. Should see other

1:04:22

costs so. We're nearing or embrace

1:04:24

when access your book. Invisible Ruler of the

1:04:26

people who turn lies in reality it's you've

1:04:28

already given us like a little italy peek

1:04:31

into like what you cover but is there

1:04:33

any like people who you cover things you

1:04:35

cover the you really particularly surprised out or

1:04:37

wanna leave us a think a little bread

1:04:39

crumb for people to get the book and

1:04:42

read s. Oh yeah, I

1:04:44

mean that you know, A

1:04:46

lot as it is a key and memoir which

1:04:48

is not what as expressing I set out like

1:04:50

I said to read this book about what is

1:04:52

propaganda look like in the modern era I know

1:04:54

how do I dislike where the interesting for any

1:04:56

questions they're how do I tell them to them.

1:04:59

And of happening with that you

1:05:01

know Congressman started smearing my colleagues

1:05:03

and I was sixteen at least.

1:05:06

Jim Jordan subpoenaed near a in a way of

1:05:08

getting all the Congress. Because we did work on the twenties

1:05:10

when he likes and. And. I thought like wow

1:05:13

this is really. At there's

1:05:15

a line then. You

1:05:17

to talk all about being subpoenaed by Jim Jordan

1:05:20

in the Book. About

1:05:22

what happens when like when I became

1:05:24

all of a sudden the either me

1:05:27

and my colleagues became the. The

1:05:29

subject of like you know that propaganda campaigns

1:05:31

in our work in certain ways and like

1:05:33

what that was like. It.

1:05:36

Is again you feel like oh my gosh

1:05:39

how the way and with correct the record

1:05:41

and then you realize that some some people.

1:05:44

System. Of the news is.

1:05:47

That ending and a very personal

1:05:49

level And so you know the

1:05:51

book really again started off as

1:05:53

the Slaves and of that provocative.

1:05:57

You know, Investigation and said.

1:06:00

Political influence your culture and and van

1:06:02

turned into like oh hey. Now

1:06:04

in character, You

1:06:06

guys if that's not like a reason to

1:06:08

fucking run out and get this post, we

1:06:10

have to write about that experience of. Reading.

1:06:13

These so much for your time they give

1:06:15

her come as a good thing if rain

1:06:18

this incredible book and they for doing this

1:06:20

hard ass. Thorny. Nuanced

1:06:22

work queen. We appreciate you. We salute you

1:06:24

and the swords are coming on the show.

1:06:27

You some surrounding the did we learn

1:06:29

the things Will we. Absolutely.

1:06:32

Dead and Nyazov lot more question

1:06:34

so misinformation. that's when something is

1:06:36

wrong accidentally this information is intentionally

1:06:39

untrue and is much more related

1:06:41

propaganda I also thought really of

1:06:43

sitting here like propaganda is really

1:06:45

like information is information that spread

1:06:48

by someone who has like a

1:06:50

political agenda of which is why

1:06:52

like honey I'm part of the

1:06:54

gay agenda which of like keeping

1:06:57

queer the live in having like

1:06:59

good access to like economic security.

1:07:01

And like C D and housing.

1:07:03

and like. Being. Able to be okay, Ah,

1:07:06

And Breivik. Ah, so that's really what are

1:07:08

gauge and is is not as scary as

1:07:10

the other. I took a lot of interesting

1:07:12

things that way from this conversation. One

1:07:15

of them as which that like we really

1:07:17

got to read read a book because getting

1:07:19

subpoena by Jim Jordan sounds like something that

1:07:22

would not be very fine. anything. but we

1:07:24

need to listen to this. We really get

1:07:26

involved in things that we think has a

1:07:28

direct impact on our lives and when we

1:07:31

don't think that those things have a direct

1:07:33

impact on our lives, we just don't concern

1:07:35

or so. And I don't know that this

1:07:38

is for everyone but I just saw that

1:07:40

was interesting. I'm the ways that people take

1:07:42

some information and run away with that on

1:07:44

tic toc. Or just on social media.

1:07:47

It is so dehumanizing and so absolute

1:07:50

and like it was. The good faith

1:07:52

part when running was like if people

1:07:54

are really engaging in these conversations in

1:07:56

good faith and I think especially on

1:07:58

tic toc We. The people are

1:08:01

not engaging the conversations in

1:08:03

good faith, they are rage

1:08:05

baiting the or click baiting

1:08:07

and they are painting themselves

1:08:09

as like the arbiter of

1:08:12

morality and I think. No

1:08:14

one is really I mean I didn't really have.

1:08:16

we were really to look at what people are

1:08:18

actually doing it like it's just I don't know

1:08:20

I think that was it's really interesting the ways

1:08:22

that like. Because.

1:08:25

So many things are true at once. I

1:08:27

think the actually lot of people who are. Trying.

1:08:32

To make content to illuminate an

1:08:34

issue end up sometimes doing more

1:08:37

harm than good, which is. Something

1:08:40

I'm really interested in in in trying to

1:08:42

learn more now. I also thought that the

1:08:45

way that social media the Jews talking about

1:08:47

really shifted everyone to becoming like a commentator

1:08:49

and and the way that a commentator is

1:08:51

not a journalist and the training of not

1:08:54

the same I that was really interesting and

1:08:56

something for us to spend more time or

1:08:58

and. Scary. About

1:09:00

the generative ai and where that's going to

1:09:02

be going. Whoop! We need to see what

1:09:04

what's going to keep happening. Mirror A has

1:09:06

it's like you can't really bad checked a

1:09:09

generated image because it may seem legit if

1:09:11

the computer can't tell that it's like a

1:09:13

made up image isn't That would have been

1:09:15

like the unique I'm sorry I just moved

1:09:17

my little chair in my car. I'm. Oh,

1:09:20

and then also the foundational issue with

1:09:22

the Twenty Twenty Four election: diligent disease

1:09:24

and. This is

1:09:26

really what remains. Most.

1:09:29

Concerned about is the fact that

1:09:31

so many people still disagree on

1:09:33

whether or not the Twenty Twenty

1:09:35

election was real. I'm okay, so

1:09:37

now really curious about several things.

1:09:40

Ah, one of them is that

1:09:42

Father Coughlin need to understand more

1:09:44

about that. I'm also curious about.

1:09:47

Narratives. That. State

1:09:49

actors try to put out in the

1:09:51

United States. I'm also a mercury's at

1:09:54

the mall or report now. But.

1:09:57

Russia, China, Iran Collaboration.

1:10:00

What does that look like? Is that

1:10:02

true? I'm so really. what I took

1:10:04

away from this episode is that like

1:10:06

how you do anything is really how

1:10:09

you do everything and almost. Anything

1:10:11

in our life is political.

1:10:13

I'm in so just really

1:10:15

seem that with open eyes.

1:10:18

I love you guys and Six, listen to getting serious with

1:10:20

me. So much of a scenic time. You

1:10:22

been listening to? Getting serious with me.

1:10:24

It's on at the Center, learn more

1:10:26

about this week's guess in their areas,

1:10:28

ecstasies and episode description and follow us

1:10:31

on Instagram accessories. A T V and

1:10:33

you can separate your every Wednesday and

1:10:35

make sure to tune in every Monday

1:10:37

for episodes. A Pretty serious it's we

1:10:39

Love with our partners are often beauty

1:10:41

get into it. still can't get enough.

1:10:43

Any want to go little spicy with

1:10:45

a season? Subscribes to Access Serious an

1:10:47

Apple podcast for commercial free listening and

1:10:49

or subscription only. So as Cbn who

1:10:51

were having sex relationships. Are really different

1:10:53

ever thought my mind that was a

1:10:55

theme music is Free Bike Lane at

1:10:57

think you so much to her for

1:10:59

letting us use that are Editor, an

1:11:01

Engineer as Lithium with the Were Sitting

1:11:03

series of Free By Me, Christmas Were

1:11:05

and Julie Mlp with production support from

1:11:07

Julie Korea and Hurry and Sad Fall.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features