Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:01
Support for Land of the Giants comes from Mint
0:03
Mobile. Do you like having a
0:05
deal with hidden fees baked into your traditional
0:07
wireless plan? Didn't think so. Thankfully,
0:10
Mint Mobile is doing things differently.
0:13
Right now, Mint Mobile's plans start at just $15
0:15
a month, and every plan comes
0:17
with unlimited talk and text plus high-speed
0:20
5G, which means you can browse, buy,
0:22
and set up your phone plan totally online.
0:24
I've done this myself with my own money. It's
0:27
easy. To get your new unlimited wireless plan
0:29
for just $15 a month and get
0:31
the plan shipped to your door for free, go
0:34
to mintmobile.com slash
0:36
giants. That's mintmobile.com
0:40
slash giants. Cut your wireless
0:42
bill to $15 a month at mintmobile.com
0:45
slash giants.
0:55
Breaking news overnight, violence in
0:57
the streets of Ferguson, Missouri. Tear
0:59
gas, Molotov cocktails, and
1:01
gunfire. Protesters
1:03
furious over the police shooting of unarmed
1:06
black teenager Michael Brown. The
1:08
National Guard, the National Guard now
1:10
being deployed to the area. The very
1:12
latest overnight and the new details.
1:14
I was sitting on my couch and I saw
1:17
the protests on CNN. In
1:19
August 2014, DeRay McKesson
1:22
was living in Minneapolis. He was watching coverage
1:24
that didn't make any sense to him. And it
1:26
looked like the wild protesters. It was like, these
1:28
people are
1:29
nuts and they don't care about community and they're
1:31
destroying things. This was another very
1:34
tense night. And police say
1:37
that this was not civil disobedience. This
1:39
was aggression toward police.
1:42
And I remember going on Twitter and Twitter was just telling
1:44
a different story. On Twitter, it was
1:47
the police are trying to kill us. This is crazy. It
1:49
left this Biden street for one and a half hours. Why they kill us unarmed
1:51
kid. And I just remember being
1:54
like,
1:55
it's not me, right? I just remember
1:57
that dissonance. Mckesson
2:00
decided to drive south eight hours to
2:02
Ferguson. Do you think if you
2:04
were not consuming Twitter back then that
2:07
you would have gone anyway? Was
2:09
the coverage on CNN enough to provoke
2:11
that emotion in you or was it the stuff
2:14
you were seeing on Twitter?
2:15
Without Twitter, I wouldn't have gone.
2:17
And when Mckesson got to Ferguson, he stayed
2:19
on Twitter.
2:20
Twitter was like how you knew where to go to volunteer.
2:23
Like, this thing happened or this
2:25
is where the protests will be tomorrow. Twitter
2:27
was- It was useful on the ground for people
2:29
who were there to organize. From day one,
2:31
yeah. Other protesters began looking
2:34
to him for information. So like I was
2:36
like the town crier. I was the person doing
2:38
a lot of the TV interviews, pushing back on the police narrative.
2:41
Very quickly, it was like if I tweeted a
2:43
location and a time, people would come. That was
2:45
sort of my superpower. And your role was
2:47
sort of broadcaster. If you needed a message
2:49
to get out, I was your guy. And people
2:52
just trusted me. So like if I came and tweeted
2:54
it, the news reported as real. People
2:56
would sort of take it as a serious thing. Twitter
2:58
wasn't just an organizing tool.
3:00
Mckesson realized he could use it as a megaphone.
3:03
When I realized that the traditional media was
3:05
not going to be our help at all,
3:08
I was like, okay, this is Twitter is like literally
3:10
the only place where we have a chance.
3:15
People take for granted now that when you think about protests,
3:17
you've seen aerial footage, you've seen all this stuff.
3:19
Remember in the early days, the state of Missouri put
3:21
a no-fire zone over St. Louis. So they
3:23
controlled almost all of the
3:25
narrative, you know, like they were at the mainstream media
3:27
with CNN. Those sort
3:29
of places weren't really pushing back on the police back then. So
3:32
Twitter was like our only mechanism. Mckesson
3:35
wasn't the only one who noticed Twitter's influence
3:38
on the protests. I remember being out on
3:40
West Florida, which was like the main street, the protest one. And
3:42
I saw him in a white t-shirt and I was like, I think that's
3:44
Twitter guy. Twitter
3:47
co-founder Jack Dorsey grew up in St. Louis,
3:49
a few miles south of Ferguson. At the
3:51
time, Dorsey was on Twitter's board, but
3:54
he didn't have an operating role at the company. But
3:56
he was still recognizable
3:58
and it meant something. to have
4:00
a big deal tech founder on the
4:02
ground during a roiling, ongoing
4:04
protest. It was definitely
4:07
unusual. I interviewed Dorsey
4:09
a couple years later at the Code Conference, and
4:11
he told me why he went to Ferguson. Over
4:13
the past nine years, we saw
4:15
so many acts of
4:18
activism and revolution
4:21
and questioning carried
4:23
out through Twitter, but it
4:25
was always somewhere around
4:27
the world. It was never this close
4:30
to home, and I just felt
4:32
I had to be there. I had to bear witness to what was
4:34
happening. The idea that Twitter
4:36
could be used for activism, meaningful activism
4:39
that helped change things, had been around the
4:41
company from the start. We opened
4:43
this season with the story of Iranian protests in 2009
4:45
and the widespread belief that
4:48
Twitter played an important role on the ground
4:50
during those events.
4:52
And in retrospect, that seemed like a stretch.
4:55
But over the next five years, Twitter really
4:57
did play some role in activism around the
4:59
world, and it was definitely important
5:02
in Ferguson. Dorsey
5:06
was proud of the way people were using his invention.
5:09
You can hear it in that clip from Code.
5:12
By the time we did that interview, Dorsey was CEO
5:14
of Twitter again, and he was doubling
5:17
down on this message. He agreed
5:19
to appear at Code on one condition. He
5:21
wanted to bring someone else on stage with
5:23
him. It was DeRay McKesson.
5:26
And Dorsey showed up wearing a t-shirt.
5:28
It had the Twitter logo and the words, Stay
5:31
Woke. In 2009,
5:33
Twitter's leaders had been nervous about taking
5:36
sides, but people within the company
5:38
had always imagined Twitter could be used
5:40
for social good. And now Dorsey
5:43
was explicitly saying that this was the
5:45
company's ambition, to
5:47
be a tool of revolution. The
5:50
idea was, if people have access
5:53
to Twitter, and they can say whatever they want, they'll
5:55
do good things with that power. But
5:58
that idea was about to come. up against
6:01
reality.
6:02
Because at the same time Dorsey and McKesson
6:04
were on stage with me in the spring of 2016, there
6:07
was an entirely different cast of characters
6:10
harnessing Twitter's power to tell a story.
6:13
And their frontman, he crushed it
6:15
on Twitter. You know what? I have millions
6:18
of followers at Real Donald Trump. I
6:21
have millions of followers.
6:28
This is Land of the Giants, a Twitter fantasy.
6:30
I'm Peter Kafka.
6:39
Donald Trump loved Twitter. And
6:42
at one point in Twitter's history, its founders would have been
6:44
delighted to see a president of the United States
6:46
tweeting nonstop. But
6:48
the way Trump used Twitter vexed
6:51
its employees in ways they would have never
6:53
predicted years before. And
6:55
that helped push Twitter to rethink its responsibilities
6:58
and remake itself on the fly.
7:01
We're still dealing with the consequences. To
7:05
find out how Trump dot on Twitter in the first place,
7:07
I asked the reporter who knows him better than
7:10
any other journalist.
7:11
My name is Maggie Haberman. I am a senior political
7:14
correspondent for The New York Times.
7:15
Haberman has covered Trump for years, starting
7:18
back when he was just a New York real estate guy
7:20
with a spotty record. Her biography
7:22
about him is called Confidence Man.
7:24
Haberman says Donald Trump learned about Twitter
7:27
back in 2009 from his
7:29
book publicist. He was promoting one
7:31
of his many books. And so
7:33
for one of the books, the person
7:35
helping him promote it suggested
7:38
to him in a meeting, there's
7:40
this thing called Twitter. And this would be a good
7:43
way to promote your book. The tweets
7:45
were not done by him. Initially,
7:47
they were done by aides, mostly
7:50
by a guy named Justin McConnie.
7:52
Trump was raised on TV and print newspapers
7:55
when he complained to journalists about their coverage.
7:58
He'd scrawl something on their story with a marker. and mail
8:00
it to them. He didn't use a computer.
8:03
But one day, Trump surprised his own team.
8:05
A tweet suddenly showed up on Trump's Twitter
8:08
account and Justin McConnie had not
8:10
done it. And McConnie later described
8:13
to a reporter the moment
8:16
as being like the moment in Jurassic
8:18
Park when the dinosaurs can open the doors.
8:22
Trump locked in right away. He
8:24
was increasingly
8:26
and authentically himself, he
8:29
was savage about people who he
8:31
considered to be his enemies.
8:34
And he was testing it. His assistant
8:37
sent an email to one of his political
8:39
aides making clear that Trump had been testing
8:41
out messages on Twitter and
8:44
looking at what took off and what didn't.
8:46
And he really put a tremendous amount
8:48
of
8:49
work into this Twitter feed. Trump
8:52
started out using Twitter just like anybody else.
8:54
He posted boring stuff about himself. Then
8:57
he figured out that people were more interested in his tweets
8:59
about celebrities like Robert Pattinson and
9:01
Kristen Stewart. Over time, he
9:03
honed it as a political weapon using it to spread
9:05
further conspiracies about Barack Obama.
9:08
He obviously jumped into the swimming
9:10
pool of social media like Hulk jumping
9:13
into the pool, like all the water and everybody else is
9:15
flying out of the pool.
9:18
Ben Smith is the former editor in chief of Buzzfeed
9:20
and the co-founder of Semaphore.
9:22
In the 2010s, Buzzfeed represented the bleeding
9:25
edge of social media driven journalism.
9:27
For Smith,
9:28
Twitter provided an unending supply of
9:30
story ideas, a tip line open to
9:33
everyone on earth.
9:34
At first, I thought Twitter was an incredible
9:36
assignment desk because there was a gap
9:39
between basically the questions
9:42
that were being asked explicitly and lazently
9:44
on Twitter and the capacity of people who were on Twitter
9:46
to get them answered. That was a kind of assignment
9:48
desk where it's like, the assignment
9:50
is here is a question that we don't know the answer to. You
9:53
can take your reporting tools and go answer it.
9:55
As we've mentioned, journalists flock to Twitter right
9:58
away. Reporters would gossip.
9:59
there, they share their scoops there, they praise
10:02
and fight with each other. And even if
10:04
you were a journalist who didn't spend time tweeting,
10:06
you'd still use it to see what other people were
10:08
talking about. And that could affect what you'd
10:10
cover. By the time Trump announced
10:13
his campaign for president in 2015, he had 3 million followers
10:17
and an instant coverage making machine.
10:20
Trump live tweeting the Democratic
10:22
National Convention, posting Bernie
10:25
Sanders totally sold out to crooked
10:27
Hillary Clinton.
10:28
Trump tweeted today. Happy Cinco de Mayo, the
10:30
best taco bowls are made in Trump's Tower Grill. I
10:32
love Hispanics.
10:33
A lot of other people
10:35
were thinking about social media and had like
10:38
social media consultants thinking with them
10:40
about how to optimize engagement on
10:42
social media. He was watching television and tweeting
10:44
at the TV and for
10:46
a while really programming television. He was
10:48
saying, talk about this next, and then they would.
10:51
There was another group watching Trump's every
10:53
tweet. His candidacy lit up
10:55
the very online fringes of the far
10:57
right. They figured they finally had
11:00
a guy who believed what they believed. And
11:02
those folks knew how to work Twitter to
11:04
run influence campaigns. They've
11:07
learned it
11:07
during Gamergate.
11:09
They knew that you could plant news online
11:12
and in small doses, if it gets
11:14
in front of the right people, it can trade
11:16
up the chain to national media.
11:19
That's Joan Donovan, who studies online misinformation
11:22
at Boston University.
11:24
That fact of tricking
11:26
journalists, hoaxing journalists became
11:28
like a drug on
11:31
fortune. It was like a game.
11:33
One big aim of the alt-right trolls was to
11:35
get Trump to retweet their stuff. And
11:38
once he started campaigning for president, sometimes
11:40
he'd do that.
11:42
Trump was very good
11:43
at following certain
11:45
provocative people on
11:47
Twitter and then either replying
11:50
or retweeting them. And so Twitter
11:52
was this amplification mechanism
11:54
that was a bit of a wink and a nod. I
11:57
knew we were in for trouble during the...
12:00
election when Trump had retweeted
12:02
a Pepe meme.
12:04
Pepe the
12:06
Frog is a cartoon frog.
12:08
He didn't start out as an alt-right meme but he
12:10
became one. There's a whole movie about it
12:12
if you really want to go deep. The main
12:15
thing to know about Pepe is that if you didn't know
12:17
what you were looking at, you saw a cartoon
12:19
frog.
12:20
But if you were an alt-right person who liked memes,
12:23
you knew it was a wink and a nod toward you.
12:26
Here's Maggie Haberman again. In a weird
12:28
way, a retweet is almost like a perfectly crafted
12:31
thing for Donald Trump because
12:33
it's a way for him to pass off someone else's thought,
12:35
take some ownership, but have a little distance if he
12:38
wanted to.
12:38
I didn't say that. And he would do
12:40
that. The thing that gets me in trouble is retweets.
12:43
The retweet is really more of
12:45
a killer than the tweets. The tweets I seem to do
12:47
pretty well with.
12:49
This is a way for him
12:50
to own and not own, which is something he really likes
12:52
because he loves avoiding accountability.
12:55
Trump liked avoiding accountability. He
12:58
liked ginning up attention even more. And
13:00
there was a surefire way to do that.
13:02
One of the things that you do is you say something
13:05
that's really transgressive, really sexist
13:07
or sexual or racist
13:09
or just crazy.
13:11
Ben Smith again.
13:12
And all the sort of well-thinking
13:15
establishment media and establishment figures
13:17
wave their fingers at you. And that
13:19
signals to people who feel really alienated
13:21
from that establishment that you actually are an outsider.
13:23
Like if Wolf Blitzer thinks you are
13:26
upsetting, then like you must be doing something
13:28
right. I don't think it was caused by social
13:30
media, but in a moment when these social
13:32
media platforms were optimizing engagement,
13:35
there was this other incentive for these right-wing
13:37
populists to just be as outrageous as they could
13:39
be.
13:40
Trump is the most successful user
13:43
from a politics standpoint of Twitter.
13:46
That's Jason Goldman, an early Twitter executive
13:48
and board member. I think he understood
13:51
intuitively that one of the things that Twitter
13:54
allows you to do is write your own headline, even if
13:56
it's not true, that
13:57
you can just tweet the thing and then that becomes a
13:59
thing the media says.
13:59
Trump says the sky is green. Like, you know, and
14:02
he understood that just because that's how he engages
14:05
in his public life generally.
14:07
He just asserts a reality and allows
14:10
everyone to react to it.
14:11
Goldman has left Twitter years before Trump
14:13
ran for president,
14:15
but he was thinking about the platform a lot because
14:18
during Trump's campaign, Goldman was
14:20
actually serving in the Obama administration as
14:22
the chief digital officer.
14:25
From the time Twitter started, its founders
14:27
had made free expression a bedrock principle.
14:30
Twitter's co-founder, Biz Stone, wrote an early blog
14:32
post calling this out. It was called, The Tweets
14:35
Must Flow. Twitter's
14:37
first leaders believed that the antidote to bad
14:39
speech was more speech. Goldman
14:42
used to think that, too, but now
14:44
he was starting to question Twitter's free speech
14:46
absolutism.
14:48
It felt like it was metastasizing
14:50
into real-world harm in a way that
14:52
was, like, different than what had happened before,
14:54
even pre-Trump.
14:56
Like, Gamergate, Trump,
14:58
Pizzagate, all those things happened
15:00
while I was at the White House. It was this
15:02
notion that now there's just going to be
15:05
real-world, off-the-keyboard violence
15:08
and threats of violence and intimidation and harassment,
15:11
mobilization of hate-based campaigns,
15:15
and none of, like, my assumptions of,
15:17
okay, like, let people work it out online.
15:20
We're really going to hold anymore. Stone's
15:22
boss agreed. Obama himself
15:25
was very aware of these
15:28
developments. He always had, like,
15:30
a appreciative but skeptical
15:33
view of social media
15:35
in particular. Like, I see how this has been
15:37
good, but also it's kind
15:39
of bad, right? Like, there's
15:42
a downside to it. I can see how these tools
15:44
are going to be used to organize
15:46
in ways that are not positive,
15:49
that are not about hope and change, but are about
15:51
violence and hate and intimidation and threats.
15:55
Sometime after the election, but before
15:57
Trump's inauguration, Goldman went
15:59
to a meeting and the Oval Office with
16:01
Obama. And he
16:03
asked me to stay behind and talk to him. Does
16:06
that happen a lot? It happens occasionally. Yeah.
16:09
I think the thing that was unusual about this particular time was he
16:11
was sitting in the chair that the president sits in, which
16:13
is like in front of the fireplace. And I sat in the chair
16:15
where like the Pope sits, like the guest chair,
16:18
which like normally staff doesn't sit in, kind of a
16:20
faux pas on my part.
16:21
And he says, well,
16:24
you know, not thrilled with how
16:26
this election turned out. Like something very kind
16:28
of like low key. And I was like, yeah,
16:31
I'm also not super thrilled
16:33
about it. Like doesn't seem to be great.
16:35
And he's like, and you know, a lot of
16:37
reasons why this happened, but
16:39
in part, it's kind of your
16:41
fault. Coming
16:44
up, Twitter grapples with a very
16:46
tough question. What happens
16:48
when its most powerful super user, who
16:51
also happens to be the most powerful person
16:53
in the world, creates havoc on
16:56
the platform? And
16:58
mind you, come on. Support
17:05
for The Land of the Giants comes from Mint Mobile. Traditional
17:10
phone companies seem to have lost the plot when it comes to contracts these
17:12
days. The
17:15
phone plan appears simple, but then it turns out to
17:17
be a minefield of hidden fees. But what can you do? Everybody
17:20
needs a phone, right? So we just have
17:22
to navigate the financial hazards the phone companies throw at
17:25
us? We do not. We need to ditch retail locations,
17:27
which lets them pass those savings on to you.
17:31
Browse, buy, and set up a phone plan totally
17:33
online. It's an easy shopping experience
17:35
that could save you time and money. I've done it with myself
17:38
with my own money. It works. Right
17:40
now, Mint Mobile's plans start at $15 a month, and
17:43
every plan comes with unlimited talk and text,
17:46
plus high-speed 5G. To
17:48
get your new unlimited wireless plan for
17:50
just 15 bucks a month, and to get that
17:52
plan shipped to your door for free, go
17:55
to mintmobile.com slash
17:57
Giants. That's Mint Mobile.
17:59
Cut your
18:03
wireless bill to 15 bucks a month at MintMobile.com.
18:06
Slash Giants.
18:16
Trump was very good at Twitter, but
18:19
that wasn't the only reason he won the presidency.
18:22
Shortly after the 2016 election, a lot
18:24
of people were convinced Trump had gotten serious help
18:27
from Russia. What we're talking
18:30
about is the beginning of cyber
18:32
warfare. You have a huge
18:34
problem on your hands because
18:37
you bear this responsibility. You've
18:39
created these platforms, and
18:42
now they are being misused.
18:44
And you have to be the ones to
18:47
do something about
18:48
it, or we
18:50
will.
18:52
That's Senator Dianne Feinstein in November 2017,
18:55
during a hearing with lawyers from Twitter, Facebook,
18:58
and Google. At the time, there
19:00
had been a flurry of reports about Russia's attempt
19:02
to interfere with the election using social media,
19:05
and the big platforms were starting to provide
19:07
evidence of that campaign. Twitter
19:09
said that in the three months before the election, 36,000
19:12
Russian bots had posted 1.4 million
19:16
election-related tweets. Now
19:19
it's worth saying that since 2017,
19:21
some research and reporting has argued
19:24
pretty convincingly against the idea that
19:26
Russian trickery on social media swayed the
19:28
election. But back then, much
19:30
of Washington and the public felt panicked, and
19:33
Twitter's leadership was rattled too.
19:35
When Twitter was criticized for
19:37
failing to address Russian interference
19:40
in American elections, we
19:42
felt that really deeply.
19:44
Yole Roth managed product trust
19:46
for Twitter at the time.
19:47
I was angry about it. Executives
19:50
were angry about it. I remember having a
19:52
meeting with Jack Dorsey to talk about what
19:54
we should do, and he was upset
19:56
that somebody would violate Twitter in that
19:59
way. That was the abiding
20:01
feeling that it wasn't just about bad PR. It
20:05
was about people rightfully
20:08
being upset with the violation
20:10
of this space. And that as a company,
20:12
we had a responsibility to do something about
20:14
it.
20:15
In 2015, in the aftermath of Gamergate,
20:18
Twitter has started rethinking how to handle
20:20
abuse on its platform.
20:22
It was a recognition that you shouldn't
20:24
put the burden on the victims of
20:26
harmful activity to need to see it, be
20:28
traumatized by it, and then report it. But
20:31
actually, that companies could do some
20:33
of that work themselves.
20:35
So Twitter fundamentally changed how it
20:38
managed the site. It was no longer
20:40
just going to respond to users complaining about bad
20:42
behavior. It would actively go looking
20:44
for bad behavior and root it out.
20:47
The aftermath of the election accelerated the shift.
20:50
The openness of the platform that had felt so
20:52
core to its founders now felt
20:54
more dangerous. And by 2017,
20:58
Jack Dorsey was saying that enforcing the company's
21:00
content policies would be Twitter's top
21:02
priority. The company expanded
21:05
its definition for speech it didn't want on the
21:07
service. It rolled out new policies
21:09
that would suspend users who glorified violence
21:12
against individuals or groups. It
21:14
put sensitive content filters over
21:16
images that denigrated people based on race,
21:19
religion, or gender. It also hired
21:21
a lot more people to work on safety. But
21:24
the company still had the tweets must flow
21:26
in its DNA. So the task
21:28
of the Trust and Safety team became balancing
21:30
two ideas.
21:32
More speech is better for Twitter,
21:34
but some speech is harmful for Twitter users
21:37
and for democracy. Anika
21:39
Kalir-Navaroli joined Twitter's
21:41
Safety Policy team in 2019. She
21:44
was a lawyer and spent years researching
21:46
online speech.
21:47
We were told that we were supposed to balance
21:50
free expression and safety. And
21:53
so many of us were thinking, how does
21:55
societal power play into the fact
21:57
here? While we're making this sort
21:59
of balance. we're inevitably saying
22:01
like we are going to protect
22:04
the free expression of X group
22:07
over the safety of this other group right
22:09
that's an inherent decision that's being made and also
22:11
in the reverse you know when we are limiting free expressions
22:14
right like whose safety are we
22:16
uplifting
22:18
in practice regulating speech at Twitter looked
22:20
like this automated systems
22:22
dealt with the most obvious stuff when
22:24
the automated system wasn't sure questionable
22:27
tweets went to a human content moderator
22:29
and in the really hard cases Navaroli's
22:32
team would get involved
22:34
we tended to evaluate on a good
22:36
day maybe five tweets we were the last
22:39
stop on the content moderation train we
22:41
wrote the rules the Twitter rules things that
22:43
you see externally that say what you can and you can't do my
22:45
team was responsible for updating those for a couple
22:47
of different areas so if
22:49
there was a gray area case or
22:52
if it was coming from a high profile
22:54
user what would are called a VIP a very
22:56
important Twitter
22:57
and it fell within those policy areas
23:00
it landed on my team's desk and
23:02
these are human beings human beings this
23:05
is manual work discussing slacking
23:07
writing what should we do about this this
23:09
is not a computer solving a problem there are no
23:11
computers involved there's a
23:12
lot of people in slack and in Google
23:15
Docs doing a lot of writing
23:17
a lot of what Navaroli did was hold her
23:19
nose and let the bad tweets stay up
23:22
one of my most common refrains
23:24
that I used every single day when you know
23:26
assessing content was saying literally
23:29
quote like I don't love it and
23:31
it was just my way of saying like I wouldn't
23:33
say this I probably wouldn't hang out with somebody
23:35
who said this
23:36
but is it against the rules
23:39
no and
23:40
so in a way it really was a
23:43
couple of people sitting in a room
23:46
trying to like do their best to say like okay
23:48
well what should we do with this the
23:51
final decisions on the hardest cases
23:53
of free speech on the internet self-declared
23:55
global town square were
23:58
left up to a handful of employees
23:59
What you're
24:01
doing every single day
24:03
is
24:03
driving the news cycle. What
24:06
you
24:06
spent doing is what everybody's
24:08
talking about
24:08
the next morning on Twitter. What you're
24:10
doing that day is what's driving the
24:12
conversation. So I thought to myself, like, holy
24:14
shit, no one should have this
24:16
job. Like, this job should not exist. There is
24:19
so much power
24:21
that is in my hands that is happening
24:24
behind closed doors that has no checks
24:26
and balances.
24:27
Navaroli and her team dealt with very
24:29
important tweeters like J.K. Rowling
24:31
and Kanye West.
24:33
But there was one V.I.T. who
24:35
got treated differently. My
24:37
team had access to every single
24:40
account on Twitter except for Donald
24:42
Trump's account. Jack Dorsey
24:45
made the final decision on Donald
24:47
Trump's tweets. Anything
24:50
that involved him, it had to go all the way up
24:52
to the top. You'll
24:56
remember some of President Trump's worst tweets.
24:59
There was the time he suggested his impeachment would
25:01
lead to a civil war. There
25:03
were racist tweets like the one telling
25:05
Democratic representatives and just to be clear
25:08
American citizens, Ilhan Omar,
25:10
Rashida Tlaib, and Alexandria Ocasio-Cortez
25:13
to quote, go back and help fix
25:15
the totally broken and crime infested places
25:17
from which they came. Then there was
25:19
the tweet insulting little rocket man Kim
25:22
Jong-Il in the size of his nuclear
25:24
button. Did Trump's tweets
25:26
set off World War III? Here's
25:29
Joel Roth again.
25:30
Twitter was paralyzed by what to do
25:32
about Donald Trump. Almost
25:35
since the beginning of his candidacy, he
25:37
had been saying things and posting
25:39
things on social media that seemed
25:41
like they violated our rules. But
25:44
there was ambivalence, even from the earliest
25:46
days, about the idea of
25:48
Twitter moderating content
25:50
coming from a candidate
25:53
for president or a sitting president
25:55
of the United States.
25:57
Before Trump, Twitter viewed politicians
25:59
differently from ordinary users. The public
26:01
had a right to know what their leaders were saying, even
26:04
if it violated company policy. And
26:06
during Trump's dissent and its presidency, that
26:08
was the answer they kept coming back to. The
26:10
tweets were too important to
26:12
take down. But Navaroli
26:14
thinks Twitter had another reason for allowing Trump
26:17
extra leeway.
26:18
I very much believed one of the reasons why
26:20
Twitter executives were
26:22
so willing to sort of bend and break their own
26:24
rules and do the things that they did is because
26:27
they very much relished in the power
26:29
of having Donald Trump use Twitter
26:32
as his megaphone. The
26:34
thing that made Twitter relevant
26:36
and made it the hottest thing
26:38
on the town again was its use by
26:41
Donald Trump starting in the 2016 election.
26:44
So even as Twitter stepped up its moderation
26:46
efforts across the platform, it left
26:49
Trump alone.
26:50
And in the meantime, it made a subtle but
26:52
important shift.
26:54
For many years, the company's position on this
26:56
information was that we're not the arbiters of
26:58
truth, that people could have conversations
27:01
on Twitter. Some of them would be true, some of them
27:03
would be false. And eventually,
27:05
through those conversations, the truth wins
27:07
out.
27:08
That's Joel Roth again, explaining why Twitter used
27:10
to be fine with people tweeting things that weren't
27:13
true. But Twitter began to change
27:15
its position in 2019.
27:18
We want to give this president
27:20
the opportunity to
27:22
do something historic
27:25
for our country.
27:27
In May of that year, an altered video
27:29
that made Nancy Pelosi appear drunk
27:32
spread on Twitter and other social platforms.
27:35
The pressure that Twitter faced in the
27:37
aftermath of that incident led
27:39
the company's executives to ask me to
27:41
think about what a policy approach could
27:43
be like to address misinformation.
27:46
Twitter was no longer just concerned about
27:48
people abusing other Twitter users or
27:51
interference from state actors. Now
27:53
it wanted to step in when people were making things
27:55
up. One obvious solution
27:57
would be to take those tweets down. But
28:00
Roth didn't think that would work.
28:02
Because simply removing misinformation
28:04
doesn't cause falsehoods to
28:06
go away, it just causes
28:08
them to move around. We would be playing an endless
28:11
game of whack-a-mole against permutations
28:13
of the same lie. And so we thought,
28:16
look, instead of us just censoring this stuff,
28:18
what if we elevated credible content
28:20
so that people could make up their own minds?
28:23
So Twitter created a policy about
28:26
deepfakes and other altermedia that gave it
28:28
multiple options to deal with intentionally misleading
28:30
stuff. In the worst cases, it might
28:33
actually take the tweets down. It could
28:35
also tweak the algorithms so the tweets were less
28:37
likely to show up on your timeline. But
28:39
its preferred solution was labels.
28:43
If the tweet contained something that was wrong but didn't
28:46
technically break Twitter's policies, the
28:48
site could use a label to add context and
28:50
links to verified sources of information.
28:53
At the time, the deepfake policy seemed
28:56
like an incremental step made in response
28:58
to new threats. When you step back,
29:00
though, it's quite a journey. At
29:02
the beginning of its life, Twitter believed
29:05
its user's tweets were almost always sacrosanct.
29:08
Now it was going to put its thumb on the scale by
29:10
telling everyone that this tweet was wrong.
29:13
And here's something else you should look at instead.
29:17
And in January 2020, we
29:19
were just ready to start
29:22
testing out this feature,
29:24
and then the pandemic happened. And so
29:26
all at once, we're dealing with
29:29
deepfakes and manipulated media. We're
29:31
dealing with COVID-19 misinformation,
29:34
with an untested alpha
29:36
version of a product that we didn't
29:38
really have the ability to roll out at scale.
29:42
But we had a responsibility to do something anyway.
29:45
Twitter ended up using its deepfake rules as a model
29:48
for its new COVID policy, which it rolled out in
29:50
May 2020. And it used
29:52
the same kind of tiered approach. A
29:54
tweet about masks not working might stay
29:57
up with a label. But if you
29:59
told people to drink bleach, that might not.
30:02
Twitter based its decisions on whatever governments
30:04
and health agencies like the Center for Disease
30:06
Control and the World Health Organization recommended.
30:10
That seemed sensible in theory, but
30:12
in practice those recommendations changed
30:14
constantly. Add in the fact
30:16
that one of the major sources for bad information
30:19
about COVID was the President of the
30:21
United States. So what
30:23
started out as a science problem quickly became
30:25
a political debate,
30:27
which made it a whole lot harder for Twitter to
30:29
make calls about
30:30
what was an opinion and what was harmful advice.
30:33
And then
30:35
the 2020 election started heating up.
30:38
As states were starting to go into lockdowns
30:40
early in the pandemic, California
30:42
Governor Gavin Newsom announced that he would
30:44
be mailing ballots to every eligible
30:47
voter in California, a measure that
30:49
a number of other states subsequently adopted.
30:52
And Donald Trump took to Twitter to claim without
30:54
evidence that this would lead to
30:56
rampant voter fraud. And
30:59
that this was going to be the first step towards
31:01
stealing the election from him in 2020.
31:04
This was a big turning point for Twitter.
31:07
After years of leaving Trump alone,
31:10
it stepped in. So Twitter applied
31:12
a label to one of his tweets, and the label
31:15
said, get the facts about mail-in ballots.
31:18
And that was the very first time that
31:20
Twitter took a visible moderation action
31:22
against him.
31:23
And it wasn't the last. Trump
31:26
continued to tweet lies about election fraud and
31:29
COVID, and Twitter kept labeling his tweets
31:31
with corrections. The
31:34
company started to take an even more aggressive stance
31:36
against some ordinary users too,
31:38
which meant that throughout 2020, Twitter
31:41
found itself taking action against a lot
31:43
of conservative and right-wing users. And
31:46
as you would expect, that did not make conservatives
31:48
very happy. Twitter routinely
31:50
found itself in the crosshairs of folks like Tucker
31:53
Carlson.
31:54
By offensive, they mean that the left doesn't
31:56
like it. And that is the new standard,
31:59
and there's only one response. under that
32:01
standard, silence the person who
32:03
disagrees with you.
32:05
That's why censorship is now everywhere.
32:08
That's why the tech companies started censoring the president.
32:10
That's
32:10
why they're getting more and more aggressive in silencing
32:13
you.
32:14
How much were you guys thinking internally
32:17
about what the political reaction would
32:19
be if and then eventually when
32:22
you moderated the president's tweets?
32:25
It was absolutely a factor.
32:26
Since 2017, Twitter executives
32:30
have been somewhat regularly hauled
32:32
in front of Congress, sometimes to get yelled
32:35
at about Russian interference, but then also
32:38
oftentimes in the same hearing to get yelled at
32:40
about being biased against conservatives.
32:42
This narrative, what I would
32:44
argue was a long-running campaign to
32:46
work the ruffs and get Twitter to moderate
32:49
less, was already happening long
32:51
before the 2020 election. And
32:53
so
32:54
when we had to make decisions about whether
32:56
to moderate Donald Trump or anybody
32:59
else, it was with a recognition that
33:01
there could well be a retaliation.
33:03
The
33:04
first time Twitter labeled one of Trump's
33:06
tweets, Yolo Roth found out
33:08
exactly how personal that retaliation
33:11
could be. I
33:12
was, let's
33:13
say, a mid-level employee and
33:15
then all of a sudden I was portrayed as the chief
33:18
architect of censorship at Twitter. My
33:20
photo was on the cover of The New York Post. Kellyanne
33:23
Conway was talking about me on Fox News.
33:26
He's
33:26
the head of integrity and his
33:28
name is Yoel Roth. He's at
33:31
Y-O-Y-O-E-L. Somebody in San Francisco
33:33
will wake him up and tell him he's about to get
33:35
more followers.
33:36
Conway was one of Trump's top aides
33:38
and she was a fixture on Fox News.
33:40
When she called out Roth, it triggered an outpouring
33:43
of personal attacks and threats. And
33:45
the company realized in that
33:47
moment that if we took
33:49
content moderation actions targeting Donald
33:52
Trump or others, that
33:54
that type of retaliation was part
33:56
of what could happen as a result.
33:59
from the safety policy team
34:01
says the threat of retaliation and the fear of
34:03
appearing biased made Twitter pull some of
34:06
its punches.
34:06
The most abuse that was happening in the platform was
34:09
coming from Trump and his supporters. Twitter had
34:11
the ability to sort of tamper that down, but
34:13
decided not to, because the question in the room
34:15
was, well, how are we gonna tell the difference between
34:17
people who are doing abuse and Republicans? And
34:20
there was no answer, because it was one
34:23
in the same.
34:24
Here's Del Harvey, who ran the entire trust
34:27
and safety operation.
34:28
I would say that senior leadership was
34:31
more concerned about
34:34
the narrative that
34:37
Twitter was attacking conservatives.
34:40
And in fact, at times,
34:42
like asked, are we doing
34:44
that? And we're like, are you kidding
34:46
me?
34:47
No, we aren't. If
34:50
you're seeing a discrepancy, it's because they're breaking
34:52
the rules more.
34:54
Never really, really started to worry about the line
34:56
between political speech and real world violence
34:58
during one of the presidential debates that September.
35:00
The moderator, Chris Wallace, asked
35:03
Trump to tell white nationalists and militia
35:05
groups like the Proud Boys to stand down.
35:08
Instead, Trump said, Proud
35:10
Boys, stand back
35:12
and stand by.
35:14
And so there was this sort of correlation between what
35:17
he was saying offline and what he was gonna say on Twitter. And
35:19
so when he said stand back and stand by,
35:22
we ended up having a conversation internally and
35:24
folks are saying, that's too far,
35:27
right? We've drawn a lot of lines,
35:29
and that's one that we're saying he cannot
35:32
say. And what had happened
35:34
along 2020 is that, you know, there was always this
35:36
sort of underlying conversation about
35:38
a violent overthrow of the government.
35:41
And that had started early in 2020 with COVID-19, the
35:44
sort of mask mandates and the sort of deep state
35:46
conversation that was happening. And it was very, very
35:49
fringe, but by the time
35:50
of the presidential debates, it was becoming
35:54
more mainstream. And so what we saw
35:56
people then beginning to say was very
35:58
loudly, you know, I am locked
36:00
and loaded. I am standing back and standing
36:03
by. I am ready for a second
36:05
civil war or another American revolution.
36:09
Things were feeling very high stakes.
36:12
Twitter, still scarred by Russian
36:15
interference in the 2016 election, was
36:17
more worried than ever about its platform
36:19
being abused. But it was also
36:21
worried about being charged with overreaching, which
36:24
is exactly what happened about a month
36:26
before the election. That's when
36:28
the New York Post ran a story about
36:30
Hunter Biden's laptop. The
36:33
Post story included emails that said connected
36:35
Joe Biden to a Ukrainian energy company.
36:38
It also described videos of Hunter Biden
36:40
having sex and doing drugs. The
36:43
Post story set off all kinds of alarms
36:45
for people, including people at Twitter who
36:48
worried about a repeat of the 2016 campaign,
36:51
where Russia seated hacked emails to
36:53
help Trump. Because the Post
36:55
story really looked dubious. The
36:57
laptop had supposedly come from a Delaware
37:00
computer repair guy via Rudy
37:02
Giuliani, who was Trump's lawyer at the time.
37:05
Former Trump advisor C. Bannon was the
37:07
only other source named in the story. Even
37:10
the Post reporter who wrote the story reportedly
37:13
refused to use his byline because he was
37:15
concerned about the story's credibility. Facebook
37:18
initially slowed distribution of the story to
37:21
give time for fact checkers to confirm
37:23
it. Twitter went further. It
37:25
prohibited users from sharing links to the
37:27
story. And in some cases punished
37:29
users who had tweeted links to it. It
37:32
initially said the story violated its hacked
37:34
materials policy. Two
37:37
days later, Jack Dorsey reversed the
37:39
Biden laptop decision. In
37:42
a 2022 interview, Yolroff
37:44
told Kara Swisher that the company's overreaction
37:47
was understandable, but
37:49
still an overreaction.
37:51
Look, when you're weeks out from
37:53
an election, when you've seen what happened in 2016 and
37:56
you've seen the immense political and
37:58
regulatory pressure to focus on.
37:59
on election integrity to address
38:02
malign interference. And when you feel
38:04
a responsibility to protect the integrity
38:06
of the conversations on a platform from
38:09
foreign governments expending their resources
38:11
to interfere in an election, there were
38:13
lots of reasons why the entire industry
38:15
was on alert and was nervous. But a mistake.
38:18
For me, even with all of those factors,
38:21
it didn't get there for me. But so it was a mistake.
38:23
In my opinion, yes.
38:25
The origin story of Hunter Biden's laptop
38:27
certainly seemed suspect at the time. But
38:30
we've never seen evidence that it was a
38:32
repeat of the Russian interference from 2016. Meanwhile,
38:36
the contents of Biden's laptop have
38:38
remained a story for years. Twitter's
38:41
decision to block the story caused immediate
38:43
blowback. Senator Ted Cruz
38:46
hammered Dorsey during a video conference hearing
38:48
just before the election.
38:50
So Mr. Dorsey, your ability is you
38:52
have the power to force a media
38:54
outlet. Let's be clear. The New York Post
38:56
isn't just some random guy tweeting.
38:59
The New York Post has the fourth highest circulation
39:02
of any newspaper in America. The New York Post is 200
39:04
years old. The New York Post was
39:06
founded by Alexander Hamilton. And
39:08
your position is that
39:10
you can sit in Silicon Valley
39:12
in demand of the media,
39:15
that you can tell them what stories they can
39:17
publish, or you could tell the American people what
39:19
reporting they can hear. Is that right?
39:21
No. You know, every person,
39:24
every account,
39:25
every organization that signs up
39:27
to Twitter
39:28
agrees to a terms of service. A terms
39:30
of service is- So media outlets plus genuflect
39:33
and a Bayard dictates if they
39:35
wish to be able to communicate
39:37
with readers. Is that right? No,
39:39
not at all. We recognize and err in
39:41
this policy and specifically the enforcement. The
39:44
way Twitter fumbled the post story was a godsend
39:47
for conservatives who claimed big tech was biased
39:49
against them.
39:50
Here was proof.
39:52
But it was also a problem for anyone worried that
39:54
tech companies had so much power they
39:56
become de facto governments.
39:59
Cutting off access to-
39:59
Information is one of the most powerful tools a
40:02
government has. And here it was Jack
40:04
Dorsey and his team censoring a story.
40:06
After
40:09
the election was over, Twitter drew down
40:11
the number of moderators working on election misinformation.
40:14
Twitter's leaders figured the danger had passed. They
40:17
were wrong.
40:18
They're trying to steal an election. They're
40:20
trying to rig an election.
40:22
And we can't let that happen. In
40:24
the days after the election, protesters
40:27
descended on election offices. Through
40:29
November and December, lawyers for
40:31
the Trump campaign filed dozens of suits
40:33
all over the country alleging fraud. Trump
40:37
true believers coalesced around hashtags
40:39
like stop the steal or locked
40:41
and loaded. But inside Twitter's
40:43
trust and safety team, there was disagreement
40:45
about how seriously they should take those
40:47
tweets. Navaroli wanted Twitter
40:50
to take down tweets with the hashtag locked
40:52
and loaded because she thought they were calling
40:54
for violence. Del Harvey saw
40:56
it differently. There were
40:58
definitely calls for us to, you know, just suspend
41:00
every account that's tweeting locked and loaded.
41:03
And we took samples and looked at them.
41:06
And, you know, accounts would be like, locked and loaded
41:08
with my glass of bourbon to watch tonight's
41:11
episode of CSI. So
41:14
what if we instead take accounts
41:16
where they're in a network
41:18
where it's more likely to be higher risk,
41:20
maybe because they're tied to more suspended accounts
41:23
or they have previous violations. Any
41:25
of those prioritize getting
41:27
those reviewed by humans and actioned
41:30
and then lowering visibility for
41:32
other tweets to try to make sure that we
41:34
aren't fanning the flames
41:35
further.
41:38
On December 19th,
41:40
Trump sent a tweet that raised the level
41:42
of alarm inside Twitter. Quote,
41:44
big protest in DC on January
41:46
6th, be there, we'll be
41:48
wild.
41:50
People are ready to overthrow the government
41:52
in some sort of violent way. And Donald Trump
41:54
tweets like, how about January
41:57
6th at the United States Capitol
42:00
Washington, D.C.
42:01
Not everyone thought Trump was using that tweet
42:03
to organize a riot. New
42:06
York Times reporter Maggie Haberman thought Trump
42:08
was just organizing another rally.
42:10
I read it as him calling his
42:12
supporters to come to Washington the day of the
42:14
certification of the Electoral
42:16
College. Have you changed your view
42:19
of what that tweet meant and what he
42:21
thought he was doing there in retrospect? No,
42:23
I never. That's what I
42:24
thought then. That's what I think
42:25
now. I
42:26
think he was trying to summon his supporters
42:28
as a show of force.
42:30
To Navaroli, though, Trump's motivation
42:32
was less important than the way his followers interpreted
42:35
the tweet.
42:36
What I saw happening was the same individuals
42:39
who had been so ready and willing to commit violence
42:41
were then saying, perfect,
42:44
right? Now we have a time, a day,
42:46
and a place in which we are going to do
42:49
this thing.
42:52
So I don't sleep the night
42:55
of January 5th. January 6th, I'm
42:57
up and I'm pacing the floor
43:00
of my apartment. I've been saying this for months. I've
43:02
been telling anybody who would listen something bad is
43:04
going to happen today. Food's house!
43:10
I was actually watching it live on
43:12
Twitter, right? And so I
43:14
was watching people
43:16
just live tweet the event. And it was very clearly
43:19
playing out in real time with
43:20
the text and audio,
43:23
right? And so I knew, like,
43:24
the Capitol had been breached. People
43:27
were saying, like, you know, here's
43:29
a
43:32
picture of where the Capitol has been breached. We
43:34
should go through here. The riot triggered
43:36
panic inside Twitter's trust and safety
43:38
team. Navaroli says she got
43:40
new marching orders. The day of January
43:43
6th, I am told two things
43:46
that I am supposed to do. One
43:48
is to find a reason and
43:51
a way to permanently suspend Donald Trump. And
43:54
two, to make the insurrection
43:56
stop.
43:56
Navaroli says her bosses gave her the leeway
43:59
she'd been asking for. for months to
44:01
take direct action to ban accounts
44:03
and take down tweets in real time.
44:05
So this is the time where people are
44:08
literally calling for the vice president
44:10
to be assassinated. And so, you
44:13
know, I'm
44:13
jumping on Twitter and taking down these tweets live.
44:15
Just manually? Nothing technologically
44:18
advanced. Twitter, dot com, search,
44:21
hashtag execute Mike Pence. I
44:23
felt like I was like a Capitol security guard,
44:25
but I was like digitally watching over the
44:28
building. And
44:33
I remember at the end of the day, there was
44:35
this video that went around of these
44:38
folks who worked at the Capitol sweeping
44:40
up the glass, right? And sort of all of the
44:42
damage that had happened. And I just
44:45
resonated so deeply with that
44:47
because it felt like that was my job.
44:50
After Trump tweeted his support for the rioters,
44:53
Twitter suspended his account for 12 hours.
44:56
Then two days after
44:58
the riots, he tweeted, quote, to
45:00
all those who have asked, I will not
45:03
be going to the inauguration on January
45:05
20th.
45:06
And what we actually saw happening was people
45:08
beginning to plan for a second insurrection.
45:12
So many folks were saying like, oh, I didn't get
45:14
to
45:14
participate the first time, right? Like I didn't make
45:16
it, you know, on January 6th to the Capitol.
45:19
But the thing that was being planned was not just
45:22
for the United States Capitol, but for capitals
45:24
all over the country.
45:26
Trump spent years tweeting the most
45:28
outrageous stuff. In the end,
45:30
the one that got him in the most trouble was
45:33
about him not going to an event.
45:36
That tweet about not attending
45:38
the inauguration and the
45:40
very clear,
45:42
immediate
45:44
groundswell of response where people
45:46
started talking
45:47
about inauguration that is being a target
45:51
made it really clear that we needed
45:53
to take action at this point. Jack
45:55
Dorsey was on vacation in French
45:57
Polynesia. So Twitter's general
45:59
counsel. Vijay Gade reportedly
46:02
made the final call to permanently suspend
46:04
Trump.
46:06
There was blowback both ways.
46:08
We should have done it sooner. We should have done
46:10
it a different way. We should have never done
46:12
it in the first place. And
46:14
it's still the best decision that we made at
46:16
the time with the information we had.
46:18
I asked Maggie Haberman how Trump reacted
46:21
to getting banned from Twitter.
46:23
Very very very angry.
46:25
He was very angry.
46:26
Again, talking about his recognition of the
46:28
power of that tool, he knew what
46:30
was being taken away from him. He cares
46:33
about being Donald Trump.
46:35
Being president fed the brand, augmented
46:37
the brand, but the brand was him.
46:39
Navaroli ended up giving testimony to
46:41
the House committee investigating January 6th. She
46:45
said that Twitter bore some of the responsibility
46:47
for the insurrection. She told the committee
46:49
that Twitter should have done more to tamp down
46:52
on calls for violence.
46:53
Everybody thinks that their platform
46:56
is so special and so unique and they do
46:58
something so different that they're not going to
47:00
have the same problems that Facebook had, right?
47:02
Like Facebook was the one where everybody meddled in the
47:04
elections and where all the bad stuff happened. And
47:06
I think Twitter thought that, right? And here
47:09
I am saying like, yeah, you can give the 2016
47:11
election to Facebook, but like the 2020 election
47:14
belonged to Twitter.
47:16
My colleague Lauren Good asked Del
47:18
Harvey whether she thought there was anything Twitter
47:21
should have done differently.
47:22
I don't know that there was much else we could have
47:24
done differently. I think that in general,
47:27
in a perfect world, there would have been
47:29
more that we could do, but
47:31
we don't live in a perfect world. And also
47:34
this was a physical attack happening
47:37
in Washington, D.C. Like
47:39
we weren't there. We
47:42
were doing our thing to try to keep the internet
47:44
safe. What
47:45
do you think Twitter learned from
47:48
the company? Well,
47:52
I think most of the people who would have had learnings
47:55
from it are gone now.
47:57
So I don't know that
47:58
there is much left.
48:02
Just about all the people who
48:05
worked for Del Harvey left after Elon
48:07
Musk bought the company a year ago, which
48:10
is not an accident. Musk
48:12
had made it clear that when he bought Twitter, one
48:15
of the things he wanted to do was reverse
48:17
what he saw as Twitter's overreach. Musk
48:20
promised to bring back accounts that had been banned
48:22
under Twitter's increasingly strict moderation
48:24
policies. One account
48:27
in particular.
48:28
It was not correct to
48:32
ban Donald Trump. I think that was a mistake.
48:34
I think it was a morally bad decision. To
48:37
be clear. And
48:39
foolish in the extreme.
48:42
In the next and final episode of
48:44
Land of the Giants, the Twitter fantasy, Elon
48:47
Musk buys Twitter and then
48:50
regrets it.
48:51
What about the rest of us? Audio
48:59
clips from CNN, Megyn Kelly presents
49:01
ABC News, Tucker Carlson Tonight, The
49:03
Knight Foundation, Fox News, and The Financial
49:05
Times. Additional footage provided
49:07
by Getty Images.
49:09
Land of the Giants, the Twitter fantasy, is
49:11
a production of Vox and the Vox Media Podcast
49:14
Network. Matt Frasica
49:16
is our lead producer. Oluwakemi,
49:18
Alade Sui is our producer. Megan
49:20
Kunein is our editor. Charlotte Silver
49:23
is our fact checker. Brandon McFarland
49:25
composed the show's theme and engineered this episode.
49:29
Art Shung is our showrunner. Nishak Kerwa
49:31
is our executive producer. I'm Peter
49:33
Kofka. If you like this episode, as
49:35
always, please share it. And you can follow
49:38
the show by clicking the plus
49:39
sign in your podcast app.
49:52
Thanks for
49:54
watching.
49:59
What's worse is dealing with the
50:02
hidden fees on top of that. Thankfully,
50:05
Mint Mobile can provide you an alternative.
50:08
Right now, Mint Mobile's plans start at just $15 a month.
50:12
And every plan comes with unlimited talk and
50:14
text plus high-speed 5G, which
50:16
means you can browse, buy, and set up a phone plan
50:19
totally online. To get your
50:21
new unlimited wireless plan for just $15 a month,
50:24
and to get that plan shipped to your
50:26
door for free, go to MintMobile.com.
50:29
That's MintMobile.com.
50:34
Cut your wireless bill to $15 a
50:36
month at MintMobile.com.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More