Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:01
this is on the media's midweek podcast
0:04
on broke lad stone this week
0:06
bloomberg reported that social media
0:08
posts about israel and hamas have
0:11
led to a sticky cesspool of confusion
0:13
and conflict on elon musks
0:15
x on saturday just
0:17
hours after hamas fighters from gaza
0:20
surged into israel unverified
0:22
photos and videos of missile
0:24
airstrikes buildings and homes
0:27
being destroyed and other posts depicting
0:29
military violence in israel
0:31
and gaza crowded the platform
0:34
but some of the har was actually
0:37
old images passed off his new
0:39
media we've also spotted and many of
0:41
fake video circulating on a line
0:43
in particular claiming to show at
0:46
children captured by hamas
0:48
specially of this appalling a video
0:51
that's been circulating and seen a two
0:53
million times on this post only on
0:55
x some of this content was posted
0:58
by anonymous accounts that carried
1:00
blue checkmark signaling
1:01
that they had purchased verification
1:04
under exes premium subscription
1:07
service some military footage
1:09
circulating on x was drawn from
1:11
video games on sunday
1:14
elon musk recommended to his
1:16
hundred and fifty nine million followers
1:18
to accounts for quote following
1:21
the war in real time the
1:23
same accounts of made false claims
1:25
are anti semitic comments in
1:27
the past ah the asher shapiro
1:30
covers tech for the thomson reuters
1:32
foundation welcome back to the show
1:34
avi
1:35
they throw have a member so
1:37
on tuesday the european
1:39
union industry chief here a break
1:42
tall told elon musk that
1:44
he needed to tackle the spread of disinformation
1:47
on x to comply with
1:49
new edu online content
1:51
rules didn't mosque take
1:54
some stuff down afterwards
1:56
i'm not sure about that i've been looking
1:59
x for the last week or so to see
2:01
how easy it is to find something that's just
2:03
completely made up about what's going on in
2:05
Israel and Gaza and see how easy it is
2:07
to find a verified or big account pushing
2:10
something that's just obviously made up.
2:12
That typically takes me a couple seconds. I
2:14
found this terrible video that was tweeted
2:17
over and over again of a woman being burned
2:19
alive that had been passed off as happening
2:21
in the conflict. And it was a seven-year-old video
2:23
from Guatemala. And then, you know, a day
2:25
later, it disappeared.
2:27
That one really got around. In
2:29
my doctor's office, the doctor's assistant
2:32
said that it had kept her up all night crying.
2:34
That image itself? Yeah.
2:37
Wow.
2:37
Wow. Wasn't that
2:39
repurposed everywhere from India to the Middle East?
2:41
Yes. It tends to crop up during crises.
2:45
Opportunistic people put it online to
2:47
try to generate interest. You asked
2:49
the question, didn't Musk take stuff down? I
2:51
mean, I have no idea why that post
2:54
doesn't exist anymore on X,
2:56
right? There's a real lack of transparency
2:58
about what's going on and how they're tackling these
3:00
issues. There's all sorts of rules.
3:03
The company has stopped issuing transparency
3:06
reports, which are the tools that we as reporters
3:08
would use to sort of parse how policies
3:11
were enforced. And they've threatened to sue
3:13
independent researchers who have tried to measure
3:16
the spread of hateful content on
3:18
the website. It is a bit of a black box
3:20
as to what's going on. And we are reduced to sort
3:22
of looking at the feed ourselves and saying,
3:24
wow, there's a lot of crazy stuff on here that doesn't
3:27
look true.
3:28
It's weird, isn't it? Because
3:30
there's so much true stuff that's
3:33
horrible enough.
3:35
Yeah. I mean, that's what I've been saying over and over
3:37
again. I mean, isn't
3:37
there enough bone-chilling images
3:40
and video to go around? One
3:43
of the things you have to understand is that Musk
3:45
has significantly changed the incentives
3:48
for how people can use his platform.
3:50
Before he took over, there wasn't
3:52
a route to make money as a creator
3:55
on Twitter by creating this verification
3:57
scheme where you pay for verification.
4:00
which allows you to get more reach and get injected
4:02
in people's algorithmic feed. And then you
4:04
can get paid out on the other side.
4:07
If you have a viral tweet, you can get a share
4:09
of the revenue. He has created
4:11
the conditions to incentivize some potentially
4:14
very unsavory behavior in a
4:16
moment like this. And the question is, has
4:18
he created the parallel institutions,
4:21
rules, hired the staff to
4:23
guard against the worst externalities
4:26
of that kind of economic system on the platform?
4:29
I don't know. We know that he fired
4:31
half of the staff when he took over. We
4:33
know he's steadily ground
4:36
down the trust and safety and other teams that
4:38
are supposedly tasked with doing this kind
4:40
of work. He just completely
4:43
acts huge content moderation teams,
4:45
people who had language expertise. Before
4:48
Musk took over Twitter was sort of a hybrid platform. They
4:50
had hundreds of staffers who were doing
4:52
editorial style functions, right? There'd be something
4:55
happening. They would create a moment. They would create
4:57
these sort of carousels where they put
4:59
authoritative sources around a certain
5:01
issue. All of that's gone. It's
5:03
been outsourced to this thing called Community
5:05
Notes. Musk is trying to do what
5:08
they used to pay people to do with volunteers. So
5:11
now they have people who volunteer to
5:13
append labels to tweets. And
5:15
there's stuff that's really good about that. It's
5:17
more democratic. But you had
5:19
a great piece earlier in this week by NBC's
5:22
tech team that got inside of the Notes
5:24
program and saw that they were overwhelmed.
5:27
They didn't have enough volunteers. There wasn't professional staff
5:29
doing the work. And meanwhile, they were racking
5:31
up hundreds of thousands of views, making claims
5:34
about churches being destroyed that weren't, about
5:36
military aid being provisioned that wasn't. And
5:39
this was all while this kind of beleaguered
5:41
team of volunteers who are now on the front
5:43
lines of this are made to label this at
5:45
the pace that they can manage it. To recap,
5:48
Twitter was not a platform where
5:50
you could
5:50
monetize your engagement by
5:52
clicks. Now it is.
5:55
Other platforms have done that, like YouTube. And
5:58
apparently they had a stiff learning. curve
6:00
and figuring out which accounts could be monetized
6:03
under what circumstances and so forth.
6:05
Do we have any clear guidelines
6:07
from X about when you
6:10
can and how you can monetize
6:13
or when they will demonetize
6:15
your account? They
6:16
have a page on their
6:18
website which is called Creator Monetization
6:21
Standards
6:22
which does lay out all of the different
6:25
things that you can do to lose your monetization
6:28
privileges, right? They
6:30
have all sorts of things. If you're promoting a
6:32
pyramid scheme, literally they have a section that says
6:35
if you're promoting miracle cures you
6:37
can lose your monetization. For me
6:39
the key question here is what
6:41
kind of architecture has Twitter built around
6:44
its monetization program to
6:46
actually create good incentives for people
6:49
who want to make money on the platform to not
6:51
go viral posting
6:53
demonstrably false information or titillating
6:56
information that's misleading. I just
6:58
don't know. I think they have an obligation
7:01
to tell journalists and tell
7:03
the public to demonstrate that they're taking
7:05
those rules seriously. Over
7:07
the weekend Musk tweeted
7:09
to his 150 million followers
7:12
that they should follow two accounts for
7:14
updated information on the conflict. He
7:17
was giving guidance. One of them was
7:20
to a place called at War Monitor
7:22
and the other was at Scent
7:25
Defender. These are two accounts
7:27
known for spewing lies
7:30
and although he eventually took down the tweet, 11
7:32
million people ended up seeing it. So tell
7:34
me what kind
7:35
of hard truths are you deriving
7:38
from the propping up of so-called
7:40
citizen journalists?
7:42
There's been a couple instances where people have found
7:44
like strange anti-Semitic things that
7:47
one of the accounts has said and then I think they
7:49
had made mistakes in the past where they've sort
7:51
of claimed certain things had happened and they
7:53
were wrong. What Musk has
7:56
done is said that Twitter put
7:58
its thumbs on the scale. in the past, that the people,
8:01
the executives and the people who worked at the company
8:03
had certain political biases that they were
8:05
not being honest about and that
8:07
he has ushered in a new era of sort of openness
8:10
and democratic horizontalism on the platform.
8:13
But really what he's done is he has just replaced it
8:15
with a new set of preferences that sort of revolve
8:17
around him. Right. And so you'll see that in moments
8:19
like this, where he's like, huh, like, what am I finding interesting
8:22
on the internet? Like, let me recommend it. Like it's a
8:24
very like personalist approach to ruling
8:27
the platform. Right. You'll see it in
8:29
his unilateral decisions about rules,
8:31
but then you also see it in more subtle
8:34
ways around, you know, introducing algorithmic
8:36
feeds. Like in the past, Twitter
8:38
had human beings curating moments
8:41
and authoritative sources. Now they're
8:43
using algorithms. Like, is that really fairer
8:46
or better? Like, no, it's like he's come up with a different
8:48
way of displaying you information that has his own
8:50
set of pitfalls. Right. As we were talking about
8:52
earlier, like, I don't really know why
8:54
a certain thing is in my feed. It's a verified
8:57
account that I don't follow. I've never looked
8:59
at before Twitter has decided to inject
9:01
it in my feed. That's an editorial decision they've made.
9:04
And there's no accountability. I mean,
9:06
that's the difference between his
9:08
brand of citizen journalism,
9:11
I guess, and what responsible
9:13
news outlets do.
9:15
I want to be clear about one thing that I really think
9:17
is important to underline, which is that, you
9:19
know, Twitter before Musk had a lot
9:22
of problems. And like, as someone who reported
9:24
on the platform and tried to bring accountability
9:26
to the platform, there are certain things they hid
9:29
that were particularly frustrating to me. For
9:31
example, you know, although they released transparency
9:34
reports, they didn't release information about
9:37
when a government would contact them and ask them
9:39
to take out information that was violative
9:41
of their terms of service. And that was a real problem.
9:43
Right. So you have to understand that, like, you
9:46
know, these platforms, it's not like there was this
9:48
perfect thing that like Musk came in and like
9:50
throw a wrecking ball into. That
9:52
being said, you know, he has taken a
9:54
lot of steps that have made it even harder to
9:57
sort of assess the social impact
9:59
of his platform. And I think threatening to
10:01
sue researchers who try
10:03
to collect information about the spread
10:05
of hateful speeches, really
10:08
chilling. How are you meant to sort of keep
10:10
tabs on this place that is
10:12
run by the richest man in the world who seems to have
10:14
a trigger finger on defamation lawsuits?
10:17
So how did X respond
10:20
to inquiries about all
10:22
the disinformation proliferating on
10:24
the platform around Israel and
10:26
Hamas? How did this response
10:28
compare to Twitter's initial
10:30
response to the flood of information in the aftermath
10:33
of Russia's invasion of Ukraine
10:35
back in February of 22? I've
10:38
been speaking to people at Twitter who were
10:40
on the front lines when the company
10:43
was trying to respond to the initial
10:45
days of the Ukraine crisis. You
10:47
see a totally different posture.
10:50
They had human rights lawyers on staff.
10:52
They had Trust and Safety
10:54
Council of NGOs
10:56
and groups around the world with experts. They
10:59
were consulting about how to make these decisions.
11:01
They released some groundbreaking new
11:03
rules around images of prisoners of war where
11:06
they were trying to apply international humanitarian
11:08
law to how the company was dealing with
11:11
images coming out of the conflict. So I'm sure
11:13
they made a lot of mistakes. And I'm sure that
11:15
there is plenty of things you could point out. But
11:18
it was a very different posture. At the
11:20
moment now, there's very little information
11:22
coming out of the company. They've tweeted some long,
11:25
mega-length tweets saying that they're taking
11:27
this seriously and that they have staff
11:30
looking at stuff. But they don't
11:32
have the same level of
11:34
granular detail, like long blog
11:37
posts that the company's kind of policy teams were
11:39
releasing in the early days of the Ukraine war
11:41
where they were trying to communicate
11:43
to the public how they were going to handle
11:46
information coming out of the conflicts.
11:48
Okay, so then how
11:51
do more responsible actual
11:53
news outlets try to staunch
11:55
the flow of false wartime
11:58
videos and images?
12:00
There's ways that you can basically
12:02
use satellite imagery, you can use
12:04
mapping technology, you can use
12:07
metadata, that you can look at an image
12:09
that's being posted online and say with a reasonable
12:11
degree of certainty, where was this taken? Using
12:14
context clues, looking at images in the background
12:16
and say, ah, this is actually from Gaza,
12:19
this is not from Guatemala. You can
12:21
look at the metadata inside
12:23
of images or inside of videos to sort
12:25
of get a sense of who might have originally posted it.
12:27
Is that how you figured out that the Burning
12:30
Girl was not from Israel?
12:33
No, Brooke, I figured that out by Googling
12:36
the words Burning Girl
12:38
video. And the first thing that came
12:41
up was a CNN story
12:43
from 2015 that said, here is a viral video
12:47
of this terrible thing that happened in Guatemala.
12:49
It was a two second thing. Whoever
12:52
had posted that, which was a verified account
12:54
that had thousands of followers, had either
12:57
not bothered to do, or they themselves
12:59
had actually re-skinned this video
13:02
and passed it off. I'm not sure the origins
13:04
of it, but you don't need to
13:06
be a whiz to debunk some of
13:08
this stuff, but it's really about the investigative
13:10
power of it. You've seen these investigations,
13:14
they've done incredible, I think
13:17
it was the post that did an incredible
13:19
recreation of the killing of Shri Naboo-Aqal,
13:22
the Palestinian reporter in
13:24
the West Bank recently, which showed
13:27
definitively that it was an Israeli sniper
13:29
bullet.
13:30
Even though the Israeli
13:32
army denied it. Right, and
13:34
how did they do it? There's these firms like Forensic
13:36
Architecture, who could go and recreate
13:39
in digital form these streets and
13:41
the ballistics, and then you pair it
13:43
with images taken from the time. And so they're amazing
13:45
investigative tools. The fact that newsrooms are putting
13:47
these kinds of people into the field does
13:50
all of us a service, because I think we ultimately
13:53
will have a clear-eyed
13:55
sense of what's going on in Israel
13:57
and Palestine.
13:59
that we've seen
14:02
the utter failure of X during
14:04
this crisis. What
14:07
are we missing out on? Because there
14:09
are other sites, Mastodon, Blue Sky
14:11
they've tried, none obviously have
14:13
risen yet, certainly, to
14:16
the influence that Twitter has had in
14:18
terms of being a springboard for awareness
14:21
and protest and pure
14:24
information.
14:25
Well, one of the trends that people are talking a
14:27
lot about is the TikTok vacation,
14:30
or the sort of like discordification
14:32
of social media. That this era of
14:35
sort of like an open platform where
14:37
you could sort of search around and you sort of
14:39
like create your own experience and people
14:41
would sort of post to the world and you
14:43
would go figure out what you wanted to follow,
14:46
that it's ending, right? And then what's replacing
14:48
it is much more either algorithmically
14:50
driven places like TikTok, where you just
14:52
turn it on and strap in, you're like, show me what
14:55
you got. Or these places like
14:57
Discord and Telegram, which are closed communities,
14:59
which are not searchable in the same
15:01
way, where people kind of cousin themselves off
15:04
into different little groups and share their.
15:07
So I think it's possible we are missing this era
15:10
that I think had a lot of positive externalities.
15:13
People kind of got to choose a little bit about what they saw,
15:16
and they could search widely around the
15:18
world and learn about things. There
15:20
was an openness to the design. I
15:23
don't think people are gonna design like that anymore.
15:25
So what do you think are the consequences
15:28
of an even greater amount of misinformation
15:30
than usual on this platform, in
15:33
the context of this conflict?
15:35
What's happening right now is the creation
15:38
in real time of a historical record of
15:41
this terrible, terrible, bloody
15:44
conflict. And flooding
15:46
the zone with BS doesn't help anybody,
15:48
right? There's always been a fog of war, but
15:52
an algorithmically driven fog of war
15:54
that actually injects potentially
15:57
false information in front of our
15:59
eyes. we scroll through the internet is a different
16:01
level of dystopian thinking. Yeah.
16:04
An algorithmically driven fog of war
16:07
does a level of disservice to the public
16:10
discourse that we haven't seen before.
16:13
Avi, thank you very
16:14
much. Thank you, Brooke.
16:15
Avi Asher Shapiro covers
16:17
tech for the Thomson
16:19
Reuters Foundation. Thanks
16:22
for listening to OTM's Midweek podcast.
16:25
And please check out The Big Show, which
16:27
posts on Friday. It's the final
16:30
part of our three-part
16:30
collaboration with ProPublica
16:33
called We Don't Talk About
16:35
Leonard. And it's about the conservative
16:38
movement's well-funded effort to
16:40
take over America's courts and
16:42
the man at the center of it all. You
16:45
can find all three parts, of course, wherever
16:47
you get your podcasts.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More