Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
This podcast is supported by KPMG.
0:02
Your task as a visionary leader is simple.
0:05
Harness the power of AI, shape the future
0:07
of business. Oh, and do
0:10
it before anyone else does without leaving people
0:12
behind or running into unforeseen risks. Simple,
0:15
right? KPMG has got you, helping
0:17
you lead a people-powered transformation that
0:19
accelerates AI's value with confidence. How's
0:22
that for a vision? Learn more
0:25
at www.kpmg.us.ai. Casey,
0:28
a little bird told me it's your birthday today. It's
0:30
my birthday, Kevin. Happy birthday. Thank
0:32
you so much. And let me just say,
0:34
58 looks great on you.
0:37
Thank you. I never felt better.
0:41
Really. You're not 58. Well, in internet years,
0:43
I think I probably am, at least 58,
0:45
if not older. So no, I feel it
0:47
good. Yeah. I got you a present. What's
0:49
that? You want to see it? Yeah. OK. I
0:54
wrapped it in everything. Now I have to warn you, I'm not
0:56
a good rapper. So it really... I'd like to hear you at
0:58
least spit a few bars. Hey, I
1:00
hear what you did there. Thank
1:02
you. I actually think this is beautifully wrapped. It's
1:04
a nice sort of, you know, some brown
1:07
and glitter paper. A
1:10
multi-voice changer. Yeah. So this is
1:13
because sometimes we get listeners writing
1:15
in to say that they can't tell our voices
1:18
apart. So this is
1:20
to give you some options for how to
1:22
transform your voice. That's great. Should we listen to
1:24
a few of them now? Yes. Just describe
1:26
what it is. This is a sort of miniature
1:28
bullhorn that is purple and plastic. And
1:31
it is still in its packaging. Yeah. Open it
1:33
up. Let's try it. All right. Let me just
1:35
cut through these zip ties here. And
1:40
it has many different modes. And the way you
1:42
adjust it is by turning those sliders. And then
1:44
you pull the trigger. And then you talk into
1:46
it. And it changes your voice. OK. So let's
1:48
just... Hello? Hello? Hello?
1:52
Pretty good. It's giving robot. Let's
1:55
try to do high pitch. So that's just A,
1:57
it says. Is
2:00
this heartbeats? Yes. OK. Do
2:02
you like this one? I like that
2:05
voice. So if I just
2:07
did this forever, then people would
2:09
just confuse us with the piecey.
2:11
Yes. And then they would
2:13
no longer be confused about our voices.
2:15
No time, no longer listen to heartbeats.
2:17
Maybe, I guess. It's
2:20
very good. I like it. Well,
2:23
happy birthday to me. I'm
2:30
Kevin Roos, a tech columnist at The New
2:32
York Times. I'm Casey Neiman, a platformer. And
2:35
this is Hard Fork. That's Link on the
2:37
Sound. The Surgeon General wants
2:39
to issue a warning about social media.
2:41
Should Congress let him? Then, former Stanford
2:43
researcher Renee DiResta joins to talk to
2:45
us about her new book on modern
2:47
propaganda and whether we're losing the war
2:50
against disinformation. And finally, The Times'
2:52
David Yaffe Bellamy stops by to tell us how
2:54
crypto could reshape the 2024 election. Well,
3:06
Kevin, this week we start with a warning. Yes.
3:08
But not our warning. A Surgeon General's warning.
3:10
Or I guess we should say an attempted
3:12
Surgeon General's warning. Yeah, let's
3:14
talk about it. This was really interesting.
3:17
This was maybe the biggest tech news
3:19
of the week. And it came out
3:21
earlier this week from Surgeon General Vivek
3:23
Murthy. That's right. And it is a
3:25
story that we have been following here
3:27
on Hard Fork. I would say essentially
3:29
since the beginning, right? Because last May,
3:31
the Surgeon General issued an advisory about
3:34
the risks of social media for young
3:36
people. In that advisory, he wrote both
3:38
that there were potential risks of harm
3:40
for excessive use of social media, particularly
3:42
for some groups of young people and
3:44
also that there could be benefits for other groups of people.
3:46
We talked about that here. Then more recently,
3:49
we had Jonathan Haidt on the show in March.
3:51
He wrote this book, The Anxious Generation, which
3:53
went on to become a bestseller, kind of
3:55
covers this similar idea that social media can
3:57
be really risky. We talked to young people.
4:00
They called into the show. We talked to them about how they felt
4:02
about it. And since then, this
4:04
has just been, frankly, one of the biggest
4:06
debates of the year, wouldn't you say? Totally.
4:08
I mean, I have been talking with parents
4:10
for months now. I would
4:12
say that the debate sparked by Jonathan
4:15
Hight's book has become a true social
4:17
phenomenon. And I've seen this book on
4:19
desks and shelves everywhere. I've heard from
4:21
just many, many people about this.
4:23
And we got so much feedback on the
4:26
episodes that we did, not just with Jonathan
4:28
Hight, but with the listeners who wrote in
4:30
and who we talked to about this issue.
4:32
So I would say this is one
4:35
of the biggest debates about technology
4:37
right now is the effects of
4:39
social media on teens and adolescent
4:41
mental health. Absolutely. And while there
4:43
are a lot of folks who
4:45
wrote in who are very sympathetic
4:47
to the ideas expressed by both
4:49
the Surgeon General and Jonathan Hight,
4:51
there's also been some pushback. Candace
4:53
Audris, who's a professor at UC
4:55
Irvine, wrote in the journal Nature,
4:57
quote, hundreds of researchers, myself included,
4:59
have searched for the kind of
5:01
large effects suggested by Hight. Our
5:03
efforts have produced a mix of
5:05
no small and mix associations. Most
5:07
data are correlative. So in other
5:09
words, efforts to prove once and
5:12
for all, find the smoking gun,
5:14
say, hey, you look at Instagram
5:16
too long, it's going to make
5:18
you depressed. She's saying we
5:20
have not been able to find a very large
5:22
effect for that. And the tech platforms themselves have
5:24
been pushing back on this idea for years, right,
5:26
that they are sort of causing mental health problems
5:29
among young people. But
5:32
I would say this has become like a kind
5:34
of kitchen table debate in America and around the
5:36
world. It has also spawned
5:38
a bunch of legislation and attempts to
5:40
actually try to reform social media through
5:42
new regulations and laws. That's right. So
5:44
more than half of the states in
5:46
the US are moving forward with some
5:48
form of legislation aimed at protecting children
5:51
who use the internet. Laws passed in
5:53
Utah and California have already faced legal
5:55
challenges because, of course, it's very hard
5:57
to regulate social media in a way
5:59
that doesn't infringe on the First Amendment.
6:01
I believe New York just passed a
6:03
bill this month that restricts social media
6:05
companies from using algorithms in kids' social
6:07
media feeds without parental consent. So we'll
6:09
see how that one plays out. My
6:12
guess is that'll be subject to a big legal
6:14
challenge as well. So Kevin, as you say, this
6:16
is maybe the big kitchen table debate about tech
6:18
so far this year. But Kevin, what if I
6:21
were to tell you that all of that was
6:23
just a prelude to what happened this very week?
6:26
I would believe you. So the surgeon general wrote
6:28
an op-ed in your newspaper,
6:30
so congrats on the big scoop,
6:33
where he says that social media
6:35
should add cigarette-style warning labels to
6:37
social media platforms. In the opening
6:40
paragraphs of his op-ed, he said,
6:42
we don't have perfect information,
6:44
but the mental health crisis among young people
6:46
is an emergency. Kevin, what did you make
6:49
of this op-ed? It was really
6:51
interesting, in part because I think, a
6:53
lot of people know that the surgeon
6:56
general puts warning labels on cigarette packages,
6:58
and we have seen those for decades
7:00
now. And there's actually some
7:02
evidence that those warning labels can increase
7:04
awareness of the risks of tobacco, and
7:06
that they can change behavior among the
7:09
people who see them. And
7:11
so what the surgeon general essentially called
7:13
for in this opinion essay is
7:16
applying the same kind of approach to social
7:18
media, where if you're a teenager and you
7:20
log onto a social media platform, maybe there's
7:22
a little banner, maybe there's a little warning
7:24
label, and it says something like the
7:27
use of social media by adolescents
7:29
has been linked to mental health
7:31
harms. And this is something that a
7:33
lot of parents and teachers have
7:35
been calling for, but it's
7:38
one thing to have sort of a citizens movement around
7:40
this stuff. It is another thing to have the
7:42
surgeon general of the United States say that social
7:44
media platforms should carry warning labels. Well, that
7:47
is certainly what he is counting on, right?
7:49
That he can use the authority that came
7:51
from many surgeon generals ago pointing out that
7:53
smoking caused cancer to use that credibility to
7:55
say now essentially, hey, you look at Instagram
7:57
or Snapchat too long, you're gonna have problems.
8:00
But I have to say, Kevin, I was not impressed
8:02
with this statement. All right. Walk me through why you
8:04
were not impressed. Well, what I want
8:06
to take issue with something you just said, which
8:09
is that these warnings have been associated with a
8:11
change in behavior. Well, I think that's true in
8:13
a broad sense. I think it's important to remember
8:15
all the other things that were happening that contributed
8:17
to people smoking less, because just a
8:20
few years after they started putting out those warnings,
8:22
Congress banned advertising for cigarette ads on TV
8:24
and radio. And then we began
8:27
to see the banning of smoking in public
8:29
places. Right. So the warning, yes, was part
8:31
of a package of things that appears to
8:33
have had a very positive effect. But the
8:35
idea that a warning in and of itself
8:37
really did much, I'm actually not convinced at
8:39
all. Yeah, I
8:41
mean, I also think it's a more
8:44
nuanced argument that the surgeon general is
8:46
making. He actually writes, to
8:48
be clear, a warning label would not on
8:50
its own make social media safe for young
8:52
people. Like he is not calling for this
8:55
to be the only thing that the federal
8:57
government does to deal with the issue of
8:59
young people's mental health and social media. He
9:01
also is supporting still this legislation in Congress.
9:04
He wants companies to be required to
9:06
share data about the
9:08
health effects of their apps
9:11
with independent researchers and
9:13
allow independent safety audits. He
9:15
also calls for these sort of phone
9:17
free zones that parents and schools can
9:20
set up. But
9:22
I think the sort of narrow question
9:24
of this warning label, I
9:26
just don't see what it harms. Do you actually
9:28
see people being
9:30
hurt as a result of, if you
9:32
were a teenager and you had to click past
9:35
a little warning label when you spent
9:37
too much time on Instagram, do you actually think that
9:39
would hurt your life? No, but what if I'm a
9:41
14-year-old LGBT kid and I have parents who aren't supportive
9:43
and I say, can I create an Instagram account? And
9:45
my parents say, no, you can't. It's like not safe
9:48
for you. And it's like, okay, well, I'll just go
9:50
be, feel very alone
9:52
for the next couple of years. Like that
9:54
doesn't seem great to me. I just think
9:56
that this warning is going to be used
9:59
as a pretext to keep kids off social
10:01
media who might get some benefit. fit for
10:03
them. And look, it's not that I'm saying
10:05
that there's no risk to some groups of
10:07
teens, but I think everything is just sort
10:10
of like getting flattened into this very just
10:12
like kind of ham fisted warning when we
10:14
need like more targeted solutions like the surgeon
10:16
general was proposing last year. Yeah, well, we
10:19
should also just say like the surgeon general
10:21
cannot unilaterally just start slapping warning labels on
10:23
products and social media platforms. This actually would
10:25
require an act of Congress
10:28
to put a warning label on Instagram
10:30
or TikTok or any of these platforms. I was a
10:32
little surprised by that. Were you? Yeah, I kind of
10:34
was too, because I kind of thought like, what's the
10:37
point of being the surgeon general if you can't like
10:39
snap your fingers and put warning labels on stuff? Congress
10:41
has to be like, okay, you can warn people. What
10:43
did we need the surgeon general for was my question.
10:45
But I think it's, you know, it is a position
10:48
that has a lot of symbolic importance. This is sort
10:50
of the top doctor in
10:52
the nation. And I think
10:54
it matters if the surgeon
10:56
general says, you know, this thing that
10:58
your teens are are using may
11:00
be harmful to them. Well, it does matter.
11:02
But I have to say, I was really
11:04
disappointed by this statement because as I'm reading
11:06
through both the op ed and an accompanying
11:08
interview that he did with reporters
11:11
at the Times, he does not bring
11:13
any new science to the table, right?
11:15
So a year ago, he brings forth
11:17
what I thought was this very measured
11:19
look at teens at social media. And
11:22
then a year later, he's in
11:24
the time saying that he believes
11:26
that the risk of using social
11:28
media is so great to adolescents
11:31
that the benefits of
11:33
potentially using it do not outweigh
11:35
the potential harms. That's an
11:37
incredibly bold statement to be
11:39
making without having subsequent
11:42
evidence to support it, right? When the
11:44
surgeon general came out said, smoking causes
11:46
cancer, there was really, really good science
11:48
about that. This I think is
11:50
a much less settled question. And so I
11:52
think to skip all the way to Well,
11:54
now we need to slap a warning on
11:56
every website where like teens can post, I
11:58
thought it was actually and
16:00
President Biden just sort of wash their hands of it and
16:02
say, well, what do you want us to do? We put
16:04
some texts on a website, right? And
16:07
I just feel like the moment that we've
16:09
gotten to this feeling like the next obvious
16:11
thing to do in the teen mental health
16:13
crisis, it just feels absurd to me. Yeah,
16:15
I do think, you know how like in
16:18
Europe, some of the warning labels on cigarettes
16:20
have like images with that, like a photo
16:22
of like a lung after it's
16:24
been like decimated by years of tobacco use.
16:26
It's like a very visual warning label. And
16:29
I think we should do the same thing with social media.
16:31
We should just like put up an
16:34
image of someone whose brain has been totally rotted
16:36
by spending too much time on social media and
16:38
like the kinds of crazy stuff that they're posting
16:40
on their feed. And this is what will happen
16:42
to you if you spend six hours a day
16:44
on Instagram. Yeah, except you know what it would
16:46
be. It would just show that person getting gradually
16:48
hotter over time as they started eating right, they
16:50
started working out, they started paying obsessive attention to
16:52
their body. So that's what the warning would
16:54
be. Well, I don't think
16:57
that's necessarily true, but I do like, so
16:59
look, I think, you know, we could have
17:01
a productive conversation about what, if anything, a
17:03
warning label should say. I also think we
17:06
should talk about where it would appear because
17:08
we know that not all social networks are
17:10
created equal. Adolescents are not having mental health
17:12
problems from spending too much time on LinkedIn,
17:15
right? This is a problem that is very
17:17
specific to a certain subset of social networks.
17:19
I would say Instagram, maybe Snapchat
17:21
should be in their TikTok, maybe should be in their
17:24
too. These are the ones where there really
17:26
is this kind of visual comparison going
17:28
on of, you know, what your body looks like,
17:30
what your face looks like. These are the kinds
17:32
of places that can be really unhealthy for teens
17:34
to spend lots of time. You know, another thing
17:37
I've been thinking about as I've been reading the
17:39
Surgeon General is, could he
17:41
offer a more targeted warning about a more
17:43
obviously real problem? There's this story that I've
17:45
been wanting to talk about on our show
17:47
and we've just not been able to find
17:49
a way to do it because it is
17:51
just so, so upsetting. We try not to
17:54
bum people out too much, but there's this
17:56
issue and there've been a lot of great
17:58
investigation into it over the past. and
20:00
without harming your mental health. All it says is
20:02
this thing might be dangerous to you. I mean,
20:04
this point, I totally agree with you. If
20:07
I had kids and they were going to school and
20:09
I found out that their school was offering a social
20:11
media literacy class and they took it and it was
20:14
the same kind of thing as like driver's ed where
20:16
you got like half credit or whatever, it
20:19
sounds like a non-solution when you're gonna
20:21
say, well, what we really need is
20:23
education literacy. When people say that to me,
20:25
I sort of feel like they're throwing their hands up and it's
20:27
like, okay, well, what's actually gonna solve the problem? But
20:30
this whole story is about media literacy.
20:32
It's about understanding how systems are designed,
20:35
how they are likely to make you
20:37
feel, what strategies you can use if
20:39
you find yourself in a spot of
20:42
harm. What are some likely scams or
20:44
dangers that you might find on these
20:46
systems? Like it would be amazing
20:48
if the platforms actually offered that kind of literacy,
20:51
right? And maybe that is an area where I'm
20:53
like, yeah, Congress actually go ahead, mandate that they
20:55
do something like this for these teens. But if
20:57
they're not gonna do it, school districts could do
20:59
it, parents groups could do it, nonprofit groups could
21:01
do it, but I agree with you. That is
21:03
what I would like to see that I think
21:05
actually starts to make a dent in this problem.
21:07
Yeah, but in the meantime, I don't think this
21:09
idea of a surgeon general's warning is necessarily a
21:11
bad idea in the same way that I think,
21:14
putting warnings on cigarettes didn't immediately
21:17
curb smoking overnight. It wasn't like people
21:19
stopped smoking because they knew all of
21:21
a sudden it was bad for them.
21:23
But it is kind of a little
21:26
visual reminder if you are going to
21:28
the store to pick up a package
21:30
of cigarettes, it just sort of makes
21:32
you pause or at least think for
21:34
one second before you hand over your
21:37
money and get your moral bros. Like
21:39
it does actually have a psychological effect.
21:41
And I actually don't mind the idea
21:43
that teens before they spend four
21:46
hours on Instagram would get a little
21:48
visual just pop up or something to just say,
21:50
are you sure you wanna do this? This could
21:52
be bad for you. Yeah, I mean,
21:54
when you put it that way, it doesn't sound like
21:57
that big a deal. Again, I'm just like, what are
21:59
the odds that we... applied this morning and
22:01
it has any meaningful improvement in the lives of
22:03
these teens. I just truly struggled to see like
22:05
the causal connection. I mean, I think the
22:08
effect that it could have is on actually
22:10
parents. I know so many parents who are
22:13
struggling with what to do with
22:15
their kids when they reach adolescence about social media.
22:17
Do I give them a smartphone? Do I let
22:19
them have an Instagram account? And
22:22
a lot of parents just feel very powerless
22:24
in those situations because all their kids' friends
22:26
are on social media. There's this sense that
22:28
by sheltering them away, you are actually limiting
22:30
their ability to be social with their friends.
22:33
A lot of parents don't feel like they
22:35
have a lot of backup when it comes
22:37
to limiting or controlling the
22:39
ways that their teens use social media. And
22:42
I actually do think that having the Surgeon General
22:44
of the United States put a warning on these
22:46
social media sites that say, this could be bad
22:49
for your teen's mental health. I think that could
22:51
embolden parents to say, look, it's
22:53
not just me saying that this stuff is
22:55
bad for you. The Surgeon General says it's
22:57
bad for you, too. And it could help
22:59
them feel a little more confident in actually
23:02
setting some policies for their own kids. I
23:04
can't believe you disagree with me like this on my birthday, by the way.
23:08
What did I do to you? Jesus.
23:14
When we come back, we'll talk to
23:16
Renee DiResta about her new book on
23:18
disinformation and how to win the war
23:20
against it. This
23:46
podcast is supported by KPMG. Your
23:48
task as a visionary leader is simple. Harness
23:51
the power of AI. Shape the future
23:53
of business. Oh, and do
23:55
it before anyone else does without leaving people
23:57
behind or running into unforeseen risks. Simple,
24:00
right? KPMG's got you. Helping
24:03
you lead a people-powered transformation
24:05
that accelerates AI's value with
24:07
confidence. How's that for a vision? Learn
24:10
more at www.kpmg.us-slash-ai.
24:14
All right, guys, how would you describe
24:17
our podcast? Matter of opinion. Extremely
24:20
civilized exchange of high-minded ideas. I
24:22
swear, if somebody says dinner party
24:24
conversation, I'm slapping them. It's an
24:26
airing of grievances, right? Somewhere
24:29
in between, I hope. Maybe the easiest
24:31
way to explain what matter of opinion
24:33
is, is actually to share what our
24:35
listeners have to say about us. Listener
24:38
Tobias said, matter of opinion
24:40
is a great podcast for anyone
24:42
engaged with social issues and politics
24:44
on any level. The
24:47
lighthearted, but testy conversations about
24:49
truly divisive topics pique my
24:52
interest. Lighthearted, but testy.
24:54
That's totally you, Ross. I'm putting that on
24:56
my headstone. My back is getting a little
24:58
sore from all this padding. From
25:01
New York Times opinion, I'm Michelle Cottle. I'm
25:04
Ross Douthat. I'm Carlos Lozada. I'm Lydia Polgreen.
25:06
And don't just take our work for it.
25:08
Make up your own mind and follow matter
25:10
of opinion wherever you get your podcasts. Well,
25:15
Kevin, I hate to brag, but it is my birthday. Last
25:18
week at Platformer, we broke some news. Yeah,
25:20
what was the news? So the
25:22
Stanford Internet Observatory, which is this
25:25
small, but I think very influential
25:27
group that studied the way groups
25:29
use online tools to spread disinformation
25:31
is basically being dismantled
25:33
and will no longer exist as we know
25:35
it. And why is this a big
25:38
deal? So this group
25:40
was the most savvy and
25:42
well-connected among the tech platforms.
25:44
And they had really good
25:46
relationships with companies like Facebook
25:48
or Twitter when that existed.
25:51
And so as elections would take
25:53
place, the SIO as they
25:55
called it, would be in close
25:57
communication with the platforms to understand.
32:00
the subpoenas and lawsuits and all the BS. And
32:03
I really just wanted to write a book kind of like manufacturing
32:05
consent did in the 1980s, right? Where
32:07
he's writing about, here's this incentive structure and here's the
32:09
outputs that come out as a result of it. And
32:11
I thought we haven't really had an update to that
32:14
in 40 years or so, so maybe
32:16
I'll write one. And then of course, couple
32:19
months later the conspiracy theories about us started, then the
32:21
subpoenas came down and then I was like, well, I
32:23
didn't want to write a memoir, but I guess I
32:25
am. But now you are, yeah. I
32:28
want to talk a bit more about
32:31
this idea that something has changed about
32:33
propaganda over the past couple of decades.
32:35
What had you noticed in your work
32:38
that made your ears perk up and
32:40
say, there's something interesting here to dig
32:42
into? I spent a
32:44
lot of time looking at the anti-vaccine movement as a
32:46
new mom in 2013 to 2015 timeframe, right? But
32:50
I didn't actually think of that as propaganda at the
32:52
time, that was not, you know, I thought of it
32:55
as activism, right? We are fighting, we are pro-vaccine people,
32:57
we are fighting those anti-vaccine people. We have a law
32:59
we want to pass in California, we have a campaign
33:01
and we're going to fight it on the internet. And
33:03
the thing that's interesting about it is whenever you have
33:06
a political campaign, there's a, there's like a start date
33:08
and an end date, right? But
33:11
they did not see it as having a start date and
33:13
an end date for them. This was like, this was their
33:15
thing. And they were on it 24 seven and
33:18
they had built an entire community.
33:20
There were thousands of anti-vaccine accounts.
33:22
This is years prior to COVID,
33:24
just to clarify, we're talking about
33:26
measles vaccines here. I thought
33:29
it was interesting that I thought of this as something
33:31
that this is activism, we're turning it on and off,
33:34
but they thought of it as something that was persistent.
33:36
They were going to message forever because they were really
33:38
true believers in this idea that vaccines caused autism. So
33:41
that was kind of my first experience. And then I
33:44
felt like I had sort of seen the future, right? I
33:46
was like, this is how every campaign is going to be
33:48
fought. I can run ads on Facebook. I can granularly target
33:50
down to the zip code level and nobody knows who the
33:52
hell I am. And I don't have to tell them. And
33:54
this is absolutely insane actually. This
33:57
is wild. Integrity
36:00
Partnership was sort of an ad
36:02
hoc group of organizations that were
36:05
looking specifically at attempts to undermine
36:07
the election on social
36:09
media. Absolutely. So
36:11
I think sometimes you see right wing media
36:13
report that it was somehow related to Hunter
36:16
Biden's laptop, that's BS, or that had something
36:18
to do with political speech in
36:20
the context of candidate A saying something about
36:22
candidate B. But you were doing
36:24
things like monitoring, there was this
36:27
thing Sharpie Gate, which is just
36:29
sort of a conspiracy theory about
36:31
people basically manipulating ballots. Yes. So
36:34
the scope was exclusively limited to things related
36:36
to policies and procedures having to do with
36:38
voting. So it was that kind of stuff,
36:40
Sharpie Gate making an allegation that a
36:43
felt tip pen would invalidate a ballot. And
36:47
that specifically, the sort of other piece
36:49
of it was the delegitimization narratives. So in the
36:51
context of Sharpie Gate again, that those felt tip
36:53
pens were only given to Trump supporters to steal
36:55
the election. So we
36:57
were only looking at narratives related to the election.
36:59
We did not care about Hunter Biden's laptop. But
37:04
what winds up happening is
37:06
that work, which did involve
37:08
occasionally speaking with state and local
37:11
election officials. And you
37:13
see Fox News call Arizona for Biden and
37:15
all of a sudden the Sharpie story goes
37:17
from being just some people concerned
37:20
about Sharpie markers to that's
37:22
how they stole Arizona. And
37:26
so state and local election officials meanwhile throughout the
37:28
entirety of election day are trying to ensure that
37:30
people have confidence in the election and that
37:32
if there is something irregular or weird that is
37:34
happening that they know about it. But
37:37
election officials are not supposed to be sitting on the internet
37:39
all day long. That is not their actual job. It is
37:41
in fact our job. And
37:43
so we were like, well, okay, we can communicate with state
37:45
and local election officials. And what this meant was
37:47
that they would occasionally send tips basically, hey, can you
37:49
look at this tweak? Can you look at
37:51
this post? Oftentimes we looked at it, but
37:53
it was some random guy with two followers
37:55
who was like wrong on the internet. But
37:58
then sometimes there were these things. that got a whole
38:01
lot of attention and a whole lot of pickup. And
38:04
in certain cases, we would also tag tech platforms and just say,
38:06
hey, you should have a look at this. Now,
38:08
this was reframed as government
38:11
officials were using us, giving us things
38:13
that they wanted taken down and that
38:15
we were then telling platforms to take
38:17
them down, that this was like this
38:19
vast operation. And they turned it from
38:22
being state and local election officials to
38:24
being like DHS itself, because
38:26
DHS is responsible kind of at a
38:28
federal level for elections. And
38:30
so you have this conspiracy theory that
38:32
we are somehow being used. And I
38:35
mean, the numbers that these sort of
38:37
like right wing blogs start to write
38:39
are like 22 million tweets, entire narratives
38:41
nuked from the internet with an AI
38:43
censorship super weapon. I'm not kidding. That
38:45
was the actual phrasing, right? And
38:49
this is the sort of thing where in
38:51
a normal polity, this would
38:53
be seen as tinfoil hat BS. But
38:55
in this reality, Jim
38:58
Jordan is like, yes, we need to investigate this.
39:01
We need to investigate the AI censorship super weapons
39:03
run out of Stanford University. So let's
39:06
just finish the narrative arc here, because
39:08
we have this sort of pushback from
39:10
the right to these groups
39:12
and platforms that are engaging in what
39:15
they believe is politically motivated censorship. Jim
39:17
Jordan starts sending out all these not
39:19
only letters, but subpoenas. He gets a
39:22
bunch of emails from platforms communicating with
39:24
governments and academic institutions. And
39:28
then something happens with Stanford
39:30
itself, which is spending tons
39:32
of money. I think you've
39:34
said millions of dollars to
39:36
defend. Yeah, to defend against
39:39
these claims and to respond to these
39:42
subpoenas. And maybe
39:44
at first that feels like they're sort of
39:46
on your side. They're sticking up for the
39:49
research team. But at
39:51
some point, that seems like it started to
39:53
change. And you recently learned that your position
39:56
at the Stanford Internet Observatory was being
39:58
disconnected. that there was not going to
40:00
be more funding made available for you
40:03
to continue working there. And
40:05
I'm just curious what your emotional reaction was
40:07
when you heard that. So
40:09
several of us were given that news at
40:11
the same time that there was no funding.
40:14
And I think the reaction
40:17
was disappointment, obviously. Well,
40:19
my reaction was disappointment, I can't speak for them. We
40:22
have a really close-knit team and we
40:24
do amazing work together. And
40:26
again, I think for us, the immediate reaction
40:29
was, are there
40:31
ways to get funding for particular project
40:34
lines? The
40:36
child safety work, I mean, I think
40:39
it's perhaps just to make
40:41
it clear, there are certain tools that we
40:43
have to build to be able to do that work.
40:45
It's not something you can just do anywhere because it
40:47
is illegal to view that kind of content in addition
40:49
to being seriously damaging. And so a
40:52
lot of what we've done is design
40:54
ways to do certain types of very
40:57
delicate trust and safety research in ways
40:59
that enable the team to
41:02
put out amazing work while not bumping into
41:05
some of the sort of terrible sides of it. I
41:08
also felt like because we have that breadth,
41:10
because all of us work on all of
41:12
these different project areas, I work on trust
41:15
and safety, I work on our generative AI
41:17
stuff, I work on our election integrity or
41:19
information integrity work, we built
41:21
an entire center on the idea of
41:23
all of these challenges are interrelated because
41:25
they happen on the same system, like
41:27
there's structural things here, how
41:29
can we have that pipeline from quantitative
41:32
empirical research to policy recommendations that also
41:34
take into account the way that all
41:36
of these things are related? There are
41:39
very, very few institutions that have that
41:41
type of analytical capacity and that have
41:43
that vision of the internet
41:46
as complex system. And so
41:48
while there are many, many excellent institutions that do
41:51
deep work on topic A or topic B
41:54
or the right exceptional policy briefs, what
41:57
we really wanted to build at SIO, what we
41:59
did build at SIO, over five years was this
42:01
ability to study a very complex system at
42:04
a holistic level and make
42:06
material impacts across a fairly broad
42:09
array of topics. So
42:11
I want that to exist. And
42:13
I want to be doing it too. Also, I
42:15
would say, even if you're somebody
42:17
who says, wow, there's this censorship
42:20
industrial complex and these academics have
42:22
gotten out of control, I just
42:24
want to remind us that what
42:27
we're talking about is should universities
42:29
be able to study the way
42:31
that narrative spread online? Should they
42:33
have a sense of which narratives
42:36
are gaining popularity? Which accounts are
42:38
responsible for spreading them? How
42:40
do these networks of ideas work? This
42:43
is a sort of objective study of
42:45
reality, and it is somehow being
42:48
painted as this malicious effort
42:50
to censor speech. So I just want to say
42:53
that you can have different opinions about what should
42:55
we do about tweets we don't like and
42:58
should we be able to study the way
43:00
that information spreads online? I
43:02
think what I actually found
43:04
most disturbing and the thing that I think I'm
43:06
going to wind up writing about is that there's
43:09
a couple of tweets that go out from like
43:11
Howis Judiciary GOP and Jim Jordan saying explicitly, victory,
43:14
right? And that's a thing that I
43:17
think I don't know what it takes to jolt academia
43:19
out of its complacency, to
43:21
make them realize that this was
43:23
the objective, right? That the objective
43:26
was to silence the work of
43:28
a First Amendment-protected research
43:31
project and team. And
43:33
my frustration there is that
43:35
when you have a sitting
43:38
government official with subpoena power
43:40
gloating about killing First Amendment-protected
43:42
work and then saying freedom of speech
43:45
wins, I mean that is, you know, I feel like
43:47
Orwellian is the most over-used word on Twitter, but
43:50
like man, is that really it? I
43:53
want to ask about another part of the
43:55
criticism of some of the work that you
43:57
do that I actually think is important. is
44:00
sort of interesting. And it's not about Stanford
44:02
or academia in particular, but it's about actually
44:05
the role the government plays in
44:07
this whole universe of online platform
44:09
manipulation and disinformation. There's
44:11
this word job owning that gets used a
44:13
lot in these debates. And for people who
44:15
aren't familiar, it's basically job owning is sort
44:18
of when the government is kind of applying
44:20
pressure to private companies to kind of do
44:22
something they want. Even if
44:24
they're not legally required to. Right. It
44:27
could be as simple as someone
44:29
from the White House sending
44:31
an email to the trust and safety team
44:34
at Facebook or at another social
44:36
network and saying, hey, we've got
44:39
these 50 accounts that we believe
44:41
are spreading misinformation. Maybe you
44:43
should take a look at them. And maybe you want to
44:45
sort of apply a label to them
44:47
or maybe even take them down. And
44:49
we know, in part, thanks to some
44:51
of these subpoenaed emails, that this
44:54
kind of thing actually did happen. There were
44:56
people in the Biden White House emailing platforms,
44:58
talking with them about trying to
45:00
get certain content taken down. And that sometimes
45:02
the platform pushed back and refused to do
45:05
that, but sometimes they went along with it.
45:07
So do you think this issue
45:09
of job owning is real? And do you think the
45:11
government has overstepped when
45:13
it comes to trying to enforce
45:15
social media policy on these platforms?
45:18
I think it's a really interesting
45:20
question. Job owning is bad. And we
45:22
should be able to hold that idea in our head and
45:24
say, it is bad. It is not a
45:27
thing that we should want as a democracy. We should want
45:29
our government to do. There's a couple
45:31
of nuances there, meaning that the government
45:33
also has speech rights. The government also
45:35
has particular incentives, for example, during a
45:38
pandemic to communicate with platforms about here
45:41
we are trying to prevail upon you for why you
45:43
should do this thing. I
45:45
think that that's best done perhaps a little bit
45:47
more publicly. I think, though, interestingly, when you
45:50
do see Biden say something like in
45:52
a public press conference, like, what did
45:54
he say, you're killing people? Was that
45:56
the sentence? That's also sort of viewed
45:58
as like, whoa. This was something that
46:01
he said about. Facebook during the pandemic,
46:04
basically accusing them of killing people by
46:06
not removing more misinformation about
46:08
vaccines and things like that. Right. So
46:11
there's a whole spectrum of government communications,
46:13
public and private. One of the
46:15
things that we see is governments,
46:17
not the United States, but
46:19
other governments making explicit content
46:22
takedown requests explicitly to throttle their
46:24
political opposition. You see the Modi
46:26
government requesting Sikh politicians in Canada
46:28
have their content throttled so that
46:31
it can't be seen in India.
46:34
Right. That is, I would
46:36
argue, rather transparently censorship in the actual
46:38
sense of the word. So
46:41
this is a worthwhile thing to be looking
46:43
at. I think that Google
46:45
in particular will put up these transparency
46:47
reports where it says the government requested
46:50
action on, and then it will sort
46:52
of list content that governments request action
46:54
on. I think that's a very reasonable
46:56
thing for tech platforms to do, which
46:58
is to say when these requests or
47:00
asks come in, we're going to make
47:03
them public. Right. And that
47:05
provides then, I think, kind of a check on
47:07
government, because if they don't want that request being
47:09
made public, then maybe they won't make it. Right.
47:12
Or if they feel like it's a it's a
47:14
very, very important thing and a thing that they
47:16
want to request, they can either do it publicly
47:18
themselves or make it public after the fact. I
47:20
think we need government and platforms to have open
47:23
channels of communication, particularly because
47:25
there are certain areas where you do
47:27
see meta in some of its adversarial
47:29
threat reporting about state actors in particular,
47:32
like China, saying the government no longer
47:34
talks to us because it's afraid of
47:36
being seen as somehow any
47:39
communication is jawboning. And that, I think, is also a
47:42
very, very bad state for us to be in. Your
47:45
book is sort of about how we ended
47:47
up in the place that we are now, which
47:49
is where you have millions of
47:51
Americans who are deeply invested in
47:54
conspiracy theories. It kind of feels like we have
47:56
what you call bespoke reality, where everyone is
47:58
just kind of stitching together. their own version
48:01
of events based on the sources
48:03
that they're following, the influencers they pay attention
48:05
to and trust. We
48:08
don't have a sort of broad
48:10
consensus reality anymore. You also
48:12
have some ideas in your book about how we could sort
48:15
of start to make our way back
48:17
to something like consensus reality, how we
48:19
could start to turn the
48:21
tide of disinformation and extremism and
48:23
all this stuff. Can
48:26
you walk us through some of your ideas for that? Yeah,
48:28
so a big area
48:30
of focus for me has been design and
48:32
that's because I think
48:35
people hope for regulation. I'm a little bit
48:37
more of a skeptic on the regulatory front
48:39
and that's mostly because I don't, from a
48:41
purely pragmatic standpoint, I just don't see how
48:43
anything gets passed in the United States, right?
48:45
So the basic, you know, we've
48:47
been talking about tech reform and tech accountability and
48:49
so on and so forth and everything from antitrust
48:52
to child safety to privacy to, you
48:54
know, and then a whole
48:56
slew of like very, very bad bills also,
48:58
but nothing gets passed anyway. So
49:01
I think what we look at here is
49:03
the question of can you, you know, what
49:05
did we used to do to arrive at
49:07
consensus? We've always had heated debates. How did
49:10
we get to a point where we could
49:12
not have any kind of ability to bridge?
49:15
I think one of the things that happens
49:17
is when you have heated debates in your
49:19
local neighborhood, you usually talk to your neighbors,
49:21
right? You're geographically constrained. You see these people at
49:23
the bus stop, you see them at the library.
49:27
You don't spend all of your time screaming obscenities at
49:29
them, you know? No, you go
49:31
out to next door like a reasonable person
49:33
and you write an all caps post complaining
49:36
that your neighbor set off fireworks at 11
49:38
p.m. and it woke up the dogs and
49:41
no, that might just be my neighborhood.
49:44
I am no longer on next door. Yes,
49:51
no, I think there's a, you know,
49:53
you can have civil disagreements
49:55
in the real world. I think it's hard to look somebody in
49:57
the face and like accuse them of being a... Politicians
58:00
have increasingly been engaging with the crypto
58:02
industry as part of a strategy to
58:04
win their elections. Well, tell me about
58:06
this. So last year, RFK Jr., who's
58:09
running for president on a third-party platform,
58:11
chose a crypto event in Miami as
58:13
the place to make his big campaign
58:15
debut, and he declared that he was
58:17
a big lover of the crypto industry.
58:20
And then, just over the last month,
58:22
Donald Trump has also been invoking crypto
58:25
in his campaign speeches and positioning himself
58:27
as a friend of the crypto industry.
58:29
And now, even apparently, President Biden is thinking about
58:32
meeting with the crypto industry to talk about policy.
58:34
Well, that is interesting, although Kevin, when it comes
58:36
to RFK Jr., we can never forget that a
58:38
worm did eat part of his brain. That's
58:42
very true. So, you know, it's
58:44
been a little weird as someone who's been sort
58:46
of following the crypto industry for a while to
58:48
sort of see this, you know, this sort of
58:50
turn of events where politicians who used to dismiss
58:53
crypto out of hand are now apparently taking it
58:55
seriously. And I think
58:57
it's just a very revealing story about how
58:59
the crypto industry has been working behind the
59:01
scenes to kind of drum up support among
59:03
lawmakers to try to beat back some of
59:05
these regulations that it thinks are going to
59:07
hurt its ability to make money, and also
59:10
how it's using its money in very conventional
59:12
ways to try to influence the upcoming election.
59:14
So this week, three of my colleagues, David
59:16
Yaffe Bellamy, Aaron Griffith, and Teddy Schleifer, published
59:19
a story titled How Crypto Money is
59:21
Poised to Influence the Election. Basically,
59:24
it's about this new attempt that the crypto industry
59:26
is making to raise a bunch of money and
59:28
to start super PACs and to start distributing it
59:30
to candidates in races where they think their support
59:32
could make a big difference. And I'm very excited
59:34
about this, because any time I hear about a
59:36
lot of crypto money going somewhere, I think it's
59:39
a fresh opportunity for people to eventually be incarcerated.
59:41
Right. So I
59:44
thought this was a very revealing piece, not just because of
59:46
what it said about the crypto industry, but because of what
59:48
it says about politicians and how easily some of them apparently
59:51
can be bought or at least convinced
59:53
to take crypto more seriously. So
59:55
to talk about this piece and what it
59:57
means, we've invited our old pal, David Yaffe
1:00:00
Bellamy. DYB, back on the show. BRB with
1:00:02
DYB? Well, BRB, have you ever used
1:00:04
that joke with DYB? Yeah. Davey
1:00:11
Effie Melanie, welcome back to Hard Fork. Thanks
1:00:14
so much for having me. By my
1:00:16
count, this is your seventh appearance
1:00:18
on this show. You are the
1:00:20
most frequent Hard Fork guest. How
1:00:22
does it feel? Well, do I
1:00:24
get some sort of metal or like any
1:00:26
hardware to signify this achievement? We just put
1:00:28
an NFT in your crypto wallet. You'll want
1:00:30
to check that out later. That's even better,
1:00:33
Davey. DYB, where are we catching you
1:00:35
right now? I'm coming
1:00:37
to you live from Puerto Rico,
1:00:39
where I'm on a real grueling
1:00:41
hardship reporting assignment for the next
1:00:43
few days. You're just like sipping
1:00:45
margaritas with Francis Howigan, aren't you?
1:00:48
Yeah, basically. Well,
1:00:51
I hope you're getting hazard pay for your
1:00:53
arduous reporting trip to Puerto Rico. Before
1:00:56
we dive into the story of crypto
1:00:58
money and the 2024 election, I think
1:01:00
it would
1:01:03
be helpful if you just sort of
1:01:05
mapped the terrain of crypto politics for
1:01:07
us a little bit. And
1:01:09
I want to start by asking
1:01:12
you about how crypto is being
1:01:14
viewed on the right and specifically
1:01:16
by former President Trump. Because until
1:01:18
fairly recently, he was not
1:01:20
a fan of the crypto industry. He used
1:01:23
to say stuff like calling Bitcoin a scam.
1:01:25
But recently, he's totally flip-flopped. And
1:01:28
this year, he has declared himself
1:01:30
a friend of crypto. He's accepting
1:01:32
campaign donations in crypto. He's taking
1:01:34
up causes that matter to crypto
1:01:36
supporters. He recently met with Bitcoin
1:01:38
miners at Mar-a-Lago. And he's been
1:01:40
saying stuff in his speeches like,
1:01:42
I will end Joe Biden's war
1:01:44
on crypto. He even has his
1:01:46
own NFT series. So what
1:01:49
happened? So that's a really good
1:01:51
question. Like you said, he had this kind of
1:01:53
long history of disparaging comments about crypto. And
1:01:56
really, up until even kind of earlier
1:01:58
this year, group
1:06:00
of PACs, the largest of which
1:06:02
is called Fairshake. Those
1:06:05
groups are sitting on a pool of money, more
1:06:07
than $150 million, which
1:06:10
in the tech world is not an astounding amount of
1:06:12
money, but
1:06:15
in politics, it can really make a huge difference.
1:06:20
So lay out the political agenda of these
1:06:22
PACs. What
1:06:25
do they hope to accomplish? They
1:06:28
want to elect pro-crypto candidates. They're
1:06:31
talking about sending questionnaires along to candidates to
1:06:33
gauge their views on crypto, and then the
1:06:35
idea is to elect people who
1:06:38
will back pro-crypto legislation. That
1:06:41
could be a bill that strips a lot of
1:06:43
power away from the SEC that
1:06:45
says that cryptocurrencies are not actually
1:06:47
securities, and therefore they're
1:06:49
allowed to be offered and traded the way they
1:06:51
have been in the US. And
1:06:55
what kinds of races are these
1:06:57
crypto super PACs most focused on
1:06:59
right now? So Fairshake,
1:07:01
the biggest of the PACs, announced
1:07:03
a couple of months ago that
1:07:05
it was going to focus on
1:07:07
four Senate races, including two that
1:07:09
are very competitive, that involve Democrats
1:07:11
who are looking pretty vulnerable in
1:07:13
their re-election efforts. And that's, there's
1:07:15
the Senate races in Montana and
1:07:17
Ohio. So it's John Tester in
1:07:19
Montana and Sharon Brown in Ohio,
1:07:21
who are both kind of vocal
1:07:23
Democratic critics of the crypto industry,
1:07:26
kind of facing re-election in those crucial
1:07:28
states. And
1:07:30
are these super PACs mostly or exclusively
1:07:32
supporting Republicans? Because there are some Democrats
1:07:34
who are seen as pro-crypto,
1:07:38
or at least a little less anti-crypto
1:07:40
than maybe Elizabeth Warren and other
1:07:42
very anti-crypto Democrats. So are
1:07:44
they supporting any Democrats or
1:07:46
independents? Absolutely, and the
1:07:48
PACs and the companies that are backing
1:07:51
them are very quick to say that
1:07:53
they consider this a bi-partisan issue, they
1:07:55
see strong supporters of crypto on both
1:07:57
sides, etc., etc. It's
1:08:00
true that one of the first major
1:08:02
expenditures by Fairshake was in the California
1:08:04
Democratic Senate primary, where the group spent
1:08:06
about $10 million on attack ads against
1:08:09
Katie Porter, who was one of the
1:08:11
Democratic candidates and was seen as sort
1:08:13
of a close ally of Elizabeth Warren.
1:08:15
And so she was defeated and Adam
1:08:17
Schiff ended up winning that race. And
1:08:20
Schiff went on to
1:08:22
meet with Coinbase and some other
1:08:24
crypto firms at Coinbase's offices a
1:08:27
few weeks after that election. So
1:08:29
you definitely see these
1:08:31
groups kind of rubbing shoulders with Democrats
1:08:33
as well as Republicans. And
1:08:35
how much of this activism by the crypto
1:08:37
industry do you think has been helped by
1:08:40
the fact that crypto prices are quite high
1:08:42
right now? I mean, if we were talking
1:08:44
in 2022 when the sort of
1:08:47
crypto industry had collapsed and all
1:08:49
these coins were their value had
1:08:51
fallen precipitously, there just might
1:08:53
not have been as much money to spend
1:08:55
on these races. So how much is the
1:08:58
fact that like Bitcoin is close to an
1:09:00
all time high now that a lot of
1:09:02
crypto prices have recovered and are booming again?
1:09:04
How much has that helped these attempts to
1:09:06
influence the political process? Yeah, I
1:09:08
mean, it's unquestionably a big part of
1:09:10
it. I mean, you know, most of Coinbase's
1:09:12
revenue comes from transaction fees on crypto
1:09:15
trades. And crypto trading ramps up and the
1:09:17
sizes of those trades tend to be bigger
1:09:19
when the market is doing well. And so
1:09:21
Coinbase does a lot better when the
1:09:23
market is doing well, generates a lot more
1:09:26
revenue. And you can see that it's earning
1:09:28
reports every quarter. And so,
1:09:30
you know, Coinbase has more money to spend
1:09:32
now than it would have had two years
1:09:34
ago. And, you know, thus it can it
1:09:36
can afford to lay out 50 million dollars
1:09:38
on a pack. You
1:09:42
mentioned the the Katie Porter race where
1:09:44
the crypto people got where they wanted.
1:09:46
Are there other examples of them winning?
1:09:48
Like do they feel like they have
1:09:51
some real momentum? So
1:09:53
one sort of cautionary thing I would say
1:09:55
is it's always like difficult to determine like
1:09:57
causation here. Like we know that Katie Porter
1:09:59
lost and we know. that the crypto industry
1:10:02
spent a lot of money in that race,
1:10:04
but like, you know, was one a result
1:10:06
of the other. It's not totally clear. They're
1:10:08
very quick to claim that scalp, but I
1:10:10
think that we probably need more evidence before
1:10:12
we can like definitively say that this money
1:10:15
is like shaping the elections. I mean, another
1:10:17
claim that sort of backers of some of
1:10:19
these PACs are making behind the scenes is
1:10:21
that Sherrod Brown's position on some crypto issues
1:10:23
has kind of softened, you know, he's voiced
1:10:26
a willingness to vote for some pro-crypto legislation
1:10:28
as a result of the threat to spend
1:10:30
a huge amount of money in his race.
1:10:33
But if he had simply put his position on
1:10:35
the blockchain, it would have been immutable and that
1:10:37
it never could have either softened or hardened. So
1:10:39
that's something that candidates should be thinking about. Exactly.
1:10:42
This is how we stopped the flip-flopping that the
1:10:44
devils are a political process, yeah. So
1:10:46
obviously there are parts of this that just sound very
1:10:49
traditional and sort of about
1:10:52
a special interest trying to
1:10:54
influence the political process, whether through
1:10:56
big campaign donations or super PACs.
1:10:59
But there's also this idea among some people
1:11:01
I talked to in the crypto industry about
1:11:04
the crypto voter, right? There's this idea that
1:11:06
a lot of crypto leaders have that there
1:11:08
are millions of Americans out there for whom
1:11:10
crypto is a very important
1:11:12
issue and it will vote for candidates
1:11:14
who support crypto and won't vote for
1:11:17
candidates who don't support crypto. What
1:11:20
do you make of that theory about the
1:11:22
crypto voter? I
1:11:24
mean, I know Casey is a single
1:11:26
issue crypto voter. Correct. It was every
1:11:28
single decision is shaped by these issues.
1:11:30
So it seems plausible to me. This
1:11:33
is something I joked about with my colleague, Kellen
1:11:37
Browning, who used to be on the tech
1:11:39
team and covers politics now. And I
1:11:41
said to him a few months ago while he was sort
1:11:43
of on the campaign trail, like,
1:11:45
so are you running into a lot of
1:11:48
these single issue crypto voters? And he just
1:11:50
laughed. Like, of course, nobody's talking about Bitcoin
1:11:52
at a Trump rally or whatever. But
1:11:54
the industry has these surveys
1:11:57
that are exclusively commissioned
1:11:59
by the www.kpmg.us.ai.
1:20:05
Before we go, just a note, if
1:20:07
you want to hear more about the
1:20:09
Surgeon General's call for a warning label
1:20:11
on social media platforms, The Daily has
1:20:14
an episode out today featuring an interview
1:20:16
with the Surgeon General Vivek Murthy himself.
1:20:18
So go check that out if you want to hear more. Artboard
1:20:22
is produced by Whitney Jones and Rachel
1:20:24
Cohn. We're edited by Jen Poitant.
1:20:27
We're fact checked by Kailin Love. Whitney's
1:20:29
show was engineered by Daniel Ramirez. Original
1:20:32
music by Alicia Beitub, Mary
1:20:34
Lozano, Rowan Nemestow and Dan
1:20:36
Powell. Our audience editor is
1:20:38
Nel Gologli. Video production
1:20:40
by Ryan Manning, Soyo Roque and
1:20:43
Dylan Bergison. Check us out on
1:20:45
YouTube at youtube.com/Heartfork. Special
1:20:47
thanks to Paula Schumann, Pwimink Tam,
1:20:49
Kate Lapreste and Jeffrey Miranda. You
1:20:52
can email us at heartfork at
1:20:54
mytimes.com. Be accepting birthday wishes
1:20:56
a week. And also crypto
1:20:58
donations, for sure. You've
1:21:22
always known just how smart she is. From
1:21:27
her early aptitude for science, to
1:21:29
her first big discovery. And
1:21:31
now that graduation is right around the corner, she
1:21:34
can continue to excel. America's
1:21:36
Navy offers her the chance to get hands
1:21:38
on training and experience for the career she
1:21:40
was born for. From reactor physics
1:21:42
to IT and cyber systems or engine design, it's
1:21:44
the smart way to get ahead. Learn
1:21:47
more at navy.com. America's Navy.
1:21:50
Forged by the sea.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More