Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Hey there cyber listeners Matthew here this episode
0:02
was brought to you by delete me a
0:05
few years ago. I did some reporting that uh Made
0:08
quite a few people pretty angry I got a
0:10
few death threats and I started to worry about
0:12
how much my private information was available in public
0:14
so we've got a safety team here at vice
0:16
and that helps journalists navigate these kinds of situations
0:18
and The single best thing
0:21
they did was sign me up for delete me
0:24
So, you know, how would you like to wake up one
0:26
day and discover your bank account has been emptied or
0:28
get an overdue notices? For credit cards you never applied
0:31
for or worse You don't
0:33
realize how much of your information is available to
0:35
scammers on the internet and how susceptible you are
0:37
and your family is to
0:40
identity theft and fraud Now
0:42
you can get your data removed with
0:44
delete me These are just two of
0:46
the many reasons that I personally recommend
0:48
delete me delete me is a subscription
0:50
service that removes your personal Info from
0:52
the largest people search databases
0:54
on the web and in
0:56
the process helps prevent potential ID theft Doxing
0:58
and phishing scams and this is not just
1:01
a one-time thing either. It is always working
1:03
for you I've been using
1:05
it for a few months now and
1:07
I'm constantly getting updates about
1:10
where my information is popping up
1:12
how to remove it and One
1:15
of the best parts a lot of the stuff just
1:17
gets automatically removed for me to put
1:19
it simply delete me does all the hard Work of
1:21
wiping you and your family's personal info off
1:23
of the web data brokers hate delete me
1:26
when you sign up for delete me It
1:28
immediately goes to work scrubbing your all your
1:30
personal information from data broker platforms When
1:33
you sign up delete me immediately goes
1:35
to work scrubbing all your personal information
1:37
from data broker platforms Your
1:40
personal profile is no longer theirs to
1:42
sell So take control of
1:44
your data and keep your private life private
1:46
by signing up for delete me now
1:49
It is special discount for our
1:51
listeners today get 20% off
1:53
your delete me plan when you go to join delete
1:56
me Dot-com slash cyber and use
1:58
promo code cyber at check checkout.
2:00
The only way to
2:03
get 20% off is
2:05
to go to joindeleteme.com/cyber,
2:07
enter code cyber at
2:10
checkout. That's joindeleteme.com/cyber code
2:12
cyber. Hey there cyber listeners,
2:14
Matthew here. Just a quick show note
2:17
at the beginning of the episode, we
2:19
get into legislation around artificial intelligence in
2:22
this episode. It's a really great
2:24
conversation. I want to note that
2:26
hours after we got done recording
2:28
this call, the Federal Communications Commission
2:30
declared that the use of AI
2:32
generated voices in robocalls is a
2:35
scam and is illegal. The
2:37
way they did this was pretty interesting because
2:39
it was not new laws.
2:41
It was using old
2:43
laws to basically confirm that this
2:45
new technology falls under the auspices
2:48
of a scam. It
2:50
was essentially a ruling that used
2:52
the Telephone Consumer Protection Act, which is from 1991
2:55
to say that
2:59
generating a fake voice
3:02
and calling someone and trying to get them to do
3:04
something is a scam. As
3:06
we talk about in this episode, this happened
3:08
because before the New Hampshire primary voters were
3:11
getting calls from an AI generated Biden who
3:13
was telling them not to vote. Again,
3:16
this happened literally hours after
3:18
we got done having this conversation and
3:21
that is why it is not mentioned in this
3:23
episode. So
3:54
we are with Leah and Janice.
3:56
Can you all introduce yourselves? Janice,
3:58
will you start? Hi,
4:01
I'm Machana Thros. I am a
4:03
senior editor at Motherboard where I cover AI
4:05
and other topics. I'm
4:09
Leah Holland. I use their sheep pronouns and I'm
4:11
the Campaigns and Cons Director at Digital Rights
4:13
Organization, FACT for the Future. And
4:16
can you tell me a little bit about what FACT
4:18
for the Future does and what you all are working
4:20
on? Yeah, so
4:22
FACT for the Future is a
4:25
queer women-led national digital rights
4:27
nonprofit that really focuses on
4:29
issues of surveillance and privacy
4:31
and censorship. We like to
4:33
be the grassroots voice of
4:36
the users of technology and
4:38
speak up for their
4:40
interests when corporations or legislators
4:42
are making bad decisions on
4:44
behalf of the masses online.
4:47
So Janice, you've been talking about doing
4:50
kind of an AI episode
4:52
with this specific event for
4:54
a while. And then we had a
4:56
news story. There's a pretty great news peg for it. I
4:58
mean, it's a horrible story, but it's a good news peg
5:00
for what we're going to be talking about. I wanted you
5:03
to kind of run us through that first. It's kind of
5:05
like table setting. Yeah, so I
5:07
think it was last week. There was
5:09
this big story about Taylor Swift, as
5:11
a lot of big stories tend to
5:13
be these days. And
5:15
essentially, there was
5:17
this website that started producing
5:20
deep fake AI nudes of
5:22
Taylor Swift. And they
5:24
started, of course, spreading on social
5:27
media, including on Twitter, which is
5:29
now called X because Elon Musk
5:31
is an edgelord and
5:34
Facebook and Instagram and other
5:36
places and Reddit. And
5:38
basically, this was entirely
5:40
a thing that was entirely predictable to pretty
5:42
much anyone who has been paying attention to
5:45
the space. These are
5:47
AI models that are able to
5:49
generate images and they've been getting
5:51
increasingly sophisticated, increasingly powerful. We've
5:54
been covering them for quite a while at motherboard
5:56
since the days when they
5:58
released Dali, which is opening eyes
6:00
to the first foray
6:03
into these kinds of like
6:05
generative AI things. Yeah,
6:07
it kind of ended up
6:09
in this kind of big headline
6:11
way where, you know, here's a
6:13
famous woman who is essentially being
6:16
humiliated on a massive, you
6:19
know, international scale by this
6:21
technology that is created by
6:24
large tech corporations that are
6:26
for the most part completely
6:28
unregulated from producing these tools.
6:31
And this is a thing that usually, as you
6:34
said, we've been covering this for a long time, this has been
6:36
going on for a while, years. And it seems to
6:38
me that a big part of the
6:40
news story this time was that it kind
6:43
of breached containment, right? It
6:45
went from discord groups where
6:47
it's being shared privately, which doesn't make it
6:49
right. I'm just saying that that's kind of
6:51
where these things kind of percolate in an
6:54
unfortunate way and then made it into more
6:56
mainstream sources. Do we have any
6:58
idea like how or why that happened? Like
7:00
why did this start getting circulated on Twitter
7:02
all of a sudden? Yeah,
7:05
well, I mean, there's been a couple of events leading
7:07
up to this that have been kind of testing
7:09
the waters, so to speak. And
7:11
like there were a couple of incidents
7:14
that were, you know, cited in some
7:16
of this legislation that we're going to
7:18
talk about that involved, for example, people
7:20
making fake songs
7:23
that were essentially used
7:25
like a large cache
7:27
of audio of the singing voices of
7:29
people like Drake and The Weeknd in
7:32
order to produce new songs that
7:34
sounded like they were being sung by
7:36
those artists. And
7:38
that was like one thing that happened. That was, I
7:40
think, a couple of months ago. And
7:43
then, you know, there was, of course,
7:45
that viral George Carlin
7:47
standup routine that was awful, which
7:49
we wrote about where the
7:51
George Carlin family is actually suing
7:53
the tech
7:56
podcast, which is like sponsored
7:58
by an A.I. company. that
8:01
is completely anonymous, which is really weird
8:03
and shady. And they won't, the
8:05
comedians have signed NDAs and won't disclose the
8:07
exact nature of their relationship or who's behind
8:09
the AI. Also, fun
8:11
follow up, after the lawsuit dropped, one of
8:13
the guys copped to writing the whole thing.
8:16
Yeah, I saw that.
8:18
Which is really fun, which is like another, like put a
8:21
pin in that. We'll
8:23
come back to that. I don't know if that's gonna
8:25
really save you, like... No, it's not. I
8:27
think, by saying, oh yeah, by the way,
8:29
I wrote it. Ha ha, just kidding. It wasn't
8:32
an AI. Yeah, I don't know if that's gonna
8:34
really help your gaze. No,
8:36
absolutely not. So, Leah,
8:38
what did you, I assume you were following
8:40
the story, what did you make of it? And
8:43
also, what did you make of the calls
8:45
for legislation afterward? Because there
8:47
was quite a bit of them. Yeah,
8:49
I made of it
8:51
that something like this was more
8:54
or less inevitable. And the impacts
8:57
of AI impersonation or
8:59
deep fakes or humiliation
9:02
of women
9:04
and people from marginalized communities in particular is
9:06
only gonna get worse
9:10
as we adjust the new reality of these
9:12
tools being so broadly available
9:15
and don't really have... If we even have
9:17
on the internet at all, maybe cultural worries
9:19
about what we're supposed to do or aren't
9:21
supposed to be doing with images and what
9:24
have you, I think that this is
9:26
something that we are right now
9:29
in the start of adjusting to
9:31
both when it comes to three
9:35
different categories of people that I think
9:37
are important to think about. The first
9:39
of which is your Taylor Swift, the
9:41
second of which is your abuse survivors,
9:44
your everyday people who have
9:46
been victims of revenge porn and what have
9:48
you. And then the third here also is
9:50
politicians. Everybody's screaming from
9:52
the rooftops about that Biden
9:54
deep fake robocall telling people not to
9:56
vote. And I think that in the election year, we're gonna
9:58
be seeing a lot more. play on that.
10:01
And we haven't only seen
10:03
calls for legislation here, we've
10:05
seen actual legislation being proposed,
10:07
several different bills, federally that
10:09
on their surface seem to
10:11
be trying to do a
10:13
good thing. But unfortunately, it seems
10:16
would potentially make the situation
10:18
worse or make the overall
10:20
status of artists in particular of the
10:22
next generation of Taylor Swift worse than
10:24
it is today. First of
10:27
all, can you tell me more about the Biden
10:29
deep fake call? Yeah, so
10:32
in New Hampshire, I think
10:34
a couple of weeks back
10:36
during the primary, there was
10:38
a really interesting operation that
10:41
was carried out, I think it was
10:43
first flagged by an election
10:45
organizer whose phone number was
10:47
spoofed to call people in
10:49
New Hampshire and play
10:51
a recording of what
10:54
sounded like Joe Biden telling them not to
10:56
turn out and vote. And
10:59
the election worker found election organizer found out
11:01
that this had happened when people started calling
11:03
her number back. And,
11:06
and that
11:09
is very obviously a concerning
11:12
thing when you can take the voice of
11:14
a politician and tell those
11:16
who support them to do something that they
11:18
don't want want want them to do.
11:20
And I'll also add to
11:23
that simultaneously, we also are now seeing
11:25
politicians like Donald Trump claiming that photos
11:27
they don't like that don't make that
11:30
don't make him look good are alternative
11:33
AI deep fakes and that those photos
11:35
aren't real. So it's just it's a
11:37
mess is what it is. We're
11:40
entering into a lovely world where I mean,
11:42
not only can you not trust audio
11:45
video that you see, especially online,
11:48
but then people can use it as a
11:50
way to dismiss any information they don't like.
11:53
I was actually thinking about what is
11:55
a term for that last phenomenon you
11:57
just described because in I
12:00
know in activist spaces, there's a term
12:02
called fed jacketing or snitch jacketing, which
12:04
essentially means when you when people start
12:07
accusing one another of being secret informants
12:09
or secret like government agents.
12:12
And so I feel like in the same way we're
12:14
about to because of the fact that the
12:17
media environment is so
12:19
completely, like chaotic and
12:21
untrustworthy. And because of AI playing a
12:24
huge part in that it feels like
12:26
we're about to start seeing like AI
12:28
jacketing or some some I don't
12:30
know, I was trying to come up with like a
12:32
better term for it. But like, I've been noticing this
12:34
recently. And also with your reporting,
12:37
Matt on pow world, there were
12:39
people that pow world is this is
12:42
this video game that we've been talking about
12:44
where it's essentially Pokemon, but like it's made
12:46
by this other company, and they kind of
12:49
sort of kind of look like Pokemon, but
12:51
they're not. And people hate this game, because
12:53
it's like like jolting off of Nintendo's
12:55
IP. And some people were accusing it of
12:57
using AI generated assets, but there was no
13:00
evidence for that. So it's just kind of
13:02
like another example of people, you
13:04
know, using this like environment
13:06
of distrust to like, pro
13:09
accusations. The power world
13:11
one is a really low stakes good example.
13:14
Because I remember watching
13:17
that, it was one account
13:19
on Twitter accused of
13:21
the game of stealing
13:23
assets and using generative AI to change
13:25
them just enough, and then
13:28
provided like screenshots that
13:30
proved quote unquote, that this is what this is
13:32
what had happened. And it's
13:35
one of those things where like you hear about it,
13:37
and then it becomes just cut part of the conversation
13:39
around the game. Like, oh, they used AI, did you
13:41
know that they used AI isn't this terrible, etc, etc.
13:44
And the company had previously published a
13:46
game that used generative AI in it,
13:48
it was like a social deduction game
13:50
where you're trying to figure out what
13:53
portrait was created by
13:56
generative AI, I'm kind of like
13:58
an among us using midget. journey,
14:00
basically. Then the person that
14:02
had done the original tweet citing the evidence admitted
14:04
that they just made the whole thing up later,
14:07
and that it was complete bullshit. So
14:09
yeah, I think AI jacketing is a good term. They
14:12
were also just posting screenshots of
14:15
the CEO's tweets saying like, oh
14:17
yeah, I think the generative AI
14:20
is interesting. That was
14:22
a hard set. And then we
14:24
also see, I've seen because I'm also
14:26
an author and a huge publishing nerd,
14:28
we've seen cover authors
14:30
and publishing be accused
14:32
by the cancel
14:35
mob that exists within publishing,
14:38
rightly sometimes and also wrongly sometimes,
14:40
of using AI to
14:43
make book covers when, in reality, this
14:45
is an artist who's worked a very
14:47
long time to make something extraordinarily beautiful
14:49
and that sort of controversy
14:51
and just the sheer mental
14:55
and emotional weight of that sort
14:57
of attack. They're really serious and
14:59
pretty harmful, this whole idea
15:01
that if you touch AI at all, then
15:03
we should throw you off a cliff. I
15:06
was following one of those stories last year
15:09
and didn't get a chance to really report
15:11
out on it. There
15:13
was mostly happening in like Facebook groups. And
15:16
it was a freelance artist who
15:19
they people had decided had used AI
15:21
to generate her covers. And
15:23
I could not tell. I
15:25
couldn't quite decide
15:28
to my own satisfaction
15:30
if she was using
15:32
generative AI or if
15:35
she was just a very talented artist. There
15:37
weren't a lot, some of the tells were
15:39
there. Maybe she used it to
15:41
do a pass. But it was
15:43
difficult. It was really hard. And that's ultimately
15:45
part of why I didn't report on it.
15:47
Because I couldn't make a determination either way.
15:49
And there have been
15:53
scandals around the use of generative AI
15:56
by major publishers in the past few
15:58
years. Like
16:00
this stuff does happen. Everyone's
16:02
getting paranoid. It's terrible. I
16:06
think that that is symptomatic
16:09
to a certain extent of the fact that we are
16:11
grappling with this thing as a culture and how it's
16:13
going to be and how we're going
16:15
to think about it and what are the acceptable
16:17
ways to use it and what are the
16:20
acceptable ways to criticize it. And
16:23
we're not there yet collectively. That
16:26
growing pain to me, well, I think very
16:28
painful for a lot of people. It seems
16:30
relatively normal in terms of technology.
16:34
We do kind of go through this every time there's a sea
16:36
change, right? And it's going to take I
16:39
think on the legislative side, unfortunately, probably years to
16:41
catch up and we're probably going to do some
16:44
things that we shouldn't do. Yeah.
16:48
Janice, you were going to say something about paranoia.
16:50
Oh, just that, you know, the
16:52
kind of paranoia is made worse by the fact
16:54
that, you know, it actually
16:56
is happening. Like, people are
17:00
generating, like using generative AI to
17:02
do all these kind of like deceptive things. And
17:04
so, you know, it's
17:07
because of that that the sort
17:09
of accusations become effective.
17:11
Like, it wouldn't be effective if we
17:13
weren't currently living in a
17:15
time when it was so trivially
17:18
easy to generate visual propaganda using
17:21
a system created by
17:24
large corporations that were built using
17:26
millions of other images taken without
17:28
permission. And playing
17:32
off of that paranoid surface area, one
17:34
of the most common quote-unquote solutions that
17:36
I see proposed to try to
17:39
make the internet
17:41
trustworthy again or some sort
17:44
of Don Quixote type
17:46
task is to enforce
17:52
labeling machines across every
17:54
tool that uses generative
17:58
AI to make images or
18:00
video or music or what have you. And on
18:03
the surface that could seem like a
18:05
good idea but we have a lot
18:07
of concerns about the reality that in
18:09
many places generative AI tools will be
18:11
made that don't have those AI
18:14
labeling routines and
18:16
that if we
18:18
come to trust that what is labeled as AI
18:20
is AI and what
18:22
is not labeled as AI is
18:24
real that the
18:27
ground for the most sophisticated and
18:29
most manipulative actors with the highest
18:31
stakes to be able to fool
18:34
us. And there's all
18:36
these secondary cascading effects
18:38
of content filtering on platforms
18:40
and whether or not we're surveilling
18:42
and collecting data on who's making
18:44
AI images at the point of
18:47
creation and there's just there's
18:49
a lot a lot there and I see a
18:51
lot of folks trying to go down that path
18:53
and I think
18:55
like you said Matthew it's really
18:58
important to think these through all
19:01
the way because
19:05
even a little bit of life we should label
19:07
it could have a lot of unintentional effects. What
19:10
would that even look like? Are you
19:12
talking like a watermark? What's been suggested?
19:14
Yeah they're talking about
19:17
visual watermarks they're
19:19
talking about invisible
19:22
cryptographic watermarks that then your
19:24
Instagram or what have you
19:26
could could read and then pull
19:28
a label in the app from that
19:31
but the reality of
19:33
the internet to get around
19:35
restrictions such as these is
19:37
well established. I
19:41
want to go back a little bit and talk about more in-depth
19:43
some of the the legislation that's being proposed
19:45
proposed you know some of it's being proposed
19:48
at the state level it's being proposed at
19:50
a federal level I think
19:52
the most well-known one is probably the no
19:54
fakes act can you walk us through some
19:56
of this stuff and is
19:58
it good is it bad I
20:01
think it's, I think that it is coming
20:03
from the right place with legislators. So
20:05
the No Fakes Act, I think is a good one to
20:08
walk through. There's
20:11
also the No AI Fraud Act, which
20:13
is maybe the next evolution of No
20:15
Fakes, or at least
20:17
proposed more recently. And
20:19
some of the terms in there are a little bit more reasonable.
20:22
But this approach that both
20:24
of those acts in particular are taking right
20:26
now is the idea that
20:28
they should create a new federal
20:31
intellectual property right to
20:34
everyone's likeness. So
20:36
that's to your face, to your
20:38
voice, your
20:41
mannerisms, essentially a right to your body.
20:45
And it's representation in the
20:47
digital world. And
20:50
the idea behind doing that is that if you have a
20:53
right to that, and it's your intellectual
20:55
property, then you can sue people who
20:57
use it without your permission. But
21:01
the problem in creating that new right, and
21:03
with this legislation, the first problem that I
21:05
have with it is that it's
21:08
establishing a new transferable intellectual
21:10
property right. And
21:14
I believe that it's incredibly important for
21:16
us to really question whether
21:18
we want to let people sell
21:20
the rights to this representation of
21:22
their physical body in
21:24
the digital realm and video and
21:26
music, what have you. And
21:29
it's funny because it felt like some
21:32
sort of Swiss Deep Groundhog Day when
21:34
the deep fakes were of Taylor Swift,
21:37
because I had just been walking
21:39
through the impact of the OAI
21:44
fraud act in particular, in the
21:46
context of Taylor Swift's own experience
21:48
as an artist, where
21:50
early on, she signed away
21:52
that the rights, that intellectual property rights
21:55
to her master recordings, which
21:57
were ultimately acquired by the Sky Scooter Braun,
21:59
who's You know my very much for
22:01
reasons that you can google. At an
22:04
end to think about it at that
22:06
time there at also Van or right
22:08
to Taylor's face. Or a right to
22:10
the sound of her voice. And.
22:12
The incentive Us Mater Content companies are are
22:15
the rich men of Hollywood's You get as
22:17
many rights as a chance that they can
22:19
exploit them maximally would have probably meant of.
22:22
that contract that she signed at that time
22:24
would have included her face in her voice
22:26
in The Scooter Braun could essentially on those
22:28
two to the. And
22:31
that that is incredibly concerning and and
22:33
to me is. Is the fundamental
22:35
no holds barred as long
22:37
as ridiculous as. Fact:
22:40
I hit the because What it what it does is
22:42
it sets off. All these artists that it's
22:44
trying to protect and said to
22:46
be exploited without their consent. Ah
22:48
by a eyes along and that
22:50
a i it's being wheeled that
22:52
by the person who has acquired
22:54
the rights to their body. In.
22:57
Also, wouldn't that mean like you would
22:59
have to register yourself in a government
23:01
database of some kind? Before.
23:03
You were able to pursue a noise
23:05
a criminal case against some of us
23:07
making within a form of you. Eat.
23:10
You know I don't They did. I think
23:12
that's because it's. An. Intellectual Property
23:14
Rights that it actually was the
23:16
something that you have automatically but
23:19
you are correct their that the
23:21
rights is the right to see.
23:24
It's. Not the right the has
23:27
the eyes aren't as you
23:29
removed from the internet. It's
23:31
not a take down is
23:33
the right to get into
23:35
a legal battle with various
23:37
entities. Some of them are
23:39
more than others. Under. The. Hindus
23:44
and protracted. Legal. Your
23:48
switches Mosquirix, Have to.
23:50
Seriously. Demons. And for human being.
23:53
How do you think we should legislate of we
23:55
handle this? i don't
23:57
have all the answers here for sure
24:00
incredibly complicated, but one
24:02
of the places to start is with
24:04
the existing laws that a bunch of
24:06
these people have been affected that we're
24:08
already talking about are already suing under.
24:11
Most states have a right
24:13
of publicity law that allows them to
24:15
sue, or that allows you know, celebrities
24:18
or public figures to sue if people
24:20
misuse their images in some sort of
24:23
manner that is commercial or could be construed
24:25
as commercial. And then at the same time
24:27
we have defamation law
24:29
which can often be a
24:32
way to, for average people,
24:34
to sue those who
24:36
humiliate them and, well,
24:39
it protects celebrities less. So between
24:41
those two, depending on where you
24:43
live and in the
24:45
majority of states, there's some good stuff there.
24:47
There's some good mechanisms, at least in terms
24:49
of if you want to get a lawyer
24:51
and if you want to sue the person
24:54
who's causing a misery. But
24:57
still that doesn't actually give victims
25:02
of these deep fakes a
25:04
real-time way to say, hey, wait,
25:06
get this disgusting part of me off
25:09
the internet, this is humiliating me and
25:11
it's running everywhere. And
25:14
while it pained me, because I
25:16
know how extensively something like the
25:19
Digital Millennium Copyright Act has been
25:21
abused, they sort of noticed and
25:23
take down systems. I'm
25:25
talking to a lot of people and none of us really
25:27
see another way
25:29
to make something that
25:33
is actually responsive to the harms that people are
25:35
going to be experiencing. But what we can get
25:37
right this time, and I think with
25:39
Donald Trump right now claiming that actual
25:41
photos of him are deep fakes, is
25:43
that unlike the DMCA, we can build
25:45
a system where there are actual consequences
25:47
if you abuse it. If you
25:50
say that it's an embarrassing video, if you were a
25:52
video of police misconduct or what have you
25:54
is a deep fake and it actually isn't.
26:00
Oh, that's interesting. What do you I
26:03
didn't even think about the possibility
26:05
of a public person getting
26:07
into trouble for lying about a real image being a
26:09
deep fake. Oh, yeah, that's coming.
26:13
What has anyone proposed legislation or you
26:15
ordered this just like conversations? This
26:18
is conversations because we
26:20
I think that a lot
26:23
of the people who are looking at
26:25
no fakes or no AI fraud or
26:27
what have you and and saying that
26:30
these are these are terrible laws with
26:32
unintended extremely harmful consequences are also really
26:34
feeling for the reality that people are
26:36
going to face with these technologies and
26:38
that and knowing that we need to
26:40
do something. Happily,
26:43
we're having much more of a proactive
26:45
conversation here amongst ourselves about what we
26:47
do want. Then, then I think we
26:50
have maybe it was previous online revolutions.
26:53
That was one of the things that struck me
26:55
initially I wrote about the no fakes
26:57
and then there was another law in Tennessee that
26:59
I think was proposed the same day, which
27:01
is that, you know, and this is like what I wrote
27:04
about in the article I wrote a couple of weeks
27:06
ago, which is like, it just, it seems like
27:08
the just of these laws is sort of intended
27:11
to protect celebrities, and
27:13
maybe the rest of us to kind
27:15
of sort of, and
27:18
that's kind of like where, where I
27:20
came out this from which is like the fact that
27:22
like, you know, a lot of people have been
27:24
talking about this as a concern for a while. But
27:27
now that Taylor Swift is mad. Now
27:29
that like the weekend and Drake are
27:32
mad about this, and it's not just
27:34
you know, it's not
27:36
just like photos. It's also
27:38
like music and, you know, voice rights
27:40
music, and then all other kinds of
27:42
stuff that could be considered you know, intellectual
27:45
property or, you know, personalized
27:47
sort of like intimate representations
27:49
of someone's person. It
27:52
seems to me like that's
27:54
where this always starts and ends
27:56
when we get quote
27:58
unquote privacy protection. or
28:00
something that is supposed to at least in
28:02
theory be like protecting privacy
28:05
is that it's generally winds up
28:07
protecting Famous and rich
28:09
people and not doesn't do a whole
28:11
lot for regular people
28:13
who are facing abuse and harassment
28:15
and you know sexual violence
28:17
and You know the
28:20
copyright system as you were just mentioning like
28:22
the copyright system you're saying like oh You
28:24
know we keep looking at this like we
28:26
don't see any other way of enforcing that
28:28
Like what what would be different about this?
28:31
Compared to like you know the fact
28:33
that you know when people when
28:36
artists for example have people Profiting
28:38
from their art that
28:40
they release like music or otherwise, you
28:43
know, there is a motive for dress But it's not
28:45
very accessible unless you have a lot of money to
28:47
lit litigate it So what's the fifth here when it
28:49
comes to this stuff if in the past this has
28:51
been kind of a status quo That's
28:54
a great question Because we are
28:56
really swimming upstream against the
28:59
headline grabbers here if
29:01
something horrible is being done to Taylor Swift and
29:04
By God Josh Holly can trot out there with a
29:06
bill and and and wave it in the air and
29:08
it's gonna get covered You know
29:10
all over creation The motive
29:12
the motive there is really clear
29:15
and straightforward politicians like flaws that
29:17
grab headlines they like flashy partnerships
29:19
with celebrities and and and I
29:22
would also say that the the
29:24
lobby of those IP
29:26
rights holders the major labels and
29:29
Publishers and content companies and what have you
29:31
is extremely powerful and really well organized and
29:33
from the moment that this all blew up
29:36
They've been in legislative offices, you know gunning
29:38
for a bill that's gonna benefit, you know
29:40
the Universal Music Groups of
29:43
the world and yeah, and that's why For
29:47
myself I turned towards and I think that there
29:49
are also legislators who who are thinking
29:51
in this way better
29:54
tools for every
29:57
everyday people because I
29:59
don't think that there's going to
30:01
be an effective way to censor, or
30:06
we can't put the router back
30:08
in the hat with AI. What
30:10
we need is proactive tools to address
30:12
it in a way that is minimally
30:15
invasive when it comes to surveillance
30:17
and censoring speech and slapping upload
30:19
filters across the whole internet isn't
30:21
the right thing either. I would
30:23
look at something like, well, we've
30:25
got Google reverse image search and
30:27
we know that that works pretty
30:29
good. And we've got the DMCA
30:31
and that, there's
30:35
an established protocol for that. And we've
30:37
got this idea that I've learned a lot
30:40
since that legislation went in.
30:42
And so can we slap something together
30:45
that doesn't reinvent the wheel and
30:47
just gives people the right to say, hey, this is
30:50
a horrific photo, a
30:52
horrific fake photo of me. Please
30:55
grab it off the internet and
30:57
I can make that request in a
30:59
way that the platforms have to
31:01
be accountable to. And that's the other thing, it's
31:03
really hard to be heard as an individual
31:05
when platforms are dealing with so many
31:07
users. About that
31:09
point on platforms being accountable, this is kind
31:11
of like something I say a lot in
31:13
when I'm talking about this topic is that
31:15
we're kind of addressing a symptom of
31:18
a problem here and not the actual problem.
31:20
And the problem is that we have all
31:23
these giant tech companies that
31:25
are producing this technology and they basically
31:27
have no regulation and they're kind of
31:29
just doing whatever they want. And
31:33
that's, there's even like these
31:35
like lobbying groups like Elon Musk has
31:37
like this like AI Institute that's like essentially
31:39
saying, you must let us develop AI and
31:41
if you don't, then you're killing people, people
31:44
will die. And so, that's kind of
31:47
like what I always frame this around is
31:51
like we're dealing with a symptom and
31:53
not the like actual problem, which is that
31:56
like, and even the
31:58
way in which these tech companies address. this
32:01
problem sometimes is very much like band-aid
32:03
oriented. Like I was writing an article
32:05
a couple of weeks ago about the
32:09
filtering system on some of these
32:11
things like on OpenAI, like
32:13
has been constantly needing to patch
32:16
chat GPT and like
32:18
Dolly and all these like image generators
32:20
because people keep finding ways
32:23
to like get
32:25
past the like content filter
32:27
system that like prevents them
32:29
from generating certain types of
32:32
images through all these kind of tricky ways
32:34
and it's just this cat and mouse game.
32:37
And you know, when it
32:39
comes to some of these, I was reading this paper
32:41
and when it comes to some of these systems what
32:43
they're actually doing is that they're just, they're
32:46
still generating the content and then they're just not
32:48
showing it. So it's not even that
32:50
they're like preventing the content
32:52
from being created in the first place, they're just filtering
32:55
it out. Like they're just saying, you
32:57
know, the equivalent of like, I
32:59
can't do that Dave is
33:01
what the policy is right now is what the
33:03
sort of mitigation policy is for a lot of
33:05
these things. But really it's off
33:07
camera sketching that image you asked for and
33:09
just storing it in a digital warehouse somewhere
33:12
and all the forbidden images you'll never
33:14
be able to see. Yeah,
33:16
but it's like, even if nobody sees that
33:19
image that's indicative of a larger problem, which
33:21
is that like, we don't actually know how
33:24
to stop that from happening because the
33:26
training has already occurred.
33:28
These systems are already built on
33:31
top of billions of images that
33:33
were taken without permission. And
33:35
you know, some of them, we wrote another story about
33:37
this a couple of weeks ago, LAION,
33:40
which is a probably one of the
33:43
most commonly used image databases used in
33:45
generative AI. It
33:47
contains billions of images that are taken from
33:49
web scraping. And it was found that I
33:51
think about 3000 instances of CSAM, of
33:56
child exploitation were found in the
33:58
state in this massive. database and
34:00
it's like you know that's what
34:04
we're dealing with here it's like the
34:07
sort of like base problem has already
34:09
occurred where we're just getting the results of
34:11
it now and you can filter the results
34:13
but that doesn't ultimately like solve the fact
34:16
that like where this all came from. All
34:20
right so I'm just going to pause there for a break
34:22
we'll be right back after this. Are
34:28
you ready to enhance your future in tech?
34:30
Then it's time to make your move to
34:33
the UK the nation that
34:35
has more tech unicorns than France
34:37
Germany and Sweden combined the
34:40
nation that was third in the world
34:42
to have a one trillion dollar tech
34:44
sex evaluation the nation
34:46
where great talent comes together
34:49
visit gov dot uk/great talent to
34:51
see how you can work live
34:54
and move to the UK. We're
35:00
driven by the search for better but when it comes
35:02
to hiring the best way to search for a candidate
35:04
isn't a search at all. Don't search
35:07
match with Indeed. If
35:09
you need to hire you need Indeed. Indeed
35:12
is your matching and hiring platform with over
35:15
350 million global monthly visitors according to
35:17
Indeed data and a matching engine that
35:19
helps you find quality candidates fast.
35:22
It's the busy work. Use Indeed
35:25
for scheduling, screening and messaging so
35:27
you can connect with candidates faster
35:29
and Indeed doesn't just help you
35:31
hire faster. 93% of
35:33
employers agree that Indeed delivers the highest
35:35
quality matches compared to other job sites
35:38
according to a recent Indeed survey. When
35:40
I was looking at Harrison then it was so
35:42
slim and overwhelming I wish I had used
35:44
Indeed. Leveraging over 140 million
35:47
qualifications and preferences every day, Indeed's matching
35:49
engine is constantly learning from your preferences.
35:51
So the more you use Indeed the
35:53
better it gets. Join more than 3.5
35:56
million businesses worldwide that use Indeed to
35:58
hire great talent. And
36:01
listeners of this show will get a $75 sponsored
36:04
job credit to get your jobs and more
36:06
visibility at indeed.com Cyber
36:09
just go to indeed.com/cyber right now and
36:12
support our show by saying you
36:14
heard about indeed on this podcast Indeed
36:16
comm slash cyber terms and conditions apply
36:19
Need to hire you need indeed.
36:21
Okay, there's cyber listeners Matthew here. This
36:23
episode was brought to you by delete me
36:27
Few years ago. I did some reporting that Made
36:29
quite a few people pretty angry I got
36:31
a few death threats and I started to worry about
36:34
how much my private information was available in public so
36:36
we've got a safety team here advice and that
36:39
helps journalists navigate these kinds of situations and The
36:42
single best thing they did was sign me up
36:44
for delete me So, you
36:46
know, how would you like to wake up one day and
36:48
discover your bank account has been emptied or
36:50
get an overdue notices For credit cards you never applied
36:52
for or worse You
36:54
don't realize how much of your information is
36:57
available to scammers on the internet and how
36:59
susceptible you are and your family is
37:01
to identity theft and fraud Now
37:04
you can get your data removed with delete
37:06
me These are just two of
37:08
the many reasons that I personally recommend delete
37:10
me delete me is a subscription service that
37:12
removes your personal info from the largest people
37:15
search databases on the web and
37:18
in the process Helps prevent potential ID theft
37:20
doxxing and phishing scams and this is not
37:22
just a one-time thing either It is always
37:24
working for you. I've been
37:26
using it for a few months now
37:29
and I'm constantly getting updates about
37:31
where my information is popping up
37:34
how to remove it and One of the
37:36
best parts a lot of the stuff just gets
37:39
automatically removed for me to put it
37:41
simply delete me does all the hard Work of wiping
37:43
you and your family's personal info off
37:45
of the web data brokers hate delete me
37:47
when you sign up for delete me It
37:49
immediately goes to work scrubbing your all your
37:52
personal information from data broker platforms When
37:55
you sign up delete me immediately goes to
37:57
work scrubbing all your personal information from data
37:59
broker Your personal
38:02
profile is no longer there to sell. So
38:05
take control of your data and keep your
38:07
private life private by signing up for Delete Me. Now
38:10
at a special discount for our listeners.
38:13
Today get 20% off
38:15
your Delete Me plan
38:17
when you go to
38:19
joindeleteme.com/cyber and use promo
38:21
code cyber at checkout.
38:23
The only way to
38:25
get 20% off is
38:27
to go to joindeleteme.com/cyber
38:29
and enter code cyber
38:31
at checkout. That's joindeleteme.com/cyber
38:33
code cyber. AI
38:36
might be the most important new computer
38:38
technology ever. It's storming every industry and
38:40
literally billions of dollars are being invested.
38:43
So buckle up. The problem is
38:45
that AI needs a lot of speed and processing
38:47
power so how do you compete without costs spiraling
38:49
out of control? It's time to
38:51
upgrade to the next generation of the cloud, Oracle
38:54
Cloud Infrastructure or OCI. OCI
38:57
is a single platform for your
38:59
infrastructure, database, application development and AI
39:02
needs. OCI has four
39:04
to eight times the bandwidth of other
39:06
clouds, offers one consistent price instead of
39:08
variable regional pricing and of course nobody
39:11
does data better than Oracle. So
39:13
now you can train your AI models at twice the
39:15
speed and less than half the cost of other clouds.
39:18
If you want to do more and spend less like Uber,
39:21
8x8 and Databricks
39:23
Mosaic, take a
39:25
free test drive
39:27
of OCI at
39:29
oracle.com/cyber. That's oracle.com/cyber.
39:31
oracle.com/cyber. Alright
39:35
cyber listeners, welcome back. We are
39:37
back on talking about legislation and
39:39
artificial intelligence. On
39:42
CSAM, I noted that several of the
39:45
local legislation was
39:48
pretty narrow. I think
39:50
Ohio specifically is
39:53
just about miners and
39:56
I don't know why they like zeroed
39:58
in on that specific thing and didn't
40:00
do that. something for adults, but the
40:02
proposed like Ohio draft legislation would narrowly
40:05
target sexual
40:08
like basically deep fakes made of people under
40:10
the age of 18. I thought
40:13
that was an interesting carve
40:15
out, no protection for anyone above the age
40:18
of 18. Legislators do
40:20
often find it easier to
40:22
pass bills that are about
40:24
saving the children. That's a
40:26
common technique that we've experienced
40:28
across the range of the issue
40:30
areas that we work on. At
40:34
the same time, I haven't reviewed
40:36
that bill specifically and what
40:39
its enforcement mechanisms are, but
40:41
I do think that narrowly
40:43
targeted legislation that is
40:46
focusing on specific harms is not a
40:48
terrible direction to go generally.
40:50
You just have to make sure that you
40:52
aren't burning down the whole internet
40:55
or important resources from our religious communities
40:57
or our online speech
40:59
or what have you in pursuit
41:03
of those protections. I
41:08
feel like we're going to. I
41:10
feel like it may not be Taylor
41:13
Swift, but something is going to happen
41:17
to somebody that has power that they are
41:19
not going to like. Then
41:22
there will be a law that oversteps.
41:25
It feels like what is going to happen
41:27
in the next few years. Am
41:30
I completely out of bounds? Oh,
41:33
I'm been waiting for that never since this
41:35
whole hell of blue. I
41:37
started up over a year ago now. Absolutely.
41:41
We're going to see something massively
41:45
impactful to somebody. I mean, who's
41:47
more powerful than Taylor Swift? I
41:49
don't really know. But happening to
41:51
a legislator or with
41:55
a compelling impact on democracy or
41:57
what have you. That may force us
41:59
to. to confront faster than
42:01
we're ready what we need
42:03
to do to mitigate
42:06
these harms. And
42:08
I think that these early
42:11
legislative drafts and the criticisms of them and
42:13
what have you are
42:17
getting civil society
42:21
and forcing legislators into conversations where
42:23
we are talking about how
42:26
not to throw the baby out with the
42:28
bathwater and that part of
42:30
the reason that we are where
42:32
we are today with the mass
42:35
image scraping and the biometric
42:37
databases and just being able
42:39
to log in on some
42:41
data broker website. And
42:45
I was thinking of a case
42:47
where a family was almost tricked
42:51
into sending mail money for their
42:53
son after somebody got
42:55
all the phone numbers for the
42:57
family and impersonated the son's voice
43:00
and this, that and the other thing. A lot of
43:03
the vulnerabilities we're facing here is just because we don't
43:05
have data privacy and the
43:07
US is a sort of substantial way. We
43:10
really haven't grappled with one of
43:12
the worst and most extreme harms
43:15
of the era of
43:17
the internet that some say that we're
43:19
leaving. And
43:21
I would say too, that you coming
43:23
into the election thinking about it can
43:25
return a litica but
43:28
with this technology and the ability to
43:30
analyze and iterate at scale is extremely
43:33
concerning. And again, that
43:35
goes back to privacy and
43:38
that we don't have it and then that data
43:40
is just laying there to be exploited. Yeah,
43:44
that's a good point that we've kind of, we've moved on
43:46
to like a new phase of the internet having
43:49
not solved any of the problems of
43:51
the previous phase of the internet. And
43:54
now we have, and I always, I keep
43:56
saying that like the sort of deep fake phenomenon
43:58
is gonna be like the... death of
44:00
not the literal death, but it's
44:02
going to be the end of like boomers
44:05
like falling finding like Taylor Swift
44:07
videos like saying here click here
44:09
for this cook for a free
44:12
cookware set which actually happened. That
44:15
was another story we did a couple weeks ago
44:17
about somebody deep thinking
44:19
Taylor Swift's voice and like getting
44:21
people to like click on a
44:23
link for you know cookware set
44:25
that wasn't being offered and didn't
44:27
exist and it was like you
44:29
know it's so trivially easy. That's
44:31
the stuff like the sort of industrial scale
44:33
that comes into play when which
44:36
is all built off of the fact that we didn't
44:38
really address a ton of these
44:40
issues before they became like able to be
44:42
automated. Oh I would say that
44:44
it's the death no for for Millennials
44:47
too. Oh yeah.
44:50
I think the the the the Millennials
44:52
are the ones being targeted by Taylor Swift. The
44:54
boomers are all getting phone calls. That's
44:57
true. So this is just like
44:59
this is the advanced robocall for the people
45:02
that grew up online and I wonder I
45:07
wonder how much of all of this will burn down
45:10
to avoid getting hurt and that's
45:12
one of my big concerns. So I
45:15
also don't live but also I don't know I don't know what you
45:17
do. I just I
45:19
simply don't. Well
45:21
I would say that there are a fair amount
45:23
of parallels. I remember back
45:26
in high school whether or not you're allowed to
45:28
cite Wikipedia and whether you can trust the Internet
45:30
whether you can trust what you read on the
45:32
Internet and and that sort
45:34
of neat media literacy and the
45:37
the existential dread of a website
45:39
that you can change instead of a newspaper that you
45:41
can't was really present
45:43
in that time. And this
45:47
to an extent feels like
45:49
that although we're talking about
45:51
much more convincing mediums to
45:53
our little animal brains that
45:55
have not evolved for any
45:58
of this. And
46:01
so while I'm not sitting
46:04
in the most optimistic place here, I
46:06
also don't feel that it's completely foreign
46:08
territory and to me at least is
46:10
encouraging. It's a
46:12
fair point. We've been through big,
46:15
frightening changes about the
46:17
way the information sphere works before and though
46:19
to Janice's point we may have not solved
46:22
some of the previous issues we are surviving
46:24
and still using the internet. The
46:29
ways we use the internet I think will definitely change. And
46:32
that's my transition into
46:34
talking about the
46:37
back half of the conversation today. Just
46:40
kind of how we started talking about
46:42
this offline Janice, which is that the
46:44
AI thing is interesting because
46:47
it's in a boom cycle right
46:49
now that I think
46:51
looks very different to you and I because
46:53
we're journalists and we have our
46:55
inboxes are very strange. It
46:58
looks a lot on the surface like
47:01
what Web3 and crypto looked like a
47:03
year ago, right? Could
47:06
you tell me about your
47:09
thoughts on how AI
47:11
has become the new Web3? Oh,
47:14
for sure. I mean, I guess the
47:16
best way of phrasing this is like
47:19
I invite, well, okay, I don't invite
47:22
other people to look in my inbox, but like
47:25
if I were to give you a
47:27
sampling of my email inbox on
47:29
a weekly basis, you would
47:31
see countless pitches, some
47:33
of the worst pitches that I've ever seen in
47:36
my life from all these companies that didn't exist
47:38
six months ago. And
47:40
oftentimes, and I've been meaning to do a
47:42
more sort of like in-depth
47:44
investigation about this and maybe even write about
47:46
it. A lot of times I
47:48
will look up these companies that didn't exist six months
47:51
ago and I will find that their founders are
47:53
the same people who did crypto
47:55
startups and Web3 startups like a
47:57
year ago. And, you know,
47:59
the… this kind of driving home for me,
48:01
the fact that we are
48:03
in this kind of looping hype
48:06
cycle when it comes to tech and when it
48:08
comes to the sort of like
48:10
economics and like the phase
48:12
of capitalism that we're in where
48:14
there's really no new ideas. Everyone
48:17
is just kind of like tweaking the ideas
48:19
a little bit each time and trying to
48:21
sell their new grift. And
48:23
you know, AI is the new grift. There is
48:25
a difference I should mention. I
48:28
think that unlike NFTs, for
48:30
example, there is an actual use
48:32
case in a lot of cases for AI
48:35
to automate things. I think that like the
48:38
allure of automation is pretty undeniable.
48:40
And I think that especially
48:43
when it comes to companies
48:46
that are looking for ways to cut
48:49
costs and do like the sort of
48:51
siren call of that is just completely
48:54
unable to be resisted. Beyond
48:57
that, there are actual
48:59
use cases in which AI can be,
49:01
I think, used to
49:04
help people. However,
49:07
you know, it has
49:09
all these problems that we've been talking about. That's
49:11
now and in a lot of ways,
49:14
the sort of like cycle that
49:17
we've gotten into where it seems
49:19
like it's becoming faster. Like Mark Zuckerberg was
49:21
pitching everybody on the metaverse like a year
49:24
ago. Was it
49:26
that recently and
49:28
already divesting from that? And it's, you
49:31
know, it's kind of like a weird
49:33
death spiral type vibe right now in
49:35
tech. Yeah, he's
49:37
got to shed what, 22 percent of his workforce to
49:39
get to get returns back
49:41
up on the company, which
49:43
worked, you know, short
49:46
term actions produce lovely short term results.
49:48
Yeah, everybody's in short term right now.
49:51
That's the thing. Everybody is like it's just
49:53
what's the what's the latest buzzword soup that
49:55
we can sell people on? I
49:57
think everyone's in short term most of the time. It's
50:00
kind of the way the business operates. It's
50:03
what it feels like to me anyway. The
50:05
way business operates. Yeah, I recall
50:07
the moment when I was reading
50:09
the crypto VC is
50:11
moving to AI article and business
50:16
and insider who knows what it was. And
50:20
my first reaction was just, Oh,
50:22
here, here we go. Here we go. And
50:25
the new AI
50:27
thought leadership, human
50:30
rights, whatever
50:33
human artistry campaign, but it's
50:35
the RIAA and these
50:37
sorts of like
50:40
new organizations that
50:42
are being being set up often by like
50:44
VC capital or established and to use their
50:46
what have you to sort of ask to
50:48
a turf on the work that the
50:51
activists have been doing for years.
50:54
It was from our perspective,
50:57
it was really profound over
50:59
the past year to see
51:02
how erased
51:04
so much of the like decades of
51:06
work of folks with the Algorithmic Justice
51:08
League or like our work
51:10
on facial recognition was
51:13
from the larger conversation.
51:15
And the gut instinct there
51:17
was 110% to build something
51:19
new rather than
51:21
to go to the analysis and
51:24
advocacy that had been happening for years
51:27
upon upon years and
51:29
this faith and actually a lot of
51:31
good, good, good friends and good starting
51:33
points for ways to ways to think
51:36
about this and are going to write
51:38
for it. It would actually help AI
51:40
be that force for good instead of
51:42
just another tool to Sophia.
51:46
It's kind of Silicon Valley's attitude all over, right?
51:49
It's trapped in this eternal present moment
51:52
with no sense of the
51:55
history that it's attempting to disrupt,
51:57
quote unquote, right? It
51:59
doesn't. understand that whatever a lot of
52:02
the products and services it's trying to
52:04
offer are already being offered in a
52:07
way that the marketplace deems efficient and
52:10
that you know if
52:12
you try to release XY or Z product you're just
52:14
going to end up where the market
52:16
is now but in 30 years
52:18
time and after hurting a lot of
52:20
people and maybe not making nearly as much money as
52:22
you thought you would. And then
52:25
they are it's like it's capitalism
52:27
again they're there and they're devised to
52:29
do that that is that is how
52:31
they are supposed to operate legally out
52:34
of obligation to their shareholders which is
52:36
which hurts you know. I just
52:39
saw somebody a good example of what we're
52:42
talking about I just saw somebody a screenshot
52:45
a tweet of someone complaining because
52:47
Revel the like startup that has I don't
52:49
know if they have this where you are
52:52
Leah but it's like the sort of like
52:54
mopeds that you can rent with an app
52:57
and drive around Revel is apparently
52:59
closing in New York City so they're
53:01
not gonna they're they're done and
53:04
somebody was complaining about this thing great
53:06
now that Revel isn't around there's
53:09
no way to get between North
53:11
and South Brooklyn without paying $65
53:13
and someone was just like
53:17
well how about these alternatives
53:20
that you can definitely get you between North
53:22
and South Brooklyn for less than $65
53:24
and it's like the B34 bus and the G train
53:30
so it's like you know that that's just kind of
53:33
like you know reinventing reinventing
53:35
the wheel and calling it a tech startup
53:37
is kind of like the thing also
53:40
I was looking through my
53:42
deleted email inbox just
53:44
to give some examples of like the type
53:47
of garbage that I'm getting one
53:49
of these things says AI girlfriends
53:51
are on the rise could they
53:53
make men behave better much
53:58
about absolutely not But they
54:00
will... It's like the general sense
54:02
of this is like, have you
54:05
done any thought about what this
54:07
problem is and that you're trying to
54:09
address? Like, why does it exist? And
54:12
that goes back to the age old, the whole
54:14
technology will save us thing. Like, we don't have
54:16
to address human problems because we can just make
54:18
it apps, which is one of my long term
54:20
just shakes in any space. Yeah.
54:23
It's just so apparent that even
54:25
a lot of the stuff that we're talking about with AI and
54:28
what have you, like at the core of
54:30
that is like economic anxiety, is lack of
54:33
education and social ills. And we don't want
54:35
to touch that. So
54:37
your inbox looks full of
54:39
AI girlfriends, Janice. Oh, yeah, for
54:41
sure. I think it's really responding
54:44
to the fact that, yeah, that people
54:47
are very anxious about the
54:50
economy for good reasons. And it's because
54:52
there's kind of like the money is
54:54
kind of gone, right? It's like people
54:56
are losing jobs, entire
54:59
industries are kind of like
55:01
dying out. It's
55:03
very tempting to basically say,
55:06
hey, we found a way to fix this
55:08
complicated social problem with
55:10
an app. And I kind of almost
55:12
understand why a lot of people are
55:14
able to accept that. It's sort of in the same
55:16
vein as the sort of mega
55:18
church preacher saying, I have
55:20
this thing, this one way.
55:23
It's kind of, you know, and that's like we've
55:25
talked about Silicon Valley as kind of like this
55:27
new kind of almost religious institution. And
55:29
it kind of resembles that in a lot
55:32
of ways. It's offering people very
55:34
simplistic solutions to complicated problems
55:36
that most people don't have
55:38
time to like fully think
55:40
about or analyze. People
55:44
are drawn to that because they still want to have hope. You
55:46
know, that's fair. That's understandable, even
55:49
if it's false. And
55:51
even if ultimately the reality is we've just got much,
55:53
much bigger work to do than AI girlfriends.
55:56
Yeah, your girlfriends are going to distract us from the
55:58
work we have to do. I think
56:00
that's a lovely place to close
56:02
out the conversation, unless Janice certainly
56:04
has something else. No,
56:08
I'm good. I'm going to go with a bunch of copyright
56:10
lawyers talk about, talk
56:12
about all this next. Thank
56:15
you so much for coming on to cyber and walking us through
56:17
this. It's
56:19
been such a pleasure. It's really been
56:21
a good conversation. Yeah, we're happy to
56:23
come back again. Thank
56:32
you. Thank
57:01
you. Are
57:54
you ready to enhance your future
57:56
in tech? Then it's time to
57:58
make your move to the Uk,
58:00
the nation that has more tech
58:02
unicorns than France, Germany, and Sweden
58:04
combined The nation that was third
58:06
in the world to have a
58:08
one trillion dollar tax sex of.
58:11
The nation were great. Talent
58:13
comes together. Visit Gov.uk forward/great
58:16
talent to see how you
58:18
could work, live, and move
58:20
to the Uk.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More