Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:03
Search Engine is brought to you by NerdWallet.
0:06
NerdWallet lets you compare top travel credit
0:08
cards side by side to maximize your
0:10
spending. Some even offer 10x
0:13
points on your spending. So
0:15
what could future you do with better
0:17
rewards? A free flight? Room upgrades?
0:20
I personally would really like to fly
0:22
somewhere for free. That sounds fantastic. Future
0:24
me would really appreciate it. You
0:27
can compare and find smarter
0:29
credit cards, savings accounts, and
0:31
more today at nerdwallet.com. NerdWallet.
0:35
Finance smarter. Reminder, credit
0:37
is subject to lender approval and terms
0:39
apply. Not
1:03
all search engines are good. Some
1:05
are morally dubious. A
1:08
couple months ago, I found myself using a
1:10
morally dubious search engine. This
1:12
search engine lives on a website. I'm not going to tell you
1:15
the name of it for reasons that will become clear, but
1:17
I am going to describe to you how it works. So
1:22
you open the site, you upload a
1:24
photo of a person's face. I
1:27
can upload a photo of myself right now. Wait
1:30
about 30 seconds for the search to run,
1:32
and then I get taken to this page,
1:34
which includes all these different photos of my
1:36
face from all these different places on
1:38
the internet. I can click on
1:40
any of the photos and it'll take me to the site
1:43
where the photo lives. This
1:45
is Shazam for faces. If
1:48
I put it in a stranger's face, I'll almost
1:50
always get their real name because it'll take me
1:52
to one of their social media profiles. From
1:54
there, I can typically use a different search
1:56
engine to find a physical address, often a
1:58
phone number. And the
2:01
site itself, I also usually see stills
2:03
from any videos the person has appeared in.
2:06
When I first learned about this site, I
2:09
did what you do when you get Google for the first time. I
2:11
looked myself up. And then I
2:14
started looking at my friends. And
2:16
it took about 30 seconds before I saw things
2:18
that made me pretty uncomfortable. I
2:21
was just seeing stuff I should not be seeing. I
2:23
don't know the most delicate way to say this, except
2:27
people I knew had compromising stuff on
2:30
the internet. Stuff
2:34
they had put there, but not under
2:36
their real names. And I don't
2:38
think they knew, I certainly hadn't known, that
2:40
the technology to search someone's face was right
2:43
around the corner. I
2:47
decided to stop using the search engine. The
2:50
line between general internet curiosity and
2:52
stalking, this felt like the wrong side of
2:55
it. It felt seedy. But now, even
2:58
just knowing this tool existed changed
3:00
how I thought in the real world. I
3:03
found myself trying to reach for it. The way any
3:05
digital tool that works begins to feel like another
3:07
limb. I found a driver's
3:09
license on the floor of a dance club. The
3:12
person had a name too common to Google,
3:14
like Jane Smith. But I realized, you
3:16
just find their face with the search engine. Another
3:19
night, two people at a restaurant were talking. One
3:22
of them, the guy, sounded like
3:24
a very personal story about the vice
3:26
president of America. Who
3:28
was this guy? I realized, if I
3:31
snapped a photo of him, I now
3:33
had the ability to know. We
3:35
take for granted the idea that we have a degree of
3:37
privacy in public, that we are mostly
3:40
anonymous to the strangers we pass. I
3:42
realized, this just wasn't true anymore.
3:48
Right now, there are a lot of discussions about AI
3:51
chatbots, about the ethics and problems
3:53
of a very powerful new technology.
3:57
I feel like we should also be talking about
3:59
this technology. these search engines because
4:01
my feeling using one was we
4:04
are not at all ready for this. This
4:06
thing that is already here.
4:08
And I wanted to know, is it
4:10
too late? Is there a way to stop
4:12
these tools or limit them? And I
4:14
especially wanted to know who
4:17
unleashed this on us. So
4:19
I called the person you call when you have
4:21
questions like this. Can you introduce yourself? Sure.
4:23
I'm Kashmir Hill. I am a technology reporter
4:25
at the New York Times. And I've been
4:27
writing about privacy for more than a decade
4:30
now. Kashmir is one of
4:32
the best privacy and technology reporters in
4:34
America. She published a book a few
4:36
months ago about these search engines and
4:38
about the very strange story of how
4:40
she discovered that they even existed. It's
4:42
called appropriately, your face
4:44
belongs to us. Her
4:47
reporting follows a company called Clearview AI, which
4:49
is not the search engine I was referencing
4:51
before. Clearview AI is actually much
4:53
more powerful and not available
4:55
to the public. But
4:57
in many ways, Clearview created the blueprint for
4:59
copycats like the one I'd found. Kashmir
5:02
told me the story of when she learned
5:04
of Clearview AI's existence back when the company
5:06
was still in deep stealth mode. So
5:09
I heard about Clearview AI. It was
5:11
November 2019. I was in Switzerland doing
5:13
this kind of fellowship
5:17
there. And I got
5:19
an email from a guy named
5:21
Freddy Martinez, who worked
5:23
for a nonprofit called Open the
5:25
Government. And he does public
5:27
records research. And he's obsessed
5:29
with privacy and security as I am. And I've known
5:32
him for years. And he sent me
5:34
this email saying, I found
5:36
out about this company that's crossed the
5:38
Rubicon on facial recognition. That's how we
5:40
put it. He said he'd gotten this
5:43
public records response from the Atlanta
5:45
Police Department describing this facial recognition
5:48
technology they were using. And
5:50
he said, it's not like anything I've seen before. They're
5:53
selling our Facebook photos to the cops.
5:56
And he had attached the PDF he got
5:58
from the Atlanta Police Department. It was was 26 pages. And
6:02
when I opened it up, the
6:04
first page was labeled
6:06
privileged and confidential. And
6:09
it was this memo written by
6:11
Paul Clement, whose name I recognize, because he's
6:13
kind of a big deal lawyer, was solicitor
6:16
general under George Bush, now in private practice.
6:18
And he was talking about the
6:20
legal implications of Clearview AI. And
6:23
he's describing it as this company that
6:25
has scraped billions of photos from the
6:28
public web, including social media sites, in
6:30
order to produce a spatial recognition app where you
6:32
take a photo of somebody and it returns all
6:35
the other places on the internet where
6:37
their photo appears. And he said,
6:39
we've used it our firm. It returns
6:41
fast and reliable results. It works with something
6:43
like 99% accuracy. There's
6:46
hundreds of law enforcement agencies that are
6:48
already using it. And he
6:50
had written this memo to reassure any police
6:53
who wanted to use it that they wouldn't
6:55
be breaking federal or state privacy laws
6:57
by doing so. And
6:59
then there was a brochure for Clearview
7:01
that said, stop searching, start solving, and
7:04
that it was a Google for faces. And
7:06
as I'm reading it, I'm just like, wow,
7:10
how have I never heard of this company before? Why
7:13
is this company doing this and not Google
7:15
or Facebook? And does
7:18
this actually work? Is this real?
7:20
Because it is violating things that I
7:22
have been hearing from the tech industry for
7:24
years now about
7:27
what should be done about facial recognition technology.
7:30
I flashback to this workshop I've
7:32
gone to in DC organized by the Federal
7:34
Trade Commission, which is kind of our de
7:37
facto privacy regulator in the United States. And
7:39
they had a bunch of what we call stakeholders
7:42
there. Google was there. Facebook was there. Little
7:44
startups, privacy advocates, civil
7:47
society organizations, academics. Good
7:49
morning, and I want to welcome all of you, both
7:51
here in Washington, DC and those
7:54
watching online, To
7:56
Today's Workshop on Facial recognition Technology.
7:58
This Workshop That Kashmir. Members it
8:00
happened in Two thousand Eleven is called
8:02
Face Facts or maybe Face Facts. The
8:05
video of the worse up on the
8:07
Ftc. His website shows a string of
8:09
speakers brazilian a podium in front of
8:11
a limp looking American flags. We will
8:14
focus on the commercial use that is
8:16
on the possibilities that these technologies open
8:18
up for consumers are well as as
8:21
their potential threats to privacy. Must you
8:23
know. Admissions The Of.
8:27
The talking about the nitty gritty of facial
8:29
recognition technology. what safeguards need to be put
8:31
in place around the Sec Now see that
8:33
rapidly becoming more powerful and everyone in the
8:35
room had different ideas about what we should
8:38
be doing. you know, group on Facebook that
8:40
point where does tagging friends and thought as
8:42
and there's some people they're saying we need
8:44
to have. Banned this. But there is one
8:46
thing that everybody in the room agreed on
8:48
and now is that nobody said build a
8:50
facial. Recognition at that you could
8:52
use on strangers and identify them
8:55
on the upper South. Avoid
8:57
the one who's case anybody fears which
9:00
has to do in a while and
9:02
then on my people that's the Ceo
9:04
of a facial recognition company and soon
9:06
be acquired by. You
9:08
saying they had to prevent the use
9:10
case? no one wanted Sam to faces
9:13
for the input and try to thing
9:15
is Barb. Photos. And
9:18
the people that you want to. To. Have identified
9:20
know over the back. So.
9:24
Apartment mother of and wouldn't do not
9:27
know about Martha. Faith is this is
9:29
one thing that we wanted to make
9:31
sure that doesn't help. And. So
9:33
now I'm looking at this memo
9:35
that says that has happened. Rice
9:37
and. So yeah, I was very soft
9:39
and I told Friday I'm deathly gonna look
9:41
into this like a suicide attack the United
9:43
States. And that's when I did. so
9:48
at this point in my twenty nineteen
9:50
years which has her knows about clear
9:53
the way i it's supposedly a very
9:55
powerful technology that has great billions of
9:57
photos from the public that And
10:00
it's being used by the Atlanta Police
10:02
Department. She doesn't know who's
10:04
behind the company, but she
10:06
has ideas about how to find them. She
10:08
starts calling their clients. And
10:13
so I reached out to the Atlanta Police Department,
10:15
they never responded. Other FOIA's were
10:17
starting to come in that showed other departments
10:19
using Clearview, and I just
10:21
did a kind of Google dorking
10:23
thing where I searched for Clearview
10:25
and then site.gov to see if
10:28
it showed up on budgets.
10:30
Oh, that's really smart. Yeah, and
10:32
so I started seeing Clearview, and it was really tiny
10:34
amounts, like $2,000, $6,000, but
10:37
it was appearing on budgets around the country. And so
10:39
I would reach out to those police departments and
10:41
say, hey, I'm looking to Clearview AI, I
10:43
saw that you're paying for it, would you talk to me? And
10:47
eventually, the first
10:49
people to call me back were
10:51
the Gainesville Police Department, a detective
10:53
there named Nick Ferrara. He's
10:56
a financial crimes detective, and he
10:58
calls me up on my phone. He said, oh, hey, I
11:00
heard that you're working on a story
11:02
about Clearview AI, I'd be happy to talk to
11:04
you about it. It's a great tool. It's
11:07
amazing. And he said he would be the
11:09
spokesperson for the company if they wanted. So he
11:11
told him, he's just like, this is great. He
11:13
loved it. He said he had a stack of unsolved
11:16
cases on his desk where he had a photo of
11:18
the person he was looking for, like a fraudster,
11:21
and he'd run it through the state facial recognition system,
11:23
not gone to anything. And he said he ran it
11:25
through Clearview AI, and he got hit after hit. And
11:28
he just said it was this really powerful tool.
11:30
It worked like no facial recognition he'd used before.
11:33
The person could be wearing a hat, glasses, looking
11:35
away from the camera, and he was still getting
11:38
these results. And this is sort of the positive
11:40
case for any of this, which is that if
11:43
a dangerous person who has committed violent
11:45
crimes is out in the world, and
11:47
there's some photo of them where maybe
11:49
they were like robbing a bank, and their mouth was
11:51
covered, and there's a hat low over their head, and
11:53
if a cop can take that surveillance
11:56
cell, plug it into a big machine, and
11:58
find this person's name. we live in
12:00
a safer world. Right. This is the
12:02
ideal use case. Yeah. Solving crimes, finding
12:04
people who committed crimes, bringing them to
12:07
justice. Yeah. And so Nick Ferrara,
12:09
this detective, said, yeah, it works incredibly well. And I
12:11
said, well, I'd love to see what the results look
12:13
like. I've never kind of seen a search like this
12:15
before. And he said, well, I can't
12:17
send you something from one of my investigations, but
12:19
why don't you send me your photo, and I'll
12:21
run you through Clearview, and I'll send you the
12:24
results. So I do that. I send some photos
12:26
of myself. How do you pick the photos? I
12:28
tried to choose hard photos. So I had
12:31
one where my eyes were closed, one
12:33
where I was wearing a hat and sunglasses, and
12:36
another that was kind of like an easy photo, in case
12:38
those other two didn't work. And
12:40
then I waited to hear how it went, and
12:43
see for myself how well this software works. And
12:46
Nick Ferrara ghost me. He
12:49
just totally disappears. Disappears. Won't
12:52
pick up when I call him. Doesn't respond
12:54
to my email. Cashmere
12:56
says she tried this again with a
12:58
different police officer in a different department,
13:01
and the same thing happened. They were friendly at first.
13:04
Cashmere asked them to run a search on her
13:06
face. They agreed. And then they were
13:09
gone. And so eventually,
13:11
I kind of recruited a
13:14
detective in
13:16
Texas, a police detective, who
13:19
was kind of a friend of
13:21
a friend at the Times, and said, oh,
13:23
you're looking at this company. I'm happy to download the
13:26
tool, tell you what it's like. And
13:28
so he requests a trial of Clearview.
13:30
And at this point, Clearview was just
13:32
giving out free trials to any police
13:34
officer, as long as they had an
13:36
email address associated with the department. And
13:39
so- It's what Facebook did when they first
13:41
opened up with College Campuses. Yeah, exactly. It
13:45
was exclusive, just for government workers. And
13:48
so he goes to their website, where
13:50
he can request a trial. Within 30
13:52
minutes, he's got Clearview on his phone. And
13:55
he starts testing it, running it on
13:57
some suspects whose identity he knows. And
13:59
it works. He tried it on himself,
14:01
and he kind of had purposely not put
14:03
a lot of photos of himself online because
14:06
he was worried about exposure and people coming
14:08
after him who he had been
14:10
involved in catching, sending to jail.
14:13
And it worked for him. It found this
14:15
photo of him on Twitter, where he was
14:17
in the background of someone else's photo, and
14:19
he had been on patrol, so it actually had
14:21
his name tag on it. So it would have been
14:24
a way to get from his face to his name.
14:27
And he immediately thought, wow, this is
14:29
so powerful for investigators, but it's going to
14:31
be a huge problem for undercover officers. If
14:34
they have any photos online, it's going to be a way to
14:36
figure out who they are. And
14:39
so I told him about my experience with other
14:41
officers running my photo, and he ran my
14:44
photo. And there weren't
14:46
any results, which was weird because I have a lot of
14:48
photos online. Like, it just came up like nothing.
14:52
Nothing. And then within
14:54
minutes, he gets a call from
14:56
an unknown
14:58
number, and when he picks up, the person says, this
15:00
is Marco with Clearview AI
15:03
tech support, and we have some questions
15:05
about a search that you just did. Oh, my God.
15:08
And he says, why are you running photos of this
15:10
lady from the New York Times? And
15:12
the detective kind of plays it cool. And
15:14
he's like, oh, I'm just testing out the
15:16
software. How would I know somebody in New
15:18
York? I'm in Texas. And anyways, his account
15:21
gets suspended. Oh, wow. And this
15:23
is how I realized that even though Clearview
15:25
is not talking to me, they have put
15:27
an alert on my face. And
15:29
every time an officer has run my
15:32
photo, they've gotten a call
15:34
from Clearview telling them not to talk to me. Just
15:41
to spell out what Kashmir believed was
15:43
going on here, these police officers may
15:45
have thought they were using a normal
15:47
search engine like Google. But
15:50
what they hadn't counted on was that someone
15:52
on the other end of that search engine
15:54
seemed to be watching their searches, surveilling
15:57
the cops who were using the surveillance
15:59
technology. It was
16:01
a moment where Kashmir saw clearly how this
16:03
mysterious company, by being the first to build
16:06
this tool no one else would, had granted
16:08
itself immense power to monitor
16:10
Kashmir, to monitor these cops. This
16:13
company, whose product would reduce the average
16:15
American's privacy, was keeping quite a
16:18
lot of privacy for itself. Of
16:20
course, Kashmir is fortunately for us a
16:22
nosy reporter, so all this cloak and
16:24
dagger behavior just made her more curious.
16:27
She tries to crack into the company a bunch of different
16:29
ways. She's reaching out to
16:31
anybody online who might have links to the company. She
16:34
finds an address listed on Clearview AI's
16:37
website. It's in Manhattan. But
16:39
when she goes there in person, there's
16:42
no such address. The building
16:44
itself does not exist. It's a real
16:46
Harry Potter moment. Finally,
16:48
she tries something that does work. On
16:51
the website Pitchbook, she can see two
16:53
of Clearview AI's investors. Peter
16:55
Thiel. No luck there. But
16:58
also an investment firm based in New York.
17:05
They're north of the city, and
17:07
they weren't responding to emails or phone calls. So
17:10
I got on the Metro North and went up
17:12
to their office to see if they had a
17:14
real office. And it was
17:16
kind of an adventure being there. The office was
17:18
empty. All their neighbors said they never came in.
17:21
I kind of hung out in the hallway for
17:23
about an hour. A FedEx guy came. He dropped
17:25
off the box. He says, oh, they're never here. And I
17:27
thought, oh, my gosh, this is a waste of a
17:29
trip. But then I'm walking out
17:32
of the building. And it was on
17:34
the second floor. And I'm coming down the stairs. And these two
17:36
guys walk in. And they
17:38
were wearing lavender and pink. And they
17:40
just looked like moneyed. They
17:43
stood out. And I said, oh, are
17:45
you with Kiranaga Partners, which is the name of
17:47
this investment firm? And they look up and
17:49
they smile at me. And they say, yeah, we are. Who are
17:51
you? And I said, I'm Kashmir Hill. I'm
17:53
a New York Times reporter who's been trying to get in touch
17:55
with you. And
17:58
their faces just fall. I said, I
18:00
want to talk to you about Clearview AI. And
18:02
they said, well, Clearview AI's lawyer said that we're not supposed
18:04
to talk to you. And
18:08
I was around seven months pregnant at this time. And
18:10
so I kind of opened my jacket
18:12
and just clearly just lay my
18:15
enormous belly. And I was like, oh, I've
18:17
come so far. It was cold. It was raining
18:19
out. And David
18:21
Scalzo, who's the main guy, main investor
18:23
in Clearview at Kiranaga, he says, OK. So
18:28
Cashmere and the two investors go inside the
18:30
office. Cashmere tells them all
18:33
this not talking. It's making
18:35
Clearview AI look pretty nefarious. She
18:39
has a point. And so one
18:41
of them agrees to go on the record and starts
18:43
talking about his vision for the company that he has
18:45
invested in. David Scalzo
18:47
said, right now, they're just selling
18:49
this to kind of like retailers
18:52
and police departments. But
18:55
our hope is that one day, everybody
18:57
has access to Clearview. And
18:59
the same way that you Google someone,
19:01
you'll Clearview their face and be able to see
19:04
all the photos of them online. He says, yeah, I
19:06
think we think this company is going to be huge.
19:09
And now they give Cashmere the information
19:12
that she'd really been looking for, the
19:14
names of the people who are actually
19:16
responsible for this tool. And
19:19
they said, oh, yeah, we're really excited
19:21
about the founders. And they
19:23
say it's this guy, Richard Schwartz, who's
19:25
kind of a media politics guy, worked
19:28
for Rudy Giuliani when he was mayor.
19:30
And then there's this tech genius, real
19:33
mastermind, young guy. And
19:35
his name's Juan Tontat. And
19:38
we're in a conference room. So I'm like,
19:40
can you write that up on a whiteboard
19:42
for me? How do you spell Juan Tontat?
19:44
And so he writes it out. And
19:46
this is the first time I figure out who
19:49
the people are behind this. After
19:53
the break, the story of how Juan
19:56
Tontat and his engineers got your face
19:58
and my face and 3
20:00
billion photos worth of faces into
20:02
this powerful new search engine. Search
20:10
engine is brought to you by Seed Probiotics. Small
20:13
actions can have big benefits, like
20:15
how taking care of your gut can support whole
20:17
body health. Seeds DS1
20:19
daily symbiotic benefits your gut, skin, and
20:21
heart health in just too little capsules
20:24
a day. My relationship
20:26
with my body is a bit of a nightmare. Probiotics
20:28
can help with things that are important to me
20:30
like digestion and skin health. Your
20:32
body is an ecosystem and great health starts in
20:35
the gut. Your gut is a
20:37
central hub for various pathways through the body.
20:39
And a healthy gut microbiome means benefits for
20:41
digestion, skin health, heart health, your immune system,
20:43
and more. Probiotics
20:46
and prebiotics work best when used consistently
20:48
like other routine health habits. Seed
20:50
subscription service easily builds DS1 into
20:53
your routine with no refrigeration needed.
20:56
Test your gut with Seed's DS1
20:58
daily symbiotic. Go to seed.com/search
21:00
and use code 25 search to
21:02
get 25% off your first
21:04
month. That's 25% off
21:07
your first month of Seed's
21:09
DS1 daily symbiotic at seed.com/search.
21:12
Code 25 search. Search engine
21:14
is brought to you by Aura Frames. Looking
21:17
for the perfect gift to celebrate the moms in your
21:19
life? Aura Frames are
21:21
beautiful, wifi-connected digital picture frames that
21:23
allow you to share and display
21:25
unlimited photos. It's super easy
21:27
to upload and share photos via the Aura app.
21:30
And if you're giving Aura as a gift, you
21:32
can even personalize the frame with preloaded photos and
21:34
memories. I struggle with Mother's
21:36
Day. I always feel like it's really hard to
21:38
figure out what to give my mom. Photos
21:41
of me, unfortunately, something that she would really
21:43
enjoy. From grandmothers to new
21:45
mothers, aunts, and even the friends in
21:47
your life, every mom loves an Aura
21:49
Frame. Named the best digital photo frame
21:51
by Wirecutter and selected as
21:53
one of Oprah's favorite things, Aura
21:56
Frames are guaranteed to bring joy to moms
21:58
of all ages. Right now
22:00
or I have a great deal for
22:03
Mother's day. Listeners can save on the
22:05
perfect gift by visiting Aura frames.com to
22:07
get thirty bucks off plus free shipping
22:09
on their best selling frame that age.
22:12
You are a frames.com use code search
22:14
engine a check out to save. Terms
22:16
and conditions apply. Night
22:38
search on turned out to learn
22:41
to find had an internet trail.
22:43
This guitar you're hearing part of
22:45
the trail one had a habit
22:48
in one chapter Recife opposing you
22:50
tube videos of himself pretty capably
22:52
playing guitar solos in the videos.
22:54
He doesn't speak but his cm
22:57
tall and slender with long black
22:59
hair fashionable one has the admins
23:01
roots. Raised in Australia, he moved
23:04
to San Francisco in two thousand
23:06
seven. His internet breadcrumbs. Suggest a
23:08
strange cause if a person a
23:10
Bay Area tech guy who presents
23:12
add slightly gender fluid way has
23:14
photos from Burning Man but I'm
23:16
also seems like a bit of
23:18
a troll in a Twitter bio
23:20
he claims to be a quote
23:23
Anarcho Transsexual Afro to Cano American
23:25
Feminists Studies major. What?
23:27
Is clear is that Wanna come to America
23:29
with big dreams of getting rich on the
23:31
internet? He
23:35
started in the Farm Bill era of
23:37
a spec ops when you could make
23:39
money building the right stupid thing online.
23:41
Nothing he tried really took off. so
23:43
not apps like friend quiz or romantic
23:45
guess not later efforts like an app
23:47
the took an image of you and
23:49
photo shopped Trump's hair on your head.
23:52
In twenty six seen one would move to
23:54
New York at some point you delete my
23:57
service or some yes, a cashmere and an
23:59
old artifact as. who he was on
24:01
the internet back then. I found
24:03
an archived page of his from
24:05
Twitter on the way back machine.
24:08
And it was mostly him kind
24:10
of retweeting like
24:12
Breitbart reporters and
24:15
kind of saying, why are all the big cities
24:17
so liberal? Yeah. He doesn't have a Twitter
24:19
account. He doesn't have a Facebook account. It
24:21
seemed like, wow, this is weird. Like this guy
24:24
is in his 20s, I think. But he doesn't
24:26
have a social media presence beyond
24:28
like a Spotify account with some songs
24:31
that he apparently had done. It was
24:33
a strange portrait. But it came away
24:35
thinking, wow, this person is a character.
24:38
And I really want to meet him. At
24:40
this point, it seemed like the company understood that
24:42
Cashmere Hill was not going to go away. A
24:46
crisis communications consultant reached out and
24:48
eventually offered to arrange an interview with
24:51
Juan Tontet. When I met
24:53
him, he was not what I expected him to be, which
24:55
was he still had the long black hair.
24:58
But now he had these glasses that
25:00
felt very like office worker glasses. And
25:02
he was wearing this blue suit. And he just looked
25:04
like a security startup CEO. OK.
25:08
Which just, again, wasn't what I
25:10
expected based on everything else I saw about him.
25:12
We met at a WeWork because
25:14
they didn't have an office. I would find
25:16
out that he mostly kind of worked remotely,
25:19
did a lot of the building of Clearview AI. At
25:21
the time, he lives in the East Village. And he
25:23
kind of just did it in cafes, like places with
25:25
free Wi-Fi. So they
25:28
booked a room at WeWork for our
25:30
meeting. The crisis communications consultant was there.
25:32
She'd brought cookies. What type of
25:34
cookies? Chocolate chip. And
25:37
I feel like they were Nantucket or
25:39
like Sausalito cookies. I can't remember the brand.
25:42
But yeah, we had lattes at
25:44
the WeWork cafe. And we sat down. And
25:47
I just started asking my questions. And for
25:50
the most part, he's answering them. And
25:52
we had a couple of hours
25:55
to talk. And he
25:57
really was telling me a lot. And so it was
25:59
just a lot. The complete one eighty.
26:04
In person is very charismatic,
26:06
very open, And would
26:08
be evasive about some things. Wouldn't
26:10
describe any one else involved as
26:12
the company besides Richer Schwartz is
26:14
cofounder be I avenues open and
26:16
I was like you have built
26:18
this astounding technology like how did
26:21
you do this. How did you go
26:23
from what you're telling me about this book?
26:25
apps and. I phone games
26:27
to building this they said well. So
26:30
standing on the shoulders of Giants. And
26:33
he said to spend this
26:35
real revolution in Ai and
26:37
neural networks. Ah a lot
26:39
of research that have the
26:41
most brilliant minds in the
26:43
world have done these open.
26:45
Sourced know they've put it on the
26:47
internet. Want old cashmere? That in Twenty
26:50
Six team in the early days of
26:52
building what would become the Clear View
26:54
I facial search engine. He taught himself
26:56
the rudiments of a i assisted facial
26:59
recognition by just essentially googling them. He's
27:01
on Get Hub and I didn't face
27:03
recognition to read papers by experts in
27:06
the field. He told her quote.
27:08
Is gonna sound like I googled flying car
27:10
and and sound instructions on it. Which
27:13
wasn't too far off until pretty
27:16
recently. Facial recognition existed, but was
27:18
somewhat crude, What
27:20
one was learning on the Internet
27:22
was at machine learning. Neural networks
27:24
have just changed all that now.
27:27
computers could teach themselves dressing as
27:29
face, even at an odd angle.
27:31
Even with a beard provided to
27:33
the computer was given enough images
27:35
of faces training data to learn
27:37
on. We reached out
27:39
to Clear View Ai for the story. We didn't get
27:41
a response. But in the years since. His interview
27:43
a cashmere quarters and plenty of interviews with
27:46
the press. One
27:48
thing I do respect as a fact that you decided to
27:50
come here are like for an interview cyprus his supporters of
27:52
its on. and thanks path for having
27:54
me on awarded for here's one with the
27:57
you tube show valued payments fine stress as
27:59
us of kashmir in a suit, looking again
28:01
like a standard tech exact, just with unusually
28:03
long hair. Here he
28:05
describes what his research process for this search
28:07
engine was like. I was looking at
28:09
the advances in AI. So I
28:11
saw ImageNet, which is a competition for recognizing
28:14
things in images. Is this a computer? Is
28:16
this a plant? Is this a dog or
28:18
a cat? And the results got really good.
28:20
And then I looked at facial recognition, and
28:22
we'd read papers. So Google had Facebook
28:25
both had deep learning papers on
28:27
facial recognition. And then, hey, can I
28:29
get this working on my computer? And we
28:31
ended up getting it working. And what
28:33
we realized was getting more data to
28:35
train the algorithm, to make it accurate
28:37
across all ethnicities, Iranian people, black people,
28:39
white people, brown people. That was really
28:42
key to improving performance. This
28:44
would be Juan's real innovation, a somewhat
28:47
dark one. His advantage was
28:49
how he would find the training data he
28:51
needed. He built a scraper,
28:53
a tool that would take, without asking, photos
28:56
of human faces pretty much anywhere on the public
28:58
internet they could be nabbed. He
29:00
also hired people who built their own
29:02
scrapers to hoover up even more photos.
29:05
He said part of our magic here
29:07
is that we collected so many photos. And he
29:09
built the scraper. He hired people
29:11
from around the world to help him
29:13
collect photos. And so it's
29:16
similar to, like when people talk about
29:18
large language models right now and companies
29:20
like OpenAI, some of what
29:22
they're doing is tuning their neural networks. But a lot
29:24
of what they're doing is feeding their neural networks. It's
29:26
like they have to find every text that's ever been
29:28
published in every library and then they run out of
29:30
all the library text and they have to find transcripts
29:34
of YouTube videos which maybe they shouldn't be
29:36
loading in there. It's part of what he
29:38
had done correctly to get his product ahead
29:40
of where the other ones were. It's
29:42
just like he was not a genius at making
29:44
the underlying AI. That was
29:47
mostly open source. He was passionate about finding
29:49
faces on the internet to put into it.
29:51
Yes. And so where was he
29:53
looking? Oh man. So the first place
29:55
he got faces was Venmo. This is funny to
29:57
me because as a privacy journalist, I really love
30:00
it. remembered people being upset
30:02
at Vemo's approach to privacy, which at
30:04
the time was, if you signed up
30:06
for Vemo, your account was public by
30:08
default. And your profile photo
30:10
was public by default. And so he built
30:13
this scraper, you know, this
30:15
bot that would just visit vemo.com every
30:17
few seconds. And Vemo
30:19
at the time had a real-time feed
30:21
of all the transactions that were happening
30:24
on the network. And so he would
30:26
just hit Vemo every few seconds and
30:28
download the profile photo, the link to
30:30
the Vemo profile. And he got
30:32
just millions of photos this way from Vemo
30:34
alone. And this is essentially
30:36
what he was doing, but with, I
30:39
mean, thousands, millions of
30:41
sites on the internet, Facebook, LinkedIn,
30:43
Instagram, employment sites, yeah, just anywhere
30:45
they could think of where there
30:47
might be photos. I
30:52
just want to butt in here to say all of
30:54
this is completely astonishing. I
30:57
know people at the dawn of social media
30:59
who just didn't want to join Facebook or
31:01
didn't understand why you would voluntarily offer your
31:04
private life to the public, but
31:06
I don't think anyone, or at least anyone
31:09
I knew, had an imagination
31:11
sufficiently tuned to Dystopia to know
31:13
that if you had the
31:15
brazenness to upload your photo to Vemo or
31:18
to LinkedIn, you could one day
31:20
be feeding your face into a machine. A
31:22
machine that today, if you go to a protest,
31:25
is capable of using a photo of
31:27
your face to find your name, your
31:30
email, your employer, even your physical address.
31:33
Who knew this was the future we were fumbling our
31:35
way towards? I asked Kashmir
31:37
about all this. I
31:40
use Venmo, I use Facebook. I'm
31:43
barely sure that when I signed up, I signed in
31:45
terms of service. I did not read it carefully. I
31:47
don't think there was a section in there, like, okaying
31:49
that my face could be used in photo-scraping software. Is
31:52
what he did legal? So
31:54
Venmo and Facebook both sent
31:56
Clearview AI cease and desist
31:59
letters saying... stop scraping our sites
32:01
and erase the photos that you took, delete the
32:03
photos that you took, but they
32:05
never sued. So this hasn't been tested in court,
32:07
whether it's illegal or not. So it's still a
32:10
bit of a gray area and it
32:12
hasn't been tested with Clearview because none
32:14
of these companies have sued them. In
32:18
one interview, shot just a month after his
32:20
conversation with Cashmere, Juan sat down with a
32:23
CNN reporter who asked about this, the legality
32:25
of his project. Is everything
32:27
you're doing legal? Yes, it is. So
32:30
we've gone through and have
32:32
some of the best legal counsel from Paul Clement,
32:34
who used to be the Solicitor General of the
32:36
United States. He's done over 100
32:39
cases in front of the Supreme Court.
32:41
And he did a study independently saying
32:44
this is not, the way it's used
32:46
is in compliance with the Fourth
32:48
Amendment. All the information we get is publicly
32:50
available, and we have a First Amendment right
32:52
to have public information on the internet. And
32:55
you have to understand what it's also being
32:57
used for. We're not just taking your information
32:59
and selling ads with it or trying to
33:01
get more. We're actually
33:03
helping solve crimes with this. So
33:06
your counsel is making the argument
33:08
that there's a First Amendment right
33:10
to information that is publicly on
33:12
the internet? Yes. And so
33:15
if you take something like Google, Google
33:17
crawls the internet, collects all these web pages,
33:19
and you search it with keyword terms, we're the
33:21
same thing. You take all this information on public
33:23
web pages that search it with a face. Juan's
33:27
making a pretty radical argument here, even though
33:29
his tone doesn't suggest it. He's
33:31
saying that someone being able to take your
33:33
face and use it to make a search
33:35
that will pull up your name, possibly your
33:37
address and more, is nothing new. It's
33:40
just like Google. His
33:43
point is that Google collects every instance of your
33:45
name on the internet. Clearview AI is
33:47
just doing that, but with your face. And
33:49
you know, attaching it to your name. Whether
33:53
you agree with this idea or not, it
33:55
has happened and it has fundamentally
33:57
changed how privacy works. Kazmir
34:00
says that most of us are just not prepared for
34:02
this brave new world. I just
34:04
don't think that most people anticipated
34:07
that the whole internet was
34:09
going to be reorganized around your face.
34:12
Right. And so a lot of people
34:14
haven't been that careful about the kind of photos they're in
34:16
or the kind of photos they've put up of themselves
34:18
or the kind of photos they've allowed to be put
34:20
up of themselves. And Juan
34:23
actually did a clear view search of me there.
34:26
And I said, oh, well, last time this happened, there were
34:28
no results for me. And he said, oh, there must have
34:31
been some kind of bug. He wouldn't
34:33
admit that they had put this alert on my face,
34:35
that they had changed my results. But
34:37
he ran my face and there were just tons
34:39
of photos, like lots of photos I knew about.
34:43
But in one case, there was a photo of me at an
34:45
event with a source. And I was like,
34:47
wow, I didn't realize. I hadn't thought that through
34:49
now that if I'm out in public with a
34:51
sensitive source and somebody takes a photo and puts
34:53
that on the internet, that could be a way
34:55
of exposing who my sources are. And
34:59
it was really stunning how
35:01
powerful it was. For me, there were
35:03
dozens, if not hundreds of photos. Me
35:06
kind of in the background of other people's photos.
35:08
I remember there were, I used to live in
35:10
Washington, D.C. and there were photos of me at
35:12
the Black Cat, which is a concert venue, just
35:14
in the crowd at the show. It
35:17
was incredibly powerful. And
35:19
I remember asking him, I was like,
35:21
you've taken this software somewhere no one
35:23
has before. You have created this really
35:26
powerful tool that can
35:29
identify anybody. All these photos of them,
35:32
you're just selling it to law enforcement. But now that
35:34
you've built this and you've described to me the
35:36
accessibility of building this, there's going to be
35:39
copycats. And this is going
35:41
to change anonymity, privacy as we know
35:43
it. What do you think about that? And
35:47
I remember he kind of was silent for a little
35:49
bit and he said, that's a really good question. I'll
35:51
have to think about it. And it was just
35:54
this stunning moment of seeing in
35:56
action people that are making these
35:58
really powerful technologies who really just. are
36:00
not thinking about the implications, who are just
36:02
thinking, how do I build this? How do
36:04
I sell this? How do I make this
36:06
a success? Since
36:10
Ron's interview with Cashmere, it
36:12
seems like maybe he's had more time to think
36:14
through better answers to hard questions. We've
36:16
watched a lot of these subsequent interviews. What
36:19
you notice is that now, he'll say
36:21
that as long as he's CEO, he'll make sure
36:23
his tool is only ever in the hands of
36:25
law enforcement, and in some cases, banks. And
36:28
he'll point again and again to
36:30
the one strong reason why Clearview AI
36:32
does need to exist. Without Clearview AI,
36:34
there are so many cases of child
36:36
molesters that would have never been caught,
36:38
or children who wouldn't have been saved.
36:41
Child predator will be extorting children
36:43
online. You don't even know about it. Sex
36:45
torsion, child abuse, child abuser, crime crimes against
36:48
children, dark web, troves and troves of children's
36:50
faces. These are kids that would have been
36:52
identified. This is why our customers
36:54
are very passionate about keeping the technology and making
36:56
sure it's used properly.
37:00
It's hard to take the other side of
37:02
that argument, but of course, Clearview AI is
37:05
not just being marketed as an anti-child predator
37:07
tool. A Clearview AI investor told
37:09
Cashmere he hoped one day it would be in
37:11
the hands of regular people, and potential
37:13
investors in the company were given the tool
37:15
to use on their phones, like just
37:18
to use as a party trick. Will
37:20
Clearview AI actually ultimately roll this tool
37:22
out for wide use? Well,
37:25
it sort of doesn't matter whether they do or not. Because
37:32
remember, the copycats already have.
37:36
After the break, how people are using
37:38
and abusing this technology. Great.
37:51
Rocket Money is a personal finance app
37:53
that finds and cancels your unwanted subscriptions,
37:56
monitors your spending, and helps lower your bills so
37:58
that you can grow your savings. With
38:00
Rocket Money, I have full control over my subscriptions and
38:02
a clear view of my expenses. I
38:04
can see all my subscriptions one place, and if I
38:06
see something I don't want, Rocket Money can help me
38:08
cancel it with a few taps. Rocket
38:11
Money will even try to negotiate lower bills for you
38:13
by up to 20%. All
38:15
you have to do is submit a picture of your bill and
38:18
Rocket Money takes care of the rest. They
38:20
will deal with customer service for you. Rocket
38:23
Money gave me a free trial because they're on the podcast.
38:25
I started using it. I really like it. I'm
38:27
actually paying for a subscription for them. It's
38:30
not very much money, but the funny thing
38:32
is that it shows me my Rocket Money
38:34
subscription on Rocket Money, and I
38:36
have considered trying to get Rocket
38:38
Money to negotiate the prices of
38:41
their subscription down against themselves. We'll
38:43
see if it works. Rocket Money has
38:45
over 5 million users and has saved a total of
38:47
$500 million in canceled subscriptions,
38:51
saving members up to $740 a
38:53
year when using all of the app's features. Don't
38:56
waste money on things you don't use. Cancel
38:59
your unwanted subscriptions by going
39:01
to rocketmoney.com/search. That's rocketmoney.com/search.
39:10
Search Engine is brought to you by Mint Mobile. My
39:12
favorite spring cleaning takeaway is the
39:14
post-clean clarity you get. How
39:17
have I been living like this? It's kind
39:19
of like when you find out that you've been
39:21
paying a fortune for wireless. So Mint Mobile has
39:23
phone plans for $15 a month when you purchase
39:25
a three-month plan. Wow, how
39:27
have I been affording this? It's time to
39:29
switch to Mint Mobile and get unlimited talk, text, and
39:31
data for $15 a month. For
39:34
me, compared to my traditional wireless carrier,
39:37
that's like almost $100 in
39:39
saving. To
39:41
get this new customer offer and your
39:43
new three-month unlimited wireless plan for just
39:46
$15 a month, go to
39:48
mintmobile.com/search. Visit
39:51
mintmobile.com/search. Cut
39:53
your wireless bill to
39:55
$15 a month at
39:58
mintmobile.com/search. dollars
40:00
upfront payment required, new customers on
40:02
first 3 month plan only, speeds
40:04
slower above 40GB on unlimited plan,
40:07
additional taxes, fees and restrictions apply.
40:09
See Mint Mobile for details. So
40:29
you leave that conversation at that point, do you
40:31
write your story? Yeah, I think
40:33
the story came out about a week after that
40:35
interview. And what was the response to the story?
40:37
Did people understand the size of the thing? Yeah,
40:40
so it was a front page Sunday story,
40:42
they call it a bulldog story when you're
40:44
Sunday A1. A bulldog
40:46
story? Yeah, a bulldog heave because you open the paper and
40:49
then it was the whole spread on the inside as well.
40:51
And it was a big deal. I remember
40:53
it landed and my phone was just blowing
40:56
up because I was being tagged. This is back
40:58
when Twitter was still a healthy
41:01
space for conversation. So my Twitter is blowing
41:03
up. I'm getting all these emails. People want
41:05
to book me to talk about it on
41:07
TV on the radio. Like it was just
41:09
this huge deal. People were stunned
41:12
that it existed, that it
41:14
was using their photos without
41:16
consent. Just the way Clavio
41:18
had gone about it, the fact that they had
41:20
surveilled me as a journalist tried to prevent
41:22
the reporting on the story. It was
41:25
a huge deal. I thought this is going to be one of
41:27
the biggest stories of the year. This
41:31
is January 2020. I
41:35
see. Shit.
41:37
So the pandemic happens and
41:40
does it just kind of
41:42
like... Yeah,
41:47
like I was hearing that there were
41:49
going to be hearings in DC.
41:52
They start getting cease and desist
41:54
letters, lawsuits happen, but then March
41:56
2020 COVID hits and it just
41:59
instantly changed the country. conversation in the U.S.
42:01
and around the world to
42:03
health concerns, safety concerns. And
42:06
then I started seeing people talking about, can
42:08
we use facial recognition technology to fight COVID?
42:10
Can we start tracking people's faces? See
42:13
where people were with other people,
42:15
if there's a known exposure. Can
42:18
we track people? And there's this
42:20
talk about, yeah, using facial recognition
42:22
technology. So it's like
42:24
we almost skipped the scared outrage phase
42:26
of the technology because the needle
42:30
kind of juttered on everything with COVID? Yeah,
42:32
we did a little bit. I mean, for certain
42:35
groups like European privacy regulators, they all
42:37
launched investigations into Clearview and they
42:39
essentially kicked Clearview out of their countries.
42:41
They said it was illegal. I mean, there were
42:43
repercussions for Clearview, but I feel like the
42:45
bigger conversation, what do we do about this
42:48
right now? It just it
42:50
got pushed aside by that larger concern around
42:52
COVID. Somehow our debate
42:54
about these search engines was just one
42:56
of the infinite strange casualties of COVID,
42:59
a conversation we never quite got to have. In
43:02
the meantime, Clearview AI's copycats have continued to
43:04
go further than the company itself, offering
43:07
their search engines online for the public to use at
43:09
a small cost. None of these
43:12
search engines is as powerful as Clearview, but
43:14
all of them are powerful enough to do
43:16
what privacy advocates were worried about back in
43:18
2011. The tool
43:20
I would end up finding online was one of those
43:22
copycats. I have been noticing
43:24
more and more people using them, mainly to
43:26
settle scores with strangers on social media. I
43:30
asked Kashmir where she has noticed people using
43:32
these search engines in the wild since she
43:34
published her book. I've
43:36
seen news organizations using it. One of
43:38
the more controversial uses was a news
43:41
organization that used it after October 7th
43:43
to try to identify the people who
43:45
were involved in the attacks on Israel.
43:47
Oh, wow. And I was a little
43:50
surprised to see it used that way.
43:52
Why were you surprised? I was
43:54
surprised just because it's still
43:57
controversial, whether we should be
44:00
using face recognition this way. And
44:02
the same site that was
44:04
using it had published stories about how controversial
44:06
it is. That there were these search
44:08
engines that have scraped the public web, and
44:11
that they invade privacy. Yeah, so
44:13
I think it's still complicated. It was
44:15
a news outlet that had done, maybe we shouldn't
44:17
have the stories, and then they were also using
44:19
the tech. Yeah, like maybe
44:21
this technology shouldn't exist, but also it's
44:23
there, so we're going to use it. Which sort of reveals
44:25
the story of every piece of technology we've ever had a
44:27
stomach ache about, which is, we say we don't want it
44:30
to exist. And then some contingent
44:32
circumstances arises in which at least some of us
44:34
feel like, well, it's OK for me to use
44:36
this here, and if I don't think it
44:38
should exist. Case by case basis.
44:41
I did ask Kashmir whether she'd seen these
44:43
search engines used in a clearly positive way.
44:47
I've heard of people using it on dating sites to
44:49
figure out if the person they're talking
44:51
to is legit. Make sure
44:54
they're not being catfished. Make sure this person
44:57
is who they say they are. I've heard
44:59
about it being used by
45:01
people who have compromising
45:03
material on the internet. Say they
45:05
have an OnlyFans or something
45:08
they don't want to exist on the internet, and
45:10
they've used these search engines to figure out how
45:12
exposed they are. And some of
45:14
the search engines do let you remove results.
45:16
And so they've done that. They've gotten rid
45:18
of the links to stuff
45:20
they don't want the world to know about. I've talked
45:22
to parents who have used these search
45:24
engines to figure out if there's photos
45:26
of their kids on the internet that
45:29
they don't want to be out there.
45:31
Oh, wow. So one woman I talked
45:33
to, she's an influencer. She gets a
45:35
lot of attention, and she didn't want
45:37
it blowing back on her kids. So
45:39
she stopped featuring them in any of her videos,
45:41
and she searched for them with one of these
45:43
search engines and found out that there was a
45:46
new photo of one of her kids. A summer
45:48
camp, I think, that one of the kids had
45:50
gone to had posted photos publicly. And
45:52
so she asked them to take it down. But
45:55
yeah, I mean, there are some positive use
45:57
cases for these engines. What
46:00
Cashmere is saying is that the most positive
46:02
use cases for these search engines might just
46:04
be finding compromising content on
46:06
the internet about yourself first before someone
46:08
else using one of these search engines
46:10
does, which seems like a
46:12
questionable upside. Cashmere has also seen
46:14
facial search engines used in a way that I have
46:17
to say was just breathtaking in
46:19
its pettiness. She recently
46:21
reported on how the owner of Madison Square
46:23
Garden was using facial recognition
46:25
and surveillance to ban from
46:28
the venue anyone who worked at
46:30
the law firm his venue was in litigation
46:32
with. Cashmere even tested
46:34
this. She tried to go see a
46:36
hockey game with a personal injury lawyer, something one used
46:38
to be able to do freely. So
46:40
I bought tickets to a Rangers game
46:43
and brought along this personal injury attorney whose
46:45
firm was on the ban list just because
46:47
I wanted to see this firm
46:49
myself. And yeah, so I met
46:52
her. I was meeting her for the first time that night. We
46:54
stood in line. Thank you. Just
46:56
the ticket? Yeah. Just, I
46:58
see the seat. There we go. We were
47:00
walking in. We put
47:03
our bags down on the conveyor belt
47:06
and just thousands of people streaming into Madison Square
47:08
Garden. But by the time we picked them up,
47:10
a security guard walked over to us. He
47:13
said, I need to see some ID from
47:15
her. She shows her driver's license. And he said, you're
47:17
going to have to wait here. Just give
47:19
me one moment. So I said that actually you stand
47:21
by. Management just has to come see you through. And
47:24
we appreciate your patience. Just hang out for me. Okay. A
47:27
couple minutes to go. We'll get someone down to talk
47:29
to. And Amanda came over and gave her this note
47:31
and told her she wasn't allowed to come in. Wow.
47:34
She had to leave. Where the firm is
47:36
involved with your family. Legal act,
47:38
here we go. You're from the
47:40
girl's line. It was insane to
47:42
see just how well
47:44
this works on people just in the real
47:47
world walking around. Yeah. It was
47:49
so fast. God, that's crazy. When
47:52
Kashmir reported this story, she actually
47:54
heard from facial recognition companies who said they
47:56
were upset that Madison Square and was doing
47:59
this. It was making their tools
48:01
look bad. It was not how they said they were
48:03
supposed to be used. But misuse
48:05
of any technology, it's almost a given.
48:08
And facial recognition is being misused
48:10
not just by corporations, but also
48:12
by individuals. So there
48:14
was a TikTok account where
48:17
the person who ran it, if somebody
48:19
kind of went a little viral on
48:21
TikTok, he would find out who they
48:23
were and expose them. The one video
48:26
that really struck me is during
48:28
the Canadian wildfires when New York City kind
48:30
of turned to orange. Somebody
48:33
had done a TikTok that
48:35
just showed Brooklyn and
48:38
what it looked like. It looked like something from
48:40
Blade Runner. And this
48:42
guy walks by and he became
48:44
the hot Brooklyn dad. And so
48:46
the TikTok account found out who
48:48
hot Brooklyn dad was, and
48:51
then found out who his son was.
48:53
Instead, if you want to date somebody
48:55
who's gonna look like hot Brooklyn dad
48:57
one day, here's his son's Instagram account.
49:00
That is wildly bad behavior. That's
49:02
crazy. Cause that person didn't even consent to being
49:04
in the first video, but I'm sure people sent
49:06
it to him and were like, hey, the internet
49:08
thinks you're hot. Don't worry, they don't know who
49:11
you are. And they
49:13
not only invaded his privacy further, but invaded his
49:15
kids' privacy. Yeah, just for fun. And
49:17
so that account was doing a lot of that. And
49:20
404 Media wrote about it and
49:22
eventually TikTok took the account down. I
49:27
mean, the thing that sort of hovers around all
49:29
of this is that prior to the invention of
49:31
these things, it was like the internet had taken
49:34
a lot of everyone's privacy, but the one thing
49:36
we had was the idea that if
49:39
people didn't know your name, or if you
49:42
did something under a pseudonym, there's a degree of
49:44
privacy. And now it's like your face follows you
49:46
in a way it wasn't supposed to. Or
49:49
the internet follows your face, is how I think about
49:51
it. It
49:53
feels like there's a world in which technology like this
49:55
would just be like fingerprint databases
49:57
where law enforcement would have it. that
50:00
the general public wouldn't have access to it.
50:03
Isn't that one way this could be going
50:05
instead of the way it's happening? Yeah,
50:07
that is definitely a possible future outcome,
50:09
where we decide, okay, facial recognition technology
50:11
is incredibly invasive, kind of in the
50:14
same way that wiretapping is. So
50:16
let's only let the government and law
50:19
enforcement have access to this technology legally.
50:22
They're allowed to get a court order or a warrant
50:24
and run a face search in the same way
50:26
that they can tap your phone line with judicial
50:28
approval. And the rest of
50:30
us shouldn't have the right to do it. I think
50:32
that's one way this could go. Seems
50:36
preferable. It
50:39
seems good, but then you also think about governments
50:41
that can abuse that power. So recently
50:44
here in the US, Gaza
50:46
has been such a
50:48
controversial issue and you have people out
50:51
doing protests and there was a lot of
50:53
talk about, well, if you are on
50:55
this side of it, then you're aligned with terrorists and you
50:57
are not gonna get a job. We're
51:00
gonna rescind job offers to college students who are on
51:02
this side of the issue. And it's
51:04
very easy to act on that
51:06
information. Now you can take a photo of that
51:09
crowd of protesters and you can identify every single
51:11
person involved in a protest and then
51:13
you can take their job away. Or if you're
51:15
police and there's a Black Lives Matter protest against
51:17
police brutality, you can take a photo and you
51:19
can know who all those people are. But
51:21
I think you notice now when you see
51:23
photos from the protest, all these students
51:26
are wearing masks. They're wearing COVID masks or
51:28
they're wearing something covering their face and
51:30
it's because they're worried about this. They're
51:32
aware of how easily they can be identified.
51:35
And the thing is it might work,
51:37
but I have tested some of these search engines
51:40
and if the image is
51:42
high resolution enough, even wearing
51:44
a mask, somebody can be identified. Really?
51:47
So just from like nose and eyes
51:49
and forehead? Yes, I did
51:51
this consensually with a photo of
51:53
my colleague, Facilia Kong, who covers
51:56
tech policy in DC.
51:58
She sent me a photo of herself. with a medical mask
52:00
on. I ran it through one of the search engines
52:03
and it found a bunch of photos of her. There's
52:10
a world, you can imagine it, where someone passes
52:12
a law and these tools are no longer offered
52:14
to the public. They become like
52:16
wiretaps, something only the police are allowed
52:18
to use. We would get
52:20
some of our privacy back. But,
52:23
and this might not come as a surprise, there
52:25
have been problems when the police use these tools as well.
52:28
These search engines sometimes serve as doppelgangers, images
52:30
of people who look like you, but who
52:33
are not you, which can
52:35
have real consequences. Casimir reported
52:37
the story of a man who was arrested for a
52:39
crime he was completely innocent of. The
52:41
crime had taken place in Louisiana, the man
52:43
lives in Atlanta. The police department
52:46
had a $25,000 contract with
52:48
ClearVue AI, though the cops wouldn't
52:50
confirm or deny that they'd use ClearVue AI
52:52
to misidentify him. How
52:57
do these search engines deal with errors? Do
53:01
they like correct things? If they make a
53:03
mistake, is there a process? So
53:05
in the minds of the creators of
53:07
these systems, they don't make mistakes. They
53:10
aren't definitively identifying somebody. They are
53:13
ranking candidates in order of confidence.
53:15
And so when ClearVue AI talks
53:17
about their technology, they don't say
53:19
they're identifying anyone. They say that
53:22
they are surfacing candidates, and that
53:24
ultimately it's the human being who's
53:26
deciding which is a match. It's
53:28
the human making the mistake, not
53:31
the system. So if I were running
53:33
for local office somewhere, and there
53:35
was a video of someone who looks
53:37
like me doing something compromising, and
53:39
someone wrote a news story being like, hey, we put his face in
53:42
the thing and this is what we found, and
53:44
I went, hey, you're smearing me. They'd
53:47
be like, we're not smearing you. We're just pointing out that you look like
53:49
this guy doing something he's not supposed
53:51
to do in a video. Right, it's the
53:53
new service that covered it that smeared me,
53:55
not the facial recognition engine. But
53:57
for the person in jail, they know that they would not have been.
53:59
been in jail if this technology didn't
54:02
exist. Yes, exactly. So there's this
54:04
power of the government, right? Power of
54:06
corporations. And then just as individuals, I
54:08
think about this. Basically every
54:10
time I'm at dinner now at a
54:12
restaurant, and there's people sitting around me,
54:15
and I start having a juicy conversation,
54:17
whether it's personal or about work. And
54:19
I think, wow, I really need to
54:21
be careful here, because anybody sitting around
54:23
me could, if they got interested
54:25
in the conversation, snap a photo of your face. And
54:28
with these kinds of tools, find out who you are.
54:31
That's what I always think about. I was
54:33
at a restaurant recently, and it was outdoor
54:35
dining, and I was with a friend. And
54:37
in the next closed booth, there
54:40
was this person. They took a phone
54:42
call, and they were like, one sec. This is Kamala
54:44
Harris. And I think they were joking. But I could
54:46
hear them. And I was like, oh, I
54:49
could just hand their face. I could kind of
54:51
figure this out. I might be able to find
54:53
out privileged stuff about a conversation with a very,
54:55
very member of the US government. I
54:58
felt real nausea. I felt
55:00
nausea at the possibilities. Yeah,
55:02
I mean, I think there's just so many
55:04
moments in our daily lives where we just
55:06
rely on the fact that we're anonymous. Like,
55:09
you know, you're at dinner. You're having this
55:11
private conversation, and then creepy PJ is going
55:13
to be sitting there and looking up
55:15
your connection to the vice president. Does
55:18
it make you more, are you different in the
55:20
public now? Yeah, I mean, I
55:22
just think that is the risk of facial
55:24
recognition technology, the same way that we feel
55:26
this concern about what we put on the
55:29
internet, like the tweets you write, the emails
55:31
you write, the texts you send, just thinking,
55:33
am I OK with this existing
55:35
and possibly being tied back to me, being
55:37
seen in a different context? That
55:39
is going to be our real world experience. You have
55:41
to think all the time. It's something that I'm saying
55:43
right now that could be overheard by
55:46
a stranger, something that could get me in
55:48
trouble, or something that I would regret. And
55:50
I don't know. That just terrifies me. I don't want to
55:52
be on the record all the time,
55:55
every minute, anywhere I am
55:57
in public. I just kind of assume that these.
56:00
things that you're doing aren't gonna
56:02
haunt you for the rest of your life or follow you for the rest
56:04
of your life or be tied to you. Unless
56:06
you're a celebrity with a very famous face. And
56:09
it's been funny because I've talked with various people
56:11
who do have famous faces and I talk about
56:13
this dystopia where it's like everything you
56:15
do in public will come back to haunt you. And
56:18
usually after the interview they'll say, that's my life.
56:21
And I'm like, yes, what this
56:23
technology does is it makes us all
56:26
like celebrities, like famous people. Minus
56:29
the upsides. Minus the upsides. What
56:34
do you do if you don't wanna be in
56:36
these databases? Don't have photos
56:38
of yourself on the public internet. It's hard
56:40
not to get into these databases.
56:43
These companies are scraping the
56:45
public web. So we can't
56:47
get out of Clearview's database. And
56:50
there's no federal law yet that gives us the
56:52
right to do that. European privacy
56:54
regulators have said that what Clearview I
56:56
did was illegal and that Clearview need
56:58
to delete their citizens. And
57:00
Clearview basically said, we can't tell who
57:02
lives in Italy or who lives in
57:05
the UK or who lives in
57:07
Greece. So there's not really much we can
57:09
do. It's
57:11
funny though, because I'm not a technology
57:13
CEO. And if you asked me to actually fix
57:15
that problem, I actually could fix that problem. Like
57:17
you could say, anybody can email us and ask
57:19
to be taken out if they prove that they
57:22
live in Greece. You would think they could actually
57:24
do something about it. Yeah, this work gets
57:26
so complicated. For a while, Clearview AI
57:28
was honoring requests from Europeans who wanted to
57:30
be deleted from the database. But
57:33
then at some point they just stopped and
57:35
said, actually, we don't feel like we
57:37
need to comply with European privacy laws
57:39
because we don't do business in Europe anymore.
57:41
God. Yeah. They're like
57:44
ungovernable. Yeah,
57:47
in some jurisdictions, you can get the
57:49
company to delete you. In the
57:51
US, there are a few states that have
57:53
laws that say you have the right to
57:55
access and delete information that the
57:57
company has about you. California is one of the...
58:00
those states. If you live in California, you
58:02
can go to Clearview AI and
58:04
give them your driver's license and a photo of you
58:06
and they'll show you what they have of you in
58:08
the database. And if you don't like it, you can
58:11
say delete me. But there are
58:13
only a few states that have such a
58:15
law for most of us, like here in New
58:17
York, we don't have that protection. So we can't
58:19
get out of Clearview's database. Facial
58:24
recognition is hard because these companies
58:26
are based in places that don't
58:28
have great privacy laws like the
58:31
United States. And they're making people
58:33
around the world searchable. It
58:36
really is a hard problem. And on a larger
58:38
sense, as a country society
58:41
world, if we were like, we
58:43
just don't want this technology to exist, I know
58:45
this is kind of like a child's question, but
58:47
what would it look like to put the genie
58:49
in the bottle? I mean, make it illegal, force
58:52
all companies to delete the algorithms. You have
58:54
to decide, are we talking about all facial
58:56
recognition, your iPhone opening when you look at
58:58
it? Or are we talking about
59:01
just these big databases that are searching for
59:03
your face among millions or billions of other
59:05
faces? I don't think that's going
59:07
to happen. I don't think it's going away. But
59:09
I do think we have this kind
59:12
of central question about facial recognition.
59:14
Should these companies have the right
59:16
to gather all these faces from the public internet
59:19
and make them searchable? I think that is
59:21
something that could be shut down
59:23
if we wanted it to be. Welcome
1:00:00
back. So
1:00:25
quickly before we go this week, we
1:00:28
are heading towards the end of season one
1:00:30
of Search Engine. Is
1:00:32
there going to be a season two of Search Engine? How
1:00:34
has season one gone? Great questions.
1:00:37
We will be answering them, all of them, and
1:00:40
whatever other questions you have about Search
1:00:42
Engine's present and future in
1:00:44
questionably transparent detail at our
1:00:46
upcoming board meeting. The
1:00:49
date is Friday, May 31st. We
1:00:51
will be sending out the details with the time and a
1:00:53
Zoom link to join. This is
1:00:55
only for our paid subscribers, people who
1:00:57
are members of incognito mode. If
1:01:00
you are not signed up, but you want to join this
1:01:02
meeting, you've got to sign up. You can
1:01:04
do so at searchengine.show. You get
1:01:06
a lot of other stuff too. You can read about
1:01:08
all the benefits on the website. Again,
1:01:10
that URL is searchengine.show. If you're a
1:01:13
paid subscriber, look out for an email
1:01:15
from us next week and mark your
1:01:17
calendar, May 31st, 2024. Search
1:01:27
Engine is a presentation of Odyssey and
1:01:29
Jigsaw Productions. Search Engine was created
1:01:31
by me, PJ Vogt, and Trudy Pinmaneni, and is
1:01:33
produced by Garrett Graham and Noah John. Fact-checking
1:01:36
this week by Holly Patton. Theme,
1:01:39
original composition, and mixing by Armin
1:01:41
Bazarian. Our executive producers
1:01:43
are Jenna Weiss-Berman and Leah Reiss-Dennis. Thanks
1:01:46
to the team at Jigsaw, Alex Gibney, Rich
1:01:48
Perrello, and John Schmidt, and
1:01:51
to the team at Odyssey. J.D. Crowley,
1:01:53
Rob Miranda, Craig Cox, Eric Donnelly,
1:01:55
Kate Hutchinson, Matt Casey, Maura Curran,
1:01:58
Beth Zunafranci, Kurt Courtney, and Hillary
1:02:00
sheets. Our agent
1:02:02
is Oren Rosenbaum at UTA. Follow
1:02:05
and listen to Search Engine with PJ Vot now
1:02:07
for free on the Odyssey app or
1:02:09
wherever you get your podcasts. Thank
1:02:12
you for listening. We will see you in two
1:02:14
weeks when...
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More