Podchaser Logo
Home
460. AI, Internet Scams, and the Balance of Freedom | Chris Olson

460. AI, Internet Scams, and the Balance of Freedom | Chris Olson

Released Monday, 1st July 2024
 1 person rated this episode
460. AI, Internet Scams, and the Balance of Freedom | Chris Olson

460. AI, Internet Scams, and the Balance of Freedom | Chris Olson

460. AI, Internet Scams, and the Balance of Freedom | Chris Olson

460. AI, Internet Scams, and the Balance of Freedom | Chris Olson

Monday, 1st July 2024
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

At Highland, we're all about celebrating

0:02

little wins and

0:04

little ways to innovate digital processes.

0:07

There's no customer pain point too small for

0:09

us to help with. Maybe

0:11

that's why more than half of

0:14

the Fortune 100 looks to Highland

0:16

to connect their content and data,

0:18

improve processes, and turn little efficiencies

0:21

into big wins for their customers

0:23

and clients. Highland

0:25

Intelligent Content Solutions for

0:27

innovators everywhere at highland.com

0:44

Hello, everybody. Today, I have the

0:46

opportunity to speak with Chris

0:49

Olsen, who's CEO of the

0:51

Media Trust Company. His

0:54

company is involved, occupies

0:57

the forefront of attempts to

1:00

make the online

1:02

world a safer place.

1:04

He mostly works with corporations to do

1:06

that, mostly to protect their digital assets.

1:08

But I was interested

1:11

in a more broad ranging conversation discussing

1:14

the dangers of online

1:16

criminality in general. A

1:19

substantial proportion of online

1:22

interaction is criminal. That's particularly true

1:24

if you include pornography within that

1:26

purview because porn itself constitutes

1:29

about 20 to 25 percent of Internet

1:31

traffic. But there's all sorts of criminal

1:33

activity as well. And so Chris and

1:36

I talked about, for example, the people

1:38

who are most vulnerable to criminal activity,

1:40

which includes elderly people

1:42

who are particularly susceptible to

1:45

romance scams initiated

1:48

on dating websites, but then

1:50

undertaken off those sites and

1:54

also to phishing scams

1:56

on their devices that indicate, for example, that

1:58

something's gone wrong. with the device and that

2:00

they need to be repaired in a manner

2:02

that also places them in the hands of

2:04

criminals. The sick and

2:06

infirm are often targeted with

2:08

false medical offers. 17-year-old

2:11

men are targeted with

2:13

offers for illicit drug

2:16

purchase and juvenile

2:18

girls, 14,

2:20

13 that age, who are interested in

2:22

modeling careers, for example, are frequently targeted

2:24

by human traffickers. This is a major

2:27

problem. The vast majority of elderly people

2:29

are targeted by criminals on a

2:31

regular basis. They're very well identified

2:33

demographically. They know their ages, they

2:36

know where they live. They

2:38

know a lot about their online usage

2:40

habits and they have

2:42

personal details of the sort that can

2:45

be gleaned as a consequence of continual

2:47

interaction with the online world. And so

2:50

I talked to Chris about all of

2:52

that and about how we

2:55

might conceptualize this as a society when

2:58

we're deciding to bring order to

3:00

what is really the borderless, the

3:03

ultimate borderless wild west

3:05

community. And that's the

3:07

hyper connected and

3:10

possibly increasingly pathological

3:12

online world. Join

3:15

us for that. Well,

3:17

hello, Mr. Olson. Thank you for agreeing to

3:19

do this. We met at the

3:21

presidential prayer breakfast not so long

3:23

ago and we had an engaging

3:25

conversation about the online

3:27

world and its perils. And I

3:29

thought it would be extremely interesting

3:31

for me and hopefully for everyone

3:33

else to engage

3:36

in a serious conversation about, well,

3:39

the spread of general criminality

3:41

and misbehavior online. And so

3:44

do you wanna maybe start by telling

3:46

people what you do and

3:48

then we'll delve more deeply into the general

3:50

problem? Great, yes. And thank you, Jordan.

3:52

Thanks for having me. I'm the

3:54

CEO and founder of the Media Trust

3:56

Company, not intended to be an oxymoron.

3:59

Our primary... My primary job is to help

4:01

big tech and digital media

4:03

companies not cause harm when

4:05

they monetize audiences and when they

4:08

target digital content. So

4:10

let's delve into the domains of

4:12

possible harm. So you're working with

4:14

large companies. Can you give

4:17

us who, like what sort of

4:19

companies do you work with? And then

4:21

maybe you could delineate for us the

4:24

potential domains of harm. Yeah,

4:27

so I work with companies that

4:29

own digital assets that people visit.

4:31

And I think maybe to set

4:33

a quick premise, cybersecurity

4:36

is a mature industry designed

4:38

to monetize the CISO, the

4:40

chief security officer, generally

4:43

protecting machines. So there's a

4:45

mindset geared to making sure

4:48

that the digital asset is

4:50

not harming servers, the

4:52

company or government data. Our

4:55

difference is that we're helping companies that

4:57

are in digital. So think big media

5:00

companies. We're helping them protect from harming

5:02

consumers, which is the difference between digital

5:04

crime, which is gonna target people, and

5:07

cybersecurity, which is generally targeting corporates and

5:09

governments and machines. So

5:12

now does your work involve protection

5:17

of the companies themselves also

5:19

against online criminal activity or

5:21

is it mostly aimed at

5:24

stopping the companies themselves from what would

5:27

you say, mostly I

5:29

suppose inadvertently harming their consumers

5:32

in pursuit of their enterprise

5:35

and their monetization? Yeah, so the

5:37

great question. And I think that's

5:39

where the heart of the matter

5:41

is. So our primary job is

5:43

to watch the makeup of what

5:45

targets digital citizens' devices. The

5:48

internet is made up of roughly 80% third-party code.

5:51

And what that means is when a consumer's

5:54

visiting a news website, when they're checking sports

5:56

scores, when they're visiting social media, the

5:59

predominance... of activity that's running

6:01

on their machine is coming from

6:03

companies that are not the owner of

6:05

the website or the mobile app that

6:08

they're visiting. That third-party

6:10

code is where this mystery

6:12

begins. So who actually controls

6:14

the impact on the consumer

6:17

when they're visiting an asset

6:19

that is mostly made up of source

6:21

code and content coming from other companies?

6:23

So our job is to look at

6:25

that third-party content to discern what is

6:28

good and bad based on company policies,

6:30

based on what might be harming the

6:32

consumer, and then informing those

6:34

companies what is violating and how they

6:36

can go about stopping that. What

6:39

sort of third-party code concerns

6:43

might they face or have they faced? What

6:45

are the specifics that you're looking for? Maybe

6:48

you could also provide us with some

6:50

of the more egregious examples of the

6:52

kinds of things that you're ferreting

6:55

out, identifying ferreting it out and

6:57

attempting to stop. Yeah,

6:59

so I think putting any digital

7:01

company into the

7:03

conversation is critical. So

7:06

we're talking about tech support

7:08

scams and romance scams targeting

7:10

seniors. That is an

7:12

epidemic. If you're a senior

7:14

and you're on the internet on a regular

7:16

basis, you're being attacked, if not daily, certainly

7:18

of every week. That

7:21

is now a cultural phenomenon. There's movies being

7:23

produced about the phenomenon of seniors

7:26

being targeted and attacked online. It's

7:29

teens. So a 17-year-old male is

7:31

being bombarded with information on how to buy

7:34

opioids or other drugs and having them shipped

7:36

to their house. If you're

7:38

a 14-year-old female and you're interested in

7:40

modeling, you're being approached by human traffickers.

7:43

The sick and infirm are frantically searching

7:45

the internet for cures. While

7:48

that's happening, they're having their life savings stolen.

7:51

So our job is to watch

7:53

that third-party content in code, which

7:55

is often advertising. It's basically

7:58

a real estate play on the internet. and what keeps

8:00

the consumer active on the digital asset to

8:04

find that problem and then give

8:06

it back to the company. I can

8:09

jump in quickly in how we go

8:11

about doing that. So we become a

8:13

synthetic persona. We've been doing

8:15

this for not quite two decades, but getting on

8:17

19 years. We

8:20

have physical machines in more than 120 countries.

8:23

We know how to look like a senior citizen, a

8:26

teenager, someone with an illness, and

8:28

then we're rendering digital assets as those

8:30

personas, acting more or less as a

8:33

honeypot to attract the problem that's

8:36

coming through the digital supply chain, which runs

8:38

on our devices. And I think that's gonna

8:40

be a key part of this conversation as

8:43

we go. Most of that action

8:45

is happening with us. And

8:48

so it's difficult for tech companies and media

8:50

companies to understand fully what's happening to us.

8:53

That's the point of their monetization, right? That

8:55

moment in time. So our job is to

8:57

detect these problems and then help

8:59

them make that go away. Right,

9:02

okay. So you

9:05

set yourself up as a replica

9:08

of the potential target of the

9:10

scams, and then you can deliver

9:12

the information that you gather about

9:14

how someone in that vulnerable position

9:17

might be interacting with the company's

9:19

services in question to keep the criminals

9:22

at bay. Let's go through these

9:25

different categories of vulnerability to

9:27

crime that you described. I

9:29

suspect there's stories of great

9:31

interest there. So you started

9:33

with scams directed at seniors.

9:35

So I've had people

9:37

in my own family targeted

9:39

by online scammers who

9:41

were in fact quite successful at

9:43

making away with a good proportion

9:45

of their life savings in one

9:47

case. And I know that seniors

9:51

in particular, who

9:53

grew up in an environment

9:55

of high trust, especially

9:57

with regards to corporate entities.

10:00

they're not particularly technologically

10:03

savvy, they're trusting.

10:05

And then you have the additional complication,

10:09

of course, in the case

10:11

of particularly elderly seniors, that their cognitive

10:14

faculties aren't necessarily all that they

10:16

once were. And

10:18

they're often lonely and isolated too. And

10:20

so that makes them very straightforward

10:23

targets for especially people, for people

10:25

who worm into their confidence.

10:29

You talked about, was it romance scams

10:31

on the senior side? It

10:33

is romance games on the senior side.

10:36

Okay, so lay all that out, tell

10:38

us some stories and describe to everybody

10:40

exactly what they would see and how

10:42

this operates. Okay, so a senior is

10:45

joining a dating website, just

10:47

as a teenager or someone

10:49

in middle age would do, they're looking for

10:51

romance. There are people on

10:53

the other side of that, that

10:55

are collecting data on that senior,

10:58

potentially interacting with them. Once

11:01

they get enough information on that particular

11:03

senior, they're gonna start to find them

11:05

in other ways. Send me emails

11:08

and information, let's move out the dating site.

11:10

They're gonna start calling them on the

11:13

phone. As that starts

11:15

to evolve, it's that information collection,

11:18

getting them to do certain things online that sucks

11:20

them deeper and deeper in. From

11:22

that moment forward, they become very much

11:24

wet and emotionally oriented towards that person

11:27

that they're involved with. And

11:30

the theft goes from there. Right,

11:32

so you go on a dating

11:35

website, as say someone in

11:37

your 70s, you're

11:40

lonely and looking for companionship.

11:44

There are scam artists

11:47

that are on those dating websites

11:49

as well, who must

11:51

have what? I

11:53

suspect they probably have keywords and profiling

11:55

information that enables them to zero in

11:58

on people who are. likely

12:01

targets. Do you know how sophisticated is

12:03

that? Like, do you think that the

12:06

criminals who are engaged in this activity,

12:08

how good is their ability to profile?

12:10

Do you think they can identify such

12:12

things as early signs of cognitive degeneration?

12:15

I think this is organized crime and

12:17

they have their own algorithms and processes

12:19

to identify people. I

12:22

also, to your earlier point, people

12:24

believe what they see on computers.

12:26

They're following what's being provided to them,

12:29

which makes them relatively easy marks. So

12:32

once that process starts, they're reeling them

12:34

in. If they lose a fish, that's

12:36

no problem because they're going after so

12:39

many in any given day. They

12:41

also have infrastructure in local

12:43

markets to go deal with

12:45

people personally. So this is a

12:48

very large criminal organization that has a lot

12:50

of horsepower to identify and then attack. Right,

12:54

okay, so do you have any sense?

12:57

See, I hadn't thought about the full implications

12:59

of that. So obviously,

13:02

if you were a psychopathic scam

13:04

artist, posing

13:06

as a false participant

13:10

on a dating website would be

13:12

potentially extremely fertile ground, not only

13:15

for seniors who could be scammed

13:17

out of their savings, but you

13:19

also mentioned, let's say,

13:22

younger people who are on

13:24

the website who might be useful

13:26

in terms of human

13:28

trafficking operations. So

13:31

do you have any sense, for example, of

13:33

the proportion of

13:36

participants on

13:39

a given dating platform that are

13:41

actually criminals or psychopaths in disguise? Because

13:43

here, let me give you an example.

13:45

You undoubtedly know about this, but there

13:47

was a website, can't

13:50

remember the name of it, unfortunately.

13:53

I believe it was Canadian that

13:55

was set up some years ago to

13:58

facilitate illicit affairs. And

14:01

they enrolled thousands of

14:03

people, all

14:06

of whose data was eventually leaked, much

14:09

of that to great scandal. The

14:11

notion was to match people

14:13

who were married secretly with other people

14:15

who were married to have illicit affairs.

14:18

They got an awful lot of men

14:20

on the website and almost no women.

14:22

And so they created tens

14:25

of thousands, if I remember correctly,

14:27

fake profiles of women to

14:29

continue to entice the men to maintain what

14:31

I believe was a monthly

14:35

fee for the service. Ashley

14:37

Madison, it was called. Right,

14:40

and so obviously... Starting

14:45

a business can be tough, but thanks to

14:47

Shopify, running your online storefront is easier than

14:49

ever. Shopify is the global commerce platform that

14:52

helps you sell at every stage of your

14:54

business. From the launch your online shop stage,

14:56

all the way to the, did we just

14:58

hit a million orders stage, Shopify's there to

15:01

help you grow. Our marketing team uses Shopify

15:03

every day to sell our merchandise, and we

15:05

love how easy it is to add more

15:07

items, ship products, and track conversions. Shopify helps

15:10

you turn browsers into buyers with the internet's

15:12

best converting checkout, up to 36% better compared

15:15

to other leading e-commerce platforms. No matter

15:17

how big you want to grow, Shopify

15:19

gives you everything you need to take

15:21

control and take your business to the

15:24

next level. Sign up for a $1

15:26

per month trial period at shopify.com/JBP. Go

15:28

to shopify.com/JBP now to grow

15:31

your business, no matter what

15:33

stage you're in. That's shopify.com/JBP.

15:39

Our dating website would

15:41

be wonderful hunting grounds for

15:43

any kind of predator. And

15:46

so do you have any sense of what proportion

15:48

of the people who are participating

15:51

on online dating sites are actually

15:53

predators, criminals? I

15:55

don't know what percentage of the

15:58

participants on the sites. are predators,

16:00

but where we come in

16:02

in our expertise is

16:04

that everyone that is visiting is

16:07

giving information into sort of the

16:09

digital ecosystem. And so

16:11

the issue from there is that

16:14

they're then able to be targeted wherever

16:16

they go online. So there's information that's

16:18

being collected from the site that they're

16:20

visiting. That is then moving out

16:22

into the ecosystem so that wherever they go,

16:24

they're being pulled back and targeted. In

16:27

an example, like in Ashley Madison, a

16:30

criminal may be able to get the

16:32

digital information about the people whose data

16:34

was stolen, come back to

16:36

them six months later, coming from

16:38

another website via email or SMS

16:40

text, and then press the attack

16:42

at that stage. For

16:44

us and becoming a digital persona, our

16:47

job is to look like someone

16:50

based on the information that sites

16:52

have collected about them. So we

16:54

look like an 85-year-old

16:57

grandmother living in a senior

16:59

community. When you become that type of

17:01

profile, no matter who else is engaging

17:03

with you online, the algorithm and the

17:05

content that is gonna be served to

17:07

you is coming from criminals,

17:10

regardless of their activity on that particular

17:12

site that you're visiting. It's simply based

17:14

on who you are. So

17:17

artificial intelligence has been around in

17:19

use for digital media

17:21

and targeting people for 2010-11. So

17:25

the initial, initial use case was

17:28

collecting data on us. That was

17:30

the key initial step for AI

17:32

utilization. The second step

17:34

was then turning that around and

17:37

targeting people better, right? So AI

17:39

was first used to collect information,

17:42

make things interesting behind

17:44

the scenes for people. Second, creating

17:46

better audience segments, which enable

17:49

that targeting. This

17:51

third phase that's happening today, you

17:53

see chat GPT and the LLMs

17:56

being used in regular use. The

17:58

third big stage. is writing content

18:01

on our devices on the fly. So

18:04

regardless of where the criminal actor

18:06

is, regardless of how they're moving

18:08

into the ecosystem and what initial

18:11

buying point, they're able to

18:13

find that person, write content on the

18:15

fly that's particularly tailored to what the

18:18

digital ecosystem knows about them to

18:20

create the situation where they then respond and

18:22

the criminal activity can occur. Right,

18:25

and so what that implies is well

18:27

then, I suppose is that we're

18:32

going to see very sophisticated LLM

18:35

criminals, right? Who

18:38

will be able to, this is the

18:40

logical conclusion of what you're laying out is

18:43

that they'll be

18:45

able to engage, huh, so I just

18:49

saw a video, it's

18:52

gone viral, it was released about three weeks

18:54

ago that portrayed the

18:57

newest version of chat GPT

19:01

and it's a version that can see you through

19:04

the video camera on

19:06

your phone and can

19:08

interact with you very much

19:11

like a person. So they

19:14

had this chat

19:16

GPT device interacting with

19:19

a kind of unkempt, nerdy

19:22

sort of engineer character who was

19:25

preparing for an interview, a job

19:28

interview and the chat GPT

19:30

system was coaching him on

19:33

his appearance and his presentation and

19:35

I think they used Scarlett

19:37

Johansson's voice for the

19:40

chat GPT bought. It

19:42

was very, very flirtatious,

19:44

very intelligent, extremely perceptive

19:48

and was paying attention to this engineer

19:51

who was preparing his

19:54

interview like

19:56

a, what would you say,

19:58

like the girlfriend of his dream, would if

20:00

he had someone who was paying more attention to

20:02

him than it was ever paid attention to him

20:05

in his life. And so

20:07

I can imagine a system like that set up

20:10

to be an optimal criminal, especially

20:12

if it was also

20:14

fed all sorts of information about

20:16

that person's wants and likes. So

20:19

let's delve into that a little bit. How

20:22

much of a digital footprint do

20:24

you suppose? How well are each

20:26

of us now replicated online

20:29

as a consequence of the criminal

20:32

or corporate aggregation of our online

20:35

behavior? So the

20:37

typical senior, for example, how much

20:39

information would be commonly available to

20:43

criminal types about, well, the typical

20:45

senior, the typical person, typical 14-year-old

20:47

for that matter? Right. The

20:50

majority of their prior activity

20:53

that they've engaged in online.

20:55

So corporate digital

20:57

data companies know a highly, their

20:59

job is to know as much

21:02

about us as possible and then

21:04

to target us with information to

21:06

maximize profit. That's the core goal.

21:10

Criminals have access to that data and

21:12

they're leveraging it just like a big

21:14

brand advertiser would. So

21:16

they know it's a grandmother and they're going to

21:18

put in something that only runs on the grandmother's

21:21

device, which makes it very, very difficult for big

21:23

tech and digital media companies to see the problem

21:26

before it occurs. I think

21:28

another thing that's really important to understand

21:30

is this is our most open border.

21:33

So we've got an idea of national

21:35

sovereignty. There's

21:38

lots of discussion on whether or not our

21:40

southern border is as secure as it should

21:42

be. Our actual devices,

21:44

our cell phones, our televisions, our

21:46

personal computers are open

21:49

to source code and information coming from

21:51

any country, any person at any

21:53

time, and typically resolved

21:55

to the highest bidder. Right.

21:58

Right. So the. Digital world

22:01

the virtual world is

22:03

a. Is

22:05

it a lawless frontier i mean i guess

22:07

one of the problems is like if i'm

22:10

targeted by a criminal gang in nigeria. What

22:13

the hell can i do about that mean that

22:15

the case i mentioned to you of my. Relative

22:18

who is scammed out of a

22:20

good proportion of their life

22:22

savings. That gang

22:25

was operating in eastern europe we

22:27

can more or less identify who they were but.

22:30

There is really nothing that could be

22:32

done about it. These

22:35

are people who are operating well out

22:37

of any physical proximity but also even out

22:39

of hypothetically the jurisdiction

22:41

of well say lawmakers

22:43

in canada police in police services

22:46

in canada and so. How

22:50

lawless how

22:52

is it how should we be conceptualizing the

22:54

status of law. In

22:58

the in the online

23:00

and virtual world. Yeah

23:02

i and i think this is where

23:04

where the major rub is so i'm

23:06

gonna walk back in and talk about

23:09

cyber security is an industry first so

23:11

cyber security is relatively mature. It

23:14

is now geared to monetizing the chief

23:16

security officer the chief information security

23:18

officer what that means

23:21

it's providing products and services designed.

23:24

To protect what they are paid to

23:26

hold dear which is the corporate

23:28

asset, so the machines and the data for

23:30

the corporation. If you

23:32

are part of the government, which is where we're going to

23:34

go in the conversation, then your job

23:36

as a cio or a czo is to

23:39

protect government machines. Governments will

23:41

tell you that they're protecting you right to

23:43

protect you from digital harm what

23:45

that means today is they're protecting your

23:47

data. On the dmv website

23:49

that that's basically the beginning and the end

23:52

of cyber security and digital

23:54

protection. There's a legislation

23:56

which is occurring coming from attorneys

23:58

general from from state. from

24:01

the federal government in the US to a degree,

24:04

other countries seem to be further ahead, seeking

24:07

to protect people from data collection.

24:09

And that's your GDPR in

24:12

Europe. Many states in the

24:14

United States are putting

24:16

some rules in place around what corporations can

24:18

collect, what they can do with the data. The

24:21

predominant use case is to provide a consumer

24:23

with an opt-out mechanism. Most

24:25

consumers say, okay, I wanna read the content,

24:28

they're not doing a whole lot with the

24:30

opt-out compliance. So that's not

24:32

been a big help

24:34

to your typical consumer, but

24:36

it's really the mindset that's the problem

24:38

and the mindset of corporate and government

24:40

that is at issue. And

24:42

so governments need to tactically engage

24:46

on a 24 seven basis with

24:48

digital crime in the same way that

24:50

they're policing the street. So

24:52

the metaphor would look like this. If

24:55

grandmothers were walking down the street and being

24:57

mugged or attacked at the rate that

24:59

they're getting hit online, you would

25:02

have the National Guard policing every

25:04

street in America. The

25:07

government needs to take step forward. And when I

25:09

say the governments, that is governments need

25:11

to take a step forward and do

25:13

a better job at policing people tactically.

25:16

And that does not mean that they're going after

25:18

big tech or digital media companies. It

25:21

means that they're protecting people with the

25:23

mindset that they're gonna

25:25

go ahead and cooperate with the

25:27

digital ecosystem to do a better

25:29

job, to reduce overall crime. Right,

25:32

so your point appears to be

25:34

that we have

25:37

mechanisms in place, like the ones that are

25:39

offered by your company that

25:43

protect the

25:45

corporations against the liability that

25:48

they would be laden with

25:51

if the data on their servers

25:53

was compromised. But that is by no

25:56

means the same thing as

25:58

having a police force. that's

26:00

accessible to people, individual people who are

26:02

actually the victims of criminal activity. Those

26:04

aren't the same things at all. It's

26:06

like armed guards at

26:09

a safe in a bank compared to police on

26:11

the street that are there

26:13

to protect ordinary people or who can be called. Have

26:16

I got that about right? Yes,

26:18

and digital crime is crime. So

26:20

this is when you're stealing grandmother's money,

26:24

that is theft. We don't

26:26

need a lot of new laws. What we

26:28

need to do is actively engage with the

26:30

digital ecosystem to try to get

26:32

in front of the problem to reduce overall

26:35

numbers of attacks, which reduces the number

26:38

of victims. And to date, when

26:40

we think about digital safety, it's

26:43

predominantly education, and then

26:45

increasing support for victims. Victims

26:48

are post-attack. They've already had their

26:50

money stolen. Getting in front

26:52

of that is the key. We've got to start

26:54

to reduce digital harm. I've

26:56

been doing this for a good number of

26:58

years, and the end of that conversation does

27:00

reside with local and state

27:03

governments. And ultimately, the federal government

27:05

in the United States is gonna

27:07

have to find resources to actively

27:09

protect beyond having discussions about legislating

27:11

data control or social media as

27:14

a problem. Okay, so I'm trying to

27:16

wrestle with how this is possible, even

27:19

in principle. So now you

27:21

said that, for example, what your company does

27:23

is, and we'll get back into that, is

27:25

produce virtual victims, in a

27:28

sense, false virtual victims, so that you can

27:30

attract the criminals, so that you can see

27:32

what they're doing. So I

27:34

presume that you can report on what you find

27:36

to the companies so that they can decrease the

27:40

susceptibility they have to exploitation by

27:42

these bad actors. But that's

27:45

not the same thing as actually tracking

27:47

down the criminals and holding them responsible

27:49

for their predatory activity. And

27:52

I'm curious about

27:54

what you think about how that's possible, even

27:56

in principle, is first of all, these criminals

27:58

tend to be, or can't even, easily be

28:01

acting at a great distance in jurisdictions

28:04

where they're not likely to be held

28:07

accountable in any case, even by

28:09

the authorities, or maybe they're even

28:11

the authorities themselves. But also, as

28:14

you pointed out, more and more, it's

28:17

possible for the criminal activity

28:19

to be occurring

28:21

on the local machine. And

28:23

so that makes it

28:26

even more undetectable. So I don't

28:29

see, I can't

28:31

understand easily, you obviously in a

28:33

much better position to comment on this, how

28:36

even in principle, there

28:38

can be such a thing as, let's say

28:40

an effective digital police force. Like even if

28:42

you find the activity

28:45

that someone's engaged in and

28:47

you can bring that to a halt by

28:49

changing the way the data is handled,

28:52

that doesn't mean you've identified the criminals

28:54

or held them accountable. So what,

28:58

if anything, I can't understand how that

29:00

can proceed even in principle. Sleep

29:04

is a foundation for our mental and physical

29:06

health. In other words, you've got to have

29:08

a consistent nighttime routine to function at your

29:10

best. But if you're struggling with sleep, then

29:12

you've got to check out Beam. Beam isn't

29:14

your run of the mill sleep aid. It's

29:17

a concoction carefully crafted to help you rest

29:19

without the grogginess that often accompanies other sleep

29:21

remedies. A bunch of us here at The

29:23

Daily Wire count on Beam's dream powder to

29:25

knock us out and sleep better through the

29:27

night so we can show up ready for

29:30

work the next day. Just mix Beam Dream

29:32

into hot water or milk, stir or froth,

29:34

and then enjoy before bedtime. Then wake up

29:36

feeling refreshed without the next day grogginess caused

29:38

by other sleep products. Dream contains a powerful

29:41

all-natural blend of Reishi, magnesium, Eltheanine, Apigenin, and

29:43

Melatonin to help you fall asleep, stay asleep,

29:45

and wake up refreshed. And with it now

29:47

being available in delicious flavors like cinnamon cocoa,

29:49

chocolate peanut butter, and mint chip, Better Sleep

29:51

has never tasted better. And today, listeners of

29:54

this show get a special discount on Beam's

29:56

dream powder. Get up to 40% off for

29:58

a limited time when you go to shopbeam.com/Peterson

30:00

and use code Peterson. at checkout. That's shop

30:02

at beam.com. And use code Peterson for up

30:04

to 40% off. So

30:10

the digital ecosystem is made up of

30:12

a supply chain, just like every other

30:14

industry. There are various steps that a

30:16

piece of content is gonna go through

30:19

before it winds up on your phone. So

30:21

it's running through a number of different companies,

30:24

different cloud solutions, different servers

30:27

that put content out. Okay,

30:29

they're intermediaries. And

30:31

so a relationship between those digital

30:33

police with the governments and those

30:35

entities on a tactical basis is

30:38

really the first step. Seeing

30:40

crime and then reporting that back

30:42

up the chain so that

30:44

it can be stopped higher and higher up

30:47

towards ultimately the initiation point

30:49

of where that content is delivered. So

30:52

it seems fantastic, but it is

30:55

possible. Well, the criminals need

30:57

to have, they need

30:59

to use intermediary processes in

31:01

order to get access to

31:03

the local devices. And so

31:05

you're saying that I believe

31:07

that those

31:09

intermediary agencies

31:11

could be enticed, forced,

31:14

compelled, invited to make

31:17

it much more difficult for the criminals to

31:19

utilize their services. I guess that's, and that

31:21

that might actually be effective. That does

31:23

that, that still doesn't, does

31:26

that aid in the identification of the actual criminals

31:28

themselves? Because I mean, that's the advantage of

31:30

the justice system, right, is you actually get

31:32

your hands on the criminal at

31:35

some point. Yes, and I think

31:37

ultimately it does. So

31:40

you have to start and you have to start

31:42

to build the information about where it's coming from.

31:45

You then have to cooperate with the private entities.

31:48

Our digital streets are managed and made up

31:50

of private companies. It's not a government run

31:53

internet. All of the information that's said to

31:55

us, at least in Western society, is

31:57

coming from these private companies. So

32:00

I think rather than having an antagonistic

32:02

relationship between governments and private companies where

32:04

they're trying to legislate to put them

32:06

into a position, that may be appropriate

32:09

for certain rules and regulations. It may

32:11

be appropriate to raise the age of

32:13

accessing social media from 13 to 16 or 18. And

32:18

that is a proper place for

32:20

the government to be legislating. On the

32:22

other hand, an eye towards

32:24

reducing crime is critical. And

32:27

the ethical and moral mindset among

32:30

all of the parties, and that's

32:32

governments through our corporations, has to

32:34

be solely on protecting people. And

32:37

I think that's something that is significantly

32:39

missing. It's missing in the

32:41

legislation. It's missing in cybersecurity.

32:44

It's not something that we've engaged

32:46

in as a society. So

32:48

there are a few countries, and

32:50

I think even a few states in the US,

32:53

that are looking at a broader whole of

32:56

society approach. That whole

32:58

of society approach is a mimicking

33:00

of how the internet and the digital

33:02

ecosystem works, which is certainly a whole

33:05

of society activity, right? So

33:07

it is the thing that influences and affects

33:09

all of us every single moment

33:11

of every single day. Engaging in

33:13

that, looking across the impact of society

33:15

and doing better via cooperation

33:18

is a critical, critical next step.

33:21

How often do you think the typical

33:23

elderly person in the United States

33:26

say, is being

33:29

successfully, no, is being

33:31

first communicated with by

33:34

criminal agents, and then how often

33:37

successfully communicated with?

33:41

What's the scope of the problem? The

33:43

scope is, if you're a senior citizen, in

33:47

particular, if you're a female senior citizen, roughly 78

33:49

to about 85 years old, we

33:52

see that two and a half to 3% of

33:55

every single page impression or app

33:57

view is attempting to,

34:00

to target you with some form of crime or

34:03

influence that's gonna move you towards crime. So

34:06

it is highly, highly

34:08

significant. In

34:11

some ways looking at, this is shooting fish in a

34:13

barrel to make a dent. So

34:15

you're concerned that the legal system isn't gonna be

34:17

able to find the criminals. There

34:20

is so much to detect and stop and

34:23

so much room to turn them off

34:25

quickly, right? That we

34:27

can gain a significant reduction in

34:29

digital crime by working together

34:31

and considering society as a whole instead of

34:33

the different pockets and how can we legislate

34:35

or how can we try to move a

34:37

private company to do better on their own?

34:41

Okay, so let's now let's, okay. So we talked

34:43

a little bit about the danger that's

34:45

posed by one form of con game

34:50

in relationship to potential criminal

34:53

victims and that was senior romance

34:55

scams. What

34:57

are the other primary dangers that are posed to

34:59

seniors? And then let's go through your list. You

35:01

talked about 17 year olds who

35:04

are being sold online access to drugs.

35:06

That includes now by the way, a

35:08

burgeoning market in under

35:11

the table hormonal treatments

35:16

for kids who've

35:19

had induced gender dysphoria. So

35:21

you talked about seniors, 17 year olds who

35:24

are being marketed illicit drugs, 14

35:27

year olds who are being enticed into let's

35:29

say modeling and people who are sick and

35:31

infirm. So those are four major categories. Let's

35:34

start with the seniors again, apart from romance

35:36

scams, what are the most common forms of

35:38

criminal incursion that you see? The

35:41

most common form is the tech

35:43

support or upgrade scam. And

35:46

essentially the internet knows that

35:48

you are senior. When you're

35:50

going to a website that you and I would visit, instead

35:53

of having a nice relationship with that site and

35:55

reading the content and then moving on to something

35:57

else, you're getting a pop-up or some.

36:00

of information that's telling you there's something wrong

36:02

with your computer. You either need to call

36:04

a phone number or you need to click

36:07

a button, which then moves you down to

36:09

something else that is

36:11

more significant. This

36:13

is happening millions and millions and

36:15

millions of times per day. And

36:18

it is something that we can all

36:20

do something about. Attempting to educate

36:22

seniors to try to not listen to the

36:25

computer when it's telling you to do something

36:27

is not working. So that-

36:29

Well, no wonder. I mean, look, to

36:32

manage that, it's so

36:34

sophisticated because once

36:36

you've worked with computers for 20 years, especially if

36:38

you grew up with them, you

36:40

know when your computer, your

36:43

phone, is telling you something that's actually

36:45

valid and when it isn't, it doesn't

36:47

even look this, a lot of these

36:49

criminal notifications, they don't even look right.

36:51

They look kind of amateurish. They

36:53

don't have the same aesthetic that you'd

36:56

expect if it was a genuine communication

36:58

from your phone but

37:00

man, you have to know the ecosystem to be able

37:03

to distinguish that kind of

37:05

message from the typical thing

37:08

your phone or any website might ask

37:10

you to do. And educating seniors, it's

37:12

not just a matter of describing to

37:14

them that this might happen. They

37:17

would have to be tech-savvy cell

37:20

phone users and it's hard enough

37:22

to do that if you're young, much

37:24

less if you're outside that whole

37:27

technological revolution. So I can't see

37:29

the educational approach. The

37:32

criminal is just gonna outrun that as fast as

37:34

it happens. So yeah, so that's pretty,

37:36

so 3%, hey, that's a lot.

37:40

That's about what you'd expect. Yeah,

37:42

it is highly significant. And

37:45

I think getting in front

37:47

of this problem requires cooperation

37:49

with states, moving

37:51

that tactically to have the idea of a

37:53

police force looking at digital. And I think

37:56

one of the things that both sides, whether

37:58

it's private companies or states, It's

42:00

the most advanced AI at your

42:02

fingertips. Expand your world

42:04

with meta AI. Now

42:07

on Instagram, WhatsApp, Facebook, and

42:09

Messenger. Yeah,

42:13

for us, there

42:16

is a path that leverages that content

42:18

to bring it to the device. And

42:20

I think understanding that mechanism and how

42:22

it's brought forward versus looking at the

42:25

content, and I'll give you an example

42:27

of what's happening in political advertising as

42:29

we speak, understanding

42:32

the pathway for how that content is

42:34

delivered is ultimately how we get back

42:36

to the criminal or the entity that's

42:39

using that to perpetrate

42:41

the crime. The actual creation

42:43

of the content is incredibly difficult to

42:45

stop. It's when it moves out to

42:47

our devices that it becomes something that

42:50

we need to be really paying attention to. So

42:53

in political advertising up to October

42:56

of this past year, our

42:58

customers asked us to flag the

43:00

presence of AI source code. So

43:03

the idea there was they didn't want to

43:05

be caught holding the bag of being caught

43:07

being the server of AI generated

43:10

political content, right?

43:12

Because that just, it looks bad in the news.

43:14

Someone's letting someone use AI. It's going to wind

43:17

up being disinformation or some form of deep fake.

43:20

By October, we essentially stopped

43:22

using that policy because we

43:24

had achieved greater than 50% of

43:28

the content that we were scanning had some

43:30

form of AI. It may have been to

43:32

make the sun a little more yellow, the

43:34

ocean a little bit more blue, but

43:36

using that as a flag, right? To

43:39

understand what's being delivered out, once

43:41

you get over 50%, you're looking at more than

43:45

you're not looking at. That's not a good

43:47

automated method to execute on digital safety. So

43:51

as we move forward, we have a

43:54

reasonably sophisticated model to detect

43:56

deep fakes very much

43:58

still in a test mode. but it's

44:00

starting to pay some dividends.

44:03

And unquestionably what we see is

44:06

using the idea of deepfakes to create

44:08

fear is significantly greater

44:10

than the use of deepfakes. Now

44:13

that's limited to a political advertising

44:15

conversation. We're not seeing a lot

44:17

of deepfake serving in

44:19

information or certainly not in the paid

44:21

content side, but the

44:24

idea of fearing what's being delivered

44:26

to the consumer is

44:28

very much becoming part of a mainstream

44:30

conversation. Yeah,

44:33

well, wasn't there some

44:35

insistence from the White House

44:37

itself in the last couple of weeks that

44:39

some of the claims

44:42

that the Republicans were making with

44:44

regards to Biden were a

44:47

consequence of deepfake audio, not

44:49

video, I don't think, but audio? If

44:51

I got that right, does that story ring a

44:53

bell? And I think where

44:56

we are at this stage in technology

44:58

is very likely there is plenty of

45:00

deepfake audio happening around the candidates. So

45:02

whether you're Donald Trump or Joe Biden,

45:04

or even local political

45:07

campaigns, it's really that straightforward.

45:10

I think on the video side, there are gonna be people

45:13

working on it left and right. I

45:15

think it's the idea of using that as a weapon

45:18

to sow some form of

45:20

confusion among the populace. Some doubt, right?

45:22

Some doubt is gonna be dramatically more

45:25

valuable than the actual utilization of deepfakes

45:27

to move society. Oh, that's, you do,

45:29

eh? So you do think

45:31

that even if the technology develops to the

45:33

point where it's easy to use, so

45:36

you think that it'll be weaponization

45:38

of the doubt that's sowed by

45:40

the fact that such things exist.

45:43

And we've been watching this

45:45

for a very, very long time, and our

45:47

perspective is coming at this from a digital

45:49

crime and a safety

45:52

in content. Safety in content typically

45:54

means don't run adult content in

45:56

front of children, don't serve weapons

45:59

in New York. they're not gonna

46:01

like that. Don't have a

46:03

couple walking down the beach in Saudi

46:05

Arabia. Right, their ministry of media is

46:07

gonna be very unhappy with the digital

46:09

company that's bringing that kind

46:11

of content in. I have the beholder save

46:14

content, drugs and alcohol, right? Targeting the

46:17

wrong kinds of people. So we

46:19

look at this from a lens

46:21

of how do you find and remove

46:24

things from the ecosystem? If

46:26

we continue down the path that we're on today,

46:28

most people won't trust what

46:31

they see. And so we're discussing education.

46:33

They're gonna self evolve to a point

46:35

where so much of the information that's

46:37

being fed to them is just gonna

46:39

be disbelieved because it's gonna be safer

46:41

to not go

46:43

down that path. I'm

46:46

wondering if live events, for

46:48

example, are going to become

46:50

once again, extremely

46:54

compelling and popular because they'll be the only events

46:56

that you'll actually be able to trust. I

47:00

think so. Frank, cause I mean, you're- I

47:02

think it's also critical that we find a way

47:05

to get a handle

47:07

on kind of the anti-news and get

47:09

back. The entities

47:11

promoting trust in journalism, that

47:16

is a very meaningful conversation and it is something that

47:18

we need to try to get back to. It's

47:20

much less expensive to have automation

47:22

or create something that's gonna create

47:25

some kind of situation where people

47:27

continue to click. That's

47:29

a terrible relationship with the digital ecosystem.

47:31

It's not good for people to have

47:33

that in their hand. And

47:36

with the place where digital crime

47:38

is today, if you're a senior

47:40

citizen, your relationship is often net

47:42

negative with the internet.

47:45

Right, you may wanna stick to calling your kids

47:48

on voiceover IP where you can see their

47:50

face. Lots of different ways to do that

47:52

in video calling, but doing other

47:54

things on the internet, including things as simple

47:56

as email, it

47:59

may be more dangerous. about

1:16:00

the machine. How are

1:16:02

you feeling about your chances of control

1:16:06

over or our

1:16:08

chances for that matter of

1:16:10

control over online criminality and

1:16:12

how successful do you believe

1:16:15

you are in your attempts

1:16:17

to stay on

1:16:20

top of and ahead of

1:16:22

the criminal activity that you're

1:16:24

trying to fight? For

1:16:26

our customers that prioritize digital safety,

1:16:31

the vast majority of what might run

1:16:33

through to attack someone is

1:16:35

being detected and removed. They

1:16:38

need to have the appropriate mindset. They need to

1:16:40

be willing to go up onto the

1:16:42

demand source to remove bad

1:16:45

activity that's gonna be coming down. You don't

1:16:47

just wanna play whack-a-ball. You have to engage

1:16:49

in that next step. Those

1:16:52

that do are very successful and

1:16:55

create safe environments. It is not possible to

1:16:57

make this go away. The

1:16:59

pipes, the way that the internet works, the way the

1:17:02

data targeting works, it's just not something you can eliminate

1:17:04

entirely. But there are companies that

1:17:06

are in front of this that will

1:17:08

withhold millions of dollars in

1:17:10

revenue at any given moment to

1:17:12

prevent the possibility of

1:17:14

targeting something and having something bad

1:17:17

happen. But there are a

1:17:19

lot of companies that are not willing to go that

1:17:21

far. I think right now in

1:17:24

some of the bigger companies, we see a lot

1:17:27

of risk towards this, who's gonna

1:17:29

win the chat GPT, who's

1:17:31

gonna win the LLM race. There

1:17:34

is so much at stake in

1:17:36

that from a competitive and revenue

1:17:38

perspective. The companies

1:17:40

that can monetize that the best

1:17:44

are going to start to leap forward. When

1:17:46

you're looking at the world from a, how does

1:17:48

my technology win versus how do I safely get

1:17:50

my technology to do the things that I want?

1:17:52

That's when you start to run a lot of

1:17:55

risk. We're in a risk

1:17:57

on phase and digital right now. But

1:18:00

your earlier claim, I think, which is

1:18:02

worth returning to, was that over

1:18:06

any reasonable period of time, there's

1:18:08

the rub, the companies

1:18:10

that do what's necessary to

1:18:13

ensure the trust

1:18:15

of what you say, to

1:18:20

ensure that their users can trust

1:18:22

the interactions with them are going

1:18:24

to be the ones that are

1:18:27

arguably best positioned to maintain

1:18:30

their economic advantage in the years to come.

1:18:33

And I think- That's the problem. Yes,

1:18:36

and those that are willing to

1:18:38

engage with governments to do

1:18:40

a better job, to ultimately find the bad

1:18:43

actors and take them down, they're

1:18:45

going to be a big part of making the ecosystem

1:18:47

better, rather than insulating and hiding

1:18:50

behind a sort of risk legal regime

1:18:53

that's going to not want to bring data forward to clean

1:18:55

up the ecosystem. Okay, okay,

1:18:57

okay. Well, for anybody watching

1:18:59

and listening, I'm

1:19:01

going to continue my discussion

1:19:03

with Chris Olsen on the Daily Wire

1:19:05

side of the interview. I'm

1:19:09

going to find out more about, well,

1:19:11

how he built

1:19:13

his company and how his interest in

1:19:16

prevention, understanding, and preventing online crime

1:19:19

developed, and also what

1:19:21

his plans for the future are. And

1:19:23

so if those of you who are watching and listening are

1:19:26

inclined to join us on the Daily Wire side, that

1:19:28

would be much appreciated. Thank you

1:19:30

to everybody who is watching and listening for

1:19:32

your time and attention. And thank you very

1:19:34

much, Mr. Olsen, for, well, fleshing

1:19:37

out our understanding of the

1:19:40

perils and possibilities that

1:19:42

await us as the internet rolls

1:19:44

forward at an ever increasing rate.

1:19:46

And also for, I would say,

1:19:48

alerting everybody who's watching and listening

1:19:51

to the, what

1:19:53

would you say, the particular points of

1:19:55

access that the online criminals

1:19:57

have at the moment. when

1:20:01

we're in our most vulnerable states,

1:20:04

sick, young, seeking,

1:20:08

old, all of those things, because

1:20:10

we all have people, we all know people

1:20:13

who are in those categories and are looking

1:20:16

for ways to protect them against the people that

1:20:18

you're also trying to protect us from. So thank

1:20:20

you very much for that. Thank

1:20:22

you, thanks for having us, me. You

1:20:25

bet, you bet. And again, thanks

1:20:27

to everybody who's watching, listening to the film

1:20:29

crew down here in Chile

1:20:31

today in San Diego, thank you very much

1:20:34

for your help today, guys, and to the

1:20:36

Daily Wire people for making this conversation possible.

1:20:38

That's much appreciated. Thanks very much,

1:20:40

Mr. Olsen. Good to talk to

1:20:42

you. Thank you. When

1:20:45

you find a deal on your

1:20:47

favorite thing in the McDonald's app

1:20:49

and order it, does that technically

1:20:51

count as online shopping? Save

1:20:54

money with the app.

1:20:56

Ba-da-ba-ba-ba. At participating McDonald's,

1:20:58

prices may vary.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features