Podchaser Logo
Home
The FTX Finale + A European Spyware Bombshell + Roblox Buys Cheating Company

The FTX Finale + A European Spyware Bombshell + Roblox Buys Cheating Company

Released Thursday, 16th November 2023
Good episode? Give it some love!
The FTX Finale + A European Spyware Bombshell + Roblox Buys Cheating Company

The FTX Finale + A European Spyware Bombshell + Roblox Buys Cheating Company

The FTX Finale + A European Spyware Bombshell + Roblox Buys Cheating Company

The FTX Finale + A European Spyware Bombshell + Roblox Buys Cheating Company

Thursday, 16th November 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

A

0:00

little over a year ago, right

0:02

as Super

0:04

Bowl ads were spreading the gospel

0:06

of a new era of finance akin

0:09

to the invention of the wheel, we

0:11

started to see hints, Scott. Hints

0:14

that a group of nerds in a penthouse in the

0:16

Bahamas had allegedly set off a

0:18

financial landmine of their own design

0:21

that would vaporize billions of dollars,

0:24

real and imagined, of other people's

0:26

money. And

0:28

you know what today is? The

0:30

reckoning?

0:31

Today is the day we can stop saying allegedly.

0:37

That is true. One of the joys of being any

0:39

kind of media is that you have to use the

0:41

word allegedly a lot.

0:43

You say it a lot, but not today. Not

0:47

us. Last Thursday, Sam Bankman-Fried,

0:49

co-founder of the cryptocurrency exchange FTX,

0:52

was found guilty on a bunch of charges,

0:54

including fraud, conspiracy,

0:56

and money laundering. We got to get into it. It's

0:58

a big day. It is. It

1:01

is a big day. We're going to do that and a few

1:03

other stories. Some interesting game

1:06

cheat story that came to us from our distorts.

1:08

So thank you, Boyd, for that one. I have to cover

1:11

it. Fascinating, fascinating topic.

1:13

Yeah, it's a really interesting one. An

1:16

unstoppable force and a movable object

1:18

in the world of video game shooting. It's

1:20

a great story.

1:22

And then I want to talk about a little story

1:24

that came out of Europe's spyware regulation,

1:27

Amnesty International report into something called the Predator

1:29

Files. It is a pretty wild

1:31

story. FTX, crypto,

1:35

Europe spyware, Roblox. Let's

1:38

get into all of it on this chatty episode

1:40

of

1:41

Hacked.

1:55

Scribbledy boop bop

1:57

boom. We're

2:00

doing drum and bass now. We're doing drum and bass for the theme music.

2:02

It's drum and bass. It's fast. 160 beats

2:04

per minute. Strap in. Strap in. Get

2:07

ready. Come with us. Come

2:09

with us. All the time. Yeah.

2:12

How you doing, man? We recorded 360 BPM and

2:14

then slow it down. Is this true?

2:16

You got that woozy... Yeah, totally. Yeah.

2:19

I'm good. I'm good. How

2:22

you doing? I'm doing good. It's been a second since we did

2:24

one of these. We had our great conversation

2:26

with Jack Resider from Darknet Diaries.

2:29

Check it out if you missed it. And then I went on vacation.

2:31

You did. You did. I'm

2:34

going vacation. And now I'm back. I'm

2:36

currently on vacation. I

2:39

am out on vacation. I'm very close to you. I've

2:41

been living your Pacific Northwest

2:44

rain lifestyle and I got to say it. I don't

2:46

love it. I don't love it. Not for

2:48

me. It's rainy, huh? It's very, very

2:51

wet and cold. And I don't

2:53

dig it. Yeah. It's

2:56

a lot. You want to get the sun lamp. You

2:58

want to get that vitamin D. We've talked

3:00

about this before, but you got to take steps. You

3:04

don't have to deal with snow, but you do have to deal with seasonal

3:06

depression on the scale I've never really experienced before.

3:09

So it kind of all comes out in the wash. The

3:11

thing I've been noticing is where

3:13

you are in Vancouver, you know, Arcterix,

3:16

Gore-Tex jackets. See,

3:18

I'm even further west and it's like

3:20

you can tell the tourists because we're all

3:22

in Arcterix and Gore-Tex jackets

3:25

and the locals are literally

3:27

wearing deep sea fishing like

3:29

outfits. Full rubber

3:31

rain suits, you know, huge

3:34

bib pants, like massive

3:37

rain boots. It's like they pulled no

3:39

stops out. Only the tourists are the ones who were

3:41

like, oh, Gore-Tex will keep me dry. And

3:44

they're like, no idea. Yeah. You

3:46

need those, yeah, those giant fishing

3:49

boots that go up to your shins. I

3:52

know what you're talking about. That makes a lot of sense. Waiters.

3:55

You need waiters, basically. Yes. You

3:57

need Paddington Bear. You need the whole outfit. It's

4:00

a little bucky hat That's

4:04

a good look That's a very west

4:06

coast look that's a very far side of the island

4:08

look I'm glad glad

4:10

to have you out here even if only for a little while. Oh,

4:13

thanks, buddy We

4:16

should thank our patrons Jordan over you

4:19

are they really lovely

4:21

patrons and We should

4:23

probably talk about the merch store which has been

4:25

going great store got a hack podcast I know

4:27

I love hats. Enjoy your hand. No,

4:30

I love a mug bunch. We get

4:32

people been buying it I don't mind. Hacks merge really

4:34

so sick you get a bunch of orders every day,

4:36

which is amazing. Love to see it Hope

4:39

you guys love the merch Merch

4:41

is like one of those things that I know we could have

4:43

just slapped the logo on something But we wanted to

4:45

like there's some fun designs in there We

4:48

wanted to work with our designer to make something cool And

4:51

that's part of why it took such a dang long Seeing

4:54

that people are still down. I'm

4:56

still excited for merch means a lot almost

5:00

as much equal to as

5:03

Much as it means when you support us on patreon hacked

5:05

podcast comm redirect short patreon great way

5:07

to support the show Oh, we got some new patrons

5:09

me too. We've lost on a shout out to think I would like

5:11

to thank Jacob

5:14

Jacob Jacob Casper

5:16

Moncie. I Thank you so

5:18

much. Thank you so much And

5:21

means a lot You've left it

5:23

just just Thank

5:25

you P the letter P You

5:28

know just the letter lowercase lowercase

5:30

letter P you are there's like a swagger

5:33

of One letter lowercase.

5:36

It's like Q on Star Trek. I'm not

5:38

like a lowercase. I'm that Thank

5:41

you so much The

5:44

world to us Broderick Duncan. Thank you Thank

5:48

you, Broderick Becca

5:50

Sanchez. Thank you so much. Micah. I

5:52

think set a Micah. I think like I like

5:54

my apologies. Okay? My my my my

5:57

is messing up the name, so it's I'm glad this time

5:59

it might be you You got me. This is we

6:01

support one another during this trying time. This is Micah.

6:03

Thank you so much I think these next two were pretty

6:05

good on Quincy Z Love

6:09

you and Diego Fierro. I think

6:12

I think it's hard to mess up Quincy in

6:14

Diego Diego

6:16

Fierro congratulations. That's a That's

6:20

a name. I'm a big appreciator of Of

6:23

a name with flow and Diego Fierro and

6:26

Quincy, you know one of my favorite fruits

6:29

I love a quince. So be

6:31

Quincy Jones guy thriller

6:35

Thank you all so much support the

6:37

show kicking on over to Patreon

6:40

hacked podcast calm to find our patreon and if we

6:43

if we miss you or buy a bucket hat Yeah, I bucket

6:45

or a visor by a bucket It's

6:48

all good. The if we did miss

6:50

you as a patron we apologize as

6:52

fire us a note I DM us on

6:55

on X or Or fire

6:57

us and the hunted discord But

7:00

yeah, if we did miss anyway, we greatly

7:02

apologize But I think just

7:05

aside from that we should maybe plug

7:07

hotline hacked Yes

7:11

So we're sitting on some

7:14

Killer content we got to decide

7:16

when we want to drop the first one But

7:19

we've gotten enough that I we're gonna at least

7:21

do a hotline hacked Maybe

7:24

a couple and if you want to get your strange

7:26

tale of technology It can be

7:28

a hack you were a part of it can be a funny

7:30

little little computer Whoopsie

7:32

doodle that you did if you just got a funny

7:35

interesting story about technology. Do you want

7:37

to share with us? hotline hacked

7:39

calm You

7:41

can see a chat GPT generated website

7:45

that is in no way shape or form optimized

7:47

for mobile very proud of it It

7:50

took me minutes to make and I think

7:52

I have a future in web death Yeah,

7:55

how I'm at calm check it out. And

7:57

yeah, let's just go with

8:00

the show. But

8:02

yeah, it seems like it's been a while. We

8:04

recorded that episode with Jack prior to Halloween.

8:07

I think that was the last episode that we recorded.

8:09

So it feels like it's been a hot

8:12

minute since we've done one of these. So it's

8:14

nice to be back behind the mic. Yeah,

8:17

a lot of stuff has happened. We

8:19

got to talk about the FTX trial. That

8:21

one's pretty crazy. There's some stories that

8:24

have started but haven't quite gotten to a point

8:26

to talk about it. In

8:28

the state's executive order, President

8:30

Joe Biden signed an executive order on October 30th

8:33

to establish the first substantial US

8:35

regulation on AI. We're

8:37

not really talking about that today because it

8:39

hasn't really fully happened yet. But

8:43

I did want to bring up the fact that the White House

8:45

revealed that a pretty big influence

8:48

on this like most recent push was the

8:51

bad guy in the new Mission Impossible

8:53

movie. I'm

8:55

not sure if you had followed this story or

8:58

it seemed the new Tom Cruise motion

9:00

picture Mission Impossible. You

9:03

know, I haven't. So please enlighten me with

9:06

how this is more of a joke than it already sounds.

9:10

It's pretty much all right on the surface. In

9:13

that movie, the bad guy and spoiler

9:15

alert, if you haven't seen, seen

9:18

the new Mission Impossible, the

9:20

villain is a sentient AI. And

9:23

apparently, amongst other

9:26

things, that movie

9:28

was a big part of why there is now

9:30

a push for substantial US regulation

9:33

governing artificial intelligence. And

9:35

that's really fun to me because that's the first part

9:37

of a two part movie.

9:41

It's the first of two and the second one hasn't come

9:43

out yet, which makes me feel like there's probably an immense

9:45

amount of pressure on Tom Cruise to get that

9:48

right, given that it seems to be informing

9:50

domestic policy. It's

9:53

funny that they've completely overlooked

9:55

the warning signs that were given to us in

9:58

1984 in Terminator. Yeah,

10:01

and they're only paying attention now to Mission

10:03

Impossible It's

10:06

got that star power, you know what I mean?

10:08

They saw yeah Hey,

10:11

our Arnie was like a senator

10:13

wasn't he? He was a senator Yeah,

10:16

he was California. Yes. He was

10:18

exactly See he knew

10:20

our great AI fighting movie

10:22

stars are The game We

10:25

didn't need to bring that up, but I just thought it was fun. That

10:28

was a good time it and That

10:30

is a good one. Yeah, so we're good. We're gonna follow

10:32

that story I think a little bit more seriously

10:35

as it starts to develop but

10:37

I feel like we're entering into that that

10:39

first era of Regulations

10:42

and guidelines governing AI it's gonna be a whole

10:44

big mess. We're gonna talk about it as it unfolds

10:46

a good contrast

10:49

to the Policy and

10:51

you know the regulations that they're putting

10:53

on AI then they're doing it quite quickly when you

10:55

compare that to something like crypto Whether

10:58

this episode will be speaking of

11:00

yeah speaking of The

11:03

you know, we're decades in

11:05

at this point or at least and

11:07

still no regulation Surprising how that

11:09

works. Maybe Tom Cruise needs to make

11:12

crypto movie. That's first

11:14

energy you just put out into Yeah Most

11:19

people run from that Tom Yeah,

11:24

it's I'm sure it's bound to happen and

11:26

I see you have a story in here that suggests

11:28

I think that I Think far

11:30

side the FTX trial which we're gonna get to we're gonna

11:33

start seeing more of it Yeah increased

11:35

financialization of crypto by the financial

11:37

industry that it was trying to get away from seems

11:40

to be maybe the new actor But

11:42

we'll get there we'll get there we get it there

11:44

we'll get there we get there the What

11:48

else that's out about quick the humane AI

11:51

pin have you seen this thing? I have seen

11:53

this thing I followed it. I watched

11:55

the very very very dry

11:58

announcement video the last like anti-salesmanship

12:01

launch video is pretty great. So

12:05

yeah, a company called Humane, helmed

12:08

by X Apple employees, yeah, Humane, helmed

12:10

by X Apple employees, Imran Shoudry

12:12

and Bethany Bongiorno, which

12:15

had been sort of operating a little bit in stealth mode

12:17

with this product that everyone knew the basic

12:19

idea behind it. It was a, my

12:22

phrase not there is a screenless smartphone

12:26

empowered by artificial

12:28

intelligence. Everyone knew they'd

12:30

been working on it. Everyone knew that's kind of where it was going. It finally got announced

12:33

last week. We get the big kickoff

12:35

video and the takes, the

12:37

hot takes, they are a coming. What do you think of

12:40

this thing? Honestly?

12:42

Yeah. You here for it? I

12:45

think it's, yeah, I think it's. You

12:47

already have one. You bought it in the midnight black.

12:50

Yeah, I've already bought it, yeah. I'm talking

12:52

to it right now. It's actually recording my podcast

12:54

for me. I

12:59

think it's a neat concept. I just don't

13:01

know if it's gonna

13:03

be the smartphone killer. Like it seems like

13:07

a 10th of the product or

13:09

a hundredth of the product of a smartphone. Sure.

13:12

So it's

13:13

like, obviously we have Siri and

13:16

one of these days Siri will be GPT

13:19

enabled and will be much more functional.

13:22

And it seems like what they've done

13:24

is create essentially a pin on pseudo

13:27

Siri. Like, I don't know. I

13:31

see this being a much

13:33

worse implementation than a smart glasses

13:36

or a smartphone

13:39

or a, I don't know, it

13:41

just feels half cut. And

13:45

I just don't know how it's gonna compete with some of the

13:47

larger, more sophisticated solutions.

13:51

Yes. So for anyone that hasn't seen

13:53

this product, it's a pin with,

13:55

I think they called a laser ink. It's small

13:58

projector on it essentially. The pin. it's on

14:00

your lapel, it listens to you,

14:02

you talk to it primarily through voice commands.

14:06

When you do need to see something, it has this little projector

14:08

that you put your hand in front of. That's,

14:11

that's kind of the whole product here. And I guess if I was

14:13

trying to think of the most generous thing I could

14:15

say about it, because I

14:17

like a big swing. I

14:20

think this is if you are a person that struggles

14:23

with chronic phone addiction, and you need

14:25

to have a small portable computer with you.

14:28

This is a really interesting way of taking

14:30

the addicting part, the screen out of a phone

14:32

and trying to preserve as much of utilities humanly possible.

14:36

They're using so that's, that's okay,

14:39

I can see that being useful for that person.

14:42

In order to empower that, the big idea

14:44

idea here is what if we used GPT

14:47

style AI, natural voice

14:49

computing, as like a layer of

14:51

mediation between you and the device, instead of

14:53

having to take the device out and actually use it, you can

14:56

talk to it using GPT,

14:58

literally chat GPT as a service.

15:01

That's a cool, that's a really, really

15:04

cool idea. There's

15:06

a thing in like tech product design about

15:08

don't build features, build products.

15:11

And that to me, unfortunately feels like a feature

15:14

that in a generation of Android or two,

15:17

they'll have just woven in all of the bard

15:19

style natural, like language

15:21

computing into the phone at that level, to

15:24

the point that you could just talk to it like it's a humane

15:26

pin totally and have it work the same

15:28

way. The

15:31

issue for me is that like, when

15:34

I do need that really

15:36

high information bandwidth

15:39

thing that only a screen can offer, like if I

15:41

need a map with five locations

15:44

pegged on the map that I can see all at once, when

15:47

I need to be able to see seven products with their prices,

15:50

and I want to be able to see them all at the same time, that

15:52

really high information bandwidth thing. I just

15:55

want the thing to have a screen on it totally. Like,

15:58

could you so I'm not sure how this improves on and Imagine

16:00

booking an Airbnb with this thing. Hell.

16:04

Hell. Actual hell. Yeah.

16:07

Like, hey, I need an Airbnb in Vancouver, in the West

16:09

End. Yeah. And they'd be

16:11

like, found these 76 listings. Yeah. I

16:14

have... Can you describe them to you? Can you describe

16:16

them to you? Well, one

16:18

of the example

16:20

cases they give is like, you know, you can talk

16:23

to it and have it do things. Like, go through my contacts

16:25

list and find this person's phone number. Go

16:27

through my mailbox and find this piece of data.

16:31

But it's like, what if it doesn't do it well? Like

16:33

what if it gives you your

16:36

confirmation code for your flight from three

16:38

years ago rather than your confirmation code from the flight

16:40

from this week? And it's like, and there's no... All

16:43

of a sudden you're like having an argument with this

16:46

four inch piece of glass that's pinned to your coat.

16:49

I just... Yes. I don't know. And

16:53

further to that point, why isn't that something

16:55

that I could say into my AirPod or

16:57

my Google Bud? Totally. Like that

17:00

way of interacting with the computer is

17:02

a forward looking,

17:05

very humane, gets you off your phone

17:07

feature. It's not... I

17:09

haven't seen proof that this needs to be its own product.

17:12

It's a great feature at an OS level

17:14

for phones.

17:18

I think that question of like, well, what is the unique

17:20

use case that this empowers? I

17:23

was reminded anytime people talk about an iPhone

17:25

killer, my brain goes back

17:28

to the famous like the iPhone launch event

17:30

where he does the thing where he says it's

17:32

an iPod, it's a phone, it's a

17:35

revolutionary internet communicator. And he keeps

17:37

saying those three things and you realize he's talking about one

17:39

product, the iPhone. Boom, we have mobile

17:41

computing. That's how that event

17:43

starts with these three use

17:45

cases that are now in one device. This

17:48

video starts with the colors

17:50

that it's available in. Like

17:53

they're not... It's like you haven't... You

17:55

don't know what that killer app is, that use

17:57

case, that reason I have to have

17:59

this.

18:00

as opposed to a phone.

18:04

Unless I just can't

18:06

deal with the responsibility of having a screen in my pocket,

18:08

which is real. I'm belittling

18:10

that for some people. I think this

18:12

has real utility of just, you can't

18:15

handle owning a phone, try

18:18

this out, fair enough. I

18:21

wonder why it's not on your wrist. That seems

18:23

maybe a little more useful, but that's just a minor

18:25

detail. It does have, yeah,

18:27

exactly. It does have a camera. It

18:29

does have other little functions,

18:32

but yeah, to me, I don't

18:34

know. It feels, when

18:37

I think of Meta Ray-Bans, the

18:39

ultimate output of the Meta Ray-Bans

18:41

is essentially this

18:45

built into a set of glasses, but also with

18:47

the ability to show you basic

18:49

pieces of information. Have a basic HUD,

18:52

or the ability to not

18:54

project into your

18:56

hand in single color,

18:59

some form of functionality.

19:02

It's like, I don't know. Yeah,

19:04

I agree with you on the function versus product

19:06

thing. This does feel like

19:09

WinCiri and like Google

19:11

AI, get fully integrated, that

19:14

you'll be able to list Hey Siri

19:16

on your AirPods, and then boom, oh no, my

19:18

iPhone's lit up. The-

19:22

A lot of people's iPhones just lit up and I don't believe

19:24

that. But

19:28

the, yeah, I don't know. I just can't see it, I

19:30

don't know. Maybe I'm missing

19:33

some extensible nature. Like the thing that made the

19:35

iPhone so successful was the App Store, right?

19:37

The ability to take it and use it as a platform.

19:40

And I'm seeing lots of those platform-ish

19:44

devices. Those are the ones that really changed

19:46

the world. I just don't know how you would

19:48

use this as a platform. Suppose you could

19:51

ingrain new UIs and GPTs

19:54

and, or sorry, AIs into it, but

19:57

yeah, I don't know. It's interesting.

20:01

But if I had to bet on it, I probably

20:03

wouldn't. I

20:05

imagine that I think you nailed it.

20:10

I think that GPT is an intermediary layer between

20:12

you and your computer. It's

20:15

a cool idea that will slowly be implemented

20:17

in devices that will be probably much more successful.

20:20

And as a form factor goes, I would much

20:22

rather have a pair of glasses I can pop

20:24

on. Because the amount of fidelity

20:27

and information that you're getting out of that laser ink

20:29

projector feels kind

20:32

of about as much information as we could probably

20:34

get into a heads-up display on a transparent pair

20:36

of glasses in the next five years. I'd

20:40

much rather just see it in the corner of my

20:42

eye. You

20:45

brought up the App Store, and that's maybe worth one last thing we touched on before

20:47

we dive into some other stories, and

20:50

we launched GPTs, their

20:52

version of essentially an App Store. You

20:55

can now create these little custom versions of Chat

20:57

GPT that take prompts, instructions,

20:59

information you've given it, and save it as

21:01

its own little agent you can come back to. Pretty

21:04

cool. You know

21:06

what? I've got to take a victory lap on this, son.

21:09

I feel like I called this. I feel like

21:11

I called this. Domain-specific

21:14

GPTs, you just sense it's

21:16

going to come. So soon there will be an

21:18

HR policy GPT that you buy

21:21

a subscription to that will write all your HR

21:23

policies, like basic

21:25

legal contract review, stuff like that.

21:30

You can see this

21:32

coming a mile away, and I think it's actually brilliant,

21:35

honestly, to train a domain-specific

21:38

AI to really know the intricacies

21:41

of a specific piece

21:43

of language. This makes sense. Yeah.

21:46

The second this got rolled out publicly,

21:48

so it's only available for folks that have, I

21:50

think, GPT Plus or Pro or whatever they call

21:52

it, I have a little

21:55

folder that has emerged over the last while of

21:57

saved prompts, basically.

22:00

It's very useful to be able to write

22:03

out a paragraph, slowly finesse it over time,

22:05

and then just keep coming back to it. Say you want to make

22:07

notes about an article or process

22:10

a large piece of text down into a specific format. Day

22:14

one just immediately took all those prompts. One

22:16

by one created GPTs based on each one of them,

22:18

saved them as little agents, and now

22:21

they're just there. It's

22:23

a small thing, but it really does radically

22:25

change that user experience of

22:27

working with these chatbots because now I suddenly have these little

22:30

programs, these little apps I can use. Sure,

22:32

that have retained the context of the discussions

22:35

you've had with them. Exactly. Yeah.

22:38

I noticed when you give it your

22:40

prompts, it's not just doing a copy paste.

22:42

It actually refines the prompt. It

22:45

asks you a series of questions that it then uses

22:47

to refine your original prompt to more

22:50

accurately give it instructions

22:52

to itself. Smart. For lack of

22:54

a better way of putting it. It is refining

22:56

itself to make sure that you're getting the result

22:59

it understands you want, not just doing the

23:01

letter of the initial prompt, which

23:03

is I think different than when you just feed it the

23:05

prompt. So far, pretty good. Even

23:08

I think four or five months ago, you

23:10

started seeing a lot of TikTok videos and things

23:12

coming out with people who

23:14

were trying to tell you the best way to use

23:17

chat GPT. A lot of them was starting

23:20

to interact with it, being like, ask

23:22

me the questions that you need to

23:24

know to write me the best cover letter for this

23:26

job application. Yeah. And

23:29

then it asks you a prompt of 10 questions and then

23:31

you answer those and then bang, like, knocks

23:34

you out this perfect cover letter. And it's like, okay, like

23:36

I, you could see where we were going for

23:38

sure. I'd been off social

23:41

media just because I was on vacation.

23:43

But when I came back to it, it was right around

23:45

when those were being announced. And

23:48

the number of people who

23:50

at some point had been like all about dropshipping

23:53

and then we're all about creating internet. We're

23:55

all about that. But who

23:57

had like moved on to. This

24:00

is the new app store. It doesn't require

24:02

any code and here's how everyone's

24:05

gonna make all the money in the world. And then I went

24:07

and made one. I was like, nah, dog, this is too easy. The

24:09

barrier of entry for this is way

24:12

too low for this to be the new grift. There's

24:14

zero friction to making this thing.

24:17

Like

24:18

it's just too easy to make a GPT

24:21

for you to be able to do any kind of real hustle.

24:24

Aside from like selling courses,

24:26

teaching people how to make GPTs for money

24:29

or something like, there's no actual utility

24:31

here, I think. Yeah. As

24:33

a money making grift, I think it's just a good

24:36

useful thing to wire GPT into

24:38

itself and let you save the results. But

24:40

I don't think it's gonna be, it might be a bit of a

24:42

gold rush, but I don't know. I think you'll

24:44

see some really, just like everything, right?

24:47

Like there's, for every YouTube

24:49

channel that's successful, there's a 10

24:51

million of them that aren't. And I think you'll

24:53

see that here. You'll see a lot of garbage get

24:55

created quickly, probably around like

24:59

marketing stuff, like writing content for

25:02

websites that give fake

25:04

reviews about products so that they get referral commissions.

25:07

I think you'll see a lot of that usage. But

25:10

also I think you'll see some people

25:12

take it and really drive it, you know,

25:15

like really train an AI on a specific

25:18

domain topic that most people spend money

25:20

on, like legal help and

25:23

HR stuff. And you

25:26

pick one of those relative

25:28

categories that cost you, every time you pick up the

25:30

phone and call your lawyer, it costs you $1,100. All

25:33

of a sudden, if you can replace that

25:37

in a small sense, like obviously not entirely,

25:39

but just if you need something quick

25:41

and you can replace it for like 30 bucks

25:44

a month, like why not? Yeah. Why not?

25:46

Yeah, there was, I think a lot of startups,

25:49

I think that's true and like cataclysmic

25:52

for a couple of companies. Where

25:54

there were, I'd spoken with a couple

25:57

of people who were doing

25:59

some kind of a, kind of an AI based startup

26:02

that was basically just a chat GPT

26:04

wrapper with a different training set like

26:06

woven into it. Sure. Like,

26:09

oh, that's a bad day. See you got

26:11

a bad day for those companies. Yeah, you just got

26:14

got like, oof. Because

26:18

those just got really easy to make like

26:20

GPT wrappers are no longer a thing

26:22

you can monetize in the

26:24

same way. There will definitely be a market for it. Some

26:27

people will get very rich. I

26:29

can go to venture capitalists and raise $50 million

26:32

for my AI company. I

26:35

don't think that's going to work the same way today

26:38

that it did yesterday. You are completely

26:40

out of the Microsoft world, right? You're entirely Mac? I

26:42

am. Yes, I am not.

26:45

I don't currently own any Windows device.

26:47

Yes, I have a few. And

26:50

I've gotten a new preview release of Windows 11. I

26:52

feel like this just touches into this. And

26:55

it's got their AI built

26:57

into it now. So you

26:59

get this, why can I not remember what it's

27:01

called, Co-Pilot. So

27:03

you essentially have chat GPT woven

27:06

into Edge and the OS

27:09

at a base level. So you can just ask it all kinds

27:11

of things. So this is going to be

27:13

interesting. How do you find

27:15

it? Because I've heard mixed results about

27:17

the first gen of it. Is it

27:19

like, wow, what a brilliant idea executed

27:21

at about 20% operating like

27:23

potential. That's pretty much how I find

27:26

it. Let's just say that I hit it away and

27:28

put it away pretty quick. But

27:31

the idea is that eventually

27:34

it will be very functional. And

27:36

that's going to be interesting. It's

27:39

going to be an interesting little revolution we go through here

27:41

as we figure out the uses for these things

27:43

as well as figure out how to best

27:45

implement them into our life and into our existing

27:48

processes. Like I will say, if

27:50

you do go on social media, the

27:52

amount of obviously AI

27:54

generated images I now see is crazy.

27:58

Like you can. miss fingers of the

28:01

spellings wrong on a t-shirt or something. You

28:03

can just tell that they're AI generated. But every

28:07

garbage marketing post you see or

28:09

memes, tons of them are AI generated

28:11

now. So just saving

28:15

so much time in generation

28:17

and artistic design, I guess. People

28:20

are not doing them anymore. They're just getting them generated.

28:22

Yeah. We

28:25

mentioned earlier that we were on vacation recently. Partway

28:28

through the trip, a little background game emerged,

28:30

which is

28:32

the photo in that giant billboard or

28:35

sign or whatever we're looking at of

28:37

real people. As

28:40

we were in Mexico City, big density, things

28:42

about door advertising.

28:45

A lot. Like a lot, dude. I

28:48

was pretty bowled over once I started trying

28:50

to pay attention to it of like, no, that

28:52

doesn't look real. That was generated. Yeah.

28:57

Like that the way that they've all sort of because

28:59

so many of the images in the training sets are of

29:02

models, their understanding of what

29:04

humanity looks like is far too pretty. Like

29:06

everyone has the cheekbone structure and

29:09

just like that squint, the model squint.

29:12

It's kind of uncanny. And then you start really

29:14

paying attention to it. You start to

29:17

see it everywhere. And it's like, well, it

29:19

makes sense that eyeglass

29:22

store would have hired models. So the

29:24

fact that GPT produced humans

29:26

that look like models is OK. But why

29:28

does the guy squinty eyes have so many wrinkles

29:30

around them that are like looping all the way back

29:32

around the eyeball into the forehead? Like weird

29:35

stuff that doesn't actually happen.

29:38

I guess there were probably only a generation of AI

29:41

away from that not really being I think

29:43

it's going to become increasingly less recognizable. This

29:46

is this sweet spot where there's all these like first

29:48

computer ghost peppering are

29:52

like out of home advertising ecosystem. It's great.

29:55

It was a good time. Well, the I don't think it's

29:57

just an advertising like the whole the. The

30:00

demise of body image

30:02

is upon us, but it

30:05

started with Snapchat filters and

30:07

Instagram filters is now AI makeovers.

30:11

So you upload your photos to these things and

30:13

these AIs essentially make

30:15

you perfect. And then it's like then you just

30:17

use that as your profile photo. And it's like that is not

30:19

what you look like. The

30:22

new Pixel phone. There's

30:25

like a lot of great discussion

30:27

about like what is a photo, but the

30:29

new Google Pixel phone really like port

30:31

a lot of kerosene on that discussion with

30:34

the inclusion of AI. They're

30:36

increasingly adding AI photo

30:39

editing features into the photos

30:41

app, which is one step

30:44

away from being added to the shutter button,

30:47

where when you press the button, you are not you're capturing

30:50

something based on what really happened.

30:52

But all of that face smoothing and

30:55

you know, subtle aesthetic tweaks

30:57

are moving increasingly like it used

31:00

to be, you would take a photo, you would

31:02

export it out of the photos app into the editing app.

31:04

Now it's moved out of the editing app into the photos

31:07

app, it will inevitably move into

31:09

the shutter button itself. Where

31:11

you're never really capturing, you're never opening

31:14

a sensor capturing light and closing it. You're

31:16

kind of just gathering information with which to synthesize

31:19

a really nice looking image. And

31:21

that's like a really big meaningful shift

31:23

in like what it means to take a photo.

31:26

Because for most people, they just want a good photo of their family.

31:29

And if everyone's been like, had Vaseline

31:31

smeared over the lens by an artificial intelligence

31:34

to make sure that they're looking at the camera and they're smiling

31:36

and you can see their teeth and there's the right number of them.

31:39

For most people, that's a cool thing. But it is it

31:41

is weird and it is different. Just

31:43

like whitens the teeth, straightens them,

31:46

replaces the eyes, smooths

31:49

any of the wrinkles, it's

31:51

like it adds makeup, just color neutralizes

31:54

and balances you, takes any sheen

31:56

away, maybe does your hair for you. Yeah,

31:59

it's very great.

33:59

Yeah,

34:01

that's selfie that first

34:03

time you see the selfie cam not flipped

34:06

in reverse. Oh, you know what I mean? We're like this

34:08

where zoom reverses you so that

34:10

it doesn't break your brain and then you see yourself

34:12

just filmed and you're the wrong direction.

34:15

Yeah. What you're used to. Yeah, just

34:17

the sort of like baby version of that. Why

34:20

don't we move on to something somehow

34:23

less dystopian than that

34:25

deal a vast international

34:29

spyware network. Predator

34:32

files. So Amnesty International a couple

34:35

of weeks ago, we never got a chance to talk about this, published

34:38

this really incredible investigation

34:42

into this series of spyware attacks

34:44

using a piece of spyware called Predator that

34:46

had targeted not government

34:48

people, not just government folks, members

34:51

of civil society, journalists, politicians

34:53

and academics throughout the European Union,

34:55

the United States and Asia.

34:58

There's a couple of really interesting things to

35:01

this story that

35:03

are why I wanted to talk about it. So

35:06

story concerns something called the Intelexa Alliance.

35:09

All of the names here are ominous. Try and keep track.

35:11

They're the Europe based developer of

35:15

Predator and a series of other surveillance

35:17

based derivatives of this piece of

35:19

spyware that have according

35:22

to I'm just going to borrow the quote from Agnes Calamard,

35:25

secretary general at Amnesty International

35:28

have done quote nothing to limit who was

35:30

able to use the spyware and for what purpose.

35:35

The report published October 9th by Amnesty International

35:37

Security Lab outlined

35:40

how it targeted though not necessarily infected

35:42

members of US Congress, president

35:45

of the European Parliament Roberta Metzola,

35:48

Taiwan president Tsai Ing Wen

35:50

and German ambassador to the US Emily Haber

35:53

is a much longer list. Big fancy

35:55

people all targeted through this. The

35:58

investigation done a partnership with the European Union. European investigative

36:01

collaborations and backed in

36:04

depth by additional reporting by MediaPart under Spiegel.

36:09

Dug into the mechanics of how Predator works,

36:11

it is a zero-day based highly invasive

36:14

spyware that gives

36:16

access to the device's microphone, camera,

36:19

all of its stored local data,

36:21

contacts, messages, photos, videos, and

36:24

the user will be unaware that it has been infiltrated

36:26

by this. It's February

36:29

and June of 2023. Yeah, it's good stuff. Amnesty

36:31

International said that social media platforms

36:33

were being used to publicly

36:35

target 50 accounts belonging

36:38

to 27 individuals and 23 institutions. Essentially,

36:41

this is just a click on a link piece of spyware. So

36:44

these were very public appeals

36:47

to people to try and click on a link that would

36:49

have infected their devices with this Predator

36:51

spyware as part of a

36:53

campaign called ReplySpy. Part

36:55

of the reason I was intrigued by this is

36:58

that Europe is governed by a series of

37:00

regulations that are meant to prevent Europe-based

37:04

and regulated companies like

37:07

the Intellect Cell Alliance, which is this network of companies,

37:09

all based and regulated in Europe from

37:12

producing products like this and then selling

37:14

them to countries that

37:16

they have good reason to believe will use them to

37:18

target citizens, journalists,

37:21

and political dissidents. There is supposed

37:23

to be a regulatory framework in

37:26

Europe preventing the use and development

37:28

of these tools by those

37:30

types of people. And I

37:32

think this is a really important story because

37:35

it reveals that there is a large Europe-based

37:37

international conglomerate that is manufacturing

37:40

these products and selling them to

37:43

regimes all over the world with

37:45

seemingly zero friction from that regulatory

37:48

framework.

37:51

I just found it very interesting. It's a long

37:53

read. It's

37:55

not a light one, but it is worth looking at. I

38:00

don't know, it feels similar

38:02

to DDoS as

38:04

a service and stuff, and this is just spyware as

38:06

a service, private company that makes these

38:09

things. There's a lot

38:11

of companies that make stuff like this and

38:13

then license it to governments,

38:16

and whether you like it or not depends on where

38:19

your political allegiances lie. For

38:22

sure. So it's the world we

38:24

live in, you know? It

38:26

is. And I feel

38:28

like we don't, I haven't heard a lot of conversation

38:31

about, you know, this is a, it's

38:33

a product, it's manufactured in one

38:35

place and it's exported in other places,

38:38

and yet it's one that like has

38:40

legal implications that are

38:42

a little bit different than, you know, I make a chair

38:45

and I sell it overseas. It's like, it's

38:47

a chair that can be used to spy on

38:51

a political dissident or a journalist. And

38:55

like you said, where your politics

38:57

fall probably dictates whether or not you think that's a good or

38:59

bad thing, but it probably shouldn't because

39:02

as much as you might be okay with a

39:04

certain group of people being targeted

39:06

by these things, there's probably some group of people

39:09

you think shouldn't be targeted by these

39:11

tools. As long as they can

39:14

be sold as liberally as a chair, you're

39:16

never going to be able to make sure that that's the case. You know

39:18

what I mean? Yeah. I think the

39:21

big, I think the big shift is, is that I think back

39:23

in the, like if we, if we look at this through the lens

39:26

of history, you know, when

39:28

a nation state or something, a three letter agency

39:31

decided that they needed to track and

39:33

focus on somebody, there's regulatory

39:35

involvement, legislative and judicial involvement,

39:38

you know, you need it warrant, all

39:40

this stuff. And now

39:43

this, you know, there

39:45

was always private investigators, so companies

39:47

could kind of do something like

39:50

this. And now I feel like

39:52

in the digital age, it's like, that's

39:54

kind of the mirror of this is like three

39:57

letter agencies and nation states are still using

39:59

software. like this. They might not be buying this piece,

40:01

but they might have other pieces. But then

40:05

at the same time, you've got other bodies that

40:07

would typically go to... The entire

40:10

domain of private investigation needs

40:12

to be largely digital now. If I had to meet with

40:14

a PI that didn't know anything about digital stuff,

40:16

I would be like, absolutely not. So it's... Yeah.

40:20

What does that guy even do? I go through

40:23

garbage. I just look at trash. I

40:25

follow people around. They're on their phones lots.

40:27

I don't know what they're doing on them. Exactly.

40:32

So the... See, yeah, it's just...

40:34

I don't know. It's just one of those things. Obviously, not

40:36

great. Don't love it. Don't love violations

40:38

of people's privacy, but it doesn't

40:42

surprise me. Let's just say that. It

40:44

doesn't surprise me. No. I

40:46

get that. Yeah. I sense that you are... Yeah. It's

40:49

like, yeah, bad, but

40:51

we knew... I feel like we kind of knew about

40:53

this and it's not... It isn't surprising.

40:55

Yeah. Which

40:58

is itself surprising. You

41:01

know? Is it

41:03

though? Yeah. It is... You bring

41:06

up a really interesting point that at a certain scale,

41:09

nation states probably don't need to... The

41:11

sufficiently large actors don't really need to

41:14

license these tools because they're developing their own.

41:16

They don't need to go to someone that found the zero days because

41:18

they have their own zero days. But there's

41:21

this sort of like middle band of

41:24

regimes and governments and actors who

41:26

don't have those things, but do have the resources

41:28

to try and deploy a tool like this. They can't

41:31

make it, but they can use it. And

41:34

the market has just sort of like, yeah,

41:37

self-selected for that. Cool. So

41:40

we'll make that. You're big enough to

41:42

buy it, but you're not big enough to manufacture it. We'll manufacture

41:44

it for you. Yeah. Not a lot of people can make their

41:46

own space rockets, but a lot of people want to go to space.

41:49

So someone will make it. It's

41:52

like anything.

41:53

Yeah. I'm reminded of that crypto AG

41:55

story, the Swiss company coming out of World

41:57

War II. It was manufactured.

41:59

for all of

42:01

these different countries on either side of the iron curtain

42:03

for decades. And then

42:05

decades and decades later,

42:07

it turned out that the CIA had bought, secretly bought

42:11

a share in the company and was installing backdoors

42:13

into the encryption device the entire

42:16

time. And when I read something like this, and this very insidiously named

42:19

InteleXa Alliance

42:21

producing this Predator spyware, God the names. Near

42:24

the names. Near

42:26

the names. And you do wonder,

42:28

you're like, in like a decade, who are we going to find

42:31

out owned this? Like that's the question

42:33

I have. Like who's really behind

42:35

this? The tinfoil

42:38

hat part of me

42:40

starts to raise its hand.

42:42

Yeah, I don't know. Nothing

42:45

surprises me. Not much surprises me anymore these days, especially

42:48

when it comes to this stuff. I find like the spouse

42:50

where, you know, I

42:53

find that segment more

42:55

disturbing, maybe because

42:57

it's more personal than something like this. This just seems

43:00

like a corporate service offered to large

43:03

entities and governments to do what they

43:05

were going to do anyway. So this is

43:07

capitalism, you know? Yeah.

43:11

This is supply for the demand. Well, I mean, that's

43:13

a good way of putting it. You have, it was bound

43:15

to turn into something like this, where if a

43:18

zero day is as valuable

43:20

as we all understand it, a vulnerability

43:22

that the manufacturer of a device we

43:24

all carry doesn't even know about is worth that much.

43:27

It would make sense that an ecosystem of businesses,

43:30

suppliers and buyers would naturally

43:32

start to grow

43:34

like moss around that incredibly high

43:36

value thing. That's

43:39

just what happens when those emerge.

43:42

So I think you're right that it sort of feels inevitable

43:44

in a way that spouse where and stalker

43:46

where I

43:49

guess are inevitable in a different way that when you zoom

43:52

in close enough, humans tend to do icky

43:54

things to one another and find technology

43:56

to empower them. But that feels like a different.

43:59

I feel like you don't even have to. to zoom in that close.

44:02

Feel like it's a pretty macro thing that we're

44:05

taking part in. That's true,

44:07

that is true. Well, that's

44:09

Amnesty International's Predator Files. Give it a look,

44:12

look it up. It is a really fascinating read.

44:14

It's a long PDF they've put out, but

44:16

I think it's important,

44:18

it's an interesting one. Yeah. Okay,

44:21

where do we wanna go next? A break, or should

44:23

we just keep going? You know

44:26

I love taking breaks. You know I love advertisements. On

44:28

ad break, advertisements. Listen

44:30

to our voice saying things that

44:32

people ask us to after the beep.

44:39

Selling a little or a lot, Shopify

44:42

helps you do your thing however you cha-ching.

44:46

Shopify is the global commerce platform

44:48

that helps you sell at every stage of

44:50

your business. From the launch your online

44:53

shop stage to the first real life store

44:55

stage, all the way to the oh my god,

44:57

did we just hit a million orders stage,

45:00

Shopify is there to help you grow. Whether

45:03

you are selling scented soap or offering outdoor

45:05

outfits, Shopify helps you sell everywhere.

45:08

From their all-in-one e-commerce platform to

45:10

their in-person point of sale system, wherever

45:13

and whatever you are selling, Shopify's

45:16

got you covered. Shopify helps you turn

45:18

browsers into buyers with the internet's best converting

45:20

checkout, 36% better on average compared

45:23

to other leading commerce platforms.

45:26

What I love about Shopify is that no matter how big you

45:28

wanna grow, Shopify gives you everything you need

45:30

to take control and take your business to

45:33

the next level. Sign up for

45:35

a $1 a month trial period at

45:37

shopify.com slash hacked,

45:40

all lowercase. You go to shopify.com

45:43

slash hacked right now to

45:45

grow your business no matter what stage you

45:48

are in. That's shopify.com

45:51

slash hacked. Today's podcast

45:54

is sponsored by Nutrisense. That was the sound

45:56

of the Nutrisense biosensor. It's a small

45:58

device you can put on the back of your arm. NutriSense then

46:00

provides real-time feedback on how your body responds

46:02

to the foods that you are eating, your exercise,

46:05

stress, and even your sleep. With NutriSense,

46:07

you can just take a photo of your meal, adjust your

46:09

portion size, and NutriSense does the rest.

46:12

NutriSense helps you track your data, see your glucose

46:14

trends, and understand your macronutrient breakdown

46:16

for each meal. You also get an overall

46:19

glucose score for each meal based on your body's

46:21

response. You'll be matched with a board-certified

46:24

nutritionist who will review your data

46:26

and answer all your questions. Plus,

46:28

they can help you get a personalized nutrition plan so you can help

46:30

achieve your goals. Try NutriSense

46:33

today. It will open your eyes in profound ways to how your food,

46:35

exercise, and lifestyle choices are affecting

46:37

you. What's more, it empowers you with a real-time

46:39

feedback loop showing the consequences of your food

46:42

and lifestyle choices. It is a powerful

46:44

tool for understanding your body and affecting

46:46

positive change in your life. You

46:48

can get all this today. NutriSense has a special

46:51

offer for our listeners. Visit NutriSense.com

46:53

slash hacked and use promo code hacked to

46:56

start decoding your body's messages and pave

46:58

the way for a healthier life. Be sure to tell them that

47:00

you learned about NutriSense on Hacked Podcast.

47:02

That's NutriSense.com slash hacked. Save $30

47:05

off your first month, plus get a month of

47:07

board-certified nutritionist support.

47:10

If you were like us, you already probably

47:12

know how amazing Notion is. They're

47:15

the sponsor of today's episode. I'm very happy to

47:17

have them sponsoring Hacked. I

47:20

use Notion every single day. I use it to manage

47:22

notes and documents for Hacked. We use it to manage

47:25

stuff at our larger company. We are big

47:27

Notion people, so I was very excited to learn that

47:29

they have launched a new AI tool called

47:32

Q&A. It is a personal assistant

47:34

that responds in seconds with exactly what

47:36

you need right inside of your Notion document.

47:39

It is incredibly cool. I used

47:41

it just the other day to take a bunch of subjects that we were

47:43

going to be talking about in the show and synthesize some

47:45

notes. It's very, very useful

47:47

for content creation and media like we do. I'm

47:50

a big Notion guy. Notion AI can now give

47:53

you instant answers to your questions using information

47:55

from across your wiki, projects, docs,

47:57

and meeting notes. It's kind of

47:59

like having a little... little AI based on all

48:01

of your projects and your work and your voice.

48:03

It's really, really interesting. If you have

48:05

an urgent question you'd normally turn to a coworker

48:08

to answer, you can just ask Q&A

48:10

instead. It's going to search through thousands

48:12

of your own documents in seconds and answer

48:14

your question in clear language, no matter how

48:17

large or complex your workspace is.

48:20

You can try Notion AI for free

48:22

when you go to notion.com slash

48:25

hacked. It's all lowercase letters, notion.com

48:28

slash hacked to try the powerful, easy

48:30

to use Notion AI today.

48:32

When you use our link,

48:34

you're supporting our show. I'm a

48:36

real user of Notion, can't recommend

48:38

it enough. It's fantastic. And if you

48:40

want to try it out and you want to see all the cool new AI stuff

48:43

they've done, notion.com slash

48:45

hacked.

48:49

The holidays start here at Kroger with a

48:51

variety of options to celebrate traditions

48:53

old and new. You could do a classic

48:55

herb roasted turkey or spice it up

48:58

and make turkey tacos. Serve up a go

49:00

to shrimp cocktail or use

49:02

simple truth wild caught shrimp for your first

49:04

Cajun risotto. Make creamy mac

49:07

and cheese or a spinach artichoke

49:09

fondue from our selection of Murray's cheese. No

49:11

matter how you shop, Kroger has all

49:13

the freshest ingredients to embrace all your holiday

49:15

traditions. Kroger, fresh for everyone.

49:22

We're back.

49:25

Where do we want to go now? Do we want to go to the video

49:28

game cheating quarter or do we want to dive into the crypto

49:30

pit? No, let's leave

49:32

the, I feel like we

49:34

had such a heavy intro piece, like

49:37

the pieces, they were pretty socially

49:39

and ethically heavy. I think we

49:41

go to the video

49:43

game cheating because it's more insane. And

49:46

then let's go back to crypto, which is inevitably

49:48

morally heavy. But

49:52

honestly, I think it's kind of a good news story today,

49:55

but we'll get to that. We'll get to that. Yes. This

49:57

is a fascinating one. You flagged that. a

50:00

member of our Discord found this story.

50:04

I'm really excited to talk about this. You wanna lay it down for

50:06

people? Yeah, absolutely. So,

50:08

void in our Discord brought

50:10

the story forward. I dropped

50:13

a few questions out, and we had a little lengthy chat about it one

50:15

night, and it was, I'm shocked.

50:19

Let's just say that. So,

50:23

Roblox, video

50:26

game, obviously. That can't be how we say it

50:28

this whole time. Roblox, Roblox, I don't

50:30

even know how you say it. Roblox?

50:33

No one's ever heard it said out loud, actually, weirdly enough.

50:35

I say Roblox as in robot. Yeah,

50:38

I think that's right. And I didn't wanna just say Roblox

50:40

later, like I was passive-aggressively correcting

50:42

you, so we had to get

50:45

in front of it. So, I

50:48

have been saying Roblox forever because

50:50

I think it sounds funnier, even though I

50:53

think, or more funny. I think Deep

50:55

Dine knows that that's not it. Yeah, sure,

50:57

I got you. I've been saying it for so long that it's now

51:00

in my lexicon. No, you're a lexicon? So,

51:03

for the sake of this episode, I will say Roblox.

51:06

Just, just for. I hope I'm right, but yeah,

51:08

okay, okay. Roblox. I

51:11

feel like an asshole if this pronounced Roblox. Roblox.

51:16

Anyway, okay, so Roblox has

51:19

just signed a partnership to

51:22

reduce the amount of cheating going on in

51:24

the Roblox environment. Oh,

51:26

thank God. With the company that

51:28

makes the cheat, the biggest known

51:31

cheat, Synapse, so

51:33

they have, and when Voight

51:36

had written initially through the story out, I

51:38

was like, where's the proof that this is the company

51:41

that makes the cheat?

51:43

Like, I see them talking, but no, they never

51:46

directly mentioned that Synapse is the cheatmaker,

51:49

to which he replied with a developer

51:53

link from the dev forum for Roblox,

51:55

where they essentially go on to say, we

51:57

understand how strange this might be. but

52:00

you know this is for the best. The

52:05

biggest cheat manufacturer for

52:07

the game has just been hired

52:10

to prevent cheating in the game. It

52:12

just feels like a shakedown

52:15

that I've never seen before. A shakedown?

52:19

Yeah, the

52:21

developer post that you

52:23

mentioned has some really great stuff in it. It

52:28

reads like that scene in a movie where the

52:30

person answers the door and there's

52:32

a cop talking to them. They have this conversation

52:35

and it gets sinister partway through when you realize

52:37

that the kidnapper is standing just outside

52:39

of you pointing a gun at the

52:42

person talking's back. A very

52:44

blink if you're okay type of situation.

52:48

To read from user BitDancer

52:50

Roblox staff on this developer forum, as

52:52

made clear that we said at RDC 2023,

52:55

cheating and exploiting are not welcome on our

52:57

platform. We are using every measure

52:59

within our power to combat it. Let's start with the next

53:02

paragraph. To take another step towards

53:04

this goal, we are pleased and excited to share

53:06

that we have entered into a close partnership with Synapse

53:09

Softworks, LLC. Synapse

53:11

is well known in the reverse engineering scene,

53:14

which is a very polite euphemism for they're

53:16

one of the single largest cheating

53:19

actors that

53:21

exists on our platform. We have

53:24

hired them in order to stop

53:26

people from cheating on our platform. It's

53:30

the end of Catch Me If You Can. It's

53:34

the end of Catch Me If You Can. I

53:36

have to read the beginning of the next paragraph in

53:38

this post because it's actually my favorite set

53:41

of sentences. Oh,

53:43

yeah. It

53:46

might seem strange that we're teaming up with Synapse,

53:49

but we believe it will be the most fruitful option

53:51

when it comes to reinforcing the platform's security.

53:54

For the longest time, we've been on

53:56

opposite sides of the battlefield. We've

54:00

grown to respect their skills and they've

54:02

grown to respect our vision. Oh my

54:06

god It's

54:08

like our vision of making this end please

54:11

make it end for the longest time we've

54:13

been trying to shut them down and

54:16

We've respected and we think we've grown

54:19

to respect how good they are

54:21

at getting around our anti-cheat and

54:23

so therefore They've grown to respect

54:26

our vision of giving them money to stop Creating

54:29

cheats is what I feel like that says

54:31

allegedly. Yeah what that says I'm

54:34

so tired of this fly that's been buzzing around

54:36

my face. I've decided to offer an equity

54:39

in my face To

54:41

pay off of the fly. Yeah, I've just given honestly

54:44

like what this I have

54:47

to think that there was a

54:49

Countdown

54:50

like a security update countdown

54:53

of like they did this So now

54:55

we're gonna try and do this and then

54:57

you release that update and then they find another way around

54:59

it and at a certain Point you would say if this happens

55:02

seven more times Where we put out a

55:04

patch and then they get around the patch We're

55:06

just buying them and it went seven six

55:08

five four three two fuck it. We're buying synapse

55:11

Yeah, like I I feel like

55:13

this was something that someone knew was coming

55:15

and then October

55:18

27th, we got to just drop the update. It's

55:20

happening. It's just it's but to me it's just

55:23

It's it's literally negotiating

55:25

the terrorists like like the

55:28

if this becomes like

55:30

Activision like if you if you Parallel

55:33

this to like what's going on with like Activision

55:36

engine owning big Call of Duty cheat

55:38

manufacturer They're like

55:41

locked up in court. They're suing each other

55:43

but the engine owning is still releasing

55:45

cheats and still kind of Causing

55:48

damage to the Call of Duty environment,

55:50

but like they're like trying to destroy like

55:53

Activision is trying to destroy engine owning

55:56

Which is like, you know what you would kind

55:58

of expect You

56:01

know, where

56:03

Roblox is taking the opposite initiative

56:05

where they're saying, you know what, we're just

56:07

going to give them a bunch of money. And

56:11

granted, I'm sure they're going to deliver some

56:13

services and be like, hey, you know, this is how

56:16

we always get around your anti-cheats

56:18

and here's like, here's some things we can do to tighten

56:20

up where some of the information goes and how you can

56:23

secure it against modification, etc., etc.

56:26

So I'm sure the get value out of it, which

56:29

is good, but I'm sure most of the engineers

56:32

that work on Roblox knew a good

56:34

chunk of it anyway. So the,

56:37

I feel like you're just setting and paving a path

56:40

for like, like,

56:42

hey, you want to get rich, Jordan? It's

56:46

like, let's just build a cheat for a major free

56:48

to play video game until the developer

56:50

comes along with a check and says how much for you

56:52

guys to piss off. And it's

56:54

like, I feel like that becomes, that

56:57

becomes their biggest problem is they might have taken out

56:59

one, you know, one

57:02

problem, but they might have created hundreds more,

57:04

especially because the people that play Roblox,

57:08

the developers and content creators in it

57:10

are used to creating things for money. And

57:13

they've all developed significant technical

57:15

skills. All

57:18

it's going to take is like 50 of them to be like, you know, what's

57:20

better than making a hundred grand

57:22

making these things inside of this platform is just

57:24

hack the platform and wait for the millions to show

57:27

up. Totally. It's

57:29

like, yeah, it feels like you've

57:31

kind of hired a really good blue team at a

57:33

certain point where it was like, for years, we've waited

57:36

for you to post an update and then we reverse

57:38

engineer it and create exploits for it. Now,

57:42

while you're working on it, we can be

57:44

doing that process of reverse engineering and how

57:47

people will crack it so that you

57:49

can hopefully ship it without that vulnerability.

57:52

Like you almost want to keep these folks

57:54

in like a separate building off

57:57

to the side and they just get like

57:59

an early. It's totally alpha of everything and it's like

58:01

your job, just keep doing what you were doing. Totally.

58:05

We don't want to hear you. We'll see you in the cafeteria

58:07

and we won't make eye contact and

58:09

your whole job is just to fuck up our shit before

58:11

someone else has the chance to do

58:13

it. Honestly, pretty good idea. Patch

58:16

the code and send the patch over to the

58:18

master branch please. Exactly.

58:21

But even then, I feel like they've

58:23

created a new financially rewarding ecosystem. It's

58:29

like why would I build content in the game when I can

58:31

just try and hack the game and then wait for them to

58:33

buy me too? I

58:35

don't know. It feels like a more extreme

58:38

version of a bug bounty almost where it's just

58:40

like, yeah, you could crack us and try and sell it

58:42

but if you don't want to break the law, maybe you

58:45

crack us and then sell

58:47

it to us. It's like kind

58:49

of that a little but

58:52

it is different. It's different. Great.

58:55

I love it. Talk

58:59

about bug bounties quick because I'm

59:02

always A, surprised when companies don't

59:04

have bug bounties and B,

59:07

I'm generally surprised at how little

59:10

the rewards are for some of them. Sure.

59:13

There's some critical systems out there that even if they

59:15

do have a bug bounty, the bug bounty is like $4,000 and

59:20

it's like, okay, it's like really? I

59:23

don't know. If I get a zero

59:26

day into your infrastructure platform

59:28

or whatever it is, it could cost millions

59:31

and millions and millions of dollars and losses

59:33

and lawsuits and you're going to give me $4,000. I

59:36

don't know if this is just a total

59:39

digression but I'm always kind of generally surprised

59:41

at the level that bug bounties rewards are

59:43

for critical things. Some have big

59:45

ones like some of them are. I think Tesla,

59:47

historically used to have a pretty massive

59:50

one. If you could hack the

59:52

car. Yeah. You

59:55

want that. Yeah. You want a

59:57

big bounty on something on a thing

59:59

that. drive. Yeah,

1:00:03

that's true. Every so often you see one, you're like, that

1:00:05

doesn't seem worth a the effort

1:00:07

and be the damage. Oh,

1:00:13

well, we shall we inevitably

1:00:15

Yeah, let's take

1:00:17

it to the crypto corner. The

1:00:20

crypto pit. Talk

1:00:23

about crypto but a fun one. Maybe

1:00:25

not fun. A lot of people lost a lot of money. Yeah,

1:00:28

a starting moment in the world of cryptocurrency.

1:00:32

Sam Bankman free co founder of the cryptocurrency

1:00:34

exchange, FDX found guilty on charges, fraud,

1:00:37

conspiracy and money laundering for following

1:00:39

a month long trial in New York where Bankman

1:00:41

free to face accusations of orchestrating

1:00:44

one of the largest financial frauds in

1:00:46

history.

1:00:48

We've been following the story for a while. A

1:00:50

lot of people have been following

1:00:52

this story. We've never done the big

1:00:55

deep dive episode, the big interview

1:00:57

because it's it's the

1:01:00

story is just being so thoroughly told

1:01:02

so well by so many different people. And that's

1:01:05

kind of one of the important things about it, I think is

1:01:07

that this was one of the first large crypto

1:01:10

scam, grift, crime, whatever you want to call

1:01:12

it, it got really, really

1:01:15

mainstream attention. And

1:01:18

it kind of finally came to a head. They

1:01:20

paid for mainstream attention, you know, their

1:01:22

name on F1 cars, their name

1:01:25

on stadiums, their Larry

1:01:27

David. People, bull ads, man. Yeah. Yeah.

1:01:29

He does the new wheel. Yeah, the world's

1:01:32

best quarterback out there pounding the pavement

1:01:34

for you. You've got, you know,

1:01:36

they, they definitely didn't hide

1:01:38

away from the limelight. So I think

1:01:41

that all that did was amplify

1:01:43

the the collapse.

1:01:46

Yes. So so a little

1:01:48

over a year, a little less than a year ago, FTX

1:01:52

cryptocurrency exchange, sort of the darling

1:01:54

of the crypto world collapsed. And

1:01:56

the very broad, we

1:01:59

can now kind of say this, users

1:02:02

tried to collectively withdraw billions of dollars

1:02:04

from FTX and they were unable to do

1:02:07

so. And as it was kind

1:02:09

of soon after revealed that the money from

1:02:12

FTX had ended up over in the coffers

1:02:14

of a supposed

1:02:16

to be independent but unfortunately

1:02:18

woven together company called Alameda

1:02:21

Research, which was Bankman Fried's training

1:02:23

firm that had made these enormous

1:02:26

bets on various parts of the crypto

1:02:29

ecosystem, some of which were owned by

1:02:31

Bankman Fried and FTX itself. So

1:02:34

when the story broke in this, I think it was a coin

1:02:37

desk report making

1:02:39

these early allegations there

1:02:41

was, you know, naturally a run on FTX,

1:02:43

the idea that FTX might not have their money

1:02:45

made everyone take out their money, which meant suddenly

1:02:47

FTX didn't have any money. It

1:02:49

was revealed over the course of the trial,

1:02:52

primarily by this one of the sort of three,

1:02:55

two to three other major players in this drama.

1:02:58

FTX co-founder and executive Gary Wang

1:03:02

testified that from FTX's inception in 2019,

1:03:04

customers funds had pretty much the entire

1:03:06

time always been flowing between

1:03:09

FTX and into the bank accounts owned

1:03:11

by Alameda, which was then able to

1:03:13

sort of do whatever they wanted to

1:03:15

with those user funds. A

1:03:19

really important revelation. And

1:03:21

this is really the story of Sam Bankman's Fried's

1:03:24

like kind of main compatriots at Alameda

1:03:26

and FTX, people that had been with

1:03:28

him since the beginning turning on him over

1:03:30

the course of this trial. This was

1:03:33

a sort of a I don't know if I want

1:03:35

to call the story of betrayal because I don't have a ton

1:03:37

of sympathy for the betrayal, but really it was it

1:03:39

was people turning against one another. Yeah.

1:03:42

Under the threat of a massive legal repercussion. But

1:03:46

one of the big really interesting moments that

1:03:48

we learned about over the course of this trial

1:03:50

was that Gary

1:03:52

Wang had hard coded an exception,

1:03:55

this really important exception into FTX

1:03:58

that made it so that Alameda. research

1:04:01

was the only user on the exchange

1:04:03

that was allowed to have a negative

1:04:05

balance, which functionally

1:04:07

meant they were able to borrow from customer

1:04:10

funds.

1:04:12

This sort of negative balance

1:04:14

exception turned out to be the thing

1:04:17

that the whole collapse turned on. And

1:04:19

a major part of this trial was Wang's

1:04:21

allegations that Bankman Fried directed

1:04:24

him explicitly to

1:04:26

create that exception to allow

1:04:28

Alameda to borrow basically

1:04:30

unlimited money from FTX, which

1:04:33

is what resulted in the whole thing collapsing.

1:04:35

You know, I just got to jump in. I

1:04:38

got to take another victory lap because

1:04:40

I didn't know you said this. And we initially

1:04:42

talked about this. I said, I bet what they did

1:04:44

is they allowed them to never be margin

1:04:47

called. They allowed them to bypass

1:04:49

the margin called safety limits. And

1:04:51

that's exactly what we were talking about. And

1:04:54

that's exactly what they did. And that's exactly what

1:04:56

they're doing. Another lap around the track. Who knows? No,

1:04:59

it would seem you called it. One

1:05:02

of the things that I did find really interesting about

1:05:04

this, if you did pay close attention to the trial,

1:05:07

is they started releasing all of the private

1:05:11

kind of like DM groups, like text

1:05:13

message groups, WhatsApp

1:05:15

groups that had a bunch of

1:05:18

the key players, including SPF's

1:05:20

dad, who I think, and

1:05:23

this is, I've seen some coverage of this coming

1:05:25

out of it, that I think he

1:05:29

might be in some trouble.

1:05:32

At least that's what I've read a few things talking

1:05:34

about it. His involvement

1:05:37

with it was not as, you know,

1:05:40

innocent. Oh, arm's length as originally.

1:05:43

Yeah. So I

1:05:45

think there could be some knock on, at least

1:05:49

what I've read a few opinion pieces there could be

1:05:51

some knock on

1:05:52

lawsuits and civil suits

1:05:55

and things like that coming down the pipe, which

1:05:57

I think is to be expected given the size

1:05:59

of it.

1:05:59

the fraud. So yeah, sure.

1:06:02

Yeah, there's inevitably going to be a round

1:06:05

of appeals. I'm sure that this

1:06:07

story isn't actually done. And all

1:06:09

these people are going to continue to exist as we

1:06:11

should talk about in a second. But

1:06:15

it does feel like a little bit of a chapter closing.

1:06:17

Totally. It felt like there was this group of

1:06:19

people who were running this ecosystem

1:06:21

of companies spending lavish sums of other people's

1:06:24

money. And it all collapsed.

1:06:26

And the result of that was that they turned

1:06:29

on each other. It is an

1:06:31

architect of turning on each other. I'm sure it was again,

1:06:34

under the immense weight of, you know,

1:06:36

legal fallout. But they all turned

1:06:38

on each other and they all started attacking each other. And

1:06:40

the result was that Sam Bankman freed while

1:06:45

he has plans to appeal has been charged with

1:06:48

two counts of wire fraud, four counts of conspiracy

1:06:50

to commit fraud, one count of conspiracy

1:06:52

to commit money laundering, and is staring

1:06:55

down the barrel of a lengthy prison sentence.

1:06:57

That sentencing is set to conclude on March 28.

1:07:00

So we'll probably do a minor check back on what

1:07:02

happened. But that's not

1:07:04

really the interesting part to me. 110 years is

1:07:08

what his maximum sentence is or something

1:07:10

like that. It's pretty

1:07:12

gnarly. Yeah, I doubt

1:07:14

it'll be that. But who knows? I'm

1:07:16

not sure that I honestly don't. I'm not

1:07:19

even sure that it should be. I think the

1:07:21

important part.

1:07:23

Yeah,

1:07:24

I'm not sure what the important part is. It's an extreme

1:07:27

waste of a bunch of capital and

1:07:29

time and human utility. It's a big

1:07:31

dumb, shitty, bad thing. Which I cannot

1:07:34

wait. Glad he didn't get away with it. In

1:07:37

Michael Lewis's new book, Going Infinite,

1:07:39

which I've just started. Interesting.

1:07:41

I am a big Michael Lewis fan as somebody who enjoys

1:07:44

economics and finance as well as

1:07:48

good story. Michael Lewis generally

1:07:51

delivers on that. If you don't know, he wrote

1:07:53

the Big Short and a bunch

1:07:55

of other great books. I've

1:07:57

read them all. And this is his newest

1:07:59

one. Just came out. It's about Sam make my

1:08:01

freed can't wait to read it and would

1:08:04

love to have him on the podcast So I just need to

1:08:06

find out how to get a hold of him, but he'd

1:08:08

I feel it would be a great great chat I'm

1:08:11

curious how I know a lot of that book was written

1:08:15

I'm not sure what it was, but I think it was it was a while

1:08:17

ago Like I think it was written maybe

1:08:20

prior to the collapse of FTX Or

1:08:23

during the early days of it. So I'm really curious

1:08:25

to read He's

1:08:28

both such a like hawkish

1:08:30

critic of structural abuse but

1:08:34

also loves a good character like and I

1:08:37

could see it going either way like does he fall in love

1:08:39

with the character of SPF or is he

1:08:41

Taken aback by the scale of fraud

1:08:43

and blah blah blah blah everything we've been talking about Like

1:08:46

I could imagine a Michael Lewis book that goes one

1:08:48

of two directions about Sam Bankman Frieda I'm

1:08:51

really curious which direction it goes What

1:08:54

it hopes to hear what you think about it? And I

1:08:56

would love to talk I'd love to read

1:08:58

it and talk about it with him on

1:09:00

the show I think if I'm not

1:09:03

mistaken, I think I remember in

1:09:05

some of the group text messages that came

1:09:07

public through the trial Yeah, Michael

1:09:10

Lewis was like in the Bahamas

1:09:13

to interview Sam when

1:09:15

it blew up and If

1:09:18

I could be making that up, but

1:09:20

I feel like I did read

1:09:22

that Access

1:09:26

journalism is so interesting. I The

1:09:30

idea of it's something I'd love to get

1:09:32

to do and you just have to find the right subject But to really

1:09:34

just be like no, I'm gonna spend three weeks with you Let's put

1:09:36

a month with you. I'm gonna document everything

1:09:38

and we're gonna create something that's really Like

1:09:41

you need not rooted in like the hour

1:09:43

or two that you sat down for an interview and you and I

1:09:45

have done so Many interviews like that. This is

1:09:47

great But the idea of just embedding

1:09:50

with a person to try and tell their story so

1:09:52

compelling and such a massive

1:09:54

threat to your ability To be honest and objective

1:09:57

what was going on because you're like,

1:09:59

this is my roommate

1:09:59

like I hang out with them all day. It's

1:10:02

such a fascinating thing. I'm excited

1:10:04

to read that book. And I'm excited for

1:10:07

the end, for now, of

1:10:10

the saga of

1:10:12

Sam Bankman freed FTX and the

1:10:15

Bahamas-based financial nuclear

1:10:17

bomb. To speak

1:10:20

of how it's not ending, let's

1:10:22

just talk about how a few

1:10:25

of the former FTX executives,

1:10:27

fresh off the stands of testifying

1:10:30

at the trial, are creating

1:10:32

their own crypto exchange.

1:10:35

What could go wrong? What could go wrong? Here's

1:10:37

the best part. They are putting it in

1:10:39

Dubai, which, as we've covered

1:10:42

previously on the show, is the, and

1:10:44

this is not any indication of whether

1:10:46

this is trustworthy or not. I'm just stating

1:10:48

it as a fact that Dubai houses

1:10:51

most of the scams and multi-level

1:10:53

marketing scams that exist in the world. A

1:10:55

lot of them are headquartered out of Dubai. And

1:10:58

this new exchange, which

1:11:00

is titled Trek Labs,

1:11:04

Trek Labs is also founded

1:11:07

and headquartered out of Dubai. So

1:11:10

I think,

1:11:11

yeah,

1:11:12

I don't know if it's the greatest look. Correlation,

1:11:15

not causation. Yeah,

1:11:22

two former colleagues of Sam Bankman who are

1:11:24

now starting a crypto exchange

1:11:27

days after he was found guilty on seven

1:11:29

charges of fraud, which is a wild

1:11:31

look. Wait a month. Yeah. Like

1:11:34

I'm not saying you're up to anything fishy. That's not fair. You're

1:11:36

allowed to have another job. You're allowed to start new businesses.

1:11:39

A hundred percent with you on that. There's

1:11:42

no indication that any of these people had anything

1:11:44

to do with anything. All

1:11:46

of the benefits of doubts

1:11:49

being given from a purely

1:11:51

optics and marketing perspective, give

1:11:54

it a fortnight. Give it a little bit

1:11:56

before you send out the press release saying

1:11:58

that you are. starting this

1:12:01

cryptocurrency exchange that had previously

1:12:03

received funding from anti-ex's

1:12:06

venture arm. It's just not

1:12:08

a good look. Yeah,

1:12:10

I feel, but yeah,

1:12:13

I guess it's to be expected, you know, people were involved,

1:12:16

they saw the success, they saw how to replicate

1:12:18

the success, they saw what worked, they have that knowledge,

1:12:20

that domain knowledge, you have to expect

1:12:23

that they're gonna take it and go with it. I

1:12:25

assume they probably won't do the whole

1:12:27

Alameda, you know, piece

1:12:29

of it, but

1:12:32

I think that being a large crypto exchange

1:12:34

is a very viable business, I believe, at this point,

1:12:36

so it is as expected.

1:12:38

It's just it's a interesting

1:12:41

knock-on and yes, I do agree that maybe waiting

1:12:44

until at least the sentencing is

1:12:46

over would have been maybe a beneficial

1:12:50

to the look, to the optics.

1:12:53

A generous read of this could be that they

1:12:55

know explicitly how not

1:12:58

to commit a very specific set of crimes. I'm

1:13:02

articulating that as a joke, but there's a there's

1:13:05

an actual thing there of like we know

1:13:07

what it looks like when you have corruption

1:13:09

inside one of these organizations, we know

1:13:11

what it looks like when one hand is like talking to

1:13:13

the other behind its back, like we we know what that

1:13:15

looks like we're not going to do that. Yeah.

1:13:18

We've seen the pitfalls, like maybe that's it, maybe

1:13:20

it's this is we all had this

1:13:22

experience, we saw the good that

1:13:24

can be done according to them, but we

1:13:27

also saw the pitfalls now watch us

1:13:29

do all of the good and make all the money without stepping

1:13:31

in the pitfalls. Yeah, they also saw what

1:13:33

brought it down and saw what they

1:13:35

tell what we needed evidence to ruin

1:13:37

them. Yeah, totally. Seeing

1:13:40

what not to do and seeing how not to get caught are very

1:13:43

distinct things. They're really importantly different

1:13:45

than each other. I think

1:13:47

just to kind of wrap

1:13:50

up the Bitcoin

1:13:52

stuff. I think we got an NFT thing to chat

1:13:54

about, but the there's been

1:13:56

a lot of speculation. I don't know if you follow the crypto

1:13:59

market, but the crypto market. market's been very volatile,

1:14:01

typically in an up trajectory the last couple

1:14:03

months or last month. Yeah.

1:14:06

And a lot of it has to do with

1:14:09

the potential

1:14:11

approval and launch of a bunch of ETFs,

1:14:14

exchange traded funds that

1:14:16

will kind of spot price crypto

1:14:19

into real regulated securities

1:14:21

markets, which

1:14:24

is to me, I can't

1:14:26

fathom how either,

1:14:29

I know right now it's in front of the UK Securities

1:14:31

and Exchange Commission, their SEC as well, it's the

1:14:34

American one, but I can't fathom how they'll approve

1:14:36

these things as

1:14:38

it essentially will, it's a

1:14:40

tacit acceptance that they are securities

1:14:43

and always have been securities, but

1:14:47

will not regulate the actual internal

1:14:50

security like the Bitcoin itself,

1:14:52

but it will allow us to create

1:14:55

securities around those

1:14:57

unregulated securities. Yeah.

1:15:02

So it's a weird, it would be a strange

1:15:05

step towards the legitimization of crypto.

1:15:07

It's already very legitimate right now, hundreds of billions

1:15:09

of dollars, Super Bowl ads, but from

1:15:11

a purely financial perspective, it's a really

1:15:14

big step to say that this is just

1:15:16

a security now. For

1:15:19

anyone that doesn't know an ETF,

1:15:21

I think a lot of people listening to this would, but an ETF

1:15:23

is just essentially a pooled investment that holds

1:15:25

a bunch of other assets instead of just

1:15:28

one. So you're not just buying one stock,

1:15:30

you're buying a thing that's got a bunch of other stuff

1:15:32

inside of it, doesn't have to be just stocks, but

1:15:35

you're buying a basket. I think is that a fair way

1:15:37

of describing an ETF for a person

1:15:39

that's unfamiliar? Yeah, it holds a bunch of like,

1:15:41

well, typically an exchange traded fund

1:15:43

will hold like an S&P 500 ETF,

1:15:48

will hold a percentage weighted

1:15:52

of all of the things that make up the SPF

1:15:54

500 basket. Crypto,

1:15:56

like if they do a Bitcoin ETF, it will just

1:15:59

strictly hold 50. So we'll only

1:16:01

probably own one one

1:16:03

asset class inside of it or one asset inside

1:16:06

of it but the issue Well,

1:16:09

there could be a crypto ETF that holds a bad

1:16:13

Why wouldn't you just have the full crypto ETF

1:16:15

that's like the SMP 500 of crypto.

1:16:18

It's got a theorem. It's got Like

1:16:20

it's just got all of them in it. But but the but

1:16:23

the thing a thing that blows me

1:16:25

away is just that it's like a a

1:16:29

You know the largely the crypto community

1:16:32

since I've been witnessing

1:16:34

it so I don't know what that would be 2009 and

1:16:38

onwards 10 onwards like I'm making these dates

1:16:40

up the top my head but They

1:16:43

have not wanted to regulate, you know they enjoy

1:16:45

the fact that they're outside of the regulations a

1:16:48

from my hyperspeculative standpoint

1:16:50

where there's no regulations and they

1:16:52

can be crazy and they can do massive

1:16:55

leveraged investments and You

1:16:57

know to the moon To

1:16:59

the moon, but also on the other side the

1:17:03

They don't really pay philosophical perspective.

1:17:05

Yeah, philosophically they they they're

1:17:08

they're kind of anti Massive

1:17:11

big big government finance regulation, etc

1:17:15

the regulatory people have

1:17:17

let them Live and exist

1:17:19

in this like gray area I would say because

1:17:22

it's not like they don't treat it as a security the

1:17:24

more and more like the SEC is getting involved

1:17:27

in crypto Stuff, but they're still unregulated

1:17:30

So they've kind of let them Exist

1:17:32

in this gray area for so

1:17:35

long

1:17:38

And now

1:17:40

Letting them

1:17:42

create ETFs Full

1:17:45

of this asset that they don't regulate

1:17:48

it seems I just can't see how

1:17:50

they're gonna do it You know, like there's a lot of speculation

1:17:52

if they're gonna approve these ETFs, but I just can't

1:17:56

For the life of me see how they're gonna allow it Yeah

1:18:01

I'm you've been following the regulatory

1:18:04

response to cryptocurrency for a while and that's

1:18:06

been something you've been talking about Kind of prior

1:18:08

to even me realizing that was an interesting element

1:18:10

of the story of cryptocurrency And

1:18:13

that's definitely a huge part of this

1:18:15

like will a crypto based ETF

1:18:18

ever get approved? And what does it say about

1:18:20

institutional finances? Relationship

1:18:23

with crypto is like a big important That's

1:18:27

that's kind of what this is really about the

1:18:29

second thing. It's about that. I find interesting

1:18:31

and Wondered about for a long

1:18:33

time with crypto is that I

1:18:36

was introduced to crypto as

1:18:38

a bold defiant libertarian oppositional

1:18:41

Idea to traditional

1:18:43

finance. Yes traditional finance is

1:18:46

this big corrupt thing and it's dealing in commodities

1:18:48

They're being produced by central banks and Bob but like

1:18:50

that Crypto is supposed to exist

1:18:54

Across a big raging river

1:18:56

from that whole world of traditional finance And

1:18:59

it was supposed to be empowering to people not

1:19:01

your keys not your crypto, but here's this

1:19:04

thing it's the digital version of a duffel bag full

1:19:06

of gold buried in the back of the And

1:19:09

to suddenly say but you know What's even better than

1:19:11

duffel bag full of gold if we put that duffel

1:19:13

bag in a bank and bought stocks in the bank

1:19:15

And then come like yeah, yeah, make stocks together

1:19:18

and made an ETF out of it like So

1:19:20

that wasn't about that for some of you it was

1:19:22

about the line going up It was about

1:19:25

a world where the value of a bunch of other things had

1:19:27

already gone up before you've gotten it But here's a new

1:19:29

thing that you can ride to riches. It

1:19:31

wasn't about the philosophical stuff. It was just

1:19:33

about that line And

1:19:36

then ETF would mean more people could buy that

1:19:38

asset

1:19:40

Making your line go up further like it

1:19:42

to me This feels like kind of saying that

1:19:44

quiet part out loud a little bit totally that's

1:19:47

exactly exactly it But yeah, I don't

1:19:49

know the second it goes ETF it can

1:19:51

attract you know institutional great

1:19:54

investors You know like tensions

1:19:56

could you imagine? Dipping it's

1:19:58

like it's like something like the S&P 500

1:20:01

when a company gets rolled into the S&P 500,

1:20:04

usually their stock goes up quite a bit because

1:20:07

all of a sudden these ETFs need to acquire

1:20:11

millions of shares in it because they're holding

1:20:13

a basket of equities made up of the

1:20:15

members of the S&P 500. So same

1:20:18

thing will happen here is that if these

1:20:20

become real ETFs, the more

1:20:22

people that buy them, the more Bitcoin will need

1:20:25

to be secured and held by the ETF and

1:20:28

it will cause the line to go up. But

1:20:30

here's the thing is like Bitcoin, should

1:20:35

I just say has no utility? Is

1:20:38

that too far? You said it before. I don't know why

1:20:40

you would, I don't know why you'd waffle suddenly

1:20:42

at the finish line. Say

1:20:44

that it's utility versus,

1:20:48

Yes, dubious. many other ways

1:20:50

that we can trade in

1:20:53

fiat currencies and utility in

1:20:55

society. It offers very

1:20:58

low little utility to the world. The

1:21:01

fact that it needs to be held in an ETF

1:21:04

as an unregulated asset, it just blows.

1:21:06

I just can't fathom an

1:21:09

SEC that allows this to happen. I'll be curious.

1:21:11

You'll have to keep us, I feel like you're tuned

1:21:14

into this regulatory response to crypto

1:21:16

stories. You're gonna have to keep us posted on

1:21:19

if one of these things exists and

1:21:23

whose pension suddenly is going towards it. So

1:21:27

bleak. But

1:21:30

you know, it's not bleak and we should end on. Yeah. I

1:21:33

think we can put a pin in it with this one. This

1:21:35

is just a little one. This is a petty, like petty

1:21:38

alarm bell warning at

1:21:40

a large board ape NFT event in Hong

1:21:42

Kong last Saturday. Attendees

1:21:45

had their eyeballs burned when ultraviolet

1:21:48

lights typically used for tanning were used

1:21:50

as part of a onscreen display. I'm

1:21:54

not happy those people got hurt. That part's not funny.

1:21:57

No. That part isn't funny. But

1:22:01

some part of it's, I'm not sure which part of it's funny,

1:22:03

but I'm laughing, because NFTs gave people snow

1:22:05

blindness. Tell me that's not fucking funny. It

1:22:08

just is. I think that

1:22:11

the fact that board a Bia club managed

1:22:13

to still have a conference in today, like

1:22:16

I'm not sure what's going on with NFT values.

1:22:18

I'm sure it's largely manufactured

1:22:21

by people just refusing to sell, but the,

1:22:25

yeah, I just, I'm not sure

1:22:27

what that conference is. Is it just a party? Like, I've

1:22:30

seen people walking around with like board a Bia clubs,

1:22:32

hoodies and stuff on. Is it just a brand now?

1:22:34

I feel like it is. Yeah, I think it was

1:22:36

a speculative commodity that's

1:22:39

value was based on a story about

1:22:41

how valuable the brand is going to be. And now the value

1:22:44

of the commodity crashed and all

1:22:46

that's left is the brand and some people kind

1:22:48

of holding on for dear life. Yeah, yeah, yeah.

1:22:50

For lack of a better way of putting it like I think

1:22:53

people still own them and I think people

1:22:55

are still excited about it. And honestly, like if

1:22:58

its value isn't up and it's just a

1:23:00

media IP play of like people being

1:23:02

like, we like these apes and making cartoons

1:23:04

and comic books about them. I don't begrudge

1:23:07

that even a little.

1:23:10

It's the second they all assemble

1:23:12

and say this is gonna be worth more than Disney

1:23:15

because XY and I'm like, this

1:23:17

is just self-delusion. I'm not happy

1:23:19

that you gave

1:23:21

your eyes a sunburn staring at

1:23:23

the apes writ large 300 feet

1:23:25

tall in front of you. I'm

1:23:28

just saying it's quite the fucking picture. It's quite

1:23:30

the scene. Do you have to did you

1:23:32

have to own an ape to go to this conference?

1:23:34

That's a great question.

1:23:37

I wonder.

1:23:38

Ape Fest, a three day annual meet up of people

1:23:40

who own board a band and which

1:23:43

still sell for tens of thousands of dollars. Yeah.

1:23:46

Oh, amid the 2021 NFT craze.

1:23:50

Got it. Yeah. I don't

1:23:52

know. I think you had to own a board ape NFT.

1:23:54

I feel

1:23:57

like I feel like that's just like Like

1:24:00

if I was a grifter, I feel like that's the conference.

1:24:03

I want to go there Like I feel

1:24:05

like a room full of people that paid $75,000 for

1:24:09

an image that was digitally

1:24:11

generated of a monkey Yeah,

1:24:15

it would certainly be those are what

1:24:17

we call that a high-influenced group hot

1:24:19

hot leads hot leads for

1:24:22

a grifter It's

1:24:24

probably a sick event. It's in Hong Kong Yeah,

1:24:26

I'll give him credit the graphic design on the websites

1:24:29

is pretty It's pretty bang and

1:24:31

it looks good. Yeah Totally.

1:24:34

Yeah. Yeah, it just it

1:24:36

feels like a metaphor I haven't

1:24:38

taken the time because I just got back from vacation

1:24:40

of articulating exactly how that

1:24:42

metaphor works, but staring

1:24:45

into a glowing Symbol

1:24:48

that inadvertently hurts you there's something

1:24:50

to that. I don't know what it is Okay

1:24:55

Okay FTX Roblox

1:24:58

and European spyware Human

1:25:00

AI pin mission impossible is

1:25:03

influencing domestic policy terminator term

1:25:05

covered a lot Terminator

1:25:07

we got through a lot. I'm

1:25:10

happy to be back. Hopefully you guys made it this

1:25:12

far and If

1:25:14

you didn't totally understand if you didn't

1:25:17

I get it. Yeah, that's

1:25:19

real. That's really valid

1:25:21

I think that's everything. I think that's another one in the

1:25:23

bucket. I Think

1:25:26

we'll catch you in the next one Godspeed

1:25:28

I Think I need to make Godspeed

1:25:30

my sign off. Godspeed. Yeah, Godspeed.

1:25:33

You said Godspeed. I was like, oh, that's

1:25:35

got that's got a nice Godspeed Godspeed

1:25:38

safe travels. I could go with safe travels

1:25:41

and you could do Godspeed That's

1:25:44

so this really intense like it's something

1:25:47

about to happen like are we good This

1:25:49

podcast left me with an ominous taste

1:25:52

in my Thanks

1:26:01

for watching!

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features