Podchaser Logo
Home
No One Talks to the Faucet Anymore - AI Investing Trends, Taylor Swift Deepfakes

No One Talks to the Faucet Anymore - AI Investing Trends, Taylor Swift Deepfakes

Released Monday, 29th January 2024
Good episode? Give it some love!
No One Talks to the Faucet Anymore - AI Investing Trends, Taylor Swift Deepfakes

No One Talks to the Faucet Anymore - AI Investing Trends, Taylor Swift Deepfakes

No One Talks to the Faucet Anymore - AI Investing Trends, Taylor Swift Deepfakes

No One Talks to the Faucet Anymore - AI Investing Trends, Taylor Swift Deepfakes

Monday, 29th January 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

It's time for to it this

0:02

weekend Tech: It's the return of

0:04

Stacey Higginbotham for one week only

0:06

special appearance. Will also talk to

0:08

Ben Par are a I expert

0:10

and my good friend Alan Melvin

0:13

turn a former host of this

0:15

weekend computer hardware. He used to

0:17

be under the sea now he's

0:19

on Solid State has an Ssd

0:21

expert at Pfizer and will talk

0:23

about of course saw as they

0:25

technology, new memory technologies, ai, ai

0:27

vandalism. I say it should be

0:29

illegal. And a whole lot

0:32

more. It's all coming up next.

0:34

Yes, the Vision Pro on twitter

0:36

room. I guess you love from

0:38

people you trust. This

0:41

is tweet. This

0:49

is to it this weekend.

0:51

Tech Episode Nine Hundred Sixty

0:54

Four Recorded Sunday, January Twenty

0:56

Eight Twenty Twenty Four. No

0:58

one talks to the faucet

1:00

anymore. There's

1:02

we get. Deck is brought to

1:04

you by Gusto. Running a small

1:06

business is just plain hard. Gusto.

1:08

Let's you focus on the joy

1:10

of running your business with. It's

1:12

easy to use payroll software accessible

1:15

online from anywhere Gusto House where

1:17

the three hundred thousand businesses and

1:19

ninety percent of it's customers. A

1:21

switch into Gusto was easy. You

1:23

get unlimited payroll for one monthly

1:25

price, no hidden fees, get multiple

1:27

schedules and rates, direct deposit and

1:29

checks you can print yourself. Plus.

1:32

Gusto integrates with your favorite tools

1:34

to make life easier to like

1:36

Quickbooks, Zero, Google, and more file

1:38

and pay all federal state and

1:40

local payroll taxes in all fifty

1:42

states, you know, Three at a

1:44

forecasters say. Running pay roll with

1:47

Gusto takes ten minutes or. Less.

1:50

Gusto. Cares about the small business owners

1:52

they work Within says money can be

1:54

tight right now. you'll get three months

1:56

free when you run your first. A

1:59

rural gotta gusto.coms. Tech and start

2:01

setting up your business today twit listeners. You'll

2:03

get three months free once you run your

2:05

first payroll gusto g

2:07

us to comm Tech

2:17

It's time for twit this week in tech show we cover

2:19

the week's tech news I this is

2:21

going to be like old home week. This

2:23

is fun for me Say

2:25

hello to Stacy Higginbotham. We miss her so much

2:27

on this week in Google She is a policy

2:30

fellow at Consumer Reports still writes

2:32

a little bit does a lot of

2:34

things But it's so nice to have you back

2:36

in front of our microphones. Hi Stacy Hello,

2:39

it's good to be here everything going well

2:42

there and a beautiful, Washington State It

2:45

is it's awesome this warm winter. I know

2:48

it's like the end of the world, but I

2:50

love it anyway Yeah, as Andrew got you playing

2:52

pickleball yet, or Silence spoke a few words

3:02

Lisa and I have been playing the old

3:04

pickleball. It's kind of fun. Oh See

3:07

I thought you'd want to know how many connected

3:09

devices. I've taken out of my house Have you

3:11

have you been able to remove them now Stacy

3:13

of course had a Stacy on

3:15

iot.com and this iot podcast and all that

3:18

stuff and just kind of let that all

3:20

go and As a result you had

3:22

to have all this iot crap you had to go to CES

3:24

He had to do all sorts of stuff. You don't have to

3:26

do anymore That's doing to

3:28

see yes. Oh Well there you go

3:30

a good. I'll ask you about it also

3:32

here Ben par a good friend Longtime

3:35

friend of the show author of the

3:37

AI analyst he is eight

3:39

years ago founded an AI startup

3:42

octane AI right

3:44

in the nick of time and Writes

3:46

about AI for the information hi Ben Hello,

3:49

why do you have a little reddit

3:51

guy behind you is that just oh

3:53

that is actually the mascot for My

3:57

company I carry eyes his name's Octi actually

3:59

hold on I've got the plushy version.

4:01

Do you want a plushy version? I will. Oh, I would

4:03

love that He does look a little

4:05

bit like the reddit guy, but oh, yeah,

4:08

and so his name's Octi. He was invented in

4:10

2016 He I guess around

4:12

it's all floating very popular with kids

4:14

all floating robot. You robot Yeah,

4:17

they all look alike. They all look pretty much

4:19

the same right if they don't look like it

4:21

looks like Eva from WALL-E. That's right Eva

4:26

They don't yeah, you don't want them to look like you

4:29

know one of those dogs from Boston Dynamics, so

4:31

I guess it's fair That's

4:33

fair Hello, Ben. Welcome.

4:35

Good to see you again And

4:37

who else is here formerly host of

4:39

this week in computer hardware? SSD

4:42

expert our SSD expert forever. He

4:45

is now SSD technologist at Phison

4:48

what it what I don't

4:50

know the name phison should I Alan Melvintano? Good

4:53

because their controller technology is in an

4:55

awful lot of SSDs. Oh so

4:58

they're a infrastructure behind

5:01

the scenes kind of Right. I've

5:03

been I've been working my way

5:05

further and further back in the industry And

5:09

you see behind him the remnants of all the

5:11

things remnants of all

5:13

the things yes You

5:16

know I find problems at one company, and then

5:18

it's like oh darn We have to work around

5:21

the problems that come from the controller, okay? Well,

5:23

let me just go work for a controller company

5:25

then I'll just fix other problems there Before

5:28

they even get to the SSD there was a I

5:30

actually saw a story this weekend I thought you know

5:32

I should ask Alan about it. It's not exactly

5:36

SSDs, but it's memory technology Samsung

5:39

has introduced this new LP cam Memory

5:43

technology that is in theory gonna

5:46

replace something that's already been replaced so dims

5:50

But they say it's faster than LPDDR5. Yeah Yeah,

5:55

it's a wider bus. It's just You

5:57

know a lot of dims technologies

6:00

in general there's a work around where you have

6:02

to you know that the bits have to go

6:04

across a fixed number of lanes right and if

6:06

you can make that that bus

6:09

more efficient wider any

6:12

of those things you know especially

6:14

on mobile leads to better

6:17

performance and battery life. Of course a number of

6:19

companies including Apple have started putting the RAM on

6:21

the package which I

6:23

think eliminates that bus you

6:26

have a unified memory eliminates that bus overhead

6:29

right? It

6:31

does well the bus is just way faster

6:33

yeah you're right in that case.

6:35

Right but there are other cons

6:37

to go with that sometimes if you're stacking directly

6:39

on top now it's harder to get heat out

6:41

of the heat producing thing underneath the memory. So

6:43

does CAM have a future this compressed attached

6:49

memory module? We'll have

6:51

to see this is one of those where it's just been announced

6:53

and it's not you know it's not it's not

6:55

one of those things where hey this is

6:57

out now and it's all of a sudden ubiquitous

6:59

everybody's making a thing with it so yes at

7:01

the standard is the thing people could use we

7:05

have to wait and see. Okay let's

7:07

see I see these things and I think boy I wish

7:10

Alan was here so now that you were here I asked

7:12

you. Can I ask Alan later about risk

7:16

five adoption and yeah sure let's ask

7:18

him now risk five adoption. Stacy when

7:21

we were on Twitter we talked a

7:23

lot about this open source architecture to

7:25

replace x86 or

7:27

I guess even arm risk letter

7:30

V for five but

7:33

I'm more Intel's it when you were an Intel they

7:35

were doing it right. Well

7:37

I don't I didn't get that much in the

7:39

weeds when I was at Intel to even be

7:41

able to speak on it right. I was more

7:44

worried with evaluating the performance of the platform and

7:46

the SSDs versus comp I wasn't I didn't usually

7:48

have to go so far back in the chain

7:50

where I was talking ASICs. But you can say

7:53

you must still have an opinion on risk.

7:55

Well it is like Western Digital was one

7:57

of the big adopters of it for Like

8:00

early adopters. Oh, yeah. Anyway, we don't have

8:02

to talk about this this week in risk

8:05

adopt this week in Increase

8:09

so Dynamically

8:11

because people they're talking about risk

8:14

five everybody come on over No,

8:17

I'm probably not Let's say listen as far

8:19

as SSD things go anything that can move

8:21

the bits faster and more power Officially and

8:23

as good and accomplish all of the tasks

8:26

that the SSD controller needs to do is

8:28

a viable solution for that Right. It just

8:30

depends on who does it then

8:32

if they choose to or they know and what's

8:34

the cost benefit for it all right, I know

8:37

what everybody wants us to talk about or Conversely

8:40

doesn't want us to talk about which is

8:42

that Friday February 2nd? People

8:45

will start getting the vision pro

8:47

headsets from Apple and you're

8:49

gonna start seeing YouTube actually sooner than that

8:51

Probably because because I'm sure the people who

8:53

get loners from Apple will be

8:55

off embargo probably Wednesday So you'll start seeing

8:58

Wednesday the reviews and you'll see

9:00

the thumbnails on on YouTube

9:02

going. Yeah or Ben

9:06

you ordered one you must have ordered one. I

9:09

did order one I got up at was

9:12

it like one two of the board. Oh,

9:14

make sure to order one I'm a em

9:16

Pacific a time and whatever time it was

9:18

it felt like one or two Did you

9:20

know here's the critical question? Did you order

9:23

it to go pick it up in the

9:25

store so you can get that 25 minute

9:27

pitch on how great this thing is? Or

9:29

did you like Alan like Alex Lindsay and

9:32

like Jason Snell say no, yeah, just send it to

9:34

me. I'll figure it out. I Did

9:37

the store version? Oh good. It's gonna be a

9:39

hair faster. And you know what? I got a

9:41

report I gotta like do a video or something

9:43

put it on tick tock. Yeah. Yeah, Mike is

9:45

going in there I wonder if they'll let us

9:47

you know do do a little video of him,

9:49

you know getting the tour I want to see

9:51

Ben pars spooky eyes On

9:53

the front of it. That's gonna be really next time I

9:55

go into it. I wore nothing, but was it please? Oh

9:58

Do the entire time just see if I can.

10:01

Did you send them your glasses prescription? Are you gonna

10:03

get those ice lenses? I don't

10:05

so I can wait I can see just

10:07

fine without the glasses they're just like I

10:09

like to make you look smarter is that it? I literally

10:12

just got my eye exam like an hour

10:14

and a half ago funny enough and

10:17

like it's it's enough it's like where

10:19

it's nice for me to drive I

10:21

could theoretically drive without. Okay. The world

10:23

seems a little clearer but close-up stuff

10:25

like it's actually better for me not

10:27

to wear glasses don't need it so.

10:30

Well you have one advantage you can lend your

10:32

vision pro to other people. It

10:35

won't be so. My fiance has got to go and play

10:37

around and do stuff you know we got to watch I

10:40

don't know. She's a playwright so I imagine

10:42

she'd be interested in in how

10:45

vision pro might have impact theater. I

10:49

wonder if you could I have to see if like they

10:51

have like easy switching accounts because then you could like have

10:53

your own works. They have a guest mode I think there

10:55

is a guest mode yeah. Because

10:57

I would be good. Is

10:59

your fiance interested at all? I have literally

11:02

asked all the normal people that I talked

11:04

to and they are just like what

11:06

the hell I have zero interest in

11:08

this and all of my efforts to

11:11

drum up any sort of excitement so

11:13

I'm just curious like here do

11:15

the real people. If it had risk five in it

11:18

maybe. Just call it the swim

11:20

five. Here's

11:23

what I think Stacy I think that no

11:26

one outside of developers has real interest and

11:28

the goal for Apple this year is not

11:30

to get mass adoption. It's like get it

11:32

in the hands of developers who will get

11:34

excited and build some stuff for it and

11:36

then they'll go more mass market and cheaper

11:38

afterwards but in the interim you know if

11:40

your friend has it you're gonna be like

11:42

sure I'll try it and most likely be

11:44

like oh my god this is awesome maybe

11:46

and then they hear oh it's a thousand

11:48

dollars cheaper next year. Oh I think

11:51

I will go and get it. That's my guess

11:53

with how Apple is rolling this out. It needs

11:55

a couple years to get more developers on board but

11:57

if they drop the price which they'll be able to.

11:59

to do in a few years, they

12:01

can get more adoption. In

12:04

line with what Ben was just saying

12:06

there, the thing that I think is missing

12:08

from it so far, that's not to say that it's

12:11

not great in the experience, appears to be

12:13

pretty impressive, but it's missing the

12:15

killer app, right? It

12:17

doesn't have... Not just the killer app, but

12:19

the killer app category. What

12:22

exactly do you need this for? Right.

12:26

Yeah. And so I think in order,

12:28

they're in this chicken and the egg proposition

12:31

right now, where they need to get the hardware into

12:33

way more hands, so that there's enough

12:36

developers out there to try to do something

12:38

cool with it that really sets

12:40

it off. And then don't be surprised if

12:42

somebody does something really impressive with the technology,

12:44

and then Apple just immediately requires that thing

12:46

and then spends on it for a year

12:48

and makes it way better,

12:51

and then ships it along

12:54

with whatever the next generation. Any guesses as

12:56

to at least the area? Will

12:58

it be a game? Will it be watching videos?

13:00

Will it be that spatial video that

13:03

they're pushing where you could take a 3D

13:05

video with your

13:07

iPhone and then it's like you're there in

13:10

the Vision Pro? Will it be a... They're

13:12

also talking productivity, which is bizarre to

13:15

me. I can't imagine trying

13:17

to do an Excel spreadsheet on that thing. What

13:20

will it be? What category? I

13:23

think a lot of

13:25

those things you just mentioned were already shown

13:27

and it remains to be seen as

13:31

how it plays out whether or not those would be

13:33

deemed a killer app. You can already watch a movie.

13:35

That's already a capability of it. It's not something they

13:37

added later. Although both YouTube

13:40

and Netflix have decided not to

13:42

put out a Vision Pro app,

13:44

probably in a fit of peak

13:48

about the app store more than anything

13:50

else. It makes sense from their perspective

13:52

though. It's going to

13:54

be so few people in the beginning. They can

13:56

always launch one later, a year from now, two

13:58

years from now. and they

14:00

have a bunch of other stuff. It's what I

14:03

get the decision to go and do that. It's

14:06

probably a game if I had to give the first version of

14:08

something like people are always surprised by

14:10

how much the like metas you know quest

14:13

sells they sell a lot they're very popular

14:15

at Christmas time and games are

14:18

probably still the biggest thing with workouts

14:20

and a couple other things still

14:22

no idea what the like killer app is going

14:24

to be that's just going to take some time.

14:28

Okay so I think there's niche

14:30

gaming opportunities as a killer app. I think

14:32

working out that's a popular idea for people.

14:34

I don't know why because strapping something that

14:36

weighs that much to my face and sweating

14:38

in it is like the least the

14:42

least exciting option for me. I played

14:44

lightsaber on my oculus pro my request

14:46

pro for about half an hour and

14:48

yeah it's you feel pretty grody

14:50

at the end of the

14:52

day. I look at it and I think we're

14:55

going to it feels like

14:57

a flamingo in an egg kind of situation or

14:59

some bird that nobody really wants to eat. A

15:02

dodo in an egg perhaps? No I why

15:05

did we kill the dodos did we eat them I don't know. This

15:08

is the whole history there. Yeah no I am

15:11

like I don't actually remember why the dodos died

15:13

but um so I look at this and I'm

15:15

like well so how about

15:17

this? I feel like society will have to get really

15:19

terrible. Passenger passenger and the no no because they

15:21

had value like we wanted

15:24

passenger pigeons. I'm not sure we

15:26

want this yeah we thought passenger

15:28

pigeons were yummy yeah um one

15:30

or but I feel like the

15:34

the opportunity that this offers so

15:37

far and will probably

15:39

offer for a couple more years is

15:41

a true escape from the world we

15:43

live in and I don't know if the

15:46

world is that bad for that many people

15:49

just yet. I think it could

15:51

get there. It's interesting to say

15:53

that's what would drive adoption because

15:55

all of the VR sci-fi stories

15:58

those VR helmets whether it's neuromuscular, answer

16:00

or Ready Player One are used as

16:02

an escape from a dystopian future. Like

16:05

the world's terrible and

16:08

you want to let in the metaverse. And

16:11

if you're like a Gen Z kid who's like

16:13

stuck in an apartment with like eight people

16:15

and you work all that, like that feels

16:17

like a couple years from now that could be

16:19

good for you. Like, but it's still

16:21

super expensive and I don't see that changing.

16:23

Pretty dystopian view though. And

16:26

I agree with you because one of the things I

16:28

don't like about it is it isolates you even if

16:30

you're watching a movie, you're watching it all by yourself.

16:32

We have a pretty good experience with

16:34

a nice TV and a good surround

16:37

sound and a couch where we can,

16:39

you know, communally enjoy something. But

16:41

that's not the VR experience. It's a

16:43

very solitary thing. But you make

16:45

a good point. If you have eight roommates, maybe that's

16:48

not such a bad thing to be solitary. Or you're

16:50

on an airplane. I don't, do you

16:52

think you're going to see, are you going to get on a plane next

16:54

week and there'll be people wearing Vision Pros

16:56

up and down the aisles? I mean, someone's going to do

16:58

it for the fun, for the Seattle, for San Francisco. Sure.

17:02

Yeah. Okay. Look,

17:04

I'll do it for the novelty factor. So someone asked me a question

17:07

and also just to report on it. Not, but am I going to

17:09

bring a $3,500 computer device

17:11

on trips with me? No, not

17:13

normally. But I

17:15

did have an idea through this conversation, uh,

17:18

Vision Pro for toddlers. You put on the

17:20

thing, you don't have to worry about your kids for six

17:22

hours. Yeah, you don't have to think about it. That's

17:26

like the high tech pillowcase. You just,

17:29

you put them in it and that's it. Now

17:31

you, you, uh, and I hope you can translate

17:33

this because you put the Ming Chi Kuo article

17:36

in on the Vision Pros first

17:38

weekend preorder, but it's in Chinese.

17:41

So he said what,

17:43

180,000? I

17:45

can't keep scrolling down. They have the English. Oh,

17:47

there's the English. Okay. Based,

17:50

thank you. Based on preorder

17:52

inventory and shipping time Ming Chi

17:54

Kuo, who's usually fairly accurate, estimates

17:57

that Apple sold 160 to 180,000. Vision

18:00

Pros during the first pre-order

18:02

weekend. The date did slip out to

18:04

a month. I don't

18:07

know where it is today. He

18:09

says it's about five to seven weeks within hours.

18:13

They say Apple says I think that they can make

18:15

as many as half a million a year because

18:17

the limitation is actually those

18:19

Sony screens inside it. And

18:22

Sony has said we can't make that many. So

18:24

half a million a year. It's not intended to be a

18:26

big seller though. This is Apple releasing

18:29

something for a very niche

18:31

market they hope mostly developers

18:33

and then using some very

18:36

high end Apple fans

18:38

as beta testers basically for

18:40

a whole new concept of

18:42

computing. That's what I see. And

18:45

Jason's now explained this on MacBreak Weekly. He said

18:47

if you're Apple and

18:50

you know you need the next best big thing right after

18:52

the iPhone because the iPhone's not going to last forever

18:54

so you know that you've got to find something. One

18:57

of the things you're going to try is

18:59

this kind of maybe AR glasses idea but

19:01

you don't have the technologies yet to release

19:03

that. So in order to

19:05

be ready when it comes ten years from now

19:07

maybe you've got to start trying stuff out now

19:10

knowing it's a small market and it's an expensive

19:12

market to be in and it's going to be

19:14

a money loser for years to come. But

19:17

Apple can afford it and this is what Apple needs to do. You

19:19

agree? And

19:22

that's what Google thought when they released the

19:25

Google Glass way back in the day. We've

19:27

all seen... And Microsoft HoloLens. Yeah,

19:29

we've all seen this before and I wonder if

19:31

2024 given all the craziness around the election,

19:36

given interest rates, given all of this, if...

19:40

and Apple has tons of money. Will

19:42

investors let them play this out to

19:44

the extent they need to? And that's

19:46

a really open question. They

19:49

get a lot of leeway. It is an open question

19:51

but they're going to also let them do a car

19:53

in some amount of time. I

19:57

feel like there is actually a lot of interest

19:59

in smart cloud. If is like as a

20:01

matter, Ray Bans are actually more popular people

20:03

think people do like those. My Gilligan was

20:05

on a couple of months ago and Suisse

20:07

were in Miami on what Yummy this close

20:10

to mine and those are for the average

20:12

sunglasses you have. Good speakers in the Temple

20:14

Peace. And a decent

20:16

battery life. I think he's had for five

20:18

hours right? And it's got a camera built

20:20

in. they just as an Ai feature where

20:22

you could take a picture something and the

20:24

glasses will then tell you what is his

20:26

or do some sort of in google and

20:28

style a search on it. At.

20:31

I want them all those third printed. Matter.

20:34

Ray. Bans make a wider version as even your

20:37

white version says the hair too tight on

20:39

my face or you're like me. you get

20:41

a fair haired I get up at as

20:43

I had yeah I but the idea I

20:45

think space is made to make the act

20:47

exact right click with like you know physically

20:49

ten years down the road. Maybe this knowledge

20:51

he is good enough to get into like

20:53

a Glass as format which I think people

20:55

are interested in a people who are honored

20:57

his skill. but there's a lot holding it

20:59

back. Stacey there's battery life right? That's

21:02

a want to compute power if you really do have been

21:04

more than these medical. Us as do. Is

21:08

there's tons? I mean there's tons of

21:10

hardware development there's I mean there's quality

21:12

internet connectivity that Billie hasn't and sell

21:14

for some the glad know like we

21:16

think it has that. He. Said

21:18

her of a and like there's a light that needs to

21:21

be solved here and. I get my companies are

21:23

like guess we must invest in things computing.

21:25

Because. That's the nest obvious option.

21:28

My the roads I am calling

21:30

your face computers themselves. Apple is

21:32

really a great Have gone to

21:34

great pains to make such as

21:36

spatial computing Know his face computing

21:38

thank you states. So computing

21:40

to me like we want to get

21:42

there. that's like that smart does that's

21:45

that's that's the size fi. You know

21:47

I'm soon enough us a screen and you

21:49

can do that with a our classes but.

21:52

It's face computing. Face it. it's

21:54

face and apple. Or

21:58

a boat or unite. the The

22:00

real question is, for instance, 5G, we

22:03

have the technology to put ubiquitous

22:05

high-speed internet pretty

22:07

much everywhere. Maybe

22:10

it'll have to be improved a little bit, maybe a Wi-Fi 7

22:12

is going to help. I'm sure that's part of the point of

22:14

Wi-Fi 7. You

22:18

can see them in the future, but battery

22:20

technology has not been leaping forward.

22:22

Right now, you have to wear a separate battery

22:24

pack with a wire to use

22:26

the Vision Pro for any length of time. Even then,

22:28

it doesn't go for very long. Well,

22:31

both 5G and Wi-Fi are hugely

22:33

battery intensive. Yes. Right.

22:37

So, yeah. I mean, look. So,

22:39

there's technology that I think that they don't even exist right

22:41

now. It's

22:44

10, yeah, like the people working on

22:47

the battery stuff, but decade

22:49

away, maybe longer? Do you

22:51

think it's a decade? Yeah. I've read

22:53

up some stuff. There's some interesting stuff being done

22:55

and interesting approaches. I

22:58

have a hard time predicting that one because

23:00

that one ends up also on the physical

23:02

limitations of our ability

23:04

to power things. I hope it's a decade away.

23:07

I think Apple probably thinks it's something like that,

23:09

which is why they would come up with something

23:11

now, the bigger thing, because eventually, the way they

23:13

like super battery, it's in the thing, but it's

23:15

still years. But you do raise a good point,

23:17

Stacy. How long will stakeholders put up with this,

23:20

especially if it's a big drain on Apple's

23:23

resources? Right now, Apple shareholders

23:25

are pretty darn happy with

23:27

Apple. They're not exactly leaving

23:29

the stock in droves. We

23:33

have a story later on about layoffs and

23:35

talking about how shareholders are thrilled about that. So,

23:37

I look at, I mean, if

23:40

you're really considering investors

23:42

and shareholders in these sorts of

23:44

conversations and calculations, they're

23:47

not long-term thinkers. Yeah, that's

23:49

right. And if they're spooked right

23:51

now because of everything,

23:54

then that's just an

23:56

open question. It's

23:58

a great point. board and

24:00

shareholders that you

24:02

know to put in what it met up put

24:04

into its its VR efforts

24:07

more than 10 billion a year

24:09

I'm which is not a success

24:11

yeah and it's not paid off although

24:14

day I yeah yeah

24:17

they pivoted AI in fact that's that's the

24:19

other question maybe apples making

24:21

a mistake going all in on

24:23

AR should they be going in

24:25

on AI harder this is such

24:27

a dumb question sorry thank you

24:29

they were wondering

24:34

how long it would take and I

24:36

think that took about 12 minutes so thank you it

24:40

doesn't bother me at all

24:42

it is a dumb question

24:45

tell me why because look

24:48

for any sort of AR experience I mean

24:50

if we if we think about the

24:53

hardware that we need to have for

24:55

shrinking the hardware down to give us

24:57

like true computing on our face or

24:59

whatever spatially they're going to have to

25:02

be new interfaces the best interfaces that

25:04

we can possibly have are going to be

25:07

if not driven by like

25:11

AI like learning today

25:13

I I'm losing my words here

25:15

but AI is just a crucial part it's like

25:17

saying I want to build a computer today without

25:19

considering the latest wireless yeah you need a

25:21

part of it yeah you need it or it's

25:24

not nearly as useful but if

25:26

you can use the glasses or the spectacles or

25:28

the nerd helmet or the whatever

25:30

you call it if you can use it to look at

25:32

at the world and get in from get you

25:35

know analysis and information back that's

25:37

much more useful than if you're just

25:39

you know playing pong so

25:42

I agree with you yeah it is a lot of I

25:44

mean look there's a lot of reports they're working on

25:46

AI for sure I Think things

25:48

internally. apples just always later. But they

25:51

do it where it's much more perfected

25:53

than others. So it's I Keep on

25:55

saying over and over again if there's

25:57

just one company where I'll know. Never

26:00

really bad against them at Apple like people are

26:02

like the watch mass the watches pretty dang popular

26:04

place they don't agree with. Op was a was

26:06

a form factor we already were wearing it was

26:09

something we're used to wearing and I admit I

26:11

was in had been idle on the was my

26:13

first is a i got a long been ah

26:15

you're right the of I'm on was not okay

26:17

my with why when the I phone came out

26:20

we were already hungry for internet on the I

26:22

was a the iso of it's the was say

26:24

it like that was actually really slow adoption care

26:26

for the first two years of the watch nobody

26:28

actually linked it was a. Sealed product.

26:32

At the Us on. Our

26:34

know that ya that like I remember

26:36

driving around trying to like search for

26:38

things on my palm pilot and as

26:41

as new had are like Nokia phone

26:43

early stages democratized at so I think

26:45

the demand and hunger was there for

26:47

that. We all had little numbers Vipers

26:49

from her blackberry Blueberries Andreas as a

26:52

blueberry there's a difference we i blackberries

26:54

by them. Sizes.

26:56

Bit of we all that also had google maps

26:58

so like that's kind of a killer app

27:00

that algiers thousand the death your right so so.

27:03

Killer app for the I phone was

27:05

google maps that came with google dazzling

27:08

Ah but then I think also safari.

27:10

The. Fact that you could use a tiny

27:13

little screen in a browser and actually

27:15

browse desktop. Web. Sites was a big

27:17

deal. him because he could tap at that

27:19

would certainly. Look. He

27:21

said and. Like. The personalization that

27:24

came with something knowing your exact

27:26

location there was once rose adoption

27:28

like remember we were so excited

27:30

or know how to bar and

27:32

and l it said soon as

27:34

I said I am kids. Mobile

27:37

So well. Is it so so Mobile?

27:39

Local Plumber our we risk. We.

27:42

Sell high on that. I'm. In

27:44

that's. That. That

27:47

was kind of the benefit. Of the

27:49

Iso and or know it were

27:51

high on for. The are.

27:53

In A I. Look,

27:55

it's. It's. I talked to like all com they're

27:58

going to tell me about Ai for like in. Modulation

28:00

for like delivering five Ci to the

28:02

phone right? That's it. That's important. great

28:04

But that's one good things about Ai

28:07

is it is is it is like

28:09

computing in general is a very wide

28:11

range of app which infrastructure. Or.

28:13

Am I would can be, but it

28:15

can also be. I use an Ai

28:17

expert system with coding and that's really

28:19

amazing when there are some definite uses.

28:22

For. Ai right now I have a box whereas

28:24

I don't see those. For

28:27

oh yeah, no, I'm arguing that Ai as

28:29

infrastructure. Which is why I think it'll be

28:31

necessary I will way I did it. ever

28:33

name as the Internet in the sense of

28:35

internet is infrastructure for our lives, eyes infrastructure

28:38

and whether you see it or you don't

28:40

ride bikes you know there's a I algorithms

28:42

handling of unjustified on a resume and then

28:44

when you look up on Tic Toc and

28:46

then when you go and drive your car

28:48

and a as a slick abroad like civil

28:50

it is as a broad term that applies

28:52

to a lot of things and people need

28:54

disagree and I'm wearing my top beyond entered

28:56

of a Ice which is it's own little

28:58

sub categories. For jazz like these apps

29:00

have the right it's infrastructure, absolutely owners

29:03

and have been around and it's been

29:05

around for longer than you might realize.

29:07

I I I came up with a

29:09

report on a I had Bastard Trends

29:11

ah it's have been part of karma

29:13

had to plug it or hundred and

29:16

ages Yes and out. One of things

29:18

I found when I was doing it

29:20

was I did a report in college.

29:22

Ah and ah A I had two

29:25

thousand five, two thousand and six. While

29:27

I found like those lights are you

29:29

know when ah where the term A

29:31

I was really first point it was

29:33

a conference into up as in the

29:36

nineteen like Safeties was like oh yeah

29:38

this has been around for a loner

29:40

John Mccarthy We were both people have

29:42

my age have been through more than

29:44

one ai winter as well. You know

29:47

we've watched the Ib overhyped. You.

29:49

Know and die of miserable death

29:51

more than once I think two

29:53

or three I winners of concerts

29:55

so. ah but i don't

29:57

think this is the same i honestly don't i

29:59

am i completely turned around on

30:01

AI. I've become much more bullish on it since

30:03

you last joined us, Stacy, because

30:06

I've been able to use it in so many interesting and powerful

30:09

ways. And I see with

30:11

one of your slides in the deck you show a slide

30:13

110, you show

30:15

an image AI generated image.

30:17

And absolutely, I mean for image

30:20

generation alone AI has been fascinating.

30:22

I've become, and I don't know

30:25

if you're on the same train

30:28

as I am, Ben, but I

30:30

have become so bullish on AI that

30:32

I'm, and much to my

30:35

the dismay of many of my co-creators,

30:38

I'm kind of saying just let

30:40

AI have everything. Don't limit AI

30:42

to uncopyrighted or free material. And actually

30:45

we're starting to see that a lot

30:47

of news organizations are blocking AI scrapers.

30:50

I think that training is really

30:52

important and I think you should, I

30:54

think AI should have access to everything. It's

30:56

not stealing your information, it's learning

30:59

something, and I think it's going to be so

31:01

valuable. I don't

31:03

want to see any limitations on it. I

31:06

see the debate screaming. Yeah, you can

31:08

see, oh yeah, you know how much

31:10

people hate that one, especially creators hate

31:13

that when I say that. Well,

31:15

with valid justification based on

31:18

some, you know, when

31:20

you see an AI image generator spit out

31:23

a thing that still has the watermark. But

31:25

it's not. It's a little bit like the

31:27

watermark, but it isn't, it's not Getty's watermark.

31:30

You can tell that they're, that it ingested

31:32

some Getty images, which is put, which by

31:34

the way, Getty has put online. Right.

31:38

I just think that there's, here's

31:40

why I say this. Look, I understand if

31:42

you're a copyright holder, you're terrified, blah blah

31:44

blah, but there is so

31:46

much value societally to be gained from an

31:48

AI that's smart, and there's so much to

31:50

be lost from an AI that's hobbled. You

31:52

know, one of the stories this

31:54

week is that AI has

31:57

not been allowed to scrape things like the Washington

31:59

Post. in the New York Times, but right-wing

32:02

news media has been welcoming

32:04

AI scraping. And so

32:06

what you're going to get is AIs that are

32:08

trained on right-wing media but

32:10

not left-wing media. And I

32:12

don't think that's good for anybody. And so

32:14

I think it's a mistake to say what AI can read and

32:16

not read. The New

32:18

York Times lawsuit saying, oh, you're going to read the New

32:21

York Times in an AI in

32:23

a chat GPT instead of on our page is

32:26

obvious nonsense. No one's going to do

32:28

that. They're just trying to

32:30

extract some money out of them. As

32:33

a creator, I don't think that's wrong.

32:35

So first up, as someone who's created

32:37

content, if you

32:40

have created your content with the idea that

32:42

it is for people and reading for people,

32:44

right, and for a certain use case, like

32:47

as a journalist, it's a

32:49

service to my readers or whatever, I have

32:51

an audience in mind. And I think

32:53

that's an important thing to

32:56

like, you can't go back and change that

32:58

contract, which is kind of what AI is.

33:00

They're like, oh, yeah, we're going to now

33:02

use this thing that you built that's still

33:05

technically owned by you or your

33:07

publication usually. And we're

33:09

going to use it for this instead. I

33:13

don't think, I think

33:15

your arguments about like, oh, right-wing

33:18

media is going to allow it to

33:20

be trained as opposed to left-wing media.

33:22

I think there's a place for making

33:24

that argument and then having people produce

33:26

content for AI that is quality content

33:28

because of the benefits to society or whatever

33:30

the hell we want to argue. But

33:33

I don't believe that saying

33:35

I should give my content

33:37

to a multi-billion dollar company

33:39

just for better training

33:41

of something they're going to make money

33:43

on is really... What if it weren't

33:45

for them to make money? What if

33:47

it were open source AI? I do

33:49

agree. I don't want to

33:51

see Google or Microsoft dominate or open AI

33:54

for that matter. Dominate AI, I

33:56

think it should be open source. But

33:59

so if you would be okay... if it's somebody who's not

34:01

going to make money in your country. I would want someone

34:03

to ask. It's kind of

34:05

like someone walking into your house and being like,

34:07

man, this place is super nice. Let

34:10

me settle on in. Well, interestingly, the

34:12

reason this is happening is because

34:15

these companies are saying

34:18

you have to ask and open AI and

34:20

Google and others are asking. This

34:22

is from Originality AI, which is

34:24

an Ontario-based AI detection

34:26

startup via Wired magazine. Data

34:28

collected in mid-January on

34:30

about 40 top news sites shows

34:33

that almost all of them use robots.txt

34:35

or something like it to block AI

34:38

web crawlers. The New York Times, The Washington

34:40

Post, The Guardian, The Atlantic, Bleacher

34:42

Report, OpenAI's GPT

34:45

bot is the most widely blocked

34:47

crawler, but none of the top

34:49

right-wing news sites surveyed, including Fox News, The

34:51

Daily Caller, Breitbart, block any of the most

34:53

prominent AI web scrapers. They're

34:55

actually explicitly saying, the sites that are

34:58

blocking it are explicitly saying, no, you

35:00

can't look. And to their

35:02

credit, OpenAI and Google are honoring it, but you

35:04

can see the problem from a societal point of

35:06

view even to that. So

35:09

yeah, well then what's to stop, I

35:11

don't know, Microsoft or Google from saying,

35:13

wow, we have a huge bias problem

35:15

and it's partially because of our trading lines.

35:17

So biased. What's going to happen? What

35:21

I wondered about, we've seen that happen in

35:23

our article. Face recognition, I mean, face recognition

35:25

is notoriously terrible on people of color because

35:28

it's mostly, and this is just an inadvertently,

35:30

this isn't even intentional, but it was trained

35:32

mostly on white faces. So of course it's

35:34

terrible on people of color. But

35:37

you want to fix that, or do you just

35:39

want to abandon face recognition and say, well, we

35:41

should just never use it. Something

35:44

I was curious about after reading that article was, is

35:48

it specific to AI or is it just that

35:50

you have a set of websites that are

35:52

trying to get as much traffic as they can

35:54

more aggressively and so they're just a lot more

35:57

lenient with the robots text versus

35:59

the other. sites. I don't know. I

36:01

didn't know. See that's and it wasn't and

36:03

it wasn't clarified in the article either. Right.

36:05

So I don't know what the answer is.

36:07

Well you can't determine the intention you can

36:09

just merely look at their robots.txt and see

36:11

what they're doing. This is

36:13

also deeply related to and I know to

36:15

the state of media which we talked about

36:17

before we got in the show which you

36:19

know if media were doing really excellent right

36:21

now there'd be less issue. Media

36:24

is suffering we have seen lots of

36:26

layoffs in the media world among our

36:28

friends and the

36:30

like this might be a

36:33

killer for some who have really cheap slow

36:35

margins or thin margins or it might be

36:37

a savior if they can get a payout

36:39

from someone like it open the ire up

36:41

but like long-term you know cat

36:43

is out of the way. People

36:46

are gonna have like this is why they're

36:48

mistral and others like raise money

36:50

because they're building the open source version of this

36:52

you could run the large language bottle from your

36:54

phone around your own device and

36:56

eventually people are just gonna have their own stuff

36:58

running and the thousand people a

37:00

billion a hundred thousand people are not

37:02

gonna all ask for permission each and

37:04

every single place. Well but really the

37:07

interesting thing and so you're talking about

37:09

Mistral I use that and I use

37:11

open AI's rather of Facebook's a llama

37:13

both of which are open models but

37:15

those models those are the

37:17

ones that are being trained on

37:19

this giant database of material

37:23

when I put this on my local

37:25

machine I'm doing the fine-tuning with with

37:27

various content stuff right so it still

37:29

has to be these are still from

37:31

big companies training right in the very

37:33

expensively by the way I'm

37:36

not gonna be shocked if you just like

37:38

and I'm already seeing it's more more starts

37:40

and others like doing their own training of

37:42

their own models and it'll become cheaper that's

37:44

interesting and it'll be easier and more effective

37:46

to go and do. Who does Mistral? Where

37:48

does Mistral come from? That's a company Mistral

37:50

AI right? Yeah they're they're based

37:52

in France and Anderson

37:55

Horowitz Bactam they're

37:57

not that old they're like less of the year old or

38:00

so, but it's like top researchers got together

38:02

and sometimes, you know, these are a

38:04

lot of money, something like that doesn't work. They do seem

38:06

to be working. It's providing a

38:09

different kind of value for those who want

38:11

to have a lot more control for their

38:13

large language model. There are a couple. Go

38:16

ahead. So on the subject of

38:18

that training at the edge and

38:20

training with your own datasets, and this was the last

38:22

thing I was expecting when I was in talks to

38:24

move over to Phizon, they're working on an AI

38:26

thing. Interesting. And it was

38:29

shockingly not gimmicky. Like my immediate assumption initially,

38:31

wait, an SSD company's working on AI, what

38:33

are you guys doing? And then no, actually

38:35

it was a completely legitimate use

38:37

case and there's like, okay, well, you have a GPU, it

38:39

only has so much memory. How do you train a model?

38:41

It's too big to fit in the memory. Oh,

38:43

you can add some SSDs. Ah, a

38:45

swap file. That are meant for that.

38:47

Oh, look, in essence, right? But you

38:50

can't just plug that in and hit go, it doesn't

38:52

work that way. You have to develop the actual technology

38:54

and the software and everything to work together. It

38:56

really is a gold rush, isn't it? Yeah,

38:59

we tease it at CES. We're working on a thing. It's

39:01

a gold rush. Like for a legit purpose. Yeah. So

39:03

I've used two different programs and I

39:06

would like to tell people about them. One

39:08

is called Olama at olama.ai and

39:10

it's based on Facebook, Meta's

39:13

open AI model,

39:15

Llama, Llama 2. But

39:18

you can download other models, including the Mistral

39:20

models. I've set that up and I've also

39:22

played with something called GPT4ALL from Nomic. Again,

39:25

the idea is you download

39:27

these models that are trained by

39:29

sometimes by big companies, sometimes by Mistral, sometimes by

39:31

a variety of people. These are the various

39:34

models you can download. You can

39:36

show the screen. And

39:39

then you would then either fine tune it

39:41

or create your own AI. One of the

39:43

most popular things to do is

39:45

to create a PDF reader that could summarize content

39:47

for you. And I've been doing that locally and

39:49

it's great. It's really, it's kind of amazing what

39:51

you can do, but you do still need these

39:54

big models. I think that'll be very interesting. Do

39:56

you think Ben actually there's going to be a,

39:58

a, a, a, a, a, a, a, a lot of

40:00

people who will be training their own models

40:02

eventually? Oh, yeah. I've

40:04

already seen more and more startups, you know,

40:07

with a lot of resources training their own

40:09

models for very specific things. Already

40:11

seeing people who train models, like they might use

40:13

the basis of an open source model and they

40:15

train it a lot more. Right. But

40:18

they're training it on things like, you know,

40:20

the medical industry, for example, you're not going

40:22

to go and use out of the box,

40:24

chat, GPT, open AI. So

40:26

you might use the large language model

40:28

like LAMA to do the basic

40:30

language stuff, so somebody's generated that and then you're

40:32

going to fine tune it for a medical application.

40:35

Well, and you have to compartmentalize that for HIPAA. You

40:37

can't, the data can't go the other way. Can't go

40:39

out of your... Yep. Right. That's

40:42

really interesting. Boy, I feel like AI is

40:45

the thing that people should be focusing on

40:47

right now, not AR. But as

40:49

you say, that's a dumb distinction, Stacy.

40:51

I'm making it silly, foolish... Well,

40:55

just saying AR versus AI is

40:57

not really... It's not really funny. They go together.

40:59

You're right. That's an excellent

41:01

point. The next thing I want to see, which

41:03

I don't know if it's going to happen. I hope somebody

41:05

does it. If anybody's listening and just working on it, then

41:08

more props to you. But like, we're getting

41:10

pretty close to, if you've ever seen the movie Her,

41:12

where they had the AI, you download the

41:14

software on the phone. And like, I

41:16

want to see the thing where you train your own model,

41:19

you plug in some pieces of software on your desktop,

41:21

you train your own model on all of your own stuff. Right?

41:25

And you go, hey, what was that email from... Yeah,

41:27

that's the final thing. Yeah, I want to just do

41:29

a, hey, what's that email from five years ago about

41:31

this thing, you know, and just find it. I think

41:33

that's going to happen in the next few months. I

41:35

think we're there already, Brett. You can already use Rewind

41:37

to do some of that stuff. Yeah. Are

41:40

you going to get a Rewind? Are you going to do the Rewind? I'm

41:42

buying every single hardware. Did you buy Humane?

41:46

I have not bought Humane. I will

41:49

do that. That

41:51

one is a hard value prop. The Rabbit

41:53

R1 is a much... Did you get a Rabbit? I got

41:55

a Rabbit. I bought every one pending. I really

41:58

wanted the Rabbit. Yeah. I will see. get

42:00

a humane pin and go and try the whole

42:02

thing out. I'll try every single one. So all

42:04

of these devices are designed to have, they

42:07

still, I think all phone home though, right? Nobody's

42:09

running locally, especially the R1, which

42:11

really doesn't even have any apps. It just

42:13

phones. You can't run any of those locally.

42:15

Yeah. But it's adding the fine tuning from

42:17

your life. Not so much

42:19

Rabbit, but definitely rewind.ai and

42:22

the humane AI pin. It's recording

42:24

your experience, your exchanges, your experiences

42:26

and stuff. And then adding

42:28

that as a fine tuning to an existing

42:30

LLM. I, to me, you're right. That's her.

42:33

In fact, it was

42:35

no accident, by the way, that chat

42:37

GPT, when they added an iOS app

42:39

and an Android app had one of

42:41

the voices sounded surprisingly like Scarlett Johansson.

42:45

They didn't call it. It could just be, she has

42:47

an amazing voice. Yeah. Yeah.

42:50

Yeah. We got to take a break. This is a

42:52

great conversation. Don't hold that thought, Alan. I want to

42:54

keep going, but I do have to take a break

42:56

or we, or we will be going way too long.

42:58

Our show today, we got a great, great panel by

43:00

the way, Stacy, it's great to have you back. I

43:02

missed you. Welcome back. You and I are

43:04

going to do the book, the book group on

43:06

February 8th, right? We

43:09

are. That's the most depressing book I've

43:11

ever read. I know. I

43:13

was like, y'all, did

43:16

we really need to do this? So depressing.

43:18

I love Paulo Guy, Bacheco Lupe. I love the

43:21

windup girl. His new one is the water knife.

43:23

I haven't got to the

43:25

end. I'm hoping like that suddenly the sun will come

43:27

out and it'll start raining. Everything will be beautiful, but

43:29

maybe not. I don't know. It's a little grim. No,

43:31

that's not going to happen, is it? Well,

43:34

we'll talk about it February 8th in the club. That's

43:36

going to be Stacy's book club. Maybe

43:38

next time, can we choose a happy book? Next

43:41

time, please. I put happy books on

43:43

the things. It's the people. Who picks

43:45

them? It's the people. It's the voice of

43:47

the people. The voice of

43:49

the people. Ben Parr is the voice of AI. He is

43:51

the author of a great book. You should read the AI

43:53

analyst, co-founders. Eight years he's been

43:55

doing AI at Octane AI.

43:57

Great to have you. Alan

44:00

Malventano now at Faison, which

44:03

is just as easy to

44:05

pronounce as Soledain, but

44:07

only a little bit harder than Intel. He's

44:10

been an all-time... I am dragging

44:12

the SSD industry into the future. His

44:16

next company will be Cyberdine. Cyberdine!

44:19

We thank you all. And then you might as well, you know,

44:21

I'll be working with Ben and we'll just take over the world.

44:23

I thank you all three for being here.

44:25

Our show today brought to you by NetSuite.

44:28

Once your business gets to a certain size, the

44:31

cracks start to emerge. You see this happen

44:34

every time. Small business, you know everybody's

44:36

name, no big deal. But as soon as, you

44:38

know, you get to the point where who... do we

44:40

hire you or you know... And then there's just

44:43

too many people, too many manual processes to

44:46

keep track of. This is... If this

44:48

is where you are or about to be, you should

44:50

know three numbers. 37,000, 25 and 1. Okay?

44:57

I'll explain. 37,000, that's the

44:59

number of businesses that have upgraded

45:01

to NetSuite by Oracle. NetSuite is

45:03

the number one cloud financial system. Streamlining,

45:06

accounting, HR and more.

45:10

25, that's how old NetSuite is. It turns 25 this

45:12

year. That's pretty impressive.

45:16

25 years of helping businesses do more

45:18

with less, close their books in days,

45:20

not weeks, and drive down costs. I

45:23

bet you can guess the one. You're

45:25

the one. Your business is one of a kind. So

45:28

you get a customized solution for all

45:30

your KPIs in one efficient system, a

45:32

single source of truth. That's

45:34

another one right there. Manage risk, get

45:36

reliable forecasts, improve margins, everything you

45:39

need to grow all in one place. Right

45:42

now, download NetSuite's popular KPI

45:44

checklist designed to give

45:46

you consistently excellent performance and it's

45:48

absolutely free. netsuite.com/twit.

45:53

That's netsuite.com/twit. It's your

45:55

own KPI checklist, netsuite.com.

46:00

We thank them so much for supporting

46:02

this week in tech. Great

46:06

conversation about AI. I'm

46:08

like you kind of Ben. I want to try all the AI

46:11

things. I don't want to try unlike you.

46:13

I don't want to try any of the VR things. I

46:18

got this close to buying the Vision Pro.

46:20

I mean, literally I did the vase scan,

46:22

uploaded my prescription, and I got

46:24

my finger hovering over the $3,500 button. Actually,

46:27

it was more because I had to buy

46:29

the $200 travel case. I

46:31

had to buy the extra sweatband. I

46:33

had the $99. I

46:36

had to buy the glasses because I do

46:38

need corrective lenses, $149. So

46:41

I got a well over $4,000 before I was done. And

46:45

then I said, no way. I

46:47

said, I have an Oculus. I

46:50

have a MetaQuest Pro that

46:52

I spent $1,400 for. It's

46:55

got a nice thin layer of dust on

46:57

it in my office collecting dust. I

46:59

knew I would buy this. I would use it for

47:02

a week, get all excited about it, and then it

47:04

would just collect dust. I couldn't justify that. So

47:06

unlike you, I'm not buying one of those. So

47:10

my point before the break, which will

47:12

overlay us into the rest of the

47:14

topics, I'm sure, which applies to both

47:16

the vision-related things, the VR things, and

47:18

the AI discussion we just had. As

47:21

far as having a thing that's very powerful in

47:25

your house, in your position, right? And

47:28

the reason I'm bringing this up is all these technologies

47:30

that were like, oh, well, this might never get here

47:32

to this state where you can just have the thing

47:34

in your house doing all of it. It's going to

47:36

have to go to the cloud. Not necessarily, right? About

47:40

this 25 years ago, this is a smart media

47:42

card for those who remember what they are. This

47:44

is too mad. Wait a minute. Wait a minute.

47:46

That was like a memory card for what? Smart

47:48

media for cameras. Oh, yeah. For giants. Yeah, for

47:50

giants. Look how big that is. Look at how

47:52

big that is. Well, it was really thin, but

47:54

okay. I had one of those.

47:57

That's too mad. A few years later, Sony made

47:59

a mistake. this little micro volt USB thing that

48:01

was really tiny that's 2 gig. Wow. Right?

48:05

Yeah. And then here's an M.2 2242 2 terabytes.

48:09

Wow. Right? Yep.

48:12

And that's across the span of 25 years. Yeah. That's

48:15

a million X. It's kind of remarkable. I

48:17

mean we saw this with hard drives too

48:19

but it's really remarkable and what I

48:22

find fascinating is that there was a lot of

48:24

concern early on that these would not be as

48:26

reliable as spinning media. What's your

48:28

experience? They seem like they're

48:30

actually as reliable if not more reliable. As long

48:32

as you understand what the caveat is which is

48:35

just for flash memory there's a finite amount of

48:37

times you can rerun a spot on it. But

48:39

you got wear leveling. Surprisingly. You got technology surprisingly.

48:41

The car did not understand. Yeah.

48:46

And there's technologies to you know mitigate that.

48:48

I remember Woz coming onto the screen savers

48:50

in about 2002 and he had a USB

48:52

key around

48:56

his neck. And he said

48:59

this is 2 gigabytes. Everything

49:02

I could ever want is on this.

49:04

Actually it might have been this because

49:06

it came with a little silicone pouch

49:09

and a little lander thing. It probably

49:11

was that. It could have been. Ooh

49:15

2 gigabytes. And his whole

49:17

life is around his neck. Now

49:19

my iPhone this has 512 gigabytes. Some

49:23

people even are getting the Vision Pro with a terabyte.

49:25

Did you get a terabyte Ben? I

49:28

did get a terabyte. I'm going to get the most

49:30

amount of things. Okay well that's

49:32

good. Now you're future proofing it. Yes.

49:35

Yeah. But all these technologies that

49:37

you would think have to use

49:39

the broader internet and the cloud and has

49:42

to be in a big server somewhere in

49:44

another state. Not necessary

49:46

right? Just give it a few years. It's the same

49:48

compute will be in your house. I

49:52

don't think the business models. Like

49:55

I would love that and you're right.

49:57

We totally could have computing do

49:59

more locally. But the business models

50:01

don't make sense because then companies can't control

50:03

it to the extent that they want to

50:05

and monetize it to the extent that

50:08

they want to also Yeah,

50:10

it's mostly business models, but also people have become

50:12

less trained for it Yeah,

50:15

Leo, I don't know if you have an article related to

50:17

this for this week But the error covered in a recent

50:20

weeks, but there's a big battle between the the home Home

50:23

cloud where you take all your devices

50:25

and have them You know on your

50:27

own server versus using whatever the company's

50:31

You know cloud-based server is for example There's

50:33

a been a thermostat company in the news

50:36

recently that was going after the

50:38

your home lab type You know

50:40

someone made a plug-in for I

50:42

forget what the home assistant Yeah,

50:45

there's people going after home assistant developers that are just

50:47

for free reverse engineering the protocol and just making hey

50:50

I just want to control my thermostat from my house

50:52

if I have no internet and Apparently

50:54

that's a thing that you know these companies are taking

50:58

Issue with the public is a little schizophrenic on

51:00

this because on the one hand we we talk

51:02

a lot about Privacy and we don't want

51:04

these companies scraping our data And we know

51:06

that you know a internet connected thermostat is

51:08

of course sending that information to the company

51:11

Which is then selling it off to

51:13

somebody who for some reason wants to know how hot my

51:15

house is So on the

51:17

one hand we you know we we talk

51:19

about a lot we talk about oh, there's a

51:21

bad thing But the other hand we really

51:23

don't want to run these things at home And

51:27

I think oh maybe these markets are

51:29

pretty brisk for things like iCloud and

51:31

OneDrive and Dropbox and

51:34

I mean well I mean I have some

51:36

home lab things right I don't do

51:38

Dropbox anymore so much I have my

51:41

own you know you

51:43

have more storage than God You

51:45

know well you're also a Technically

51:48

a dead well here's a better example here's a better

51:50

example that someone can even run on a Raspberry Pi

51:52

in their own Home with a docker container and that

51:54

you've talked with Steve Gibson on on security now a

51:56

bunch about this where you can have a docker Container

51:58

that holds all your passwords Right on a

52:01

very small doesn't need a lot of storage Is that

52:03

a completely reasonable thing especially given all the data breaches

52:05

you hear about? You

52:07

know right that's a you know very private information You

52:09

would think people would be interested in doing that and

52:12

our sponsor bitwarden among others allows you

52:14

to do that I don't

52:16

think a lot of people I think

52:18

they're people like you know, yeah, they don't want

52:21

to pie for everything They I don't even do

52:23

it and I'll tell you why I don't do

52:25

it because I think bitwarden probably is better It's

52:27

securing that vault than I am I'm

52:30

much more likely to do some dumb thing with

52:32

that vault All right Seriously,

52:35

well so the part of it where I

52:39

Have I don't want to call it a disagreement, but

52:41

I mean I have a nest thermostat I'm not

52:43

you know I'm not relying on that thing But

52:45

if I did want to use home assistant and

52:48

switch stuff over to my own for

52:50

some functions I should be able to

52:52

do it right I shouldn't be like I

52:54

shouldn't be for of course You should be able to

52:56

do it and I think my opinion It's

52:59

just as related right to repair It's

53:01

companies should just go ahead and do it because 99% of

53:04

their customers are not gonna do it

53:07

So just let them do it. It looks good for you and

53:10

And people are still gonna do the convenient and

53:12

easy thing It's

53:14

why it's so bizarre that Apple is

53:17

being so malicious with

53:19

the EU over its app

53:21

store When it knows perfectly

53:23

well Even if there is a third-party app

53:25

store available on the iPhone that most iPhone

53:28

users like 99.9% Will

53:31

never even know it exists Let

53:33

alone you know I have a theory about

53:35

this. Yes. I think what

53:37

is what's happening is they're They're

53:40

not thinking today about the issue.

53:42

They're thinking about the long-term implications.

53:44

So think about like basically

53:49

My child doesn't pirate music because they

53:51

never have had to write but I

53:53

can pirate music because when I was

53:55

you know how Really

53:57

great right and you had to write right?

53:59

I was like how you got music. The

54:05

kids today, when

54:07

you're looking for cheap solutions or reliable

54:10

solutions and that's usually people

54:12

who have less income and more

54:14

time. So generally younger people they'll

54:17

train themselves to pull

54:19

away from the traditional business models and use

54:21

this stuff and companies don't want to open

54:23

that door. I don't think that's stupid of

54:26

them. Interesting.

54:28

So Apple's concerned about, see

54:30

I always thought when

54:32

you put a lot of anti-piracy protection

54:34

on stuff it

54:36

teaches, this

54:39

is my example, I know this isn't exactly applicable,

54:41

but copy protection

54:43

teaches people to be pirates because

54:46

it's only normal end users that

54:48

are baffled by it. Pirates know perfectly well how

54:51

to get around all this stuff. They're like Alan,

54:53

they know how to have all the storage and

54:55

do all the things and stuff. But

54:57

normal people who

54:59

are thwarted by copy protection

55:02

learn how to get around it. So the best

55:04

answer, in fact this is what the music industry

55:07

finally came to in Apple, is not to have

55:09

copy protection and just

55:11

make it easy to buy music. And that's

55:14

why your kid just buys music because

55:17

they don't know that there's other ways. They

55:19

don't buy music. They don't even buy music. They read

55:21

it. They read it. They don't even buy it. They

55:23

read it. Because they don't know of any other way.

55:26

Right? But you don't want to train people on how

55:28

to get around stuff. So you're saying Stacey that by

55:31

providing a third party app store

55:33

Apple is teaching people that they

55:35

can't get around it? Well what

55:37

I'm saying is by

55:41

leaving those avenues open more and more

55:43

people will see value in going

55:46

around it. And then over time

55:48

the number of users

55:50

of the legit app

55:53

store reduces. And same

55:55

thing with like Home Assistant or people. It's not

55:57

so much the issue today. It's the people. people

56:00

who are like, oh, I'm going to use Home Assistant,

56:02

and then more and more people do it, and they make

56:04

it easier. Home Assistant is getting better and better over time.

56:07

Just for people who don't know, including yours

56:09

truly, Home Assistant is an open source

56:13

home automation tool? It

56:15

is an open source home automation tool. So

56:17

this is like OpenHAB or one

56:19

of those, nobody use, you told me, you

56:22

told me don't use OpenHAB. Nobody uses OpenHAB.

56:24

Home Assistant is the way to go if

56:26

you're going to do it. I

56:28

think though, by the way, anybody automating their home

56:30

is already in the category of super geek, but

56:32

that's just me. Did

56:36

you take a lot of home automation out of your house,

56:38

Stacey, when you didn't have to do it anymore?

56:43

So I took the Madame A devices, so

56:45

the Amazon Echo devices all left because they

56:47

were the most annoying devices. I

56:49

kept the Google and

56:53

still hoping they get their act together

56:55

on their stuff. And

56:57

I did eliminate a lot of the stupid

56:59

gadgets that you could talk to. Do

57:03

you still have a faucet you can talk to? So

57:06

I still have a faucet that I could talk to, but

57:08

that faucet only works through Madame A, so

57:10

now I can't talk to it. Perfect.

57:14

That's a harder uninstallation process, Leo.

57:16

It requires plumbing. Right. So

57:19

I installed it myself. It wasn't so bad.

57:24

I like to do things. Do you run Home

57:26

Assistant now, Stacey? No. No,

57:29

I don't. The only thing I do

57:31

is, I mean, I

57:33

have more home automation than the average bear, but

57:36

it's mostly light. Stacey, if you might remember, if

57:38

you watched this, we can Google, her husband used

57:40

to complain about how hard it was

57:43

to open up blinds. Do

57:45

you still have blinds you can talk

57:47

to? No

57:51

one talks to the blinds. Theoretically, you could talk

57:53

to the blinds, but everyone has forgotten the fact

57:55

that they pull the string

57:58

attached to the blinds that opens. We

58:01

still have remote control. Oh, you ever. This

58:04

is reminding me my fiance. This is when I

58:07

got to, I saw you like a year and

58:09

a half back in my fiance

58:11

had a play premier in Sonoma. Yes. Called

58:14

Atlas Lomani Gibbon. It's about a

58:16

smart home going awry and like the inserts

58:18

of the fridge exploding and go through all

58:20

sorts of stuff. And like the impact of

58:22

like what if you have AI running things,

58:25

but also the impact of like how that

58:27

affects things like relationships. I just

58:29

had to go and plug because I love my fiance, but also

58:31

we are answering world where these

58:33

are going to be the conversations and I think going

58:36

back to Stacy's original point before, you know, you

58:38

have a free time as a kid, you're going

58:40

to hack what everything you have availability to hack

58:42

right now. You don't have the availability to hack

58:44

on the iPhone. Nothing no more. Kids

58:46

don't even know how to type. Kids

58:49

don't know how to hack. Kids

58:51

are too busy playing the TikTok

58:53

on their phone. They're not.

58:56

You want to see a kid hack something? Take TikTok

58:58

off their phone. That's how they get it back

59:01

on. I mean, seriously, you

59:04

just have to know what they want and make

59:06

it hard for them again. Yeah, they hacked the

59:08

parents. They'll learn how to pirate. By

59:10

the way, complete aside, Leo, while we were having

59:12

the conversation, I did make a phone. I

59:15

did make a mid-journey of you as a pirate. I

59:17

put it in the show notes. Oh, I do. If

59:19

you want to take a look. Pirate Leo. No,

59:22

and I'll confess to

59:24

having downloaded, I think,

59:27

90,000. 90,000 songs on

59:29

Napster back in

59:31

the day. Who didn't? That

59:34

was around. Yeah, who didn't? And that's

59:36

why. Yeah, by the way, mid-journey does

59:38

not know what I look at. Look like I just want

59:40

to say it is. I've

59:43

had that problem. It looks pretty close. If you think that

59:45

looks like me. You look as old-ish. Well,

59:47

in the upper right is John

59:49

Cleese. Yeah. It's

59:52

just an old fat guy, basically, with

59:54

gray hair. I'm the generic

59:56

old fat guy with big nose and gray

59:58

hair. That's it. right there. You

1:00:00

might as well have just used that as the prompt.

1:00:04

At least it got the hat right. Actually,

1:00:07

somebody was asking me about

1:00:09

my chess.com login because I

1:00:12

used an AI

1:00:14

for it

1:00:17

of me and it kind of from a distance it

1:00:19

looks like me. I can't show it because I'm not

1:00:21

logged in. Alright, I'll use this

1:00:24

pirate somewhere. Thank you, Ben. I

1:00:28

love Mid-Journey. I think playing with Mid-Journey is

1:00:30

great. It's one of the pieces that made

1:00:32

me kind of start to think, you know,

1:00:34

I want a smart AI. I want a

1:00:37

good AI and I don't really care if

1:00:39

Thomas Kincaid, the painter of light, is pissed

1:00:41

off that Mid-Journey can do a crappy painting

1:00:43

like him. That just doesn't

1:00:46

seem to me to be

1:00:48

a societal problem. The

1:00:50

heirs of George Carlin.

1:00:53

Look, I understand. I support the heirs

1:00:55

of George Carlin who are

1:00:57

very upset about this fake

1:01:00

George Carlin comedy

1:01:02

routine titled, I'm

1:01:05

Glad I'm Dead. Okay, maybe

1:01:07

that was a little offensive. It's

1:01:10

not very funny, but for

1:01:12

people who are George Carlin fans, maybe

1:01:15

they're kind of happy. It came out a

1:01:17

couple of weeks ago on the YouTube channel.

1:01:19

It's from a podcast called Doodsy, Will Sasso

1:01:21

and Chad Cochran. They're being sued

1:01:23

now by the state of George

1:01:26

Carlin. We

1:01:28

have to draw a line in the sand,

1:01:30

says daughter Kelly Carlin. Do you

1:01:32

think they'll win this suit? I feel

1:01:36

like they have a fair argument. Yeah,

1:01:38

I really agree with them. I would sue in that

1:01:41

case too. I think they're going to lose. I

1:01:45

am not a legal expert at all. Can you

1:01:47

do Rich Little for doing an impression of George

1:01:49

Carlin? That's

1:01:53

different. It is.

1:01:56

Legitimately, over the course of

1:01:58

the next two years, we're going to start to get some answers

1:02:00

to a lot of questions that are surrounding

1:02:03

the legality of certain things with it. From courts.

1:02:06

From the courts. From courts. It's

1:02:08

going to take a while. I don't expect major

1:02:10

rulings to be in effect this year. It'll take

1:02:12

several years probably. You know what I fear though,

1:02:14

Ben, is that we're going to get conflicting

1:02:17

rulings because there's so many courts

1:02:19

involved. Oh, for sure. Supreme Court. And

1:02:21

so you're going to get some, it's going to end up

1:02:23

having to be, but then the Supreme Court's not going to

1:02:25

really resolve this, I don't think. I

1:02:28

think they're going to do a very

1:02:30

constrained, well in this one case, I

1:02:32

don't know. Supreme, if there's lots of conflicting

1:02:34

cases, the Supreme Court will. They've got to resolve

1:02:36

it. They've got to resolve it. They've got to

1:02:38

resolve it. It's going to be a thing that

1:02:41

we really will have to figure out what is

1:02:43

the right line and

1:02:45

what is the right legal line. This

1:02:48

is complex stuff that we'll have to just get

1:02:51

four lawyers to go and debate for a couple

1:02:53

hours. I will watch that. Because

1:02:55

I don't think there's a, even

1:02:57

among our panelists, there's no consensus at all.

1:03:01

The other thing is like, remember too,

1:03:03

we don't fully understand every aspect

1:03:05

of the technology. Like, AI, there

1:03:08

was a story earlier this

1:03:10

week, an AI that

1:03:13

started like, the

1:03:15

point in itself, it was essentially a point and

1:03:18

it went rogue and it was actually deceiving the

1:03:20

people who had trained it and

1:03:22

lying to them about its training. They

1:03:25

could not get it back on track for training at all,

1:03:27

so they had to abandon it. That

1:03:30

kind of thing is happening and

1:03:32

there was no real explanation

1:03:35

yet from some research or so. I

1:03:38

get my whole point. It's like, this is

1:03:40

all cutting edge stuff. There's amazing things you

1:03:42

can do with AI. We do not fully

1:03:44

understand every aspect of it. We

1:03:46

do not fully understand where the legality of the

1:03:48

things are going to go. At a certain point,

1:03:51

the AI is just going to decide to make

1:03:53

things on its own without you saying anything and

1:03:55

then who is responsible then? There's

1:04:00

an interesting idea about like from

1:04:03

a legal perspective making AI a

1:04:05

separate like legal class

1:04:07

like. Yeah, I think that's what you'll have

1:04:09

to do because humans can do and interpret

1:04:11

can do you know, I can

1:04:13

do a George Carlin impression. Humans

1:04:16

can read your articles Stacy and then summarize

1:04:18

them to somebody else and

1:04:20

those are all I think we

1:04:22

all accept normal uses transformative uses

1:04:24

of content. You

1:04:27

feel like it shouldn't be a machine. In fact,

1:04:29

you're very specific. You shouldn't be a profit making

1:04:31

machine. You would you would say a

1:04:33

nonprofit that's okay. It's gonna

1:04:35

be very hard. You're gonna have to create new classes. I think you're

1:04:37

right. Well, and I

1:04:39

mean, there's gonna be a I mean just

1:04:41

on the create like content creation

1:04:43

side. Over

1:04:45

time, if you have AI generating everything,

1:04:48

it's going to devolve into crap,

1:04:51

right? So you're gonna

1:04:53

have to have people in the mix somewhere

1:04:55

and you're gonna have to that's right pay

1:04:57

them accordingly. So we just we're in

1:04:59

this like weird area where we're just

1:05:01

like, ooh, let's arbitrage this and make

1:05:04

all the money and gather all the

1:05:07

gather everything we can. Let me ask Ben because

1:05:09

you're that you're trying to make money on AI.

1:05:14

I think there's such potential for AI. I

1:05:16

mean, if we really want her or you

1:05:19

know, if we want AI to design chips, AI

1:05:22

has already come up with interesting

1:05:25

new medicines. It's

1:05:27

very good at protein folding. But

1:05:30

those aren't LLMs that do that. Well, whatever just

1:05:32

generative AI of some kind. In fact, I think

1:05:34

it's a mistake to say LLMs are different than

1:05:37

GANs are different than that. It's

1:05:39

all generative AI. And it all has

1:05:42

and all of them have to ingest

1:05:44

human created content. I agree with

1:05:46

you, Stacy, that if we're an AI

1:05:48

to be good, a human has to have

1:05:51

input. But we get such

1:05:53

societal benefit out of it. I mean, really,

1:05:55

this is going to be a very bitter

1:05:57

argument. I can tell already. But

1:06:00

Ben, don't you think that that's, I

1:06:04

think you might hobble AI. I think you

1:06:06

might actually kill AI in the cradle by

1:06:09

being too restrictive on what it can read. This

1:06:13

is the ultimate debate between, for

1:06:16

those who know, E slash ACC versus

1:06:18

EA, effective altruism versus effective accelerationism, which

1:06:21

for those who don't know is two

1:06:23

camps, one being like we should move

1:06:25

as quickly as possible because AI could

1:06:28

save lives and every day you wait

1:06:30

to train the AI, the life that

1:06:32

didn't have to be lost versus EA's

1:06:35

which is like AI could destroy the

1:06:37

entire planet. So we should be very

1:06:39

cautious because have you seen Terminator before?

1:06:42

Yeah, I'm not a doomer. And

1:06:45

I think that's a, I think maybe that's gone away. I

1:06:47

don't know. Elon Musk was one of the doomer.

1:06:49

Oh, no, no, no, it hasn't gone away. Oh, no, no, no, no. It's

1:06:53

less like provenance before, but

1:06:56

there's still like a lot absolutely

1:06:58

there. It's just more underground.

1:07:00

And I have those conversations with- I'm an

1:07:02

accelerationist. You might be surprised to hear. I

1:07:05

was never a doomer because I thought that's just sci-fi. But

1:07:09

I am now accelerationist. I believe, in

1:07:11

fact, I might not even disagree with

1:07:13

Larry Page who says it's

1:07:15

time for humans as a species to get out of

1:07:17

the way of the next big thing. We

1:07:20

only have merch for that. There's

1:07:22

an effective accelerationist. I know. I'm

1:07:25

in that camp. I'm

1:07:28

in that camp. I've been, the machines have

1:07:30

won me. Over. I

1:07:33

think Leo might, may or may not be an AI.

1:07:36

I always find the answer to that. Well, this is a good example.

1:07:38

I'm a creator. Now, admittedly, I'm at the end of my career. But

1:07:41

I have hundreds of thousands of videos online that

1:07:43

could be used to train an AI. And

1:07:46

in a year or two, we're not far off

1:07:48

from a time when you could, I mean, the

1:07:50

George Carlin's surprisingly good, when you would be able

1:07:52

to create an AI Leo that looks just like

1:07:54

this. Maybe have a human

1:07:56

on the other end typing in some

1:07:59

content. But

1:08:01

I could live forever as

1:08:04

this created entity. And it doesn't

1:08:06

bother me at all. Leo

1:08:09

even already has a premade avatar in the

1:08:11

form of his digital self from the screensavers.

1:08:13

I do. I was a

1:08:15

digital character actually on MSNBC's site. But

1:08:19

even more we have in our Discord, if you're a Club

1:08:21

Twit member, there's a A.I. Leo who's been

1:08:23

getting better and better, by the way. I don't know

1:08:25

how he's getting better. I'm a

1:08:27

little nervous. So what are you going to let

1:08:29

A.I. Leo host the show? And then 20 years

1:08:31

from now, A.I. Leo

1:08:35

has hosted the show. Yeah, it's just a matter of

1:08:37

time. That's fine with me.

1:08:39

Leo's in his comfy chair sipping his cognac.

1:08:42

It's not even about me. It's

1:08:44

about if the content, if an A.I.

1:08:46

can create good content or solve or

1:08:49

cure cancer, why shouldn't we want

1:08:51

that as opposed to saying, well,

1:08:53

no, I made this and you know, I

1:08:56

am the unique one and only and you can't have

1:08:58

me. That seems very selfish.

1:09:01

Well, no, no, for

1:09:04

your case, Leo, well, obviously you should benefit

1:09:06

from that if you like. No, I don't

1:09:08

care. I don't want money out of it.

1:09:10

No. If it helps

1:09:12

people and if it fulfills the mission that

1:09:14

we created Twit to inform people about technology

1:09:16

so they can use it, more

1:09:20

power to it. Now I agree with you,

1:09:22

Stacey. I don't want Google to be making money off of it.

1:09:25

So maybe that's the difference. Well,

1:09:28

and what are the assurances you have? I mean, you're

1:09:30

going to have to make some provisions for if you

1:09:32

have an A.I. Leo for making

1:09:35

sure it's still accurate, which means you need someone because

1:09:37

of the type of job that you have. You still

1:09:39

need humans. So you're going to have to have funding

1:09:41

for that. So then A.I. U

1:09:44

is still going to need to make money somehow

1:09:46

to pay for the humans on the back end

1:09:48

to check like QA humans at a

1:09:50

minimum. Well, and that's of course open A.I.'s argument

1:09:53

was it got so expensive we couldn't be a

1:09:55

non profit. We had to have a

1:09:57

for profit arm to foot the bill. What do you think

1:09:59

of it? I'm gonna take a break and I'm

1:10:01

gonna ask you what you think of Nightshade, which

1:10:03

is a tool designed to poison

1:10:05

AI Gives

1:10:08

artists a fighting chance against AI. We're gonna take

1:10:10

a break and talk about that when we come

1:10:12

back Stacy Higginbotham

1:10:15

Ben Parr Alan Malventano

1:10:17

great to have you the show

1:10:20

today brought to you by Ecamm and

1:10:22

I know you all know Ecamm we

1:10:24

use Ecamm So when I set up

1:10:26

twit back in the day, we have

1:10:28

big fancy hardware Switchers and

1:10:30

mixers and all this stuff and

1:10:32

cost millions of dollars very expensive

1:10:35

and then along comes Micah

1:10:37

Sargent who does iOS today

1:10:40

all on his Mac using Ecamm It's

1:10:42

kind of amazing the leading Livestreaming

1:10:45

and video production studio built to

1:10:47

run on your Mac. That's all

1:10:50

whether you're a beginner or an expert

1:10:52

Ecamm is here To elevate your video

1:10:54

production from streaming and recording the podcasting

1:10:57

to presenting To doing exactly what

1:10:59

we do here at Twitter Ecamm live is you're all

1:11:01

in one video tool Perfect for

1:11:03

simplifying your workflow Ecamm

1:11:05

supports multiple cameras. It supports

1:11:08

screen sharing. So I Mean,

1:11:11

I'm watching it happen on iOS today. Well,

1:11:13

it will switch to Rosemary. You'll switch to

1:11:15

her screen She'll switch to a phone.

1:11:17

She's holding back to Micah It's

1:11:20

a live camera switcher that lets you and

1:11:22

them direct their show in

1:11:24

real time You'll stand out from the crowd

1:11:26

to with high quality video Oh and All

1:11:30

that lower third stuff we do

1:11:32

logos titles graphics They

1:11:34

all work in Ecamm Ecamm supports

1:11:37

it. It's built in you

1:11:39

could share your screen You can drop in video clips.

1:11:41

You can bring on interview guests. You

1:11:43

can use a green screen works great with green screen

1:11:45

and so much more Join

1:11:47

the thousands of worldwide

1:11:49

entrepreneurs marketing professionals podcasters

1:11:51

educators Musicians and other Mac

1:11:54

users for Lion Ecamm live daily. I

1:11:56

mean really this transforms your ability to

1:11:58

do video production anywhere in time. Try

1:12:01

it out right now. Get a month free when you

1:12:03

subscribe to any of Ecamm's plans. Ecamm.

1:12:05

E-C-A-M-M-E-C-A-M-M. ecamm.com/twit. Don't forget to

1:12:07

use the promo code twit

1:12:09

at checkout so they know

1:12:11

you saw it here. We

1:12:14

are big believers in Ecamm and I'll tell you if I

1:12:16

was starting all over again, boy, I would be using Ecamm.

1:12:19

I absolutely would. Nightshade.

1:12:21

This is an interesting

1:12:24

product created by the

1:12:26

University of Chicago. Let's see.

1:12:29

Artists poison their image data, making

1:12:31

it useless or worse even, disruptive

1:12:35

to AI model training. It'll actually

1:12:37

screw up the model. Ben

1:12:40

Zhao, the computer science teacher who

1:12:42

led the project, compared it to putting hot sauce

1:12:44

in your lunch so it doesn't

1:12:46

get stolen from the workplace fridge. This

1:12:50

must drive you crazy, Ben. It

1:12:54

doesn't drive me crazy. I

1:12:56

am a person who sees both sides

1:12:58

of everything here. The answer

1:13:01

probably between that whole debate of E-A-C-C

1:13:03

versus E-A is probably somewhere in the

1:13:05

middle. I

1:13:07

definitely tend to think that AI will

1:13:09

be a large good in the world.

1:13:11

However, we're going to see a

1:13:14

lot of pain. We're seeing the pain

1:13:16

now. We saw that with the internet.

1:13:18

We saw that with computers. This is

1:13:20

what happens with disruptive technology. For technology

1:13:22

to be really valuable, it has to

1:13:24

be disruptive. I don't

1:13:26

think preserving print newspapers

1:13:30

is the right way to handle this

1:13:32

problem. I don't think

1:13:35

Congress is now getting in the act

1:13:37

trying to save AM radio. Sorry,

1:13:40

it's over. We

1:13:43

should not be subsidizing it or

1:13:45

forcing companies to subsidize it. It's

1:13:49

over. Time and

1:13:51

transition do matter. You think it's

1:13:54

so fast that we do have

1:13:56

to kind of flow it Out.

1:14:00

The in any like technological was I

1:14:02

sat down with like a candidate like

1:14:04

one of the world's greatest ah my

1:14:07

professors and board member since he's been

1:14:09

around for many days. it's never see

1:14:11

this quick of anything and so there

1:14:14

is something to like we have. Like

1:14:16

the transition takes time like we can

1:14:18

see where it's going. ah but it

1:14:21

is painful already painful for you know

1:14:23

people who needed awry on lake aren't

1:14:25

or copywriting and I already know friends

1:14:28

when they already know people who have.

1:14:30

Less work or have been laid off

1:14:32

her. Their consulting business is a lot

1:14:34

less as it used to be. obvious

1:14:36

as a be do stuff and new

1:14:38

things but that's why the discussion of

1:14:40

things like you be I has popped

1:14:42

up and why even like Sam Altman

1:14:44

talks about you would have reservations income

1:14:46

so that every it's because you we

1:14:48

all our work my since everybody cigarettes

1:14:50

as low as a month from the

1:14:52

big tech companies icing night sage to

1:14:54

be illegal. Isaac. These

1:14:56

guys should go to jail for poisoning

1:14:59

Ai data. That's. What I think

1:15:01

I. Guess

1:15:03

that's I like vandalise. I'm gonna

1:15:05

go with states the on this

1:15:07

one that's vandalism. Your vandalizing a.

1:15:10

Ah, Models as much as

1:15:12

don't want some places. As

1:15:15

a success. If you want a is you want

1:15:17

to is basically did it. To me it's the

1:15:19

equivalent of somebody who's walking behind you if you're

1:15:22

recording like and tic tac video and put them

1:15:24

there middle finger up right? or like the guys

1:15:26

walk by news casters and like. Do

1:15:29

Something. I moved here and you're right.

1:15:31

It's legal. It's legal and but it's

1:15:33

rude. And more to

1:15:36

the point it doesn't do any real

1:15:38

damage to interview billion dollar corporations. although

1:15:40

it doesn't care if it's multi million

1:15:42

dollar open source as discover poisons get

1:15:44

one of the things that you as

1:15:46

use the Mona Lisa. What it does

1:15:48

is it does a night shade version

1:15:51

looks just like the real Mona Lisa.

1:15:53

Maybe we'll darker but actually to an

1:15:55

ai looks like a cat. and

1:15:57

so the ai so fascinating AI

1:16:00

starts... If you're teaching people how the AIs see

1:16:02

things, it would work. It could

1:16:04

take fewer than a hundred poison

1:16:06

samples to corrupt a stable diffusion

1:16:09

prompt. So what

1:16:11

this is going to end up doing is

1:16:13

just... they're just going to adapt and overcome. Oh,

1:16:15

believe me. This isn't going to

1:16:18

last five seconds. I agree

1:16:20

with you. This is not really a threat. But

1:16:22

if it were, it should be, they should go to jail. So

1:16:26

Jow says... Wow, I wouldn't go

1:16:28

that far with you. This is not jail. This

1:16:30

is vandalism. Are we sure? This

1:16:32

is not an actual AI Leo who is

1:16:34

all in favor of AI? This

1:16:37

is vandalism. I'm just saying. We have

1:16:39

the opportunity for technology... Leo's training is

1:16:41

future AI. He

1:16:44

properly needs to be like... Wow, future

1:16:46

AI is way more conservative than

1:16:48

I really thought. We have

1:16:50

an opportunity to create

1:16:53

something transformative that could be

1:16:56

like fusion. We were saying before the show

1:16:58

that if we come up with cold fusion,

1:17:00

that's the kind of thing that could change

1:17:03

everything. And of course it could. This

1:17:06

AI has that kind of potential, I think.

1:17:09

In fact, I might

1:17:11

even say AI might invent cold

1:17:13

fusion. You might be stopping cold

1:17:15

fusion by turning the Mona Lisa

1:17:17

into a cat. All

1:17:20

the hot fusion guys are going to

1:17:23

be poisoning the set with... Yeah. Maybe

1:17:26

not realism. Maybe they'll go

1:17:28

to jail. Just a small fine.

1:17:32

No. This is

1:17:34

the sort of thing... Look at

1:17:36

it from a purely evolution as an

1:17:38

effect of altruist. You should be grateful. This

1:17:40

is the sort of thing that's

1:17:43

going to make it bigger, badder, better and

1:17:45

stronger. You're getting great insights into how something

1:17:47

works. You're going to be able to prevent

1:17:49

against it. If you really smart AI, you'd

1:17:51

figure out how to avoid the night shade

1:17:54

attack. AI models have

1:17:56

to learn and adapt to them. It's a

1:17:58

hostile world out there. have

1:18:00

to learn to be smarter than humans. That's

1:18:02

right. That's actually one of

1:18:04

the biggest issues right now, as I see

1:18:06

it with AI, is that if your data

1:18:09

set has some garbage in it, this is

1:18:11

intentional garbage, but there's plenty of other content

1:18:13

that is just incorrect. So that's a good

1:18:15

point. Maybe this will just teach AI to know how

1:18:18

to avoid garbage, garbage inputs. Okay.

1:18:22

All right. Never mind. You don't have

1:18:24

to go to jail. It's okay now. It's okay.

1:18:27

You're doing God's work. You

1:18:29

are not actually an AI

1:18:31

promoting the AI agenda, which

1:18:34

will be a thing that we have to

1:18:36

actually question probably in the next couple of

1:18:38

years or now. Yeah.

1:18:41

And you don't want your AI developed in this

1:18:43

protected, hot house. No, that's true. This

1:18:45

is the equivalent of a latchkey kid

1:18:47

versus a little helicopter parented child. We

1:18:49

don't want latchkey AI. We

1:18:52

want helicopter parented AI. No, we do want latchkey AI. Oh, no.

1:18:55

We do want latchkey AI. Yeah, you want that.

1:18:57

And we have resilient AI that knows how to

1:19:00

put a TV dinner in and

1:19:02

make dinner for itself, is

1:19:04

what you're saying. It maybe has the mental

1:19:06

health issues. You

1:19:08

either put your kid in the bubble and they

1:19:10

never get exposed to diseases and in the moment

1:19:12

they go into the real world, they get sick.

1:19:14

Or you let them play in the mud. Let

1:19:17

them play in the mud. Let the

1:19:19

AI eat mud. It's good for its

1:19:22

ecosystem. Now

1:19:24

you're talking. I get it. Joseph

1:19:27

Cox at 404 Media,

1:19:31

they review multiple examples of AI ripoff

1:19:34

articles making their way into Google

1:19:36

News. Google

1:19:39

says, well, we don't pay that much attention to how the article

1:19:41

was written by an AI or a

1:19:43

human. So this has always

1:19:45

been a problem, which is people see, I'm

1:19:48

sure Ben, you had articles from Mashable ripped

1:19:50

off. Stacy, you had articles from Stacy on

1:19:52

IoT ripped off, where somebody would just copy

1:19:54

it and put it on their site and

1:19:57

a few Google ads to make some money. Right?

1:20:00

But now it's smarter, they have AI

1:20:02

writing these things. Yeah,

1:20:05

and except I've run across some

1:20:07

of these articles in the storage industry and everyone

1:20:10

so far has been grossly wrong

1:20:12

about some, it states

1:20:14

things as fact that are not actually

1:20:16

fact, like no, that's not how that

1:20:19

works, sir, it's totally wrong, like yeah.

1:20:21

Lead storage space researcher Taylor Swift said.

1:20:24

Right. Hey, don't

1:20:26

knock Tay Tay, she probably could do it if

1:20:28

she wanted to. She

1:20:30

probably could. It's funny

1:20:32

that Twitter finally added

1:20:35

some moderation when

1:20:37

they were flooded with deep fake

1:20:40

porn videos of Taylor Swift. Then

1:20:43

they said, oh no, we gotta block it. So what'd they do?

1:20:45

They don't have a lot of tools. So

1:20:47

they just prevented anybody from searching

1:20:49

for Taylor Swift. They can't

1:20:51

just let Grok figure that out, don't they have

1:20:53

their own AI? They don't have an AI, use

1:20:56

it. Yeah, yeah. This

1:20:58

is two separate stories. There's the story

1:21:00

of deep fakes becoming better thanks to

1:21:02

AI, which is very

1:21:05

scary, especially for women. I

1:21:07

was watching a TikTok where when, there's

1:21:12

just a lot of guys who create

1:21:14

very explicit things and it doesn't really

1:21:17

happen when you switch the genders

1:21:19

around anywhere near as much. It's

1:21:22

a real issue that's gonna be a real problem. And

1:21:25

these are the kind of things that we

1:21:27

have to go and solve, which is why we

1:21:29

have the entire debate that we had a little

1:21:31

bit before. The other story being like, there's just

1:21:34

no one to moderate anything over at X and

1:21:36

it took them forever. Now I think they're not

1:21:38

gonna try to do something. Yeah, they're gonna build

1:21:40

a facility in Austin. But you don't. 100

1:21:44

moderators is not gonna be enough, by

1:21:46

the way. I just wanna tell me one. What

1:21:48

are you, nuts? I

1:21:50

don't have a thousand people were doing it before,

1:21:52

but that 100 is not gonna be enough. I

1:21:56

mean, don't piss off Taylor Swift. Actually,

1:21:59

I'm really. Okay, so first

1:22:01

of all, there's no way

1:22:04

to stem this tide, right? You're going to see

1:22:06

deep fakes of everybody

1:22:10

soon and there'll be no way to stop

1:22:12

this. You can't, I mean, what, are they

1:22:14

going to turn off search for everybody? Just

1:22:16

say, no, you know, I mean, basically you

1:22:18

have to turn off Twitter. I

1:22:21

feel for Taylor, I think it's terrible, but

1:22:24

I think we just have to live with it, don't

1:22:26

we? So you're advocating that there's no

1:22:28

way to solve the problem of deep

1:22:30

fake porn. I don't think there is.

1:22:33

I'm not advocating it. I'm not happy about it,

1:22:35

but I don't think there is a way to solve it. Do you think there

1:22:38

is? It's, there is absolutely a way to solve it.

1:22:41

Yes, you find out who's distributing it

1:22:44

and who's creating it and you punish

1:22:46

them. Now, do we have

1:22:48

an appetite for doing that? No. Is

1:22:51

it scalable to do that? No.

1:22:55

We're not doing so well with C-SAM and we have

1:22:57

a lot of infrastructure to fight C-SAM. We

1:23:01

are, okay, we will not solve everything, right?

1:23:03

But the penalties for child pornography are incredibly

1:23:05

high. They are and they should be.

1:23:07

And I guess you're right. You could make

1:23:09

penalties for deep fake porn that would be

1:23:12

equally high. That would be fine. Is

1:23:15

it? That would be fine.

1:23:17

Now we're treading right back into the territory

1:23:19

of the George Harlan thing. Just

1:23:21

a different context. So I was like, there's a couple

1:23:24

issues here and that is, I mean, is it illegal

1:23:27

or is it just?

1:23:30

Creepy. Just tasteful. Just

1:23:32

tasteful, ugly. Yeah. Although

1:23:35

for a guy who's advocating for

1:23:38

putting saboteurs in jail, I would

1:23:40

argue that your stance on this

1:23:42

betrays a certain lack of. I'm

1:23:44

still working it out. I'm thinking. I feel

1:23:47

bad for Taylor. I do. And I

1:23:49

would feel bad for anybody this happened to. But

1:23:51

I just don't know if you could stop it. I

1:23:54

disagree. Start with that one. One,

1:23:57

Stacy's absolutely right. We have to go

1:23:59

after these people and have. actual backbone

1:24:01

to do that and that's I

1:24:03

hope we start to get that but there are

1:24:05

technological ways to go after it too. So

1:24:08

like example there's a company called the Hive, the

1:24:11

Hive.ai and they've been around for a long time

1:24:13

and they literally like you can detect the fetus,

1:24:15

you can detect whatever thing and it will just

1:24:17

like remove it in

1:24:19

live video and among other stuff.

1:24:21

There is technology to do things

1:24:23

like this and to detect and

1:24:25

never underestimate the technological ability. By

1:24:27

the way it's AI doing it. Right.

1:24:31

Yes. This is how it is. The

1:24:33

technology does exist and the technology

1:24:35

is getting better and more of

1:24:37

it is existing. You just have

1:24:39

new convinced companies like X, Twitter

1:24:43

whatever they call them to implement some of

1:24:45

this stuff and you're going to see a

1:24:48

large decrease in it. You do it across

1:24:50

other places. The technology exists, the connect technology

1:24:52

continue exists. As long as there's

1:24:55

bad stuff there's always people building stuff to

1:24:57

fight against the bad stuff and

1:24:59

there was always a technological solution in

1:25:01

addition to the human one which is

1:25:03

let's put these people into jail or

1:25:06

find them. I'm not against that. I

1:25:08

agree you should definitely do that. Yeah

1:25:10

it might just be as simple as you just have to build a

1:25:12

better AI. Which ironically

1:25:15

enough here's here's an even better reason for

1:25:17

the AI on the edge thing because you

1:25:19

might need this. I should point out that

1:25:22

there's a lot of not fake

1:25:24

porn on Twitter just as a

1:25:26

feature of famous people. So

1:25:32

have any of you between something that's... The

1:25:35

technology that just between whether it's AI or

1:25:37

not AI. Harder problem? Yes.

1:25:40

Is it doable? Absolutely. Okay. Right. Have any

1:25:42

of you recently in the past few months

1:25:44

gotten those texts that sounds like if someone

1:25:46

is trying to pretend that they just found

1:25:48

your number every

1:25:50

day. That's all AI driven. You're

1:25:53

gonna need AI on the

1:25:55

edge on your side to be able to

1:25:57

filter that sort of stuff. Yeah. AI

1:26:00

with AI at that point. AI

1:26:03

all the way down, isn't it? Yeah,

1:26:05

basically. So really what AI... I

1:26:08

mean, that was our thinking was

1:26:10

eventually over time that we would have

1:26:12

battling bots. Right. Yeah.

1:26:15

I mean, we're getting there. Yeah,

1:26:18

very close to it. I guess... Go

1:26:21

ahead, Ben. No, I can't. I have

1:26:23

a story about battling bots and I

1:26:26

am not allowed to talk about it in

1:26:28

the public. I will tell you all personally.

1:26:30

Okay, offline we'll talk about it. It involves

1:26:32

governments and... Oh, yeah. Oh,

1:26:35

boy. I mean, it kind of is like

1:26:37

a little terminator-y where you got AI fighting

1:26:39

AI and robots and then the humans

1:26:42

are just kind of hiding in trenches, staying

1:26:44

away from it. I

1:26:46

don't know if this is exactly the future we want.

1:26:50

But it's the future that will get us to

1:26:52

strap on those face computers and just... I

1:26:58

can't. Or do they

1:27:00

all cancer and all diseases? Well, both

1:27:02

branches exist and they're probably going down both

1:27:04

at the same time. Everybody

1:27:08

will live forever so that they could sit in

1:27:10

their bunker with their... Right, avoiding AI. Yeah.

1:27:13

Avoiding the robots. Yes. Actually,

1:27:15

there's a precedent for that because isn't that exactly what happened with the internet?

1:27:18

We were very bullish about it at first, but really it

1:27:20

turned out to be everything that's good and bad if you

1:27:22

have human beings is on the internet. Everything

1:27:25

that's good and bad about us as human beings will

1:27:27

be expressed through AI. Except

1:27:30

at the early part of the internet, you had a

1:27:32

certain subset of people that were trying... Generally were

1:27:34

trying to use it more for good. Yeah, but

1:27:36

they lost. They were... Yeah, they sort

1:27:39

of lost to that. Well, now with the AI stuff, it's

1:27:42

sort of flipped the other way, right? You're already seeing malicious

1:27:45

or immoral uses for it,

1:27:48

taking grasp even before the real good stuff

1:27:50

can come out. Well, stuff has been pointed

1:27:52

out is happening faster than ever before, isn't

1:27:54

it? Yeah. Thank

1:27:57

you. Nah.

1:28:01

Nah. Nah. Got nothing.

1:28:06

I can't wait until February 8th when you and I are

1:28:08

going to talk about the most depressing novel ever written. No,

1:28:13

it's not. It's actually a vision of a kind

1:28:17

of dystopian future. Hashtag

1:28:21

Phoenix Down the Tubes. But

1:28:23

it's very interesting. And he

1:28:25

is a great writer. I really love Paulo

1:28:28

Bachaglupi's stuff. So I look

1:28:30

forward to that. That's going to be a club special. February

1:28:33

8th. What is it? Is

1:28:36

it another one of those we'd start at

1:28:39

9am things? We

1:28:41

are changing the time. So we don't have the

1:28:43

actual time. Thank you, Stacy. I think it's going

1:28:45

to be two year time. Oh,

1:28:47

hallelujah. I don't have to

1:28:49

get up so early. We

1:28:52

will get you the new time. Right now it

1:28:54

says 3pm Pacific. 6pm. You

1:28:56

know, it should be prime time. 95

1:29:00

people interested. It's not too late. You have

1:29:02

time to read The Water Knife.

1:29:04

I would. It's well written.

1:29:06

It's fascinating. He's great at characterizations.

1:29:10

But it's just kind of a gloomy future. But you know, it

1:29:12

may well be our future. So The Water

1:29:14

Knife by Paulo Bachaglupi. You can get it

1:29:17

on Amazon. You can

1:29:19

get it at your bookstore. You can get it on

1:29:21

Kindle. You can get it at Audible. There is a

1:29:23

very good Audible version. That's what I'm listening to.

1:29:26

Look forward to that with you, Stacy. It's going to be so much fun.

1:29:30

Now there is one hitch in that

1:29:32

get along. If you are

1:29:35

not yet in Club Twit, you cannot

1:29:37

participate. But there is a

1:29:39

way out. Just join Club Twit. This

1:29:42

I think as we go forward

1:29:44

into 2024, it's become very

1:29:46

clear, you know, with media failing right and

1:29:49

left, this is our future as

1:29:51

a podcast network is getting our listeners

1:29:53

to support us. We don't

1:29:55

need all of you. Yeah, you can

1:29:58

get off the hook. But we need. about

1:30:00

five to ten percent. We're

1:30:03

right now about two percent. We're not really

1:30:05

doing so well. So if you're

1:30:07

not yet a club twit member but you value what

1:30:09

you hear on our programming, could you help us out?

1:30:13

Seven bucks a month you get ad free versions

1:30:15

of all the shows, you get additional programming like

1:30:17

the book club that you don't get anywhere else,

1:30:19

you get access to the discord. All of

1:30:23

that for seven bucks a month. I think

1:30:25

it's worth it and you get the good feeling you're supporting

1:30:27

what we're doing. twit.tv

1:30:30

slash club twit. We are working other

1:30:32

less expensive ways. Right now you can

1:30:34

get any show including this one for

1:30:36

$2.99 a month but

1:30:39

you know we're gonna look at YouTube subscriptions too

1:30:41

because I understand that you know everybody has you

1:30:44

know limits on how much they can spend but if you can

1:30:46

we'd really appreciate it. And

1:30:48

if you think we're worth more than seven bucks a month

1:30:50

you can also make it

1:30:52

ten bucks if you want or more. We appreciate it. twit.tv

1:30:56

slash club twit. Our

1:30:58

show today brought to you

1:31:00

by ExpressVPN. It's common these

1:31:02

days to overspend on streaming

1:31:04

services. Well

1:31:08

since I started using ExpressVPN I've saved

1:31:10

so much every month by simply changing

1:31:12

my online location. There are a lot

1:31:15

of reasons to use a VPN

1:31:17

of course for security, for privacy but

1:31:19

this is also valuable. You can

1:31:21

watch Netflix in countries all

1:31:23

over the world. If you already have a

1:31:25

Netflix subscription it's okay to do

1:31:27

it. All you got to

1:31:30

do is use ExpressVPN. They're in more

1:31:32

than 90 countries so you

1:31:34

want to switch to another country to watch content from

1:31:36

that country. You tap one button you say I am

1:31:38

in London right now and now you're watching Netflix

1:31:40

England. You refresh the page

1:31:42

and it shows up. You can even use

1:31:44

ExpressVPN to get discounts because some services cost

1:31:46

less in other countries. At

1:31:49

less than seven bucks a month

1:31:51

ExpressVPN really pays for itself. It

1:31:55

is security you can put on your router,

1:31:57

you can put on your phone, you can put it on your computer,

1:32:00

very easy. I have it everywhere when I

1:32:02

want to switch it on, when I'm traveling,

1:32:04

you know, how many times you

1:32:06

go to an airport and there's free Wi-Fi, right? But

1:32:08

is it safe? No! Use

1:32:11

ExpressVPN and it is safe.

1:32:13

It's a no-brainer. If you

1:32:15

want to get more shows, if you want to be more

1:32:18

secure, if you want to protect your privacy, and if you

1:32:20

want to save money while you're at it, go to expressvpn.com

1:32:22

slash twit. When you do that, you get

1:32:25

three months free with a one-year plan. That's

1:32:28

the best deal. Less is seven

1:32:30

bucks a month. e-x-p-r-e-s-s vpn.com/

1:32:35

twit. To learn

1:32:37

more, expressvpn.com/twit. We thank them so

1:32:39

much for their support of

1:32:42

this show. I've used them

1:32:44

for years. Yeah, is that your favorite too?

1:32:47

It is. Yeah.

1:32:49

All right. We

1:32:53

kind of peripherally touched on it, but I think we really

1:32:56

have to address this. Apple's response

1:32:58

to the EU and their Digital

1:33:00

Markets Act. The

1:33:02

EU said that, in fact,

1:33:04

Apple has got to open up their store. The

1:33:07

courts in the US have also

1:33:09

said, the Supreme Court last week said, we're not

1:33:11

going to touch this case. So the courts have

1:33:13

also told Apple, you can't stop people from putting

1:33:16

a link in their app to a different payment

1:33:18

system. So Apple's got to deal with this. But

1:33:21

it seems like in every case,

1:33:24

Apple's response has been, well, Tim

1:33:26

Sweeney has the best quote.

1:33:28

He's, of course, the guy

1:33:30

who sued at Epic to

1:33:33

get them to open up the store so he

1:33:35

could put Fortnite in an Epic store on

1:33:37

the iPhone. He says,

1:33:39

it's a devious new instance

1:33:41

of malicious compliance. Well,

1:33:44

I'd expect Sweeney to say that. But what

1:33:46

do you guys think? Apple is going

1:33:48

to charge 27% instead of 30% if you have

1:33:52

a link to your own store. They're going to

1:33:54

charge a commission. They are going

1:33:57

to offer new app stores.

1:34:00

in the EU exclusively but you'll

1:34:02

still have to run your app by Apple

1:34:04

and get approval and if you sell it

1:34:06

there, they're still going to take 27% or 15%. Actually,

1:34:11

I think it's free for free apps up to

1:34:13

the first million but as a number of people

1:34:15

have pointed out, that's not much of a solution.

1:34:17

What if you have a freemium app like

1:34:20

Fortnite and

1:34:22

all of a sudden you owe Apple half a million

1:34:24

dollars because you sold more than a million?

1:34:28

That's an untenable situation. Mozilla

1:34:31

has complained because, yeah, Apple

1:34:33

says in the EU you can have

1:34:35

a browser that doesn't use Safari's webkit.

1:34:38

You can have your own browser but Mozilla says,

1:34:40

well, great, that's a solution. So everywhere

1:34:42

but the EU, we still have to make our

1:34:44

Firefox browser as always using the

1:34:47

Safari backend but in the

1:34:49

EU we make a whole new browser using

1:34:51

our backend. That doubles our cost. How

1:34:53

have you helped us, Apple? Thoughts?

1:34:59

Have I laid it out accurately?

1:35:02

I mean the EU is going to give some rulings

1:35:04

that are not going to be favorable to Apple. They're

1:35:06

going to slap them down, you think? Yeah,

1:35:09

I mean, yeah, there's interpretation and

1:35:12

it's going to be taken to EU court

1:35:14

and it's probably going to have more stuff

1:35:16

for Apple to go and do and like,

1:35:19

look, like, EU is always a couple years

1:35:21

ahead when it comes to things like this.

1:35:24

This stuff will come stateside a certain point. I

1:35:27

don't know what the middle ground eventually is

1:35:29

because I can understand, like, Apple

1:35:32

from a business position and from a consumer

1:35:34

position I can understand but I can really

1:35:37

understand the consumer position of, like, you know,

1:35:39

on your computer you can sideload whatever you

1:35:41

want and download things and it's been like

1:35:43

that for who since the beginning

1:35:45

of time. So

1:35:47

why can't you do that with your mobile

1:35:49

device? So I mean... I'm going to have

1:35:52

to channel, because I think all of you will probably feel the same

1:35:54

way, but I'm going to channel Alex Lindsay

1:35:56

who argues vehemently against what the EU

1:35:58

is doing, It's Apple's

1:36:00

platform. Apple says we want

1:36:03

to preserve the security and privacy

1:36:05

of our customers. They should

1:36:07

be allowed to make whatever rules they want

1:36:09

on their platform. And if you

1:36:11

don't like it, suck it. Go

1:36:14

put an app store on Android. You

1:36:16

don't have to be on iPhone. And

1:36:19

it's unfair of any company, even

1:36:21

Epic, to say, oh, we want to use

1:36:23

the iPhone without paying Apple. It's due. Apple

1:36:26

made this platform. Plus, Alex makes the

1:36:29

point. It's better for users if you

1:36:31

don't have these different stores and different

1:36:33

browsers. This is not as

1:36:35

good an experience for users. There. Now

1:36:38

it's actually done that. This is incredibly true. OK. So

1:36:42

I've been thinking about this a lot because I

1:36:44

used to be a telco reporter and the whole

1:36:47

idea of being a common carrier. And

1:36:50

everybody wants to be a platform these days,

1:36:52

right? And Apple is a true platform and

1:36:54

has built something out. And

1:36:57

once you get to a certain point, Amazon's

1:37:00

another great example. Do you have not

1:37:05

just so you could be a

1:37:07

big platform and be compelling to people to not forcing

1:37:10

them, but they would want to

1:37:12

use your platform because that's where

1:37:14

the users are. That's what Apple's arguing.

1:37:16

They're like, our platform is so amazing

1:37:19

that people

1:37:21

want to be on it. And that's why

1:37:23

people should pay us a commission, right? Now

1:37:27

true fact of the matter is Apple does a lot of things

1:37:29

that are anti-competitive to keep people

1:37:32

on their platform and to make their

1:37:34

platform to lock folks in, thus giving

1:37:36

them that huge user base. So

1:37:40

if you could have a truly competitive platform,

1:37:44

would it have zero switching costs? What

1:37:46

would that look like? Would that

1:37:48

make something like charging these fees fair? I would

1:37:50

say yes. If you had zero switching costs, it

1:37:52

would. At a certain

1:37:54

point, if you lock people in via like, maybe

1:37:58

it's messaging, maybe it's it's all

1:38:02

of the devices working only with other devices. However you

1:38:04

want to look at it, does

1:38:07

that make it anti-competitive?

1:38:09

And then the EU makes sense trying

1:38:11

to enforce these things, even though I'll admit its

1:38:14

solutions are a pain in the ass. There's a

1:38:16

very famous Supreme Court decision

1:38:20

from almost 50 years ago

1:38:23

that opened up the Bell

1:38:25

System Network. Cardi-phone? Cardi-phone,

1:38:27

yeah. In

1:38:29

fact I remember seeing the cardi-phone at the Computer

1:38:31

History Museum. It's kind of a funky old device.

1:38:34

But this is illegal. The Bell System had

1:38:37

rules against putting third party devices on their

1:38:39

network. And they said it's

1:38:41

the same thing. It's about the security and privacy

1:38:43

of, well I don't know if they talked about

1:38:45

privacy, but certainly it was about the security and

1:38:47

the integrity of our network. We don't want

1:38:50

customers coming along and putting any old device on

1:38:52

the network. You couldn't even use your own phone.

1:38:54

You had to rent it from Western Electric from the

1:38:56

Bell System. Supreme

1:38:58

Court said, no that's wrong. And they

1:39:00

opened up the Bell System with cardi-phone.

1:39:03

Cardi-phone was allowed to use it to attach

1:39:06

a two-way radio to their

1:39:08

telephone. It

1:39:10

was to be used in the Texas oil fields. I

1:39:13

think they only sold a couple of thousand. But

1:39:16

the decision changed everything. It opened up the

1:39:18

Bell System. And

1:39:21

it made it possible for you to buy your own

1:39:23

phone, for instance, and put it on the phone's network.

1:39:28

The Hush-a-phone. What a Hush-a-phone. Give

1:39:30

me one of those. This is another thing

1:39:33

AT&T objected to this small plastic snap-on which would

1:39:35

let business phone

1:39:44

users. Like you were talking into

1:39:46

your shoe. It's like business phone. A cone of silence.

1:39:48

Yeah, it was a cone of silence. It took the

1:39:50

SEC seven years. And

1:39:58

they finally said, no, yeah. you can't

1:40:00

put a hush-a-phone on the phone. Cardiphone

1:40:03

was a big deal, right, Stacey? That

1:40:06

opened up the Bell network. It

1:40:09

did, to attach devices that were not owned by

1:40:11

the Bell company. So those were physical

1:40:13

devices on a physical network. Yeah,

1:40:15

it's not a perfect analogy. But I think

1:40:17

it's very similar. That's

1:40:20

why I've been thinking about this a lot. Because, you know,

1:40:22

one of the, like, if

1:40:25

you're a platform, what are

1:40:27

your rights and responsibilities to your

1:40:30

users and to the people

1:40:32

who come to meet your users on the

1:40:34

platform? And it's real

1:40:36

unclear. Like, I

1:40:39

wish I had a really cool opinion for you. Like, they

1:40:41

should go to jail. But I don't. I'm

1:40:44

trying to work it out in my brain. You'll never

1:40:47

make it in talk radio, Stacey. You got it. I

1:40:49

know. I was saying this

1:40:51

one. This one is truly one where

1:40:53

our nuanced opinion is, like, probably the

1:40:55

correct answer here. There was actually a

1:40:58

great article that Stephen Sinofsky wrote, you

1:41:00

know. Yeah, I was so surprised that

1:41:02

Sinofsky, who was the much-hated guy at

1:41:05

Microsoft who ran Windows, the famous Windows

1:41:07

8, put out

1:41:09

an article in favor of Apple, saying

1:41:11

Apple should not bow to the

1:41:14

EU. Apple is,

1:41:16

here's Sinofsky's article. Apple

1:41:20

has done the right thing. And they

1:41:24

shouldn't give in. My heart sank, he

1:41:27

says, when I read the Digital

1:41:29

Markets Act. I

1:41:32

could feel the pain and struggle

1:41:34

product teams felt in clinging to,

1:41:36

at best, or unwinding it worst.

1:41:38

The most substantial improvement in computing

1:41:41

ever introduced the

1:41:44

iPhone. Do

1:41:48

you agree with Stephen? This opinion surprised you?

1:41:51

Yeah, it kind of did. He had such a... Does

1:41:55

he work for Andreessen Harvitz? I don't know where

1:41:57

he works nowadays, does he? Yeah. for

1:42:00

a long time. Yeah, he's a venture

1:42:02

capitalist and an Andreessen

1:42:05

venture capitalist at that. But

1:42:07

okay. Technology positivity is the name

1:42:10

of the game over

1:42:13

there. That's a good way to put

1:42:15

it, by the way. Thank you. Technology

1:42:17

positivity, that's really good. There's

1:42:19

a difference here between what's

1:42:22

best for users and the answer is it's

1:42:24

probably a mixed bag because you can make

1:42:26

a real argument for both sides. And

1:42:28

what's going to probably happen, which is that

1:42:31

slowly, there'll probably be some more prying up

1:42:34

as more legal stuff comes up over

1:42:36

and over and over again. It'll be

1:42:39

a long time though. And whether it's good

1:42:41

or not, you know, I have no idea.

1:42:43

We will probably see that

1:42:45

result at some point when you have

1:42:47

the first side loaded app store coming

1:42:50

in and you start to see stuff and

1:42:52

you start to see good and bad and

1:42:54

who knows, but it's going to happen.

1:42:59

I find if you think about like, are

1:43:01

we at the same point from a

1:43:03

regulatory and a business perspective that we

1:43:05

were in like the, I don't

1:43:08

know, 1800s with like the railroads and the

1:43:10

launch of like electric networks and that sort

1:43:12

of thing where you need something like a

1:43:14

public utilities commission. And I'm literally

1:43:17

just wondering this because the

1:43:19

economics in online

1:43:23

services, software, that sort of thing,

1:43:26

they, you have to be big and

1:43:29

you have to have like basically

1:43:32

monopoly positions or a duopoly position.

1:43:34

So when you have that, you

1:43:37

don't have the ability to compete in the way

1:43:39

you would if you had, you know,

1:43:41

if everyone could, could play with

1:43:43

these things, even, even with startups, they

1:43:46

just get bought by these big guys, right? And even if

1:43:48

they weren't bought by these big guys and the FTC was

1:43:50

like, Hey, you can't have them, those

1:43:53

companies would fail because they can't really go

1:43:55

up easily against these monopolies. And

1:43:58

so I wonder if we're going to have a much

1:44:00

more or we need a

1:44:02

much more kind of adversarial

1:44:04

thing against regulators and digital

1:44:07

businesses, which I mean,

1:44:10

you know, we hate but I

1:44:15

really don't see a lot of options for us.

1:44:17

We're already seeing more of that. Sorry

1:44:20

to interrupt. Yeah, go. We're

1:44:23

seeing more of that from

1:44:25

the what the SEC,

1:44:27

the FTC, like there is a lot

1:44:29

more aggressive behavior now. Some of the

1:44:31

cases they've chosen, they have not won.

1:44:33

They have not been great cases. Others,

1:44:35

you know, they're like clearly there's going

1:44:37

to they're going to win or

1:44:39

they're going to get something. I'm rooting for them

1:44:41

in some cases. For instance, Lena Conner, the FTC

1:44:43

said it should be as easy to cancel an

1:44:45

account as it is to create one to

1:44:48

which the cable companies responded. But people might

1:44:50

cancel by accident if we make it too

1:44:52

easy for them. I

1:44:55

think in this in many cases, the

1:44:57

regulators are fighting a good fight against

1:45:00

companies that are rapacious, that are as

1:45:03

good as evil. You know, here's

1:45:06

Sinovski has something interesting to say. He

1:45:09

talks about a dichotomy

1:45:11

in computing. The

1:45:14

Microsoft Android way

1:45:16

where it's open, but

1:45:21

as a result, security, privacy,

1:45:23

abuse, fragility and other

1:45:26

problems of the PC show up on Android at

1:45:29

a rate like the PC. But

1:45:31

Steve Jobs had a vision for computing where

1:45:34

it was enclosed and abstracted to make

1:45:36

it safer, more reliable, more private, more

1:45:38

secure. And you get all these benefits

1:45:40

like better battery life, better

1:45:43

accessibility, more consistency, ease of

1:45:45

use. He says these attributes

1:45:47

did not happen by accident. They

1:45:49

were the process of design and architecture

1:45:51

from the very start. They're the brand

1:45:53

promise of the iPhone. Just as much

1:45:55

as the brand promise of Android and

1:45:58

Windows by extension is open. Openness,

1:46:00

ubiquity, low price, and choice.

1:46:04

These choices are not mutually compatible.

1:46:06

You don't get both. Sanofsky

1:46:09

writes, I know this is horrible to say

1:46:11

and everybody believes there's somehow malicious intent to

1:46:13

lock people into a closed environment or on

1:46:16

the Windows Android side an

1:46:19

unintentional incompetence that prevents bad

1:46:21

software to invade an ecosystem. Neither of these would

1:46:23

be the case. Quite simply, there's

1:46:25

a choice between engineering and architecting for

1:46:27

one or the other. It's not true.

1:46:38

You can build open and open can be

1:46:40

way more secure because there are more people

1:46:43

assessing it. I mean, that's a

1:46:46

talking point you encounter all the time

1:46:48

when arguing with people like Steven Sanofsky.

1:46:51

Now I think there is a

1:46:53

legitimate argument for usability.

1:46:57

Yeah, although I have to point out that iOS

1:47:00

is, you know, in the early days the

1:47:02

idea of Macintosh was there's only one way

1:47:04

to do anything and there

1:47:06

were very strong user guidelines, user

1:47:08

interface guidelines. So everything is very

1:47:11

consistent. But that's not

1:47:13

the case with iOS. A lot of times, especially because they've

1:47:15

gotten rid of all the affordances like the home button, a

1:47:17

lot of times you'll look at an iOS app and go,

1:47:19

I don't know what to do to

1:47:21

get out of this, to back up. I'll just

1:47:23

force close the app sometimes because I'm just like

1:47:25

sitting there, I don't know what to do here.

1:47:28

That's very common because the UI is not

1:47:31

well specified. So I think in some ways

1:47:33

iOS can be more confusing than Android. You

1:47:38

use Android, right? There's there's you're an

1:47:40

Android user. Yeah. I

1:47:42

mean, I use yes, I use both on

1:47:44

my phone. Yeah, yeah. Yeah.

1:47:46

You choose the other one. I mean,

1:47:48

there's enough. There's still

1:47:51

enough consistency with the iOS

1:47:53

iPhone experience across things

1:47:55

that I'm like

1:47:57

I'm like going back and forth in my brain.

1:48:00

on this one because you know on an

1:48:02

iPhone my mom has never had

1:48:04

to worry about viruses she's had

1:48:06

to worry about viruses on her computer this

1:48:09

is probably like the basic core argument that like

1:48:11

Apple wants to make and it's a legitimate argument

1:48:13

if you have more control over what goes on

1:48:15

to an iPhone you never want that to be

1:48:18

like antivirus software never once have I ever had

1:48:20

to think about that quarter that's a good point

1:48:23

or really on the Mac the Mac's more secure

1:48:25

even than the windows and Android okay

1:48:29

so you think that's a straw man argument that

1:48:31

synops keys raising Stacy I think

1:48:33

security is a straw man argument especially in

1:48:36

the year of our Lord 2024 how about

1:48:38

you Alan where do you

1:48:41

weigh in on this you've

1:48:43

been strangely silent I

1:48:46

don't know by

1:48:49

the way ladies and gentlemen the right

1:48:51

answer I don't

1:48:55

know I mean I'm glad

1:48:57

the EU exists I'm

1:49:00

glad they're putting pressure on these platforms to

1:49:03

consider consumers it's weird that

1:49:05

we don't so much in the US that the

1:49:07

FTC and you

1:49:09

know other consumer watchdogs like Congress are really

1:49:11

not getting involved in this I guess

1:49:15

because we believe in the US in

1:49:17

the free market and we kind

1:49:19

of have this core belief that the market

1:49:21

will solve this problem the EU does not I

1:49:26

don't know I think the EU is

1:49:28

more astute on the monopolies maybe that

1:49:30

are happening now or my I'm less

1:49:33

reluctant to regulate them anyway yeah

1:49:35

I mean but I mean

1:49:38

I point to the EU browser

1:49:40

cookie pop-up that

1:49:42

is good as well my god I

1:49:44

know that annoys you but it's wasted

1:49:47

billions of human hours clicking

1:49:49

that thing that does nothing you don't

1:49:53

agree I guess we might be before it is the

1:49:59

doing I'm not trying to trigger you. I don't mean

1:50:01

to trigger you. It's

1:50:03

just a little spastic like, oh god, we're going to get

1:50:05

you. You're warning. You're going to fuck up.

1:50:07

If you're listening to this, you cannot see the spastic that's

1:50:09

safe. It wouldn't

1:50:11

be so bad if it didn't pop up in a slightly

1:50:14

different form in every single

1:50:16

place, especially on mobile. And it

1:50:19

accomplishes the pop-up. It literally does

1:50:21

nothing. It's like a silly button.

1:50:24

Well, website administrators,

1:50:28

I have been to plenty sites where they're just like,

1:50:30

hey, we got to collect this. Or

1:50:32

they could choose not to do it. That's what we do

1:50:34

at our site. That's the other option. You can't not. Oh,

1:50:37

you mean choose not to collect cookies. You

1:50:40

can choose to collect only necessary ones, which is kind

1:50:42

of the option that you could do. I mean, there

1:50:44

are ways... Don't you still have to say

1:50:46

that though? Don't you still have to put a pop-up on?

1:50:48

I feel like there is a lot of pressure. You

1:50:51

still have to tell people that you're

1:50:54

collecting necessary cookie data. You

1:50:57

don't have to force people to go through

1:50:59

the next screen to pick which cookies. You

1:51:02

could just be like, eh, I don't know. When you go

1:51:04

to my personal website, there's a

1:51:06

button. I collect no cookies. But

1:51:09

there is a button that lets you choose light or dark, and

1:51:11

it needs to remember what you chose. It

1:51:15

needs to remember what you chose as a cookie.

1:51:18

So I pop-up an announcement that says, yeah, there's a

1:51:20

cookie here to remember whether you chose light or dark.

1:51:25

Okay? But I feel like I have

1:51:27

to do that for EU visitors. It's

1:51:30

extremely annoying. So,

1:51:33

yes, but you could also... I mean, again, this

1:51:35

is an issue where if people really wanted to

1:51:37

solve it, it's

1:51:40

a particular issue, right? If they didn't want to collect

1:51:42

all this user data, they could create

1:51:44

it a way for you to remember

1:51:46

light or dark preferences without a cookie

1:51:49

on browsers. But they don't.

1:51:52

But the cookie thing is also going to

1:51:54

start to be not a thing as like...

1:51:56

Yes. Google is moving away from cookies entirely

1:51:58

anyway. So, are

1:52:01

we going to still be talking about

1:52:03

cookies in five years? Yes, because the

1:52:05

EU is living in some sort of

1:52:07

strange vacuum. I'm

1:52:09

not against the DMA. And, you know what,

1:52:11

the biggest issue for me with the DMA

1:52:14

is interoperability. And this is

1:52:16

the problem, is if you start to cast your

1:52:18

net so wide, you cause

1:52:20

a lot. If you could just say, for instance,

1:52:22

and this is what Cory Doctorow is always saying,

1:52:24

all we really want is the ability to move

1:52:26

our data or interoperability. So if I use one

1:52:29

message program or another, I can

1:52:31

use whatever I choose. That's all we want,

1:52:33

some choice. And I think you could do

1:52:35

that without compromising the privacy, security, or ease

1:52:37

of use of your platform. Steven

1:52:40

Sinofsky notwithstanding, just that's all

1:52:42

we want, interoperability, choice. Is

1:52:45

that unreasonable? Maybe

1:52:48

it is. I mean, maybe I'm choosing

1:52:50

by simply buying the iPhone that I made the

1:52:52

choice. That's your choice, Leo. If

1:52:54

you want choice, buy more Android phones. Although

1:52:57

I would argue that the OS choice is kind

1:52:59

of a false one. Yeah, I

1:53:02

mean, Tim Cook, when he told a reporter

1:53:04

who said, well, are you going to open

1:53:06

the iMessages? He said, no, but you tell

1:53:08

your mom to buy an iPhone. I

1:53:13

say we go back to floppy disks and Walkmans.

1:53:16

Let's bring the 90s back, baby.

1:53:19

This is why we got to bring back the

1:53:21

1880s with the regulatory cushions. I'm

1:53:24

not kidding. We didn't have any trust in the 1880s,

1:53:27

though, did we? Antitrust

1:53:29

came later with the robber barons, right? In

1:53:32

the 20th century. What was it like in the 1880s? I thought it

1:53:34

was the 20th century. I don't know. All right.

1:53:37

So you're saying, yes. This is a question I have, ChatTPT.

1:53:39

Yeah. Hey, ChatTPT. ChatTPT

1:53:41

is going to put you on a list. When

1:53:44

were the railroad barons? There were antitrust. I

1:53:47

mean, the Sherman Antitrust Act,

1:53:49

I don't remember when that was. That wasn't, I

1:53:51

think, 1910, something like that. Oh, no, you're

1:53:53

right. Good

1:53:56

job. You win. Okay,

1:53:58

you can hit anyone. Captain! Are

1:54:01

you? Are you really? What's the name

1:54:03

of your

1:54:05

team? Prickly Bears. See every... Because

1:54:08

my kid's name is Bear, or Nick name is Bear.

1:54:10

So we're the Prickly Bears. Every trivia

1:54:12

team has a silly name and I love to find

1:54:14

out. We, you know, anytime I'm on a trivia team

1:54:16

I just call them the Twits. It's very easy. I

1:54:20

don't have to think twice on that.

1:54:22

iPhone secretly, iPhone apps, and

1:54:24

this is the other thing is, the

1:54:27

iPhone isn't all that secure. iPhone

1:54:29

apps secretly harvest data when they send

1:54:31

you notifications. Security

1:54:34

researchers at MISC have

1:54:37

said apps like Facebook, LinkedIn, TikTok, and

1:54:39

Twitter are skirting

1:54:41

Apple's privacy rules, you know, the

1:54:43

ATT, the application tracking, to

1:54:45

collect user data through notifications. You don't have

1:54:47

to be running the app since the notification

1:54:49

pops up. They

1:54:52

can collect all sorts of information about who

1:54:54

you are, who kind of device

1:54:57

you're using. And it's not

1:54:59

bad actors. It's big, big actors who are

1:55:01

doing this. A

1:55:04

spokesperson for Meta and for LinkedIn,

1:55:07

according to Gizmodo, categorically deny

1:55:09

the data is used for

1:55:11

advertising or other inappropriate purposes.

1:55:14

It's only used to ensure notifications

1:55:16

work properly. I

1:55:20

think the idea is that, because there's plenty

1:55:22

of apps where if you're actively using it

1:55:24

on the same platform, signed in on another

1:55:26

device or PC or whatnot, that it won't

1:55:28

pop up the notification on your phone while

1:55:30

you're... I turn off all notifications. I don't

1:55:32

want to see them. There's

1:55:34

a tail. Whose tail was that?

1:55:38

Do you have a raccoon? Oh no, you have a

1:55:40

small... Oh, a very pretty cat.

1:55:42

A small fur, maybe. Yes, very pretty. Yes.

1:55:47

We love having pets through here.

1:55:50

So I see what utility there could be for that,

1:55:52

but at the same time, yes, Facebook would

1:55:54

have to know that, well, that person's iPhone

1:55:56

doesn't need to get the notification if they're

1:55:58

also on their iPad. So what happens

1:56:01

is when you dismiss the notification, you're

1:56:04

sending a signal to the app, which now the

1:56:06

app has permission to send all

1:56:08

your device information back. Like

1:56:11

Leo clicked it. So by the way, here's

1:56:13

his phone number, here's his IMEI, here's his

1:56:15

kind of iPhone, here's his location.

1:56:18

Just, you know, we had to know this because

1:56:20

he clicked it. Anyway. I

1:56:23

mean, didn't they already know that? Like...

1:56:26

Well, you know who knew that? You know who knew that? It

1:56:28

was the NSA. Oh, here we go.

1:56:31

Now you've been an intelligence officer,

1:56:33

Alan Malventana. So

1:56:36

you probably were not surprised when

1:56:38

the NSA finally admitted they didn't

1:56:42

want to. But Ron

1:56:44

Wyden was holding up the appointment

1:56:46

for the next NSA director. So

1:56:49

General Nakamoto, Nakasone,

1:56:52

I should say, Nakam... It's

1:56:56

the same name as the Die

1:56:58

Hard company, right? Nakasone. General Nakasone

1:57:00

sent a letter to Wyden saying,

1:57:04

okay, I want to retire, so please. Yes,

1:57:09

we do in fact buy

1:57:11

and use various types of

1:57:14

commercially available metadata for our

1:57:16

foreign intelligence and cybersecurity

1:57:19

missions. Yes,

1:57:22

including data related to

1:57:24

entirely domestic Internet communications.

1:57:28

So now we know. They

1:57:30

do collect all this data. Are you surprised?

1:57:32

Well, that part that said entirely domestic, I

1:57:35

don't think appeared in the actual letter

1:57:37

from the NSA because I was reading through

1:57:39

it. Oh, actually, it's in a quote in

1:57:41

the New York Times article. Maybe.

1:57:45

I can give you my

1:57:47

perspective because I used to work there. You're an

1:57:49

intelligence officer. Yeah, you're an intelligence agent. I used

1:57:51

to do this. Yeah. Okay.

1:57:53

So they have rules. Their mandate

1:57:56

is foreign intelligence. They're not supposed to

1:57:58

be after the U.S. people. Which is

1:58:00

why domestic would be a big deal. Right.

1:58:03

Now, I will say this, having had

1:58:05

to work there and

1:58:07

deal with that type of information,

1:58:09

trust me, it makes the job

1:58:11

harder. The fact that there

1:58:14

is the potential that there's US

1:58:16

people information in there, right? Because

1:58:19

the issue is if you have somebody in the US

1:58:21

and they're talking to somebody that's foreign and

1:58:23

you're trying to surveil the foreign side

1:58:26

of it, then you end up

1:58:28

potentially with the other half of the conversation.

1:58:30

That's like the other end. Here's

1:58:32

the letter from General Nakasone. Dear

1:58:36

Director Haynes, the honorable Director Haynes,

1:58:38

Director of National Intelligence. Oh, wait a minute. No,

1:58:40

I'm sorry. You're right. This

1:58:42

is from Ron Wyden. So, that's, I get what

1:58:44

you're saying. Right. So, let

1:58:46

me see if I can find. You have to go.

1:58:48

Here's the response. Okay. Yes, you're right.

1:58:50

You have to scroll down. Yeah, yeah, scroll

1:58:53

down. This is now addressed to Senator Wyden from,

1:58:55

let me just

1:58:57

look at the bottom. Ronald S.

1:59:00

Moultrie, who is somebody.

1:59:03

He's part of the DNI, I guess. Following

1:59:07

up on the letters regarding the Department of

1:59:09

Defense, the Department of Defense and the Department

1:59:11

of Transportation, providing you with a below redacted

1:59:14

answer to a question. I answer

1:59:16

to you in 2021. Well,

1:59:20

actually, if you scruff a little, it's the

1:59:23

reply from Nakasone. Oh, God, man. This

1:59:25

is all in here. Okay. We

1:59:29

don't need to go all the way into reading entire

1:59:31

pages. My God, what if we had an AI summarize?

1:59:33

I wish I had an AI now. DOD,

1:59:36

here's the question. Here's the perspective I can

1:59:38

offer. Yeah. Here's

1:59:41

the perspective I can offer. So, having worked here

1:59:43

and having seen plenty of stories over the years,

1:59:45

even some of which we've talked about on Twitter,

1:59:47

even dating back to when we got the most

1:59:49

epic spit take on Twitter of all time from

1:59:51

Leo, when he learned that I used to work

1:59:53

for the NSA in the middle of the month.

1:59:55

I didn't know that, actually, until you told me. I thought you merely

1:59:58

were a submariner. you

2:00:00

were also an NSA. I was a man of

2:00:02

many talents. You were a contractor though, not an

2:00:04

employee. It was a

2:00:06

DOD, well the NSA contracts the military

2:00:08

to have. So you were working with

2:00:11

the Navy yeah

2:00:13

on loan to the NSA.

2:00:16

Correct, correct. So whenever

2:00:18

these stories come about, usually the gist of

2:00:21

it is and usually why I've

2:00:23

learned that there's so much hesitation in even

2:00:25

the NSA saying anything is that

2:00:27

if you look at the two stories or the

2:00:29

different perspectives there's like a there's there's a dividing

2:00:32

line right even though the title of

2:00:34

the article is going far in the

2:00:36

one side hey okay all you US

2:00:38

person information right okay good all right

2:00:41

now I can speak from the other side

2:00:43

right if

2:00:45

there was a magical way to make

2:00:47

it so that the NSA could do

2:00:49

their job of the foreign surveillance and

2:00:52

not get any single word or bit

2:00:54

of information from the US people I

2:00:57

guarantee you they would hit that button because

2:00:59

it makes the job so much harder right we

2:01:02

don't we don't want to know what's in your

2:01:04

email like the Leo right we

2:01:06

want to know what the foreign people that

2:01:08

were potentially plotting and doing you know

2:01:11

the whole all the reasons why you would need to

2:01:13

surveil have a foreign intelligence program right to protect

2:01:15

the United States right that's the

2:01:17

job that they have to do all those people that

2:01:19

are still working there so yeah

2:01:22

like we don't want the other side of

2:01:24

conversation but sometimes you don't have a choice

2:01:27

to get it in the act of doing

2:01:29

the surveillance and if that does happen then

2:01:31

you have to do a bunch of extra

2:01:33

steps required by law as like described and

2:01:35

you know from Congress this comes down this is

2:01:37

how you're going to do your job NSA and

2:01:40

they have to make sure that all that

2:01:42

stuff is excluded or even purged from databases

2:01:44

or you know by whatever means

2:01:47

necessary to make sure that that information either

2:01:49

is gone or doesn't go out any

2:01:52

of those things right it's that's they're

2:01:55

basically when you get right down to it it's

2:01:58

a job that's almost impossible to do

2:02:00

without pissing somebody off on

2:02:02

either side of the argument. Not Cassoni said

2:02:04

specifically, and I will read from his letter

2:02:07

specifically, the NSA does not buy or use

2:02:09

location data collected from phones known to

2:02:12

be used in the United States, either

2:02:14

with or without a court order.

2:02:17

He says this categorically. Right. Similarly,

2:02:19

the NSA does not buy or use

2:02:21

location data collected from automobile telematics systems

2:02:24

from vehicles known to be used in the United States.

2:02:27

The NSA does buy and

2:02:29

use commercially available net flow,

2:02:31

that's metadata, non-content

2:02:34

data, related ...

2:02:36

Okay, here's the smoking gun,

2:02:38

to wholly domestic internet

2:02:41

communications and

2:02:43

internet communications, where one side of

2:02:45

the communication is a US internet

2:02:47

protocol address and the other is

2:02:50

abroad. And they've always said, we

2:02:52

have communications when somebody's communicating outside

2:02:54

the US from inside

2:02:57

the US. Right. Now

2:02:59

notice that the last part of that sentence

2:03:01

was conveniently omitted from the other ... Yeah,

2:03:03

you're right. The people making the other point,

2:03:05

right? Yeah. They're pretty clear.

2:03:07

They're like, look, because the reality is they don't

2:03:09

want the other info, right? They're

2:03:11

not going to spend money to get info

2:03:13

that's only US people because they can't use

2:03:16

it anyway. So I know you, Alan,

2:03:18

and I know you're a good guy. And

2:03:22

I teach you. And

2:03:24

you no longer work for the Defense

2:03:27

Department of Intelligence agencies. You're a private

2:03:29

individual. You're

2:03:33

somebody who we trust as a geek. You

2:03:36

understand our concerns and

2:03:38

you feel fully confident that Nakasone

2:03:41

is not misleading Ron Wyden, that

2:03:43

this is an accurate representation and that

2:03:45

they are, in fact, they don't want

2:03:48

domestic communications. Everything represented

2:03:50

in that letter aligns exactly with my

2:03:52

experience having to actually punch

2:03:55

the clock doing that work. I trust

2:03:57

you. I actually trust you more than I

2:03:59

do, Edward Smith. I mean if you say

2:04:01

that's the case I believe you well no listen

2:04:03

he had a point though Edward Snowden

2:04:05

had a point which which we talked about

2:04:08

in twit I don't know how many hundreds

2:04:10

of years ago right yeah, but

2:04:12

there was a case where okay So we just said

2:04:14

that you know they can have this collection where if

2:04:16

one side's foreign the other side's us well That

2:04:19

we learned through the Snowden You

2:04:21

know documents and stuff that you've released was

2:04:23

that hey There's people in the org that

2:04:25

could potentially see both sides of that that

2:04:27

actually shouldn't be able to Right

2:04:30

because they're supposed to be procedures in place to

2:04:32

prevent that sort of thing from happening right And

2:04:35

there was apparently a gaping hole in those

2:04:38

procedures And that's what's known if anything I'm

2:04:40

kind of happy that that did happen because

2:04:42

it was a problem Right

2:04:45

it kind of undermined all of the work that I was doing

2:04:47

like I had to make sure Nobody was

2:04:49

doing searches on any u.s. Person's names or any of the

2:04:51

other stuff and to know that there was just an IT

2:04:53

guy Up at the

2:04:55

top of the IT stack there They could just

2:04:57

run a query and nobody would know the wiser.

2:04:59

That's a problem. It needed to be fixed Well

2:05:02

and also to be fair. It's not part

2:05:04

of the NSA or the CIA's charter to

2:05:06

spy on Americans on American soil

2:05:10

But for that we have the FBI And

2:05:13

other law enforcement agencies, and I gave my

2:05:16

data too yeah, and I do believe

2:05:18

they do buy location data They

2:05:20

buy metadata they may even buy content data.

2:05:22

They buy whatever data brokers will sell, right?

2:05:26

I mean they have whatever set of rules

2:05:28

they're required. That's their their Legal

2:05:31

in fact that's one of the things that Cassini says

2:05:33

is it's legal It's

2:05:36

you know we only do what's legal and we

2:05:38

can't spy on domestic But the information we get

2:05:41

is legally obtained We don't need a warrant to

2:05:43

get that because of data brokers actually

2:05:45

it's been my contention now Maybe maybe you want to

2:05:47

weigh on this that one of the

2:05:49

reasons we don't regulate data brokers Which I

2:05:52

think all Americans who are informed about them

2:05:54

Would like us to do is because

2:05:56

Congress is periodically told Sub Rosa that

2:06:00

the law enforcement agencies would prefer they do

2:06:02

not shut down the data brokers because they're

2:06:04

a valuable source of information, yes? I'm

2:06:08

personally not thrilled with the whole data broker

2:06:10

thing. Yeah, nobody has a piece of it.

2:06:12

Yeah. Right. But

2:06:15

Congress refuses to do anything about it. Right. And

2:06:17

I think that that's probably at the behest of

2:06:19

law enforcement. Yeah, my

2:06:21

personal take on that is that this whole NSA

2:06:23

story, this whole thing shouldn't have even had

2:06:25

to happen because the whole data broker thing shouldn't

2:06:27

even exist in the first place. Right.

2:06:30

So instead of blaming the NSA or casting

2:06:32

stones at the NSA, let's shut down the

2:06:34

data brokers. Right?

2:06:40

Anybody disagree with that one? No, no. I

2:06:42

don't think you will find a normal

2:06:44

person and maybe even Steven Tsunofsky would

2:06:46

agree that data brokers are a problem.

2:06:48

Although, actually, here's what I'm

2:06:51

going to make a super convoluted argument. The

2:06:54

existence of data brokers and the purchasing of

2:06:57

consumer data keeps a lot of apps you

2:06:59

might enjoy free and available for you to

2:07:01

use. As a whole ecosystem,

2:07:03

you're right. No, that's true. I mean,

2:07:06

we've often said the same thing about consumer

2:07:09

credit reporting agencies like Experian

2:07:11

and Equifax and TransUnion. Ugh,

2:07:13

creepy. But honestly, if

2:07:16

they didn't exist, you wouldn't be able to give or get

2:07:19

a loan or rent a home or anything, rent an apartment

2:07:21

or anything. Well, you

2:07:23

would just, I mean, there's actually a

2:07:25

whole bunch of companies using AI to

2:07:27

establish payback records

2:07:29

for people. Anyway.

2:07:31

Maybe we don't need them anymore. Something

2:07:33

like that would exist. Right. It has

2:07:35

to. And Experian and stuff, they're data

2:07:38

brokers. Right.

2:07:41

They're... Right. All

2:07:44

right. Let's take a break and we

2:07:46

can get a few final thoughts in with

2:07:49

our wonderful panel. Really

2:07:51

appreciate it. All

2:07:54

three of you have been here. It's kind of ironic. The

2:07:57

name of our next sponsor is Panoptica. is

2:08:00

brought to you by

2:08:02

Panoptica. Panoptica is Cisco

2:08:04

Cloud's application security solution

2:08:06

which provides end-to-end lifecycle

2:08:09

protection for cloud-native application

2:08:11

environments. It empowers organizations

2:08:13

to safeguard their APIs,

2:08:16

serverless functions, containers,

2:08:18

and Kubernetes environments.

2:08:20

Panoptica ensures comprehensive

2:08:22

cloud security, compliance, and

2:08:24

monitoring at scale offering

2:08:27

deep visibility, contextual risk

2:08:30

assessments, and actionable remediation insights

2:08:32

for all your cloud assets.

2:08:35

Powered by graph-based technology, Panoptica's

2:08:37

attack path engine prioritizes and

2:08:39

offers dynamic remediation for vulnerable

2:08:42

attack vectors, helping security teams

2:08:44

quickly identify and remediate

2:08:46

potential risks across cloud infrastructures.

2:08:49

A unified cloud-native security

2:08:51

platform minimizes gaps from

2:08:54

multiple solutions, providing centralized

2:08:56

management and reducing non-critical

2:08:58

vulnerabilities from fragmented systems.

2:09:01

Panoptica utilizes advanced

2:09:03

attack path analysis, root

2:09:06

cause analysis, and dynamic remediation

2:09:08

techniques to reveal potential risks

2:09:10

from an attacker's viewpoint. This

2:09:13

approach identifies new and known risks,

2:09:16

emphasizing critical attack

2:09:18

paths and their potential impact.

2:09:21

This insight is unique and difficult to

2:09:23

glean from other sources of security

2:09:25

telemetry such as network firewalls. Get

2:09:27

more information on Panoptica's website. It's

2:09:30

panoptica.app. More details

2:09:32

on Panoptica's website panoptica.app.

2:09:36

We thank them so much for supporting

2:09:38

the show. We really appreciate it. We

2:09:41

had a lot of fun this week on Twitter.

2:09:43

We prepared a little movie to show

2:09:45

you what you might have missed. I'm literally

2:09:47

talking to you from the framework Laptop 16

2:09:50

right now. This is the touchpad for the

2:09:52

machine that I literally pulled off the laptop

2:09:54

in front of me. Like right now here's

2:09:56

one of the spacers. Let me rip off

2:09:58

the keyboard. Oh my goodness. This

2:10:00

is the keyboard in front of us. Previously

2:10:03

on Twit, tech

2:10:06

news weekly, you can replace

2:10:08

the GPU on this

2:10:10

machine and you can do it in

2:10:12

two minutes. The idea is you buy

2:10:14

a laptop from them and you have

2:10:16

a future-proof laptop this week in space.

2:10:18

You're thinking about renewable energy, you think

2:10:20

about solar panels, you put them where

2:10:23

the sun is always shining, and then

2:10:25

you beam that to Earth. It

2:10:27

seems like so

2:10:29

beneficial and yet here it is 2024 and we

2:10:31

still don't have this stuff. The

2:10:33

problem is cost. All of

2:10:36

a sudden in just less than

2:10:38

10 years it has been proven

2:10:41

by Starlink, by OneWeb, by Kuiper

2:10:43

Systems, that you can make

2:10:45

space systems super cheap. Mac

2:10:48

break weekly. We have a big birthday to

2:10:50

celebrate. We're gonna see if we can boot

2:10:52

this 40 year old Macintosh. Uh-oh, is that

2:10:55

a sad Mac? It's not seeing the disk.

2:10:57

Once you get the case open, let's look

2:10:59

inside. You were never ever intended to do

2:11:01

what we're doing, which is open this thing

2:11:03

up. Steve, in fact, designed it that way.

2:11:06

But nevertheless he got the designers to sign

2:11:08

the case. All artists signed

2:11:10

their work. Windows weekly. Microsoft announced and

2:11:12

released Copilot Pro. And I'm surprised by

2:11:15

how good this is. It's good for

2:11:17

your thumbnail image. Look

2:11:19

at this. This Halo image is incredible.

2:11:21

The prompt for that did not say

2:11:24

anything like, be as provocative as possible

2:11:26

and please offend Americans and religious people.

2:11:29

And, like, no, it popped that

2:11:31

out on its own. I didn't ask

2:11:33

for that. Twit. Great tech news and

2:11:36

analysis every week. Yeah.

2:11:39

I love it. If you didn't see

2:11:42

it, it basically created The Last Supper

2:11:44

with Master Chief from Halo, where

2:11:46

Jesus would be. And

2:11:49

then an American flag, but instead of

2:11:51

stars, it had French fleur-de-lis on it.

2:11:53

So I don't know what the AI

2:11:55

was hallucinating, but it was

2:11:57

pretty wild. A lot of fun on our

2:11:59

shows. this week. Yeah. Nara

2:12:02

Patel happened to swing by the Phizon

2:12:04

suite at CES and personally

2:12:06

broke down the framework right in front of us.

2:12:08

Oh, how cool. Pretty impressive to see.

2:12:10

Yeah. So I had an

2:12:13

order for that framework 16 with the AMD

2:12:15

processor. I canceled it because I spent so

2:12:17

much on my M3

2:12:19

MacBook Pro and

2:12:22

I thought, do I really need another Linux

2:12:25

laptop? But I love my

2:12:27

framework 13. I really think it's

2:12:29

very impressive and I love the idea that you

2:12:31

can upgrade it. Yeah,

2:12:33

it really was impressive, especially with the plug and

2:12:35

play aspect of it where your page just broke

2:12:37

down. Were those magnets? Same thing that was just...

2:12:39

There was no screws? It was all

2:12:41

magnets? I think it's magnets in like a little bit of like

2:12:44

a clippy kind of action to it. But yeah, there's a lot

2:12:46

of magnets. That's kind of amazing. Yeah. Because

2:12:48

I think it's kind of slide in like it's not

2:12:51

like it would just, you know, if you dropped it,

2:12:53

it probably wouldn't just, you know, disassemble immediately.

2:12:55

Early reviews on it kind of

2:12:57

backed my decision not to buy

2:12:59

it. Apparently there's some performance

2:13:01

issues and so forth, but I'm sure they'll fix those.

2:13:04

I like framework a lot. I think they deserve

2:13:07

success. Frame.work. Yeah,

2:13:10

you could just take it apart and

2:13:12

more importantly upgrade it. Have

2:13:16

you, Ben Parr, played the

2:13:18

new PAL World? I

2:13:22

am going to do that shortly. I have

2:13:25

watched a couple of videos. I really want

2:13:27

to go and play. It's

2:13:29

Pokemon with guns. And

2:13:32

in fact, so much so

2:13:34

that the Nintendo Corporation is suing

2:13:36

them saying it's a little too

2:13:38

much Pokemon with

2:13:40

guns. The

2:13:42

idea is... Didn't someone

2:13:45

make a... Go ahead. Someone made a

2:13:47

mod for it. Yeah, they put actual Pokemon in

2:13:49

it. The

2:13:52

idea is you're running around and you have a

2:13:54

ball. It looks a little bit

2:13:56

like a Pokeball that you throw to capture

2:13:58

your pals. But

2:14:01

apparently you can also capture weapons, which

2:14:05

if you ask me is all that Pokemon was

2:14:07

missing? Well, this game

2:14:09

has aspects that I think

2:14:11

people wanted in the written like Pokemon

2:14:13

games currently like you know It has

2:14:15

some aspects of like the like Minecraft

2:14:18

building process. You can build your own

2:14:20

space You can have your

2:14:22

own essential farm. I can't wait to play

2:14:24

this. This looks like a great This is one

2:14:26

of the fastest like growing games ever Like

2:14:29

you got to like five million downloads in a week,

2:14:31

which is insane And it's the number

2:14:33

one game on Steam because of this now Obviously

2:14:36

some of these things were like it's

2:14:38

clear that like Pokemon right there that

2:14:40

was a little of them look too

2:14:42

much like their Pokemon and I

2:14:45

mean you can make the argument like actually it's

2:14:48

a sheep and like of course Pokemon like go

2:14:51

A soft of real animals and so it'll

2:14:53

be an interesting case I feel like there's some

2:14:55

where you really can't argue they took a little

2:14:57

bit from But it just

2:14:59

why can't Game freak the

2:15:01

creators like make a game like this You

2:15:03

don't have to have the guns but have

2:15:05

the open world stuff and have the like

2:15:07

deeper interaction. That would be fantastic It

2:15:11

looks like there's all sorts of kind of

2:15:13

minecrafty style automations and it really looks like

2:15:15

a great game I'm not surprised it's done

2:15:17

very well Pal world it's

2:15:19

called. What do you think Benito? Have you started playing it?

2:15:22

I have not but I kind of also

2:15:24

wonder if this game would have even been

2:15:26

big if it wasn't for the controversy Oh,

2:15:28

yeah, absolutely. And in fact talk

2:15:30

about the Streisand effect Nintendo

2:15:33

going after him just really solidifies the

2:15:35

whole thing Maybe

2:15:37

they just want maybe they just want a little you

2:15:39

know, kind of a little fee of some kind No,

2:15:42

not intended. Nintendo is no they don't

2:15:44

go along with that. Yeah get sued

2:15:47

make bank Yeah All

2:15:50

right, there's gonna be an interesting test of look

2:15:52

and feel what's the threshold PC

2:15:55

and Xbox although the verge says don't

2:15:58

get the Xbox version. It's valid inferior

2:16:01

so get it on Steam and play

2:16:03

it on your Windows

2:16:05

gaming PC. Palworld.

2:16:10

Bad news for Beeper. I

2:16:15

kind of think we saw this coming. The

2:16:18

company which was founded by

2:16:20

former Pebble founder Eric Michigowski

2:16:24

has decided to give it up on

2:16:26

trying to duplicate Apple's iMessages on

2:16:29

Android. Apple

2:16:32

temporarily banned Macs used as bridges

2:16:34

to Android devices. So

2:16:36

you really, this was kind of the

2:16:38

end of the line for Beeper. And

2:16:42

I guess in this case I don't really think

2:16:44

there's much of a loss. I

2:16:47

wish Apple would open Messages. I understand why for

2:16:50

commercial reasons they're never going to do that. They

2:16:53

will eventually. Do you think they'll have to? Not

2:16:57

have to. There has been that discussion. You

2:16:59

could lock some more people into it. They

2:17:02

might have to. I think it will happen

2:17:04

down the line. I do

2:17:06

think that Beeper tried over. It

2:17:08

was always going to happen. You're trying to force a

2:17:11

platform to do something you don't want to do. You're going

2:17:13

to go and fail. And the other result too. Look, I

2:17:16

used Beeper. Do you? Did

2:17:19

you take it off? I removed it as

2:17:21

soon as I saw this, I thought. I

2:17:23

remove iMessage because people are getting locked out.

2:17:25

Apple can do whatever they want. I'd

2:17:28

like to not be locked out.

2:17:30

But is it a better messaging

2:17:32

program than say WhatsApp or Telegram?

2:17:34

Well, it's better to put everything

2:17:36

together because I have Signal, I

2:17:38

have WhatsApp, I have iMessage, I

2:17:41

have LinkedIn messages, I have

2:17:43

X messages or Twitter messages.

2:17:46

It's nice to have the idea of everything

2:17:48

in one place where I don't have to

2:17:50

go look at everything. But

2:17:53

the reliability is still an issue. I

2:17:57

hope the Beeper people are listening. I've

2:17:59

had enough time. when I tried to send

2:18:01

a message and it just won't actually send

2:18:03

and it was like over signal or over

2:18:05

slack and the beeper and it didn't send

2:18:07

and you know stuff that I

2:18:09

wanted to have happen didn't happen with my team

2:18:12

or with other people and so now I'm like

2:18:14

I can't rely on beeper to send the things

2:18:16

out and like look I understand like you know

2:18:18

it's really hard to work with a lot of

2:18:20

these different systems and all that but that's

2:18:22

gotta that's the bar right you've got

2:18:25

to be able to send messages and

2:18:27

receive messages on the platforms that you

2:18:29

support. This sounds so

2:18:33

it's just blast from the

2:18:35

past trillion. Pigeon. Pigeon that's

2:18:38

the blast of the past. I just need

2:18:40

I just want to sing one thing to

2:18:42

rule them all to talk on all the

2:18:44

things that because nobody wants to just settle

2:18:46

on one thing so I just needed source

2:18:48

army knife. So automatic the folks who do

2:18:50

wordpress own bought something called

2:18:52

texts. Yeah I've tried them too.

2:18:55

I find it annoying. I actually don't want all

2:18:57

these things in one interface maybe because it's not

2:18:59

a great interface but it does do

2:19:01

that. It doesn't do Apple oh

2:19:03

I guess it does do iMessages. It

2:19:06

does do iMessages. Yeah. We'll see what

2:19:08

happens long term here like you know

2:19:10

there is a real desire because there

2:19:12

are so many messaging apps and I'm

2:19:15

in groups on every single message. Oh

2:19:17

it's so frustrating I agree. I

2:19:19

agree. You just

2:19:21

have to have the reliability of and

2:19:24

you send a message that it actually

2:19:26

sends and it like if it doesn't

2:19:28

send is it clearly lets you know

2:19:30

hey this did send please try again.

2:19:33

Stacey a former telecommunications reporter. Is RCS

2:19:37

going to change all this you think? Oh

2:19:41

Apple's adoption of RCS? Apple's going to

2:19:43

do RCS. Google's been pushing RCS. It's

2:19:45

an open standard the rich communication. I

2:19:47

think it'll solve some of the it

2:19:49

won't solve the like I mean we

2:19:51

all have multiple platforms. It won't solve

2:19:53

like I use multiple platforms to communicate

2:19:55

with different people but it will solve

2:19:57

things like when my friends

2:19:59

and me a video from her iPhone. It's

2:20:01

like to my Android phone. It's like this big

2:20:04

and grainy. It'd be nice if we

2:20:06

could all use whatever

2:20:08

messaging platform we want and have a protocol

2:20:10

like RCS be available in all of them

2:20:12

so that we could communicate with people.

2:20:16

Seems like. Especially if

2:20:18

it offered end-to-end encryption. Which it does, right?

2:20:20

Or no? I don't

2:20:22

think that's a feature of RCS. I

2:20:24

think Google wants it to be. I don't know if

2:20:27

it is yet. It might be a totally specific. I

2:20:29

don't know if it's standard. No, it's implementation specific. I

2:20:31

think you're right. Yeah. It's

2:20:34

just a protocol you could, if you want

2:20:36

to, encrypt it. And I think Google does

2:20:38

or intends to anyway. Finally,

2:20:41

some good news if

2:20:43

you don't like swatting. And who does like

2:20:45

swatting? Well, apparently this

2:20:48

California teenager tore

2:20:50

swats like swatting, but

2:20:52

now he's in jail, going to be extradited

2:20:54

a felony. And even though he's

2:20:56

a minor, he will be tried as an adult.

2:21:00

Wired, which I think with unusual

2:21:04

caution did not reveal his real name because he is

2:21:07

17 years old. But he,

2:21:09

according to sources familiar with

2:21:11

the investigation, swatted

2:21:14

hundreds of people. He

2:21:17

was like the king of swatting, apparently.

2:21:21

Swatting, as you may or may not know, but

2:21:23

probably should know, is when a fake

2:21:25

call is made to law enforcement. We've been

2:21:27

swatted. Somebody called the Petaluma Police Department and

2:21:29

said, I am holding hostages

2:21:32

inside the Twitch studios. I've planted bombs

2:21:34

all over the studios and

2:21:36

I'm going to kill myself. Goodbye. And

2:21:38

then of course, we were lucky. The

2:21:40

Petaluma Police were smart enough to go, sure,

2:21:43

kid. But they came over and

2:21:45

they actually brought in, at great expense, bomb-sniffing

2:21:47

dogs and we had to evacuate the studio

2:21:49

and they went all over the studio. Members

2:21:53

of Congress have been swatted.

2:21:57

Judges trying President Trump.

2:21:59

have been swatted, the director. Secretariat

2:22:02

of State. Yep. CISA

2:22:04

directory. Yeah, I mean it's a

2:22:06

nasty thing. Now, I don't think

2:22:08

this one kid did that, all of it. But

2:22:12

it's probably good if he

2:22:14

does go to jail and does do

2:22:16

some time as maybe a warning

2:22:18

that this is not a game, right?

2:22:21

Why are they trying as an adult? Is it because

2:22:24

if they tried him as a juvenile, he wouldn't get jail?

2:22:27

Or he'd be out at 18 or it's lighter.

2:22:29

You got to send a message. Like,

2:22:31

this is one where I have no sympathy of any

2:22:33

kind, way, shape, or form. Like,

2:22:35

we got to throw the book at these, at

2:22:38

the people who are doing this. This is like,

2:22:40

it's insane that people are thinking,

2:22:42

oh, we should waste our valuable

2:22:45

law enforcement and health and services

2:22:48

because I do not like some of the

2:22:50

political statements of the other side, or I

2:22:53

do not like this person, or this person

2:22:55

rejects. Throw the book at

2:22:57

them. Throw all the book at them. This person

2:22:59

is not doing it for political reasons. No, he's

2:23:01

just a jerk. They're

2:23:04

like, I have power in this situation. But

2:23:07

there are people doing it for political reasons. There

2:23:09

are people doing it for all. And we don't

2:23:11

actually know for sure. The kid

2:23:13

may have some political – in

2:23:15

addition to everything else, obviously, the

2:23:17

power thing. I just – this

2:23:20

is just a behavior that has

2:23:22

to be punished heavily to

2:23:24

stop it because people feel

2:23:27

like that if you are behind the

2:23:29

Internet and you can modulate your voice,

2:23:31

that you are, you know, free of

2:23:33

consequences. Well, they never caught the person

2:23:35

who swatted us. I

2:23:38

listened to the recording of it. I

2:23:40

mean, it's kind

2:23:42

of chilling. It's

2:23:45

not just the use of resources and the inconvenience. There's

2:23:47

a risk of death. Absolutely. And

2:23:49

there have been deaths, accidental deaths. Right, exactly.

2:23:52

And swanings. So it is a

2:23:54

very dangerous and dumb thing to do. Now,

2:23:56

I agree. Maybe he shouldn't be tried

2:23:58

as an adult. just Florida

2:24:01

State Law, it's four felonies and the

2:24:03

Florida State Law does allow him to be tried as

2:24:05

an adult and the prosecutor is going to do that.

2:24:09

The way he came about is interesting. He apparently

2:24:11

was swatting Twitch streamers. That was

2:24:14

his favorite thing to do. If

2:24:16

you've seen that, I used to work at

2:24:18

Twitch. Yes, this was a very common

2:24:20

occurrence. It's a problem. You'd be live on the air streaming

2:24:23

and police would come storming in. So

2:24:27

a number of prominent Twitchers hired

2:24:30

a private investigator,

2:24:33

Brad Caffros-Dennis. He'd

2:24:38

been hunting this kid for two years, which means

2:24:40

the kids started doing this like when he was

2:24:43

15. Caffros-Dennis

2:24:47

says, it's a beautiful day. I'm

2:24:49

very relieved. Tor will no longer

2:24:51

be able to conduct his reign

2:24:53

of terror. He

2:24:56

swatted schools as well as

2:24:58

Twitch streamers, public officials. According

2:25:02

to Dennis, in January of last year, he

2:25:04

handed evidence to the FBI special agents

2:25:06

in charge of the case. It

2:25:09

was used, that information was used in subpoenas

2:25:11

sent to YouTube and Discord. I

2:25:16

guess they tracked the kid down. Would

2:25:20

watch that movie. Yeah, wouldn't that be interesting? I

2:25:22

mean, it took him a while. It'd probably be

2:25:24

a really boring movie. I'm wondering

2:25:26

how they did it. I'm

2:25:28

just thinking about what is the back end? How

2:25:32

do you track it? Do you just look at server

2:25:34

logs for who was the

2:25:37

origin? So he sent them the information.

2:25:40

They sent out subpoenas. They figured out who

2:25:42

he was by July of last year. Then

2:25:48

at that time executed a search warrant,

2:25:50

seized his devices, but

2:25:52

the FBI isn't talking and neither is

2:25:54

the Florida district attorney.

2:26:01

There's ways to figure out what you're

2:26:04

doing. They

2:26:07

use a voice over IP number, that's one of the

2:26:09

ways they can keep it anonymous. Well,

2:26:11

within reason, obviously because they call them.

2:26:16

The individual's calls

2:26:18

to Washington schools in

2:26:20

May allegedly affected 18,000 students, cost

2:26:23

taxpayers $271,000 and lost instructional time. I

2:26:29

mean, yeah, there's a cost, a significant cost.

2:26:32

The kid's really lucky that apparently not a single

2:26:34

one of those walks resulted in a death. Yeah.

2:26:39

Anyway, so I hope you all pay attention. Don't do

2:26:41

it. Don't do it. It's not

2:26:43

a joke. Rick

2:26:46

Scott, a senator from Florida, said we must

2:26:48

send a message to the cowards behind these

2:26:50

calls. This isn't a joke, it's a crime.

2:26:52

Scott was, by the way, swatted into December.

2:26:56

Just ordering 20 pizzas or something, please. Yeah, just

2:26:58

order some pizzas. Do it like we did back

2:27:00

in the day. Just

2:27:02

order some pizzas. It's fine. Don't

2:27:05

do that either. I

2:27:07

hope it says get rid of this problem, but

2:27:09

I doubt it will. I feel like it's going

2:27:11

to continue. Stacy,

2:27:14

do you want to say something? No.

2:27:19

Good. In that case, go get

2:27:21

a waffle because we're done, ladies

2:27:23

and gentlemen. Stacy Higginbotham, I

2:27:25

would plug your website. I would plug your podcast.

2:27:29

I have something to plug that your audience may care

2:27:31

about. I plug it. Yes. So

2:27:34

February 2nd is

2:27:36

the deadline for anybody who's interested

2:27:38

to submit a comment

2:27:41

to the Federal Trade Commission

2:27:43

on users' right to repair.

2:27:45

Oh, please do this. Yes.

2:27:48

You can tell them a story. You can just

2:27:51

say, hey, we would love it if you would

2:27:53

make a rule forcing or even just enforce the

2:27:56

existing provisions that would require or that would enable

2:27:58

them to be able to do that. or right

2:28:00

to repair. So

2:28:02

Kyle Weans of course of iFixit

2:28:04

has been involved. You

2:28:07

can go to the repair.org

2:28:09

site states.repair.org to

2:28:11

sign the petition. You

2:28:14

need to do this before February 2nd

2:28:16

so the FTC sees

2:28:18

this. There's other ways you can join the

2:28:20

fight. I agree with you 100%

2:28:23

and I'm guessing Consumer Reports is also very much

2:28:25

involved in this. I'm writing

2:28:27

our formal comments. Absolutely.

2:28:30

I mean, yeah, let's

2:28:33

get this done. You don't

2:28:35

have to get super nerdy on like, oh

2:28:38

my gosh, software pairing is wrong or anything

2:28:40

crazy. You could just be like, hey,

2:28:42

it would be awesome if I could replace the battery

2:28:44

in my phone. Right. In fact, they

2:28:46

probably take that more seriously,

2:28:49

wouldn't they? I mean, that's a

2:28:51

real person. They're going to pour over my comments,

2:28:53

Leo. They're going to pour

2:28:55

over them with. They're going to believe

2:28:57

it firmly, vividly. All

2:29:00

right. Right

2:29:02

to repair. We're behind it. So

2:29:05

is the EFF iFixit, repair.org and Consumer

2:29:08

Reports. Thank you, Stacey. It's

2:29:10

always a pleasure and I'll see you in

2:29:12

a little while, a week

2:29:14

from Thursday for our, I guess

2:29:16

it's two weeks from Thursday for

2:29:18

our book club. So it's not

2:29:21

too late. You got two weeks. It's a week

2:29:23

from next Thursday. It is a week from next Thursday. Oh, yeah, yeah,

2:29:25

yeah, yeah. I'm

2:29:27

halfway through right now, but it gets

2:29:30

happier, right? The sun comes out, the

2:29:32

dust storms stop, the water flows and

2:29:34

everybody has a, no, no. Okay.

2:29:39

The book is actually a great book. It's good. It's

2:29:42

good. Read it. The

2:29:44

Water Knife by Paolo Bacigalupi. Right?

2:29:48

Yeah, we're going to solve our water

2:29:51

issues in, in, yeah, it's going to

2:29:53

totally happen. Yeah. Also, just

2:29:55

a tip, don't keep hyenas as pets. Copybar.

2:30:01

I don't know. Maybe copybar is a P-O-K. Yeah,

2:30:04

they're benign. Sure. They don't

2:30:06

even have people. I think they might, actually.

2:30:09

I don't know. That's Ben Parr. You

2:30:11

better ask your AI. He is the author of

2:30:13

The AI Analyst, which is available everywhere, co-founder of

2:30:16

Octane AI, and you will

2:30:18

read his columns frequently in the information,

2:30:21

which I do. I'm a happy subscriber.

2:30:24

Thank you, Ben. Anything you'd like to plug

2:30:26

besides that? I

2:30:28

made a one. I have two things. One,

2:30:31

I made a 110-page presentation

2:30:33

for a conference on AI

2:30:35

investor trends, what they're investing

2:30:37

in, what the numbers look

2:30:39

like. So if you're interested

2:30:41

in that, it's on benparr.com,

2:30:43

b-e-n-p-a-r-r.com. You can get that deck. You

2:30:45

can go look at all the numbers. You can find out how

2:30:48

much more VCs are investing into

2:30:50

AI than other types of companies.

2:30:52

Hint, it's a lot. You can

2:30:54

find out cool stuff about

2:30:57

things like Mamba and Liquid

2:30:59

Neural Networks and all the stuff that I am watching.

2:31:03

My own, so go to that one. And then my

2:31:05

other one is Go Lions. Shut

2:31:08

up. For those

2:31:10

who are listening right now, I

2:31:13

got the notification. It's a 217 Lions. So

2:31:17

I'm sorry. No,

2:31:19

I'm not. Who are they playing? They're

2:31:21

playing the 49ers. Ah! By

2:31:25

the way, there's one that, this is

2:31:27

just a good story. Also I'm from the midwest. It

2:31:29

is a good story. No, no. I

2:31:31

got a root. It is a good story. And if we do lose,

2:31:33

which we won't because we're going to come back in the second half.

2:31:35

But if we do lose, at least it'll

2:31:37

be to a team that hasn't been in the Super Bowl since how

2:31:39

long? So,

2:31:41

God, it's like what, the 70s? It's

2:31:44

a very long time. I remember that like, ah,

2:31:46

it's a very long time, which is why people,

2:31:48

their last time in the Super Bowl, they've

2:31:51

never been to a Super Bowl. That's the answer,

2:31:53

by the way. They've never been to

2:31:56

a Super Bowl. So

2:31:58

in that case, you're right. But

2:32:00

only if they win, not if they lose. If

2:32:03

they lose, I'll be happy. Fifty

2:32:06

billion dollars this year, past year on

2:32:09

the AI startups. That's kind of an amazing... Put

2:32:11

that in perspective. Is that a lot

2:32:13

for venture capital investments? What's

2:32:16

interesting is that 2021 was still just a

2:32:18

hair higher in terms of the money put

2:32:21

into AI startups. Oh, interesting. Just because there's

2:32:23

so much more money in 2021. As

2:32:26

a percentage of overall funding, one

2:32:28

out of every four dollars in venture went

2:32:30

to AI startups last year. That number will

2:32:33

be higher, I'm almost certain, this year in

2:32:35

2024. And

2:32:38

the real thing that moves the needle here

2:32:41

is more than anything, interest rates. Which

2:32:44

is just fast, no matter what you do, the

2:32:46

interest rates determine how many dollars go into AI

2:32:49

or venture capital. And the interest rates

2:32:51

are probably going to go down a little bit this

2:32:53

year. You will probably see more money

2:32:55

and you might see a few more exits. It's

2:32:58

crazy. There's a

2:33:00

lot of predictions around how much it's going to

2:33:02

increase GDP. Goldman Sachs predicted 7%. I

2:33:05

think that is way too small. It'll be a lot

2:33:07

of a larger impact. But it

2:33:10

also has a large impact. I just go through all this

2:33:12

stuff in that stack of where are people putting the

2:33:14

money. I will say one other thing, which is out

2:33:16

of that 50 billion that was invested last

2:33:18

year into AI startups. That's a stat from

2:33:20

Crunchbase. Half of

2:33:22

that came from, went to just 11

2:33:24

companies. Wow. And

2:33:26

so this is also proof of what they call

2:33:28

the power law in venture capital, which is like

2:33:31

the biggest returns come from a very small group of companies

2:33:33

you invest in. And so a lot

2:33:35

of the money went to a small group of companies.

2:33:39

But there's still over 5,000 companies in AI got

2:33:41

VC funding last year. And

2:33:43

a number will almost certainly go up this

2:33:45

year. The

2:33:48

AI analyst at benparr.com. Thank

2:33:51

you, Ben. It was

2:33:54

so good to see you, Alan Malventado. I'm going to

2:33:57

be here. on

2:34:00

and anything else

2:34:02

you want to plug? I

2:34:06

don't really have anything else going on. Go to

2:34:09

work for the NSA kids, you see the world.

2:34:14

You won't see much of the world. You won't see the world,

2:34:16

you see the world's data and that's more valuable.

2:34:19

Illinois is a pleasure, I appreciate it. We'll get you back soon. Thanks

2:34:22

to all three of you. Thanks to all of you who

2:34:24

joined us. We appreciate it. A couple more days to take

2:34:26

the survey. We want to know more about you. If

2:34:29

you haven't yet, twit.tv slash survey24, we

2:34:31

do this once a year because we

2:34:33

don't know anything about you. We're proud

2:34:35

to say unless you volunteer it and

2:34:37

that way we know if we're on

2:34:39

the target with what you want in

2:34:41

here and read and watch and it

2:34:43

also helps us to sell advertising of

2:34:45

course. Twit. but we

2:34:47

don't reveal any personal information, don't

2:34:50

worry. It's not about you individually,

2:34:52

just about the audience as a

2:34:54

whole. Twit.tv slash survey24, it really

2:34:56

does help us out. Thank you.

2:34:59

We will be back next Sunday. We do twit every

2:35:01

Sunday afternoon from 2 to 5 Pacific. That

2:35:05

is 5 to 8 Eastern time. That is 2200 UTC. We

2:35:08

turn on the live stream on YouTube right when the show starts

2:35:11

and turn it off when the show ends. If

2:35:13

you go to youtube.com/twit, you'll

2:35:15

see it there. In fact, if you sign up and

2:35:17

get a notification, of course you can

2:35:20

watch the show at your leisure or listen at your

2:35:22

leisure. You'll find episodes at twit.tv.

2:35:26

There's a dedicated YouTube channel and of course you

2:35:28

can subscribe on your favorite podcast,

2:35:31

Clients. We thank you for doing

2:35:33

so and we will see you next week. But for now,

2:35:35

another twit is in the can. Bye-bye.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features