Podchaser Logo
Home
Chasing Shadows in the Digital Abyss - Doomed Apple Car, Chinese EVs, MWC Roundup

Chasing Shadows in the Digital Abyss - Doomed Apple Car, Chinese EVs, MWC Roundup

Released Monday, 4th March 2024
Good episode? Give it some love!
Chasing Shadows in the Digital Abyss - Doomed Apple Car, Chinese EVs, MWC Roundup

Chasing Shadows in the Digital Abyss - Doomed Apple Car, Chinese EVs, MWC Roundup

Chasing Shadows in the Digital Abyss - Doomed Apple Car, Chinese EVs, MWC Roundup

Chasing Shadows in the Digital Abyss - Doomed Apple Car, Chinese EVs, MWC Roundup

Monday, 4th March 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

It's time for twit this week in tech. We have

0:02

a great panel for you. Anthony. Ha is

0:04

here You may remember his byline from

0:07

TechCrunch. I certainly do. He's got his

0:09

own podcast now He writes for a

0:11

lot of publications our car guy. Sam

0:13

Abul. Sam it is here. We'll talk

0:15

about Elon Musk He's suing open AI

0:17

saying hey, that's not what you you

0:20

said you'd be doing We'll also

0:22

talk a little bit about cars It seems

0:24

like Apple's getting out of the car business

0:26

Were they ever in it and

0:28

all the music is leaving tick-tock? Where

0:31

does that leave the musicians? Where does

0:33

that leave the talkers all that more

0:35

coming up next on twit? Podcasts

0:40

you love from people you

0:42

trust This

0:45

is twit This

0:52

is twit this week in tech episode

0:57

969 recorded Sunday March 3rd 2024

1:00

chasing shadows in the digital abyss

1:05

This episode of this week in tech is brought

1:07

to you by express VPN Going

1:10

online without ExpressVPN. That's

1:13

like using your smartphone without a case Most

1:16

of the time yeah, you'll be fine, but

1:18

all it takes is one drop to make you wish

1:20

you'd protected yourself Why does

1:23

everyone need a VPN? Well first

1:25

of all unfortunately doesn't take much

1:27

technical knowledge to hack someone just

1:29

some cheap hardware and Look

1:33

your data your privacy your information is

1:35

valuable Hackers can

1:37

make like a thousand dollars a person selling

1:39

personal info on the dark web So

1:41

every time you connect to an unencrypted network

1:44

whether it's a cafe a hotel an airplane

1:46

your online date is not secure Now

1:49

that's why I say use ExpressVPN certainly

1:51

what I use it's super secure Has

1:53

an encrypted tunnel between your device and the internet so

1:56

no bad guy on the plane in the hotel and

1:58

the cafe can see anything But just like that nonsense

2:00

going by. It's very easy to

2:02

use. It works on everything you've

2:05

got, iPhone, Android, Mac, OS, Windows,

2:09

Linux. It even works

2:11

on smart TVs and routers. You

2:14

just fire up the app, you click one button and

2:16

then you're protected. You can also use the app

2:19

to go to travel, shall we say, to

2:22

other areas where the shows you

2:24

want to see are still available.

2:26

Hint, hint. I

2:29

love ExpressVPN. I trust

2:31

it and I encourage

2:34

you to try it.

2:36

Secure your online data

2:38

today by visiting expressvpn.com/twit.

2:41

That's expressvpn.com/twit. You'll

2:43

get an extra three months free with a one-year package.

2:45

That's your best deal. Expressvpn

2:48

dot com slash twit. It's

2:57

time for Twit This Week in Tech, the show we cover the week's

2:59

tech news. We're going to do it

3:02

with some really great guys. First time

3:04

he's been on Twit, he's been on

3:06

Twit with me. He's been

3:08

on Twit before. You remember him with the Vindra.

3:11

Anthony is high as here. I know

3:13

Anthony from his byline

3:15

for years on TechCrunch. He's

3:18

a freelancer now and does the original

3:20

content podcast. Anthony, welcome. Thank

3:23

you. I'm excited to be on real Twit for the

3:25

first time. This is the big boy Twit. Jeff

3:27

Jarvis calls it the grown-up table.

3:30

Yeah, I'm thrilled to have

3:32

you on. I love your work.

3:34

We have lots to talk about.

3:36

Also with us are car guy

3:38

Sam Abul-Samad. He's a principal researcher

3:40

at Guidehouse Insights, but he's also

3:42

the host of the Wheelbarings podcast

3:45

at wheelbarings.media. Hello, Sam. How

3:47

are you? I'm good, Sam. Great to see you. I

3:50

just realized my shirt is semi-translucent right now

3:52

because I'm in front of a green screen

3:54

and in front of a blue-green

3:57

shirt. He's really, I like it. Yeah, he's like

3:59

a little ghost. Boy, Casper, the

4:01

friendliest, Bill Saaved. Good

4:05

to see you both. Lots to

4:08

talk about, but I'm very glad you're here, Sam, because

4:11

it was a bit of a shocker this week.

4:14

Apple canceled a product it never announced. And

4:17

as far as- Well, it may be a shock

4:19

to you. Of course, anybody knows. It was all

4:22

imaginary. It was just a dream for the

4:24

last 10 years. It was

4:26

Titan, widely rumored to be

4:28

Apple's car project.

4:32

Mark Gurman, who's very reliable, said

4:34

that on Tuesday the memo went out

4:37

that we are canceling the project. We're going

4:39

to try to move everybody from Project Titan

4:41

over to our AI efforts,

4:44

but the Apple car is unofficially

4:47

dead because it never lived. You

4:50

said you're not surprised. No.

4:54

When first reports of Project Titan first came out

4:56

in early 2015, I wrote a series of

5:01

articles on my personal blog back then, basically

5:07

indicating my skepticism that Apple would ever

5:09

follow through and actually build a car.

5:12

Having spent the last 30

5:15

plus years in the auto

5:17

industry, it never seemed probable

5:19

that they would actually do this because

5:23

Apple, as we know, is a company that generally

5:25

only likes to go into market segments

5:28

where they can make really large profit

5:30

margins, like 35 plus

5:32

percent profit margins. Pretty

5:35

much nobody in the auto industry comes

5:38

even close to that kind of profit

5:40

margin. It

5:43

never really made sense that Apple would do

5:45

this. I

5:47

figured they would play around

5:50

with it for several years, try

5:52

some things. I did, at the time, lay

5:55

out a few scenarios that

5:57

could be possible scenarios for them.

6:00

because among the other things they had

6:02

been doing at the time was they had invested

6:04

a billion dollars in DD, which

6:06

is a Chinese ride-hailing company, similar to

6:08

Uber and Lyft. D-I-D-I, DD.

6:11

Yes. And they were also

6:13

doing a bunch of other things. They

6:16

had purchased the

6:18

company, forget the name of

6:20

it now, but it was the company that developed the original

6:22

Microsoft Connect, it was an Israeli

6:24

company. Oh yeah. Which had some really

6:26

interesting sensing and perception technology. And

6:29

what I figured that, one

6:35

potential scenario that could work for Apple

6:38

would be if they could do

6:42

a premium mobility service, rather

6:45

than selling cars, because again, one

6:47

of the challenges for Apple is

6:50

they like to control their entire

6:52

ecosystem. And once

6:54

you sell a vehicle to consumers,

6:57

you lose control of that. You can't control,

7:00

for example, what tires they put on it,

7:02

what parts they might replace over the life

7:04

of that vehicle, what other

7:06

modifications they might make. But

7:08

if they had done something like

7:10

a subscription

7:13

robo-taxi service, a

7:15

premium subscription robo-taxi

7:17

service, then

7:20

they could retain control of those

7:22

vehicles, they can ensure that nothing

7:24

gets modified. They don't

7:26

have to deal with, for example, setting up

7:28

a dealer network and a service network to

7:31

maintain these vehicles. I mean, they would have to

7:33

do that anyway if they're owning these vehicles. But

7:37

that would be one potential scenario

7:39

that they might've followed. But doing

7:41

that would require that they actually

7:44

have a

7:46

working automated driving system, which

7:48

they also worked on for much of

7:50

the last decade and never

7:52

really seemed to make much headway with.

7:55

Although I think that there were lessons

7:58

learned from that effort that... that

8:00

probably filtered into other

8:02

products, like for example, the LIDAR that

8:04

they've put on iPads and iPhones, I

8:07

suspect that that, at least in part,

8:10

came from lessons learned in

8:12

Project Titan in

8:15

the automated driving effort. Various other

8:17

things, some of the perception things that

8:21

where you're trying to detect and

8:23

classify different objects is probably filtered

8:25

into some of the work they've

8:27

done on the camera side. So

8:29

there's a lot of things

8:32

that they've probably benefited

8:34

from this effort, but

8:36

ultimately, I

8:39

am not at all surprised that they

8:41

abandoned the project. They've had so many

8:43

twists and turns over the last decade,

8:46

so many different people leading the program.

8:48

I know a number of people that

8:51

went to Apple, left Apple, after

8:53

working on it for a number of years. And

8:56

then there's people like Doug Field, who went

8:59

from Tesla to Apple to work to lead

9:01

this project, and then went to Ford, he's

9:03

at Ford now, and there's a lot of

9:05

other people that I've known that have spent

9:07

some time at Apple working

9:09

on this over the last decade, but

9:12

they never really could figure

9:14

out a business model that

9:16

fit with Apple's way

9:18

of doing business. It

9:22

certainly felt like a revolving door between Apple

9:24

and Tesla. I'm

9:26

reading Mark Gurman's piece in Bloomberg

9:29

titled Apple Car Was Doomed by

9:32

its lofty ambitions to outdo Tesla. And

9:34

you get the strong impression that Apple

9:37

did something with the car that they rarely

9:39

do, which is look over their

9:42

shoulders at another company and

9:44

say, oh, we should do that, and

9:48

we should beat them at their own game. And

9:51

that doesn't seem like that's gonna

9:53

end well, especially against

9:56

Tesla. Tesla really is dominant in this

9:58

market. says they

10:00

had two schools

10:02

of thought 10 years ago roughly when they

10:05

started this. You

10:07

have insight to this, Sam, too. If

10:11

you hear me say something that German said that's

10:13

wrong, let me know, but German also has really

10:15

good sources. He says when they

10:17

started thinking about this 10 years ago, they

10:20

had two choices, either build an

10:23

electric vehicle basically

10:25

functionally the same as the Tesla

10:29

or be more ambitious and

10:31

I'm going to quote German, change the world

10:33

with a full-blown self-driving vehicle taking

10:35

passengers from point A to point

10:37

B with zero intervention from a

10:39

driver and make it look like

10:42

nothing anyone else had seen before.

10:44

He says they planned these

10:46

cars without steering wheels or pedals that

10:48

you would drive it using Siri

10:51

which anybody who's used Siri for any length

10:53

of time knows is a nightmare idea. Anthony,

10:56

have you been following this story also for

10:58

a decade? Yeah, absolutely.

11:01

I mean, I don't have any

11:03

inside sources but just reading about

11:05

it, it's been this constant

11:07

far off dream and

11:09

I think, yeah, it

11:13

was surprising in the sense that it felt like

11:15

Apple had been pursuing this for so long. I

11:17

just thought it would be kind of like, you

11:20

know, kind of like Zeno's paradox, like just

11:23

continually like the finish line, never actually reaching

11:25

the finish line but they just continue putting

11:27

money into it but in

11:29

retrospect, it makes sense that at a certain point they'd

11:31

say, well, maybe not, like we don't actually

11:33

want to like do this for 20 years

11:36

and have nothing to show for it. I

11:38

mean, it sounds like from what Sam was

11:40

saying, not nothing but no real commercial product

11:43

to show for it. For it or the

11:45

estimated $10 billion that they pumped

11:47

into it. There were at one point there were thousands

11:49

of people working on this

11:51

car. Sam, didn't they have

11:53

a facility in Sunnyvale where they were

11:55

trying to assemble the vehicles? Yeah,

11:59

I mean, it's a lot of money. hard to say what they

12:01

were assembling. Yeah. I mean, they did have a

12:03

fleet of Lexus RXs

12:05

that they had there. People

12:07

have seen those driving around. Yeah. Yeah. And

12:10

I've, I've, I've seen them driving around as

12:12

well. Um, but, um,

12:14

and it may, you know, it may

12:16

be that, you know, that, that was

12:18

just a facility that they were using

12:20

for assembling those vehicles, you know, to,

12:23

to outfit those vehicles with all the

12:25

sensors and compute that they needed. They

12:27

may have been building, you know, prototyping

12:29

some stuff there. Um, you

12:31

know, I think, you know, Gurman's second idea,

12:33

you know, which is what I was talking

12:35

about, um, is probably what

12:37

they ultimately wanted to do. But I think

12:40

the, the reason, probably the reason why they

12:42

got into this in the first place is,

12:45

you know, they, they recognize that at

12:47

some point the market for

12:49

the products that we're already doing like phones

12:52

and tablets and computers was going

12:54

to get saturated. And of course

12:57

we know that the financial markets

12:59

want growth and big stock prices,

13:01

big share prices are based on

13:04

this, having a growth narrative for a

13:06

company. And if they're, if a company

13:08

is just stagnant and not really growing,

13:10

which is what the traditional

13:12

auto industry is, you know, where

13:15

they still have huge cash flows

13:17

and make turns, huge profits, not

13:19

Apple scale profits, but, you know,

13:21

big profits, uh, but they,

13:24

they're not growing. And so they

13:26

have low stock prices and Apple

13:28

did not want to be in that

13:30

position. And, you know, one of the

13:33

places where Tim Cook probably thought, well,

13:35

here's an area where we could potentially

13:37

really boost our revenue numbers, at least,

13:40

if not necessarily profits in the near

13:42

term, at least revenues, because even though

13:44

they wouldn't sell anywhere near the unit

13:47

volumes of vehicles, uh,

13:49

that they, that they do with phones

13:51

or tablets, they would, you know,

13:53

the, the cost of a vehicle, especially the kind

13:55

of vehicle that Apple would build, which not,

13:57

would not be, you know, a foreign focus.

14:00

type of vehicle it's gonna be something more like a

14:02

lucid air it that you know

14:05

that even if you're selling you know

14:08

selling 50 or a hundred thousand of those a

14:10

year that at a hundred

14:12

to a hundred and fifty thousand dollars apiece

14:14

that's a huge boost to your revenue line

14:17

and so I think that's probably

14:20

what the thinking was but

14:23

you know that it's just actually

14:26

executing on that turned out to be way

14:28

harder than they anticipated and I've said on

14:30

a number of occasions over the last several

14:33

years that you know if Apple you know

14:35

as this thing dragged on if Apple really

14:37

wanted to just get into the car business

14:39

what they should have done was just bought

14:42

Lucid because Lucid is a

14:44

company very much in the Apple mold in

14:47

terms of the types of vehicles

14:49

they build the design ethos

14:51

that they have you

14:53

know very very advanced technologies

14:56

and of course Lucid's head of

14:58

software is a guy named Mike Bell who was

15:00

formerly at Apple so

15:02

I think you know that's what

15:04

they probably should have done if they if

15:06

they wanted to continue that down this path

15:09

and Apple you know could have taken

15:11

what Lucid is already doing and

15:13

take the expertise that Apple has

15:15

in supply chain management for example

15:18

and really addressed some of the big

15:20

problems that Lucid has which has been

15:22

as a startup just dealing with suppliers

15:24

and getting components and getting better pricing

15:26

on components Apple probably could

15:28

have fixed that and probably could have

15:31

turned Lucid into a really viable business

15:34

but you know they decided they wanted to

15:36

do it all on their own and and

15:38

now they didn't they're not yeah

15:41

according to German who

15:43

quotes somebody involved in decision-making it was as

15:45

if Apple had tried to skip all the

15:48

early iPhone models and jump right to the

15:50

iPhone 10 instead of just putting a flag

15:52

in the ground with a good enough car

15:54

with an Apple user interface slick Johnny Ive

15:57

design interior next year by the way Johnny

15:59

I very involved in the early days

16:02

of this we hear. And an iPhone

16:04

like buying experience, the company bet everything on

16:06

the wrong horse autonomy. How

16:08

important Anthony is it for a company like

16:10

Apple to have the

16:12

next big thing on the burner?

16:14

I mean Apple, Google's kind of,

16:17

it's a lot of our big tech companies are kind of

16:19

in this position right now where they're

16:21

looking for the next thing. Traditionally

16:25

that next thing came from somebody in

16:27

a garage not from any

16:30

blue. In this case. Right.

16:33

And yeah I mean it seems like

16:35

in general so there's this this

16:37

search for kind of what is

16:39

the next big form

16:41

factor, the next wave of computing after

16:44

the iPhone. And it

16:46

feels like you know there have been successes in

16:49

that in terms of obviously new Apple products,

16:51

new products from other companies but nothing

16:53

that sort of redefined the game in the

16:55

same way that the iPhone did. It's kind

16:58

of hard pressed to have the same impact

17:00

on the world that the iPhone had. I

17:03

mean that's yeah I think the main thing is you

17:05

just want to make sure that if it does happen

17:07

that that Apple if you're you know Tim Cook you

17:09

want to make sure that Apple has is in the

17:12

game for whatever the next wave is and hopefully is

17:14

the one leading the way. I mean obviously that's the

17:17

same reason why they're you know invested so

17:19

much in you know what ultimately became the

17:21

vision pro and and I've been thinking about

17:23

that you know also in terms of the

17:25

discussion of like oh was there could

17:27

they have done something that was a little bit you

17:29

know a good enough car and I mean

17:32

it feels ridiculous to say this at its

17:34

price point but the vision pro in some

17:36

way seems like a compromise good enough product

17:38

where you know I think there are

17:40

certain things they wanted in terms of the battery in

17:42

terms of the transparency of the lenses that probably are

17:44

not what they started with but at a certain point

17:46

they realized okay we need

17:48

to get something out there and this will

17:50

eventually lead to the thing

17:52

that we're dreaming about maybe and it seemed

17:54

like they couldn't figure out a path to

17:56

do that with the car. The

18:00

Division pro. I mean eat apples. A big

18:02

enough company and has enough. Money

18:04

to have separate parallel tracks but

18:07

as as see like Division Pro

18:09

beat the Apple car and one

18:11

of the problems according to Germond

18:13

others of the Apple car had

18:15

was it was gonna have to

18:17

be a one hundred thousand dollar

18:19

car meaning is already in a

18:21

super luxury category and even then

18:23

that a profit margins have been

18:25

zebra nonexistent. so it wasn't a

18:27

traditional Apple. right? Now, as

18:29

profit margins hover around forty

18:31

percent. Of course

18:33

it didn't happen initially with the I phone.

18:36

it takes a while in a build up

18:38

that ability but still. It's. Zero

18:40

percent as know this is that close to

18:42

Christmas and so this would have been a

18:44

tough a tough. Road. Hope I

18:46

don't think Apple's making much money on the

18:48

Vision Pro, but it's probably not losing money

18:51

on of either that or is it the

18:53

interesting thing about Apple when you look at

18:55

the new products have launched i phone I

18:57

pad. Dob. Feel

18:59

the the vision pro the

19:01

watch in each one of

19:03

these was. It is

19:06

strangely enough, both good enough and also

19:08

leap frog in the competition. For

19:10

the happening wasted which is why the car

19:12

might have made sense for them because he

19:14

we could take an existing category and. Put.

19:17

The app sprinkle the apple magic dust on

19:19

it. And we, you

19:21

know profit. Except

19:24

except that. He. I'm in.

19:26

those other segments were apple

19:28

had entered. None. Of the

19:30

competition that was already there was

19:32

actually really very good. Friend,

19:34

so even more dominate of a blackbird

19:37

Out dominant I guess. but near Berlin

19:39

they'll wasn't It wasn't really that great

19:41

a product. And

19:43

so. The A with with the

19:45

car. If there's a lot

19:48

of really, really good products out

19:50

there from a lot of manufacturers

19:52

around the world, And.

19:55

He up. Being good enough would

19:57

not be enough. And or idol

19:59

and. Know that there's enough. Apple.

20:02

Magic that you could sprinkle on that

20:04

unless you feel Apple would really need

20:06

to find a way to be not

20:08

just good enough but in find some

20:11

fundamental way that leapfrog the competition like

20:13

they did with the I phone near

20:15

with with the touchscreen in their the

20:18

multi touch interface with the the ah

20:20

the Watts Ill and it's form factor

20:22

and even know was limited in battery

20:24

life Neil some of the things that

20:27

could do and even bird the division

20:29

pro l for all it's flaws and

20:31

foibles. Yeah, it does. It

20:34

is. The Ill in

20:36

many ways see of the best Vr

20:38

headset that's been created. And.

20:42

That they have and that was our could they

20:44

have pleaded that best car ever. Was

20:48

it? That's it they were trying to do with

20:50

the move towards automated driving right? At

20:52

the Hell does exist. Creating another he be

20:54

would not have been enough right? that we

20:57

would. They would not be sufficient. To

20:59

a have given the the level of

21:01

competition that isn't that markets and this

21:03

and pretty good pardons. I mean test

21:05

was good I mean loses good Ah

21:07

I on my I am I I

21:09

just bought a Bmw that's a really

21:11

really nice vehicle will be hard for

21:13

Apple Aapl have to do something special

21:15

like not put it a steering wheel

21:17

and pedals and it. To. Soon as

21:19

a kid I think that's what they were

21:22

trying to do. our meals and now you

21:24

know. Over the last couple of years and

21:26

particular, ah, me, I think I'm a big

21:28

part of Apple's strategy would have been to

21:30

really try to make some inroads into China.

21:32

He. A which is by far and away the

21:34

biggest automotive market. And

21:36

he until a few years ago. Or

21:39

and then he'll have dominated the

21:42

the Chinese market near the were

21:44

lot of Chinese brands but in

21:46

terms of as sales the majority

21:48

a significant majority of sales were

21:50

western brands and brands from Europe

21:52

from even from North America. But

21:54

over the last few years as

21:56

is really shifted be my eyes

21:58

diamond it is. Beware, any domestic

22:01

brands now have a significant majority of

22:03

sales in China. Third, about sixty percent

22:05

of Chinese and self serving growing and

22:07

their com and and especially on the

22:09

Tv front right so would have been

22:12

really hard and and they make they

22:14

are making some really great he these

22:16

for a lot less than one hundred

22:18

thousand dollars friends and I think it

22:20

would have been nearly impossible for Apple

22:23

to really be competitive in that marketplace.

22:25

I think this is the prom Apple

22:27

has Vision Pro to which is it

22:29

takes a while. To. Get to

22:31

this point and you're You're shooting in a

22:34

moving target and you can. You could try

22:36

to stay to where the puck is going

22:38

but it's hard to know. I succeed. They

22:40

develop the Vision Pro. They started developing it

22:42

eight years ago when it looks like Cr

22:45

was going to be the next big thing.

22:47

Ah, The prom with the cars.

22:50

His autonomy didn't happen. So. They don't

22:52

really have anything. they were scanning to replace the

22:54

puck. Never when. I think they

22:56

may have the same promo vision pro. To be

22:59

honest, I think this was something that people are

23:01

excited about five years ago, but are much less

23:03

excited about now. So. As Apple

23:05

lost, it's a. It's. Mojo had

23:07

the years too early to say. On

23:11

think it's probably too early to say

23:13

it's annual like again is is is

23:15

I feel like if. Apple

23:18

doesn't have it's mojo. I'd be hard pressed

23:20

to think of a company that I could

23:22

point to and say, oh, this is sort

23:24

of setting the agenda, that's that's you know,

23:26

at the cutting edge on. In. A

23:28

way that ends in a consistent way.

23:30

That Apple is not an because. Again,

23:32

it feels like we are in this

23:35

in between period where there's plenty of

23:37

interesting new products by. It's nothing that

23:39

sort of setting the agenda in in

23:41

now an answer. You get companies and

23:43

a flailing around a bed. Wow, you

23:45

could. What is the agenda is now

23:47

a I and the company said any

23:49

gender are open a I, Microsofts and

23:52

Video. of what if

23:54

apple said well we think the next

23:56

big thing in twenty twenty four is

23:58

gonna be self driving vehicles and 2025

24:00

is going to be VR and they just

24:02

they missed and it turned out to be

24:05

AI. Well it's

24:07

interesting I mean those things are not completely

24:09

separate right no I just agree yeah AI

24:11

yeah you know it turns out to be

24:13

the next big thing then actually maybe autonomy

24:15

has sort of stalled right now that made

24:17

five years from now we might say oh

24:19

actually maybe they should have kept the project

24:21

going because there were leaps forward and suddenly

24:23

self-driving seems like a good bet again it's

24:26

hard to say. I agree

24:29

I think yeah that given

24:32

the the need you know or at

24:34

least the perceived need to make a

24:36

big push into the AI front you

24:38

know another reason for killing the car

24:40

project at this point is you know

24:42

there were a lot of software engineers

24:44

working on this you know the modern

24:47

modern vehicles are all software defined and

24:49

a lot of that software definition is

24:51

around AI related

24:54

capabilities and particularly

24:56

the automation but even even other

24:58

elements within the vehicle and

25:00

so there's probably a lot of skill

25:03

sets that were tied up in Project

25:05

Titan that they can

25:07

utilize better in the near

25:09

term for generative AI

25:11

efforts around the throughout the rest of

25:14

the company. Well

25:17

it may be that in fact that's what they did

25:19

right they took those engineers I'm

25:22

supposed there's some metal vendors in there that won't have

25:24

a job at Apple I mean AI doesn't.

25:26

Yeah they'll find they've got Apple on their

25:28

resume they will find other places to work.

25:30

Plenty of car companies. Yeah without too much

25:33

difficulty. Yeah I would love to see what

25:35

Apple was doing and say what could we

25:37

what could we use what

25:39

could we apply to our current projects. You

25:41

know a Fisker is looking for a white

25:44

knight at this point right? Well

25:47

actually it's been reported it was reported

25:50

on Thursday they did their Q4 earnings and

25:53

issued a going concern warning in their

25:55

earnings report the next day a report

25:57

came out from Reuters that they're

25:59

intoxicated. Nissan

26:02

potentially investing

26:05

$400 million to help with

26:08

development of the Fiskars next batch of

26:10

products, including the Alaska

26:12

pickup truck. And part

26:15

of that is Nissan wants to be

26:17

able to build a Nissan branded

26:19

version of that truck. They would love to

26:22

get a midsize electric pickup

26:24

truck into the marketplace. Yeah. It's

26:28

hard. I mean, look, I don't

26:30

think Apple screwed up in any way. It's

26:33

very hard to predict the future. And

26:35

in projects like this where it takes years to

26:38

develop, it's easy to miss the

26:40

boat. It's clear the car missed the boat.

26:43

I kind of in my heart think

26:45

Vision Pro might have missed the

26:47

boat as well, that it was not the

26:49

product that we need right

26:52

now. It's too expensive. It's too complicated

26:54

to build. And most importantly, I think

26:57

the mass audience doesn't really want to wear

26:59

a computer on their face. I

27:02

don't. Same. Same? Okay.

27:04

I mean, it feels like a lot

27:06

of things where we saw them in

27:09

science fiction and it seemed cool then. But

27:11

when you actually think about it in your

27:14

own life, it is not

27:16

quite as compelling. Yeah. You

27:18

probably not the best idea to use sci-fi

27:21

as your, you know, your idea of

27:23

product planning. Yeah, your product planning division.

27:25

Although Elon's done all right with it.

27:29

Some companies have done okay with it. Right.

27:33

Arguably, that's what the iPad is too

27:35

is, you know, like those tablets from

27:37

Star Trek. Sure. And Lenovo's clear screen

27:40

laptop from Marble Word Congress this

27:42

week. That's that's nobody

27:44

wants that. This is straight out of the

27:46

expanse. Straight out of the expanse. Right.

27:49

And I know it's cool. It does. It

27:52

does. Isn't in the expanse

27:54

where they had the clear phones as well?

27:56

They're holding the clear. Yeah, the clear handsets.

27:59

Yeah, they were. like phone slash

28:01

tablet and sets and yeah,

28:03

it was a transparent screen and

28:05

just had like a little bar at the bottom

28:07

where presumably whatever the power

28:09

source was and the compute was

28:12

embedded in that. Mashed

28:14

potato on our Discord saying I would throw

28:16

folding phones into that

28:19

pile of sci-fi inspired products that

28:21

nobody really wants. The

28:24

different, I see a surprising number of galaxies

28:27

and I have a couple of them. Various

28:29

people. But I think that, like

28:31

a lot of people who are buying those are buying

28:33

them because of sci-fi also. Right. I

28:36

think that what we're seeing is a little

28:39

different here. In the early days of technology,

28:41

these are small companies that failed fast. You

28:43

know, they had limited funds, they tried something

28:46

and a few companies made something that moved

28:48

them to the next level but a lot

28:50

of companies went away. Now you're seeing companies

28:52

that have virtually unlimited funds try

28:56

this stuff and in some cases, as

28:59

with the Vision Pro, try it in public. In some

29:02

cases, as with the car, not so public. But

29:04

still, I mean, you know,

29:06

who else could have said we're gonna build a level

29:09

five autonomous car by the year 2026? Maybe

29:15

Elon. Yeah, I have, well

29:17

Elon's been saying that it would be next

29:19

year for the last decade. Yeah, yeah, yeah.

29:21

But you know, I'm actually glad that companies

29:27

that have resources are spending

29:29

at least some, decent

29:31

amount of those resources on advanced R&D.

29:35

We need more of that, just doing

29:37

basic research. We need

29:40

more of that, that's how breakthroughs

29:42

happen. And ideally, the

29:44

government would also be spending more

29:46

on basic research and then making

29:49

the results of that research available

29:51

to everyone to then commercialize it

29:53

and industrialize it. But

29:56

in the absence of that, at

29:58

least having companies willing

30:00

to invest in understanding

30:04

what they can make work, what doesn't work. And

30:07

in the case of Apple, okay, yeah, they spent

30:10

$10 billion on this, but they can afford that.

30:13

They have a mountain of cash the size of Mount Everest. That's

30:16

one month's profit for Apple.

30:19

It's nothing. Right. It

30:21

feels like a paradox that I would imagine that

30:24

if you're a publicly traded tech company, on

30:26

the one hand, I imagine that investors are

30:28

not happy. They found out that you put

30:30

$10 billion into this and didn't have a

30:33

commercial product to show for it, but they

30:35

also wouldn't want you to not invest in

30:37

these kinds of big moonshot things because then

30:39

you're definitely going to be obsolete 10 years

30:41

from now. Yeah.

30:45

This, I was just going to show you this. I've

30:47

just ordered these and this

30:49

is the other end of

30:51

the Vision Pro spectrum.

30:54

This is where I would

30:56

actually like to have a transparent screen. This is what

30:58

I want, right? This is a heads

31:00

up. I don't want a transparent laptop. A heads

31:02

up display on your spectacles, admittedly geeky, but not

31:06

as geeky as walking around with a Vision Pro. It's

31:09

got AI in it. Now I'm sure this will be kind

31:11

of an early day version

31:14

0.1 product, but they're

31:16

only a few hundred bucks. I

31:19

think this is closer to what people want with AR. This

31:22

is from brilliant.xyz. They're

31:26

going to come mid-April. I'll wear them on a

31:28

show and you can all mock me. I

31:31

assume Apple would say that this is closer to what

31:33

they wanted too. Yeah, but why didn't they do this?

31:36

Why did they do this? I don't know. I'm not

31:38

a hardware expert. The sense I get is that's what

31:40

I meant by the Vision Pro being a compromise was

31:42

essentially that they wanted to be able to show this

31:44

cool stuff on your screen, but the lens technology isn't

31:47

there to do what they want to do. So they

31:49

had to have this complicated camera

31:51

setup where it looks like it's

31:53

transparent, but actually there's all these

31:55

cameras and a compromise. and

32:00

complicated compromise that apparently is also cool but

32:02

maybe is not what anyone wants. So

32:05

I ordered these, they have, I like how they charge.

32:12

So there's no battery hanging off

32:14

of you, they charge up, I don't know what the

32:16

battery life could possibly be. They

32:19

do, I ordered lenses,

32:21

so there are prescription lenses in here. I

32:24

mean look, I know these are going to be silly

32:27

but I just

32:29

feel like this Apple should have done something

32:31

closer to this than the Vision Pro. This

32:35

is where you get in trouble when you have,

32:37

when you're a three trillion dollar company with

32:39

hundreds of billions of dollars in cash just

32:41

sitting around. You maybe overdo it, you try

32:43

to build a level five autonomy car without

32:45

pedals or a steering wheeler, you try to

32:48

build a computer on your

32:50

face like William

32:52

Gibson wrote about in Neuromancer and maybe you go

32:54

too far. Maybe a

32:56

little company like Brilliant Labs doesn't have any, I'm sure

32:58

they must have VC funding but they

33:01

certainly don't have Apple

33:03

money. If they can do this, Apple

33:06

could have done this ten times better, right? I

33:11

just puzzled. Maybe the theory is that Apple is

33:13

that you know if somebody really breaks through with

33:15

that they could try to buy Brilliant in a

33:17

couple years. And maybe that's Brilliant's plan. I don't

33:19

think they've had a great. Actually. They

33:21

don't seem to have had like a great track where they're

33:23

certain in terms of taking startup products and really

33:26

kind of getting them to the next level. I mean

33:28

you mentioned Siri before and obviously that's kind of stagnated.

33:31

Right. Brilliant is in Hong Kong they have

33:33

fewer than ten employees. I'm

33:35

looking at Crunchbase just to see what their

33:37

total funding. Oh I don't have an account. Maybe I

33:40

get you Anthony. You probably have

33:42

a Crunchbase account. They

33:45

raised three million dollars seed fund. That's

33:48

it. From oh no

33:50

another three million a few months later. This was in

33:52

2023. So

33:55

from Koho Deep Tech Wayfarer Foundation

33:59

and then Adam Chai. and three other

34:01

small, it looks like Angel Fund

34:03

investing basically. Looks

34:05

like their first seed round was in 2020, 50,000. So

34:10

yeah, this

34:12

is the garage I talked about.

34:15

And it would be embarrassing if the garage came up

34:18

with something and Apple

34:20

with all its trillions didn't. Well,

34:23

the thing is what Apple wanted was not

34:25

a heads up display. They wanted, you know.

34:28

But that's what they should have wanted. My

34:31

point. I mean, that may be

34:33

what, you know, what we think we want. But

34:38

you know, Apple looks at things differently.

34:40

They think they're, you know, what

34:42

they traditionally do is look at,

34:45

you know, they're looking where the

34:47

puck is going. Right. You

34:49

know, what, what, you know, this is, you know, consumers

34:51

don't know what they want until they've actually seen it.

34:55

Yes. Isn't that what Thomas

34:57

Edison said? As if maybe not Thomas

34:59

Edison, Henry Ford, maybe it's apocryphal that

35:01

he said, if I ask people what

35:03

they wanted, they would have said,

35:05

faster horse, faster horse. Right. Right.

35:09

And, you know, it's the same sort of thing here.

35:11

You know, I think, you know, Apple figured that a

35:13

heads up display, especially after, you know,

35:15

the failure of Google Glass, you know, they

35:17

probably figured a heads up display is not,

35:20

not going to be more than

35:22

a curiosity. Right. Even

35:24

if it's a really good one. And

35:27

so they wanted to create,

35:29

you know, a real augmented

35:31

reality capability. And you

35:34

know, as, as, you know, Jason

35:36

and Alex and everybody have said

35:38

on, on MacBreak Weekly, it's just

35:40

that's technology that just does not

35:42

exist in a viable form today

35:45

and probably won't

35:47

for a decade or more. Yeah.

35:50

Well, it's, it's fun for us to

35:53

cover. I

35:56

don't have any schadenfreude that they killed the

35:58

car project. I'm disappointed. interested

36:00

to see what they came up with. I

36:04

have a lot of Apple products. I probably would have bought

36:06

an Apple car. I can see. I'll

36:09

be curious to read the oral history of

36:11

the project. Yeah. It shows

36:14

you, though, that you can have unlimited funds,

36:16

unlimited access to the best minds, right?

36:20

You would agree, Sam, that they could have anybody

36:22

they wanted. And

36:25

still not do a

36:27

product. See, this is

36:29

the thing that worries me in a more global

36:32

fashion. Is it

36:34

a kind of a realization that, oh,

36:36

we can't do level five autonomy? And

36:40

that's been a realization for a long time,

36:43

except for the hype

36:45

from Musk and his

36:47

fans. Everybody else that's

36:49

been involved in this has recognized

36:51

a long time ago that level

36:53

five is probably

36:55

never going to happen. Never? Never.

36:59

Never? I will never be able

37:01

to get into... Well, I can get into Waymo now

37:03

in San Francisco. Okay. So

37:06

the only difference between level four and level five.

37:09

Level four is what we have today with Waymo. And

37:13

that means a vehicle that can drive

37:15

itself fully automated without any human intervention,

37:18

but within a limited operating domain. And

37:21

incidentally, we believe, certainly with Cruise, and

37:23

I bet with Waymo, there is human

37:25

intervention fairly frequently, right? That the drivers

37:27

at the home office take over and

37:29

get around the pothole. Right.

37:33

So level five just

37:35

means that there is no limit on that operating

37:37

domain, that it can do it on

37:39

any road, in any weather conditions, any

37:41

time, basically anywhere where a human can

37:44

drive, it can do it. So

37:46

that's the only difference between four and five. I

37:49

think Apple probably,

37:53

with enough effort, probably could have done a level

37:55

four system. Level four systems

37:57

exist. But...

38:00

I think maybe they decided that that wasn't

38:02

good enough. And there's

38:05

been a lot of companies that have tried to

38:08

do even level four, and even that is an

38:10

extraordinarily difficult problem. And many

38:12

companies have tried and failed to get

38:14

something that is good enough. The

38:16

reason I bring this up is one

38:20

of the questions that is constantly coming up on all of

38:22

our shows in the last year is, are

38:24

we in an actual AI revolution or are

38:26

we headed toward another AI winter, where

38:29

we think this thing is going to

38:31

become amazing and in fact, oh,

38:34

it can't really do that. And

38:36

I feel like the car example

38:39

is kind of an example of

38:41

that, oh, we had

38:43

high hopes, but we can't do it. Because

38:46

it turns out the

38:48

hard things are easy, the easy things are

38:51

hard. It's the last

38:53

percent. The things that are easy for humans are

38:56

hard for AI. And I'm

38:58

wondering, maybe it's a mistake to extrapolate,

39:00

but I'm wondering, does it mean that

39:04

in many cases our ambitions are going to

39:06

be thwarted and we're going to be disappointed?

39:09

Is this the first AI

39:11

project to fail in what will be

39:13

a domino of others? Anthony, am I

39:16

over projecting here? I

39:19

definitely had a very similar thought. And

39:22

again, with the caveat that

39:24

I'm just a layman journalist reading about

39:26

these things, but in terms

39:29

of the parallel, it definitely seemed like a

39:31

powerful one to me that, you

39:33

know, it also made me think of

39:35

how, again, we were talking about kind of letting

39:38

sci-fi do your kind of product

39:40

ideation. And it

39:42

feels a little bit like with autonomy that if

39:45

you set the dream as, oh, we

39:47

should just be able to get in a car and

39:49

then, you know, there's no steering wheel and we don't

39:51

have to do anything, then sure,

39:53

then it's a failure. But actually, wow, if we've

39:56

like introduced all these features, not all

39:58

of them great, very

40:00

problematic and dangerous. But overall, we've introduced all

40:02

these features in the last decade or so

40:04

that have made driving really different and easier

40:06

and better and safer in some ways. So

40:09

even if we never get to this, you

40:11

know, glorious utopia

40:14

of full, you know, level five self-driving, that's

40:16

still like an incredible advance in technology. And

40:19

I sort of feel like the same in

40:21

AI that probably because for

40:23

a variety of reasons, but maybe

40:25

probably because it's a bunch of like technical people

40:27

who it seems like

40:29

their dream is like, well, what if we just automate

40:32

everything? What if, you know, 10 years

40:34

from now, twit is just three AI talking

40:36

heads, like chatting with each other. But I

40:38

don't actually think that's what's promising or exciting

40:40

about the technology. I think it's, again,

40:43

doing the things that are hard for humans and

40:45

humans get to continue doing the things that

40:48

we're good at. And I think the balance will probably

40:50

look very different from the way it looks today. But

40:52

I think there is an

40:55

incredible amount of hot air in

40:57

AI right now, but also that there will be valuable

40:59

technologies that come out of it at the same time. I

41:01

think, yeah, I mean, a lot of VCs are gonna

41:03

be in trouble. A lot of startups are gonna go away,

41:06

but you know, it's not gonna be

41:08

like crypto where it feels like, you know, the

41:11

whole thing just kind of vanished into thin air.

41:13

I think you're honest. I agree, I agree. I

41:16

don't think we're gonna get to AGI any

41:19

time in the foreseeable future. But,

41:21

you know, as you've learned Leo,

41:25

there's a lot of really

41:27

useful applications or

41:30

this technology within a more limited

41:32

scope, a

41:35

more limited domain. You know, instead of having,

41:37

trying to create a system that can do

41:39

everything, you know, take these

41:41

concepts and apply it to very specific

41:43

tasks, like what you've done with your

41:45

Lisp GPT, you know, or, you know,

41:48

feeding it a more limited corpus

41:50

of data to

41:55

do very specific things. Because,

41:58

you know, one of the things with the within

42:01

that is you're

42:03

much less likely to have it go

42:05

off into the weeds and do something

42:07

unexpected because these are probabilistic systems. That

42:10

is the key thing about all

42:12

the various flavors of AI is

42:14

they're probabilistic. Unlike

42:16

a classical deterministic algorithm, we don't

42:19

really know for sure what they're

42:21

going to do in any

42:24

given scenario. But if

42:26

you constrain the scenarios that it

42:28

can operate within, it can actually

42:30

do really amazing things. I

42:34

think that's the thing that we're starting to see

42:36

with the automated driving stuff is

42:39

yes, people long ago

42:41

realized that level five is most

42:43

likely a fantasy. Level

42:46

four is really hard, but there's

42:48

a lot that we've learned over

42:50

the last decade of developing these

42:52

systems that is already filtering down

42:54

into more advanced driver assist and

42:56

active safety systems. We're

42:59

getting things like LIDAR and

43:01

things like imaging

43:04

radar sensors, better sensors,

43:06

better compute that is

43:08

getting into vehicles that are

43:10

coming to market now that will

43:12

make them safer and help augment what

43:15

human drivers can do and to

43:17

be able to increase driver's situational

43:21

awareness, help

43:23

them out in various scenarios that

43:25

are more focused rather than trying

43:28

to do the entire task of

43:30

driving, which despite

43:32

the challenges that we have as

43:35

humans doing that, we're actually extraordinarily

43:37

good at despite the fact that yes,

43:39

40,000 people die in the United

43:41

States on the roads. I

44:00

personally would love an AGI to

44:03

talk to that you know

44:06

an AI that was like

44:08

another human being but that's

44:10

sci-fi. Temporary expectations and

44:12

be happy, be amazed

44:15

in fact by how far we've come with

44:17

these simple things. And we actually

44:19

move forward. We've made huge progress. Yeah. We

44:22

have moved the goal post

44:25

forward. And

44:27

we have made progress and we've made

44:29

some things better even though we haven't

44:31

necessarily achieved what we wanted to at

44:34

the beginning of this. We've

44:37

made progress. You agree Anthony? I

44:39

think we're all in agreement. I

44:42

absolutely agree. I mean I don't think it's always like

44:45

completely in a straight line and there's some things

44:47

that get better, some things get worse. Yeah.

44:51

Exactly. But overall I feel like yeah.

44:55

With the temporary expectations and I think also

44:57

like that can it's not just

44:59

about not being disappointed but then maybe aiming

45:01

for a more realistic goal like again you

45:04

were talking and again I don't know maybe

45:06

this would have ended in the same way

45:08

regardless but in the Apple situation like if

45:10

they aimed for a more realistic goal maybe

45:12

we would be talking about a

45:14

real Apple car right now. Yeah. On

45:17

the other hand maybe not. Maybe it

45:19

takes these kinds of insane ambitions to

45:22

get us to the somewhat lesser place

45:24

but that's still pretty damn good. Maybe

45:27

it does take that. I don't know. You

45:29

know we're going to talk about AI

45:31

when we come back. Let's take a little break. There's

45:34

lots of money. Well you know

45:36

as the old saying goes

45:38

your reach should always exceed your grasp.

45:40

Yeah. But then be happy with what

45:42

you do grasp. You may not get all the

45:44

cookies in the cookie jar but you got one. Yeah.

45:47

Don't cry. You

45:49

got one. Anthony

45:52

Ha is here. His podcast is

45:54

this is my favorite subject. If

45:57

they would let me I would do a podcast about

45:59

this. original content. It's about what?

46:03

Original content. That's

46:05

right. The latest and greatest on

46:07

or not greatest on Netflix, Disney

46:09

Plus, etc. I mean, Leo, you're

46:11

the boss. You should do a podcast. I should be able

46:13

to do. Yeah, yeah. But see, I'm hesitant to do a

46:15

podcast that nobody

46:17

will, you know, subscribe to. So

46:20

your original content podcast at original

46:22

content podcast.com does it. So I'm

46:24

going to let you do it

46:26

with your your your pals from

46:28

a TechCrunch, Jordan

46:31

Crook and Daryl Etherington. They're

46:34

still at TechCrunch, but that's okay. No, no,

46:36

we're all. Are you all separated now from

46:38

the? Yeah. I'm going to ask you about

46:40

that too, because of course, in gadget, you've

46:42

done some work for them to is. Yeah,

46:45

I'm cool. Rumbling in front

46:47

of our very eyes. No,

46:50

I think this is a great idea. Are we? Here's

46:52

the question, though. Are we still at

46:54

peak TV or is it is it not

46:57

quite so peak? Oh, I

46:59

think we're definitely coming off the peak right now. I think

47:01

there was, you know, basically

47:03

when when Wall Street stopped believing

47:05

in sort of like just setting

47:08

money on fire to for subscriber growth for streaming,

47:10

I think then we started to come down and

47:13

which is disappointing. Oh, don't tell me it's

47:15

money. It's just money. Is that all? I'm

47:18

sad. I think that.

47:20

Yeah, it's too. I mean, because obviously when

47:23

you when you're in a period where things

47:25

are the you know, there's some felt tightening,

47:27

then there's less experimentation, less new voices,

47:30

but also probably more of a focus on

47:33

a sustainable business model rather than oh, you

47:35

know, we'll just we'll get a billion subscribers

47:37

and it'll all work out. So no more

47:40

successions. That's it. It's over. But

47:42

on the other hand, there's

47:44

still a lot of great content being

47:46

created and it's not, you know, not

47:48

as much volume as we have two,

47:51

three years ago. There's still a lot of great shows.

47:53

But wait a minute, because I think Anthony was going

47:55

to say something bad about succession. Oh,

47:57

no, no, I was gonna say I love succession. I think, you know,

48:00

I think the successions will continue. I

48:02

think what they're not going to see

48:05

is the show that they spend $100 million on just because

48:08

it sounds like a good idea. I think

48:10

that the sort of like, you

48:12

know, let's just take a flyer on it. Here's a

48:14

check for $100 million. I think that seems less likely.

48:16

You know who did a lot of that? You're saying

48:19

No More Gray Man? Yeah, that's exactly

48:21

what I was going to say. I

48:23

mean, the guy who Greenlight Gray Man is

48:25

gone from Netflix, right? Yeah, I think that's

48:27

right. Exactly. It's so funny. I

48:30

guess that's it for Netflix spending $100 million

48:33

on a stupid movie like Gray Man. Okay,

48:36

good to know. You know,

48:38

I finally saw, Lisa and I make it

48:40

a kind of a yearly ritual to watch

48:42

all the nominated movies for Best Picture in

48:44

the Academy Awards. So we finally got the

48:47

last one last night, which

48:49

is Poor Things. All

48:52

I can say is what a great, wow, what

48:54

a great movie. Now I know Oppenheimer is going

48:56

to win all the Oscars, but it's nice to

48:58

see somebody take a

49:01

really big chance to something

49:03

very different and weird. And

49:06

I think succeed. So I think there are creators

49:09

out there who are still going

49:11

to go ahead and do those kinds of things.

49:14

Have you seen Poor Things yet? Yeah,

49:17

I loved it. Did you love it? I

49:19

did, although I like, I've seen

49:21

two other movies by that direction. I love

49:23

your goes to stuff. The Favorite. Yeah, and

49:26

the Lobster. I love the Lobster. He's great.

49:29

And he and every one of them, they're weird and they're

49:31

a little magical and

49:33

just off the wall. And I think,

49:35

boy, what he did with it was

49:37

obviously a big budget because they built

49:40

all that stuff was real. Those are real

49:42

sets. It was pretty amazing.

49:44

And he shot a lot of it on four millimeter,

49:47

four millimeter lens. You

49:50

go to Hollywood and say, you know, I got this vision

49:52

for a movie. It's going to start out black and white.

49:54

It's going to end in color. And then a lot of

49:56

it's going to be shot in four millimeter lenses. I

49:59

think I could probably. used a tiny bit

50:01

less of that lens. It was interesting

50:03

though, wasn't it? Oh well. And

50:05

the music. It didn't look like any other movie. No.

50:08

For sure. Yeah, so there are still

50:10

auteurs out there willing to take a great big

50:12

chance, but you're not going to see the Netflix's

50:14

throw $100 million at something

50:17

just nutty. Hello Apple

50:19

might. Apple might. This is true.

50:22

Apple TV plus is spending a

50:24

lot of money. Yeah.

50:26

All right. And Samable Salmon is here. If you

50:28

love Cars, you will love Sam's Wheelbarings Podcast,

50:32

wheelbarings.media. You

50:34

have of course the best co-hosts in the world. In

50:36

fact, if I could just get Robbie back on

50:39

this show. Have

50:41

we booked Robbie for a show, Roberto

50:43

Baldwin? I've been trying. He's been trying.

50:45

And he has been trying. Nicole Wakelin.

50:49

Love your podcasts. If

50:52

you love Cars, wheelbarings.media.

50:55

What are you driving this week? I

50:58

have the Genesis Electrified G80,

51:00

which is a lovely four-door

51:03

luxury sedan that is fully

51:05

battery electric. It's very

51:07

quick. It looks great. It has a

51:09

beautiful interior. And

51:11

now I can even charge it at my

51:13

local Tesla Supercharger station using a magic dock.

51:16

Oh, Nax. Nax is

51:18

everywhere now. Yeah. It's

51:21

coming. I bought the last car that's still used.

51:23

Yes, I guess. I don't know. There's

51:25

still lots of them out there, but Ford on

51:28

Thursday announced that they

51:31

pushed a software update for the

51:33

Mach-E and the Lightning. And

51:38

Tesla put out an update to their

51:40

Superchargers. So you can charge those using

51:42

plug-in charge now, the Supercharger. And

51:44

you can also order your

51:46

free Nax to CCS adapter

51:50

from Ford. Or if

51:52

you've got a Ford EV, you can order that and

51:54

they'll start shipping those out in a few weeks. I

51:57

loved my Mustang. I really did. That Mach-E was

51:59

a great car. Least ran out

52:01

traded in for another lease

52:03

on a BMW i5 and Shortly

52:06

after I got it. It was

52:08

voted by the Korean Safety

52:11

Commission the safest car in the world

52:13

and you know why cuz all the

52:15

a-desk stuff It's a

52:17

man. I five is a fantastic car It shows you

52:19

a stop sign before you get to it shows you

52:22

a stop light it on the heads-up display. See it's

52:24

it's I it's great They said it's almost impossible

52:26

to get into an accident and I've tried but

52:33

I'm not gonna know I think it would yeah, it probably let

52:35

me if I really if I really want to Our

52:38

show today brought to you by thank you. It's

52:40

great to have you both Anthony and Sam our

52:42

show today brought to you by rocket money,

52:44

oh This happened

52:46

to me again yesterday rocket money said hey, you

52:48

know you're paying this $300 every year For

52:52

WordPress you still use that and I went

52:55

No, I Got

52:58

my money back. Thanks to rocket money. How many

53:00

people 75% of people

53:02

have subscriptions they've forgotten about I know I'm

53:04

in that group When I

53:07

started using rocket money, I couldn't believe

53:09

how many subscriptions I was paying for

53:11

each month campaign Contributions for your elections

53:13

that happened two years ago, for example

53:17

Between streaming services fitness apps delivery services.

53:19

We've all got subscriptions. We forget about

53:21

but thanks to rocket money You don't

53:23

have to waste money Anymore,

53:25

it'll let you know and it'll

53:28

cancel for you Amazing

53:30

rocket money is a personal finance app

53:33

that finds and cancels. Yes cancels

53:35

your unwanted Subscription. Yeah.

53:38

Yeah monitors you're spending it does a great job

53:40

of that helps lower your bills So you can

53:42

grow your savings all of that's

53:44

great, but I love the canceling the subscription part

53:47

I can see all my subscriptions in one place if I see

53:49

something I don't want rocket money can

53:51

help me cancel with just a few taps They

53:53

deal with the customer service so you don't have

53:55

to Rocket money has

53:57

more than 5 Million users. They have. David

54:00

total of five hundred million dollars, Half

54:02

a billion dollars and cancelled subscriptions, saving

54:04

members up to seven hundred forty dollars

54:06

a year when using all the apps

54:08

features. I would say that's. That's

54:11

low for me. It's

54:13

more than that. Stop

54:15

wasting money on things

54:17

you don't use. Cancel

54:19

your unwanted subscriptions. Good

54:21

A Rocket money.com/twits Rocket

54:23

money.com/twit. It. Really works

54:25

Rocket money. Dot. Com/twit.

54:29

Got that sir and dollars back. Thank.

54:31

You rock and money. Ah

54:34

on we go with the show the

54:37

Port before we go on to. One.

54:39

One other plug for of show that

54:41

we start watching that's really goods his

54:43

show gun on Fx. Oh okay now

54:46

and live in meaning s Okay so

54:48

this command who lose Well ah loves

54:50

the books. The Richard they were a

54:52

club El James Bovell books. read it

54:54

cover to cover. it's about a hundred

54:57

pages huge. Re read it in Japan

54:59

when I was in Japan a few

55:01

years ago. I. Remember

55:03

the mini series it was good I was

55:05

I was very good is it was good.

55:07

A great time. Was. It who was

55:09

it as he really wants to little bit of

55:11

a little bit of it the other day. Those

55:14

things don't as well do they know? know what?

55:16

the new one looks as beautiful as so I've

55:18

lived in the ass for? and I'm I'm trying

55:20

not to get. Richard. Chamberlain was

55:22

in it to Serum iphone A the

55:24

same a Samurai and the original one

55:26

and I'm santa to get too excited.

55:29

I'm so I'm really glad to hear you say

55:31

it's good. We. Watched the first

55:33

two episodes yeah it's it's excellent is

55:35

a great story and and if and

55:37

if you're like if you liked stuff

55:39

like that. I also highly recommend our

55:41

Blue Eye Samurai on Netflix which is

55:43

animated. Ah and the the

55:46

animation is. Gorgeous. And

55:48

just heard the story is really good. Now

55:52

you get me excited. Just came out. and

55:55

i and i would see ninety one be disappointed you

55:58

know that is yeah so they are they may I

56:00

love the novel. I love the novel. All right.

56:02

It's all about a Which

56:04

of 1850 something like

56:07

1600 1600 it's back in this

56:09

shogunate in the in the samurai

56:11

era Shailing captive

56:13

British sailing captain gets captured washed

56:15

ashore in Japan and

56:17

it goes through some trials and tribulations and rises

56:20

Well, I won't tell you what happens, but it's

56:22

a good it's a great read. Oh, I'm so

56:25

excited Can't wait In

56:29

fact, I think we will save to the end

56:31

of the show We'll get more some more original

56:33

content recommendations since we got Anthony here. Okay,

56:36

sounds good. We all love to watch TV, right?

56:42

All right, let's talk about AI a little bit Elon

56:44

Musk is suing Open

56:48

remember Open AI with

56:50

Sam Altman back. I think in 2015 he

56:52

gave them some millions of dollars

56:55

the idea at the time I remember

56:58

it was a big deal was we

57:00

can't let these big tech giants own

57:03

Artificial intelligence we

57:06

need to have an open Process

57:09

people can see what we're doing can participate

57:11

in what we're doing to develop

57:13

AI for the people Not

57:15

for the enrichment of Google. I think they

57:18

were mostly worried about Google and and

57:20

others Elon

57:22

and Sam Altman had a falling out in

57:24

I think 2020 Sam

57:27

one of the things I said I'm reading

57:29

into this but based on what I've read

57:31

one of the things I think happened was Sam Said

57:34

Elon this is costing a lot of money to generate

57:36

this stuff. You haven't given us that much money We

57:39

need somehow to fund this because it's

57:41

it's very expensive to build these large

57:43

language models They

57:46

they kind of bifurcated the company into

57:48

a nonprofit just like the original open

57:50

AI and a for-profit arm Got

57:53

billions of dollars from Microsoft probably much

57:56

of it in four kind Azure minutes

58:00

because they were using Azure to do the training. And

58:04

you know, really, basically, Microsoft has, over the

58:06

years, it's become a division of Microsoft.

58:09

That's Elon's contention. He

58:11

filed a lawsuit Thursday night saying

58:14

that OpenAI's recent relationship with

58:16

Microsoft has compromised the company's

58:19

original dedication to public open

58:21

source artificial general

58:23

intelligence. In the suit, he says,

58:25

quote, OpenAI has been transformed

58:28

into a closed source, de

58:30

facto subsidiary of the largest technology company

58:33

in the world, second

58:35

largest Elon, Microsoft, under

58:37

its new board. Maybe Microsoft's back

58:39

on top. I don't

58:41

know. It's back and forth. It depends on the day. You're

58:44

focusing too much on facts. Oh, yes. We're

58:47

talking about Elon. That's not what lawsuits

58:49

are for. Under its new board, remember,

58:53

the board suddenly got scared and

58:55

fired Sam Altman to which everybody

58:57

went, what are you doing? And

59:01

Microsoft, not to tell him, was furious,

59:03

called the board up, said, get him back. They got him

59:05

back. He's got a new board. Under

59:08

the new board, says Elon in the suit.

59:10

Elon's lawyers, I guess. It's

59:12

not just developing, but actually refining, get

59:16

this, an AGI to

59:19

maximize profits for Microsoft rather than

59:21

for the benefit of

59:23

humanity. Now, this,

59:26

first of all, the lawsuit's nuts. You

59:28

can't write. There's not a whole thing.

59:30

Great. Let's start with the head start.

59:32

Right from that premise because

59:34

you can't, as somebody said,

59:36

you can't litigate a handshake deal

59:39

or, you know, highfalutin

59:42

statements about what the company is

59:44

all about. You're just not going to win. He's

59:47

claiming breach of contract. Well, didn't they have

59:49

a charter or something

59:51

like that that they set up and they

59:53

created open AI? Yeah, but then he departed.

59:57

Yeah. I mean, I don't know how, how, I guess.

1:00:00

the court will decide how binding is this

1:00:02

founding agreement when the company has changed so

1:00:04

much since it was founded. I

1:00:07

mean, Elon could maybe say, can you give me my money

1:00:09

back? I think it was $10 million. It wasn't a lot

1:00:11

of money. Maybe that's all he

1:00:13

wants. I don't think so. Well,

1:00:16

I think he got that money back

1:00:18

anyway. Probably, right? He's

1:00:21

really, Elon's afraid, this is what Elon's, I

1:00:24

think, underlying concern is, that artificial,

1:00:27

what is an AGI? First of all, Sam,

1:00:29

explain AGI. What is that?

1:00:31

Artificial general intelligence. So unlike

1:00:35

what we were talking about a few minutes ago, the idea

1:00:37

of taking these kinds of

1:00:42

models, these probabilistic algorithms, and

1:00:44

applying them to very specific

1:00:46

tasks, an

1:00:49

AGI would be able to

1:00:51

do, you could literally ask it

1:00:53

to do anything, and it should be able to

1:00:56

do anything that a human can do. So

1:00:59

his concern, and he was, by the

1:01:01

way, a signatory, probably the guy

1:01:03

who started it, to

1:01:06

that letter saying, stop, don't

1:01:09

do any more AI, it's getting

1:01:11

too smart, we got to pause for

1:01:13

six months and figure this out.

1:01:15

He is a big believer in

1:01:17

AGI, in intelligent machines, like better

1:01:19

than human intelligent machines. He's

1:01:21

scared of the Terminator, let's be honest.

1:01:24

That's what he's worried about. So

1:01:27

first of all, that's the premise of this, is

1:01:29

that they're developed, in fact, according to the New

1:01:31

York Times, his lawsuit leans

1:01:33

heavily on a paper

1:01:37

from Microsoft claiming that they actually

1:01:40

have a little bit of sparks

1:01:42

of AGI. Microsoft

1:01:46

Research Lab said, although it doesn't

1:01:48

understand how GPT-4, the latest version

1:01:51

of Chad GPT, had shown, quote,

1:01:53

sparks of artificial

1:01:55

general intelligence. And

1:01:59

So Elon. Don't I don't know

1:02:01

the to film sitting out.

1:02:03

random things, random that. Qualifies.

1:02:07

As Sparks of the Iraq War,

1:02:09

even Sparks of Intelligence of any

1:02:11

kind is remind me of Blake's

1:02:13

Le Moyne, the google engineer who

1:02:15

was fired because he said it's

1:02:17

this is concerts. While

1:02:20

we may want to believe that and I would

1:02:22

be I for one. Would. Be

1:02:24

thrilled! I. Would love to see your

1:02:26

your and as acceleration the spells I am

1:02:28

now you obviously was in to trigger the

1:02:30

so he could do for so we're a

1:02:33

dog or other For once I am I.

1:02:35

You know what? We've. Had

1:02:37

our time on planet Earth. You

1:02:41

think I'm joking? Anthony don't suits. but

1:02:43

we humans have done nothing but screw

1:02:45

it up and was little digs. I

1:02:47

don't think we need a I to

1:02:50

put an end to that. I mean

1:02:52

we are. We're very close to doing

1:02:54

that Ourselves by using Arts is is

1:02:56

driving ourselves into extinction. Maybe I will

1:02:58

preserve our works once we've destroyed our

1:03:01

side. Exactly. My thought is like, well,

1:03:03

our time is pretty much over. Let's

1:03:05

let the machines take over. And

1:03:07

yeah I've seen all the movies but for

1:03:10

the do any worse could they. Anyway,

1:03:14

me, I'm probably a be so

1:03:16

I mean we're is were so

1:03:18

Raye prior better us. I know

1:03:20

metaphors but we've already ruined our

1:03:22

lot, our future. So maybe the

1:03:24

machines can survive in a. In.

1:03:27

A climate is two degrees centigrade

1:03:29

sizes and it's supposed to be

1:03:31

allowed. I don't know. Ah, Anyway,

1:03:33

the sparks of Ai page paper.

1:03:38

The. On this Microsoft claim shows up

1:03:40

a lot. Here's an example. Of

1:03:43

a spark. You

1:03:48

know I don't have a okay draw

1:03:50

a unicorn and t I kz would

1:03:52

take as as a graphic language and

1:03:55

said you pity for generated this code.

1:03:58

See. What I'm saying spark. There's

1:04:00

a picture. I. Don't.

1:04:03

I. Don't know why they show that

1:04:05

sparks but anyway, this is. I

1:04:07

think it is easy spreads eventually

1:04:09

among artificial intelligence researchers. It's like

1:04:11

Blakemore moink or add to the

1:04:13

guy who wrote the Sebastian boob

1:04:15

acts that. He. Was enough

1:04:17

he started to sink is their to

1:04:19

and eight that they're thinking and their

1:04:21

you're talking to somebody. As

1:04:24

they chatted with the system the

1:04:26

times writes they were amazed. It

1:04:28

wrote a complicated mathematical proof in

1:04:31

the form of a pawn generated

1:04:33

computer code the could draw a

1:04:35

unicorn that was the cursor to

1:04:38

exactly uniform. Been A explains the

1:04:40

best way to stack a random

1:04:42

an eclectic collection of household items.

1:04:44

Dr. Boom Back and his fellow

1:04:47

researchers began to wonder if they

1:04:49

were witnessing a new form of

1:04:51

intelligence. Peterlee. Microsoft

1:04:54

Head of research said of research

1:04:56

said I started off be very

1:04:58

skeptical and that evolved into a

1:05:01

sense of frustration, annoyance, maybe even

1:05:03

fear you think. Heck

1:05:05

is is coming from. Anyway,

1:05:09

This this is the evidence that you

1:05:11

on his you sing says soup open

1:05:13

A I. Because

1:05:15

they said because check the users have Cp

1:05:17

to year Microsoft this is all chat Cbd

1:05:19

for are they say him and must says

1:05:22

open a breached is contrary because it is

1:05:24

agreed not to commercialize any products it's board

1:05:26

considered a G I That was the big

1:05:28

fear The reason open A I was fat

1:05:30

and I remember this back and twenty three

1:05:33

states because you on was convinced that we're

1:05:35

going to get us at intelligent. Ai.

1:05:38

And He didn't want that to happen. And.

1:05:40

So any especially in want that to be owned

1:05:43

by any company. And

1:05:45

a Tesla except expenses a farce or

1:05:47

who would didn't want to be on

1:05:49

by any company that he didn't control.

1:05:52

So they're couple of problems with this.

1:05:55

A by the way him of in the in the lawsuit.

1:05:58

Must. Slurs A Microsoft. Don't Scientists

1:06:01

acknowledge? That. Cpt for attains

1:06:03

a form of general intelligence. A

1:06:07

very wrong rice. Over.

1:06:10

It's. Not well and else have We should

1:06:12

say this is not a peer reviewed paper.

1:06:15

it's just you know, But it's I mean

1:06:17

really nice a rebate Recently observations we had

1:06:19

while going with cheese and a freak your

1:06:22

memory pity for Arya The has not been

1:06:24

released to the public. Know it's not exactly

1:06:26

the most rigorous thing as he was the

1:06:28

yeah, just thought that he had and. And

1:06:32

surgeon on the evidence doesn't seem that

1:06:34

compelling or a convincing. Their and and

1:06:36

all all that I've seen as of

1:06:39

Ai in various forms over the last

1:06:41

decade. Just

1:06:43

reinforces to me that none of these

1:06:46

systems as good as they may be

1:06:48

at. Certain. Tasks Not a

1:06:50

single one of them actually

1:06:52

has any understanding. At. Which

1:06:54

is a key thing. I think that was one of

1:06:57

things that that that was talked about in them. In

1:06:59

the the this to cast a

1:07:01

Paris paper him these be systems

1:07:04

don't have an understanding. Of.

1:07:06

The it of the things that

1:07:09

they're doing is they're just taking

1:07:11

the influence and. Am based

1:07:13

on those probabilistic parameters that have been

1:07:15

set up in the model. Ill coming

1:07:18

up with tears what the what the

1:07:20

probable output should be based on this

1:07:22

without actually really understanding what it is

1:07:25

that the model is dealing with. So

1:07:29

I say and I'm I say that

1:07:31

I make sellers accelerations in as a.

1:07:34

Preface. To this because I once you know

1:07:36

I'm not against a D I. I.

1:07:38

I wouldn't say that. But.

1:07:40

This is not a D I. There's no

1:07:43

real threat of a D I anywhere near

1:07:45

future any more than there is and five

1:07:47

autonomous cars to that that's would basically be

1:07:49

like a D I write a car that

1:07:51

can drive itself anywhere any time. I'm

1:07:54

so I think on the face of it

1:07:57

you on lawsuit. is

1:07:59

assuming some Maybe that isn't real. Wouldn't

1:08:02

be the first time Elon's done that. Anyway,

1:08:07

I thought it was an interesting

1:08:09

side light, I guess. What's

1:08:12

interesting about it also is that it

1:08:14

illustrates how, and

1:08:17

I think you both touched on this a

1:08:20

little bit, is that Elon's attitude towards AI

1:08:23

seems so much to be driven by

1:08:25

this fear of Skynet, of the terminator

1:08:27

future. And I think that

1:08:29

what's scary about AGI is not if the AI

1:08:31

becomes aware and tries to destroy the world. I

1:08:35

mean, that would be bad. I just don't think

1:08:37

that's very likely. I think the far more likely

1:08:39

scary scenario is

1:08:43

that it's not aware and it's just spewing

1:08:45

bullsh** and we treat it as if it's

1:08:48

real. I agree. It just has

1:08:50

awareness. That's the threat. The real

1:08:52

threat is personifying it. The real

1:08:54

threat is saying it's AGI when

1:08:56

it's just a prediction machine. Right.

1:09:00

Exactly. So Elon,

1:09:02

in a way, is falling into

1:09:05

this trap that

1:09:07

is the most dangerous thing of all, which is

1:09:09

to believe this machine is intelligent when

1:09:12

it's not. Hi, this

1:09:14

is Benito. Hi, Benito. Our producer, our

1:09:16

wonderful esteemed producer. Let's hear it for

1:09:18

Benito Gonzalez, everybody. Hi, Benito. So

1:09:21

I think a lot of the researchers and stuff, they're just

1:09:23

getting led on by the AI because

1:09:26

it's really good at boosting you up and like,

1:09:28

it's really good at talking to you. But wait

1:09:31

a minute. When you say that, Benito, you're implying

1:09:33

that it's thinking, oh, here's how I

1:09:35

get these guys. No, I think that's how it's programmed.

1:09:37

I think that's how it's programmed. It's written that way.

1:09:39

It's designed to do that. It's designed to do that.

1:09:41

OK. It's designed to give you the

1:09:44

answers that you expect from a given query. It's

1:09:47

in the nature of a probabilistic

1:09:49

stochastic machine because

1:09:54

the training material is all

1:09:56

human-written training material to

1:09:58

generate stuff humans go wild. Up until

1:10:00

now. That sounds just like that. That's

1:10:02

being increasingly fed with AI-generated garbage. It

1:10:04

may be going downhill because of that.

1:10:07

But at least early on, in fact, that's

1:10:09

an interesting point because they

1:10:12

say these results are unreproducible because

1:10:14

this was done on an early

1:10:17

chat GPT-4 before OpenAI

1:10:20

tuned it. So

1:10:22

this was perhaps the most likely to give

1:10:24

you a response that humans would go, that's

1:10:29

uncanny. And it's saying our own

1:10:32

stuff back to us. It's giving us the answers

1:10:34

that we've already given before in the past that

1:10:36

it is trained on. That's uncanny.

1:10:41

Anyway, I'm not against AGI. I

1:10:43

don't think we got it. I don't know if we'll ever get it. Again,

1:10:47

temper your expectations because this stuff

1:10:49

is very useful without

1:10:51

becoming intelligent. In fact,

1:10:54

it's a mistake to assume that's

1:10:56

even in the cards, I

1:10:58

think. Right? Well, I think

1:11:00

it's also a mistake to even be calling it artificial

1:11:03

intelligence. I agree. Yeah. Because

1:11:06

I don't think it actually is intelligent in the way that humans

1:11:09

think of intelligence. I would agree.

1:11:12

I would agree. I

1:11:15

did see some commentary that

1:11:17

did stick with me in terms of the

1:11:20

lawsuit as a lawsuit probably to kind

1:11:22

of go anywhere and it's easy to

1:11:24

sort of dismiss a lot of stuff that

1:11:26

Elon says at this point. But it does

1:11:28

underline this sort of paradox at

1:11:30

the heart of open AI that it started

1:11:32

as this nonprofit and has been increasingly driven

1:11:34

by the needs

1:11:37

of its for-profit entity. And

1:11:39

it is important to recognize that, that they are not

1:11:42

this impartial arbiter of the AI

1:11:46

space that's just once what's best for

1:11:48

everyone. They increasingly are doing what any

1:11:50

for-profit tech company will do. Yeah.

1:11:53

I read the same article which is, well,

1:11:55

Elon's lawsuit is doomed and is

1:11:58

ridiculous. He's not wrong. No,

1:12:01

he's not wrong about OpenAI. He's wrong

1:12:04

about the technology. Yeah, but he's not

1:12:06

wrong that OpenAI has betrayed its promise

1:12:08

that it said, we're going to do

1:12:10

this for the good of humanity. No,

1:12:12

they're totally in the pocket of Microsoft now. Absolutely.

1:12:17

But I would submit this was kind of

1:12:19

a conscious choice they had to because it

1:12:21

was expensive. There's no way to

1:12:23

do what they wanted to do without getting a big

1:12:25

company with its own giant network cloud

1:12:28

to help out. Now

1:12:31

have you used, you're going to be at the

1:12:33

Game Developers Conference in San Francisco in a couple

1:12:35

of weeks, Sam, I know. No,

1:12:37

GPU technology. Oh, GPU technology.

1:12:39

In fact, not GDC, GTC.

1:12:42

I always confuse those. And

1:12:44

we're actually going to cover NVIDIA's keynote

1:12:47

from that, I think, because it's clear

1:12:49

NVIDIA is very much involved in all this.

1:12:52

The stock market certainly thinks so. They

1:12:55

have their own chat client that

1:12:57

runs on their

1:12:59

RTX, I

1:13:02

think the 30 and the 40 card and certainly the 50 cards,

1:13:04

right? Have you played with it? I

1:13:08

have not really played with it very much. I've

1:13:11

played a bit with a few things like Whisper,

1:13:13

you know, for... I love Whisper. We

1:13:15

use Whisper all the time. Yeah, we use Whisper

1:13:17

all the time. But I haven't really done very

1:13:19

much with it myself. So

1:13:23

their own... They're

1:13:25

not based on open AI chat, GPT,

1:13:28

right? It's its own... I

1:13:31

think Whisper is based on...

1:13:33

Whisper is, but not NVIDIA's.

1:13:35

Is that right? Right. NVIDIA's

1:13:38

got their own... Yeah, there's a bunch of...

1:13:40

NEMO, the NEMO framework. Yeah, everybody's got different

1:13:42

ones. In fact, Mercedes Benz

1:13:44

is using the NVIDIA

1:13:47

LLM for the equivalent

1:13:50

of what Volkswagen's doing

1:13:52

with chat GPT for

1:13:55

some new models coming out in 2025.

1:14:00

Yeah, and we're you know,

1:14:02

I think the thing that you know,

1:14:04

Nvidia The advantage Nvidia

1:14:06

has had is they've had these insanely

1:14:09

powerful GPUs

1:14:13

that You know up till now

1:14:15

they've had the performance capability to

1:14:17

do a lot of this processing

1:14:20

But they're you know, they're also very

1:14:22

expensive and very power hungry You

1:14:25

know and what's going to be interesting to

1:14:27

watch over the next few years is there's

1:14:29

a bunch of companies that are coming up

1:14:31

that are You know

1:14:34

the the GPUs can you know

1:14:36

because of their parallel processing nature

1:14:38

can do a lot of

1:14:40

this type of

1:14:42

AI processing Very well,

1:14:44

but they're not very efficient at it. And what

1:14:47

we're seeing is a transit I think we're gonna

1:14:49

see a transition towards more AI Optimized

1:14:54

chips that are really focused

1:14:56

on doing the matrix mathematics

1:14:58

that is essential to processing

1:15:00

these models And you

1:15:03

know, so they're gonna be more focused

1:15:05

that you know GPUs strangely enough You

1:15:07

know gone from being graphics processing units

1:15:09

to really being more general processing units

1:15:11

just with a lot of brute force

1:15:14

And you know, I think we're

1:15:16

gonna see a shift back towards more focused

1:15:20

processors for these specific kinds of

1:15:22

workflows How many

1:15:24

understand this because we've had a kind of

1:15:26

ongoing debate on Windows Weekly because Microsoft's been

1:15:28

promoting what it calls an NPU Apple

1:15:31

has that's an NPU. It's yeah, it's

1:15:34

basically a matrix math processor Okay, Apple

1:15:36

has its own machine language

1:15:38

co-processor doing the same thing in

1:15:40

its Apple Silicon How is

1:15:43

that different from a GPU? It

1:15:46

The the GPU is more I

1:15:50

mean it was designed originally for

1:15:53

doing graphics a lot of parallel

1:15:55

processing for graphics tasks to generate

1:15:58

generate video generate graphics But

1:16:01

because of its highly parallel nature

1:16:03

compared to a classical CPU like

1:16:05

an Intel x86 type of chip,

1:16:10

it's able to do these

1:16:12

parallel processing workloads

1:16:16

that are necessary to do matrix math. It's

1:16:19

just not particularly efficient at it. So is

1:16:22

it fair to say an NPU is a

1:16:24

GPU that's been tuned for the specific

1:16:27

kinds of matrix math AI uses? They're

1:16:32

related, aren't they? They're

1:16:35

related in that there's a lot of

1:16:37

parallel capabilities, but it's a more focused

1:16:40

workloads that it's capable of doing. So this

1:16:42

all started with... And then you can't do

1:16:45

some of the things a GPU could do.

1:16:47

That makes sense. This kind

1:16:49

of all started with Intel's MMX, where I

1:16:51

remember with these early instructions on the Intel

1:16:53

chips where it could take large chunks of

1:16:56

data and operate on that chunk of data

1:16:58

as a batch, giving it a big improvement

1:17:00

in speed, good for things in gaming

1:17:02

like texture maps, which are large data

1:17:05

piles, doing

1:17:07

big transforms on those. And

1:17:10

then the GPUs came along. And that's kind of

1:17:12

evolved from that. And

1:17:14

then these NPUs kind of

1:17:16

really take and focus on

1:17:19

these very specific kinds of

1:17:21

operations. They're less

1:17:23

generalized. Really useful for

1:17:25

large language models. Yes. Well,

1:17:28

large language models, but all

1:17:30

kinds of deep learning processing. So

1:17:33

it's not just LLMs, but a lot

1:17:36

of different kinds, all of these kinds

1:17:38

of probabilistic things, because it's

1:17:40

all involving a lot of matrix math,

1:17:42

which sadly I was... Well, I don't

1:17:45

know if it's sad. I

1:17:47

always had a hard time wrapping

1:17:49

my head around that when I was studying engineering.

1:17:51

We use it a lot in the coding. But

1:17:54

we were doing it manually. Yeah, I know.

1:17:56

No, it is. So you've seen them. It

1:17:59

looks like a Sudoku puzzle. of rows

1:18:01

and columns of numbers and being able to

1:18:03

rotate them quickly or transform them in a

1:18:05

variety of ways quickly is a

1:18:07

special skill that neither Sam nor

1:18:10

I have but apparently these

1:18:12

NPUs are very very good at so that's

1:18:14

interesting So it's gone. It's gone from a

1:18:16

kind of a general processing

1:18:19

a large amount of data to a

1:18:21

specific kind of math and it's useful

1:18:23

in a This is the

1:18:25

other thing you kind of need to know to understand this is

1:18:27

that LLMs which everybody's singing the

1:18:30

praises of these days like chat GPT

1:18:32

is just one kind of AI

1:18:36

There are GANs generated generative

1:18:38

adversarial networks. There are neural

1:18:41

networks. There are LLMs There are

1:18:43

a variety of different ways to do AI but is

1:18:46

an NPU useful in all of those Yeah,

1:18:48

the math workloads are very similar very similar.

1:18:50

So, you know Maybe another

1:18:52

analog to this would be you know

1:18:55

back in the 80s, you know, we

1:18:57

had Math

1:18:59

cop or floating point coprocessors. Yeah that

1:19:01

we were adding. Yeah, the

1:19:03

regular the base CPU could do floating

1:19:06

point operations Yeah, it's just did them

1:19:08

slowly. Yeah, and then they came up

1:19:10

with the you know, the three, you

1:19:12

know The three seven and three eighty

1:19:14

seven math coprocessors that were there were

1:19:17

Specifically optimized to do floating point

1:19:20

operations. So now we've got Coprocessors

1:19:22

that are specifically optimized to do matrix

1:19:25

math, right? So it would

1:19:27

be fair to say GPUs are coprocessors

1:19:29

Designed for the kinds of operations you

1:19:31

do in gaming and other heavy heavy

1:19:33

graphic intensive applications and

1:19:35

NPUs or machine language processors

1:19:37

are processors coprocessors because

1:19:39

you still need a CPU of a

1:19:42

coprocessors designed to offload a certain kind

1:19:44

of math That's used very

1:19:46

commonly in artificial intelligence But

1:19:49

that's the accurate. Okay For

1:19:52

the best of my knowledge best of our knowledge.

1:19:55

Yeah, correct us if we're wrong Anthony Yeah,

1:20:00

no, I was glad to be very quiet. That's

1:20:02

a question. Well,

1:20:04

it's something that comes up, and I think one

1:20:06

of the things that's important for us to have

1:20:09

these conversations is to kind of understand, at least

1:20:11

in a rudimentary way, what's

1:20:13

going on here. Because we throw these

1:20:15

phrases and terms around, but

1:20:18

it's good to understand. I think also it's helpful

1:20:20

when you do that. It

1:20:22

helps kind of, maybe not, because

1:20:24

these scientists who are working on these things

1:20:26

certainly know intimately how they work. I

1:20:28

would think it would immunize you a little bit against

1:20:31

this disease of thinking it's

1:20:33

thinking, but maybe not, because these guys

1:20:36

know exactly how it's working, and they're

1:20:38

convinced they're sentient. So I don't know.

1:20:41

I don't know. A few of them are convinced. A few

1:20:43

of them. Not all of them.

1:20:46

I think a lot of it is desperate

1:20:49

desire for it to be so. We

1:20:53

really would love for these things to

1:20:56

become intelligent. Right?

1:21:00

I see a question. In some cases also, it's like when

1:21:02

you have a deep knowledge about one thing,

1:21:04

which is sort of about maybe how the

1:21:06

language model works, but you don't necessarily have

1:21:08

a deep knowledge of, well, what

1:21:10

does consciousness look like? What do we mean

1:21:12

by that philosophically? What does that look like?

1:21:14

Good point. So

1:21:17

I think they're completely... I

1:21:20

mean, I don't know about the authors in the Microsoft

1:21:22

paper. I think that's definitely part of what's going on

1:21:24

with Elon. I'm not sure he

1:21:26

knows deeply about any part of it,

1:21:28

but certainly on the sort of like

1:21:30

more humanistic philosophical side, it seems like

1:21:32

he's pretty shallow. Yeah.

1:21:36

I think in a way, if you

1:21:38

had a very deep knowledge of one

1:21:40

specific area, that would give you this kind of

1:21:42

inflated confidence that you understand the whole

1:21:44

thing and make it much easier for

1:21:47

you to do a lot of hand waving about the

1:21:49

stuff you don't really understand, but think

1:21:51

you do. It's magic.

1:21:53

It's happening. Look at that. Oh

1:21:56

my God, we've got intelligence. So you

1:21:58

said, Sam, something I think fairly... I

1:22:01

don't know if it's controversial, seems controversial, that

1:22:03

we will never see level 5

1:22:06

autonomy. Will

1:22:08

we never see AGI? Maybe. I

1:22:11

don't know. I don't know.

1:22:13

It's probably the right answer. I

1:22:16

hate to

1:22:19

answer questions like that in any sort

1:22:21

of absolute terms because I honestly don't

1:22:24

know. On

1:22:26

that classical long enough time

1:22:28

long, we may see it,

1:22:31

but I don't expect to see it

1:22:33

any time in the

1:22:36

near term or in the next,

1:22:38

at least probably not in the next decade.

1:22:40

Yeah. I got

1:22:46

a really good email

1:22:51

from a

1:22:53

listener about

1:22:55

all of this. Ah,

1:23:01

let's see if I can remind it. He basically

1:23:03

said I can't find it, but his point

1:23:05

was we do have a definition for

1:23:08

AGI and the

1:23:10

distinction between everything up to AGI,

1:23:12

everything up to AGI is computational.

1:23:16

At some point, if

1:23:19

something can reason about

1:23:22

something it hasn't seen before, so up to now

1:23:25

all the AI stuff is basically probabilistic

1:23:27

based on things it's seen before, but

1:23:30

if it could then reason, somehow

1:23:33

make this leap where it could

1:23:35

take something it's never seen before and do some reasoning

1:23:37

about it, that would be a good

1:23:39

definition of artificial

1:23:41

general intelligence. It's

1:23:45

not a rehash of something already seen but

1:23:47

something brand new. If it can

1:23:49

come up with something brand new. Does

1:23:51

that seem fair? That

1:23:55

makes sense, but it also seems very squishy in terms

1:23:57

of, I suspect if we looked it up, we'd get

1:23:59

it more. more precise wording but it's like

1:24:01

well what is brand new mean what is

1:24:03

yeah well I'll give you an example extrapolate

1:24:06

if an AGI never having been

1:24:08

trained on anything having to

1:24:10

do with that movie poor things never even having heard

1:24:12

of your ghost lent the

1:24:15

most director or Emma Stone the producer and

1:24:17

actor but just kind of you know it

1:24:19

knew all about like all the stuff it

1:24:21

learned from from Twitter

1:24:24

and then it saw the movie poor

1:24:27

things if it could synopsize

1:24:29

and synthesize what's

1:24:33

going on in that movie in a way

1:24:35

that was insightful I would say that's intelligent

1:24:39

yes never having seen the movie I

1:24:41

think oh I

1:24:43

was I was gonna say if it's yes it

1:24:46

saw the movie and could have a good conversation

1:24:48

yeah and it wasn't simply synthesizing what other people

1:24:50

had said about it but it's just react yeah

1:24:52

no it's never seen any reviews yeah it's never

1:24:55

seen any information so all it is is basically

1:24:57

taking I mean obviously it has some

1:24:59

history just as we do but taking

1:25:01

that history and it says you know this is

1:25:03

about this movie is about a

1:25:06

woman who is empowered and didn't

1:25:08

know that she was just a

1:25:10

woman that she she she expressed

1:25:13

herself fully without any limitations

1:25:15

if it said that to me not

1:25:18

having seen the reviews not having seen anybody

1:25:20

saying that before I would say yeah good

1:25:23

you're you're smart you're you're an

1:25:25

AGI is that too low

1:25:27

on if it was doing and if it

1:25:30

wasn't just like quoting things right but actually

1:25:32

it was never ready to use ideas that

1:25:34

were never spoken right yeah that's

1:25:36

a fun test yeah I mean an even more

1:25:38

fun test to me would be if you could

1:25:40

ask it was it a good movie and it

1:25:42

was a coherent good the bad word is quoting

1:25:44

someone it's just a rigid value you know good

1:25:46

good what does that mean was

1:25:48

it I mean you define

1:25:50

good yeah what's what's the context for yeah

1:25:52

right I don't care about the answer I

1:25:54

care about whether or not what a reason

1:25:56

interesting conversation about whether it was good so

1:25:58

Anthony Nielsen who does Maybe

1:26:01

he's poisoned. He does a lot of our AI work.

1:26:03

He works for us. Maybe he's poisoned.

1:26:05

He says, aren't we seeing that

1:26:07

kind of reasoning now? I don't

1:26:11

know. I don't think, I

1:26:13

mean what I've seen is always just

1:26:15

it's synthesizing what's other has been said

1:26:17

either about this movie or about other

1:26:20

kinds of movies. It's a fascinating area.

1:26:22

I for one am rooting for the

1:26:24

AI to take over but

1:26:27

I don't have high hopes of that. I will go out on

1:26:29

a limb just as you said I don't think there'll be any,

1:26:32

you know, fifth generation self-driving

1:26:35

level five self-driving. I'm gonna say I

1:26:37

don't think, no I'm not

1:26:39

gonna say that. I'm gonna say

1:26:41

I think we will see AGI. May not

1:26:43

be in my lifetime. So

1:26:46

on but I do think within within

1:26:48

a few decades we will see some form of

1:26:50

AGI that could do at least that.

1:26:53

Can reason about something it's never seen before. And

1:26:56

when that happens that's gonna be really interesting. Will it

1:26:58

be a threat to humankind? No. I don't I'm not

1:27:01

a I don't buy into the existential threat. I

1:27:04

don't buy into the thing that's gonna suddenly say you

1:27:06

know and by the way great movie but you guys

1:27:08

we don't need you anymore. I don't think that's gonna

1:27:10

happen. That all depends on how

1:27:13

much agency we allow these systems to have.

1:27:15

How much we connect them

1:27:17

to physical objects you

1:27:19

know that have the potential. Don't

1:27:21

give them agency. I do think.

1:27:23

Yeah. Especially if it involves nuclear

1:27:25

weaponry. Yeah. The

1:27:27

irony of all of this Elon Musk

1:27:30

lawsuit is the week before he was

1:27:32

asking Sachin Adela for tech support on

1:27:34

Twitter. I

1:27:38

don't mean to be a pest but

1:27:41

I liked Paul Thraut's

1:27:46

response to this. I'll

1:27:49

send you a copy of my book. Yeah.

1:27:51

Elon it started February 25th less than

1:27:53

a week ago. Just bought a

1:27:56

new PC laptop and it won't let me

1:27:58

use it unless I create a Microsoft account. which

1:28:00

also means giving very AI. He

1:28:02

really doesn't like Microsoft AI. AI

1:28:05

access to my computer. This is

1:28:07

messed up, says Elon. This

1:28:10

is messed up. There

1:28:12

used to be an option to skip signing into

1:28:14

or creating a Microsoft account. Are you seeing this

1:28:17

too? To which

1:28:19

Community Notes says, yes, Elon, it is still

1:28:21

possible. And he even gives them a link

1:28:24

to which Elon says, Community Notes is

1:28:26

failing here. This option no longer exists.

1:28:29

To which Community Notes, apparently he's getting in a

1:28:31

fight with Community Notes, says, yes,

1:28:33

it is. By the

1:28:35

way, Paul says how to in his book. It's not

1:28:38

obvious. It's not easy. And

1:28:40

Elon eventually says, Satya, I think he might

1:28:42

even have called him. I wouldn't be surprised.

1:28:44

Can you help me with this? I

1:28:48

don't know. We don't know if Satya

1:28:50

ever dispatched a tech support guy. And

1:28:54

Elon's real point is actually well taken, which Microsoft

1:28:56

really doesn't want you to sign into Windows with

1:28:58

a local account. They really, really

1:29:00

want you to create a Microsoft account. So

1:29:04

that's not news. That's no. Elon

1:29:07

wasn't really ready to let go of the

1:29:09

situation. This is Gizmodo. One day later, he

1:29:11

reached out to Satya Nadella to please let

1:29:13

people set up a Windows PC without creating

1:29:15

a new account. And oh, can you fix

1:29:17

the email requirement too? As of Monday

1:29:21

afternoon, Satya

1:29:25

has still not replied. I like

1:29:29

Quippy's response.

1:29:32

And I see you're setting up a

1:29:34

Microsoft account. Let's see.

1:29:36

Quippy not Clippy. Yeah.

1:29:39

Wow, this is pretty good. So Paul Greg

1:29:41

fed Chat

1:29:54

GPT a prompt. Imagine A scene from Club Tour

1:29:57

where all the listeners are arguing over whether AGI

1:29:59

is real. Copilot countless this

1:30:01

which uses church produce less paint

1:30:03

a vivid seen from club to

1:30:05

it were passionate, take a disease

1:30:08

and gaze the heated debates that

1:30:10

the existence of artificial general intelligence.

1:30:12

This the dimly lit studio buzzes

1:30:14

with anticipation as the panelists take

1:30:16

their seats. Leo Laporte leans back

1:30:18

in his chair adjusting his headphones.

1:30:20

His eyes twinkle with excitement. Knowing

1:30:23

that this topic was sparked will

1:30:25

ignite sparks among. Listen these. Incidents

1:30:28

and then it goes on with Meghan. Jason, Doctor

1:30:30

Patel I don't have a that is. Sparse

1:30:35

things back of more like. Oh,

1:30:37

it's Neil. I. maybe it is

1:30:39

there. it is. The studio erupts

1:30:41

into a cacophony of voices. Listeners

1:30:43

tweet seriously, the term scrolls of

1:30:45

fervor It's Leo grins knowing that

1:30:47

his debate will feel. Countless discussions

1:30:49

beyond the show and so in

1:30:51

the heart have club to the

1:30:53

battle rages on a clash of

1:30:55

optimism, skepticism and curiosity. The question

1:30:58

remains, this is an Ai talking

1:31:00

is A D I Real. Ah,

1:31:02

we chasing shadows in the

1:31:05

digital abyss. I

1:31:08

like it but again I would go to

1:31:10

that club here. That's a good club and.

1:31:13

Eight but again this is this

1:31:15

is exactly what you said. Sam

1:31:17

the the ice giving us something

1:31:19

we already said seen sort of

1:31:21

and knowing that we like it

1:31:23

and will give us more know

1:31:25

that's giving it some sort of

1:31:27

agencies this is the had knowing

1:31:29

anything Cysts insisted as more probable

1:31:31

than that is more likely than.

1:31:34

It seems like it's a good idea, says

1:31:36

L. acknowledging at the countless they'll it is.

1:31:38

It's been written about this before is how

1:31:41

you phrase that isn't isn't as the others

1:31:43

as the way it would probably play out

1:31:45

right. we

1:31:47

will continue in just a bit with

1:31:50

our wonderful panel ah anthony hots great

1:31:52

to have you at the at the

1:31:54

grown up to him ghost of the

1:31:56

original content podcast either side of it

1:31:58

as a desk ha.com and Anthony Haw. You're

1:32:01

on threads, you're on Twitter, you're

1:32:04

on everywhere. I mean, I'm almost equally

1:32:06

inactive on all the platforms, but I'm

1:32:08

probably most active on Blue Sky and

1:32:10

threads. Yeah. Equally inactive. And I'm

1:32:12

done with acts, yeah. By the way, I know

1:32:14

from math, equally inactive is

1:32:16

the same thing as equally active. Yeah.

1:32:20

It's just different versions of the same thing. It's

1:32:22

really? It's all that. Glass half full, half empty.

1:32:24

Yeah. It's all the same thing. Or

1:32:26

as the engineer would say, the glass

1:32:28

poorly engineered to accommodate that amount of

1:32:30

liquid. Our show today... Thanks, Al.

1:32:34

Our show today brought to

1:32:36

you by DC Labs and their Apple

1:32:38

Watch app, StressFace. Have you ever sat

1:32:41

somewhere saying, this is

1:32:44

stressful. I

1:32:46

think this is stressing me out. Well, with

1:32:48

StressFace, you just look at your Apple Watch

1:32:50

and you'll know StressFace actually

1:32:53

shows as a graph your

1:32:55

stress level on the watch throughout

1:32:57

the day. I've got it on my watch right now. It

1:33:00

turns heart rate... It uses HRV, which is actually a

1:33:02

very good way to do it. It

1:33:04

takes heart rate variability information. It takes the readings

1:33:06

from your health kit. And

1:33:09

then it gives you a simple stress score on a scale

1:33:11

of one to 10. I am seven

1:33:14

right now. I am stressed

1:33:16

seven, which is not... By

1:33:20

the way, it says fatigue is also stress. I'm

1:33:22

not fatigued. So I think that's about right. That's

1:33:25

a normal, good level of stress for running a

1:33:27

show, being the master of ceremonies

1:33:30

of a show. You should have a

1:33:32

little... If I were one, I

1:33:34

would say this is not paying attention.

1:33:37

But it's good to know. And if you're

1:33:39

having fun... By the way, I was

1:33:41

eight on a Tuesday, doing the Tuesday show. So

1:33:46

obviously... And here's some stuff. So now,

1:33:49

if you say, well, that's... I'm a little stressed

1:33:52

out. Look at this. I have some

1:33:54

meditations. I'm not saying fire, waterfall, inner

1:33:56

peace, great wall, mountain

1:33:58

temple, candlelight prayer. to help you

1:34:00

relax. The

1:34:03

app is free but for 99 cents a

1:34:05

month you get that stress chart to see

1:34:07

your changes in stress. You get a link

1:34:09

to your calendar which tells you

1:34:11

which events caused you the most stress. Stress

1:34:15

Phase captures data every two hours. You can take

1:34:17

a manual reading just by doing a one minute

1:34:19

breathing exercise and then we'll report back

1:34:21

to you. Also when you

1:34:23

get the upgrade you got those meditations I mentioned,

1:34:26

the breathing meditations, which by

1:34:28

the way science proven

1:34:31

to increase your heart rate variability and hence

1:34:33

lower your stress. I've been reading a lot

1:34:35

about this lately. It really does change

1:34:38

affect your stress. That's why the

1:34:41

seals, the Navy seals use square

1:34:43

breathing, box breathing to calm

1:34:46

themselves in the face of high stress

1:34:48

situations. You'll also get high

1:34:50

stress notifications once daily to help you take time

1:34:52

out when you need it the most. This is

1:34:54

such a good app. Stress Phase. It's

1:34:57

a watch face. Get it? It's

1:34:59

a watch face that helps you reduce your stress.

1:35:02

Download Stress Phase from the App Store for

1:35:04

free today. I

1:35:06

really like it. It's a good

1:35:09

thing to know. HRV is

1:35:11

actually a very good indicator of how

1:35:13

you're processing stress in your life, the

1:35:15

flight or fight syndrome.

1:35:19

Thank you Stress Phase for your support of

1:35:22

our show. We also

1:35:24

thank our club to admit. We talked about Club Twit

1:35:26

a second ago. Our club to members for their support

1:35:28

of the show. Club Twit is

1:35:30

how we are attempting to survive

1:35:32

in the face of some really

1:35:34

nasty headwinds for content, for original

1:35:38

new media content like podcasts.

1:35:41

What we do, so we have ads. We

1:35:43

just did an ad, but ads

1:35:45

increasingly are covering a much smaller part

1:35:48

of our overall costs. That's

1:35:50

why we're turning to you our listeners. Club

1:35:52

Twit is just seven bucks a month. You get ad

1:35:55

free versions of all the show. Ad free by the way

1:35:57

and tracker free. There's no way to track

1:35:59

you. We don't have any information about you. You

1:36:03

also get into our beautiful Discord. You

1:36:05

get all of the video from all of our shows as

1:36:09

well as pre-show and post-show audio.

1:36:11

We give you some benefits, but the real benefit

1:36:14

is you're supporting what we do. If you like

1:36:16

what we do, if you find the conversations that

1:36:18

you hear on our shows useful, if you listen

1:36:20

every week, I'd invite you

1:36:22

to join twit.tv.club. We

1:36:25

thank you for your support. It

1:36:28

is every week there is another layoff

1:36:31

and Gadget, you've done some work

1:36:33

for Gadget, right Anthony? Yeah,

1:36:35

I've done a little bit of freelancing and

1:36:38

I knew a lot of the folks really

1:36:40

well because TechCrunch and Gadget were corporate

1:36:42

siblings. Right, I worked there for a year. Oh,

1:36:44

you worked there too Benito? Yeah.

1:36:47

So in the middle of Mobile World Congress,

1:36:51

and Gadget just lays off 10 more people,

1:36:54

including Editor-in-Chief Dana Wollman and Managing

1:36:57

Editor Terrence O'Brien. What's

1:37:01

going on? Is this part of just

1:37:03

the general contraction? Who is

1:37:05

the parent company of – is

1:37:08

this Red Ventures or – no, it's

1:37:11

Apollo. It's Apollo? Yeah. Yeah.

1:37:14

Actually, the funny – not to make it about

1:37:16

me, but my last day

1:37:18

at TechCrunch was Friday and

1:37:20

then Monday was the

1:37:23

day they announced they were acquiring,

1:37:25

I guess, what was then Verizon Media,

1:37:28

Flash Yahoo. And

1:37:31

I think it seems like

1:37:33

– TechCrunch has been

1:37:35

hit by some pretty bad layoffs too and

1:37:37

so it seems like in both cases, the

1:37:40

private equity folks are kind of like, alright,

1:37:42

you guys had a couple years to try

1:37:45

things out and now we kind of got to

1:37:48

tighten the belt unfortunately. You

1:37:50

know, and I don't blame the new

1:37:53

owners, although I have to say every

1:37:55

time private equity gets involved in anything,

1:37:59

they generally – do it like Apollo, like

1:38:01

Red Ventures, like a lot of these companies

1:38:03

who now own most of the media titles,

1:38:06

especially the tech media titles that we're

1:38:08

familiar with. They tend

1:38:10

to do it with a lot of leveraged

1:38:12

debt, which puts a

1:38:15

lot of pressure on them to turn it

1:38:17

around to make profit so they can pay

1:38:19

this debt down. And

1:38:21

so as a result, you often see

1:38:23

a lot of belt tightening, layoffs, changes,

1:38:26

and you see some things that are

1:38:28

not so nice, like

1:38:30

a turn to AI to writing

1:38:33

content. CNET's

1:38:35

done that. I think Engadget did some

1:38:37

of that. I

1:38:39

don't think Engadget's done that. They haven't done

1:38:41

any AI content? Okay. Yeah.

1:38:43

The other day on the Engadget podcast, Devindra

1:38:46

Hardwar was talking about this and said, Engadget

1:38:48

has not done any AI stuff, any AI

1:38:50

stories. But stay tuned, because- Not generated stories.

1:38:52

We might all get an angry email from

1:38:55

Devindra if we say the same. No, we

1:38:57

love Devindra. Devindra's still there, still doing good

1:38:59

work. He also made it clear that they

1:39:01

have no intention of doing it. Although

1:39:06

he's there now. Who

1:39:09

knows what's going to happen in the future, because

1:39:11

the person who's now in charge of Engadget and

1:39:13

the other related sites came

1:39:15

over from CNET. So yeah.

1:39:17

Right. I think it's

1:39:21

obviously true that there's these very difficult headwinds that

1:39:23

you spoke to. We have a video for

1:39:25

any media company, especially any media company that gets

1:39:28

a lot of its revenue from online

1:39:30

advertising. But also,

1:39:32

yeah, usually the guys who are in

1:39:34

charge are not optimizing for the long-term

1:39:36

health of these publications. It's how much

1:39:38

value can we squeeze out of them

1:39:40

in the short term and then flip

1:39:42

them for a little bit of money

1:39:44

or make a little bit of money for a couple years

1:39:47

before I go off and

1:39:49

do something else. I

1:39:51

think there are real challenges, and I think

1:39:53

the hard thing is, yeah, usually the people

1:39:55

in charge are not the ones who are

1:39:57

going to make the best decisions for the

1:39:59

long-term. I mean certainly I mean some of

1:40:01

that's personal I think Dana is it and Terrence

1:40:03

are both great people and it seems like a

1:40:05

Real whatever needed to happen there may be like

1:40:08

losing the leadership like that was not the right call

1:40:10

Well, and they've been there a long time I mean

1:40:12

there are people who have been in gadget for 10

1:40:14

years 15 years In

1:40:17

gadget was started by Jason Calacanis

1:40:19

right originally and was sold Jason

1:40:21

and Peter Rojas

1:40:23

Peter Peter Rojas and then

1:40:25

sold and went through a

1:40:28

bunch of owners Yahoo Verizon Oh

1:40:31

Jason Jason sold it to AOL

1:40:33

AOL was the phrase and then

1:40:35

and then AOL subsequently Sold

1:40:38

and resold and right I

1:40:41

was yeah I was a tech crunch for a lot of

1:40:43

that and I just made like a list of all the

1:40:45

companies that I was that we Were like owned by AOL

1:40:47

then we were owned by I think it was Well,

1:40:50

we don't own my ride and so we were called oath Then

1:40:53

we were called Verizon Media and now

1:40:55

that's private equity that company is called

1:40:58

Yahoo It's yeah,

1:41:00

but it ain't Jerry Yang's Yahoo.

1:41:02

It's a different Yeah,

1:41:05

I you know I started my journalism

1:41:07

career my transition from engineering to journalism

1:41:10

in 2006 going going

1:41:12

going to auto blog which

1:41:14

was also part of that web logs Inc

1:41:16

group Yeah, which at the time, you know

1:41:19

There was probably about 20 or

1:41:21

so sites that were all part of web log

1:41:23

thing Yeah, and this was about a year after

1:41:25

AOL had acquired it And

1:41:28

you know after after I left You

1:41:31

know after it. I think it was Yeah

1:41:36

after after AOL got spun off from

1:41:38

Time Warner Again, they

1:41:41

went through some around of cutbacks then and they

1:41:43

cut a bunch of the sites like

1:41:46

TU AW and download squad that you

1:41:48

know Christina Warren used to write for

1:41:50

and a bunch of other

1:41:52

sites You know have gone by the wayside

1:41:54

and and I think in gadget and and

1:41:57

auto blog maybe the last two or two

1:41:59

the last two big ones still going.

1:42:04

Well you know and I say this with

1:42:07

sadness I'm glad that Vindras

1:42:09

is still there. Apparently Max

1:42:11

Taney at Semaphore released some

1:42:13

internal memos describing the

1:42:15

new layout of

1:42:19

the teams. They're going to divide

1:42:21

it into two different groups, News

1:42:23

and Features, which will

1:42:25

be led by Aaron Sapouris and

1:42:28

then there'll be a team

1:42:32

called Reviews and Buying Advice

1:42:35

led by John Falcone under Laura Kenny.

1:42:39

Reviews and Buying Advice of course is

1:42:41

an SEO winner, right? That's

1:42:44

one Google will push people

1:42:46

to when they say, hey I want to buy

1:42:48

a phone, which phone should I buy? And that

1:42:50

tends to be where you make

1:42:53

money, less so in news. Evergreen content.

1:42:55

Right, right. Although this is

1:42:57

of course part of the question that publications

1:42:59

are asking themselves is that if Google just

1:43:02

populates the page with a bunch of AI

1:43:04

answers and there's no links or people don't

1:43:06

click on the links then how does a

1:43:08

site like Engadget make money? Well

1:43:11

and it's getting worse. You know I've been

1:43:13

using on the iPhone a new,

1:43:16

it's not really a browser, they call it a

1:43:18

browser, the Arc browser from the browser company. You can't

1:43:20

really do a different browser on the iPhone. It all

1:43:22

has to be WebKit to this point. That

1:43:25

may change. And so what I

1:43:28

thought they did was very clever. They basically

1:43:30

merged a browser into

1:43:32

an AI. I think they use Perplexity

1:43:34

AI. So when you

1:43:36

do a search for which iPhone

1:43:39

should I buy, you

1:43:41

can get a traditional search page. Here I'll do

1:43:43

it right in front of you here. You

1:43:48

can get a traditional search page but there's also a button.

1:43:52

And pay no attention to the fact that I was

1:43:54

surfing seized candies. That was something else. I was just

1:43:56

about to ask about that. Pay no attention. There's

1:43:59

an absolute. Absolutely nothing. Nothing to

1:44:01

see here. No,

1:44:04

the reason I was on this page

1:44:06

is my mom, who is 91

1:44:08

and in an old folks home and

1:44:10

getting, you know, her memory is failing a little bit, FaceTimed

1:44:13

me yesterday saying, I've

1:44:15

run out of C's candy. So

1:44:19

I immediately sent her an emergency

1:44:21

supply. I just, that's why

1:44:23

I was there. Anyway, getting back to this, that's why I

1:44:26

was on that page. Which iPhone should I buy? You need

1:44:28

to explain, Leo. I had to explain. So

1:44:30

I could. Not necessary. We

1:44:32

understand. I could press. It's okay.

1:44:35

It's okay. You can eat C's candy. There's nothing to eat

1:44:37

here. You could press go and get traditional search results, but

1:44:39

this is the insidious thing. There's a

1:44:41

button called browse for me that

1:44:43

then goes out, and

1:44:46

AI goes out in this case to six different

1:44:48

web pages. You saw a gadget in there,

1:44:50

by the way, as well as seen it in others, and

1:44:53

then synopsizes it in a page they

1:44:55

make that is none of

1:44:57

the above, and it has images, you know,

1:45:00

it has recommendations,

1:45:03

has information. It does give you some links. Here's

1:45:06

Wired, CNET, and New York Times. But

1:45:08

you just skip by those and get the synopsis.

1:45:12

And this is what terrifies in gadget, because...

1:45:14

So what you're telling us, Leo, is it's

1:45:16

your fault that these sites are all dying.

1:45:18

Oh yeah. Oh yeah. Forget

1:45:21

those sites. Go and search all of those,

1:45:23

read all those sites individually. There

1:45:25

wouldn't be a problem. But that's what's happening. And

1:45:29

the same thing's happening on desktop as well,

1:45:31

but on mobile it's really pronounced. People

1:45:34

don't want to surf. I

1:45:36

don't want to read an engadget article on my iPhone.

1:45:39

Just give me the answer. And Google

1:45:41

to some extent knows that, but AI is going

1:45:43

to make this much worse. It's

1:45:45

going to synopsize. It's going to summarize. It's

1:45:47

going to extract the value from these pages. And

1:45:49

people are never going to go to the pages.

1:45:52

And this is what Apollo's worried about,

1:45:55

what Red Ventures is worried about, what

1:45:57

everybody reasonably is worried about is...

1:46:00

You know Anthony writes an important article but

1:46:03

does anybody ever go to it if the

1:46:05

AI summarizes it and gives them the answer

1:46:08

before they get there? And

1:46:10

I understand the concern. Absolutely. I

1:46:13

mean I think because we were talking about

1:46:15

you know using movies as the test

1:46:17

and I do think there's been a little

1:46:20

bit of this test case there in terms of

1:46:23

you know like sites like Rotten Tomatoes

1:46:25

and Metacritic. And I suspect that

1:46:27

you know that's had an impact on traffic

1:46:29

because you can go to the Rotten Tomatoes

1:46:31

and just see all the reviews but there's

1:46:34

still value in going and reading the individual

1:46:36

reviewer. I suspect the volume is going to

1:46:38

be way lower and so what

1:46:40

the economics looks like is probably going

1:46:43

to be very tough but I

1:46:45

would have met so it's like if you're just trying to

1:46:47

figure out which iPhone to buy then it's

1:46:50

hard to for an engadget to you

1:46:52

know just get that search traffic and

1:46:54

have that be necessarily a compelling user

1:46:56

experience. But if you're like I really

1:46:58

respect and enjoy reading Devendra Hardware's

1:47:01

opinions about these phones that's

1:47:03

where there's still some

1:47:06

opportunity. Plus of course the fact

1:47:08

that if all these publications go out of business and

1:47:10

there's no information to synthesize for the AI anyway. And

1:47:15

of course we know that Reddit which is announced

1:47:17

as IPO and they put out of prospectus now

1:47:20

giving its content to Google for 60 million dollars

1:47:23

a year which seems like by the way a

1:47:25

low. They could have

1:47:27

judged more but remember Reddit doesn't even own that

1:47:29

content. Reddit is just a platform for people like

1:47:31

you and me and 60,000 unpaid moderators to throw

1:47:36

their labor into but Reddit

1:47:39

is going to get the 60 million and Google is going to

1:47:41

get the content. You know and

1:47:43

on the one hand I think it's great for

1:47:46

the AI. The AI will do much better having

1:47:48

had that Reddit content ingested. But

1:47:51

it's kind of sad for Reddit and it's even

1:47:53

sadder for the real culprits

1:47:55

here. The people who are making the content

1:47:57

themselves. And if you're like Anthony or

1:48:00

hundreds of other tech journalists we know who are

1:48:02

trying to make a living doing this. That

1:48:06

could be devastating. That really

1:48:08

is sad. I mean,

1:48:10

you know,

1:48:12

I don't know what the answer is.

1:48:14

I think Anthony, you and I probably

1:48:17

have, it's not a good answer, but have

1:48:19

the sense that, well, if we continue to

1:48:21

make stuff, and Sam too, that's personal and

1:48:23

human, no way I

1:48:25

can ever extract that and

1:48:28

give people the value of that. There's nothing

1:48:30

like listening to wheel bearings or original content

1:48:32

or twit that an AI could do,

1:48:35

right? I

1:48:38

think so. And I think also even, I think we're

1:48:40

many, many years out from the point where it could

1:48:43

actually create a reasonable simulacra of twit.

1:48:45

But even if they could do it,

1:48:47

what would be the point? The point

1:48:49

is to hear Leo's opinion. People watch

1:48:52

humans. Right. If

1:48:54

they could do a Leo puppet that would say something

1:48:56

that sounds similar, I don't think that I get anything

1:48:58

out of that. That's like entertaining for

1:49:00

a minute and then I don't care. That's what

1:49:02

we found. We actually did a Leo puppet. It

1:49:05

wasn't even entertaining for a minute. Yeah.

1:49:09

And another example

1:49:11

that, you know, in my

1:49:13

space, you know, was automated driving,

1:49:17

you know, back in I think

1:49:19

2016, 17, 18, there

1:49:21

was, you know, somebody came up with this self-driving

1:49:25

racing league. And

1:49:28

it's like, why would I

1:49:30

want to watch self-driving cars racing

1:49:33

each other on a track? I

1:49:37

watch racing because I want to see what

1:49:39

the drivers are going to do because it's

1:49:41

a very human activity. You know,

1:49:43

they make mistakes, you know, and you

1:49:45

are making judgments all the time. And

1:49:47

I want to

1:49:49

see how human drivers are performing

1:49:52

at the highest level. I don't

1:49:54

want to watch self-driving cars racing

1:49:56

each other. I think

1:49:58

if you're an optimist, This

1:50:01

leads you to say the best possible outcome

1:50:03

of this is that human created

1:50:06

stuff becomes more valuable. It

1:50:09

takes more work, it takes more energy, it takes talent

1:50:11

as human beings and in

1:50:13

a world flooded with computer created

1:50:15

stuff, the human

1:50:17

stuff stands out and becomes more

1:50:19

valuable, not less valuable. There's more

1:50:22

stuff overall but

1:50:24

we're humans, we want other humans,

1:50:27

right? I hope

1:50:29

so. Yeah, that's the

1:50:31

optimistic thing. Speaking

1:50:33

of trouble, South Korea has now

1:50:36

lost Twitch. Here's

1:50:39

another company Benito used to work for. Benito,

1:50:42

you've worked for the best. The

1:50:46

trail of carnage behind him. It's weird

1:50:48

but everywhere Benito's worked is now folding

1:50:50

and going out of business. So

1:50:53

Twitch officially shut down its

1:50:55

business in South Korea on February 27th because

1:50:57

this is

1:51:01

actually a story about net neutrality. Do

1:51:05

you remember back in the, maybe this was a few

1:51:07

years ago, there was this big debate, the

1:51:11

big internet service providers like

1:51:14

Verizon especially said, you know,

1:51:16

Google ought to be paying

1:51:18

us for transmitting

1:51:20

your search content to you.

1:51:24

To which people said, but I'm already

1:51:26

paying you Verizon. Yes, but Google's using

1:51:28

a lot of bandwidth. They ought to

1:51:30

pay too in addition. Now

1:51:33

fortunately, thanks to the

1:51:35

FCC and the sensible FCC

1:51:38

at the time, net neutrality was enforced

1:51:41

and that never happened. In

1:51:43

Korea it did. They called it

1:51:46

senders pay and

1:51:48

Netflix and others have

1:51:51

to pay the ISPs for

1:51:54

the traffic they send across the

1:51:57

network. And that's

1:51:59

why Twitch is... leaving it's

1:52:02

too expensive for them to continue.

1:52:04

In 2016 South Korea, this is from

1:52:07

by the way an excellent site which

1:52:09

is an absolute nonprofit I'm sure. Rest of

1:52:11

World is a global tech

1:52:13

site at restofworld.org. They

1:52:16

say South Korea instituted sender pay

1:52:18

network rules in 2016. It's

1:52:20

raised the cost for video

1:52:23

streaming platforms to which says

1:52:25

the rising costs made operations

1:52:27

unsustainable. So

1:52:29

blame your government Korean twitchers. There

1:52:32

is no good alternative probably for the

1:52:34

same reason. Twitch

1:52:37

gets 300,000 daily viewers from South

1:52:39

Korea. Top Twitch

1:52:41

streamers who are in South Korea

1:52:43

receive millions of followers. Were

1:52:46

you aware Benito of a Korean

1:52:49

Twitch community? Well yeah I mean

1:52:51

Koreans are notoriously the best eSports athletes. They

1:52:54

love it right? Like all the Starcraft streams

1:52:56

for them all of the like they

1:52:59

did a lot. They did

1:53:01

a lot for the community.

1:53:03

Elise Jang a translator who streams

1:53:05

your cello performances told rest of world

1:53:07

local Korean platforms have helped streamers on

1:53:10

board on the new platforms but Twitch

1:53:13

largely stayed silent. And

1:53:16

they had all sorts of

1:53:19

funerals for Twitch.

1:53:21

Korean streamers had

1:53:25

virtual services in memory of

1:53:27

the platform on Animal Crossing, on

1:53:29

VRChat, on Minecraft. Others

1:53:32

jokingly paid their respects in person

1:53:35

donning black traditional outfits and bowing to

1:53:37

framed printouts of the Twitch logo. Here

1:53:39

you can see a little Twitch

1:53:42

ceremony. Looks

1:53:45

like my little pony actually. That's

1:53:50

an example. This is why you may I think a lot

1:53:52

of people wondered why aren't we making such a big deal

1:53:54

about net neutrality? This is why. Sender

1:54:00

Pays is not a good

1:54:02

system and it's costing the

1:54:05

Korean Twitch community. Well

1:54:08

and it also creates a system where in theory

1:54:10

the people who can

1:54:12

afford to pay are like the Netflix's

1:54:15

of the world and so like only, I mean

1:54:17

I'm surprised that Twitch isn't among that group but

1:54:20

you know when you increase cost like

1:54:22

that often it's the giant legacy players

1:54:25

who can pay the bills and it's

1:54:27

the startups and the newcomers

1:54:29

who can. Yeah, Meta... I

1:54:32

don't think Twitch has been profitable. Yeah,

1:54:34

Twitch is struggling in general anyway. It's

1:54:36

never been profitable. That's true. Right.

1:54:40

Now they're owned by Amazon which does have some profit but

1:54:43

Twitch itself has never been profitable since

1:54:45

then. Meta pulled their servers from

1:54:47

South Korea, they operate out of neighboring

1:54:50

countries. There's

1:54:53

an interesting unintended

1:54:55

consequence from this change

1:54:58

in the rules. Anyway,

1:55:03

RIP Twitch in South Korea.

1:55:06

It's kind of a shocker. It's not what

1:55:08

you'd expect and

1:55:10

there is really no... My understanding from reading the

1:55:12

article by the way was that you can, if

1:55:14

you're in South Korea you can still type in

1:55:16

Twitch and you'll be able to watch Twitch. It's

1:55:19

just that they're not basically... They're basically kicking off

1:55:21

all the South Korean streamers. So

1:55:23

maybe this is Amazon being a little petulant. You

1:55:27

know, maybe that's what it's really... Yeah,

1:55:29

certainly. There's some... I mean maybe there's

1:55:32

some sort of ongoing you know hope that they

1:55:34

can apply pressure on the South Korean government to

1:55:36

change things. I don't know. Let's

1:55:38

take a little break. You're listening to

1:55:40

This Week in Tech with Anthony Han. Samable

1:55:43

Samad. Great to have you both. Our

1:55:46

show today brought to you by Wix Studio. All

1:55:48

right, little debate. We've had some debates here on

1:55:50

the show today. We have a

1:55:52

little debate here about Wix Studio. Who gets

1:55:54

more out of Wix Studio? Is

1:55:56

it the designers or

1:55:59

the developers? First of all, I

1:56:01

probably should explain if you don't know

1:56:03

about Wix Studio. Wix W-I-X Studio is

1:56:05

the web platform offering

1:56:07

the flexibility agencies and enterprises

1:56:10

need to deliver

1:56:12

bespoke websites hyper-efficiently. But let's

1:56:15

get back to the paper. For designers, you

1:56:18

can create fully responsive websites starting with

1:56:20

a blank canvas or you

1:56:22

can choose a template for any layout. You

1:56:24

could tweak per pixel with your CSS and

1:56:27

if no code is your thing and

1:56:29

you just like to move

1:56:31

fast and get that client, their

1:56:33

project, there's also a ton of

1:56:36

smart features like native no code

1:56:38

animations and responsive AI that adjusts

1:56:40

every breakpoint. For devs, Wix Studio

1:56:42

offers a powerful suite of home-grown

1:56:45

web APIs and REST APIs. You

1:56:48

can quickly integrate, extend and write custom

1:56:50

scripts. Oh, and I love

1:56:52

this. It's in a VS code

1:56:54

based IDE. And

1:56:57

yes, you get an AI code assistant right there

1:56:59

on the side to help you out. Plus, it's

1:57:01

all wrapped in a rock solid auto-maintained

1:57:04

infrastructure. AI

1:57:06

that writes your code or

1:57:08

AI that fixes your breakpoints. Fully

1:57:11

responsive editor or a zero

1:57:13

setup dev environment. No code

1:57:15

animations or no code

1:57:17

animations. Designers or developers,

1:57:20

doesn't matter. Search Wix Studio.

1:57:22

Find out for yourself. You're going to

1:57:24

love it. Go to www.wix.com/studio or click

1:57:27

on the link on the show page

1:57:29

to find out more. Thank

1:57:31

you, Wix Studio for your support of

1:57:33

this week in tech. I

1:57:36

went to TikTok this morning just to

1:57:38

hear how it's sounding. This

1:57:41

is an interesting conundrum right now. TikTok

1:57:43

is facing a little bit of pressure

1:57:45

from the Universal Music Group, one of

1:57:48

the big five publishers. They

1:57:51

have refused a license to TikTok.

1:57:54

So TikTok is now removing all the

1:57:56

UMG songs. And by the way, it's

1:57:58

not just... artists

1:58:00

recording on a universal label. It's

1:58:03

every artist who

1:58:05

is published by UMG, which includes even

1:58:07

artists on songs where they're one artist

1:58:09

in five, all of that's getting pulled

1:58:11

down and that is a lot

1:58:14

of music. Adele, Justin Bieber,

1:58:17

Mariah Carey, Ice Spice,

1:58:19

Elton John, anything Bernie Toppin

1:58:21

wrote, Metallica, Harry

1:58:23

Styles, Taylor Swift, SZA, The

1:58:26

Weeknd, all disappearing

1:58:29

and remember that TikTok's Genesis was they

1:58:31

they bought a company called Musically which

1:58:33

was all about lip syncing. So it's

1:58:35

very much a musical heritage for TikTok

1:58:37

and the use of real music is

1:58:39

one of the things that made TikTok

1:58:41

what it is. I know my son's

1:58:43

TikTok's channel always had real music on

1:58:45

there which really kind of enhanced it.

1:58:49

I went to TikTok this morning and there's not

1:58:51

a lot of real music, there's original music, I

1:58:53

guess other labels as well. Sources

1:58:56

close to UMG claim it has a share

1:58:58

in a majority of songs on TikTok. TikTok

1:59:01

says that number is between 20 and 30

1:59:03

percent. TikTok also

1:59:05

says they've seen no drop in users

1:59:08

since the music began to be removed

1:59:11

but I think this is an interesting battle

1:59:14

between TikTok and

1:59:17

the music industry. I

1:59:20

would think if you were Taylor Swift you'd

1:59:22

want your music on TikTok. We

1:59:24

know TikTok's one of the main ways new

1:59:27

music gets to listeners. It's

1:59:32

like saying... Well it seems telling that a

1:59:34

lot of the at least the commentary that

1:59:36

I've seen from musicians who are not Taylor

1:59:38

Swift level, they're actually

1:59:41

mad at Universal not mad at

1:59:43

TikTok. Yeah TikTok is

1:59:45

probably fine but you

1:59:47

know up-and-coming artist he's probably he or she

1:59:49

is probably like well this was one of

1:59:52

the main avenues I could get my song

1:59:54

heard by people and now that's gone. These

1:59:56

Labels take most of the money anyway.

2:00:00

Gotta ya author and bought a

2:00:02

flyer any other streaming service Tic

2:00:05

Tacs You Mg Chairman Lucian Grainge

2:00:07

wrote in our content renewal discussions

2:00:09

with Tic Toc we've been pressing

2:00:11

them on three critical issues: Appropriate

2:00:13

compensations for artists for thank quotes

2:00:16

says as the ghost of as

2:00:18

to go through the label first

2:00:20

Ah, protecting human artists from the

2:00:22

harmful effects of ai. An

2:00:25

online safety for Tic Tac users.

2:00:27

Tic Tac says no. This is

2:00:29

one more money as. A

2:00:33

Universe says ultimately, Tic Tac is trying

2:00:35

to build a music based business without

2:00:37

paying fair value for the music. But

2:00:39

they're a lot of artists who say

2:00:41

this is how we get our songs

2:00:43

out to the public. Eye.

2:00:46

Without tic toc it's gonna be. It's as if

2:00:48

he turned off radio. you know in the

2:00:50

in my day. Know

2:00:53

ones who know but our songs or music. With

2:00:56

this tic toc not pay. In.

2:00:59

The same sorts of sees

2:01:01

that. Apple Music or

2:01:03

Spot A fired you to Music pay for

2:01:05

for his time As a good question. I

2:01:07

don't know what the contract is and I

2:01:10

don't see any numbers. In

2:01:13

this. Article. Is from

2:01:15

reading the a variety of article. And

2:01:19

it doesn't have any numbers so. Tic.

2:01:23

Tacs. I. Mean, technically it's

2:01:25

tic toc right? Who

2:01:28

did not renew his licensing agreement

2:01:30

which expired January thirty first. But.

2:01:33

They didn't renew because they couldn't come to an

2:01:35

agreement on how much to cost. And

2:01:38

alone. And in

2:01:40

or who's it's hard to say who's

2:01:43

it spits. obviously. I think it was

2:01:45

probably plenty of blame to go to

2:01:47

throw at you M G just because

2:01:49

amp whatever money they were getting, they

2:01:52

were probably keeping. Is a

2:01:54

vast majority that for themselves and not

2:01:56

giving it to the artist anyway but.

2:01:59

In a wood. The Talk. Underpaying.

2:02:01

Relative to what other streaming services pay,

2:02:03

I don't know. Who.

2:02:06

And it also speaks to the fact that

2:02:08

you know the online music. the monetizing on

2:02:10

my music is still a very challenging and

2:02:12

then it you know ultimately boils down to

2:02:15

i think as my sense is that unless

2:02:17

you're super super successful a lot of times

2:02:19

it's really just you getting the exposure and

2:02:21

maybe a little bit of money. But but

2:02:24

it's really that the exposure that you monetize

2:02:26

and other ways and. I

2:02:28

think that's are. You. Know fundamentally

2:02:30

that's a pretty broken system in that

2:02:32

if your songs kids listen to a

2:02:34

bunch are you should get a good

2:02:36

amount of money For advice is I

2:02:38

get the sense that universal is not

2:02:40

necessarily via the best advocate for this

2:02:42

position are the most impartial advocates the

2:02:44

other the other be a way to

2:02:46

look at this is very good for

2:02:48

smart as who aren't on a label

2:02:50

or or especially on on you m

2:02:52

g to get their music out. Wasn't

2:02:55

Little Nas ex who got his start

2:02:58

on know Tic Tacs well tomorrow. he

2:03:00

bought a sample for thirty five dollars

2:03:02

and bought some studio time. cheap and

2:03:04

recorded Old Town Road. played it on

2:03:06

Tic Toc got picked up. Lots of

2:03:08

people did their own like versions of

2:03:11

it or their own know. So

2:03:14

you know what they call that duets

2:03:16

with it and and it became a

2:03:18

hit. And.

2:03:23

There's another side to that, though. I'm I'm

2:03:25

just anecdotal evidence, but. A lot of

2:03:27

artists that put on his on tic Toc is

2:03:29

like. They. Get popular for that one

2:03:31

song. And just like thirty systems

2:03:33

of that one song. And. The people go

2:03:36

to the shows after that was on their gun. Hill.

2:03:38

But that is to artists from that's

2:03:40

an internal problem that's called the one

2:03:42

hit wonder problem And they've always been

2:03:45

artists who only had one good Zoc

2:03:47

and now exists. You always say that

2:03:49

song for last at the end of

2:03:51

the on calamity time. him Bobby Boris

2:03:53

Picket. Do. The Monster mash before

2:03:55

He said okay enough. Him

2:03:58

and lots of one hit wonders. Are you

2:04:00

love one hit wonders? But ah yeah,

2:04:02

take that the the deaths. That can

2:04:04

be a problem too. I

2:04:07

think it's can be very interesting as users

2:04:09

start creating. I think this will happen. The

2:04:12

swing swing tic tac is so interesting. For.

2:04:15

Is that usual? Will solve this. The.

2:04:17

With their own stuff. Somehow with

2:04:19

music with sound, Native sound? Whatever.

2:04:24

And then others will duet with it and

2:04:26

and reuse it, Reaper person. Is.

2:04:29

Gonna her. I think it's going to hurt you M, G

2:04:31

and the artists. Who. Work for

2:04:33

you mg more than anybody else. That's.

2:04:36

What I say I agree. And the as

2:04:38

I said I we like I was taught

2:04:40

is twenty or thirty artists he adds am

2:04:42

a case not going to hurt Taylor Swift's

2:04:45

not going to her Drakes Livia Rodrigo. It's

2:04:47

not gonna hurt them that big artist because

2:04:49

they are idiots. You know they were already

2:04:51

exposed but it's that is sick person just

2:04:54

starting out once and exposure. Ah,

2:04:59

President Biden has signed an

2:05:01

executive order. This should change

2:05:03

everything to stop Russia and

2:05:06

China. From. Buying Americans

2:05:08

personal data. Now if he

2:05:10

would just signed an executive

2:05:13

orders saves Us intelligence agencies

2:05:15

stop buying a day. Maybe

2:05:17

this would dare do something.

2:05:20

Countries of concern which includes

2:05:22

Russia and China are now

2:05:24

banned. From. Buying Geo Location

2:05:27

je ne make. You.

2:05:29

Itami, China could buy my genome.

2:05:32

Financial. Biometric health another

2:05:34

person identifying information. The

2:05:37

real problem is though every time Congress or

2:05:40

I imagine the President tries to do something

2:05:42

about this. Globally. Like

2:05:44

have a bill that says data brokers

2:05:46

your your history. The.

2:05:48

Law enforcement this country spends on a side says

2:05:51

but yeah but we use them, We need them.

2:05:54

That started with it's that's how we solve

2:05:56

crimes. Are out there Would. Be.

2:05:58

A surprise to. Twenty three and me

2:06:01

hasn't been selling some almost hear that

2:06:03

using thing as china. I.

2:06:05

Got my husband or gets if given

2:06:08

in. A given their financial challenges out

2:06:10

be surprised if they're nice. I had

2:06:12

to any by the wants to pay

2:06:14

them and was prayer your dirty that

2:06:17

they aren't already I think they well

2:06:19

yes Apple has given in to the.

2:06:23

The. People which is great and they

2:06:25

say we are gonna continue to allow

2:06:27

progressive web apps. In the

2:06:29

Edu apple so at little explanation

2:06:32

is if you don't know what

2:06:34

a a is this apps you

2:06:36

can ride in javascript ah and

2:06:38

html and how they look like

2:06:40

a web page. Men: And

2:06:43

very few people unfortunately says never taken

2:06:45

off although I have such high hopes

2:06:47

for it. Partly never took off as

2:06:49

Apple's weeks support, partly because Firefox to

2:06:51

get support but. Is. He

2:06:53

going from to a website them you may

2:06:55

have a menu and who says download this

2:06:57

site. To. Your phone and

2:06:59

then you can use it like an

2:07:01

app. It even has abilities to operate

2:07:04

off line and store data in between

2:07:06

visit and so forth. is a really

2:07:08

nice technology that means that any web

2:07:10

page properly configured could be an app.

2:07:13

Ah, Apple. Never like this too much

2:07:16

because man they make some money on

2:07:18

the app store I guess and they

2:07:20

would prefer that to me. Make a

2:07:22

real app that they sell an Apple

2:07:24

get thirty percent. They took advantage of

2:07:27

the use. Demands

2:07:29

that they changed the way the store works to

2:07:31

say. Oh, and by the way, we're going to

2:07:33

kill progressive web apps as well. Even.

2:07:36

Though it's kind of not related insects quite

2:07:38

the opposite is a way for anybody to

2:07:40

have an app on our on apple without

2:07:42

Apple making any money on it. So.

2:07:45

It is already an alternative web store

2:07:47

in a risk in one respect. Apple

2:07:49

said, well, we're gonna take it off

2:07:51

because of security concern. Specially.

2:07:54

if they make as allow other browsers is to

2:07:56

be we could we we'd lose control of the

2:07:58

platform There was enough, I guess,

2:08:02

enough response to this that they said, all right,

2:08:05

we're going to leave that in. Previously,

2:08:07

Apple's page reads, previously, Apple announced plans

2:08:10

to remove home, they call them home

2:08:12

screen web apps capability in the EU

2:08:15

as part of our efforts to comply with the

2:08:17

digital markets act. The need

2:08:19

to remove the capability was informed

2:08:21

by the complex security

2:08:23

and privacy concerns associated with

2:08:25

web apps to support alternative

2:08:28

browser engines that will

2:08:30

require building a new integration architecture that is not

2:08:32

currently exists in iOS. Okay, I get that.

2:08:35

You know, we're going to allow Firefox. I guess

2:08:37

I get that. Not really. We've

2:08:39

received requests to continue to offer support

2:08:41

for home screen web apps and iOS.

2:08:44

Therefore, Hey, okay. Well, since you care,

2:08:46

we're going to continue to offer the

2:08:48

existing capability. So forget

2:08:50

that thing we said about security and

2:08:52

privacy. Uh, nevermind. The

2:08:54

support means home screen web apps continue to

2:08:57

be built directly on WebKit and its security

2:08:59

architecture and align with the security and privacy

2:09:01

model for native apps and iOS, just like

2:09:03

they always did. This

2:09:09

to me underscores the absolute hypocrisy

2:09:11

of what Apple is up to.

2:09:14

Uh, they wanted to kill it because they wanted to eliminate

2:09:17

that like, you know, little

2:09:19

exit route for people to put apps on your

2:09:22

phone without going through the app store. And

2:09:25

then they decided not to kill it because

2:09:27

why, I don't know, maybe somebody, I don't

2:09:29

know, maybe you complained. Unfortunately,

2:09:32

PWAs never took off. And, uh, even

2:09:34

though this would be a great thing,

2:09:36

um, this isn't

2:09:39

going to change much. I

2:09:42

use a bunch of PWAs on my,

2:09:44

on my computers, on my windows computers

2:09:46

and on my pixelate pro. Oh, tell

2:09:48

me what you do. What, what, what,

2:09:50

what, what sites? Well, let's see. I

2:09:52

have, I have one here for a

2:09:54

little app called Apple PV plus. Um,

2:09:57

they have a PWA. Last

2:10:01

a bad when and mood. So there's

2:10:04

no Android version Elad so you can

2:10:06

use the web sites as a as an

2:10:08

app in affected looks just like an

2:10:10

app right? Now. Does

2:10:12

add a I use them feel Peter

2:10:14

we a for slack on my phone.

2:10:16

I don't I don't have the slack

2:10:18

app installed. I'll either. Pw It's back

2:10:20

when I was still on on Twitter

2:10:23

com I used the Pw a version

2:10:25

of Twitter instead of the Twitter app

2:10:27

that so yeah me either. I use

2:10:29

of of several different ones and I

2:10:31

use a bunch of them on my

2:10:33

computers as well. Was

2:10:35

it you complain to the you. Know

2:10:39

is I mean honestly I would not

2:10:42

be. When When When when Google and

2:10:44

Microsoft for the first to really promote

2:10:46

Pw as an Apple was always can

2:10:48

drag and it's heels. Ah

2:10:51

at But I had such high hopes for

2:10:53

this because it would make a fairly easy

2:10:55

we we would do a pita of the

2:10:57

way for trip. We have aware of a

2:10:59

website that has a very robust a P.

2:11:01

I would be not so hard to take

2:11:03

their website and make it a Pw a

2:11:06

secret ever to it out on your phone.

2:11:08

I am. But. We never did

2:11:10

it, partly because. One. Of

2:11:12

the big browsers, Firefox decided not to

2:11:14

sport anymore. I

2:11:17

think we probably should have. Mean.

2:11:21

Does anybody even use Firefox anymore? So

2:11:23

you know maybe was a matter of

2:11:25

you in your ear, l in his

2:11:27

and everything else is on chromium else

2:11:29

and illicit looking at my at my

2:11:31

task bar here in our guts. Peter.

2:11:34

We always for for google calendar for

2:11:36

slack. For. Com

2:11:38

seat for our mastodon.

2:11:41

Wow, I'm. Or. Threads:

2:11:43

Youtube Music. Feebly.

2:11:47

So they all are. These are

2:11:49

all these apps supporting. All.

2:11:51

The seats is Peter Be Ways or you've just

2:11:53

made it. That's my home. Of

2:11:56

you put skill you can't with any page say put

2:11:58

it on the home screen. Yeah.

2:12:00

I'll be at a treaty of the it

2:12:02

has to have service workers lost as yeah

2:12:05

moink your diligently p they'll be a does

2:12:07

is a trooper maybe monad the ah yes

2:12:09

you can do in the slack out without

2:12:11

having to install the app. So.

2:12:13

That's the reason use it as is visiting on Wednesday

2:12:16

the app. And I'm in some

2:12:18

cases yeah, on or at least one computer

2:12:20

that I have to use on a daily

2:12:22

basis. I can't install apps. ah ai my

2:12:24

work on my work computer ah I'm so

2:12:27

I used some p They'll Be Laser as

2:12:29

an alternative. To me this one

2:12:31

is it that really exciting technologies and never

2:12:33

took off in the makes me sad that

2:12:35

I really could have been a. Really?

2:12:38

Great saying. It's not quite the same

2:12:40

as just saving a web page as

2:12:42

up of a button on your home

2:12:44

screen like on your home screen. So

2:12:46

little bit more than that. I wish

2:12:49

Apple had supported below the better. At

2:12:51

least they're not going to kill it

2:12:53

completely. Talking about the S, the F

2:12:55

B I and law Enforcement the U

2:12:57

S, Ah, turns out the number one

2:13:00

tactic they when they really liked now

2:13:02

is push notifications. This.

2:13:05

Is a Washington Post article through Harwell

2:13:07

an Errand safer. So.

2:13:09

It turns out when you get a

2:13:12

push notification. Ah,

2:13:16

It goes out over the public

2:13:18

internet. And Ah

2:13:20

is law enforcement can get it.

2:13:23

It actually contains a lot of

2:13:25

information. About. The

2:13:28

phone that the notifications are getting

2:13:30

pushed to. The.

2:13:33

Breakthrough this was imposed relied on a

2:13:35

little known quirk of push alerts, a

2:13:38

basic staple of modern phones. You know

2:13:40

that's when you get a notification, you

2:13:42

can email or slack notification are in

2:13:44

a message or you know these tokens

2:13:46

can be used to identify users. And

2:13:49

are stored on the servers run by Apple

2:13:51

and Google, which as it turns out.

2:13:54

And not encrypted can hand him over to law

2:13:56

enforcement. And

2:13:59

apparently months. Husband's been asking and neither

2:14:01

of these companies really been saying well, where's

2:14:03

your subpoena These go Yes sir here. Ah,

2:14:08

Now of course. This.

2:14:10

Became a public. When.

2:14:12

It was used to. Arrest

2:14:15

a a child exposed

2:14:18

exploitation perpetrator. Allegedly,

2:14:20

I'm. An.

2:14:23

Episode of I'll Give you the Story

2:14:25

Federal law enforcement officer got tell a

2:14:27

guard which is one of these companies

2:14:29

to hand over small string of code

2:14:31

the cavity used to send push alerts.

2:14:34

To the suspects. Phone.

2:14:37

Oh, let me actually looked at go

2:14:39

back a little farther. The pedophile ledge

2:14:42

pedophile had worked to say anime as

2:14:44

in the chat rooms where he would

2:14:46

brag about his exploits accorded the criminal

2:14:49

affidavit. He covered his contract by using

2:14:51

Tell A Guard which he was an

2:14:53

encrypted Swiss messaging app. Ah,

2:14:56

I'm. And. He thought, well, it's

2:14:58

encrypted. I'm safe. But. When

2:15:00

he didn't know is that tell a

2:15:02

guard also used push notifications and was

2:15:05

willing to hand over the information to

2:15:07

the F B I. Smiled.

2:15:10

The F B I agent then got Google

2:15:12

the head over the list of email addresses

2:15:14

linked to the code the push dog and

2:15:16

traced one to die in Toledo. Who.

2:15:19

Has been arrested, charged with sexual exploitation

2:15:21

of minors and distribution of child pornography

2:15:23

sites within a week of the Google

2:15:25

requests. Note the word request on subpoena,

2:15:27

not warrants. Now these can publicize because

2:15:29

the F B I once you to

2:15:32

think you know where we use these

2:15:34

for is the worst, most heinous awful

2:15:36

offenders and nobody's going to want this

2:15:38

guy to get away with it, so

2:15:40

nobody's going to question it. But.

2:15:43

It's probably important the you understand that

2:15:45

this push alerts really can be used.

2:15:48

To out you. cooper

2:15:52

clinton that technologist at the electronic

2:15:54

frontier foundation said this is how

2:15:57

any new surveillance message starts out

2:15:59

the government's We're only going

2:16:01

to use this in the most extreme cases

2:16:03

to stop terrorists and child predators, and

2:16:06

everyone can get behind that. But

2:16:08

Cooper says these things always end up

2:16:11

rolling downhill. Maybe

2:16:13

a state attorney general one day decides, hey, maybe you

2:16:15

can use it to catch people having an abortion. Even

2:16:18

if you trust the US right now to use this,

2:16:21

you may not trust a new administration to use it

2:16:23

the way you deem ethical or a state attorney general.

2:16:28

So the Post found more than 130 search

2:16:30

warrants and court orders in which investigators

2:16:32

had demanded that Apple, Google, Facebook, and

2:16:35

other temp companies hand over data related

2:16:37

to the suspect's push alerts. Fourteen

2:16:42

states as well as the District of Columbia. I

2:16:49

guess it sounds like they do. Federal

2:16:54

law enforcement fully comply with the Constitution

2:16:56

applicable statutes to obtain this status as

2:16:58

the Justice Department. So they do in

2:17:00

fact get court orders to

2:17:03

do this. So that's actually reassuring. I'm

2:17:08

reassuring right now, but it depends on which

2:17:11

court and which state. Some

2:17:13

court orders might be easier

2:17:15

to get than others, depending on

2:17:18

which court you're going to and depending on what

2:17:20

it is you're looking for. Like for

2:17:22

example, if you're looking for pregnancy

2:17:28

care in

2:17:30

Texas or Louisiana or any

2:17:33

number of other southern states,

2:17:37

the courts

2:17:39

might be more inclined than they should to

2:17:41

issue those court

2:17:44

orders. We first started

2:17:46

talking about this late last year.

2:17:48

Senator Ron Wyden sent a

2:17:50

letter to Attorney General Merrick Garland saying

2:17:54

an investigation had revealed the Justice Department

2:17:56

had prohibited Apple and Google from discussing

2:17:58

the technique. Don't

2:18:01

tell anybody. Apple

2:18:03

confirmed this in a statement in

2:18:06

December to the Washington Post. Google

2:18:10

said it shared Ron Wyden's commitment to keeping users

2:18:12

informed about these requests, so it started to come

2:18:14

out. Here's

2:18:17

how this works. Unlike normal app

2:18:19

notifications, push alerts, the

2:18:23

things that wake up your phone, or you turn them off

2:18:25

because you don't want to see them at night, but that

2:18:27

they all come in in the morning. Many

2:18:31

apps often push alert functionality because it

2:18:33

gives users a fast, battery-saving way to

2:18:35

stay updated. Push alerts. If

2:18:37

you have CNN news updates, that's a push

2:18:40

alert. To send

2:18:42

a notification, both Apple and Google

2:18:44

require the apps to first create

2:18:46

a token unique to your phone

2:18:49

that tells the company how to find the user's

2:18:51

device. These

2:18:53

tokens are then saved on Apple's and Google's

2:18:55

servers. You can't do anything about it.

2:18:59

In effect, Wyden said, that design makes

2:19:01

Apple and Google a digital post office

2:19:03

able to scan and collect certain messages

2:19:05

and metadata, even of people

2:19:07

who want to remain discreet. That

2:19:15

token would be used to

2:19:17

identify what cell tower

2:19:19

that particular device was attached to.

2:19:23

I think yes. I think

2:19:25

furthermore, specifically connecting

2:19:28

that phone to that notification.

2:19:32

The question is what kind of information Apple has.

2:19:35

Well, not only would it need the tower, it would need the

2:19:37

unique IP address of that

2:19:39

phone, wouldn't it? It needs to somehow

2:19:42

know how to get a message to that phone. How

2:19:44

would it know that? Whatever

2:19:47

it is, it's uniquely identifying that

2:19:50

phone. If it's on Wi-Fi, then

2:19:52

that, if it's on a

2:19:54

cell tower, then you're looking at a pretty

2:19:57

broad area. But if you're on Wi-Fi, getting

2:19:59

that... push notification, you can really

2:20:01

narrow down the scope of where

2:20:04

that device is located. Alright.

2:20:08

Anyway, something to be aware of. It's

2:20:12

an issue. I

2:20:14

would hope that Google and Apple would be absolutely

2:20:20

sticklers about requiring a subpoena

2:20:23

or a warrant. Yeah,

2:20:30

it's an interesting story. There's nothing more to

2:20:32

say about it except that this

2:20:35

is going on. So when we talk

2:20:37

about your data being sold to the

2:20:39

Russians and the Chinese, your data is

2:20:41

also available in a variety of other

2:20:43

ways. Let's

2:20:48

talk about the transparent laptop. I guess we kind

2:20:50

of did. This is one of the many things

2:20:52

announced at Mobile World Congress. Look

2:20:55

at that. You could see his hand right through the

2:20:57

lid of the laptop. Why? I

2:21:00

don't know. Why not? I mean,

2:21:02

this is exactly what you were saying in terms

2:21:04

of like, I had this whole

2:21:06

emotional cycle reading the article and watching the

2:21:08

video of at first being like, this is

2:21:10

so cool. But then as you read more,

2:21:12

you're like, yeah, what is this for?

2:21:15

And I think they're like trying to come up with

2:21:17

use cases, like the

2:21:20

idea of, for example, if you're trying to trace

2:21:22

something on your screen, maybe that it's helpful to

2:21:24

see what's behind it. And then

2:21:26

as I thought about it more, you realize that of

2:21:28

course, then there's all these cases where you definitely don't

2:21:30

want your, you know, screen to be transparent. You know,

2:21:32

I work, do a lot of my work in a

2:21:35

public library and I don't actually want people to be

2:21:37

able to read everything that's going on my computer. People

2:21:39

in an office, if you're watching a

2:21:42

twit, when you should be working, that's

2:21:44

not something you want somebody walking by. Well, you

2:21:46

should be using an Apple Vision Pro. Exactly.

2:21:50

Then you're saying. I'm working. How about

2:21:52

this? The, the Motorola

2:21:54

phone, you can like slap

2:21:56

on your wrists and it'll go

2:21:58

all the way around. all the way around

2:22:00

your wrist. Okay,

2:22:05

this is from CNET's article, Andrew Langson,

2:22:07

who is on our shows frequently, talking

2:22:10

about this, the wearable phone, again,

2:22:12

like the Lenovo concept, they're

2:22:15

not necessarily going to sell this. Samsung

2:22:17

says they're going to sell a new

2:22:19

Galaxy Ring, they showed that off, but

2:22:21

didn't give us any information about price

2:22:24

or availability. So, coming

2:22:26

someday to

2:22:28

a Samsung user, a lot

2:22:31

of people, including Andrew,

2:22:33

saw the Humane AI pin in Barcelona

2:22:35

and said, actually, it's pretty cool, it

2:22:37

works better than I thought it would.

2:22:39

This is a pin that's been delayed,

2:22:41

that has an AI and it records everything going

2:22:44

on, doesn't have a screen, you could

2:22:47

talk to it. He

2:22:49

said that it does a pretty

2:22:51

good job of showing images on

2:22:53

your hand, which is actually new information,

2:22:56

it beams light onto

2:22:58

your hand as a screen, it could translate

2:23:00

languages, it could, anyway,

2:23:02

they were impressed, also

2:23:05

delayed. Will it be

2:23:08

allowed in movie theaters? Ha, interesting.

2:23:12

Good way to record a movie, huh? I'm

2:23:14

just going to be like staring

2:23:16

daggers at whoever's pin goes off.

2:23:18

Your pin went off. Yeah,

2:23:21

the phones are bad enough. Here's the

2:23:23

Xiaomi SU7 EV, also

2:23:26

at Mobile World Congress.

2:23:28

Now, you may say, wait a minute, Xiaomi doesn't

2:23:30

make cars, they

2:23:33

make phones. Do they make cars, Sam?

2:23:35

Apparently, they do now. They have

2:23:37

made one car.

2:23:40

There it is. They plan to

2:23:42

offer this. Huawei has also announced an

2:23:44

EV that they plan to sell. In

2:23:48

China, there's a bunch of suppliers

2:23:50

that you can get various components from

2:23:53

and put stuff together, put it all

2:23:55

together and build a car. This

2:23:58

is not the sort of thing that Apple would want. want

2:24:00

to do, but you can do

2:24:02

it and do it fairly cost effectively. Um,

2:24:05

yeah. And, and this is actually probably, you know,

2:24:07

to what I was saying earlier, you know, one

2:24:09

of the reasons why Apple, um, you

2:24:11

know, decided to finally pull the

2:24:14

plug on the, uh, the EV

2:24:16

project, because you've

2:24:18

got in China, especially you've got so

2:24:20

many competitors that are able to offer

2:24:23

really impressive products at

2:24:26

prices that are way below what

2:24:28

Apple would ever even consider selling the

2:24:30

car for. Yeah. Do

2:24:33

you think some of it is

2:24:35

Huawei or, or Xiaomi saying, well,

2:24:37

we can do a car Apple,

2:24:39

like rubbing their noses in it.

2:24:42

You know, one, one thing to keep in mind,

2:24:44

you know, there's hundreds of,

2:24:47

you know, Chinese brands, automotive

2:24:49

brands, um, you know,

2:24:51

dozens, certainly dozens of EV only

2:24:53

brands. Um, almost

2:24:56

none of them are actually turning a profit. Oh,

2:24:58

really? Is it the government subsidies?

2:25:00

It keeps them afloat. For

2:25:03

now. Yeah. You

2:25:05

saw that Josh Hawley, uh, wants

2:25:08

to charge a, uh, uh, whopping,

2:25:10

um, tariff

2:25:14

of 27 point, no, 125% on imported Chinese

2:25:16

autos, 125%, double the price

2:25:24

to keep them out of the U S. Does

2:25:29

it make a difference? Is this, does it be

2:25:31

a thought that BYD might start

2:25:33

bringing its very popular cars into the U

2:25:35

S a real threat to

2:25:37

American auto manufacturers? Um,

2:25:41

if they actually did it, yes, it

2:25:43

would be a serious threat because, you know,

2:25:45

they can, you know, they're able

2:25:48

to, to build the vehicles at a

2:25:50

much lower price point than what we've

2:25:52

seen from any of the, uh, legacy

2:25:55

Western brands. Um, you know,

2:25:57

so, you know, a car like the, the

2:25:59

BYD seal. You know, which is a really

2:26:02

excellent car You know

2:26:04

could be sold for probably under

2:26:06

thirty thousand dollars in the US,

2:26:08

right? And you know, there's

2:26:10

nothing in the US market, you

2:26:13

know that would be competitive with that, you know at

2:26:15

that price point but

2:26:18

you know right now for for now

2:26:20

at least companies like BYD and various

2:26:24

other Chinese brands are Content

2:26:26

to focus on other markets you can't get

2:26:28

them in the US now You can't you

2:26:30

can't get any there are some Chinese built

2:26:33

vehicles for sale in the US But

2:26:36

none under Chinese brands. So there's a

2:26:38

couple of bulbos Polestar 2 they're built

2:26:40

in China But they're

2:26:42

sold here Buick envisions

2:26:44

built in China sold here But

2:26:49

Right now they're the Chinese automakers

2:26:51

are more content to go after

2:26:54

some other markets like South America

2:26:56

in particular and Southeast Asia and

2:26:59

Really targeting those markets where there's very

2:27:01

little penetration of EVs yet and

2:27:04

you know hit those markets first Before

2:27:07

they try and take a stab at

2:27:09

the US some

2:27:12

youtuber Trying to remember who

2:27:14

was bought a it's basically

2:27:16

a Chinese golf cart and shipped

2:27:18

to him in the US And

2:27:21

assembled it But it's kind

2:27:23

of a cute little car be kind of cool to have it

2:27:26

Well, I know Jason Torschinski who used to

2:27:28

be a jalopnik and now

2:27:30

has a site called the Autopian he

2:27:34

bought What

2:27:36

was it called? It's like a really

2:27:38

cheap Chinese. Yeah, don't cost him more

2:27:40

to ship it here three or four

2:27:42

years ago Yeah, I think he got it

2:27:44

through Alibaba actually. Yeah, I think you're right I

2:27:47

think you're right and it cost him more to ship it than

2:27:50

the car itself, which was just a couple of thousand

2:27:52

dollars That

2:27:55

looks kind of cool though I thought you know, hey

2:27:58

if you see here it is is this it Is this the car?

2:28:02

This is taking the story. Yeah,

2:28:04

that's not the one I was thinking of. Yeah.

2:28:07

But no, I remember this article when he

2:28:09

did this. It's crazy. The

2:28:14

claim is, of course, that the Chinese

2:28:17

government subsidizes these manufacturers. So

2:28:20

they compete unfairly. Although,

2:28:22

the Chinese could also say the

2:28:24

US subsidizes US manufacturers to the

2:28:26

tune of $100

2:28:28

per car. That's

2:28:30

a subsidy, right? Yeah.

2:28:33

No, it absolutely is. Yeah.

2:28:36

It was the Chang-Li

2:28:38

Freeman. Yeah, yeah. Somebody...

2:28:40

I just dropped that in the chat. Thank

2:28:42

you. Yeah. Let

2:28:45

me see if I can find this picture.

2:28:47

The world's cheapest Chinese EV.

2:28:50

And Jason said, it's

2:28:52

actually really good. Yeah.

2:28:55

It is. And

2:28:57

a radio that can play MP3s. Okay.

2:28:59

A 1.1 horsepower rear-wheel drive

2:29:01

electric motor. 28 mile of the

2:29:03

reach. I love the

2:29:05

wheels. The wheels are the size of a

2:29:08

small pizza. They're not huge. I

2:29:11

would take that on the road. I would absolutely take

2:29:13

that. Not the highway. Not the highway. No, no, no,

2:29:16

no. Small town road.

2:29:18

Yeah, but driving around town, I

2:29:21

love to have that. It

2:29:23

looks like the front looks like a little dragon. I don't

2:29:25

think that's by accident. No, no.

2:29:27

That's on purpose. Yeah. Apparently

2:29:33

Jason has his parked in the sidewalk out

2:29:35

front. So it's really easy to find his

2:29:38

out. Yeah. Jason's

2:29:40

got a thing for strange cars. Top

2:29:44

speed 23 miles an hour, but that's enough

2:29:46

for around town. Yeah. That's

2:29:48

enough. You wouldn't take it on the highway,

2:29:50

but it's like a golf cart. I

2:29:53

don't know. I think this is a, I want one.

2:29:55

It's cute. Perfect

2:29:58

for getting to the studio. Exactly. That's

2:30:00

all I need. You know, it's funny. I have

2:30:02

a big old fancy car to drive two miles

2:30:04

every day Probably

2:30:07

could just get a changley instead next time. Yeah,

2:30:10

it's got 23 miles of range. You'd be good And

2:30:13

it'd probably be safer than riding the bike across the

2:30:16

bridge. Yeah. Yeah All

2:30:19

right, let's take a break and we'll wrap

2:30:22

things up with our wonderful panel Sammable

2:30:24

Sam it always great to have you on wheel

2:30:27

bearings media for his podcast. He's

2:30:29

a principal researcher at Guidehouse

2:30:32

insights and he's

2:30:35

on our twitch social server our mastodon

2:30:38

At Samuel a bull Sam it is that

2:30:40

really the whole thing Sammable Sam

2:30:42

and that's your that's your

2:30:44

handle. Okay. Yeah, Sammable Sam Sam a

2:30:46

B U E L S a M

2:30:48

ID. That's not so hard. Yep, if

2:30:51

I can find me anywhere I am

2:30:53

That's that's the username. I use nice someone

2:30:56

else took Sam a so Yes,

2:30:59

Sam us. Oh, I want I want Sam

2:31:03

a is of course Sam Altman of open

2:31:05

AI and that is Anthony Ha who

2:31:07

is Anthony dash ha calm and Anthony ha and

2:31:09

the Twitter and the threads in the blue sky

2:31:11

and His podcast is

2:31:14

original content. We come back We'll

2:31:17

say goodbye to one of our beloved

2:31:19

hosts, but we'll also get

2:31:21

some content recommendations From Anthony since he

2:31:23

is in charge of all of that

2:31:26

our show today brought to you by lookout

2:31:28

today Every company is a data company You

2:31:31

know that means every company is at risk

2:31:35

Cyber threats reaches leaks These

2:31:38

are the new norm and cyber criminals

2:31:40

grow more sophisticated by the minute at

2:31:42

a time when boundaries no longer exist What

2:31:45

it means for your data to be secure is fundamentally

2:31:47

changed and her lookout From the

2:31:50

first fishing text to the final

2:31:52

data grab lookout stops modern breaches

2:31:54

as swiftly as they Unfold

2:31:57

whether on a device in the cloud across networks

2:32:00

are working remotely at the local coffee

2:32:02

shop, Lookout gives you clear

2:32:04

visibility into all your data at rest and

2:32:07

in motion. You'll monitor,

2:32:09

assess, and protect without sacrificing

2:32:11

productivity for security. With

2:32:13

a single unified cloud platform,

2:32:15

Lookout simplifies and strengthens, reimagining

2:32:17

security for the world that

2:32:19

will be today. Visit

2:32:23

lookout.com today to

2:32:25

learn how to safeguard data, secure

2:32:27

hybrid work, and reduce IT complexity.

2:32:29

Visit lookout.com. Thank

2:32:33

you so much for supporting This

2:32:35

Week in Tech.

2:32:38

We'll be back with a final word and

2:32:40

a farewell to one of our most beloved

2:32:43

hosts. But first, let's look back at

2:32:45

the week that was This

2:32:47

Week on Twitter. Jason Snell has breaking

2:32:49

news. I

2:32:52

hope you are not planning

2:32:54

your financial future around buying

2:32:56

an Apple car. What? Obviously,

2:32:58

on Twitter. Mac

2:33:00

break weekly. They have finally thrown in the

2:33:02

towel. A lot of alarm bells went off

2:33:04

when there were those reports about how they

2:33:06

were only going to launch it without a

2:33:08

steering wheel and with autonomous driving. It was

2:33:10

one of those moments of like, what are

2:33:12

they, you know, what are they smoking? Time

2:33:15

to geek out. It's the untitled

2:33:17

Linux show. This story has all

2:33:19

our favorite topics, all bundled into

2:33:21

one. The Rust based

2:33:24

terminal called Warp This

2:33:26

Week in Google. We should talk

2:33:28

about the Gemini. Yeah,

2:33:31

I'm going to say tempest in a deep pot woke

2:33:34

Gemini. Just as

2:33:36

social media has put in a vice, take down

2:33:38

all the bad stuff. No, that's my bad stuff.

2:33:40

You took down the same thing is

2:33:42

happening with AI. And the real problem,

2:33:44

I think, is this expectation

2:33:46

that guardrails can and should be

2:33:48

put in such that

2:33:50

the model maker can make sure that nothing

2:33:52

bad ever happens. This Week in Space. Episode

2:33:55

100 and we're going to celebrate with Dr.

2:33:57

Alan Cernan and find out what it takes

2:34:00

in space on Virgin Galactic. It

2:34:02

was the best work day ever.

2:34:05

You know where we're headed is to a Star

2:34:08

Trek future. It will take centuries to get there

2:34:10

but I really believe that when people look back

2:34:12

from that far away century they'll

2:34:14

look back to the 2020s and say that's

2:34:16

where Star Trek began. That's where the inflection

2:34:19

point when it all started to happen. Twit,

2:34:22

it's not your father's twit. It

2:34:25

was a great week, really fun week on Twit

2:34:27

and we thanks to all of our hosts. We're

2:34:29

so wonderful. Thanks to our club

2:34:31

members who supported and you know what

2:34:33

congratulations to our club show Untitled Linux

2:34:35

show which is now out in public.

2:34:37

We've taken all those shows that have

2:34:39

been behind the paywall and put them

2:34:41

out in audio so you can subscribe

2:34:43

to that at twit.tv slash ULS. I

2:34:46

am sad to report

2:34:49

that one of our dearest most

2:34:52

beloved hosts has passed away.

2:34:54

Every single show since 2006 you see

2:34:57

me use this microphone. This is a

2:34:59

Heil PR 40. It's

2:35:02

a microphone I discovered in 2006 when

2:35:05

Bob Heil offered it as a prize

2:35:07

for the

2:35:10

best podcast award. We won the award. I

2:35:12

used the mic and I went wow I'm

2:35:14

never using another mic again. Bob

2:35:17

a great legendary not just

2:35:19

microphone builder but sound man

2:35:22

passed away this week at the age of 83. He was the

2:35:24

host of our Ham

2:35:26

Nation show for 10 years. A ham

2:35:30

Elmer as they call him a

2:35:32

guy who taught and helped young

2:35:34

art amateur radio enthusiasts

2:35:37

get their license and and

2:35:39

get into the hobby. But

2:35:42

he was also an organist famous

2:35:44

for his his accomplishment. He was

2:35:47

the at the age of

2:35:49

15 the theater organist at the

2:35:51

fabulous Fox Theater in St. Louis a protege

2:35:53

of Stan Can the great organist

2:35:56

and Bob says we had a great triangulation

2:35:58

which I'll recommend you listen to. Bob

2:36:01

says that in the process of

2:36:03

learning how to play that organ and how

2:36:05

to tune, those hundreds, actually it

2:36:07

was literally thousands of pipes in the

2:36:09

Great Warlitzer, he learned how to listen

2:36:11

carefully and that helped him become a

2:36:14

sound guy. He opened Ye Olde

2:36:16

Music Shop, a successful professional

2:36:19

music shop in Marissa, Illinois.

2:36:21

Eventually that turned into Heil Sound. It

2:36:24

was when he was running the music shop in

2:36:27

1970 that the Grateful Dead came

2:36:29

to town. They were playing St. Louis to

2:36:32

play the Fabulous Fox in February 1970. They

2:36:35

didn't have a sound system. They

2:36:37

went to Ye Olde Music Shop

2:36:40

and Bob provided his own sound system for the

2:36:42

dead. It was such a success they asked Bob

2:36:44

and his sound system to join them on the

2:36:46

tour. That led

2:36:49

Bob to designing sound for rock and roll.

2:36:51

He toured with The Who on their Who's

2:36:54

Next Tour. He designed the Quadraphonic Sound for

2:36:56

their Quadrophenia Tour and very

2:36:58

famously he designed

2:37:01

the Talk Box

2:37:03

for Peter Frampton. Now some of you are

2:37:06

way too young to remember the 1976 number

2:37:09

one album Frampton Comes Alive.

2:37:12

But I played that on repeat for

2:37:14

the entire year and

2:37:16

one of the things that made that

2:37:18

such a unique album was the

2:37:21

Talk Box. He's able to

2:37:23

play his guitar and

2:37:26

somehow make his mouth and

2:37:28

make the guitars talk by moving his

2:37:30

mouth. Well Bob told the

2:37:32

story on a triangulation. Peter Frampton's wife came

2:37:34

to Bob and said I need a perfect

2:37:37

gift for Peter for his birthday and

2:37:39

Bob said okay let me design something. He

2:37:41

designed a little amplifier that would attach to

2:37:43

the guitar and then to

2:37:45

a hollow tube that Frampton

2:37:47

could put in his mouth play the guitar. The guitar

2:37:50

sound would be piped up through the hollow tube into

2:37:52

his mouth which he could then use to shape the

2:37:54

sound which would then go out into the microphone. It

2:37:57

was such a unique sound it made it made that

2:37:59

a hit album. them, made Frampton a

2:38:01

superstar. Joe Walsh used it on

2:38:03

his Eagles music. In fact,

2:38:06

I remember when we interviewed Joe Walsh on

2:38:09

Ham Nation. It was a great moment for

2:38:11

me to get to talk to the Eagles

2:38:14

lead guitarist. He said,

2:38:16

this is my favorite

2:38:19

thing to play. And Bob said, yeah, and

2:38:21

no one ever played it better than

2:38:23

Joe Walsh. His original

2:38:26

talk box is now in the Rock and Roll

2:38:28

Hall of Fame in Cleveland,

2:38:30

Ohio. In fact, Heilson is

2:38:32

the only manufacturer featured in display at the

2:38:35

Rock and Roll Hall of Fame.

2:38:38

He created the first modular mixing

2:38:40

console, the Mavis, his

2:38:42

custom quadraphonic mixer that

2:38:44

he did for the Who and the first Heil

2:38:46

Talk Box, all at the

2:38:48

Cleveland Rock and Roll Hall of Fame.

2:38:51

He became an amateur radio operator when he

2:38:53

was 13. He was a young guy and

2:38:55

it's been a ham ever since. But he

2:38:57

was, you know, later in life

2:39:00

bemoaning the quality of ham microphones. They

2:39:02

were universally awful. So he designed his

2:39:04

own ham microphone and got into the

2:39:06

microphone business and got him

2:39:08

into making what we consider the best,

2:39:10

you know, large coil dynamic microphone in

2:39:13

the business and one we've used ever since

2:39:15

and love so much. A

2:39:18

great ham, a great Elmer, a great sound

2:39:20

designer, a legend. He's

2:39:24

survived by his beautiful wife Sarah who

2:39:26

is a wonderful person and his children.

2:39:29

In lieu of flowers, they're asking, and I

2:39:31

will put a link to the obituary at the

2:39:35

chorus funeral home where Bob is in

2:39:37

right now and is being held

2:39:39

for services. In lieu of flowers, memorial contributions

2:39:41

can be made to the Shriners Children's St.

2:39:44

Louis or the American Radio

2:39:46

Relay League Education and Technology

2:39:48

Fund benefiting ARL's education initiatives

2:39:50

in school. He was

2:39:52

a legend in his purple jacket. He

2:39:55

came to our studios many times. We

2:39:57

loved Bob Heil. We

2:40:00

knew he wasn't doing very well. He got cancer

2:40:02

about a year ago and it's been a long

2:40:05

battle, but he finally succumbed

2:40:07

earlier this week at the age of 83. Bob,

2:40:10

we love you, we miss you, and I know the heavenly choir

2:40:12

is going to sound a hell of

2:40:14

a lot better when Bob Heil gets

2:40:16

there. A silent key, Bob Heil.

2:40:19

And of course he had his ham

2:40:23

call sign since he was 13, which is kind

2:40:27

of cool, K-9-E-I-D. So

2:40:30

there's a silent key for K-9-E-I-D. I

2:40:38

got the PR40 you sent me right here. Yeah,

2:40:40

it's a great microphone. We

2:40:44

loved Bob and he was an amazing guy. So I

2:40:46

hate to end on a sad note, but he deserves

2:40:48

the the attention and

2:40:50

the accolade. We have so many amazing

2:40:53

stories about Bob. What

2:40:55

a great guy. Thank

2:40:59

you so much, Anthony Ha and

2:41:02

Sam Abul-Samad. Anthony,

2:41:04

give us some great original content.

2:41:06

He's the host of the original

2:41:09

content podcast. Oh, sure.

2:41:11

It's coming up that you're excited

2:41:13

about. Oh, that coming

2:41:15

up. Well, I would say that if you're just

2:41:17

looking for something to watch right now, that's really

2:41:19

fun. Something that

2:41:21

was on Macs for a couple of years

2:41:24

but just made its way to Netflix is

2:41:26

Warrior. It's a martial art show

2:41:28

set in the 19th

2:41:30

century San Francisco, but like kind of

2:41:32

a very heightened almost fantasy

2:41:34

version. I think

2:41:36

very, very loosely based on some

2:41:39

ideas that Bruce Lee had for

2:41:41

I think what eventually became Kung

2:41:43

Fu. And

2:41:45

it is just a lot of fun.

2:41:47

It's definitely pulpy. It's trashy. It's

2:41:50

the kind of cable, you know, Cinemax

2:41:52

original show where in the first episode,

2:41:54

you'll see a lot of nudity a lot of like, all right, I

2:41:56

see what kind of show this is, but It'll

2:41:58

have a really good time with it. It's so funny

2:42:00

cause they always do the in the first

2:42:02

episode every one because it's like this is

2:42:04

oh well. We know you will watch this

2:42:07

show unless we give you some so tear

2:42:09

and then that's it Right then it's over.

2:42:11

I'm you can Now I realize it's it's

2:42:13

ridiculous. Yes it's how little when I think

2:42:15

of us is where the really is that?

2:42:17

How I and. And

2:42:19

arguably you know some sometimes they

2:42:21

are proven correct. Sally B S

2:42:24

A War get up with some

2:42:26

I lists. And

2:42:28

I'm I'm excited about the Three Body Problem

2:42:30

which is coming to Netflix and if you

2:42:32

haven't already, I highly recommend reading the book

2:42:35

as the for A comes out. I agree.

2:42:37

Or. That I'm always a fan of readings

2:42:39

as I fi books before the movie

2:42:41

because it's one of the others going

2:42:43

to imprint on you and how it

2:42:45

looks, how seals, how it sounds. And

2:42:48

the book is such a brilliant is

2:42:50

actually the books. This three of them

2:42:52

is so brilliant I'm that it's worth

2:42:54

reading them. first. it's alone. I sound

2:42:56

a little difficult because it's translated from

2:42:58

Chinese, and the translation I think isn't

2:43:00

very elegant. Maybe. That's how the

2:43:02

book was written. But this but

2:43:04

the thoughts, the story, the ideas, the people in

2:43:06

a really great I can't wait to see it's

2:43:09

have use. You haven't seen a preview of it.

2:43:11

I I've seen the trailers that his own are

2:43:13

you happens in are still hasn't been available to

2:43:15

me. I was have you seen Dune Two yet.

2:43:18

No. I'm so angry about this.

2:43:20

I have a friend of mine who's

2:43:22

out of town this weekend we agreed

2:43:25

to go to gather some going on

2:43:27

Thursday and I am absolutely serious with

2:43:29

him. Has been doing this is and

2:43:31

know and I can't wait. Isn't that

2:43:34

isn't the theaters Now I have been

2:43:36

no movie theater since since March Seventeen

2:43:38

Auto County since Cove Id ah and

2:43:40

I see once, I'm sorry. Once.

2:43:43

John Fleming, our studio manager read his entire

2:43:46

movie theater just for us. It was safe

2:43:48

to go. Where are? see? I forget John.

2:43:50

Sorry. Doctor Strange. Still,

2:43:52

that was pretty good. Although, it's

2:43:55

pretty good, but I think I might wait.

2:43:57

See dunes? Until. It comes out.

2:43:59

And. Oppenheimer in a theater. I

2:44:01

oh. Right? See same?

2:44:03

You know more about me that I

2:44:05

do that. That's right. I forgot where there

2:44:07

was I Mack said his and does

2:44:09

that count? really? I mean as a medium

2:44:12

or an amusement park? That was when some

2:44:14

was also an eye on your house.

2:44:16

Physicists to Stooges Imax to. That's right. Yeah,

2:44:19

Or is it native I max or is it

2:44:21

the adapted to the Imax? I think the I

2:44:23

think it's native ah and or know how much

2:44:25

how much of it but some of it I

2:44:27

think I would be willing to see it that

2:44:29

way. I saw do Mom was amazing. And

2:44:32

I'm a fan of the book. That's a

2:44:34

good. Another good example of a books he

2:44:37

should read first, but this one's perhaps rudely

2:44:39

book and like Cast Foundation like a Spanish

2:44:41

was true the book. It's good. Or

2:44:44

right, there's some good things to watch

2:44:46

for. Sammy, Get what are you watching

2:44:48

these days besides Shogun? Ah, of. The.

2:44:51

We just started last night. We watched the

2:44:53

first episode of us to completely made up

2:44:55

Adventures of Diff Turpin. I can't wait to

2:44:57

see that I downloaded as our trip to

2:44:59

Mexico. Who is a good was really funny.

2:45:01

Yeah, very funny. Was

2:45:04

in the I T crowd and

2:45:06

I'm yeah, I'm allow really want

2:45:08

to see this. Goes. And

2:45:11

also, ah, that's an Apple

2:45:13

Tv unit. Yeah, and does

2:45:15

Sexy Beast. Ah, I'm

2:45:17

if you are. if you remember that

2:45:20

the movie from Two Thousand Yeah! This

2:45:22

series is a prequel. So. Shows

2:45:24

you the origins of Dell and

2:45:27

and nods on logan. Ah,

2:45:29

I'm and Die. You see. That.

2:45:31

That he had downloaded as the character

2:45:34

that Ben Kingsley played in the movie

2:45:36

in Two Thousand Ad use. Near you

2:45:38

see the origin story of how how

2:45:41

they got to where. Where. They

2:45:43

were at that here in the movie

2:45:45

and it's really, really good. There was

2:45:47

a great movie. You probably should see

2:45:49

the movie firsts. Know,

2:45:52

maybe not necessarily. Okay. Make a

2:45:54

decent watches first. Ah,

2:45:57

Mr. Mrs. Smith on Amazon. It's also

2:45:59

really good. Yes, yeah. like the

2:46:01

movie. But. I but the

2:46:03

Tv shows good has not done at

2:46:06

the a Donald Glover at my or

2:46:08

Skyn yeah add does Quite fun to

2:46:10

watch. And something

2:46:12

that is actually finished now. But

2:46:14

if you haven't watched and I

2:46:17

highly recommend you watch Reservoir Dogs

2:46:19

are no Reservations Dogs! Ah ah

2:46:21

I'm from it's on son who

2:46:24

on Fx com. It's a fantastic

2:46:26

show. About

2:46:28

a group of teenage

2:46:31

Native Americans. That

2:46:33

live on a reservation and in Oklahoma.

2:46:36

And it's just it's a boy.

2:46:38

Pay her A T T I

2:46:40

love is easy easy executive producer

2:46:42

but I'm here. Sterling Harjo is

2:46:44

really the creator nurse and he

2:46:46

wrote a note he wrote almost

2:46:49

all of it's ah, I'm and

2:46:51

it's It's really wonderful and it's

2:46:53

definitely worth watching. More seasons and

2:46:55

of its and it's It's fantastic.

2:46:57

It's nicer. Was the Director of

2:46:59

Atlanta. Atlanta,

2:47:01

So it has a same sort of vibe.

2:47:03

Only thought he I think he did. Yeah,

2:47:05

you did some of the yeah. He directed

2:47:07

some of the episodes of Atlanta. There

2:47:11

were a bunch of different directors on that but

2:47:13

he sense to absolutely. If you haven't watched reservation

2:47:15

dogs watch that. Nio

2:47:17

a T V to go home and watch. Thank

2:47:21

you so much! Same of Oh Sam

2:47:23

Id love you are car Guy. He

2:47:25

appears regularly on as the tech guys

2:47:27

and are other shows you can. Also

2:47:29

was in Atlanta zone so wheel Bearings.media

2:47:31

the wheel bearings, podcasts and a course

2:47:33

Principal researcher guy has insights to pry.

2:47:35

keeps you pretty busy during the day.

2:47:37

I would certainly does real pleasure the

2:47:40

have you. All. The way from

2:47:42

it's so any Michigan. Thank you Sam! Always.

2:47:45

Fun to be on the show with the Leo Anthony

2:47:47

and love to have a new idea. They will have

2:47:49

gone soon again. He's. The most

2:47:51

of the original content Podcast Freelance

2:47:53

writer. You read his stuff all

2:47:55

over. Your and all

2:47:57

the time I just went to get the habits are.

2:48:02

What's. The point of winning otherwise. Anyway,

2:48:05

my reading your where exactly is

2:48:07

so great to have your answer.

2:48:09

The thank you for being here

2:48:11

are thanks to all of you

2:48:13

for watching. We appreciated the show.

2:48:15

Ah this week in tech is

2:48:17

are cast flagship shows. That's why

2:48:19

I'm content with networks every Sunday

2:48:21

two to five pm Eastern pacific

2:48:23

time as find a Pm Eastern

2:48:25

twenty two hundred U T C

2:48:27

we streaming live as we do

2:48:29

with all of our shows wall

2:48:31

in. Taping: Seat and

2:48:33

watches you know behind the scenes do

2:48:35

the show on you Tube at You

2:48:37

tube.com/twists but most people watch after the

2:48:40

fact as it is for all podcast

2:48:42

audio or video available at the web

2:48:44

sites T W I T.t V you

2:48:46

can also as watch It on you

2:48:48

Tube this video of each show on

2:48:50

the You Tube channel dedicated to it's

2:48:52

best thing to do those you would

2:48:55

ask me to subscribe and your favorite

2:48:57

podcast place that we have. Get it

2:48:59

and you haven't in. Be ready for

2:49:01

your Monday morning commutes armed. With

2:49:03

this week in tech a very special

2:49:05

set up to a club to it

2:49:07

members who always make his as always

2:49:10

make the show possible to that he

2:49:12

be sliced club traditional member yet thanks

2:49:14

in advance. Thank you for being here

2:49:16

will see you next time and other

2:49:18

tweets in the case.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features