Podchaser Logo
Home
We Just Found It on the Doorstep

We Just Found It on the Doorstep

Released Friday, 5th July 2024
Good episode? Give it some love!
We Just Found It on the Doorstep

We Just Found It on the Doorstep

We Just Found It on the Doorstep

We Just Found It on the Doorstep

Friday, 5th July 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

So there's a handful of people that I'll schedule like

0:02

a month monthly facetime call with and in most of

0:04

them, you know, almost all of them are in fact

0:06

all of them are not local. And

0:08

then there's a handful of people that I try to do

0:10

lunch with like once a month and my

0:12

good friend Sam, he and I had our monthly

0:14

lunch today and we went to a place. But

0:18

I had a problem. During

0:21

lunch, there was music

0:23

outside, which was good. But

0:28

the Jack Brown burger joint trolled me

0:30

because they were playing a fish

0:33

album during the entire lunch.

0:37

And all I could do was think about how

0:39

happy you would be if you were there or

0:41

if you at least knew this was happening as

0:43

it was happening in retrospect. I should have like

0:45

FaceTime or something just to be like, listen to

0:47

this joke. And the

0:50

worst part of all, the worst part of

0:52

all, you could tell

0:54

me what songs I heard and I would probably

0:56

be like, sure. But

0:58

there were a couple of songs that even

1:00

I recognized as fish songs. Like, you know,

1:02

not only did it have the vibe of

1:05

fish, but I like had heard the

1:07

songs before and recognize them. And

1:09

I forget which ones they were. The

1:11

only one I know by name is bouncing around the room and that

1:13

was not it. But there was one

1:15

and I'm sure this is describing half a

1:17

fish catalog, but where it was repeating the

1:19

same phrase over and over again. And it

1:21

was very catchy. That's fairly common. Yeah, exactly.

1:23

But anyways, David Bowie, maybe I'm

1:26

kind of proud of you that you recognize that

1:28

it was fish. I'm not sure I could do that.

1:30

I don't know any of their songs. Maybe

1:33

I could pick it up based on vibe, but I don't think I've

1:35

even heard that much. So like you must. When

1:38

are you listening to fish so much that you recognize

1:40

songs? I can give you a good heuristic, John. If

1:43

you hear a song that you don't recognize, you don't

1:46

think you've ever heard it on the radio before. Look

1:49

around the room. And if the

1:51

widest guy in the room is slightly

1:54

bopping his head to it. That's me,

1:56

though. And zero other

1:58

people are. There's a

2:00

decent chance. It's fish It

2:03

could be anything like I don't know I

2:05

think my chances of record Spontaneously recognize like

2:07

you're at a restaurant these music playing the

2:09

background spontaneously recognizing fish I think my odds

2:11

are very low. I guess I'd have

2:13

to look for somebody with the little red

2:16

blood cell You know pattern

2:18

on there on their clothing and if they were bopping to

2:20

it or something then I could figure it out Right now.

2:22

I know what that is That's the one thing I can

2:24

recognize Marco taught me what that is and now I see

2:27

it on people's like license plate surrounds I'm like, oh one

2:29

of them Anyway,

2:32

the worst part Marco the worst part of

2:34

this entire lunch and about the only bad

2:37

part of this lunch because I really do

2:39

Enjoy Sam so very much and I guess

2:41

yes, did you like some of it? It

2:44

wasn't bad So

2:49

we have a new member special we have

2:51

gone back to the well and

2:54

we have done another ATP tier list

2:56

John. Can you remind us all what is

2:58

a tier list? I can't remind you all

3:00

because everybody knows what a tier list except

3:02

for all people who Listen

3:05

to this podcast, but then they've also heard the specials

3:07

before so it's a tier list You rank things you

3:09

put them in tiers multiple things can be in a

3:11

single tier the top tier is s why nobody knows?

3:13

Except somebody knows but we don't really care the point

3:16

is is better than a it's a tier list and

3:18

it's grading It's like a through f and then s

3:20

on top of a mm-hmm And

3:23

we graded all the iPods No,

3:25

at least most of them anyhow and so

3:27

I am pretty confident that we did a

3:29

pretty good job on this There was a little bit of horse

3:31

trading involved, but I'm pretty happy with where we ended up We

3:34

made a handful of people that we know

3:37

very upset and I'm sorry that you're upset,

3:39

but we're right So if you are

3:41

curious to hear this tier list or any

3:43

of the others You can go to ATP

3:46

dot FM slash join and if you join

3:48

even for about a month But you should do

3:50

more then you can get to all of the

3:53

members specials We've been trying to do one a month

3:55

for what like a year or two now I forget exactly

3:57

how long it's been but that we've racked up a fair

3:59

number over the over the course of the last several months.

4:02

There's a handful of tier lists. We

4:04

do ATP Eats among

4:06

other things. There's a lot of, there's a lot of

4:08

good stuff in there and some silly stuff. So ATP

4:10

tier lists. And if you are a member and you

4:12

would like to watch the tier list

4:14

happen, which is not required, but is occasionally

4:17

helpful. There is a super

4:19

secret YouTube link in the show notes for members

4:21

where you can go and watch it on YouTube

4:23

as well. Please do not share that. It's the

4:25

honor system, but you can check it out

4:28

there as well. It's in the show notes

4:30

for the member special. Sorry, yes, thank you. When you

4:32

go to the iPod tier list member special, look in

4:34

the show notes, the first link will be the YouTube

4:36

video. I like this tier list because they always, we

4:38

always seem to, I think they reveal something about the

4:40

things that we are ranking. Something that

4:42

we, at least I usually didn't know going in. You

4:44

think, oh, you're just gonna rank them and people are gonna,

4:47

you know, have controversies over which is good and which is bad.

4:49

But I think in the end, when you look at the whole

4:52

tier list and you kind of look at the shape of it

4:54

and how it's worked out and how contentious the choices would be,

4:57

you learn something about it. Like I think our connectors tier

4:59

list was like that. And I think the iPod one turned

5:01

out like that too. And the reason

5:03

we made some people angry is because we

5:05

know a lot of really weird tech people

5:07

with very specific and often very strange opinions,

5:10

specifically than iPods. I think you could also

5:12

say incorrect opinions. Like

5:14

they have their reasons. At least most

5:16

of them have reasons that make some sense. I think

5:18

one of the things we learned, not to spoil too

5:20

much, is that a lot of people

5:22

have, you know, all the things that

5:24

we put in tier lists, people can have personal

5:28

sentimental reasons for. We all certainly do.

5:31

And, you know, listeners do as well. And I

5:33

think iPods, more than anything we've done before, like

5:36

the people who had opinions, they swayed

5:38

heavily into the sentimental, right? It

5:41

was, you know, it was like, this was my first

5:43

iPod. I really love this thing, right? Much

5:46

more so than the past tier

5:48

list we've done. So I think, you know, maybe

5:50

the iPod at that point was the most personal

5:52

product Apple had ever made. Yeah,

5:55

I mean, honestly, like, I had a lot of

5:57

fun with this one because, like. Even

6:00

though I hardly ever really

6:02

used iPods because by the time I

6:04

could really afford decent iPods, it

6:07

was only very shortly before the iPhone really

6:09

took over. So I only really had a

6:11

couple of years with iPods, but those couple

6:14

of years, I really liked the iPods. And

6:16

this was actually fun. So, and just for

6:18

coincidence sake, I happen to have bought a

6:21

couple of iPod Nano's off of eBay a couple

6:23

of years back, just to kind of play around

6:25

with. And I took

6:28

them out the other night after we recorded

6:30

this episode and charged them up. And well, the ones that

6:33

we'll accept to charge at least, charged them

6:35

up and got to play around

6:37

with the old iPod Nano. And I will

6:39

just say, I stand

6:41

by everything I said on that episode, everything.

6:44

So feel free to listen and tell

6:46

us how wrong we are. And you too listener can

6:48

pay us $8 a month to

6:51

yell at your podcast player just a little bit more.

6:53

So we encourage you to do that. That's

6:56

absolutely great marketing. Thank you, Marco.

6:58

And by the way, our membership

7:00

episodes are DRM free. And

7:02

so if you happen to use an

7:04

iPod to listen to your podcasts, we

7:07

are fully compatible. So you can pay us

7:09

$8 a month to listen to our member

7:11

content on an iPod if you actually have

7:13

one. And you can honestly buy one on

7:15

eBay for only a few months worth of

7:17

membership fee because they're pretty cheap these days.

7:21

Indeed. And hey, what would you listen to

7:23

on an iPod if not a podcast? Well,

7:25

you could listen to music and you could

7:27

listen to music on a U2 iPod. And

7:29

so Brian Hamilton wrote in with regard to

7:31

the red and black colored U2 iPod. We

7:33

were wondering, I thought we were wondering on

7:35

the episode, certainly there was some mumblings about

7:37

it on Mastodon afterwards, how did they get

7:39

to red and black for the color scheme

7:41

of the U2 iPod? And Brian

7:43

wrote into her mind, John and us, about

7:46

how to dismantle an atomic bomb, which was

7:48

released November 22 of 20, or

7:51

excuse me, of 2004. And the

7:53

color scheme on the cover art for that album

7:56

is red and black. We're worried on that one, John, Mr. U2.

7:59

Yeah, I remember. I heard it once I was

8:01

reminded of it. I mean, here's the thing. Like I

8:03

said on the episode, it's not as if red and

8:05

black became the iconic colors of the band. This was

8:08

one album that was released, obviously, at

8:10

the same time as the iPod as part of a

8:12

promotional thing, like the iPod, the U2 iPod, the first

8:14

U2 iPod was released in October, and the album came

8:16

out in November. So it's a tie-in, right? And then

8:18

there were future U2 iPods, and they were also red

8:20

and black. But at that point, U2 hadn't released a

8:23

new album. So they're all just tied to this one

8:25

album. But they have released a lot of

8:27

albums, and they were future albums, and they were past albums,

8:30

I can tell you that this one and this color

8:32

scheme did not become heavily associated with the band. But

8:34

that's the reason. That's why they went with red and

8:36

black, because of the cover of the album. Are you

8:38

saying that as an assumption? I'm genuinely asking. Are you

8:40

saying that as an assumption? No, once

8:43

I was reminded of it, I'm like, oh, yeah, that's why they did

8:45

it. I mean, it's not a great reason, but I'm pretty sure it's

8:47

the reason. Fair enough. Max

8:50

Velasco Knott writes in

8:52

that there's also another feature, and I'm using

8:54

air quotes here, on the U2

8:57

iPod. Max writes, the U2 iPods featured

8:59

signatures of the band members on the backside. I was

9:01

fine with the black-red color scheme, but couldn't stand seeing

9:03

Bono and company on the back whenever I turned them

9:05

over. Yeah, I'd forgotten about that

9:07

as well. I mean, obviously, it's a shiny back end

9:09

that doesn't show up that much. But if you really

9:12

just wanted a red and black iPod and didn't care

9:14

about the band, the signatures on the back messed

9:16

it up a little. Indeed. Nikolai

9:19

Bronvol Ernst writes to us with

9:21

regard to the DMA and Apple's

9:23

Cut. Nikolai writes, I really enjoyed

9:25

your last show, 593, not

9:27

a European lawyer. I'm also not a European lawyer, but I

9:29

am a citizen in the EU and wanted to provide a

9:31

single European's point of view. The DMA

9:34

has nothing to do with Apple's Cut in

9:36

the App Store or how much money Apple

9:38

earns from selling their hardware. It only has

9:40

to do with ensuring fair competition, citizens' rights

9:42

to freely choose services they want to use without

9:44

vendor lock-ins on interoperability, portability, and your own

9:46

data, which we here in the EU believe

9:48

belongs to the user. That was a pretty

9:51

good summary. A lot of people have written in

9:53

to say this, but I think people will get hung up with the

9:55

idea when they're like,

9:57

Apple's Cut and how The

10:00

EU is trying to control that and they're like, the

10:02

EU is not trying to tell Apple how much money

10:04

it can make, it's just trying to do this other

10:06

thing. But the

10:09

reason it gets mixed up and the reason

10:11

people send us these emails is

10:13

because what Apple did to, you

10:17

know, supposedly comply with the DMA

10:19

while also trying to prevent competition

10:22

is an application of fees. So it's that

10:24

we, you know, OK, well, the EU says

10:27

you have to allow for competition. Apple says, OK,

10:30

sure, we'll allow competition. But all of our

10:32

competitors have to pay us an amount that

10:34

makes it so they can't compete

10:36

with us, right? And the cut we're

10:38

talking about is not Apple's cut from its

10:40

own App Store. Like when you

10:42

sell through the App Store, you pay Apple

10:44

some cut. It's the cut Apple demands from

10:46

the App Stores and the people selling

10:48

through App Stores that are not Apples on App Store,

10:51

that are selling through third party App Stores. Apple is

10:53

using money, using fees to

10:56

make the competition less competitive.

10:58

And that's what we're talking about. I know it's

11:01

even confusing when we're talking about Apple collecting its

11:03

money or Apple having its fees and stuff like

11:05

that. So I think maybe that's the source of

11:07

the confusion. And the other thing is, by the way, that

11:10

plenty of countries, including the EU, do

11:12

actually tell companies that they can't

11:15

make a certain amount of money on a certain

11:17

thing that they do. Someone wrote in to give

11:19

us the example of like credit cards, like a

11:21

MasterCard and Visa, the two big credit card networks.

11:24

I think in the EU, the fees

11:26

they charge stores to

11:28

process the credit cards are essentially

11:30

capped. And the EU

11:32

has basically said, Visa and MasterCard

11:34

own the market. You can continue to do

11:36

that, but you can't charge merchants

11:39

any more than 0.1%. The

11:43

EU has not done that to Apple. They

11:45

haven't said to Apple, hey, Apple, you can't

11:47

charge more than 10% in your own App Store.

11:50

They haven't said that at all. They haven't said anything about what

11:52

Apple can charge in their App Store. What they just want is

11:54

more competition, and Apple is saying, OK, there

11:56

can be other App Stores, but they all have to give us an amount

11:59

of money that makes it up. unattractive. And

12:01

yeah, we'll see how that flies. Again, the

12:03

EU has not yet ruled on the

12:06

core technology fee and all the other things that

12:08

they're investigating. So far, they've only ruled on the

12:11

steering provisions about how Apple

12:13

restricts the way apps in

12:16

its own app store can link out to third party

12:18

payment methods. But we'll see how

12:20

those other decisions come out in the coming months and

12:22

years. I don't know how long this is going to

12:25

take. But right now it's not

12:27

looking good for the core technology fee. Let's say that. We

12:30

asked for mostly tongue in cheek,

12:32

but we asked for Brexit style

12:34

names for Apple leaving the EU.

12:38

Jared counts was the first

12:40

we saw to suggest I

12:42

leave Frederick B Jormann suggested

12:44

accent and provided a truly

12:46

heinous but hilarious. I

12:49

presume AI generated image for this. My

12:52

personal favorite though, was suggested several times.

12:54

First we saw was from Oliver Thomas.

12:56

I quit. Yeah, that's pretty good. I

12:58

leave and I quit. We had many more suggestions. These are

13:00

thought these were the top three. The I leave and I quit are

13:03

cute, but I kind of like accent

13:05

because it's as close to Brexit and

13:07

the axe thing like the picture has

13:09

like a EU themed

13:12

Superman holding an axe and an apple. And

13:14

yes, it does look AI generated. It's interesting

13:16

how due to the way the various models

13:19

that we're familiar with have been trained, most

13:21

people can now look at an image and

13:23

identify it immediately as AI generated based

13:26

on like the shading and the

13:28

weirdness of hands and all sorts of other stuff.

13:30

It is kind of strange how quickly that happened.

13:32

But anyway, I kind of like

13:34

accent, but I don't

13:36

think we get to pick this name. So I mean,

13:38

I don't, MacBook one didn't really catch on. Neither did

13:41

a MacBook. Oh please, it sure did. Well, within our

13:43

little circle of podcasts. Yes. But I don't see the,

13:45

uh, the New York

13:47

Times running with accent or I quit. Yeah,

13:49

we don't really seem to have naming power

13:51

in the, in the greater ecosystem. If

13:55

we try hard enough, we can make fetch happen. All

13:57

right. Uh, someone anonymously wrote in

13:59

with. So

18:00

maybe there's some other factors there. How does it, how

18:02

would it possibly be measuring the sound on the inside

18:04

of your ear? Is there a microphone that's facing the

18:06

inside of your ear? I think there might be. Isn't

18:09

that how they do some of the calibration stuff? So

18:11

anyway, the point is my experience actually

18:14

using them, it

18:16

really does not feel like I'm

18:19

hearing a 95 decibel concert for three hours.

18:22

It feels like what it says of 85. Well,

18:25

how loud was the concert outside of the

18:27

year? From your seat, did you

18:29

look at the decibel meter? If I had nothing

18:31

on, what would the level be? Yes. So I did

18:33

a couple times where I would take the AirPods

18:35

out and put them

18:38

away so they turn off and just listen and watch

18:40

and see how the watch measures the concert fully. And

18:44

it was, I don't remember exactly, but

18:46

I remember it was somewhere in the high 90s, I think.

18:48

So not quite as loud as there. So maybe

18:51

the difference is that they were

18:53

coming from 105 decibels. And

18:55

they came out to 95. And I was coming, I think, from somewhere in

18:57

the 90s down to 85. So maybe that's

18:59

the cause. Or it could just be differences

19:01

in fit. I don't know exactly

19:04

how good is the seal with their artificial ear

19:06

setup compared to my actual ear. I don't know.

19:08

There's no good way to know that. So

19:11

I think the conclusion to draw here is, first

19:13

of all, what we kind of already knew,

19:15

which is they provide

19:17

some protection, suitable

19:20

for occasional concert goers, not suitable if you're

19:22

going to be working in a factory every

19:24

single day. There's different degrees of

19:26

protection that you might need. This

19:28

is not everyday protection. But also, it probably

19:31

varies a little bit between both

19:33

fit and between what exactly you're

19:36

actually listening to, like how loud

19:38

is your environment. Maybe

19:41

it can't bring down 105 decibels, but

19:44

maybe it can bring down 95 decibels. So

19:47

obviously, there are other variables here. So

19:49

I think the advice that

19:51

I would give remains the same, which is,

19:54

if you have really serious hearing protection needs

19:56

or very frequent hearing production needs, get real

19:58

hearing protection. an

20:00

occasional concert goer like me and you want

20:02

basic hearing protection for occasional concerts, this

20:05

is probably fine unless you are standing

20:07

like directly next to the giant PA

20:09

speaker. Maybe you might need a little

20:11

bit more protection. But this

20:13

seems fine to me. And every time I've

20:15

used them, I feel great afterwards and my ears

20:18

don't ring at all. And it doesn't, there's no

20:20

fatigue. So like, it seems to be

20:22

working. So maybe it just has a limit to

20:24

how much it can work. Apple

20:26

is apparently using Google Cloud infrastructure

20:29

to train and serve AI. This

20:31

is from HPC wire. Apple is

20:33

two new homegrown AI models, including

20:35

a 3 billion parameter model for

20:37

on device AI in a larger

20:39

LLM for servers with resources to

20:41

allow, to answer more queries.

20:44

The ML models developed with TensorFlow were

20:46

trained on Google's TPU. John, remind me

20:48

what TPU stands for. Tensor processing unit,

20:50

some of that we talked about, the

20:53

actual hardware on a past show and how many

20:56

billions of computations or whatever they do

20:58

and how many different operands are in each

21:00

operation. But I think it's like a Tensor

21:02

processing using unit or something. It's basically, so

21:05

Google doesn't buy its GPUs from Nvidia and

21:07

put them in, it makes its own silicon

21:09

to do machine learning. It has for many,

21:11

many years, it's not a new thing. They're

21:13

called TPUs. And that's what they're currently using

21:16

to train Gemini and stuff. And if

21:18

you pay them, just like you pay AWS or whatever,

21:20

you pay Google Cloud, I believe

21:22

they will rent you their TPUs and you can train your models

21:24

on it. And that's what Apple did. Indeed.

21:28

Apple's AXLearn AI framework used to

21:30

train the homegrown LLMs creates Docker

21:32

containers that are authenticated to run

21:34

on the GCP or Google Cloud

21:36

something. What is that? Google Cloud

21:38

computing? I don't know. Computers. GCP

21:42

is like AWS. It's Amazon Web

21:44

Services, but Google. Anyway,

21:46

to run on the GCP infrastructure,

21:48

AXLearn supports the Bastion orchestrator, which

21:50

is supported only by Google Cloud.

21:53

This is a quote from their

21:55

GitHub documentation. While

21:58

the Bastion currently only supports Google. Cloud Platform.

22:00

There you go. I should have kept reading. My bad. Google

22:02

Cloud Platform Jobs, its design is cloud-agnostic. And

22:05

in theory, it can be extended to run

22:07

on other cloud providers, Apple stated on its

22:09

AX Learn Infrastructure page on GitHub. Yeah,

22:12

so this is, I mean, we didn't

22:14

put this in the notes, but the rumors

22:16

are that the deal between Apple and Google

22:18

to use Gemini as part of iOS 18

22:20

as an option alongside a chat GPT, that

22:22

deal is reportedly getting closer. But this is

22:24

from the past of like, hey, Apple's got

22:26

these models, the one that's going to be

22:28

running on people's phones or the various ones

22:30

that are running on their phones, which are

22:32

smaller. And the big ones, they're going to

22:34

be running on their private cloud compute. And

22:36

these are Apple's own models, and they train

22:38

them themselves. And how do they train them?

22:40

They paid Google to use TPUs to train

22:42

their models. And so I feel

22:44

like this is interesting in that Google's

22:50

unfriendly relationship, let's say, with Nvidia

22:52

continues. And

22:54

their friendly relationship with Google continues. It's kind of a

22:56

surprise that Google didn't do the deal. Maybe the rumors

22:59

are, I think we talked about this on a past

23:01

show, that nobody's paying anybody for

23:03

the open AI thing, whereas maybe Google wanted

23:05

to be paid. So we'll see how this

23:07

works out. But yeah, there seems to be

23:10

a cozy relationship between Apple and Google, because

23:12

apparently Apple either doesn't have yet or doesn't

23:14

plan to have fleets of

23:16

massively parallel machine learning silicon that

23:18

they can train their models on.

23:20

But Google does. We

23:24

are brought to you this episode by

23:26

Photon Camera, the ultimate manual shooting app

23:28

for iPhone photography enthusiasts. Whether you're a

23:31

seasoned pro or just getting started with

23:33

photography, Photon Camera is designed to give

23:35

you all the tools you need to

23:38

elevate your iPhone camera experience to new

23:40

heights. Photon Camera is a beautifully designed,

23:42

easy to use manual camera app for

23:45

iPhone, perfect for both beginners and professionals.

23:47

You can say goodbye to confusing buttons

23:49

and hidden gestures. Photon Camera is very

23:52

intuitive and comes with a comprehensive manual

23:54

to help you learn the basics of

23:56

photography. They've also just launched Photon Studio,

24:00

your iPad or a monitor connected to

24:02

a spare iPhone for a big screen

24:04

preview while you shoot. It also allows

24:06

you to favorite or delete images in

24:08

real time, view metadata, and even zoom

24:10

in to expect details closely. And Photon

24:13

Enhance is the new powerful photo editor

24:15

for iPad and Mac that's also now

24:17

available. Both Photon Studio and

24:19

Photon Enhance are included free with

24:21

your Photon Camera subscription. And

24:24

here, of course, is the best part. For

24:26

our listeners, Photon Camera is offering an exclusive

24:28

deal. You can get up to 50% off

24:31

your first year by visiting

24:33

Photon.cam slash ATP. That's Photon,

24:35

PHOTON.cam, C-A-M slash ATP. Go

24:37

there, Photon.cam slash ATP, to

24:40

claim your discount and start

24:42

exploring the power of manual

24:44

photography on your iPhone today.

24:47

Thank you so much to

24:49

Photon Camera for sponsoring our

24:51

show. John,

24:56

I hear that you have

24:58

asked Apple for help and they have said,

25:00

you know what you need? You need a

25:03

Mac Studio. Because why would anyone need a

25:05

Mac Pro? This went around,

25:07

I think, a week or two ago. Apple's got

25:09

a page, apple.com/Mac slash best hyphen

25:11

Mac. And the title of the page is

25:13

Help Me Choose. Answer a few questions to

25:16

find the best Mac for you. And

25:19

when this was going around, the first thing

25:21

I did was launch this page and

25:23

I wanted to go through the little wizard and answer

25:25

a bunch of questions to see if

25:27

I could reach the win condition, which is

25:30

having this tool recommend the Mac Pro. Is

25:32

that the win condition? It is the win condition. Are

25:35

you sure? And the answer

25:37

was very clear. And I was mostly telling the truth, but

25:39

occasionally I would exaggerate to

25:41

make sure I go on the Mac Pro path. And

25:44

I did not end up at a Mac Pro. It

25:46

recommended Mac Studio to me and that a bunch of

25:48

other people pride. So a bunch of people tried to

25:50

use this tool to get Mac Pro. Nobody could do

25:52

it. And Julia Montier tried it and found out how

25:54

to cheat to win the game. If

25:57

you look at the source code, you can

25:59

see that a JSON file that

26:01

defines the options for

26:03

the endpoints. And that JSON, it's

26:05

not a JSON file, but it's not a JSON. That JSON does

26:08

not contain the Mac Pro. It

26:11

contains pretty much every other Mac that

26:13

Apple sells, but there

26:15

is no way to get to the Mac Pro because

26:17

the Mac Pro is not one of the options. That's

26:20

weird. Is it? No, this

26:22

is Apple telling you that literally nobody wants this

26:24

computer and nobody should have it. We all agree

26:26

on this show that the current Mac Pro is

26:28

not a great computer. But it is

26:31

a computer that exists. And on

26:33

top of that, there is at

26:35

least one very specific

26:37

reason why someone might want to use it. If

26:40

one of the questions had asked, hey, do

26:42

you have a bunch of PCI Express cards that you

26:44

need to use? If

26:46

the answer to that is yes, it's literally the only

26:48

computer Apple sells that you can do that on. And

26:51

that is really the only thing to recommend. Do

26:53

you think the people who made this quiz know

26:55

what a PCI Express card is? I

26:58

mean, it's Apple. They

27:01

have questions and answers for every other computer. It

27:03

just seems weird to me. Now, again,

27:05

I can understand saying, well, this is not a

27:08

great computer. And really, honestly, no one should really

27:10

buy it. I agree with all of that. But

27:12

when you make a help me choose tool on

27:14

your website, you should have

27:16

all of the things as endpoints. And yeah,

27:18

make the Mac Pro pretty much impossible to

27:20

get to unless you need it. But there

27:23

is a reason someone might need it. If

27:26

someone is going through this tool and saying, I

27:28

don't know what I'm going to do. I've got all these audio cards

27:30

that I need to use for, you

27:32

know, my old Mac is dying. Is there some

27:35

other computer that I can use? How would you

27:37

determine that Apple still

27:39

sells computers with card slots

27:41

in them? Everyone on Mass.

27:43

I'm saying, OK, well, the people who need the Mac Pro

27:45

know it, and so they don't need to use this tool.

27:48

That's not how these tools work. You could say the

27:50

same thing about, well, the people who need an iMac know they

27:52

want an all-in-one things that they don't need to use this tool.

27:56

And if you know which computer you need, yes, you don't need

27:58

this tool. But the tool exists to lead you. to

28:00

whichever product that Apple sells is best suited

28:02

for you. And it's weird to

28:04

leave just one out. And

28:08

I would just love to know if the thinking behind that process is

28:10

like... Look, if Apple doesn't want to sell them,

28:13

don't sell them, right? But they're selling them. You

28:15

can buy them for a huge amount of money.

28:18

And the tool can make

28:20

it difficult or almost impossible to get there because when

28:22

it says, how many PCI Express cards do you need

28:24

to use? The default choice should

28:26

be zero or I don't know what a PCI Express

28:29

card is. Like have a million options that regular people

28:31

will click and they will lead them off that path

28:33

and say you shouldn't buy this. But if the person

28:35

says three or any

28:37

number other than zero, you have to leave them to

28:39

the Mac Pro because it's literally the only computer they

28:41

sell with card slots. I mean, you're

28:43

gonna hate this, but so I did the whole quiz

28:45

trying to get to the Mac Pro before

28:48

you said it wasn't an option and just putting

28:50

in all like the highest requirements. Like, you know,

28:52

I need all that. I do 3D editing and

28:54

content creation and video editing and audio editing. I

28:56

need all these tools. I need to connect a

28:58

bunch of stuff to my Mac and

29:01

it recommended exactly what I'm

29:03

using right now. The Macbook Pro

29:05

16H. I

29:08

thought for sure I'd at least get a Mac Studio, but

29:10

nope. Well, no, because the question it asks is do you

29:13

do all your work in a single location or do you

29:15

need to be portable? Did you say all I do all

29:17

my work in a single location? I said like the on

29:20

the one desk option, the very top option where it's

29:22

like I do everything at the same place on a

29:24

desk. Like I even I thought for sure I'd at

29:26

least get a Mac Studio. I think a lot of

29:28

the endpoints recommend two computers. Like I didn't

29:30

just get the Mac Studio. I got recommended the Mac Studio and

29:32

the Macbook Pro. Oh, I also got two computers,

29:35

the Macbook Pro $4,000 configuration and

29:38

the Macbook Pro $3,500 configuration. I

29:42

don't know how you didn't end up with desktop because

29:44

there must have been some question that's differentiating portability.

29:46

Obviously, if you mention you ever need to take it

29:49

somewhere, they're not going to recommend it at all. Yeah,

29:51

I don't know. I don't know how great this tool

29:53

is. Wizards in general are not great.

29:55

I like their comparison ones like for the phones where it does

29:57

like columns and you can list all the features and scroll and

29:59

see they are different from each other. This

30:02

doesn't do that. But I do

30:04

think it's very strange to not

30:06

have a single one of your

30:08

computers. Remember when they were selling the trash can for

30:10

years and years, and really nobody should be buying that,

30:12

right? But if you needed

30:15

whatever GPUs it came with, for a while

30:17

it still did have the most powerful GPUs

30:19

you could buy in an Apple computer. And

30:22

if you needed those GPUs and they had a tool that was

30:24

asking you a bunch of questions, they should have had a question

30:26

that said, do you use Maya at

30:28

Pixar and need this much GPU power and then it

30:30

will lead you to the trash can? But I

30:33

don't know. It's weird. Anyway, if someone at Apple knows why

30:35

the Mac Pro is emitted from this tool, please

30:38

tell us. I'm sure it's the obvious reason, which is like,

30:40

no one should buy that. And we kind of agree, but you're selling

30:42

it, so put it in the tool. Pretty

30:44

sure it's very clear why it's emitted.

30:48

Even the very first day this Mac Pro came out, nobody

30:50

should be buying it, let alone now. It's

30:54

not nobody. It is the only computer with slots.

30:57

That's not a great reason for it to exist, and it's not

30:59

a reason for you to pay twice as much as Mac Studio,

31:02

but especially since they don't support, I believe

31:04

they don't support at all anymore, the PCI

31:07

Express breakout boxes like they used to on

31:09

the Intel things, it's literally

31:11

your only choice if you have cards. And

31:14

that's one of the reasons they should continue to make it. And do continue

31:16

to make it, and they just never ask about that. Oh,

31:19

yeah. It made me laugh quite

31:21

a bit that nobody was coming up with the Mac

31:23

Pro. I don't know. Maybe that's

31:25

a feature, not a bug. All

31:28

right. For the main, main

31:30

topic this week for your main course, we

31:33

have a plethora of different AI

31:35

related topics. And I'm going

31:38

to try to take us on a journey.

31:40

We'll probably fail, and that's OK. But basically,

31:42

this next section is AI. Huh.

31:45

That's a thing, isn't it? And

31:48

so we start on the 17th

31:50

of June, for what it's worth, with our

31:52

friend John Voorhees at Mac Stories, which

31:56

is them saying, hey, the article is entitled

31:59

How We're trying to protect Mac stories from AI

32:01

bots and web crawlers and how you can too. And

32:04

it seems like both John and

32:06

Federico are getting very wrapped around

32:08

the axle with regard

32:10

to AI stuff. And I'm not saying I don't

32:12

mean to imply that they're wrong or that's bad,

32:14

but they are getting ever

32:16

more perturbed on what's

32:19

going on with AI crawlers. And

32:21

I mean, to a degree, I get it. So,

32:23

uh, that was on the 17th of June. John

32:25

says, here's how you can protect yourself from crawling.

32:27

And then on the 21st of June, business insider

32:29

writes, it says, Oh, open

32:32

AI and anthropic seem to be ignoring

32:35

robots.txt. And if you're not familiar, if

32:38

you have a webpage or website, I

32:40

guess I should say, where, um, where

32:42

you control the entire domain, you can

32:44

put a file called robots.txt at the

32:46

root of the domain. So,

32:48

you know, it would be

32:50

marco.org/robots.txt and any self-respecting

32:52

and ethically clear crawler will start

32:55

crawling marco.org or whatever the case

32:57

may be by attempting

33:01

to load robots.txt and seeing if

33:03

there's anything there. And if so, it,

33:05

there's a, a mechanism, a schema, if you will,

33:07

by which, um, the

33:10

robots.txt will dictate who or really what

33:12

crawlers should or should not be allowed

33:15

to crawl that site. And it's by, it's by path.

33:17

They can say everything in this directory, you shouldn't crawl

33:19

everything here. You can crawl it. So you can sort

33:21

of subdivide your site to say which parts are accessible.

33:24

Yeah. And I have thoughts on that, but we'll come back to that. Yeah.

33:27

I mean, whenever you're ready to interrupt, to be honest,

33:30

feel free. Okay. Let's talk about robots.txt. Uh,

33:32

so, so, well, let me just actually very quickly, I apologize.

33:34

I gave you the, we gave you the green light and

33:36

I'm giving you the yellow light. Just very quickly, it's already

33:38

in the intersection. It's

33:40

important to note that robots.txt has never been

33:42

enforced in any meaningful

33:45

way. It's been kind of a, a, a, a, a, a, a, a, a, a, a, a, a, a, a, a, kind

33:51

of a friendly agreement amongst pretty

33:53

much the entire world wide web. But there's

33:56

never been any real, um, wood behind the

33:58

arrow or, or, or, or, or, whatever the

34:00

turn of phrase we call advisory. Yeah.

34:02

Advisory locking. It is a

34:04

scheme that people who agreed to

34:06

that scheme can use that scheme to collaborate

34:08

and work together, but there is no actual

34:10

mechanism stopping anyone from doing anything. It is

34:12

literally just a text file that you can

34:15

choose to read or not. Right.

34:17

So with that said, Marco carry on. Yeah. And,

34:19

and so robots.txt is

34:22

basically a courtesy. It is

34:24

a website saying, please

34:26

maybe follow these rules if you, if

34:29

you would, you know, but it

34:32

is not a legal contract. It is not a

34:34

legal restriction. Um, it

34:36

is not technically enforced or

34:39

enforceable really. It is

34:41

also not universally used and respected.

34:43

And so, and, and I can

34:45

tell you, I operate crawlers of

34:47

a sort and I don't use robots.txt. So

34:50

when, when overcast crawls podcast feeds, I

34:53

don't even check for robots.txt. I just

34:55

crawl the URL as the users have entered them

34:57

or as they have submitted them to iTunes slash

35:00

Apple podcasts. What robots.txt

35:02

advisories are were originally

35:04

for was not

35:06

like, Hey, search engines, don't

35:09

crawl my entire site. That's not what they were for.

35:12

What they were for was mostly

35:14

to prevent like runaway crawls on

35:17

parts of a site that were potentially infinitely

35:19

generatable. So things like if you had like

35:21

a web calendar and you can just click

35:24

that next month, next month, next month button

35:26

forever if you want to. And so a

35:28

web crawler that like, you know, indexes a

35:30

page and then follows every link on that

35:33

page. If it's hitting like a web calendar,

35:35

it can generate basically infinite links, uh,

35:37

as it goes forward or backwards in time.

35:40

So the main purpose

35:42

of robots.txt was to kind

35:44

of advise search engines. And

35:47

it was specifically for search

35:49

engines. It was to advise

35:51

them areas of the site that

35:53

crawlers should not crawl mostly for

35:55

technical reasons, occasionally for some kind of privacy

35:57

or restriction reasons, but usually it was just.

38:00

Was some kind of like legal contract that

38:02

said you must obey my rules That really

38:04

has never been tested until fairly recently like

38:06

that that was never really something that really

38:08

ever came up I mean there were been

38:10

a couple of things here and there was

38:12

like Google News and news publishers in certain

38:14

countries and stuff But like for the most

38:16

part that the basic idea

38:18

of robots.txt was really just

38:21

please Like that's it. It was like, please

38:24

do this or don't do this And

38:26

even then like it was often

38:28

used in ways that harmed The

38:32

actual customers using using things or or

38:34

did things that were unexpected this is

38:36

why I don't use it for overcast

38:38

feed crawlers because if you

38:40

publish an RSS feed and Submit

38:42

it to Apple podcasts. I'm

38:44

pretty sure you intend for that to be a public

38:46

feed And so I

38:48

feel like it is not really my place

38:51

to then you know Put up an alert

38:53

to my user to say hey this person's

38:55

robots.txt file actually says you know It's disallow

38:57

star on this one path that

38:59

this feed is in and so I actually

39:01

can't do this for you Like that that

39:03

would feel like I would have

39:05

I would have first of all no incentive to

39:07

do that and second of all because of It's it

39:11

because of its intention and context as a standard

39:13

for search engines, which I'm not This

39:15

doesn't really apply to me and my use and

39:18

and there were all sorts of things over the

39:20

years too Like you know that you could specify

39:22

certain user agents like alright Google bot do this

39:24

Yahoo bot Do this like and even and that

39:26

was also problematic over the years too because it

39:29

Disadvantaged certain companies if you just had

39:31

like bad behavior once or if a

39:33

site owner Just had like

39:36

one bad thought about one of these companies

39:38

once and then like never revisited it or

39:40

whatever Like then that company was allegedly like

39:43

disallowed from crawling this site. Why well, I

39:45

mean, it's not even that It's like, you

39:47

know for people in all of the technology

39:49

behind it You know

39:51

don't allow Google bot the way you identify Google

39:54

bot is by the user agent string Which is

39:56

part of the HTTP request and anybody can write

39:58

anything there and more

52:00

and more applications over time of

52:03

technologies like AI summarization and

52:05

action models and things

52:07

like that, where some

52:09

fancy bot basically is going to

52:12

be browsing and operating

52:14

a web page on behalf

52:16

of a user. That is kind

52:18

of like a browser, but it's

52:20

a very different form that I

52:22

think breaks all those assumptions with publishers.

52:25

This is one thing that I faced when I was

52:27

making Instapaper a thousand years ago. Instapaper would save the

52:29

text of a web page to read later and only

52:31

the text, not like all the ads and the images

52:33

and everything like that. I was very careful,

52:35

though, to not make features

52:38

that would enable somebody to

52:40

get the text of a page without having

52:42

first viewed the page in

52:45

a browser or a browser-like context. It would

52:47

load the whole page. They would see the

52:49

page. If there were ads, those

52:51

ads would load on the page. They would see

52:53

those ads. Then they could save what they were

52:55

seeing. Then part of that would be saved

52:57

in Instapaper and shown to them later. That

52:59

was always a very tense

53:03

balance to try to maintain

53:06

because what I didn't

53:08

want was widespread scraping of people's text

53:10

without loading their ads, but

53:13

I figured that seemed like an okay trade-off because that

53:15

was literally just saving what was already

53:17

sent to the browser and what the user was already

53:19

looking at. But a lot

53:21

of these new technologies, first

53:23

of all, I probably wouldn't attempt that today, but a lot

53:25

of these new technologies

53:29

break a lot of those little details.

53:31

If you have some kind of bot

53:33

that's doing something on

53:36

a website, suppose

53:38

it's one of these action models where you're saying, all right, book

53:41

me a flight. This stupid

53:43

book me a trip thing that all of

53:45

these AI demos from these big companies keep

53:47

trying to do even though nobody ever wants

53:49

that. Suppose you have a

53:51

book me a trip kind of thing with an AI model and

53:53

the idea is that model will go behind the scenes

53:56

and will go

53:59

operate experience. media or orbits behind the

54:01

scenes for you and manipulate things back

54:03

there to find the best flights and

54:05

hotels and whatever else. Well,

54:07

those sites make some of their

54:09

money via ads and affiliate things

54:11

and sponsor placements on those pages.

54:14

If you have some bot operating the site for

54:16

you, kind of clicking links for you behind the

54:18

scenes in some kind of AI context, that

54:21

bot is not going to see those ads. It's not

54:23

going to click those affiliate links. It's not going to

54:25

pick the sponsored listing. It's going to just

54:27

kind of get the raw data and that's it. And

54:29

that will be violating those sites business models that that

54:31

happened. That really has

54:33

not happened at massive scale until fairly

54:35

recently. So this really has not been

54:37

challenged. This really has not been legally

54:40

tested that much. This really has not

54:42

been worked out. Like what are the

54:44

standards? What are the laws? What are

54:46

the legal precedents? How much of this

54:48

is fair use versus not? You

54:50

know, for the most part until very recently,

54:52

we could pretty much just say, all

54:54

right, if you serve

54:56

something publicly via

54:58

public URLs and anybody can

55:00

just download it, then nothing

55:03

bad would really happen to you and your business

55:05

model for the most part. If

55:07

some bot came by sometimes and parse that page for

55:10

some other purpose, it wasn't a big deal. But

55:13

now there's a pretty

55:15

significant difference in scale

55:17

and type of replacement. Now

55:20

with a lot of these AI products

55:22

and with Google search itself, you know,

55:24

increasing over time and then more recently,

55:26

rapidly increasing, what we're seeing now is

55:28

full out replacement of the need for

55:30

the user to ever look at that

55:32

page. That's a

55:34

pretty big difference. And it's really

55:36

bad for web publishers and kind of,

55:38

you know, then consequently really bad for

55:41

the web in general. We

55:43

have a pretty serious set of challenges on

55:45

the web already, even before this new wave

55:47

of LLMs came by to

55:50

further destroy the web. We

55:53

already had a pretty bad situation for

55:55

web publishers for lots of other reasons

55:57

over the years to have

55:59

something that reminds us of the web.

56:01

moves the need for many people to

56:03

visit a page at all, that is

56:05

going to crush publishers. And so

56:07

it does make sense why everyone's freaking out about this.

56:09

It makes a lot of sense. I

56:12

do caution people though,

56:14

I don't think it's a

56:16

very good business move or a very good technology

56:18

move to say, I'm going to just block AI

56:21

from being able to do any, to see any

56:23

of my stuff. Because

56:25

that's a pretty big hammer and that's

56:27

a pretty big blanket statement. And

56:30

you can't actually block them anyway. Like that's when

56:32

it comes down to technically speaking, you can't, you

56:34

literally can't stop them, right? Unless

56:36

you stop everyone from viewing your website, in which case you

56:38

don't have a website. Right.

56:40

So I think it is wise to

56:43

focus on trying

56:45

to prevent uses of your content that

56:47

remove the need to visit your page.

56:50

Because that is a direct attack on your business model. That

56:52

makes a lot of sense. I don't

56:54

think it's wise to say, I don't

56:56

want any AI training or any AI

56:59

visibility of my page. That

57:01

I think is probably

57:03

short sighted and probably a bit too

57:06

much of a blanket statement. And that I

57:09

don't think it's good for any

57:11

party involved to

57:13

have that kind of blanket ban on it. I know a

57:15

lot of people want though that what people, the publishers

57:18

in particular want is

57:21

they want an ecosystem

57:24

of members who do agree to

57:26

some rules of politeness and

57:28

say, look, we should agree on a system that

57:30

lets me tell you that you shouldn't do

57:33

X, Y and Z on my site and you should agree to it

57:35

and we'll feel better about you if you do that. And part

57:38

of the reason I think Instapaper, your

57:40

example was not a particularly big

57:42

problem is like you said, scale. And anything

57:44

with AI in the name these days, people

57:46

flip out about it and think this is

57:48

going to be as big as Google. Instapaper was

57:50

not as big as Google. No,

57:53

it did not have billions and billions and billions

57:55

of users. If it did, if Instapaper

57:57

had Google scale, I bet there would have been a.

57:59

hell of a lot more scrutiny on even the very

58:01

conservative things that you did. But because it was small,

58:03

it's not a big deal. Like that's, that's part of

58:05

the sort of the ecosystem of the web is there's

58:08

all sorts of small things that don't have

58:10

particular big scale to do and all sorts

58:12

of weird stuff. Nobody cares about them. We

58:14

allow them to exist. It's fine. But now

58:16

these big names and AI, AI is the

58:18

next big thing. You're an AI company, you

58:20

have a lot of funding. Everyone looks at

58:22

them and think that could be the next

58:24

Google. That could be the next thing with

58:26

billions and billions of users. So we better

58:28

take whatever weird stuff they're doing way more

58:30

seriously than we would take overcast. And with

58:32

even with Google, the, you know, the current

58:35

giant in the world of search and they're, you

58:37

know, trying to replace sites and giving answers on

58:39

the side or whatever. Neil I Patel coined a term,

58:41

I think it was says about this called Google zero,

58:43

which is the point at which publisher

58:46

websites get zero traffic from Google search,

58:49

right? Because it's been going down and down over the

58:51

years because, hey, you'd write type Google search and look,

58:53

the answer to my question that I typed it's right

58:55

on the Google results page. It's unattributed. And I don't

58:57

have to, if it was attributed, I don't have to

58:59

click on any link to get to it because the

59:01

answer is right there. And so Google has been sending

59:03

less and less traffic to websites and Google zero is

59:05

when you notice, hey, you know what, you know how

59:07

much traffic we're getting from Google searches? Zero.

59:09

I don't know if it's absolutely zero for everybody, but

59:11

it's sure going down. And it's a scary

59:14

world to have

59:16

what was once the massively largest

59:18

source of your traffic to your website disappear.

59:23

But yeah, like whether or not it is wise

59:25

to exclude, to try to, to ask

59:28

to be excluded from pick, you

59:30

know, whatever, whatever AI crawler thing

59:32

from whatever open AI perplexity or whatever.

59:35

I think most publishers just simply want that

59:38

choice. And to have that choice, the,

59:40

the crawlers need to agree. Because again, there is no

59:42

technical way to stop this short of doing like putting

59:44

your entire site behind a paywall. And even that's not

59:46

going to stop them. Cause they'll just pay another crawler,

59:48

go through it. Like it's, that's, that's the thing about

59:50

publishing on the web. You do, it's

59:52

like DRM. You want people to

59:54

see your movie. You can't make it impossible to

59:56

see your movie. You have to give the viewer

59:58

an ability to see. your movie. But once you

1:00:00

give the viewer the ability to see your movie,

1:00:02

they can see your movie. Like, what

1:00:05

if they see it, but also record

1:00:07

it? I want them to see it,

1:00:09

but not be able to see it. Can I do that? And

1:00:11

the answer is no. Right? So if you're

1:00:13

publishing on the web, you have

1:00:15

like, it's like anything else. That's why Marco was right to

1:00:17

call this illegal thing. Like things are published

1:00:19

all the time. They were published in paper, you know, like

1:00:21

the books or whatever. It's like, but I can take the

1:00:24

book and look at it. I can see all the letters

1:00:26

in it. Haha. The book is mine. Well, no, actually we

1:00:28

have laws about the stuff

1:00:30

that's in that law, that book. We

1:00:32

have this thing called copyright. And even

1:00:34

though you can technically read it and

1:00:36

you can technically copy it increasingly more

1:00:39

easily over time with technology, we have laws surrounding it

1:00:41

to control what you can do it. And

1:00:43

ROASD text, people who think of ROASD text as some

1:00:45

kind of like technological bank vault. It's no more of

1:00:47

a bank vault than you could put on a book.

1:00:49

Like you do want people to read it and you

1:00:51

can't stop them from being able to copy it. And

1:00:54

these dates is really, really easy to copy a book, especially

1:00:56

if it's an ebook, right? Setting aside

1:00:58

the whole DRM thing. What

1:01:00

you want is some either in

1:01:03

a sort of polite society, an agreement

1:01:05

among the large parties that actually are

1:01:07

significant to get along and then

1:01:10

failing that you want laws to provide whatever protections

1:01:12

you think are due to you. And yeah,

1:01:14

the Google search stuff has, I feel like been

1:01:16

hashed out probably in the Altavius today's, but who

1:01:19

knows? And the AI stuff has not

1:01:21

yet been hashed out. And so to move on to this next

1:01:23

one, because we have a lot of these items, Microsoft,

1:01:25

at least someone in Microsoft has

1:01:27

a very interesting notion of

1:01:30

what the deal is on the

1:01:32

web and potentially what the law should be surrounding

1:01:34

it. So this is

1:01:36

a post on the verge by

1:01:38

Sean Hollister, who writes, Microsoft AI

1:01:41

boss Mustafa Suleiman incorrectly

1:01:43

believes that the moment you publish anything on

1:01:45

the open web, it becomes quote unquote freeware

1:01:47

that anyone can freely copy and use. When

1:01:49

CNBC's Andrew Ross Sorkin asked him whether AI

1:01:52

companies have effectively stolen the world's

1:01:54

IP, Mustafa said, I

1:01:57

think that with respect to content that's already

1:01:59

on the open web, the social contract of

1:02:01

that content since the nineties has been that

1:02:03

it is fair use. Anyone can copy it,

1:02:05

recreate it with recreate it, reproduce with, sorry,

1:02:07

recreate with it, reproduce with it. That

1:02:10

has been freeware if you like, and

1:02:12

that's been the understanding Microsoft

1:02:14

is currently the target of multiple lawsuits alleging

1:02:16

that it and open AI are stealing copyrighted

1:02:18

online stories to train generative AI models. So

1:02:20

it may not surprise you to hear Microsoft

1:02:23

exec defend it as perfectly legal. I

1:02:25

just didn't expect them to be so very publicly and

1:02:27

obviously wrong. And I'm not a lawyer writes Sean, and

1:02:29

that's also true for me, but I

1:02:31

can tell you that the moment you create a work,

1:02:34

it is automatically protected by copyright in the U S

1:02:36

you don't even need to apply for it. And you

1:02:38

certainly don't void your rights just by publishing it on

1:02:40

the web. In fact, it's so difficult to rate to

1:02:42

waive your rights that lawyers had to come up with

1:02:44

special web licenses to help. This

1:02:46

is so gross. Like I'm

1:02:49

not as riled up as a lot of

1:02:51

people about people about, you know, these AI

1:02:54

bots crawling my website, like sitting

1:02:56

here now, I don't find it that off

1:02:58

putting. I don't love it, but whatever this

1:03:01

though, this is disgusting. So this is such

1:03:03

a weird statement because everybody knows how copyright

1:03:05

works. I'm sure this person knows as well.

1:03:07

But to say that like, Oh, it's once

1:03:09

you put it on the web, it's freeware,

1:03:11

which is a term that mostly applies to

1:03:13

software. But like, the idea is you can

1:03:16

recreate it, reproduce it, uh, you know, copy

1:03:18

it. Like, no, no, no.

1:03:20

Like there, those are specifically

1:03:22

the things we actually do have laws around. Well, we

1:03:24

don't have laws around or the more complicated things like,

1:03:26

well, can I train AI on it or whatever? And

1:03:28

we'll get to that in a little bit. But like,

1:03:30

it's such a weird thing to say that like, Oh,

1:03:32

as everyone knows since the nineties, once you put it

1:03:35

on the web, you forfeit all ownership. That's

1:03:37

not true at all. And I think that's, that's, like,

1:03:39

it's one of the things that's great about the web

1:03:41

is, Oh, it's just like books. It's printed word, right?

1:03:43

And especially in the beginning, it was just a bunch

1:03:45

of words and we already have laws surrounding that, right?

1:03:47

And that's why there were cases about search engines. Like

1:03:50

our search engines, copying it because, you know,

1:03:52

we got this whole, you know, giant library

1:03:54

of laws about copying texts. My website has

1:03:56

text on it and Google's copying it and

1:03:59

they've had to do good. out and say,

1:04:01

actually, what Google's doing is fine within these

1:04:03

parameters, blah, blah, blah. But

1:04:06

that fight was fought because it was an

1:04:08

example of copying. But yeah, this... I

1:04:12

mean, obviously, the Microsoft AI

1:04:14

leadership, this guy is not a

1:04:16

lawyer either. But

1:04:19

that's not how you should defend this. You shouldn't

1:04:21

defend it by saying, you know, every single web

1:04:23

is a free for all. Because that's never

1:04:25

the way it's been and it's not the way it is now. There's

1:04:29

another foot in the mouth problem from Microsoft. I'm not

1:04:31

sure what's going on over there, but they really need

1:04:33

to take a lesson from

1:04:35

Apple and maybe try to speak with one

1:04:37

voice instead of having individual lieutenants make really

1:04:39

terrible statements of the press. Yeah.

1:04:42

So, Louis Mantia writes with regard to permissions on

1:04:44

AI training data from the 22nd of June. Louis

1:04:48

writes from John Gruber today on the 22nd of June.

1:04:52

It's fair for public data to be excluded

1:04:54

on an opt-out basis rather than included on

1:04:56

an opt-in one. And then Louis

1:04:58

continues, no, no, it's not. This

1:05:00

is a critical thing about ownership and copyright in

1:05:03

the world. We own what we

1:05:05

make the moment we make it. Publishing text or

1:05:07

images on the web does not make it fair

1:05:09

game to train AI on. The public in public

1:05:11

web means free to access. It does not mean

1:05:13

free to use. Also, whether reposting

1:05:16

my content elsewhere is in good faith or not,

1:05:18

it is now up to someone other than me

1:05:20

to declare whether or not to disallow AI training

1:05:22

web crawlers in the robots.txt file

1:05:25

to allow, excuse me, to add insult to injury,

1:05:27

that person may not have the knowledge or even

1:05:29

the power to do so if they're posting content

1:05:32

they don't own on a site that they also

1:05:34

don't own like social media. So this

1:05:36

is, he's so close to getting to the crux

1:05:38

of this. In the first

1:05:40

little paragraph here, he's basically declaring

1:05:43

that training AI on your data

1:05:45

is exactly the same

1:05:47

as copying and reproducing it. And that is

1:05:49

not something that the world agrees on. His

1:05:52

opinion is that it is. The courts have

1:05:55

not yet weighed in. I think

1:05:57

to the average person they would say, are those

1:05:59

the same things? seem like they might be a

1:06:01

little bit different. Kind of in the same way

1:06:03

that indexing your content in Google is a little

1:06:06

bit different than just literally copying it and reposting

1:06:08

it on the website, right? But

1:06:10

anyway, if you agree that it's the same as copying,

1:06:12

then yeah, sure. But then the second bit is getting

1:06:14

to even more of the heart of it here, which

1:06:16

is like, okay, so let's say we do agree that

1:06:19

it's the same much, you know, not

1:06:21

proven yet, but anyway. What

1:06:24

about when somebody like posts a link to

1:06:26

your site on the social media network, and

1:06:28

on that website, they do a little embedding,

1:06:31

inlining of like the first paragraph or whatever,

1:06:33

like what if someone copies and pastes a

1:06:35

paragraph of your thing on another

1:06:37

website, right? Even if you had absolute,

1:06:39

somehow magical technical control to stop

1:06:42

AI crawlers crawling your website, if

1:06:45

people can read your website and quote

1:06:47

from it or embed little portions of it or

1:06:50

screenshot or do whatever on other websites, of course,

1:06:52

you don't control those other websites. And so if

1:06:54

they allow crawling, your stuff's going to end up

1:06:57

in the Google search index in the

1:06:59

AI training model or whatever, even

1:07:01

though you disallowed it from your

1:07:03

website. And I would say

1:07:05

that for the most part that we also

1:07:08

have laws covering can someone take a portion

1:07:10

of the thing that you made and

1:07:12

quote it elsewhere, there's all legal framework

1:07:14

deciding whether that is fair use or

1:07:16

not, and it's complicated. And the

1:07:19

law is not a deterministic machine, as the other

1:07:21

Patel, who I mentioned before, is always fond of

1:07:23

saying, but we do have a legal framework to

1:07:25

determine, can I copy and paste this

1:07:27

paragraph from this thing on this person's site and quote it

1:07:29

on my site so I can comment on it? Yeah,

1:07:32

in general, you can. Can I make a

1:07:34

parody of this article on my

1:07:36

website? Yeah, you can. There's a whole bunch

1:07:38

of things around that that have been fought

1:07:40

out in court that we have a system

1:07:42

for dealing with. But all of

1:07:45

those things, say the court determines, you sue them

1:07:47

and they say, actually, this person was allowed to

1:07:49

quote that snippet, right? You lost your fair use

1:07:51

case because it's pretty open and shut. That's fine.

1:07:54

That just got indexed by an AI

1:07:56

training pot because that person's website allows

1:07:58

them, you know, polite AI bots or

1:08:01

whatever, nevermind again, nevermind that you can't stop them.

1:08:03

Anyway, right? That's

1:08:05

just the nature of publishing.

1:08:08

No matter what, you do not

1:08:11

have absolute control over every single character

1:08:13

that you made. You

1:08:15

do have control over the entire work and the

1:08:17

reproduction of the entire work, but you don't have

1:08:19

control over other examples of fair use. And

1:08:22

Louie's saying, oh, it shouldn't be like, I shouldn't have

1:08:24

to opt out. The default should be that nobody can

1:08:26

crawl me. I mean, that's just like,

1:08:30

not only is it technically impossible, but

1:08:32

like, that's not the

1:08:34

way the web has ever worked. It has

1:08:36

always been, we're going to crawl you unless

1:08:38

you tell us don't. And

1:08:41

even the polite ones, you know, they'll

1:08:43

read the thing that you said not to do it, but

1:08:45

by default, they're going to crawl you. And I think asking

1:08:47

for a world where everything you

1:08:49

publish on your website is not only

1:08:51

not crawlable by the things

1:08:53

you don't want it to crawl up, but also not able

1:08:55

to be quoted by other people is clawing

1:08:59

back rights that we've

1:09:01

already decided belong to other people through fair

1:09:03

use. So then the music

1:09:05

industry decided to get involved. Yeah. Multibillion

1:09:08

dollar companies are entered the chat, as they would say.

1:09:11

We talked about this before of like, Hey, Louie Mantian doesn't

1:09:13

want people to crawl on his website. What

1:09:15

can he do about it? He's just one person. Uh, the

1:09:18

music industry, they have a lot of money. They

1:09:20

have a lot of IP. This is

1:09:22

where the stuff really starts

1:09:25

going down. Yeah. So reading

1:09:27

from our Stendica on the 24th of June, universal

1:09:29

music groups, sunny music and Warner records

1:09:31

have sued AI music synthesis companies, Oudio

1:09:34

and Suno for allegedly committing mass copyright

1:09:36

infringement by using recordings owned by the

1:09:38

labels to train music generating AI models.

1:09:40

The lawsuits filed in federal courts in New

1:09:43

York and Massachusetts claim that the AI companies

1:09:45

use of copyrighted material to train their systems

1:09:47

could lead to AI generated music that directly

1:09:49

competes with and potentially devalues the work of

1:09:51

human artists. So from

1:09:54

the verge article, there's a quote from

1:09:56

RA a chief legal officer, Ken Doroshow.

1:09:58

And that quote is these. are straightforward cases

1:10:01

of copyright infringement involving unlicensed copying of

1:10:03

sound recordings on a massive scale. Suno

1:10:05

and Udeo are attempting to hide the

1:10:08

full scope of their infringement rather than

1:10:10

putting their services on a sound and

1:10:12

awful, lawful, excuse me, footing. And again,

1:10:15

that was the RIAA chief legal officer.

1:10:18

Mikey Shulman, the CEO of Suno, says

1:10:21

the company's technology is transformative

1:10:23

and designed to generate completely

1:10:25

new outputs, not to memorize

1:10:27

and regurgitate preexisting content. Shulman

1:10:30

says Suno doesn't allow user prompts

1:10:32

based on specific artists. Reading

1:10:35

from the lawsuit, the use here is far

1:10:37

from transformative as there is no functional purpose

1:10:39

for Suno's AI model to ingest copyrighted recordings

1:10:42

other than to spit out new, competing music

1:10:44

files. That Suno is copying

1:10:46

the copyrighted recordings for commercial purpose

1:10:48

and is deriving revenue directly proportional

1:10:50

to the number of music files

1:10:52

it generates further tilts the fair

1:10:55

use factor against it. Andy

1:10:58

Baio writes, 404 Media pulled together a video

1:11:00

montage of some of the AI-generated examples provided

1:11:02

in the two lawsuits that sound similar to

1:11:04

famous songs and their recording artists. Then

1:11:08

finally, we'll put a link in the show

1:11:10

notes to a Verge article that discusses what

1:11:12

the RIAA lawsuits mean for AI and copyright.

1:11:14

You know, I saw somebody say this a

1:11:16

few days ago, I don't remember who exactly

1:11:18

it was, but what's

1:11:20

going on if the RIAA are suddenly the good guys?

1:11:22

This is a weird place to be. Well, are they

1:11:25

though though? So here's the thing. Like this is the

1:11:27

tricky bit with this and we talked about this with

1:11:29

the image generators or whatever. So this is significant because

1:11:31

they're big rich companies and you have to take them

1:11:33

seriously when they bring a lawsuit because this is the

1:11:35

kind of like who can stop open AI and Google

1:11:38

and whatever. Well, you know, it's clash

1:11:40

of titans. You need other titans in here to be

1:11:42

duking it out, right? I

1:11:47

think this needs to

1:11:49

be fought out in a court in some way. I

1:11:51

say that before we see what the result will be

1:11:53

because maybe the result does not probably want

1:11:55

to happen. But like as

1:11:58

with the image things, These companies that you

1:12:00

type in a string and they produce a song for

1:12:03

you, right? These models

1:12:05

are trained on stuff. And

1:12:07

these record labels say, yeah, you trained them on

1:12:09

all our music, right? Gets back to

1:12:12

the question, is training something? Is

1:12:14

AI training? How does that relate to copying? Is

1:12:16

it just like copying? Is it not like copying

1:12:18

at all? Is it somewhere in the middle? Do

1:12:20

any of our existing laws apply to it? And

1:12:22

we've discussed this on past episodes as

1:12:24

well, especially when

1:12:27

the company doing the training, then

1:12:29

has a product that they make money on.

1:12:32

And as I said with the image training, these

1:12:35

models that make songs are worthless without

1:12:37

data to train them on. The

1:12:39

model is nothing without the training

1:12:41

data. This company that wants to make money,

1:12:43

you pay us x dollars, you can make

1:12:45

y songs, right? That's their business model. They

1:12:47

can make zero songs if they have not

1:12:49

trained their model on songs. So

1:12:51

the question is, where do those songs

1:12:54

come from? If they've licensed them

1:12:56

from somebody, if they made the songs themselves,

1:12:59

no problem, right? Again, Adobe training

1:13:01

their image generation models entirely on

1:13:03

content they either own or licensed.

1:13:06

Nobody's angry about that. That's the thing

1:13:09

you're doing. You own a bunch of images, you license

1:13:11

them from a stock photo company or whatever, you

1:13:14

train your models on them, you put the feature

1:13:16

into Photoshop, you charge people money for Photoshop, they

1:13:18

click a button, it generates an image. Whether people

1:13:20

like that feature or whatever, legality seems

1:13:22

fine. These other situations

1:13:25

where it's like, hey, we crawled your site because

1:13:27

we don't care about your robust ed text. We

1:13:29

trained our models on your data, on

1:13:31

your songs, on your whatever, right? And by the way,

1:13:33

we have no idea if these companies

1:13:35

actually paid for all the songs. Let's just

1:13:37

assume they did. They bottle the songs from,

1:13:39

you know, Sony Music

1:13:41

Warner Records or whatever, or they paid for a training

1:13:43

service. They got all the songs, they trained their model

1:13:45

on them, they're charging people to use their model, right?

1:13:49

Just like the image processing, I've always thought that if

1:13:53

you have a business that

1:13:55

would not be able to exist without content

1:13:57

from somebody that you did not. pay

1:14:00

anything for, that is very

1:14:02

different than are we trained in

1:14:05

AI model for research purposes, are we trained

1:14:07

it for, you know, for some purchase that

1:14:09

is not like literally making money off of

1:14:11

you. And this particular case is like, okay,

1:14:13

not just that they're making money, but the

1:14:15

thing they're providing is, quote, not transformative. They

1:14:17

keep using that word because that's one of

1:14:19

the tests for like fair use. Is the

1:14:21

work transformative? Have they taken the

1:14:23

thing that existed but made something new out

1:14:25

of it? And that's the argument in court,

1:14:27

whether it is not transformative.

1:14:29

And also, is it a substitute? There's

1:14:32

another one of the fair use tests. Is it

1:14:34

a substitute for the product? Is someone not going

1:14:36

to buy a Drake

1:14:38

album because fake Drake sounds

1:14:40

just as good and they just listened

1:14:42

to fake Drake, right? Is it a substitute

1:14:44

for it? It doesn't mean does it sound exactly like it? That's

1:14:46

a whole other sad area of

1:14:49

law of like, does song A sound too much

1:14:51

like song B and they have to pay them

1:14:53

whatever when they're all made by humans, right? This

1:14:55

is like, would someone pay for

1:14:57

this instead of paying for this?

1:14:59

Is one a substitute for the

1:15:02

other? And that's what they'll be

1:15:04

duking it out about. But I think

1:15:06

at its root, it is sort of like,

1:15:08

where does the value of this company come from?

1:15:12

Every company has to take input

1:15:14

from somewhere. They manufacture something and they sell it to

1:15:17

you or they have a service, they wrote the software

1:15:19

for it, they pay someone to run the servers and

1:15:21

they sell it. There's a there's sort of a value

1:15:23

chain there. And a lot of these companies are like,

1:15:27

we would make more money if we

1:15:29

don't have to pay for the things that make

1:15:31

our product valuable. So we don't want

1:15:33

to have to license all the music in the world, but

1:15:35

we do want to train an AI model on all the

1:15:37

music on the world so that we can make songs that

1:15:39

sound as good as all the music in the world. But

1:15:42

we don't want to have to pay for any of that. And

1:15:44

that seems to be not

1:15:47

a good idea for from my perspective. And this is one

1:15:49

of the different ways you can look at this. All

1:15:52

ethical, legal, I think one

1:15:54

of the frameworks that I've

1:15:56

fallen back on a lot is practical.

1:16:00

If, you know, for any given thing, say, if

1:16:02

we allowed this to happen, would

1:16:04

it produce a viable,

1:16:07

sustainable ecosystem? Like,

1:16:11

would it produce a market for

1:16:13

products and services? Would it

1:16:15

be a rising tide that lifts all boats? Or

1:16:17

would it like, burn the forest to the

1:16:19

ground and leave one tree left in the middle? Right? You

1:16:21

know what I mean? Like, that

1:16:23

practical approach, people like to jump on, like we talked

1:16:25

about before with the Vittici and Max stories and everything,

1:16:27

like they want to go to the moral and ethical

1:16:29

thing. They're stealing from us. It's our stuff. They have

1:16:31

no right. And even when I was saying before, like,

1:16:33

oh, they're, they don't want to pay for this stuff,

1:16:35

but they want to make money off of it or

1:16:37

whatever. But practically, and this is not the way the

1:16:39

law works, but this is the way I think about

1:16:41

it. Practically speaking, I'm always asking myself, if

1:16:44

this is allowed to fly, what

1:16:46

does this look like? Fast forward this. What,

1:16:48

you know, is this viable? Right? What

1:16:51

if, if everyone's listening to fake Drake,

1:16:54

does Drake, the next Drake, are not able to

1:16:56

make any money? Does human

1:16:59

beings making music become an unviable business? And

1:17:01

all this is just an increasingly gray soup

1:17:03

of AI generated stuff that loops in on

1:17:05

itself over and over again. Right? Like

1:17:07

where are the, you know, and we have the same

1:17:10

thing with publishing on the web. Like, does Google destroy

1:17:12

the entire web because no one needs to go to

1:17:14

websites anymore? They just go to Google. Right. Unfortunately,

1:17:18

when these cases go, you know,

1:17:20

to court, no one is thinking that that's not how

1:17:22

the law works. The law is going to be, is

1:17:24

this fair use, whatever? Just does Congress pass new laws

1:17:26

related to this or whatever. But what I really hope

1:17:28

is that the outcome of all these things and the

1:17:30

thing I'm always rooting for is can we get to

1:17:33

a point where we have an

1:17:35

ecosystem that is sustainable, which

1:17:37

means it's probably, you know, whatever they're suing for is like, I think they

1:17:40

want like $150,000 for every song or something. That is not a sustainable solution.

1:17:45

You can't train an AI model when you pay $150,000 for each song that you

1:17:47

trained it on because you need basically

1:17:49

all the songs in the world. That's a

1:17:51

big number. That's stupid. We do

1:17:53

want AIs that can make little songs, right?

1:17:55

I think that is a useful thing to have, right?

1:17:58

So we need to find a way where we can... have that, but

1:18:01

also still have music

1:18:04

artists who can make money making actual music, setting aside

1:18:06

the fact that the labels take all the money and

1:18:08

the artists get rarely anything anyway, which is separate

1:18:10

issue. Right. And there was a good article

1:18:12

about that recently about how the labels, the

1:18:15

label Spotify and the artists and the terrible relationship

1:18:17

there that screws over artists. Anyway,

1:18:22

I think I really hope that the outcome

1:18:24

of this is some kind of situation where,

1:18:26

where there's

1:18:30

something sustainable. There's like, I keep using ecosystem, but

1:18:32

it's like, you know, you have to have enough

1:18:34

water, the whole water cycle. This animal

1:18:36

eats that animal. It dies. It

1:18:39

fertilizes the plant. Like the whole, you know, a sustainable

1:18:41

ecosystem where everything works and it goes all around in

1:18:43

a circle and everything is healthy. And there's growth,

1:18:45

but not too much and not too cancerous. And it's

1:18:48

not like everything is replaced by a model culture and

1:18:50

only one company is left standing and all that good

1:18:52

stuff. Right. But right now

1:18:54

the technology is advancing in a way that if

1:18:57

it's not, if we don't do

1:18:59

something about it, uh, the,

1:19:01

the individual parties involved are not

1:19:04

motivated to make a sustainable ecosystem.

1:19:06

Let's say, I mean, that's kind of what the DMA is about

1:19:09

in the EU and these AI companies definitely

1:19:11

are not motivated to try to make sure they have

1:19:13

a sustainable ecosystem. They just want to make money. And

1:19:16

if they can do it by taking the world's music and

1:19:18

selling the ability for you to make songs that sound like it

1:19:20

without paying anything to the music that they ingested, they're going to

1:19:22

try to do that. Yeah.

1:19:27

It's, I don't know. It's all just so weird

1:19:29

and gross. And it's, it's hard because I don't

1:19:32

want to be old man who shakes fist at clouds.

1:19:35

Right. And it

1:19:37

seems like AI for

1:19:39

all the good and bad associated with it

1:19:41

is a thing. It's certainly a flash in

1:19:43

the pan for right now, but I

1:19:46

get the feeling that where blockchain

1:19:50

and Bitcoin and all that sort of

1:19:52

stuff was very

1:19:55

trendy, but anyone with a couple of brain

1:19:57

cells to rub together would say, eh, that's

1:19:59

all good. a fade or it's certainly not going to

1:20:01

work the way it is today. I

1:20:03

think there's a little of that here, but I

1:20:05

get the feeling that this is going to stick

1:20:08

around for a lot longer. And I think

1:20:11

that there needs to be some wrangling

1:20:13

done, some legal wrangling. And I

1:20:16

get the move fast and break things mentality of

1:20:18

these startups that are doing all this, but I

1:20:21

don't know. It just feels kind

1:20:23

of wrong. Like again, I'm not nearly

1:20:26

as bothered by it as some of our peers

1:20:28

are, but it just doesn't feel right. And

1:20:31

it definitely doesn't feel sustainable. Like practically, like if,

1:20:33

if, like regardless of how you feel about right

1:20:35

or wrong, if you say, if we just let

1:20:38

them do this, like, and

1:20:41

these, you know, these models get better and better and

1:20:43

produce more and more acceptable content, you can

1:20:45

see that it's taking, again, regardless of

1:20:47

how this lawsuit ends up with the whole

1:20:49

record labels, you can see that it is

1:20:51

taking value away from human beings, making

1:20:54

music and pushing that value to models,

1:20:56

making music. But those models are absolutely worthless

1:20:58

without that human generated music, at least initially,

1:21:00

right? Again, maybe in the future, there will

1:21:02

be models trained entirely on model generated music,

1:21:04

but then you have to trace it back

1:21:06

to where that model get trained. Like in

1:21:08

the end, these models are trained

1:21:11

on human created stuff and

1:21:13

there's, there may not be

1:21:15

enough officially licensed human created stuff to train them

1:21:17

on at this point. I

1:21:19

think that's, you know, I think we, we

1:21:22

want these tools. They are useful for

1:21:24

doing things. Even if you think, oh, they make

1:21:26

terrible music. Sometimes people need terrible

1:21:28

music, right? Sometimes people just need a little jingle. They

1:21:30

can describe it. They want it to be spit out.

1:21:32

Right. By most people's definitions, all of my music is

1:21:34

terrible music. They

1:21:37

do useful things like, unlike, you know, cryptocurrency,

1:21:39

which does a very, very small number of

1:21:42

useful things that is not in general purpose.

1:21:44

The AI models do tons of useful things.

1:21:46

Apple's building a bunch into their operating systems.

1:21:48

You know, people use them all the time.

1:21:50

They do tons of useful things, right? We

1:21:53

should find a way for them to

1:21:55

do those things without

1:21:58

destroying the ecosystem. I think

1:22:00

we can find a way for that

1:22:02

to happen. If you look at the

1:22:04

awful situation with Spotify and record labels

1:22:06

and music artists, that's a pretty bad

1:22:08

version of this. And yet still,

1:22:10

it is better than Spotify saying, we're going to stream

1:22:13

all these songs for free and not

1:22:15

pay anybody. I wish I could find that article for

1:22:17

the notes. I'll try to look it up. But

1:22:19

even that is better than the current situation

1:22:22

with AI, which is like, we're

1:22:24

just going to take it all for free. Come sue us. And

1:22:27

they say, okay, we are suing you. And they'll

1:22:29

battle it out in court. And like, either way

1:22:31

this decision goes with the music, they could go

1:22:33

bad in both directions. Because if they say, oh,

1:22:35

you're totally copying this music, all AI training is

1:22:38

illegal. That's terrible. That's

1:22:40

bad. We don't want that, right? And

1:22:42

if they say, no, it's fine. It's transformative. You can

1:22:44

take anything you want for free. That's also bad. So

1:22:47

both extremes of the potential decision that a

1:22:49

court could make based on this lawsuit are

1:22:51

really bad for all of us for the

1:22:53

future. So that's why I hope we find

1:22:55

some kind of middle ground. Like again, with

1:22:57

Spotify, they came up with a

1:22:59

licensing scheme where they can say, we want to

1:23:01

stream your entire catalog of music. Can

1:23:04

we figure out a way to exchange money

1:23:06

where you will allow that to happen legally?

1:23:09

And they came up with something. It's not a great

1:23:11

system. They came up with, again, if I can

1:23:14

find that article, you can read and see how bad it is. But they

1:23:16

didn't just take it all for free. Right. And

1:23:18

they also didn't, the music labels didn't say, okay, but every time

1:23:20

someone streams one of these songs, it's 150 grand. Right.

1:23:23

That's also not sustainable. So obviously they're

1:23:26

staking out positions in these lawsuits and they're trying to

1:23:28

put these companies out of business with big fees or

1:23:30

whatever. But yeah, it's

1:23:32

like, it's scary. It's scary when Titans clash.

1:23:35

And I do worry about how the results of these cases

1:23:37

are going to be. But I think, I think

1:23:40

we have to have these cases or, and I

1:23:42

know this is ridiculous in our country, or we

1:23:44

have to make new laws to address this specific

1:23:47

case, which is different enough from

1:23:49

all the things that have come before it, that we

1:23:51

should have new laws to address it. It

1:23:53

would be better if those laws weren't created by

1:23:55

court decisions, but our ability

1:23:57

and track record for creating technology.

1:24:00

technology related laws for new technology

1:24:02

is not great in this country.

1:24:04

So there's that. Yeah.

1:24:06

And then it continues because

1:24:08

Figma, a popular, um, I

1:24:11

don't know how to describe this, like a user interface,

1:24:14

uh, generation tool, um, yeah,

1:24:17

design tool. Thank you. Uh, they pulled

1:24:19

their AI tool after criticism that it blatantly

1:24:21

ripped off Apple, uh, Apple's weather app. So

1:24:23

this is the verge by Jay Peters. Figma's

1:24:26

new tool make designs. Let's use our users

1:24:28

quickly mock up apps using generative AI. Now

1:24:30

it's been pulled after the tool drafted designs

1:24:32

that looked strikingly similar to Apple's iOS weather

1:24:35

app. In a Tuesday interview with

1:24:37

Figma, CTO, Chris Rasmussen, I asked him point

1:24:39

blank if make designs was trained on Apple's

1:24:41

app designs. His response, he couldn't say for

1:24:43

sure. It

1:24:46

was not responsible for the training AI models that

1:24:48

used it all. Who knows who trained it? It's

1:24:50

just our model. We don't know. Do you know

1:24:52

who trained? I don't know. Does anyone know who

1:24:54

trained it? It's just, we just found it on

1:24:56

our doorstep and this is a model with the

1:24:58

real trainer. Please stand up. Uh,

1:25:01

quote, we did no training as part of the

1:25:03

generative AI features. Rasmussen said the features are quotes

1:25:05

powered by off the shelf models and a bespoke

1:25:07

design system that we commissioned, which appears to be

1:25:09

the underlying issue. So if you commissioned it, then

1:25:11

you should know we had someone else do it

1:25:13

and they gave it to us and we just

1:25:15

took it and we're like, we didn't ask him

1:25:17

any questions. It's fine. Whatever, whatever

1:25:19

you got, just give it as it's probably

1:25:22

fine. Uh, the key AI models that power

1:25:24

make designs are open AI's GPT four O

1:25:26

and Amazon's Titan image image generator G1. According

1:25:28

to Rasmussen, if it's true that Figma didn't

1:25:30

train its AI tools, but they're spitting out

1:25:32

Apple app lookalikes anyway, that could suggest that

1:25:34

open AI or Amazon's models were trained on

1:25:37

Apple's designs. Open AI and Amazon

1:25:39

didn't immediately reply to a request request for

1:25:41

comment. This is seriously the like the

1:25:44

Spiderman pointing at other Spiderman's

1:25:47

image. It's just, it's not my fault. It's their fault.

1:25:49

Well, it's not my fault. It's their fault. Oh, no,

1:25:52

no, no, no, no. It's not my fault. It's their

1:25:54

fault. It was that company. I think it was open

1:25:56

AI or whatever the Sora model that does it makes,

1:25:58

makes movies essentially. Someone who was

1:26:00

responsible for that, was asked in an interview,

1:26:02

was your model trained on YouTube? They

1:26:05

didn't give an answer. Like, maybe, I don't

1:26:08

know. Listen, if you run an AI company,

1:26:11

figure out how and where your

1:26:13

models were trained. I don't know, maybe

1:26:15

you trained them on good things, bad things, whatever,

1:26:18

but have an answer. Don't say, we don't know.

1:26:21

Someone else did it. We like, this

1:26:23

seems like table stakes. You

1:26:25

should know where and on what your

1:26:27

model was trained on. Not like granular, like every single individual

1:26:30

thing, although ideally that would be great, but there's too much,

1:26:32

I get it, right? But when someone says, hey, did you

1:26:34

train on YouTube? You should be able to answer that with

1:26:36

a yes or no. Right, not

1:26:38

weasel about it. And this one, was this trained on Apple's

1:26:40

apps? I mean, anyone looking at it

1:26:43

is gonna be like, well, if it wasn't, this is the

1:26:45

world's biggest coincidence because it looks just like Apple's app. As

1:26:48

Gruber pointed out, right down to the really weird

1:26:50

like line chart that I never really understood until

1:26:52

I saw it explained in Apple's weather app, right?

1:26:54

Was obviously trained on Apple stuff, but

1:26:57

you have to have an answer, right? If you don't

1:26:59

have an answer, say, I don't know, but I'll find

1:27:01

out for you and then come back. But

1:27:03

like the bar is real

1:27:05

low here. Anyway, same

1:27:08

situation, different thing. Images, songs,

1:27:10

text, UIs, a

1:27:15

mock-up tool that makes UIs, it's

1:27:17

based on a model. That model

1:27:19

is worthless without being trained on

1:27:21

a bunch of UIs. Were you gonna get enough UIs to

1:27:23

train it from the world of

1:27:26

UIs that we take essentially without permission? Is

1:27:28

that okay? If we sell that as part

1:27:30

of our application, is that okay? I

1:27:33

mean, I wrote a

1:27:35

big post about this, what, in January? Excuse

1:27:37

me, I made this. Yeah, and

1:27:39

we talked about it on the podcast before

1:27:41

and I took a while to write

1:27:43

this because actually speaking of Neil Patel, I was listening

1:27:46

to the Decoder podcast and there was an episode where

1:27:48

I was debating with somebody about the New York Times

1:27:50

lawsuit at the time, like New York Times was suing

1:27:52

some company that had trained its AI on the New

1:27:54

York Times and they said, you

1:27:56

can't do that. And going back and forth about like, well, the

1:27:58

model is just doing what a person would do. it's learning and

1:28:00

blah, blah, blah. Is the person the same

1:28:03

as the model? Does the model have the same

1:28:05

rights as a person? And I was trying to

1:28:07

write up something related to that. And as usual,

1:28:09

writing helps me clarify my thinking, but it is

1:28:11

a fairly complicated circuitous route to sort of really

1:28:13

dig down into that thought to

1:28:16

get to what's at the heart of it. And

1:28:19

I wrote this thing and I think I did get to the heart

1:28:21

of it as far as I was concerned, but it's

1:28:24

complicated. So every time I try to like summarize it

1:28:26

on the podcast, I find myself like tongue tied and

1:28:28

rather, you know, you just quote from the paragraphs. Like

1:28:30

I think if you read the post, my thoughts are

1:28:32

in there, but a lot of people have read it

1:28:34

and like no one has commented on it. So maybe

1:28:37

I'm doing a poor job communicating it, but I

1:28:40

was coming at it from the other angle. We talked all about

1:28:42

training data in this section of the show here. I was coming

1:28:44

at it from the angle of like what

1:28:47

was then one of the hot

1:28:49

topics, which is say I use one of these

1:28:51

tools. Say I use the Figma tool to generate

1:28:53

a UI. I use the song tool to generate

1:28:55

a song or whatever. That

1:29:00

thing that I made, what

1:29:02

is the legal, ethical,

1:29:04

moral, practical ownership

1:29:07

deal with that? If

1:29:09

I use Figma to make that

1:29:11

auto create UI thing and it makes me

1:29:14

a UI and I put

1:29:16

that in my app, do I

1:29:18

own that UI? If I make a song with

1:29:20

the song making tools, do I have the copyright

1:29:23

on that song? There's been legal cases about this. And I think

1:29:25

the only ruling we have now is something like if you make

1:29:27

it with any AI generator tool, you don't have the copyright on

1:29:29

it or whatever. But

1:29:31

the reason I got to that, because I was getting with the whole

1:29:33

like, oh, training is just like what

1:29:35

a human would do. They read all these articles in

1:29:38

the New York Times, then you ask the human the

1:29:40

answer, and they read all those articles, and they have

1:29:42

the knowledge from reading those articles, and they give you

1:29:44

an answer. Well, that's just what our AIs are doing.

1:29:46

I'm like, yeah, but a human is a human, and

1:29:48

AI is an AI, and is that really what the

1:29:50

root of the thing it is? And I kept chasing

1:29:52

that down, chasing that thought down, and got to sort

1:29:54

of the thing that confers ownership. When

1:29:58

you make something, it's yours. You write something. something on

1:30:00

your blog, you have the copyright to it because you

1:30:02

created it. It's so clear, right? What

1:30:05

if you draw a picture on a piece of paper? Okay, you

1:30:07

got the copyright on the picture, right? What

1:30:09

if you use Photoshop to make a

1:30:11

picture? Well, now you use this software tool written

1:30:13

by a bunch of other people, just plain old

1:30:15

Photoshop, not like AI generally, like Photoshop 3.0, right?

1:30:18

With layers now. You

1:30:20

use Photoshop, but you didn't write Photoshop. A bunch

1:30:22

of people wrote software to make Photoshop, and you

1:30:24

then paid Adobe for, then they gave you that

1:30:26

software product. You use Photoshop to make a picture,

1:30:29

but still we say, well, you made that picture.

1:30:31

You have the copyright on it. You are the

1:30:33

creator. You own it, right? Then

1:30:36

we say, all right, but what if you can't draw?

1:30:40

What if you tell somebody, like, I can't draw. Here's

1:30:42

what I want. I want this picture, or whatever example I gave a

1:30:45

thing, a polar bear riding a

1:30:47

skateboard, but I can't draw. So I

1:30:49

asked somebody else, say, can you draw me a picture of

1:30:51

a polar bear riding a skateboard? So someone goes and they

1:30:53

draw a picture of a polar bear riding a skateboard. At

1:30:55

that point, the person who drew it owns it. Maybe they

1:30:57

use Photoshop, maybe they don't. They own it because they created

1:30:59

it. They drew it, right? But then you say, okay, this

1:31:01

was a work for hire. I'll give you 10 bucks. And

1:31:03

our contract says, I give you 10 bucks. You give me

1:31:06

the polar bear drawing. Now I own the polar bear drawing

1:31:08

because I paid you for it. That is

1:31:10

a market for creative works. Someone

1:31:12

was an artist. I can't draw. They

1:31:15

could. They drew it. They asked

1:31:17

for money. I gave them money. The

1:31:20

copyright is now mine, right? And

1:31:22

the act of creation is clear. The person who drew

1:31:24

it, they created it. I paid money for it. They

1:31:27

sold me their creation. Now I own it. All, you

1:31:29

know, normal, right? Now

1:31:31

I say, make me a picture of a polar bear

1:31:33

on a skateboard. But I don't say it to

1:31:35

an artist. I say it to an

1:31:37

image generator. It's the exact

1:31:39

same thing as I did before. Before

1:31:42

when I did it, it was clear that I don't own

1:31:44

anything until I pay for that, right? Now

1:31:46

when I do that exact thing, but instead of typing

1:31:48

it into an email to an artist, I type it

1:31:50

into an image generator and

1:31:52

I get an image back. Who

1:31:55

created that image? I didn't

1:31:57

create it. But if you're going to say I didn't create the

1:31:59

one that the artist drew for me. Because you just told the

1:32:01

artist what to draw, but you didn't create it. Well, if I

1:32:03

didn't create that one, I certainly didn't create this one because I

1:32:05

literally did the same thing. I just typed the text in a

1:32:07

different text field. It could literally be the same text. It could

1:32:09

be an AI prompt, emailed to an

1:32:12

artist or sent to an AI. So I'm not

1:32:14

gonna say that I am the creator of that.

1:32:17

The AI model can't be

1:32:19

the creator because computer

1:32:22

programs can't own things. They

1:32:24

don't have rights. Computer

1:32:28

programs are made by people who have rights, just like

1:32:30

people who wrote Photoshop. They have the rights to Photoshop

1:32:32

and so on and so forth. But the people who

1:32:34

wrote Photoshop have no rights to the things that people

1:32:36

made with Photoshop, despite Adobe's little snafu

1:32:38

with their license agreements recently, which they

1:32:40

clarified. But anyway. So

1:32:43

I didn't make that picture of the polar bear.

1:32:45

The large language model didn't make it. Who

1:32:50

owns that picture of the polar bear based on

1:32:52

the act of creation? Where is the act of

1:32:54

creation there? How did that model

1:32:56

create the polar bear? Well, it created the polar

1:32:59

bear picture because it had been trained on tons

1:33:01

of other images that maybe were

1:33:03

or weren't licensed. But still I'm looking around of

1:33:06

like if ownership is conferred by the act of

1:33:08

creation and there's no act of creation here, what

1:33:10

the hell are we, what's going on here? Who

1:33:12

owns the picture of the polar bear? Every

1:33:16

time I dig down into some kind of like,

1:33:18

oh AI's allowed you to do this and you're

1:33:20

allowed to train, it's just what people do or

1:33:22

whatever and computers aren't people. I always go through

1:33:24

to looking for how we confer ownership of stuff

1:33:26

like this, how we confer ownership of intellectual

1:33:29

property, how we exchange money for intellectual

1:33:31

property, how the market for intellectual property

1:33:33

works. And none of the

1:33:35

existence systems make any sense in a world where

1:33:38

I can say the same thing to a human and

1:33:41

a generator that is

1:33:43

clearly not me creating anything and yet I do

1:33:45

get a picture out of it that came from

1:33:47

somewhere and there's no like, there's no human act

1:33:49

of creation. It's an indirection, right? And

1:33:51

so I think we need new ways to think

1:33:53

about a new laws for that type of indirection

1:33:56

to say, what is the chain

1:33:58

of ownership here? It's kind of like, not

1:34:00

quite the same thing, but remember the whole

1:34:02

thing where the monkey took

1:34:04

a picture of itself with the camera? Do you remember? Oh, yeah.

1:34:06

It was like a camera set up in the jungle or whatever

1:34:08

and a monkey comes up to it and snaps a picture of

1:34:10

himself and the photographer is like, well, it's

1:34:12

my camera, so I own the copyright to the picture.

1:34:14

I'm like, well, doesn't the monkey own the copyright? Because

1:34:17

it took the picture, right? And it's like, but the

1:34:19

monkey can't own the copyright. It's not a person, right?

1:34:21

And believe me, a monkey is way closer to a

1:34:23

sentient being than an LLM, right? It's

1:34:26

a real living thing. No one's going to argue

1:34:28

in court that a monkey is not alive. And

1:34:30

they're going to say, well, does it have legal

1:34:32

rights? Well, I would say a monkey has more

1:34:34

legal rights than a large language model, which is

1:34:36

just a bunch of numbers in memory. And

1:34:39

so this is the kind of conversation we're having. And

1:34:42

honestly, this would be so much easier to

1:34:44

have if we had actual artificial intelligence, as

1:34:46

in sentient artificial beings. But we don't. That's

1:34:49

just science fiction. Large language models are not anywhere close

1:34:51

to that. That would be so much easier because you'd

1:34:53

be like, well, conscious beings have rights and we need

1:34:55

the, you know, whatever they always have names of this

1:34:57

in sci-fi movies, the AI consciousness act

1:35:00

of 2732 that gives rights to the AIs to avert

1:35:02

a global war and plunge

1:35:07

us into the matrix apocalypse. You know what I mean? Like,

1:35:10

it's so much easier when you say, well, people

1:35:12

have rights and computer programs that are basically people

1:35:14

have rights and it's straightforward, but

1:35:16

we're nowhere near there. So now we're arguing about monkeys,

1:35:19

if they have the copyright pictures,

1:35:21

and we're arguing about huge matrices

1:35:23

of numbers, whether they can

1:35:25

create anything or you're saying like, oh, you're saying

1:35:28

basically like the people who wrote Photoshop own every

1:35:30

picture that's made from because like, well, no, the

1:35:32

LM doesn't own it. And the person who wrote

1:35:34

the prompt doesn't own it. But you know who

1:35:36

does own it? Open AI because they wrote the

1:35:38

program that crawled all the pictures in the world

1:35:40

that train the model that you paid to use.

1:35:44

None of those answers are satisfactory

1:35:46

anyway. Like it doesn't feel right.

1:35:49

It doesn't seem right. It doesn't seem sustainable. And

1:35:51

yet we do need some kind of answer here,

1:35:54

even if the answer here is that anything again,

1:35:56

like that one wall precedent we had is like,

1:35:58

if you make something out of AI, you don't

1:36:00

own the copyright. It is not copyrightable. Nobody owns

1:36:03

it. It's garbage. It's slop. It's a thing that

1:36:05

exists, but nobody can claim that they own it

1:36:07

So it is free for anybody to take and

1:36:09

do whatever they want with but you

1:36:11

certainly can't like sell it to someone because you didn't

1:36:13

Own it. It's very confusing.

1:36:16

I know that I haven't made this any clearer You can try

1:36:18

reading my post to see if it becomes any more clear but

1:36:20

really this is this is a dizzying

1:36:23

topic if you think about it for any amount of

1:36:25

time and I think a lot of

1:36:27

people are doing a

1:36:29

lot of feeling about it, which makes perfect sense

1:36:31

and Honestly, it is more

1:36:33

straightforward to feel things about it than it is to

1:36:35

think about it because thinking about it gets you Into

1:36:37

some weird corners real fast It's

1:36:40

just it's a mess in It's

1:36:44

a mess and I don't know what the right

1:36:46

answer is right like it's it's so gray from

1:36:48

top to bottom And I just I don't know

1:36:51

I just don't know well And I think we're

1:36:53

gonna have to be fighting this and working this

1:36:55

out for a while I mean

1:36:58

look at how much disruption

1:37:00

to existing businesses existing

1:37:02

copyright law and existing norms was

1:37:04

caused by the web and then the

1:37:06

rise of other things on the

1:37:08

internet like this is Just this is

1:37:10

how technology Goes

1:37:13

there are massive disruptions to

1:37:16

what has been established what we what many

1:37:18

people have held dearly there

1:37:20

there's massive disruptions to that when new tech comes

1:37:22

around sometimes and Sometimes

1:37:24

it takes a decade or two to really

1:37:27

settle out and work out. What are the

1:37:29

norms? What should the laws be what is

1:37:31

copyright mean in this new world things like

1:37:33

that? like that takes a long time to

1:37:35

work out sometimes the rise

1:37:38

of these AI techniques and models is

1:37:41

Potentially as disruptive to

1:37:44

existing business models

1:37:46

and norms and perspectives as

1:37:48

the web was when it first came out a

1:37:50

thousand years ago, so I really think we're in

1:37:52

for a while of just

1:37:55

not knowing there's gonna be a

1:37:57

lot of damage and destruction

1:37:59

along the path to get from

1:38:02

where we are now to where kind of where

1:38:04

things settle out. It will destroy

1:38:06

a lot of businesses and it

1:38:08

will, you know, make it hard for a lot

1:38:11

of people to do what they've been doing. It

1:38:13

will also create a bunch of new businesses and

1:38:15

create a bunch of new value and new opportunities,

1:38:17

just like any other massive disruption. I think this

1:38:19

is, this is a very large disruption and it

1:38:22

is, it's mostly only going

1:38:24

to start to

1:38:26

become visible of like, you know, what the other side

1:38:28

looks like just after a bunch of time has passed

1:38:30

and we've gone through a bunch of messiness and we're

1:38:32

just, we're in such early days, it's really hard to

1:38:34

know where we're going to end up right now. I

1:38:37

feel like this is going to be in some

1:38:39

respects, not all, but in some respects, even more

1:38:41

disruptive than the initial web, because the initial web

1:38:43

was kind of like text.

1:38:45

We have laws governing that. It was

1:38:48

a massive shift of wealth. Obviously newspapers

1:38:50

go out of business. Craigslist gets

1:38:52

rich. You know what I mean? Like, like, but

1:38:54

we saw that giant shift in paper magazines, like

1:38:56

the shift of publishing, right, and web search and

1:38:58

doing all that or whatever. But during that entire

1:39:00

thing, people were upset and it was

1:39:02

a big turmoil because it was like, these things used

1:39:04

to be huge. Every city had 25 newspapers. A newspaper

1:39:07

reporter was a big job. And, you know, it was

1:39:09

like, and all of a sudden all that money's going

1:39:11

elsewhere to these.com things or whatever. But during that whole

1:39:13

process, there was mostly agreement

1:39:15

that like, newspapers own what

1:39:17

they publish, websites own what they publish. Like

1:39:19

we have existing copyright laws for this. There's the

1:39:21

whole Google search index thing that we can figure

1:39:24

out and, you know, fair use on the

1:39:26

internet and stuff. But in general, it was just

1:39:28

a massive shift of power and money from

1:39:30

older industries to newer ones, mostly

1:39:32

following along the shape of laws

1:39:35

and ideas and morals and

1:39:37

ethics and societal understanding about

1:39:40

the written word, mostly in the early days of the web,

1:39:43

especially before social media really came and mixed that up

1:39:45

a little bit, right. With the whole aggregation of humans

1:39:47

all talking to each other and quoting things and linking

1:39:49

out or whatever. Right. That in

1:39:52

hindsight, that seems much less disruptive, disruptive than

1:39:54

AI stuff, which is like, it's

1:39:56

a free for all. No one knows anything. No one knows

1:39:58

what's legal. What's not. What's sustainable? What's

1:40:00

not? What should we do? What can we

1:40:03

do? What are people doing? How valuable is

1:40:05

this? How useful is it? Like just

1:40:07

so many questions and we like all the laws that

1:40:09

we have that seems like they could apply this and

1:40:12

some of them do apply. It's like, yeah, but there's

1:40:14

these huge areas where it's like here be dragons on

1:40:16

the map and they draw the big dragon and the

1:40:18

thing is like, nobody knows what's there. And

1:40:20

there's a lot of money behind it. And a lot of people

1:40:22

running in that direction. And it's not even clear where

1:40:24

or how this will shift the power. Like

1:40:26

on the internet in the early days, it

1:40:29

was pretty clear paper newspapers, powers going away

1:40:31

from them and towards websites. Like that trend

1:40:33

was visible to anybody with a clue and

1:40:35

it was just a question of how fast,

1:40:37

how hard, you know, whatever. Here, is this

1:40:40

going to shift power massively to the record labels because

1:40:42

they own all the music, for example, or is it

1:40:44

going to destroy them because everything they have is now

1:40:47

worthless because AM models can be trained on it and

1:40:49

it's a perfect substitute for what they previously made and

1:40:51

no one wants anything like you can't even tell which

1:40:53

direction it's going to go at this point. It's so

1:40:55

early and I just don't think that was true of

1:40:57

the web. So this is an exciting

1:41:00

time to be alive in many

1:41:02

ways, especially if you're in any

1:41:04

industry, any creative industry that involves

1:41:06

intellectual property that AI touches at

1:41:08

all. And at this point, that's

1:41:11

nearly all of them. Right. And

1:41:14

right now, what it does

1:41:16

is not, you know, not

1:41:18

particularly amazing, but it is good enough

1:41:20

for so many use cases. And this

1:41:22

stuff generally doesn't get worse over time.

1:41:25

Thank you to our sponsors this

1:41:27

week, One Password and

1:41:30

Photon Camera. And thank you to our

1:41:32

members who support us directly. You can

1:41:34

join us at at.fm slash join members,

1:41:37

get a bunch of perks, including ATP

1:41:39

overtime. This is our weekly

1:41:41

bonus topic. That's an extra segment

1:41:44

that only members get to hear

1:41:46

ATP over time. This week is

1:41:48

going to be about a rumor

1:41:50

reported by the information and Mark

1:41:52

Gurman about some

1:41:54

changes and plans to what Apple is going

1:41:56

to be working on for the next Vision

1:41:58

Pro and kind of what they

1:42:00

can maybe do to make the next vision

1:42:03

pro cheaper and how they're going to possibly

1:42:05

do this and everything. That's what we're talking

1:42:07

about in ATP over time this week. Join

1:42:09

out a listen ATP.FM slash join. Thanks everybody.

1:42:11

And we'll talk to you next week. Now

1:42:17

the show is over. They

1:42:19

didn't even mean to begin

1:42:21

because it was accidental. Oh,

1:42:25

it was accidental. John

1:42:28

didn't do any research. Marco

1:42:30

and Casey wouldn't let him

1:42:32

because it was accidental. It

1:42:35

was accidental. And

1:42:38

you can find the show notes at

1:42:40

ATP.FM. And

1:42:43

if you're into mastodon, you

1:42:46

can follow them at C A

1:42:49

S E Y L I

1:42:51

S S. So that's Casey Liss M A

1:42:54

R C O A R M.

1:42:57

And T Marco R men. S

1:42:59

I R A C U S A

1:43:02

C. Recuse it's accidental.

1:43:07

They didn't mean to. Tech.

1:43:18

Not so real time follow up on my

1:43:20

earlier statement about Apple

1:43:22

Silicon Max not being able to use PCI

1:43:24

breakout boxes. That is not true. You can

1:43:26

use Thunderbolt PCI breakout boxes. Obviously you can't

1:43:28

use it. It's just not GPUs. Yeah. But

1:43:30

you can't use GPUs internally either. That's the

1:43:33

thing. Yeah. So still Apple should

1:43:35

have put the Mac Pro in the configurator. Or

1:43:37

I suppose they could have said, Hey, use PCI

1:43:39

cards buying it except you. You can use PCR cards.

1:43:41

Well, you can buy a Mac studio and also this

1:43:44

third party product that we don't even sell or

1:43:46

you could buy a Mac Pro, which is a product

1:43:48

in their lineup. I think two

1:43:51

things are simultaneously true. Number one,

1:43:53

they should keep making the Mac Pro because

1:43:56

it does have uses. And number two, absolutely.

1:43:59

Nobody should. by the Mac Pro effectively.

1:44:01

Like, anybody who's going to a

1:44:03

page on apple.com saying, what Mac should I buy?

1:44:06

None of those people should buy it. No,

1:44:08

the app, they should. That should be, look, the

1:44:10

whole point of this is it's a path that

1:44:12

leads to all of our products. And maybe there's

1:44:14

only one very lonely overgrown path that leads to

1:44:16

the Mac Pro, but it's got to be there.

1:44:19

I would say number three, your product chooser should let

1:44:21

you choose from any of the products, depending on which

1:44:23

things you answer. Put as many scary questions in there

1:44:25

as you want. There's just got to be a path

1:44:28

that lands in the Mac Pro. Because otherwise, what they're

1:44:30

saying with this is, no

1:44:32

one should buy this product. And I don't think Apple believes that.

1:44:34

If you ask them, they said, well, some people should. Like, OK,

1:44:36

great, but you have a tool that lets people choose, and it

1:44:38

has every single Mac you sell except for that one. That just

1:44:40

seems like a bug to me. Someone should report it, and they

1:44:42

should fix it. You should report it. I

1:44:45

just wanted the Mac Pro that's worth buying. I

1:44:48

mean, that's a bug. Maybe

1:44:50

they're working on that. We'll see. So

1:44:52

I feel like we covered this in

1:44:54

the past, but what are you waiting for? What

1:44:57

would make it worth buying? Am I waiting

1:44:59

for anything in particular? I don't know. Because

1:45:01

again, with the gaming situation on Apple Silicon

1:45:03

Macs is entirely unclear. If I did buy

1:45:06

a Mac with a big beefy

1:45:08

GPU, bigger than a Mac Studio GPU, that

1:45:10

would be a speculative purchase. It would not

1:45:12

be like my current Mac Pro, which I

1:45:14

literally knew I could run Windows games on

1:45:16

and do, and they work fine. And I

1:45:18

run literally put into Windows. That's not speculative.

1:45:20

That's a thing, right? If

1:45:22

I decide, hey, I want a bigger than

1:45:25

Mac Studio GPU in an ARM Mac, I

1:45:27

am crossing my fingers that some magical point

1:45:29

in the future, I will be able to

1:45:31

do interesting gaming things on it. I

1:45:34

don't know if I'm going to make that speculative purchase. I don't

1:45:36

know if Apple's going to make a Mac with a better than

1:45:38

Mac Studio GPU in it. And

1:45:40

maybe they make it, and it's just too

1:45:42

rich for my blood, and I can't spend

1:45:44

that much money on something speculative. Like I

1:45:46

said, my default is an M4 Mumble Mac

1:45:49

Studio is potentially the computer

1:45:51

I will replace this with whenever they release that like next

1:45:53

year or towards the end of this year or whatever. But

1:45:57

I would like to see, show me something.

1:45:59

Show me the Mac Pro. Pro show me something that's not

1:46:01

a Mac studio in giant cavernous case, right?

1:46:03

That's what I would like to see from them and then I can

1:46:05

decide is it worth it for me to get

1:46:07

that? Because it's not a slam dunk like the toy well,

1:46:09

it's not as big a slam dunk as 2019 was because

1:46:11

again, that's just not speculative but It's

1:46:14

just kind of wishful thinking at this point to think

1:46:16

you're gonna be running Windows arm games natively on you

1:46:18

know You're gonna be booting Windows for arm on your

1:46:20

you know Apple silicon Mac

1:46:23

Pro or you're gonna be running Windows

1:46:25

caliber games in Mac OS

1:46:28

Because Apple will have gotten all the triple-a game developers

1:46:30

on board. That is all just Twinkle

1:46:33

in someone's eye right now. It is not a real thing.

1:46:35

I just I feel like

1:46:37

and I'm

1:46:39

gonna say this and I know and I

1:46:41

understand why it's not appealing to you But

1:46:43

I feel like so many of your

1:46:45

problems which well, maybe not even not even problems But so

1:46:47

much of your life would be so much better if

1:46:50

you would just get a damn

1:46:52

Mac studio and a damn Windows PC

1:46:54

And I get it. I don't want to

1:46:56

run Windows anything I don't and I know

1:46:58

you are even worse than me in this

1:47:00

capacity But like that would make so many

1:47:03

things so much better in your life I would probably have a

1:47:05

gaming PC if I had a place in the house for it,

1:47:07

but I don't so I mean I

1:47:09

hate to break it to you But I really don't think that

1:47:11

there is ever going to be a Mac Pro that does the

1:47:13

things that your current Mac Pro does And I mean that may

1:47:15

be true like I'm rooting for it. But like right now the

1:47:18

outlook doesn't look so great Yeah, I

1:47:20

would definitely not hold your breath on that I

1:47:23

mean like the thing is it's actually kind of if

1:47:25

I thought like it two years ago like Predicted

1:47:28

how this would go actually I'm kind of

1:47:30

surprised at how much motion there is here

1:47:33

the copout plus PC how hard? Microsoft is

1:47:35

pushing into arm PCs after doing such a

1:47:37

bad job with Windows RT, right? Apple

1:47:40

with its whole game porting toolkit like

1:47:42

both those parties both Microsoft and Apple

1:47:45

are actually surprising me with how

1:47:47

Hard they're trying to make my

1:47:49

dream happen. They're just not

1:47:51

succeeding right, but they They're

1:47:54

trying more than I thought they would right? I

1:47:56

did not I didn't think they'd be like both

1:47:58

on both sides I have been pleasantly

1:48:00

surprised by the additional effort that they

1:48:02

are putting in. I think everyone kind

1:48:04

of is. It's just like I just

1:48:06

they're just not really doing

1:48:09

it. Alright, but I give

1:48:11

them kudos for the effort. If

1:48:13

I had to pick one thing like I

1:48:15

would I would wish that Microsoft would commit

1:48:18

to a transition to ARM but that's not what

1:48:20

they want to do. They seem to

1:48:23

think that they're going to have a they're

1:48:25

going to support x86 and

1:48:27

ARM forever off into the future

1:48:29

which I think is a dumb strategy but that

1:48:31

seems to be what they're doing and that doesn't

1:48:33

help me and that doesn't help Windows games get

1:48:35

ported to ARM. All that does is bifurcate their

1:48:37

market and say well all the all the AAA

1:48:39

games will still be on x86 with Nvidia cards

1:48:42

and ARM will just free for people's laptops and Microsoft may

1:48:44

be perfectly happy with that but it doesn't help me over

1:48:46

here with Apple Silicon. I mean

1:48:48

what what PC games are you playing with regularity

1:48:50

right now? Destiny. I don't know if you know

1:48:52

this but Destiny runs on PC. But

1:48:55

that's the thing like is there no

1:48:57

other appliance that you can buy to run Destiny

1:48:59

can't you do it on PlayStation? Destiny runs at

1:49:01

higher resolution and higher frame rates on gaming PCs.

1:49:03

I don't really play it on my Mac Pro

1:49:05

I play it on my PlayStation 5 for a

1:49:07

variety of reasons but it does run

1:49:09

better as defined

1:49:11

by resolution and frame rate even on my

1:49:13

Mac Pro. The PlayStation maxes out of 60

1:49:15

frames per second right and I can

1:49:17

get higher than that depending

1:49:20

on settings and you can go way higher you can

1:49:22

go I've played it I actually have played Destiny on

1:49:24

my PS5 at 120 frames per second on my TV

1:49:26

but it has to lower the quality substantially and I

1:49:28

generally don't play Destiny on my TV because it'll burn

1:49:30

it in right but I did try that just to

1:49:32

see what it was like. 120

1:49:35

frames per second is good all the like the

1:49:37

Destiny streamers who are out there playing Destiny they

1:49:40

occasionally have their frame rate displayed in the corner they're

1:49:43

triple digits always hundreds of frames per

1:49:45

second sometimes pushing them in 200 it

1:49:47

makes a difference it it looks and feels

1:49:50

smoother especially in PvP they're like that's you

1:49:52

know and even if I'm playing

1:49:54

on a controller because at this point sadly I'm better with

1:49:56

a controller in Destiny than I am with mouse and keyboard

1:49:59

and also controllers. way better for my RSI so

1:50:01

I'd be doing it anyway. But yeah, Destiny

1:50:03

is one choice. And games come out all the

1:50:05

time and they come out for PC. They

1:50:08

don't come out for the Mac until three years later when Apple

1:50:10

puts in the key note, right? So there's

1:50:13

past games, there's future games, there's my gigantic

1:50:15

Steam library that I still haven't played through.

1:50:19

I mean, I'll get a PlayStation 5 Pro, I'll

1:50:22

get a PlayStation 6, I do like

1:50:24

consoles, they're great. Maybe someday the gap

1:50:26

between PC and console will be

1:50:28

diminished. But even now I would say it's more diminished because

1:50:30

60 frames per second on PS5 is such a change from

1:50:32

30 on the PS4 that I

1:50:34

feel like the gap has narrowed. Because Destiny players

1:50:36

were playing at 100, 200

1:50:39

frames per second back when I was playing 30. Now

1:50:42

they're playing at 100, 200 frames per second and I'm playing at 60, right?

1:50:45

I'm gaining on them. So maybe at some point I'll be like,

1:50:47

you know what, I don't need a big GPU and I'll just

1:50:49

get a Mac Studio and be happy with it. And

1:50:52

that's looking like the most likely situation

1:50:55

right now. But we'll see. I

1:50:57

mean, to be clear, as much as I'm giving you a hard time,

1:51:00

I want you to have what you want.

1:51:02

I can make an argument, even I can

1:51:05

make an argument for the Mac Pro, for

1:51:07

a really beefy Mac Pro that's useful for

1:51:09

people that work outside of a music studio.

1:51:12

I'm not saying that your desires are wants as much

1:51:14

as I'm giving you grief about it. I'm not saying

1:51:16

your desires are wants or unreasonable. I

1:51:18

don't think Apple will be

1:51:21

achieving them, but I don't think they're

1:51:23

unreasonable. It's exciting that they did with 2019, because

1:51:26

again, I've said before, I don't, despite

1:51:28

my gaming things, this is not a purchase. This is not

1:51:30

a rational purchase. It's the same way that you don't need

1:51:32

a Ferrari to get to work faster. People just like fast

1:51:34

cars, they like fast cars. I just like

1:51:36

powerful computers because I like powerful computers. It's exactly

1:51:38

the same thing. Trying to justify a Mac,

1:51:41

me trying to justify a Mac Pro is like someone trying to

1:51:43

justify a Ferrari. It's like, well, I need a car this fast

1:51:46

to get to my work. No, you don't. Nobody

1:51:48

does. But people want them because they're cool.

1:51:51

Right. And that's fair and

1:51:54

that's totally fair. But I feel

1:51:56

like from to my eyes, we're

1:51:58

starting to cross. from, oh, it's

1:52:00

kind of adorable that John is still rocking

1:52:02

his Mac pro to like, man,

1:52:05

I kind of want you to move on to a Mac

1:52:07

studio. Cause I think you might enjoy it a lot more,

1:52:09

you know? Well, I mean, I'm not buying an M two

1:52:11

one at this point. Well, that's fair. No, that's fair. This

1:52:13

is not the time to buy a Mac studio. So it

1:52:15

isn't, it isn't hanging in there for the M four one.

1:52:17

Yeah. I think when the next one comes out, I think

1:52:19

that's your move. I can't, I just cannot

1:52:21

see a future in which they

1:52:23

make the Mac pro that you want. And

1:52:26

so you might as well get the Mac

1:52:28

studio, which is the Mac pro without slots.

1:52:30

Like that is the new Mac pro. I

1:52:32

can't say it enough. And with a wimpy

1:52:34

or GPU, but they, they just like the

1:52:37

Mac studio is the Mac pro. They should

1:52:39

have called it Mac pro. That is the

1:52:41

Apple Silicon Mac pro. They should not have.

1:52:44

No. Can you imagine the aneurysm he would

1:52:46

have had? It doesn't make sense. They

1:52:48

sell, I think all the Mac pro it's way bigger, but

1:52:51

it's the same computer. It's just a built-in

1:52:53

PCI breakout box. I know. It's still got

1:52:55

the slots. It's still anyway, but we'll see

1:52:57

how good, and by the way, by the

1:52:59

time I do get this, like my computer

1:53:01

is essentially five years old now already.

1:53:04

This is a pretty good run for a computer that

1:53:06

I bought at the, you know,

1:53:08

just before the, the, you know, processor

1:53:11

transition, right? Which is, you know, unfortunate for the, we

1:53:13

already said when it happened, like, Oh, my poor Mac

1:53:15

pro or whatever, but I love this machine and

1:53:17

I've already gotten five years out of it, which granted is half of

1:53:19

what I got out of my last Mac pro, but you know, processor

1:53:22

transition, right? So if

1:53:24

I ditch this machine at six years old, that's

1:53:26

longer than any of your laptops lasted, right? It's

1:53:29

a pretty good run. Hey, we're just excited

1:53:31

if Marco makes it six months, much less six years.

1:53:33

He's been pretty good with 16 inch. I think it's

1:53:36

almost two years old now, right? No, it's the M3

1:53:38

max. It's a, it's the black one. Sorry.

1:53:41

I mean, to be honest, lately I haven't

1:53:43

been much better, so I shouldn't be casting stones

1:53:45

in this glass house, but generally

1:53:48

speaking, Marco is much more frequent on, on

1:53:50

his purchases. So I mean, like no matter

1:53:52

what, like. I feel like I've gotten

1:53:55

a good run out of this Mac pro and I'm enjoying

1:53:57

it for, you know, as long as like I'm excited that

1:53:59

it's a. runs on it. That's

1:54:01

cool. Next year, probably not, right? So it's really

1:54:03

putting a deadline on this. Like I said, I'm

1:54:05

willing to run this with last year's version of

1:54:07

the operating system for some period of time if

1:54:10

I have to wait, right? But, you

1:54:12

know, well, we'll see what happens. Like I'm, you know,

1:54:14

I keep my cars for a long time and keep my Macs for a

1:54:16

long time.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features