Podchaser Logo
Home
Walk a Little Faster

Walk a Little Faster

Released Thursday, 22nd February 2024
Good episode? Give it some love!
Walk a Little Faster

Walk a Little Faster

Walk a Little Faster

Walk a Little Faster

Thursday, 22nd February 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

So are we all returning our vision pros

0:02

as the as the news

0:04

stories seemed to indicate is the big wave This

0:06

is what you decide to do to start the

0:09

show I was I was saving this and

0:11

now you're just gonna drop this bomb right and put right up front

0:14

Yeah, you are you already turned yours. No,

0:16

of course But I was

0:18

trying very hard to make it sound believable. I'm a terrible

0:20

liar So I don't think I don't think I succeeded but

0:22

I was trying very hard No,

0:26

I didn't return mine I mean truth be told that now we

0:28

are getting into a topic that I don't think you intended but

0:30

I It is

0:32

exceedingly overpriced. It is exceedingly

0:34

heavy I don't think

0:36

my particular nose construction is terribly

0:38

compatible with it and I'm stubborn

0:40

stubborn and obstinately Refusing to

0:43

use the probably more comfortable to

0:46

two-sided strap. I'm still on the crank

0:48

strap I feel like my particular

0:50

nose construction is that all of the weight really

0:52

wants to rest on my nose and part of

0:54

the problem is I've heard other people not

0:57

complain as much about the nose thing but But

0:59

complain about the fact that the

1:01

vision Pro really wants to sit lower on your

1:04

face Then I think I would choose to naturally

1:06

put it And so if I put it where

1:08

I think is most comfortable and not on like

1:10

I don't know my nose Ridge or whatever I'm

1:13

sure there's an anatomical term for it But um,

1:16

if I said it where I want to sit it then

1:18

it's like please move the headset down It's too high or

1:20

I forgot the messages But if you if you have a

1:22

vision Pro you've seen it and if you've seen any of

1:24

the videos you've probably seen it That's the vision Pro equivalent

1:26

of your hand is covering up the face ID camera on

1:28

your iPad Pro. Yep. Yep. Yep. Yep. Exactly it Exactly it

1:30

I could not think of a better analogy and so So

1:33

I have to have it ride a

1:35

little bit lower than I want And

1:37

so for me all of the weight

1:39

tends to sit on my nose unless

1:41

I really crank the crank strap Really

1:43

tight in which case then I can

1:45

transition some of that weight onto my

1:47

forehead or sometimes my cheekbones and And

1:50

that's better. But yeah, I was I

1:52

was at I was in a room at the library again today

1:55

And yeah after like an hour and a half two

1:57

hours it it was it was hurting my schnoz a

1:59

little bit And comically, Erin

2:01

hadn't had a chance to try it until this past

2:04

weekend. We've just been so very busy. And

2:07

for her, all of the weight

2:09

was on her cheekbones. And immediately I

2:12

knew I should have put the dual strap or whatever it's

2:14

called on there for her. But we just

2:16

wanted to go plow ahead with what I was trying to show

2:18

her. And it was so uncomfortable

2:21

for her that she

2:23

ended up having her

2:25

pointers as load-bearing fingers under the

2:27

Vision Pro to keep it from

2:29

slamming into her cheekbones. I

2:32

don't think Erin has exceedingly

2:34

prominent cheekbones. I think they're

2:37

regular-ish cheekbones. Well, this is part of what

2:39

the difference... Did you see people link around

2:41

that there was a thread on Reddit where

2:43

the people were compiling basically what the light

2:45

seal sizes mean? Oh, I did not see

2:47

that. So there's the two digits and then

2:49

the N or the W at the end.

2:52

They've figured out through various sleuthing and trial

2:54

and error and stuff that it basically encodes...

2:56

Like, each digit is separate. It's not like

2:58

23 millimeter versus 33 millimeter. Like

3:01

it's... The two and the one or whatever, like

3:03

those are two different indicators. They can both change

3:06

up or down. And one

3:08

of the factors... I forgot the specifics, but

3:10

one of the factors is like whether it

3:12

like sits high or low on your cheekbones

3:14

or something like that. And so chances are

3:16

this is just, you know, for Erin to

3:18

comfortably wear it, she would most likely need

3:20

a different light seal. Which I would totally

3:22

buy. That being said, the scan that

3:25

she did said the same as me. That doesn't

3:27

mean that that's, you know, the right answer. Just

3:29

the automated scan says she and I have the

3:31

same one. But it very well could be that

3:33

if we had all of them arrayed out in

3:35

front of us, I mean, hell, maybe even not

3:37

mine would be different. But I concur

3:39

that, you know, it's very likely that maybe a

3:41

different light seal would work better for her. But

3:43

certainly if she were to buy her own based

3:45

only on the, you know, experience she had by

3:47

scanning her face, she would end up with the

3:49

same one I've got. I think one of the

3:52

things that's hurting Division Pro, you know,

3:54

initial sales and reactions is

3:56

they've made this system so clever.

3:59

complicated of how to fit it

4:01

and it seems like the whatever the app

4:03

is saying should fit you it maybe

4:06

is not always accurate or Maybe they just

4:08

are not good at presenting alternatives to people

4:10

or whatever it is. It's a highly You

4:12

know fit dependent device and

4:15

I was thinking like why do we

4:17

not hear about this so much with the

4:20

Quest headsets? I mean obviously part

4:22

of it is I think we are applying Strictor

4:24

standards to Apple because everyone does and their stuff always

4:27

gets more scrutiny But I think part of it might

4:29

also just be like There's

4:31

such a weight difference like Apple chose

4:33

to make a very high-end headset it's

4:35

a very heavy headset compared to its

4:37

competitors and maybe therefore it is

4:40

more sensitive to Like

4:42

different fit adjustments and I was thinking like is

4:44

it a mistake To have

4:47

made the vision Pro in such

4:49

a way that the light seal or whatever whatever

4:51

the fit mechanics of it are Are

4:54

not adjustable on the device now

4:56

obviously that would introduce more Mechanical complexity probably a little

4:58

bit more weight as a result of that, but like

5:01

would it have been a better choice? I know

5:03

Apple would never do this But would it have

5:06

been a better choice to have like some adjustment

5:08

knobs or whatever like on the actual Light

5:10

seal to have it be someone adjustable. I don't

5:12

I don't know, but it seems like there there's

5:15

a lot of areas where Apple's

5:17

approach to this is you find your

5:19

perfect fit or more often we automatically

5:21

find it for you here It is

5:23

period and it like if your

5:25

perfect fit is a little bit different They're just like

5:28

no this is your perfect fit period like and there's

5:30

just no we already there's no alternative I

5:32

wonder if that's something that they'll tweak over time so

5:35

a third-party opportunity though It's kind of like how

5:37

third-party sell tips for the AirPods Pro Right like

5:39

all the ink by foamy tips or plastic tips

5:41

or tips or you may take a multiple of

5:43

your ear Because as far as I

5:46

know there is no weird DRM or parts pairing with

5:48

the light shield It's just fabric

5:50

and plastic and a couple little magnets.

5:52

I Would imagine that if this product

5:54

ever becomes popular enough that it can

5:56

sustain the third-party ecosystem for light shields

5:58

like it is. Going to lot

6:01

of people they'll be third party thing for

6:03

you can snap in there that are very

6:05

differently shaped may be are adjustable you know

6:07

like I. I think the the rigidity of

6:10

the main screen part is tough to tough

6:12

to change at this point in the technology

6:14

curve because there's a lot of stuff going

6:16

on in there that really come out to

6:19

be carefully aligned. but the replaceable light shield

6:21

is actually a good designed for. Our

6:23

if not Apple doing this for third party say

6:26

I will will sell you a thing for less

6:28

than two hundred dollars that goes between you and

6:30

the main unit and maybe you can find money

6:32

that fits better. I. Mean certainly

6:34

like the quest line does have many third

6:37

party like you know head gaskets or would

6:39

expect have said gasket sounds had spezza it's

6:41

it's a new car engine, Amazon Euro Vr

6:43

headset and that much easier to replace with

6:45

the our heads. That one's the. Very.

6:48

True but I know it's I do think

6:50

that I don't know how I got him

6:52

stance in but I believe I did myself

6:54

but I'm I eat. I do really like

6:57

this device. I wish it was cheaper I

6:59

was it was lighter but it is really

7:01

cool and since I do feel like it

7:03

is a compulsory purchase for this for my

7:06

job both in terms of Atp and call

7:08

seats I feel like I kind of had

7:10

spend money and I will whine about the

7:12

money I spent until the end of time

7:15

It as high have we met hi this

7:17

is his and. I will allow half of

7:19

it until the end of the Us. the

7:21

her successor A but I do really love

7:23

the device and you know it is incredible

7:25

for media consumption. I think you and I

7:27

will never see I'd ions as to whether

7:29

or not it's good. As a secondary manager

7:31

for your sir replacement wander for your magnum

7:34

revolver that more later but it is extremely

7:36

extremely cool and know I'd I don't I

7:38

don't play I'm in my time as run

7:40

out. At this point her I think is

7:42

running other has run out but on I

7:44

don't plan to return it and I do

7:46

really like it. It is not. A perfect

7:48

product by any stretch, and I don't know if

7:50

I go so far as to say I love

7:53

it quite yet, but I do really, really like

7:55

it and and I'm enjoying having it at at

7:57

hand and using it. as yeah i think

7:59

it's cool, but I'm

8:01

treating mine, I'm ending up using it

8:04

more like a dev kit than

8:06

like a product that's going to really have a

8:08

huge place in my life so far because I

8:10

mean first of all, look, Apple

8:12

reaped what it sowed. There

8:15

are no apps. There's no

8:17

games. There's very little content. Apple

8:20

did this 100% to themselves. The

8:23

developers are just largely not

8:25

there, so there's not

8:27

much for me to do with it besides watch

8:30

movies and the reality is

8:32

I don't watch that many movies and when

8:34

I do watch movies, I'm rarely watching them

8:36

alone. Yeah, yeah, yeah. I'm not

8:38

sure I'm going to have all use for it. That's not to say

8:41

that no one else has use for this product. Just me, I don't

8:43

have a ton of use for it. I was

8:45

really hoping the Mac screen angle of

8:47

it would be very useful

8:49

to me, especially now as I'm in this weird

8:51

housing transitional period where I really want a big

8:53

monitor but I can't really fit one in this

8:55

rental house and there's all sorts of uses

8:58

that I thought I might be able to squeeze out of

9:00

it that ended up not really working that

9:02

well for me. So for me, it's

9:04

really just a dev kit and it's a dev kit for a

9:07

branch of my app that I'm not really working

9:09

on yet because I don't have time because I'm

9:11

working on the iPhone version. And I

9:14

think this is going to be kind of a story we keep hearing, something

9:16

like that where it's going to be

9:18

difficult again over time. It's going to

9:21

be difficult for developers to justify investing

9:23

time into the Vision OS

9:26

platform if there's no customer base

9:28

for it and it's a chicken and egg problem. And

9:30

if there's not a lot of apps and content on

9:32

it, it's going to be difficult for a lot of

9:35

people to justify buying one, especially with the other hard

9:37

to justify factors like the price and the

9:39

single person experience and some of the version

9:41

one challenges and limitations. Maybe

9:44

I'm totally wrong and maybe this is selling

9:46

like gangbusters way above Apple's expectations and maybe

9:48

they can keep thinking that they don't need

9:50

developers. But the reality is what

9:53

Apple has shown in their actions around

9:55

app for policy over and over again

9:57

is They believe that... They

10:00

grace us with a platform full of users

10:02

and we should kiss their feet and thanked

10:04

them and give them a third of our

10:06

money or whatever hill on because these are

10:08

their customers they're bringing to us. Effectively

10:11

therefore we are not really bring us on

10:13

a value to the platform that they are

10:15

bringing the platform for users and which with

10:18

her thanks for that. And I think what

10:20

we're seeing here is maybe this is demonstrating

10:22

the value of developers to the platform. If

10:25

not, a lot of people are finding much to

10:27

do with it. That's.

10:30

Because there's not a lot of apps and

10:32

content for it and that's because there's not

10:34

a lot of third party development for it

10:36

and so I may be a means. I

10:39

know Apple will not change their after policies,

10:41

but. Maybe at

10:43

least this helps show them a little

10:45

bit like kind of where it hurts

10:47

because they just did a huge long

10:49

as he's platform minutes I i think

10:51

it it might be may a little

10:53

below expectations are is just from Troy

10:56

say but John Cake Jonah feeling like

10:58

I think they thought they were be

11:00

a lot more apps than there are.

11:02

And I think they thought their be a lot

11:04

more developer interest in there has been. Maybe they

11:06

thought their be a lot more content deals from

11:08

the big Clinton publishers sevens and that's not really

11:10

happening as as quickly or yet either. This

11:13

platform is starving for apps and

11:15

content. It's just it's it's it's

11:18

just not there yet. And.

11:20

I hope it's coming. But. What if

11:22

it's not? Like. Apple's gonna have

11:24

to just make this work largely on

11:26

their own because they've done such an

11:28

effective job of alienating everyone else in

11:31

the industry. Others in broad

11:33

this changes to app store policy like he

11:35

else. like sea level changes. I'm not sure

11:37

what I would have done differently, but I

11:39

do think and I can't help but wonder.

11:42

If seeding dev kits to a

11:44

handful of indies would have gone

11:46

a long way, especially if it

11:49

was in concert with. Talk.

11:51

about you know because i think the problem

11:53

is if they'd seated let's say underscore for

11:55

the sake of discussion the in the surely

11:57

would have said well here's a dev kit

11:59

you tell no one.

12:03

And then that's not really accomplishing much. Yes,

12:05

then that gets an even more polished version

12:08

of Widget Smith in the store on day

12:10

one than was already there, which is good.

12:13

But I think the better approach would have been to

12:15

hand underscore one and say, go to town, man.

12:17

Talk about it. Get people excited if

12:19

you can. Not in edict. You know what I mean?

12:22

If he's going bananas,

12:25

doing all of this work and having all this

12:27

fun, and underscore would talk about it

12:29

and underscore would be effusive about it, not because

12:31

he's full of garbage. He's not at all, but

12:33

because he's just a person who finds the good

12:35

in things. And so I think, you

12:38

know, you see it in underscore, you see it at

12:40

James Thompson or whatever, and

12:42

you start to build a little more enthusiasm

12:44

in the indie community. And I think, and

12:46

I mean, admittedly, as I'm saying this, I'm

12:48

like, well, of course you think that because you're in

12:51

the indie community. And maybe that's true. But I don't

12:53

know, I feel like it would have done a lot

12:55

to build enthusiasm amongst anyone other than Disney and like

12:57

unity or whoever it was. I don't even know if

12:59

they got dev kits. I just have to assume. But

13:01

you know, whoever it is that got the dev kits,

13:03

it certainly didn't seem to be indie from what I

13:05

can tell. And I, I feel like that

13:08

was a missed opportunity. And it could have been that

13:10

they wanted to and they just couldn't produce them in

13:12

time or something held it up. I don't know. But

13:14

it sure seems like if you were doing these

13:17

labs, which admittedly were pretty controlled, by

13:19

the way, I went to a lab, but I can't say anything else.

13:21

You know, even despite these

13:23

controlled labs, I feel like if

13:25

you had those labs, you must have had the

13:27

quantity of devices at a stage

13:29

in which they were complete enough that you could

13:32

have seated a handful of like trusted indies

13:34

as well as the Disney for the world.

13:36

And, and I wish they did and they

13:38

didn't. And so like you said, they're reaping

13:40

what they sold and there isn't a lot

13:42

on there. There's there's really not. And I

13:45

think part of this is exacerbated by

13:47

your average iPad app is really not great

13:50

on the vision pro in part because all

13:52

the iPad apps are shown in light mode

13:54

and all the vision pro apps, all

13:57

the native apps are not literally in dark mode,

13:59

but effectively in dark mode. And part of

14:01

it is because these iPad apps are designed

14:03

for touch targets that are much smaller than

14:05

eye targets. And I found

14:08

that with almost every iPad app,

14:11

it's often prohibitively difficult

14:13

to grab the right target without

14:15

using a cursor of some sort.

14:18

And Slack is to me the epitome of this,

14:20

and I think I've already brought this up several

14:22

times, but like changing between different slacks, which is

14:25

something I do constantly in that app is very,

14:27

very difficult. And I don't necessarily fault Slack for

14:29

this, but it's just that the targets are too

14:31

small, and then it becomes difficult to use Slack,

14:33

and then it's like, well, I'm just going to

14:35

get my work done on my computer then. And

14:39

so all of these iPad apps are kind of eh,

14:41

and there's not many Vision Pro apps because none of

14:44

us had them. And so now what? And it's like,

14:46

it's exactly what you said, Marko, like, where do you

14:48

go from here, Apple? What are you going to do?

14:51

What we've seen is like, I think the launch

14:53

of the Vision Pro kind of

14:55

is that DevKit program. Obviously,

14:57

a lot of people are buying them for

15:00

their own uses, their entertainment use,

15:02

a lot of it just kind of status

15:04

or YouTubers playing with it, but whatever it

15:06

is, a lot of people buying them as

15:08

early adopters. But I

15:10

also think a lot of the early purchases

15:12

are companies and developers who are wanting to

15:14

start experimenting with their get on board with

15:16

it or try to look at porting their

15:18

apps or whatever, and this

15:21

is just – I think this is their

15:23

DevKit program largely, and it just so happens

15:25

that you'll also see them like people

15:27

watching movies in first class on airplanes as well.

15:30

The way it is now with there being almost

15:32

no software and almost no content, it's

15:35

not like a failure per se. It just

15:37

really hurts the argument to buy it, and

15:39

it really hurts the experience of owning it

15:41

when – I think

15:43

people are predisposed to assume

15:47

any new tech product is a fad that

15:50

will fail and then laugh at it. That's

15:52

a very common thing in media

15:54

and tech commentary culture. I

15:57

think if everyone who buys the Vision Pro at first

15:59

class, it's just a lot of people first ends up

16:01

not using it very much a few weeks later

16:03

because they kind of ran out of stuff to

16:05

do on it. That's not

16:07

great for the reputation

16:09

of that product and its launch. None

16:12

of this should come as a surprise to Apple. They

16:15

saw coming up to the launch,

16:17

they knew how many apps there

16:19

weren't. They knew which apps

16:21

were being built native and which ones weren't.

16:23

They knew Netflix didn't have their native app

16:25

submitted to them or whatever. All

16:27

the things that are missing, they knew that going into

16:30

it. So it isn't like this is a

16:32

surprise to Apple. Again, I

16:34

hope that they are stepping on

16:36

the gas behind the scenes in terms of their

16:38

own content efforts. They're gonna have

16:40

to do most of this on their own. They're

16:42

not gonna get a lot of help from third

16:45

party developers or third party content makers on this.

16:47

They have to be making a ton

16:49

of the 3D content. They have to

16:51

be making a ton of the environment

16:53

content, any kind of experiential, virtual

16:56

travel stuff. They have to be

16:58

the ones to kickstart that themselves

17:01

because no one else is gonna do it with

17:03

these numbers and with Apple having really

17:06

alienated so many people over the last decade.

17:09

And it's funny too, we should

17:11

probably move on from this, but I think

17:13

what's kind of unfortunate about it is even

17:15

though I'm fetching a little bit

17:17

about everything, this is an amazing device. Leaving

17:19

aside the physical comfort, which is a big deal, leaving

17:21

aside the cost, which is a big deal, if you can

17:24

get past that or just forget it for a minute,

17:26

this is a truly incredible device. And

17:29

the 3D stuff, like consuming a 3D movie

17:31

in it is very cool, but the immersive

17:33

content of which there is very little, but

17:35

we're hearing more and more rumblings that there's

17:38

more coming. In fact, statements even that there's

17:40

more coming. The immersive

17:42

content is, what is

17:44

the Tim phrase, it's blow away. It

17:46

really just knocks your socks off. Was that a forced-all-ism? I think

17:48

it was a forced-all-ism. Maybe it was. I think you might be

17:50

right, actually, now that you say that, I think you're right. It

17:53

has caught on since then, it has spread. I

17:55

think you might be right. But anyways, the immersive

17:57

stuff is just, it's unlike anything I've ever

17:59

seen. experienced. It's tremendous.

18:02

And for me, and I'm not saying it's true

18:04

for you, Marco, or anyone else, but for me,

18:06

I really like the Mac Virtual Display thing and

18:08

Universal Control. It works pretty darn well. It's not

18:11

perfect, but it works pretty darn well. And

18:13

so, and briefly using this on

18:15

a train a few weeks ago

18:17

was amazing. And so this is

18:19

a truly incredible, incredible device. And

18:23

even though we've kind of accidentally enumerated

18:25

some of the crappy parts of it,

18:27

it is incredible. And whether or not

18:30

it's the future, it is a

18:32

future that I am on board with and is super

18:34

neat. And I don't want to lose sight of that

18:36

because I think we're coming across as two grumpy old

18:39

men, which is accurate. But there's a good side to

18:41

this as well that we're not giving out. We're not

18:43

shining enough light on. Like it is incredible. And if

18:45

you are lucky enough to be able

18:47

to have one, it is very, very

18:50

cool. And I really think that there's a lot of potential

18:52

here. It's just a question whether or not we'll realize it.

18:54

And I think it's going to have a slower start

18:57

than anyone thought. Like, you know, because we were just

18:59

saying a few weeks ago, like this is going to

19:01

be, they're going to sell as many as they can

19:03

make. It's going to be back ordered for we, you

19:05

know, for, you know, the whole year it's going to

19:07

be back ordered, like, you know, it's going to be

19:09

supply constrained or whatever. I just looked and I can

19:11

pick one of these up tomorrow or I can

19:13

have it shipped to me next week. That's

19:16

not good for the

19:18

sales figures, I think. Speaking

19:21

of things with a lot of potential

19:23

that may or may not have been

19:25

realized, two different things. First of all,

19:27

the 2015 movie, Steve Jobs. And second

19:29

of all, our new member special about

19:31

that movie. We recorded this month's members

19:33

only special about Steve Jobs, the 2015

19:36

movie with Michael Fassbender and Kate Winslet and a

19:38

bunch of other people. This is

19:41

a movie about Steve Jobs, and of what, four years after

19:43

he passed away. And so we, like I said, we did

19:45

a member special on it. If you are not a member,

19:47

John, what do you need to do in order to become

19:49

a member? You're making me up

19:51

for my slumber to pitch the membership program? Yeah, man. Hey, you

19:54

could have bought a Vision Pro. You could have been a part

19:56

of this. You opted out of the conversation, sir. Well, you just

19:58

got through telling me why I shouldn't go. get one. I

20:00

was trying to bring it back around. That was the

20:02

whole point. But if you want to get one, it's

20:04

really easy. It sounds like there's no apps for it.

20:06

I also want to watch Major League

20:08

Soccer, games that have already taken place because Apple just

20:11

announced they're providing that content. Anyway. Yeah, I put you

20:13

to sleep before I talked about the good parts apparently

20:15

because there are good parts for sure. ATB.FM slash join

20:17

if you'd like to become a member. Not all of

20:19

our member specials are about movies, but some of them

20:21

are and this one is. Great

20:24

sales pitch. What are you going to

20:26

get for me? Accurate. It is accurate. It is accurate.

20:28

No, it was a lot of fun watching this and

20:30

talking about it and I don't want to give anything

20:32

away. But yeah, remember that if you go to ATP.FM

20:35

slash join, you can join on a

20:38

monthly or yearly basis. You can also

20:40

go to ATP.FM slash, did I

20:43

say common a minute ago? Whoops. Anyway,

20:45

ATP.FM slash gift, if I'm not mistaken,

20:47

to gift yourself or someone else a

20:49

membership. Tint, tint, tint. But

20:51

yeah, we had a lot of fun recording this one. And

20:53

if you become a member for any amount of

20:56

time, you can go back in the history books

20:58

and listen to any of our member specials. And

21:00

you can do that as long as you are

21:02

a member. So you can check that out. We've

21:04

done one a month for almost a

21:06

year now, I think, or something like that. I don't

21:08

have a count in front of me, but we have

21:10

a fair bit of member specials in the can at

21:13

this point. So check it out. HP Movie Club Steve

21:15

Jobs. And we'll put links in the show notes, the

21:17

relevant information atp.fm slash

21:19

join. Let's start

21:21

some follow up at 30 minutes and

21:24

chase rights regarding the blur in the

21:26

vision pro when you turn your head.

21:28

I'm pretty sure that this is just

21:30

typical sample and hold display blur. This

21:33

also affects televisions and where the idea

21:35

of motion resolution comes from impulse displays

21:37

like CLTs and plasma have

21:39

much higher motion resolution to combat.

21:42

To combat this LCDs can

21:44

use backlight strobing and OLEDs

21:46

can use black frame insertion

21:49

and blur busters, which apparently is a

21:51

website I learned today, has a really

21:53

good resource where you can read about

21:56

blur of all kinds, a

21:58

lot of headsets like the vision. excuse

22:00

me, the Quest 3, use very low persistence

22:03

to get much better motion resolution. The downside

22:05

is in brightness. The upside of the micro

22:07

LED displays in pancake lenses is that it

22:09

allows the displays to be very close to

22:11

the eyes and having the weight closer to

22:13

the head is better for comfort. The downside

22:15

of pancake lenses is that they swallow much

22:17

more light of the light coming off the

22:19

displays than Fresno. I think it's Fresnel. Fresnel,

22:21

yeah. It's those lenses that are flat but

22:23

they look like they have a bunch of

22:25

concentric circle ridges on them. Okay,

22:27

so our other aspheric lenses will. So

22:29

even if the displays are 5000 nits, once you have

22:32

color filters, polarizers, and the pancake lenses, the

22:34

brightness we see can still end up very

22:36

low. The other consumer headset using pancake lenses

22:38

and micro OLED displays is the big screen

22:40

beyond 3, which I had never heard of.

22:43

It is very dim, especially if you turn

22:45

down the brightness to get acceptable persistence. Chase

22:47

continues, I believe Apple is pushing persistence further

22:50

than they should in order to get more

22:52

brightness back because they want to push HDR

22:54

as a thing on the Vision Pro. On

22:57

this topic, for the

22:59

motion blur in motion, I wish I had known to look for

23:01

that when I had my demo because I would have. I didn't

23:04

notice it, but clearly Marco has and I've heard

23:06

it from other people as well. I

23:09

do wonder if, I still wonder if this

23:11

is what they're talking about. So the

23:14

sample and hold thing, this happens on OLED

23:16

TVs as well. The deal with OLEDs is

23:18

you light up a pixel and it stays

23:21

whatever color you made it until you change

23:23

it and it changes color really, really fast.

23:25

That sounds great. This is a great

23:27

display technology. What's the problem? The

23:30

problem is if you watch something like a 24

23:32

frames per second

23:34

movie, it will show a

23:36

frame and the whole TV will show that frame,

23:39

just the exact frame, exactly the

23:41

way it is, until the next frame comes. And

23:43

again, you might be thinking, that sounds like what

23:45

it's supposed to do, right? Well, not

23:47

really, because if you think about what a

23:49

movie projector does or what a CRT television

23:52

does is both of those things will show

23:54

the frame and then there'll

23:56

be, it will basically like blast it onto the

23:58

screen like boom, here's the frame. frame and

24:00

then the frame will either fade away or quickly

24:02

be replaced by black like this plasma if you

24:04

watch it in slow motion it blasts color at

24:07

the screen and then it just fades away sometimes

24:09

plasmas would blast some of the color

24:11

then the second part of the color and both of

24:13

those will fade until the next frame appears and a

24:15

movie projector would show one frame

24:17

of film but then there'll be nothing

24:19

as the next frame slides into view and

24:21

then it will blast that frame on so

24:24

what it's really showing you is bright light, bright

24:27

light, bright light, bright light and

24:29

in between the bright light there's either total

24:31

blackness or a fade to black and

24:34

I think what our brain does during these intervals

24:36

is say okay well there's like a train going

24:38

across the screen bright light

24:40

oh there's the train and then

24:42

there's nothing or blackness and then

24:44

a second picture of the train appears and now it's moved a

24:46

little bit to the right and our brain goes oh in between

24:49

when I saw that first picture of the train and then there

24:51

was blackness and then I saw the second picture of the train

24:53

I guess it must have moved between those two parts with

24:56

sample and hold on an OLED where it just shows

24:58

the train in the first position and just holds it

25:00

there for 1 24th of a second and

25:02

then immediately shows the train in the new position what

25:04

it looks like to us and you will see this

25:06

on an OLED television if you have it set up

25:08

quote unquote correctly is it stuttery it looks like it's

25:11

moving in segments looks like chunk chunk chunk chunk it's

25:13

like why doesn't it look smooth I watch the same

25:15

movie that moves you to the train smoothly moves from

25:17

moves from left to right but suddenly when I watch

25:19

it on my OLED TV it's stuttering or something and

25:21

it's not stuttering it is carefully

25:24

if you have it set up correctly it's showing 1 24th of a second

25:27

and then 1 24th then one but the thing

25:29

is it never goes black between the frames it

25:31

instantly changes from frame number one to frame number

25:33

two instead of showing frame number one for a

25:35

you're a tiny fraction of a second and then

25:37

showing blackness and then showing the next frame so

25:40

this is this is a thing but a thing

25:42

for headsets since they've been rolled out and one

25:44

of the innovations of oculus you can see john

25:46

karmack talking about this and everything is like we

25:48

need displays that can blast that frame really brightly

25:51

for a tiny tiny fraction of a second and

25:53

then fade the black and

25:55

do nothing until the next frame is ready because we

25:58

want the brain to essentially fit in the frame. fill

26:00

in the blanks. Because if we show the frame

26:02

the whole time until the next one is ready,

26:04

even though we can do that with OLED screens,

26:06

it looks jerky. Because your brain doesn't

26:08

get a chance to fill in any of the intermediary spots.

26:10

Like if you think about the train moving, you see the

26:13

train in position one and train in position two. But if

26:15

you blank out in between them, your brain will fill in

26:17

train in position 1.1, 1.2, 1.3, 1.4. Your

26:21

brain will fill those in for you. Those frames don't

26:23

exist, but your brain will fill you them in. But

26:25

if you never do that, your brain will say, train

26:27

is in position one, still in position one, still in

26:29

position one. Oh my god, it's in position two. What

26:31

happened to between? It is still in position two. And

26:34

that appears jerky. This is a big thing with OLED

26:36

televisions, which is why some people say, I have

26:38

to turn on motion smoothing. I can't have

26:40

it set up, quote unquote, correctly. Because it

26:42

looks wrong to me. Because when there's a

26:44

slow panning shot, I see every

26:47

one of the 1.24 frames and it looks jerky to

26:49

me. So that's

26:51

what I would expect you would see in the

26:53

headset if that was a problem. Marco would be

26:56

saying, I turned my head and everything looks jerky.

26:58

But that's not what people are saying. They're saying

27:00

it looks blurry. Maybe it's the same thing. Maybe

27:02

it's a misinterpretation. Again, I wish I had known

27:04

to look for this. I didn't notice either jerkiness

27:07

or blurriness, but my eyes aren't grayed

27:09

and it was just a half an hour demo. So I'm

27:12

interested to see how this develops. But

27:14

this story about Apple pushing brightness rings

27:17

true to me because things are very bright in there. And

27:19

obviously, you can get more brightness by

27:21

holding that image longer and not fading to

27:23

black between, or not fading to black as

27:26

long. I'm curious, John. Because

27:28

when I went from plasma to

27:30

OLED, I noticed this too where

27:32

I remember watching The Office,

27:35

this regular TV show shot, I

27:37

assume on film or whatever. But

27:39

I remember panning shots. I'd noticed

27:41

immediately the difference. Oh, wait, motion

27:43

looks bad on OLED. Everything else

27:45

looked great, but motion looked worse.

27:48

And my OLED, it's probably now

27:50

seven or eight years old, so

27:52

it's nowhere near cutting edge

27:54

now. But I'm wondering, do modern OLEDs

27:56

today, are they better with things

27:58

like black frame and charging? mine was

28:00

one of the first ones that supported it, but

28:02

it just is not fast enough of a TV.

28:04

Like I tried it and it just looked terrible.

28:06

I could almost see the black frames. It was

28:09

not smooth enough. Is it

28:11

better now? So modern TVs are

28:13

better at black frame insertion than your old

28:15

one was, but still not good enough that I

28:17

think you would ever want to use it. Because

28:20

here's the nature

28:22

of OLEDs is they change really fast. And

28:24

I think that makes it harder to

28:26

do black frame insertion. The way they do it

28:29

now to try to make it better is they're

28:31

like double or triple the frame rate so that

28:33

the black frames are like faster. But OLEDs quote

28:35

unquote problem is they change so fast. Like LCDs,

28:38

there's all these tricks we have to play to

28:40

make the pixels change from one color to another

28:42

really fast. OLEDs you don't need to play any

28:44

tricks. They change really fast. So that means they're

28:46

changing really fast to blackness, which means you have

28:49

only a brief time that the light exists and

28:52

then immediately it's completely 100% black. And that's why

28:54

you felt like you could see the blackness because

28:56

it's not like it smoothly fades

28:58

like a CRT, like a dying star. That's

29:00

not how it works. So if we crank

29:02

the frame rate up enough, can it fade

29:04

to black between each frame? Yeah, it's really

29:06

fast. But anyway, like I'm saying, like can

29:08

you just insert like, you know, a faded

29:10

frame and then another faded frame and like

29:12

yeah, I mean there'd be a heck of

29:14

a frame rate. But yeah, so what you're

29:16

getting at there is alternate solutions, which is

29:18

basically what they do. But just to finish

29:20

the black frame insertion, there are ones now

29:22

that are fast enough that you can't see it

29:24

flickering, but they do hit the brightness. And for

29:27

the most part, you don't want to make your

29:29

TV dimmer, especially OLEDs, right? Because up

29:31

until recently, OLEDs have not been, you

29:34

know, people will consider them not bright enough for a very

29:36

bright room. Now they are very

29:38

bright, but then their competition is brighter still.

29:40

Anyway, what they actually do, the actual solution

29:42

to the stuttering problem is the

29:44

good televisions have essentially added a

29:47

form of motion smoothing that is

29:49

like the most delicate form. Like

29:51

you don't detect it as a soap

29:53

opera effect, but it smooths

29:55

that out just enough for it

29:57

to not look jumpy to you. So it's basically

29:59

like. motion smoothing but turned down

30:01

to the lowest possible setting, lower

30:04

than you would ever imagine

30:06

and that basically cures the problem. What that

30:08

is doing is instead of making faded frames

30:11

that is essentially interpolating between frames but just

30:13

barely enough to make it. It

30:16

doesn't take much. 24 is close. If

30:18

the motion picture or even 30 frames which is

30:20

probably what the office was, you don't

30:22

need much more frame rate above that so they don't need to

30:24

fill in a lot for it to just smooth

30:27

out. I still watch mine in straight

30:29

up 24 frames per second mode. I

30:31

can still see it on slow panning shots in movies sometimes

30:33

but it doesn't bother me as much as it bothers some

30:35

people but yeah that's one of the reasons you want a

30:37

fancy TV because they will have a setting that says motion

30:40

smoothing for people who hate motion smoothing and

30:43

it's like the turn it up to one

30:45

or low or super low or whatever and sometimes

30:47

they also separate the different aspects of motion smoothing

30:49

so you don't have to apply both of them

30:51

at the same time or you can set some

30:54

different values. That is

30:56

essentially the solution to dealing with

30:58

24 frames per second content if you are sensitive to it

31:00

or 30 frames per second in the case of the office

31:02

I imagine. Speaking

31:04

of display technology and screens and trade-offs

31:06

and Division Pro, as we were speaking

31:08

about a minute ago, there's been this

31:10

great series ever since

31:12

Division Pro was announced and everything. There's

31:15

a blog called KG on Tech by

31:17

somebody named Carl Gutag and Carl

31:20

goes through and is very knowledgeable

31:22

about VR and displays and

31:24

hardware and optics and how this stuff

31:26

works. This kind of helps

31:28

show the various trade-offs

31:31

involved in the Vision Pro's screens

31:33

and optics and what

31:35

it does, why it is limited

31:37

in certain ways, what the other

31:39

headsets like the Quest and other things

31:41

like what they do sometimes the same way

31:44

or certain choices they make differently and why

31:46

they choose differently and what the trade-offs are

31:48

there. It's very, very interesting. So Carl had

31:50

this article the other day about some of

31:52

the trade-offs about the resolution inside the Vision

31:55

Pro. There was a shot

31:57

in the iFixit teardown video, the second iFixit teardown

31:59

where they actually show that like

32:01

the raw image on

32:03

the actual little tiny screen panel and

32:06

the image looks like it's being

32:08

viewed through a fisheye lens. It's

32:11

like you know very it's curved, it's warped

32:13

with a whole bunch of you know resolution

32:15

spent on the middle of it and towards

32:17

the edges it warps out. The screens have

32:19

to kind of warp the image to make

32:21

up for what the lenses in front of

32:23

them are going to do as they you

32:25

know project the image around so it looks

32:28

like a giant little view around your eye

32:30

to give you that whole immersive effect. Since

32:32

the image on the screens has

32:35

that fisheye warping effect, whatever

32:38

is in the center of the

32:40

screen has way more resolution

32:43

than what is in the periphery

32:45

of each eye. So

32:47

I think this kind of helps explain first

32:49

of all some of the you know optical

32:51

effects that you see when you're using a

32:53

Vision Pro but also I think this

32:56

might be part of my problem with the Mac

32:58

screen mode. Whatever you're looking at

33:00

in the Vision Pro when you are looking

33:02

straight ahead that it has

33:04

way more pixels like per degree than whatever

33:06

is on the edge of the screen. Now

33:08

if you turn your head obviously you're going

33:11

to turn the high-resolution part of the screen

33:13

towards what you're looking at but if you

33:15

are looking towards the corners of the screen

33:17

only by moving your eyes and

33:20

not by turning your head then what you

33:22

are looking at has way less resolution than what's

33:24

in the middle of the screen and

33:27

also other you know there's other kind of optical

33:29

trade-offs that you get when you're looking near the

33:31

edges. So for instance when you're looking towards the

33:33

middle both eyes can see

33:35

what you're looking at in their respective

33:38

screens. Their screens

33:40

don't overlap perfectly. The

33:42

left eye screen can see a little bit further

33:45

on the left and the right eye screen

33:47

can see a little bit further on the right than

33:49

their opposite screens can. If

33:51

you're looking past like the overlap area

33:53

where like if you're looking you know far to

33:55

the left only your left eye

33:57

might be seeing that in its screen so that's

33:59

even less information it's getting. And it's

34:02

in the lower-res, you know,

34:04

warped edge of the screen optic

34:06

trade-off, you know, pipeline. So

34:09

I think maybe my problem with

34:11

the Mac screen mode is

34:13

that you don't have to think about

34:16

that kind of stuff when you're using a physical monitor. A

34:18

physical monitor has the same resolution across the

34:20

whole thing. And you can just move your

34:22

eyes and not move your whole head, and

34:25

you will see pretty much

34:27

the full resolution. I mean, yeah, your eyes aren't super

34:29

perfect in all ways either, but they're, you know, I

34:31

think they have fewer trade-offs than Division Pro screens do.

34:33

That's what I said last week, I was saying about

34:35

like, when you move your eyes, your field of view

34:38

moves with your eyes. So even though your eyes only

34:40

see things that are in focus that are directly in

34:42

the center, you can move that center. But when you're

34:44

in the headset and you move that center by moving

34:46

your eyes, the screens don't care. They

34:48

don't. They don't. Imagine if

34:50

that high-res center of the screen followed

34:53

your eyes as you move them around, like

34:55

maybe the screens are motorized, they're staying in front

34:58

of your pupils or something. That's what happens in reality.

35:00

In reality, you move your eyes. And

35:02

you know, your field of view is just as

35:04

janky, even probably jankier than this screen, which is

35:06

why foveated rendering works as well as it does.

35:08

But you get to move it wherever you want.

35:10

And so you can take the dead center, highest

35:13

resolution part of your eyes and point it at

35:15

the Apple menu. But if you keep your head

35:17

dead straight, and you point your eyes at the

35:19

Apple menu inside the Division Pro, yeah, that Apple

35:21

menu is probably going to look pretty janky because

35:23

it's on the corner of the screens and the

35:25

screens didn't move when your eyes did. I've

35:28

been trying to figure out, you know, ever since I

35:30

got the Division Pro, why is it that everyone else

35:32

says the Mac screen is sharp and it's not that

35:34

sharp for me? And I think this

35:37

might be one of the reasons that like, I think

35:39

that people for whom it's working well for, maybe

35:42

they are just moving their head more.

35:44

Because they're multi-monitor people like Casey. Right.

35:47

All coming together. Or maybe they are they

35:49

are making the virtual window smaller in the

35:51

Vision Pro field of you. No,

35:54

that's not what I'm doing. But maybe, are you turning

35:56

your head because you're used to having like your 3,

35:58

5K monitors, turning

36:00

your head in real life so you're used to it? I

36:02

guess, I mean I haven't really thought about it that much

36:04

but yeah, I mean it stands to reason that's true. But

36:06

I mean again, I was using this earlier today and I

36:08

thought to myself like, what is

36:11

Marco talking about? I really don't. Supposedly

36:13

improved this in the 1.1 beta like

36:16

the Mac screen sharing is a little bit

36:18

sharper for people so when that update comes.

36:21

And just to be clear, I'm not trying

36:23

to imply that Marco, you're full of it

36:25

or lying or anything. It's just, it's so

36:27

funny to me that this device is so

36:29

personal and based in a way

36:32

that I don't think that any of us have ever

36:34

really dealt with. It's based on your own like, bodies,

36:36

abilities and physiology, is that what I'm looking for? You

36:38

know what I mean? Yeah, and habits and like habits

36:40

like turning your head or not. Right, and so I

36:42

think I made a big stink about this last episode

36:44

and I'll just briefly say again, like I'm not trying

36:47

to imply Marco that you're wrong or you're lying or

36:49

anything like that. It's just so weird to me that

36:51

your experience does not match mine on

36:53

what is effectively identical hardware and it's

36:56

just a funny quirk of

36:58

this brave new world we're entering. It

37:00

turns out our eyes and brains are not identical hardware. Maybe

37:03

that's the problem. It

37:05

can all be solved by screens that are four times

37:07

as big with eight times the resolution. No

37:09

problem, I'm sure we'll be right there. Hey, you know, we

37:12

got retina, Mac monitors eventually. It's just gotta be patient. I

37:14

think there's always gonna be certain trade-offs. There

37:16

are, for example, like certain principles of optics.

37:18

They're gonna make some of this stuff more

37:20

difficult but certainly the higher resolution they

37:22

can get those screens, you can

37:25

paper over a lot of those problems or you

37:27

can dramatically minimize them. Kinda like on the iPhone

37:29

screens where, you know, I think most modern iPhone

37:31

screens essentially they use the Pentile sub-pixel pattern where

37:33

you don't even get an R, a G and

37:35

a B sub-pixel for every quote unquote pixel on

37:37

the screen, right? Like if you zoom in on

37:39

the screens that they use this pattern where, like

37:41

I forget which one it is but one of

37:43

the sub-pixels is shared with neighboring pixels and you're

37:45

like, how can that look good? What a garbage

37:47

screen. They couldn't even give every pixel an RGB?

37:49

That must look terrible. And the answer is no,

37:51

you can't tell they're so small. And like,

37:53

nobody notices, nobody cares, right? Because

37:56

they're so small. Yeah, even like, I mean, please

37:59

designers. out there, cover your ears for

38:01

just this moment. When

38:03

I draw icons and

38:05

think about font weights and icon stroke

38:07

widths for my app, you

38:10

used to have to think about, all right,

38:12

you gotta make everything exactly, like

38:14

1.5 points or three points or whatever it

38:16

was, so it would perfectly line up on

38:18

pixel boundaries. It's in your multiples. Yeah, and

38:20

so it would look good on retina screens

38:23

and non-retina screens and it would perfectly align

38:25

with everything. And these days,

38:27

I don't think about that anymore.

38:29

Once we went to 3x density

38:31

iOS screens, which I believe happened at

38:33

the same time, I believe that was all with the iPhone

38:35

X and forward, I have yet to

38:37

find any stroke width that I choose

38:39

to use that looks blurry or bad

38:41

compared to other ones. So now I just do this

38:44

manually. I'm like, all right, I want

38:46

this icon to be semi-bold. I want this one to

38:48

be medium. I just do that

38:50

and kind of let the system do what it wants

38:52

with thicknesses and they are not always perfect integer multiples

38:55

and it turns out it's totally fine because we

38:57

have such incredibly high density on those screens now.

38:59

So I think with Vision Pro, I think you're

39:01

right. Once down, way

39:03

down the road, I don't think this

39:05

is coming soon, but way down

39:07

the road, maybe 10 years from now, when

39:09

they can double the resolution of the screens or

39:11

more, I think a lot of these

39:13

problems will get a lot less noticeable. But

39:16

until then, this is gonna be trade-offs that we live with and

39:18

that's just the reality of the technology we have so far. Tony

39:21

DiTaranto writes, adding to the categories

39:23

of ATP listeners with jobs in

39:26

every profession, I am a

39:28

professional choral conductor and long-time listener to the

39:30

show. This is my favorite corner,

39:32

it really is. I had a funny thought

39:34

while listening to John describe Window Management, Vision

39:36

OS, in episode 574, specifically about

39:38

how our eyes normally function as input devices

39:41

and not output devices. In

39:43

the course of my work and in my

39:45

professional training at music school, I've learned an

39:47

important thing that most people who aren't conductors

39:49

don't think about. When giving cues, conveying information,

39:51

it's important what you do with your arms and

39:53

hands, but it's arguably more important what you do

39:56

with your eyes. Eye contact and directing

39:58

your gaze is an even stronger form of action. of

40:00

communication and control than arm and hand

40:02

gestures. From when I first started conducting

40:04

technique in college, when I first studied

40:06

conducting technique in college, I had to

40:08

train my gaze as much if not

40:10

more than my gesture to achieve the

40:13

desired result from the group. I

40:15

just wanted to provide this as an example of how, although I'm

40:17

no Superman, I feel like I use my eyes as an output

40:19

device for my work. I was

40:21

conductor as they just see all the people in

40:23

the chorus or the orchestra

40:26

as tiny instruments to be controlled by

40:28

their eyes. Well,

40:30

there was a line from the movie, wasn't it? They

40:32

play the music, I play the orchestra or

40:35

something like that. No, spoilers for Steve Jobs.

40:37

Come on. ATP.offense.join. David Shob

40:39

writes, what frustrated me is that, oh,

40:41

this is with regard to Fitt's law,

40:43

I'm sorry. Whoever put this in, not

40:46

enough context, John. Well, you know, you

40:48

got to read ahead at some times. Anyway,

40:50

David Shob writes with regard to Fitt's law,

40:53

what frustrated me is that dropping files on the

40:55

Mac OS dock breaks Fitt's law. If you drag

40:57

a file to the dock, you can drag it

40:59

past the icons, which according to David Shob is

41:01

ridiculous. It is ridiculous. So it's a long standing

41:04

thing. It's kind of like when you put a

41:06

folder alias in the dock, you can't drag things

41:08

into it. It's one of those things that could

41:10

be fixed in numerous ways. For example, you could

41:12

just disallow dragging folder aliases into the dock if

41:14

you're not going to support it. But

41:17

hasn't been because apparently nobody cares. But yeah, dragging things

41:19

to the dock. We get away with it because in

41:21

the grand scheme it thinks the dock is pretty big

41:23

on most people's monitor. I bet most people don't have

41:25

their dock smaller than the menu bar. But

41:28

yeah, try it. Take something if you

41:30

have a folder in the dock, take a file and try

41:32

to drag it into the folder and go past the folder

41:34

go all the way to the screen edge and you'll notice

41:36

the folders like, nope, nothing's on top of me anymore. It's

41:39

crappy. I might have even filed

41:41

a feedback on it years ago. Maybe

41:43

I'll file one again. I'm sure other people have, but

41:45

I don't think Apple cares. But it's sad. They should.

41:48

Apple this is wildly unrelated. Apple

41:51

extends their modem licensing deal with

41:53

Qualcomm through March of 2027. So

41:56

to recap, Apple bought, what was it? Intel's

41:58

modem business a few years back. And

42:00

so sell modem business. Right.

42:02

And they said, without saying, oh,

42:04

we're going to make our own modem so we

42:06

don't have to continue to pay Qualcomm. And

42:09

then they realized, oh, this is harder than we thought. And

42:11

they continued to pay Qualcomm. And now they've apparently realized, no,

42:14

it's still harder than we thought. And

42:16

so they are extending their deal even

42:18

further, reading from MacRumors, Apple has extended

42:20

its modem chip licensing agreement with Qualcomm

42:22

through March 2027. Qualcomm said today,

42:24

during its first earnings call of 2024, Apple's existing agreement

42:27

has now been extended for two years. So we

42:29

can expect to see Qualcomm modems the next several

42:31

iPhone generations. Yeah. I'm great for Apple.

42:34

I still think they really should make that

42:36

modem and maybe think about integrating it into

42:38

the SOC or near the SOC. But first,

42:40

they got to make one that works. Good

42:43

luck. I keep in mind also, like, you know, just

42:45

because they have a license to deal with Qualcomm through

42:48

2027 doesn't necessarily mean that Apple

42:50

is not going to ship its own modems before

42:52

then. It probably means

42:54

that. But this could be like

42:57

Apple is going to need Qualcomm's

42:59

chips until, you know, through March

43:01

2027 for some parts

43:03

of their lineup. So it's possible

43:06

that, you know, next year, one

43:08

of the iPhones gets it or some variant of it, whether

43:10

it's the iPhone SE or – Yeah, but they're going to

43:12

want to start this slow, I think. It's not going to

43:14

debut in the pro phone probably. I think they're probably going

43:17

to be wary and like, you know, put

43:19

it out in the SE first or something. Like, just –

43:21

I don't know what they're going to do, but like, given

43:23

how hard it's been, I don't think this

43:25

is a bet your flagship product on it. I don't think

43:27

they can make enough of them at that. I don't know.

43:29

I guess the TSMC would do it for them or whatever.

43:31

But yeah, I think given how

43:34

this has gone, a

43:36

gradual rollout of Apple's modems is a good

43:38

idea. Remember, last time they used Intel modems

43:40

in their phones and – what was it?

43:42

You could either get the Intel modem or

43:44

the Qualcomm one and nobody wanted the Intel

43:46

one because it wasn't great. And that was

43:48

kind of on their flagship phone. Yep. I

43:50

had the Intel one. That was the AT&T

43:52

iPhone 7. Yeah. So

43:54

– Not a good one. It's tough.

43:57

So I hope Apple does take this slow. Keep

43:59

extending. that deal, lock-in, whatever deal you have to

44:01

get with Qualcomm. I know these two companies hate

44:03

each other, but you kind of need cell modems

44:05

for cell phones. Indeed.

44:08

Going back many episodes now, probably five or ten

44:10

episodes, we were talking a lot about patents, and

44:12

an anonymous person wrote in to say, Apple

44:14

employees are heavily incentivized to file patents. It's

44:16

one of the few ways to make additional

44:19

income at Apple outside of your normal job

44:21

responsibilities. Apple has an entire department, an online

44:23

portal to streamline the process. I can't find

44:25

the exact numbers right now, but I believe

44:27

employees receive at least $1,000 if

44:29

a patent is accepted and approved. The caveat, of

44:31

course, is that Apple owns all of the intellectual

44:33

rights to the invention. I don't mean

44:36

to sneeze at $1,000. $1,000 is a lot of money, but like... That

44:40

seems chintzy, right? Yeah. Yeah. For Apple, for it's

44:42

like that does seem like a little... It seems

44:44

lower than I would have guessed. Well, that's Apple's

44:46

M.O. The whole rep is that they don't pay

44:48

as much as their competitors, because everyone should just

44:50

be happy to be working there because of Apple.

44:52

But yeah, $1,000 is like, why bother? Give me

44:55

one share of stock. We

44:58

are sponsored this episode by Celtrios. This

45:00

is a shmup, a shoot-em-up game. They

45:02

actually sponsored us a while ago, and

45:05

they're back with even more updates. So

45:07

here's the idea. It's a shoot-em-up. So

45:09

tiny ship, big weapons, waves of foes

45:11

and power-ups and obstacles, and all of

45:14

this is brought to you with no

45:16

ads, no in-app purchases, no subscription, and

45:18

nothing is locked. You can play Celtrios

45:20

over and over again as long as

45:23

you want, with lots of customization options

45:25

to keep it fresh. And

45:27

Celtrios is available exclusively for

45:29

Apple platforms. Mac, iPhone, iPad,

45:32

and Apple TV. So Celtrios

45:34

has 13 different stages.

45:36

You can start from whichever one you

45:38

want. You can also, of course, resume,

45:40

play later if you quit the app.

45:42

You have a million different possibilities when

45:44

you're configuring your ship. Dozens of abilities,

45:46

and then, of course, randomizing options, full-screen

45:49

options, huge amount of ways to

45:51

play both mechanically and then also just

45:53

how your ship is configured it's great. Celtrios

45:55

also has a huge high-quality

45:58

soundtrack with over 40... minutes

46:00

of music and Celtrios keeps expanding. There

46:03

have been over 75 free updates

46:06

to it so far and the entire thing was

46:08

made by an independent developer so of course this

46:10

is right in my heart you know. So Celtrios

46:12

supports one or two players at a time on

46:15

the Mac you can do various input methods with

46:17

iOS the second player is a game controller but

46:19

hey it works so it's just a great shoot-em-up

46:21

or shmup kind of game. I love this genre

46:24

of games it's so fun you know like it's

46:26

like the old arcade games taken to the extreme

46:28

with all the modern capabilities it's wonderful. So

46:30

get Celtrios if you love traditional shmups

46:32

or you just want a quality game

46:34

that's fun to play again and again

46:36

with none of those usual annoyances of

46:38

other modern games. Mac only Celtrios

46:41

is available on Steam with a free demo

46:43

or head over to the Apple App Store

46:45

to get Celtrios for iOS and tvOS. Thank

46:47

you so much to Celtrios for sponsoring our

46:50

show once again. Harvey

46:55

Simon writes in regards

46:57

to the magnetically attached Apple

46:59

Watch bands that are rumors magnets

47:01

screw with compasses. Apple

47:03

watches have compasses ergo Apple is unlikely

47:06

to add magnetic attachments for Apple Watch

47:08

bands so I mean

47:10

I'm not an expert in magnets but that makes

47:12

sense to me. If

47:15

you're waiting for extremely powerful magnets

47:17

to be connecting your Apple Watch

47:19

band maybe that won't

47:21

work well with the compass feature of

47:23

the watch. Time will tell. I think

47:26

working around magnetic strength limitations is

47:28

not that hard because again I'm

47:30

not a scientist but I'm pretty

47:33

sure magnetic strength falls off dramatically

47:35

with distance and if

47:37

you have fixed magnets inside the Apple Watch

47:39

that are always in the same position I

47:42

would expect it would not be super

47:44

hard to just calibrate that out from

47:47

the sensor. Third-party watch bands might throw a

47:49

monkey wrench into that. Well it depends like where are

47:51

the magnets? Are the magnets in the body or are

47:53

they in the band? I would imagine that they're in

47:55

both places kind of like the MagSafe cases you know

47:58

like the ones that work well the cases also have

48:00

magnets in them and third party ones tend to be stronger

48:02

than Apple's in my experience. Maybe. But

48:05

I think if they're in the watch body,

48:07

which I mean, actually, does the

48:09

charging disc on the bottom, are there

48:11

any magnets on the watch side of

48:13

that? There probably are I would imagine.

48:15

There's electricity, so there's also magnetism that's

48:17

complicated. Yeah. Anyway, so

48:20

I think they could design around this. I don't think it

48:22

would be that big of a deal. We'll

48:25

see. This is still just a rumor. I

48:27

haven't seen anything concrete on the watch,

48:29

strap thing since we talked about it

48:31

except for vague notions of watch, strap

48:33

magnets. Okay. Well, so give me a drawing. Give

48:36

me something. We don't have anything yet. Fair

48:38

enough. All right. And then finally, for follow

48:40

up this week, more on the line or letter

48:42

of credit from Brian coffee. As

48:44

someone who also works in commercial banking, specifically problem

48:47

commercial loans, I need to clarify the comments on

48:49

the requirements to get a letter of credit. Drawing

48:52

a letter of credit. Yes, Apple could draw on the

48:54

letter of credit, but in the agreement, they must have

48:56

a good and valid reason. The typical use for a

48:58

letter of credit is for international shipping where you ship

49:01

the goods before getting paid. The letter of credit ensures

49:03

the overseas recipient won't stiff you. The shipper who got

49:05

stiffed must go to their bank to work with the

49:07

ship ease bank to claim that the shipper never

49:10

got paid. And with regard to getting

49:12

a letter of credit, a letter of credit is a credit product

49:14

of the bank. Once the letter

49:16

of credit is drawn upon, it instantly becomes a loan.

49:18

A company can get an unsecured letter of credit if

49:20

they're producing enough cash flow to show their quote unquote

49:23

good for the money. If the letter

49:25

of credit ever gets drawn upon, if a potential

49:27

borrower doesn't have strong enough cash flows to support

49:29

this, there are alternatives. Smallish businesses could put up

49:31

their homes or retirement accounts as collateral. You could

49:33

also be 50% secured or really any percent secured

49:36

depending on the strength of your cash flows. All

49:38

this is to say, you don't absolutely have to

49:40

put a million in cash in the bank, but

49:42

you do have to show a bank that you

49:44

could reasonably come up with a million if

49:46

you had to and prove it. So

49:49

it's not exactly handing a bank a million bucks

49:51

or a million euros, but it's not that far

49:53

away either. Yeah, I mean, like any bank

49:56

thing, if you can convince some bank to do something for

49:58

you, then fine. But yeah, I feel like the. bank

50:00

basically wants to know if you have

50:02

something that we can get a million dollars from

50:05

even if we need to like repossess your home

50:07

or whatever. So it's

50:09

not as bad as you must have a million dollars in

50:11

cash but I feel like you still kind of have to

50:13

have some way to get a million dollars for a bank

50:15

to agree to this because that's kind of the whole deal.

50:17

Unless you have a really friendly bank and it's like we

50:19

like your face. We think it's fine.

50:21

Yeah, we'll front you a million bucks because

50:24

we like your face. That sounds reasonable. Yeah.

50:26

All right, let's move on to some topics

50:28

and of course we have a little bit more

50:30

vision pro to talk about and John, you seem to be

50:33

very enthusiastic to talk

50:35

about personas, baby. So what you got?

50:37

It's kind of like when we talked

50:39

about eyesight last week. The

50:41

reason this feature exists is obvious and I

50:44

think is not really going to go away.

50:46

So personas are the little fake computer people

50:48

that you use to represent yourself when you're

50:50

on a FaceTime call and the reason for

50:52

them is obvious. Being on

50:54

a video call in a Zoom meeting for

50:56

your work in a FaceTime or whatever, it's

50:59

so common today. It's a very common

51:01

part of using computers. But if you got weird ski goggles

51:03

strapped to your face, how

51:05

do you get yourself into a video

51:07

call? Do you let your max webcam

51:09

show your weird ski goggly face? I

51:12

think people would find that off-putting even

51:14

with the creepy eyesight things on it.

51:16

So Apple's solution is, hey, we'll

51:19

make a little computer version of you and when

51:21

you're in your FaceTime call or your Zoom meeting,

51:23

the little computer puppet of you will talk. And

51:26

that problem is not going

51:28

to go away until we're actually wearing

51:31

glasses that just look like regular glasses, which

51:33

is many, many, many years in the future

51:36

if it ever comes in any form. So

51:40

you're going to need some way to show

51:42

your face in these meetings. I guess the

51:44

other solution is, oh, we're having a Zoom

51:46

meeting. Why isn't your video turned on? Oh,

51:48

I'm wearing a headset so you can't see

51:50

me. I'm not sure

51:52

that's necessarily going to fly. And you wouldn't

51:54

want it to because there's a lot of bandwidth, a

51:56

lot of communication that happens from your

51:58

facial expressions. solution

52:00

to this is these weird creepy computer

52:04

models and as

52:07

weird as they are I think

52:10

they're gonna Apple is gonna keep plugging away at

52:12

this as you know they've taken a lot of

52:14

flack for this during the rollout because they do

52:16

look kind of creepy and scary I

52:20

don't think the Apple is going to be

52:22

scared away nor should they because I think

52:25

you know video conferencing is not going away

52:27

and it's going to

52:29

be a long time before these things on our face

52:31

don't look like sea goggles and kind of hide everything

52:34

about our most things about our face so I

52:36

think they just need to keep plugging away at it is embarrassing

52:38

it is now and as for how

52:41

embarrassing it is like we like last episode

52:43

I think we when we tooted about it

52:45

we showed Casey's persona and like the the

52:47

graphic for that episode they

52:49

look silly but only if you you

52:52

have a more I think

52:54

you have a more appreciation for them if you have

52:56

any experience with this type of thing before and most

52:58

of my experience with this type of thing comes

53:01

from video games and similar tech where they

53:03

would you know put your face in

53:05

the game make a skin for your player character or

53:07

whatever and the things that have

53:10

been built into games have been so much worse

53:12

than what Apple did they really

53:14

are one of the best instances I've

53:16

ever seen of this particular technology and

53:19

we'll put a link in the show notes

53:21

to someone demonstrating this the thing I think

53:23

that's most impressive about Apple stuff all two

53:25

things one that they're able to make something that looks

53:27

as good as it does I know you hearing

53:29

that you're like what do you mean they look terrible look like a

53:31

death mask it

53:35

could be so much worse right and yes they all

53:37

blur the edges to hide their sins or whatever but

53:39

they do an amazing job with just a with a

53:41

face scan that you can do not a professional scan

53:44

like go look at things where they have like a

53:46

famous Hollywood actor in a video game and they have

53:48

that person go into a full motion

53:50

capture studio and have them stand there with like

53:52

balls on their shoulders and and lasers shooting them

53:55

for a million angers like incredibly controlled

53:57

environment they spend the entire day getting

53:59

their face skin and they put them in the

54:01

game and they look awful. And this is, oh just

54:03

hold the ski goggles in front of your face for

54:05

two seconds turn to the side and the Apple's doing

54:07

a better job. So yeah that's impressive but the second

54:09

thing is it's demoed in this video how

54:12

well it tracks the

54:14

expressions you're making with your face.

54:17

So you're wearing ski goggles and you're raising

54:19

your eyebrows and you're blinking and you're twisting

54:21

your mouth and you're sticking out your tongue

54:23

and you're smiling and you're frowning and you're

54:25

furrowing your brow and somehow

54:28

Apple's able to detect all those things. Some

54:30

of those things are happening inside the headset,

54:32

some of those things are happening outside the

54:34

headset. It is phenomenal what they do.

54:36

Now does the little puppet look like you when

54:39

it's doing that? You

54:41

know it's got fake, everything's got the same perfect

54:43

fake teeth and everyone's got the same weird artificial

54:45

tongue and it's not you know

54:47

if you can make the W shape with your tongue the

54:49

avatar is not going to right but like it

54:52

does a really good I think it does

54:54

a good enough job kind of like a

54:56

really well articulated puppet would do of

54:59

letting the people who are on the zoom

55:01

call with you or whatever know what expression

55:03

you're making. Are you you know are you

55:06

happy about that? Are you skeptical? Are you

55:08

angry? Are you not paying attention? Like I

55:10

feel like they do an amazing job of

55:12

matching the movements

55:15

of the various parts of your face and reflecting

55:17

those. No it's not perfect it's not capturing all

55:19

the subtleties of your acting performance it might not

55:21

even really look like you but kind

55:24

of like I was you know talking about the cartoon eyes

55:26

sometimes just sort of a not

55:30

a cartoon eyes but like a decomposed

55:33

less granular version of you captures a lot

55:35

of it. It's the reason animation can look

55:38

so good. Animation doesn't look photorealistic but the

55:40

right lines in the right places can be

55:42

very expressive in animation and I feel like

55:44

that's what Apple's going for. This isn't exactly

55:47

you but if they catch

55:49

enough that they put the right lines in

55:51

the right places they can convey most of

55:53

the information your face is expressing and again

55:55

I'm really impressed that some of that expression

55:58

is underneath these the the the guy and

56:00

some of it is outside and they put it

56:02

back together into a cohesive whole. So I'm

56:05

not, I don't relish seeing these personas talking

56:07

to me, but I think this

56:10

is just one of those hard problems that Apple and

56:12

anybody who wants to do what Apple is doing has

56:14

to be resigned to tackling over the next several decades

56:17

because it's not going to go away again until we

56:19

just get plain old glasses. Until then, people are going

56:21

to want to be meanings and people are going to

56:23

want to see their faces and they're going to want

56:25

to be able to use their faces again, but talk

56:27

about our choral conductor as an output

56:29

device because that's part of the way we communicate

56:31

with other people. We want to be able to

56:33

scowl at someone meaningfully and have that, have an

56:35

effect on them in the meeting. Yeah.

56:38

And I want to build on what

56:40

you were saying earlier, like as a

56:42

technical achievement, it is

56:44

stunning how good these

56:46

are at, at expressing

56:49

what your face is expressing. Cause remember, you've

56:51

got cameras on the inside that are figuring

56:53

out when you're blinking because the persona reflects

56:55

that, that figure out when you're raising your

56:57

eyebrows because the persona reflects that. I

57:00

don't recall if it reflects where your eyes are

57:02

looking, but I think it does. Yeah, it does

57:04

pretty sure. And it's certainly, you can turn your

57:06

face left and right and up and down. And

57:09

then your smile, like that's happening outside the

57:11

device and granted there are cameras, you know,

57:13

pretty much everywhere on this thing, but still

57:15

there's, you know, it's outside the device that

57:18

you're smiling or sticking your tongue out like you had said.

57:20

And whether or not like leaving

57:23

aside the creepiness factor, the uncanny Valley, which again, just

57:25

like I was saying in the beginning of the show,

57:28

like that's a big thing to

57:30

just push under the carpet. But leaving

57:32

that aside, the technical achievement is really

57:34

just phenomenal and stunning how good, how

57:36

good it is. And, you know, I

57:38

had a, I do a

57:41

monthly face time call with James Thompson. We had one

57:43

this morning and because we're both idiots, we jumped on

57:45

the call, unbeknownst to either of us, but we, I

57:47

think we both kind of assumed it. You know, we

57:49

jumped on the call, you know, started off the calls

57:51

and in our vision pros with

57:53

our personas and so on and so forth

57:55

and had a good laugh about it.

57:57

And, you know, it chuckles about how ridiculous we both look.

58:00

And then we hung up and got on the computer

58:02

like we usually do. But I mean, I think if

58:04

we had stuck with the personas, it would have been

58:06

awkward for a few minutes and this is a very

58:08

common refrain from people who've done it. Um, you know,

58:10

it would have been awkward for a few minutes and

58:12

then it would have felt pretty normal all in all.

58:14

And, and I think that's how

58:16

it typically ends up. Like I'm not saying

58:18

it's not weird. I'm not even necessarily saying

58:21

it's not creepy, but I

58:23

don't know. It, it, you settle into

58:25

it. And again, as a technical exercise,

58:27

it is beyond compare. Did

58:29

you look at the Charlie Chapman video, by the way, the

58:32

one I was referring to, did you have a chance to

58:34

look at it? I think I skimmed through

58:36

it super quickly if memory serves. Some, some of

58:38

the things he does, for example, are like puffing

58:40

air, uh, underneath his,

58:42

like, uh, into his, uh, behind his

58:44

lips and into his cheeks. And

58:47

also talking out of like the side of his mouth. Like

58:49

that's what I'm talking about. Where it's not just like

58:51

a puppet where it's like, I can tell when you're

58:53

opening your mouth and you said a T sound, so

58:56

I'll make it look like you're making a T sound

58:58

with your tongue. Like you can do weird stuff with

59:00

your face that is not normal. Like, like, you know,

59:02

puffing your lips up and, and, you know, talking weird,

59:04

like, and it's, I mean, is it tracking it exactly?

59:06

No, but they're accounting for the fact that you might

59:08

do that and they're mapping it to whatever their little

59:11

puppet model is of your face going

59:13

far beyond just simply making it so that your mouth

59:15

moves when you talk, please. Everyone should definitely look at

59:17

this video. I mean, you know, and again, ignore the

59:19

fact that his whole head is all thuzzed out and

59:21

he looks like a weird death mask

59:23

of himself. I

59:26

mean, I think you are

59:28

correct at how incredibly impressive

59:30

this accomplishment is, but it's

59:33

still not good enough. And I think this is

59:35

kind of, this is largely the story of Vision

59:37

Pro in general right now, most

59:39

products, like most new groundbreaking

59:41

products, you tend to have

59:44

mostly like routine, you know, stuff

59:46

that has been done before plus

59:49

like one or two big new challenges.

59:52

That's not the Vision Pro. The Vision Pro

59:54

is a very small number of

59:56

things that have been done before. Like here's an

59:58

M2 based, you know. iPad-based OS in

1:00:00

a computing environment with Windows and stuff and

1:00:04

then they tackled ten different

1:00:06

massive Challenges at least and

1:00:09

they have achieved Remarkable things

1:00:11

they're way ahead of the industry

1:00:14

in so many ways with the vision Pro But

1:00:17

the problem is they're not

1:00:19

selling the vision Pro and pitching the

1:00:21

vision Pro to people in the VR

1:00:23

industry They're not selling and

1:00:25

pitching the use of personas to

1:00:28

only gamers They've

1:00:30

tackled these massive problems and

1:00:32

they've done very respectable jobs

1:00:34

in their solutions to them

1:00:37

but the problem is they're still

1:00:39

not where Most

1:00:41

people want them to be and they probably have

1:00:43

some kind of like resentment whenever people

1:00:45

criticize it on some level Like I bet there's

1:00:48

people in Apple who were like How

1:00:50

can you criticize the personas because look at how

1:00:52

amazing they are compared to the state of the

1:00:54

art and that's true

1:00:57

But when you're presenting them as an

1:00:59

alternative to people Then

1:01:01

people are gonna hold them to a much

1:01:03

higher standard. That's why they made this the

1:01:05

studio display camera. So cruddy yeah,

1:01:08

like like, you know the

1:01:10

vision Pro in in many ways it's pitching

1:01:12

itself as like an

1:01:14

alternative to various aspects

1:01:17

of reality that people are

1:01:19

very good at noticing the differences between the real

1:01:21

stuff and the fake stuff and They

1:01:24

have achieved remarkable stuff and yet it is

1:01:26

still not good enough for what most people

1:01:28

expect as like the basics Oh, you're gonna

1:01:31

show a virtual version of me. Okay, it

1:01:33

needs to look like me like it needs to be a stand-in

1:01:36

replacement and it's It's sort

1:01:38

of you know It's in the ballpark for a lot of

1:01:40

people but it's not you know a stand-in replacement and in

1:01:42

many cases it looks creepy and weird and you know still

1:01:45

even with 1.1 and So

1:01:47

I think this is gonna be you

1:01:49

know, just part of the vision pros uphill battle over

1:01:51

the over the coming years is like Even

1:01:54

though they have achieved remarkable things. They still

1:01:56

need to push it even further than what

1:01:58

they've already done to

1:02:01

match what most people's expectations are

1:02:03

who are not VR

1:02:05

industry pros? Yeah, I

1:02:07

think they picked mostly the right challenges because there are so

1:02:09

many different challenges they could have chosen and you might look

1:02:11

at this, I think a lot of people have been outside,

1:02:14

why did Apple even try to do this? And the answer

1:02:16

is because this is one of the problems they eventually need

1:02:18

to solve. And if you don't start working at it now,

1:02:20

don't expect you're just gonna snap your fingers sometime 10 years

1:02:22

from now, it'll be perfect. You gotta make the janky version

1:02:24

first, right? So they do it, but

1:02:27

they can't ignore this problem. Like they can't

1:02:29

ignore it unless they see a time horizon

1:02:31

of just wearing like clear glasses instead of

1:02:33

these goggle things. Otherwise, this is just gonna

1:02:35

be out there as an issue and so

1:02:37

they better start plugging away at it.

1:02:39

And I bet this was like a big time sink

1:02:41

and a big, you know, like a lot of technology

1:02:43

went into this and to your point, Margot, a lot

1:02:45

of technology and time went into it, into a feature

1:02:47

that we know regular people are gonna look at and

1:02:49

go, ugh. I

1:02:52

look at it and go, ugh. But it's like, you

1:02:54

just got it, like this is a problem that has

1:02:56

to be solved. If you wanna make this product, there

1:02:59

are a small number of problems that you basically just have to

1:03:01

solve and you're not gonna be able to do a great job

1:03:03

on them, even if you do better than anyone else has ever

1:03:05

done before, but you better start cracking on it because next

1:03:07

year you make a better version, next year, like they

1:03:09

can't ignore this one. They can't, like even more so

1:03:11

than eyesight. Eyesight where you can say, okay, we'll make

1:03:14

the cheaper one without eyesight or we'll use cartoon eyes

1:03:16

or whatever. Like you could maybe sweep that one under

1:03:18

the covers, but people are on video

1:03:21

calls all the time and I don't think it's

1:03:23

acceptable to them to say either

1:03:25

you just can't show your face or

1:03:27

your face is gonna have these giant ski goggles on

1:03:29

it. And all of those are less acceptable than even

1:03:31

this janky thing. So they just have to be like,

1:03:33

okay, we're gonna put ourselves out there

1:03:35

and we're gonna say we gave it our best shot because

1:03:37

we recognize this is a problem that we need

1:03:40

to solve and next year we hope we'll do

1:03:42

better. And it's gonna be a while, but like

1:03:44

I said, it is impressive

1:03:46

from a technical perspective what they've achieved and I

1:03:48

think they struck a reasonable balance in the beginning.

1:03:50

People are like, why did you just use meemojis? I think

1:03:53

meemojis would be worse. I know they

1:03:55

have meemojis, I know this whole scene kit versus

1:03:57

reality kit, political internal API thing that they have

1:03:59

going on there. But setting that aside, mimogis

1:04:01

are not as expressive as personas are. Again,

1:04:03

watch the video we'll link in the show.

1:04:05

It's of Charlie Chapman showing you different facial

1:04:07

expressions. Mimogis are cartoons that

1:04:10

are not, what I was

1:04:12

using for the example of animation where you draw just

1:04:14

the right lines to be expressive. Mimogis are not that.

1:04:16

Mimogis are bad, rigid headed,

1:04:18

chucky cheese, pizza time band,

1:04:21

whatever things. Not the, you know, slack in the

1:04:23

mimogis team. What they did is amazing too. Mimogis

1:04:25

had to walk so personas could walk

1:04:28

a little faster. But personas are

1:04:30

better able to communicate you across

1:04:37

a computer while you're wearing that goggles

1:04:39

than mimogis would be. And what

1:04:43

people want is personas, but not bad. Right?

1:04:46

And so you got, I guess you have to start, you know, you

1:04:48

could either just never ship or you

1:04:50

could ship what you have, which is personas, which

1:04:53

are, you know, better than mimogis, but still not

1:04:55

good enough. So I give Apple

1:04:57

an E for effort here. And

1:04:59

if they have to choose where to add the

1:05:02

resources to pursue better things, I

1:05:04

would say, put more resources

1:05:06

into personas than eyesight for

1:05:09

going forward. Yeah,

1:05:12

I don't know. I stand by eyesight

1:05:14

as well, and we don't need to

1:05:16

belabor this, but I get why people

1:05:19

are turned off by both personas and

1:05:21

eyesight. But I genuinely think this

1:05:23

would be a far worse

1:05:25

product without both of those

1:05:27

components. We

1:05:30

are brought to you this episode by Squarespace,

1:05:32

the all-in-one website platform for entrepreneurs to stand

1:05:34

out and succeed online. Whether you're just starting

1:05:37

out or managing a growing brand, Squarespace makes

1:05:39

it easy to create a beautiful website, engage

1:05:41

with your audience, and sell anything from your

1:05:43

products to your content to your time, all

1:05:45

in one place and all on your terms.

1:05:47

Let me tell you how great it is

1:05:50

building a website with Squarespace. I've told you

1:05:52

over the years how whenever I

1:05:54

or someone in my life needed a new

1:05:56

website for something, especially businesses, I would always

1:05:58

point people to Squarespace first. and they would,

1:06:00

every time, they would try it and they would

1:06:03

realize, oh, this is, I'm done, this is all

1:06:05

I need. This just happened again. Friend of mine

1:06:07

runs a, runs a small business and she had

1:06:09

some very, very high quote for some custom work.

1:06:11

And I said, hey, why don't you try it

1:06:14

on Squarespace first? And she didn't know about this.

1:06:16

And next time I saw her, she was like,

1:06:18

oh my God, you just saved me thousands of

1:06:20

dollars because she was able to go, and she's

1:06:22

not technical, but she was able to go to

1:06:25

Squarespace and build her own site without me telling

1:06:27

her how to do it as the nerd, like

1:06:29

you literally just point people to Squarespace. And

1:06:31

it's so easy. Even non-tech people have no

1:06:33

problem figuring it out and then doing it

1:06:36

themselves. So you're actually not only saving yourself

1:06:38

as the nerd, you're saving yourself work and,

1:06:40

you know, time that you could be, that

1:06:43

you might think they would need your help, but

1:06:45

then you're empowering them to build their own site.

1:06:47

And every feature, she kept, she was, she was

1:06:49

asking me like, oh, can you change this? Yes.

1:06:52

Can you change the template, put in your own

1:06:54

stuff? Yes. Does it support buying things, scheduling things?

1:06:56

Yes. All of those, does it support image galleries?

1:06:58

Yes, of course. Like so much on Squarespace

1:07:00

is built right in and it's so

1:07:02

easy. Anybody can do it. I strongly

1:07:04

suggest you check out Squarespace and if

1:07:06

people in your life need websites, point

1:07:08

them to Squarespace too. Go to squarespace.com

1:07:10

to start a free trial. When

1:07:13

you're ready to launch, go to squarespace.com/ATP,

1:07:15

and you will save 10% off

1:07:17

your first purchase of a website or domain.

1:07:20

So once again, squarespace.com free trial. When

1:07:22

you're ready to launch, use squarespace.com/ATP for

1:07:24

10% off your first purchase.

1:07:26

Thank you so much to Squarespace for

1:07:28

sponsoring our show. There's

1:07:34

news today that Apple

1:07:37

is already defending iMessage

1:07:39

against tomorrow's quantum computing

1:07:41

attacks. I'm sorry, what?

1:07:44

So Apple and security professionals that

1:07:46

have probably known about this for a long time,

1:07:48

but I don't often have to think about this,

1:07:50

but one of the things that security professionals are

1:07:52

thinking about is, Hey, there

1:07:55

will one day be a quantum computer

1:07:57

and what happens if you, you know,

1:08:00

somehow, probably, nefariously capture a bunch

1:08:02

of encrypted traffic today, but

1:08:04

what if you save it off for years

1:08:06

and years and years? And

1:08:09

eventually, you finally get

1:08:11

your hands or build that quantum

1:08:13

computer, and you can go back

1:08:15

to all of that years and

1:08:17

years and years of data that

1:08:19

you captured with our comparatively weak

1:08:22

encryption that we use here in 2024. But,

1:08:25

well, suddenly your quantum computer can

1:08:27

just decrypt all of that, right?

1:08:29

That's how it's going to work. And nobody really

1:08:31

knows if that's true or not. It certainly stands

1:08:33

to reason that it's true. And

1:08:36

Apple is already starting the process

1:08:38

of defending iMessage against tomorrow's quantum

1:08:40

computing attacks. So they had a

1:08:42

blog post about this, which I'll link in the show notes,

1:08:44

and The Verge covered it, and I will read from The

1:08:46

Verge a little bit. Apple's security team

1:08:49

claims to have achieved a breakthrough that, quote,

1:08:51

advances the state of the art of end-to-end

1:08:53

messaging, quote. With the upcoming

1:08:55

release of iOS 17.4 and macOS 14.4 and the

1:08:57

equivalent iPad WatchOS, the

1:09:01

company is bringing a new cryptographic

1:09:04

protocol called PQ3 to iMessage that

1:09:06

purports to offer even more robust

1:09:08

encryption in defenses against sophisticated quantum

1:09:10

computing attacks. So now from

1:09:12

Apple's blog post, today we are announcing

1:09:14

the most significant cryptographic security upgrade in

1:09:16

iMessage history with the induction of PQ3,

1:09:18

a groundbreaking post-quantum cryptographic protocol that advances

1:09:20

the state of the art of end-to-end

1:09:22

secure messaging. PQ3 is the first messaging

1:09:25

protocol to reach what we like to

1:09:27

call Level 3 security, which I'll explain in

1:09:29

a second, providing protocol protections that surpass those

1:09:31

in all the other widely deployed messaging apps.

1:09:33

To our knowledge, PQ3 is the strongest security

1:09:36

properties of any at-scale messaging protocol in the

1:09:38

world. PQ3 employs a

1:09:40

hybrid design that combines elliptic

1:09:42

curve cryptography with post-quantum encryption,

1:09:44

both during the initial key

1:09:47

establishment and during rekeying. Thus,

1:09:49

the new cryptography is purely additive,

1:09:52

and defeating PQ3 security requires defeating

1:09:54

both the existing classical ECC

1:09:56

cryptography and the new post-quantum

1:09:58

primitives. That bit

1:10:00

was interesting because basically

1:10:03

they're covering their butts to say, look, what if

1:10:05

we screwed this up? What if we came up

1:10:07

with this new, uh, you know, quantum, uh, you

1:10:09

know, post quantum encryption, but we are, we have

1:10:11

a bug in the implementation. We're just rolling this

1:10:13

out and oops, we got something wrong and there's

1:10:15

some kind of bug or buffer overflow or whatever,

1:10:18

uh, making this additive to the

1:10:20

existing encryption, hopefully makes it

1:10:22

so that if they really screwed this up

1:10:25

and oh, it's trivially easy to crack this post quantum

1:10:27

encryption because of a bug and Apple software. Well, once

1:10:29

you crack it, what you're left with is

1:10:32

the stuff that was encrypted the way it's currently

1:10:34

encrypted now. Like that's that it's layered on top

1:10:36

of it is my understanding, which I think is

1:10:38

really smart thing to do to sort of,

1:10:40

you know, we all kind of wish we could do

1:10:42

this. Like you kind of want to, I want to have

1:10:45

the old way there as a fallback, even if you totally

1:10:47

screw up the new way. I'm not entirely sure if that's,

1:10:49

that's true, but my reading this paragraph makes me think that

1:10:51

it might be, and it's a clever idea, which is like

1:10:54

belt and suspenders. Uh, we're not getting rid

1:10:56

of the old encryption. We're just encrypting it

1:10:58

one more time, even better. Yep.

1:11:01

Yep. And so Apple as mentioned has

1:11:03

come up with its own, uh, four

1:11:06

plus stage, I guess five plus stage

1:11:09

level system. Uh, they've

1:11:11

defined classic classical cryptography, which is not quantum

1:11:13

secure. There's level zero, which is no end-end

1:11:15

encryption by default. And by the way, you

1:11:17

can kind of tell it, it's kind of

1:11:19

weird that like Apple's marketing department really doesn't

1:11:21

ever really like to name competitors

1:11:24

at all ever. Like even though we know who

1:11:26

they're talking about when they alluded to something, but

1:11:28

whoever's doing the security blog has no problem

1:11:31

naming names. So you got to get the

1:11:33

level zero thing, level zero, no end-to-end encryption.

1:11:35

They just named names, Skype, Telegram,

1:11:37

WeChat, whatever that QQ is. Level

1:11:40

one end-to-end encryption by default, which

1:11:42

includes line Viber WhatsApp signal previously

1:11:45

and I message previously level

1:11:47

two. Now that now we're in the

1:11:49

post quantum cryptography or PQC, this is,

1:11:51

and also has an encryption by default.

1:11:54

So level two is PQC key establishment

1:11:56

only, which is signal with PQXDH, whatever

1:11:58

the hell you're talking about. That means

1:12:01

and then level three, which is PQC key

1:12:03

establishment and ongoing PQC rekeying, which is the

1:12:05

new I message with PQ three. And then

1:12:07

in the future, not given a level, but

1:12:09

I guess would be level four, PQC

1:12:12

establishment plus ongoing PQC rekeying

1:12:14

plus PQC authentication. And then

1:12:16

there's potentially even more after

1:12:18

that. So this is really

1:12:20

cool. I'm happy that Apple is working on this,

1:12:23

even though I don't think anyone gives a crap

1:12:25

about what I say in high message, but I

1:12:27

do think that that's really cool that this is

1:12:29

something that they're actively working on hopefully and

1:12:32

presumably so long before it's ever going to

1:12:34

be necessary. Interesting. And the, the, the fact

1:12:36

that they have like the level four written

1:12:38

there, if you read the big document to

1:12:40

the blog post, they explain like that they

1:12:42

essentially chose not to do the

1:12:46

thing that makes this level four, because it's

1:12:48

just like level three, except for they add

1:12:50

the PQC authentication. And they explained

1:12:52

why they chose not to do that. They said like,

1:12:54

look, we don't we

1:12:57

don't want to do that right now. And we don't

1:12:59

think it's necessary right now because it happens, it happens

1:13:01

in the moment, so it's not one of those things.

1:13:04

The authentication is like, you know, establishing authenticity

1:13:06

of who you're communicating with before you start

1:13:08

communicating and that has to happen each time.

1:13:11

And so what they say is like, this is not

1:13:13

like you could save this for later and then decrypt

1:13:16

it later because you're not, there's no useful information exchanged

1:13:18

yet. So saving that,

1:13:20

saving that, you know, that exchange is useless to

1:13:22

you because that exchange has already happened. So even

1:13:24

if you crack the encryption out, whether you're on

1:13:26

a computer 10 years from now, it's pointless. There's

1:13:28

no data there. And that conversation happened long ago

1:13:30

and those keys are all useless. Right. And

1:13:33

so they said, well, until someone can get a quantum

1:13:35

computer that can intercept your traffic and then in real

1:13:37

time crack it, we don't have to worry about that.

1:13:39

So that's why they push it off to the future.

1:13:42

And the other thing they talk about in implementation tradeoffs

1:13:44

is the post quantum

1:13:46

cryptography stuff. The data is bigger, like the

1:13:48

keys are bigger and whatever, whatever other info

1:13:50

they have to exchange is significantly bigger than

1:13:53

their old encryption scheme and their

1:13:55

old encryption screen scheme sends new keys

1:13:57

like every single message, but

1:13:59

to like basically tamp down on the

1:14:01

bandwidth use of, you know, if you're

1:14:03

going to tack something onto every iMessage that's sent, that

1:14:06

adds up real fast because a lot of iMessages are

1:14:08

sent per hour, per day, per minute, per second, right?

1:14:11

So they say with the post-quantum stuff,

1:14:14

they only are going to do like the re-keying

1:14:16

periodically, and they do it based on an algorithm

1:14:18

of like if you're on a crappy connection, we

1:14:20

won't try to shove these new keys down as

1:14:22

fast as we normally do. If

1:14:24

you're on a faster connection, we'll send them

1:14:27

more frequently. And what they basically said is

1:14:29

we're trying to narrow the window that an

1:14:31

attacker could do something. So if

1:14:33

an attacker someday cracks this, the only thing that they'll

1:14:35

be able to see is the

1:14:38

brief period before we re-keyed. They

1:14:40

cracked just that little segment. So I don't know how big

1:14:42

that is. They didn't actually say whether it's like two lines

1:14:45

of text or five minutes or whatever, but

1:14:47

that's another trade-off they made. And that's another

1:14:50

fun thing of like seeing security people write

1:14:52

something instead of marketing people, because

1:14:54

they'll tell you, here were the engineering trade-offs and

1:14:56

here's why we made them. Instead

1:14:59

of just saying, this is the best and no one

1:15:01

else has anything like this and it's super secure, or

1:15:03

they let engineers write it and they'll tell you about

1:15:05

the trade-offs. So I thought this was super interesting. I

1:15:07

highly recommend everybody read the blog post we link, because

1:15:09

it might seem like it's kind

1:15:11

of got a little bit of technical jargon, but they do a

1:15:14

really good job of explaining it well enough

1:15:16

for you to follow what they're saying, and it's pretty

1:15:18

cool stuff. All right,

1:15:20

it's been a little while, so let's do some Ask ATP.

1:15:23

And let's start tonight with Ian Malkuszewski, who

1:15:25

writes, what is your advice on how to

1:15:28

best communicate to non-tech people the value and

1:15:30

benefits of native Mac-ass Mac apps? I work

1:15:32

at a small under-10 people company where everyone

1:15:34

works on a Mac, but some are new

1:15:36

to the platform and most of their software

1:15:39

experience is using electron apps and other apps

1:15:41

that are at best, so-so citizens of the

1:15:43

platform. We're constantly hiring a developer to make

1:15:45

an app to help us with an internal project. I want to

1:15:47

be able to make the case to hire someone who knows how

1:15:49

to make good Mac software. If I put

1:15:52

an app like Fantastical next to Outlook, most

1:15:54

of my teammates just see two calendar apps

1:15:56

with cosmetic differences and shrug off the idea

1:15:58

that there's anything notably between them. I'd love

1:16:01

any advice on how to make the case

1:16:03

to non-technical people that Mac-Ass Mac Apps have

1:16:05

a real user-facing benefit beyond just feeling better.

1:16:08

Hot take, I don't know that you really want

1:16:10

a Mac-Ass Mac App in this context. Like if

1:16:12

you're just writing stuff for your own team of

1:16:14

10 people, I wouldn't spend the time

1:16:16

personally. And I know that's probably going to make everyone

1:16:19

shudder and hate me, but there are

1:16:21

bigger problems and more important problems to solve than

1:16:23

making a Mac-Ass Mac App. But that's my opinion,

1:16:26

wrong as it may be. Marco, correct

1:16:28

me. I don't think I'm going

1:16:30

to disagree with you on this. So it

1:16:33

depends so much on the nature of the app that

1:16:35

you're going to build. So

1:16:38

Ian said this is an internal app. I

1:16:40

think it will be challenging for

1:16:43

the higher-ups to justify what it

1:16:45

would take to make a really

1:16:47

good, quote, Mac-Ass Mac

1:16:49

App. And what that means is basically

1:16:52

native Mac code,

1:16:55

native Mac controls, kind of the

1:16:57

standard Mac UI design paradigm, things

1:16:59

like that as opposed

1:17:02

to things like Electron. And

1:17:04

I think for most software,

1:17:07

let alone most internal use

1:17:09

software, it's very difficult to

1:17:11

justify that kind of investment on the Mac because

1:17:14

first of all, Ian

1:17:16

mentioned wanting to hire someone who could do that. That's

1:17:19

difficult. There's not a lot of

1:17:21

programmers out there who are qualified

1:17:23

to make this style

1:17:25

of high-quality, native, kind

1:17:27

of traditional style

1:17:30

Mac App. It's a very small talent pool.

1:17:32

Including inside Apple. Yeah, Apple can't even make

1:17:34

them anymore. Sorry, but it's true. It is.

1:17:36

So that's problem number one is could you

1:17:38

even find someone to do this? Problem number

1:17:40

two is would you be able to pay

1:17:42

them what they are probably worth? And

1:17:45

then problem number three is can you convince

1:17:47

the higher-ups in your company that that's

1:17:49

worth doing? And I think

1:17:51

the only way that is

1:17:54

really easily done is if

1:17:56

the higher-ups in your company are Mac

1:17:58

nerds and also... and not good business

1:18:00

people. Because what you're ultimately looking,

1:18:03

and by the way, and I'm both of those

1:18:05

things. So if

1:18:07

it was my internal app, I would absolutely

1:18:09

do this. But the problem

1:18:11

is when you're talking about having

1:18:14

that style of app with

1:18:16

the realities of today and

1:18:18

the markets and the tech needs around it today,

1:18:22

it's more of an indulgence

1:18:24

than something that you can make a good business case

1:18:26

for. So

1:18:29

if you're able to convince them,

1:18:31

hey, indulge me in having

1:18:33

this thing that we're gonna build be very

1:18:35

nice in these ways that you don't care about, but

1:18:38

I do, then good for you,

1:18:40

that's great. I think you're in for an

1:18:42

uphill battle. And then also, even

1:18:44

if you can get someone to build it, and

1:18:47

you can get the higher ups to agree

1:18:49

to indulge you in this, what

1:18:51

happens down the road when you have to change

1:18:53

it or update it? How

1:18:56

hard is it going to be to get someone

1:18:58

in to do that down the road?

1:19:00

Because it's already hard enough now. So

1:19:04

I think it's gonna be a tough sell. If

1:19:06

they just don't have enough experience dealing with giant corporate

1:19:08

bureaucracies, let me tell you how to do this. So

1:19:10

the first thing you need to do is establish things

1:19:16

that are not in this question that I don't

1:19:18

know the answer to. For example, this seems to

1:19:20

imply that you're gonna hire a developer to make

1:19:22

an app for an internal project and that you

1:19:25

only need a Mac app. If

1:19:27

that's really true, confirm that and say,

1:19:30

just so we're clear, we're not planning

1:19:32

on making a Windows version of this app later. It's

1:19:35

an internal app, are we ever gonna need a Windows version,

1:19:37

do we need a Linux version? And

1:19:39

if you can get past that hurdle of clarifying

1:19:41

the requirements and they say, no, no, no, we're

1:19:43

never gonna make a Windows version, never gonna need

1:19:45

a web version, this is gonna be a Mac

1:19:47

app, it's an internal thing, which is only ever

1:19:49

gonna run on Macs, there's never gonna be another

1:19:51

version, then you're set, because then

1:19:54

what you can do is not say

1:19:56

what Marco just said, because that will discourage them. What you

1:19:58

have to say then is... Okay, if it's gonna

1:20:00

be a Mac app, the

1:20:03

reason we want to hire some, you

1:20:05

know, an experienced Mac developer is because

1:20:08

the straightest path to make a

1:20:10

Mac app is to use Apple's

1:20:13

frameworks in a straightforward way. No

1:20:15

weird custom stuff. No I've come up with

1:20:17

a framework of my own for doing GUIs.

1:20:20

Just use and you'll have to like hash this out, whatever

1:20:22

they want you to use, SwiftUI do they want you to

1:20:25

use AppKit, whatever it is they pick. Whatever path you go

1:20:27

down. Don't pick SwiftUI on the Mac. Find

1:20:30

someone who will do that in the

1:20:32

most straightforward way possible. And the pitch

1:20:34

is, when this person

1:20:36

disappears, I want anybody to be able

1:20:38

to look at this and say, oh

1:20:40

this is a straightforward SwiftUI app, straightforward

1:20:42

AppKit app that doesn't do any weird

1:20:45

custom stuff that has no custom controls,

1:20:47

that Apple's documentation explains how to do

1:20:49

it, that it's really easy to find

1:20:51

example code documentation, you know, anything like

1:20:53

that it is straightforward and they're going

1:20:55

to be done faster. This was the

1:20:57

old pitch with like the next stuff

1:20:59

when it was, you know, next and

1:21:01

object to see and everything. You

1:21:04

can make a highly functional

1:21:06

app with fewer lines of code and

1:21:08

in less time because the frameworks do

1:21:10

so much for you. That's

1:21:12

the pitch you make. Your goal

1:21:15

is I want a good to get an

1:21:17

experienced Mac developer in and the pitch is

1:21:19

we need someone who, we can't

1:21:21

get someone who's like, I don't even know what API is

1:21:23

Mac software. I'm just going to use like the, you know,

1:21:25

Quartz 2D drawing API and draw my own GUI because I

1:21:28

don't know what this whole AppKit thing is. It's confusing to

1:21:30

me, but I'm a good programmer so I'm going to make

1:21:32

my own UI framework like Lauren Brick there out of OpenGL

1:21:34

or whatever. You know what I mean? Like that is not

1:21:36

what you want. And so that's how I would pitch it,

1:21:39

I would say. And you never get sort of

1:21:41

like to know the culture and the idioms, but at

1:21:43

the very least you should get someone in there. For example, you decide

1:21:45

it's going to be AppKit. Get someone

1:21:47

who knows AppKit. I know it sounds dumb, but

1:21:50

like I've seen corporate hiring and it's like, I

1:21:52

can figure it out. And then, right, and you

1:21:54

know, if someone comes in, you're like, oh, I've made a Mac

1:21:57

app and they show you an Electron app. And it's like, did

1:21:59

you make an app? app though, are you really

1:22:01

a web developer? Are you using like React Native

1:22:03

and Electron and all these other things? That's not

1:22:05

what we're looking for. And that I

1:22:07

feel like will be harder to wrangle because

1:22:09

if you're making some kind of

1:22:11

internal tool app and someone uses Electron and it

1:22:13

takes 500 megs of RAM when you just launch

1:22:16

the thing, like that's another case against it, right?

1:22:18

Just find a Mac developer

1:22:20

who can make a straightforward simple thing that

1:22:22

will be done quickly, have lots of functionality,

1:22:24

and be easy for any future Mac

1:22:27

developer to understand. I'm not saying it's a slam

1:22:29

dunk case, but that's your best shot. All

1:22:32

right, sorted. Julian Gamble

1:22:34

writes, if you could ask Apple for one

1:22:36

new API to help your apps this year,

1:22:38

what would it be? Julian's guesses are Marco

1:22:40

for Overcast, WatchOS, an API to make syncing

1:22:42

files like podcast files work on demand and

1:22:45

reliably on schedule and in general much easier.

1:22:47

We can stop there. I don't even write

1:22:49

a watch app in a minute. I'm ready.

1:22:52

I don't think an API can change physics in the size

1:22:55

of the watch's battery. And I know Marco says they should

1:22:57

loosen up a little bit, but in the end, an

1:22:59

API that did that would burn your battery

1:23:01

pretty badly. Okay, first of all, this would

1:23:04

not be my pick for this question. But

1:23:06

just for the sake of argument, what I

1:23:08

would want in this area would be if

1:23:11

a user has initiated a download

1:23:14

while the app is in

1:23:16

the foreground, let me start

1:23:18

a background double download that

1:23:20

begins immediately and

1:23:22

uses Wi-Fi if it has to. Because

1:23:25

right now it will, you know, wait for a

1:23:27

while and maybe do it later when I saw

1:23:29

the charger or it'll use the Bluetooth connection to

1:23:32

the phones. It's lower power. And

1:23:34

if a user while using the

1:23:36

app in the foreground initiates a

1:23:38

download that signifies pretty clear

1:23:40

user intent. I want this to happen

1:23:43

like kind of now. I

1:23:45

make a pretty strong argument for that, but

1:23:47

honestly, that is not my biggest problem on

1:23:49

WatchOS. My biggest problem on WatchOS

1:23:51

is every few days when I get an

1:23:53

email from a customer saying, why don't I

1:23:55

support the double tap gesture on the Series

1:23:57

9 and Ultra 2? Yes. The

1:24:00

answer is there is no API to do

1:24:02

that. Apple released the Double Tap feature in

1:24:04

the fall, and they said,

1:24:06

hey, third-party apps, you can just

1:24:08

let this do the default response

1:24:10

on notifications, and that's

1:24:13

it. So literally,

1:24:15

there is no API to respond

1:24:17

to Double Tap, and it's

1:24:19

such a glaring omission that my customers

1:24:21

assume that I'm the one being negligent,

1:24:23

not Apple. So that would

1:24:25

be my number one request on WatchOS.

1:24:28

But that isn't even my number one

1:24:30

request overall. My number one

1:24:32

request at this moment is

1:24:36

for SwiftUI's

1:24:38

list to

1:24:40

have feature parity with UI

1:24:42

table view. Now,

1:24:44

that is not a small thing. No,

1:24:47

it is not. However, that would

1:24:49

be a huge improvement to the

1:24:52

most coding that I'm doing this

1:24:54

year. Julian said this year.

1:24:57

That's what would help me is make SwiftUI

1:24:59

list have more capabilities. For instance,

1:25:01

one that I ran into most recently is

1:25:05

the drag-to-reorder mechanic in it

1:25:07

does not support multiple items.

1:25:10

In UI table view, you can pick up multiple items

1:25:12

as you drag your finger around with the second finger

1:25:14

and then drop them all in one spot. The

1:25:18

equivalent in SwiftUI does

1:25:20

not support that. The API appears to be

1:25:22

written to support it, because when it tells

1:25:25

you drop here, it

1:25:28

passes you a set of

1:25:30

indexes to drop there. So

1:25:32

the API seems to be

1:25:34

built to support multiple things

1:25:36

being dropped. But there is

1:25:38

no physical implementation of that.

1:25:41

It's just one case of many where

1:25:43

I keep running into areas where SwiftUI

1:25:45

list is oddly limited

1:25:48

in ways that UI table view

1:25:50

is not. And UI table

1:25:52

view is a super important API

1:25:55

for iOS. Almost every app uses

1:25:57

table views in some way. UITableView

1:26:00

has been added to over time like crazy

1:26:02

because it's such a huge

1:26:04

part of interaction in iOS, there's tons of

1:26:06

features that UITableView supports. So if UIList kind

1:26:09

of started from scratch and kind of did

1:26:11

the basics and has been very slowly

1:26:14

and carefully adding little

1:26:16

bits and pieces, oh you want to customize this

1:26:18

inset? Okay here's one way to

1:26:20

do that. You want to customize whether this border shows

1:26:22

up over here? Okay fine we'll give you another small

1:26:24

way to do that. But there's

1:26:27

some big areas like the multi-select drag

1:26:29

and drop that it just has not

1:26:31

enough feature parity and that is making

1:26:33

my life difficult as I'm trying to

1:26:36

work on the SwiftUI rewrite for the

1:26:38

biggest part of my

1:26:40

user base, like the iOS app and

1:26:42

the table views within it. Those

1:26:44

are major areas and I just keep hitting walls

1:26:47

that just aren't there yet in SwiftUI. It

1:26:50

rocked my world when I

1:26:52

want to say it was like four or five years ago,

1:26:54

it was maybe more than that. But

1:26:56

somebody described all

1:26:59

of professional iOS development as turning

1:27:01

JSON into table views. And

1:27:03

I was like mother of god,

1:27:05

that is exactly right. So if that doesn't mean

1:27:08

anything to you, so JSON is a plain text

1:27:10

way of transmitting data. Typically when you

1:27:12

get something from a web server, not as a

1:27:14

person but as an app, you're going

1:27:18

to get data back in JSON format,

1:27:20

JSON. And like Marco

1:27:22

said, table views run the majority

1:27:24

of all iOS apps. So you

1:27:26

could summarize iOS development as turning

1:27:28

JSON into table views for money. Web development too

1:27:31

these days. Not table views I guess

1:27:33

but same thing. Yeah exactly. You hit a JSON API,

1:27:35

you get the result, you lay it out in an HTML page. Yep

1:27:37

exactly. So anyway, that rocked my

1:27:39

world even though it was five years ago. I still

1:27:41

think it's hilarious and accurate. John,

1:27:44

continuing with Julian, John for front

1:27:46

and center for macOS,

1:27:48

an API to enable preserving icon

1:27:50

arrangement in folders as per classic

1:27:52

macOS pre-Mac OS X. That's

1:27:55

Julian's guess. That's a misunderstanding. What would be

1:27:57

required to get that? It's not an API

1:27:59

that's missing. It's the Mac Finder application

1:28:01

would have to be behaved differently. There's no

1:28:03

API that could roll out that would make

1:28:06

that work, unfortunately. Back to what Marco

1:28:08

was saying about List, I was telling him before, when

1:28:10

we were talking about this in Slack, they

1:28:12

should count as blessings because a List was

1:28:14

so under-featured and buggy for me in SwiftUI

1:28:17

and macOS that I can't even use List

1:28:19

in my thing that has a List of

1:28:21

things that are reorderable, and I had to

1:28:23

basically roll my own List. It's like having

1:28:25

to roll your own UI table view because

1:28:27

table view is too janky for you. It

1:28:29

could be worse, but yeah, it could definitely

1:28:31

be better. What I actually want for

1:28:33

an API to help make my app better

1:28:36

this year, I actually filed feedbacks on this

1:28:38

in the fall of last year, a whole bunch

1:28:41

of them. I'll

1:28:43

just read off the titles of them because I tried to separate

1:28:45

it into, I don't know why I bother, but I tried

1:28:48

to separate it into things that I hope would be, like

1:28:50

if I put them all in one feedback, they'd say, oh,

1:28:52

we're not doing all this crap. I tried to break it

1:28:54

down, like maybe they'll pick one of them. The first is

1:28:57

add a modern window list API. An

1:28:59

API in macOS that lets me list all the windows

1:29:01

on the system. There are existing APIs

1:29:03

that do that, but they're super old. Most of

1:29:05

them are deprecated and they're just horrible. Well, in

1:29:07

your computer, they would just crash. When

1:29:10

I say list the windows, I don't mean get

1:29:12

the windows content like screen

1:29:14

capture kit. I don't mean control them like

1:29:16

accessibility APIs, which are all kind of old

1:29:18

and cruddy. I mean just literally list them.

1:29:21

List them, which apps own them, what are their

1:29:23

sizes, what are their positions. It's so simple, you

1:29:25

can do it with existing APIs, but

1:29:27

Apple really doesn't want you to, and there are a bunch

1:29:30

of caveats and a bunch of stuff that's deprecated. Second

1:29:33

one was add a system-wide window layering and

1:29:35

visibility API. So you can list

1:29:38

all those windows and you know where they are and

1:29:40

how big they are and which app owns them. Wouldn't

1:29:42

it be great if you could tell them to come

1:29:44

forward, to go to the back, go behind some other

1:29:46

window? Again, still, I have no idea. These APIs have

1:29:48

no idea what's in the windows. They can't see the

1:29:50

window contents at all. They don't want the window contents.

1:29:52

They want a bunch of anonymous rectangles that are owned

1:29:55

by processes, right? An API where you could

1:29:57

tell them what to do. And the

1:29:59

Mac window layering, or APIs are extremely limited.

1:30:01

People are shocked to hear that you can't

1:30:03

do something as simple as take one window

1:30:05

and change its layering because those

1:30:07

windows belong to other applications and my application

1:30:09

has a very limited ability to screw with

1:30:11

the windows in other applications. Again,

1:30:14

modularly accessibility APIs, which are all other can of worms

1:30:16

that I found other bugs with. And

1:30:20

let's see, once I get

1:30:22

even more ambitious, add support for window manipulation

1:30:24

extensions. So the idea,

1:30:27

this is more of kind of more ambitious, but the

1:30:29

idea that when you're

1:30:31

doing operations with the Windows Server, like moving windows

1:30:33

around or clicking on them to bring them to

1:30:35

the front or whatever, that there

1:30:37

would be a plugin system where you could affect

1:30:39

that interaction. Once again, still having no idea what's

1:30:41

in any of these windows. All you know is,

1:30:43

hey, a window is being moved. It's owned by

1:30:45

this application. It's in this size, it's in this

1:30:47

position. I'm about to

1:30:49

move it to here. Is there anything you'd like

1:30:52

to do to modify that operation? Like say snapping

1:30:54

to a grid or giving you a chance to

1:30:56

draw a bunch of guides like you're in a

1:30:58

graphics application, you could do so

1:31:00

many cool things with this. Apple is currently not

1:31:02

doing them. Third party applications try to do them,

1:31:04

but it's so hard because macOS fights you at

1:31:06

every step. So anyway, my answer to this is

1:31:08

basically like a modern, swift,

1:31:12

savvy, not

1:31:14

something that's in core foundation, not something

1:31:17

that's an ancient carbon API, but like

1:31:19

a modern API that lets you

1:31:22

participate in the window management

1:31:24

system in macOS. Changing window

1:31:26

layers, knowing where they all

1:31:28

are and being able to do

1:31:30

things as they're moved around. That would be a

1:31:32

dream, not just for me and for my apps,

1:31:34

but I think if Apple provided those APIs, all

1:31:37

those apps that are out there now that try

1:31:39

to do this with like the accessibility APIs and

1:31:41

the giant ones that we have now would become

1:31:43

so much better, so much

1:31:45

more full featured. I've said this in past programs,

1:31:48

stage manager. If Apple

1:31:50

did my wish list of

1:31:52

APIs here, stage manager should have

1:31:54

been something that a third party could have implemented.

1:31:56

Like if you have all the APIs, that a

1:31:58

third party had the idea. Like I think

1:32:00

it would be cool if Windows work like

1:32:02

this. A third party should have been able to

1:32:05

implement stage manager. As we know, third party

1:32:07

could absolutely not implement stage manager. Only Apple could

1:32:09

do it. And now only Apple can ever

1:32:11

make it better. And only Apple, we have

1:32:13

to wait for the next idea that Apple has

1:32:15

in five years. Third parties are out there

1:32:17

with lots of cool ideas about window management on

1:32:19

the Mac. They can't implement them because Apple

1:32:21

does not provide robust enough APIs. And like I

1:32:24

said, you can do pretty much all of

1:32:26

these with zero access to the contents of any

1:32:28

of these windows. So it is privacy preserving,

1:32:30

but it is annoying for Apple to implement

1:32:32

because they may think this is not important. But I'm

1:32:34

like, Apple, your heart doesn't seem into

1:32:36

this window manager thing. Every once in a while, some team

1:32:38

manages to sweep something out and it gets added to a

1:32:41

giant pile of Mac window management stuff. Let

1:32:43

third parties do it. We'll figure out what works.

1:32:45

Just copy whatever the most popular app is in

1:32:47

five years. That's what I want.

1:32:50

Fair enough. Julian continues. Casey, this is a tough

1:32:52

one. Has to be Apple and not another

1:32:54

company. For Vision OS, an API in Apple

1:32:56

TV for call sheet to pull out the

1:32:59

current movie or show name and be able

1:33:01

to prompt with the movie show info required.

1:33:04

Yeah, kind of. But really, the one thing I

1:33:06

would kill for right now is I want to

1:33:08

have a way to ask any of the Apple

1:33:11

TVs on the same network, what are you playing

1:33:13

right now? And I get

1:33:15

why that isn't a thing because somebody like

1:33:17

Facebook would use it for nefarious purposes. But

1:33:20

maybe you could have. I mean,

1:33:22

there's so many other freaking user prompts,

1:33:25

like Windows Vista style. Why not prompt

1:33:27

some security prompt that says, hey, is

1:33:29

it cool if call sheet looks at

1:33:31

what you're watching? And I would love

1:33:33

that. And it'll never happen, but that

1:33:35

would be what I would want. Please

1:33:37

and thank you. Why don't you

1:33:39

just do what everybody else does, which is

1:33:41

just have microphones or visuals. Yeah, right, exactly.

1:33:43

Just detect what's playing and look it up

1:33:46

Shazam style to figure it out. Yeah, that's

1:33:48

what I should do. Winnie

1:33:50

Lewis writes, could Marco please best

1:33:52

first favorite fish concerts? I wasn't aware

1:33:55

fish had a lore. And

1:33:57

I'm curious where to start. Do you need me to explain best first favorite Marco?

1:34:00

We should explain to the listener anyway. Yeah, please. So

1:34:02

Best First Favorite is a thing from one of my

1:34:04

other podcasts, Reconcile All the Differences. My co-host Merlin came

1:34:06

up with it. It is

1:34:08

the idea that when you're trying

1:34:10

to discuss a thing, like

1:34:13

the band Fish or a television show or

1:34:16

a set of movies or whatever, you

1:34:18

are challenged to come up with which one

1:34:21

of these things you think is the best, the

1:34:23

best Beatles album, for example. Which one you think

1:34:25

a someone who is new to the Beatles should

1:34:27

listen to first. So what Beatles album should

1:34:29

I start with? And what is your favorite

1:34:31

Beatles album? And they may all be the same thing,

1:34:33

or they may all be different. So Best First Favorite

1:34:35

fish concert is what's the best fish concert, what's your

1:34:37

favorite fish concert. And if someone is new to fish

1:34:39

and you had to tell them this is the concert

1:34:41

you should start with, which one is it? And

1:34:44

I don't really have a good answer. So

1:34:46

the reason Winnie Lewis asked this question is

1:34:49

shortly after New Year's

1:34:51

this year, because Fish did a really fan

1:34:55

service amazing thing for

1:34:57

their New Year's Eve concert that if

1:35:00

you've been a fish fan for a very long time, this

1:35:03

was an especially big one that played upon a whole

1:35:05

bunch of stuff in

1:35:07

Fish lore from forever ago that fans

1:35:09

really enjoyed. And I don't really

1:35:11

know how to tell you to get started in

1:35:13

Fish in a way that you would appreciate that.

1:35:15

That's kind of like saying, oh, here

1:35:17

we have a podcast with over 500 episodes. How

1:35:21

do I go about getting all of the old

1:35:23

references? You could go listen to all 500 episodes

1:35:25

of our show. And

1:35:28

by that point, you will understand all the

1:35:30

references. That's a bit of a

1:35:32

commitment. And I wouldn't necessarily recommend that most

1:35:34

people do it, just because that's quite a

1:35:36

lot to tackle. But that New Year's one

1:35:38

could be your favorite though, or it could

1:35:40

be the best. Fish has

1:35:42

40 years

1:35:44

they've been playing together. So no

1:35:47

one is going to go into a band with a

1:35:49

40-year history and

1:35:52

get all the context to understand all

1:35:54

the different lore. I don't even

1:35:57

know a lot of it. And I've been

1:35:59

a very diehard. fan of the

1:36:01

band since like 2007 ish 2008

1:36:03

ish much of the old fish

1:36:05

lore goes over my head even because I wasn't

1:36:07

there in the 90s when they were in when all

1:36:10

the you know fandom was really building up so

1:36:13

I'm actually not even qualified to

1:36:15

answer this question and I have

1:36:17

I have purchased the live download

1:36:19

of every show they've done since

1:36:22

2009 you can answer favorite for

1:36:24

sure answer favorite I can't do that either

1:36:26

I don't frequently listen

1:36:28

to old shows I'll

1:36:31

pull one up occasionally and in fact one

1:36:33

of the reasons why I want to make a

1:36:35

jam band listening app as I discussed

1:36:38

in previous episodes at some point that you know

1:36:40

the app that will have an audience of four

1:36:42

people and so I I should never make this

1:36:44

app but one of the features that I want

1:36:46

out of this app is kind of

1:36:48

a way to like deep mine

1:36:51

my collection of old fish concerts

1:36:53

for like gems of shows and

1:36:55

I have a couple of ideas and how that could be made

1:36:57

more interesting and things like that but the reality is what I

1:37:00

usually am listening to is the last few

1:37:03

months worth of live shows so

1:37:05

of course that's a rolling window they're they're

1:37:07

literally doing a show right now I cannot wait until

1:37:09

tomorrow morning I can download it and listen to it

1:37:12

usually that's what I'm doing is I'm listening to whatever

1:37:14

the you know the last few months of shows are all kind

1:37:16

of go through and as I'm going through

1:37:19

a show I will give star ratings in

1:37:21

you know the Mac music app

1:37:23

formerly called iTunes anything that that

1:37:25

I like rate a sort of a certain level I kind of

1:37:27

go back and revisit more often and then

1:37:29

I have a playlist in iTunes slash music

1:37:32

called best of fish any

1:37:34

real standout songs not shows

1:37:37

I will add to that list it's not a

1:37:39

huge list but then whenever I'm somewhere

1:37:41

if I want to shuffle my best of fish playlist

1:37:44

like then I know I'm gonna get some real you

1:37:46

know rocking you know standouts but I

1:37:48

don't even I am NOT

1:37:50

qualified to tell you which entire

1:37:52

performance is best or

1:37:54

my favorite and I would say for

1:37:56

first your your

1:37:58

best off for first fish

1:38:01

concert, just finding any of

1:38:03

them. Like, if

1:38:05

you go, so the service that they released them

1:38:07

through is called Live Fish. You

1:38:09

can buy a whole show worth of MP3s

1:38:11

there for, I think it's 10 bucks for

1:38:13

a whole night. You can

1:38:15

also try, they have their own streaming service that spotlights

1:38:18

certain shows. You can stream the whole catalog from it.

1:38:20

They have a free trial on that streaming service. So

1:38:22

you can even just try, like sign up for the

1:38:24

streaming service for a month and just play

1:38:26

some shows. I think it's 12 bucks

1:38:29

or whatever. We're not talking about money here. That's

1:38:32

what I would suggest. Get into it

1:38:34

that way because that's the way most fans get

1:38:36

into fish is maybe

1:38:39

you'll hear one of their studio albums or two.

1:38:41

You can go to whatever music streaming service you

1:38:43

already have and listen to some of their studio

1:38:45

albums, but what the band really

1:38:47

is about is live shows. So

1:38:49

getting any live show

1:38:51

exposure will give you an idea of what

1:38:53

this band actually is. Some of

1:38:55

them are available on YouTube for free. Some of them

1:38:57

you can get on certain streaming services for free. Most

1:39:01

of them you're going to have to go to live fish to get

1:39:03

because most of them are not released through the official streaming service channels.

1:39:05

And then the second thing is if you're into this, go

1:39:08

to a concert. Many people,

1:39:10

if you don't get into fish through listening to

1:39:12

the albums first, you usually

1:39:15

get into fish because someone brought you to a concert

1:39:17

and you enjoyed it. So

1:39:19

that's what I would suggest. For

1:39:21

first, go to those streaming services and

1:39:23

listen to whatever you can. Live generally

1:39:26

is preferred to studio albums. And

1:39:29

go to a show if that's your jam, so

1:39:31

to speak. And that's

1:39:33

it. Unfortunately, I don't have a good answer to what

1:39:35

are my things because my things are always shifting around.

1:39:37

I don't have one show I go back to all

1:39:40

the time. I'm constantly just listening to whatever is recent,

1:39:42

whatever I can get. This is a very

1:39:44

on brand for fish, for

1:39:46

the band and for you in particular to

1:39:48

essentially not be able to name best first

1:39:50

or favorite because it's all just music man.

1:39:53

That's not what it's then. But that is

1:39:55

the result. Thank you to our

1:39:57

sponsors this week, Celtrios and Squirt. And

1:40:00

thanks to our members who support us

1:40:02

directly. You can join us at atp.fm

1:40:04

slash join. I mean we'll talk to

1:40:06

you next week. And

1:40:37

if you want to join us at

1:40:40

P-A-S-E-Y-L-I-S-S-S-S-S-Q-Z list

1:40:43

M-A-R-C-O-A-R-M-T-M-R-C-O-R-M-S-I-R-A-C-S-A-Z-E-R-Q-Z

1:40:57

So I went on

1:41:05

a small adventure. I

1:41:14

mentioned a while ago and

1:41:17

a couple listeners have called me out on it recently. I

1:41:19

mentioned that I may or may not

1:41:21

have recently bought a higher resolution camera

1:41:24

in preparation for Vision

1:41:26

Pro content. I

1:41:28

was tempted for a while to go with a

1:41:30

higher resolution camera system and I did. And

1:41:34

what pushed me over the edge happened

1:41:38

around the time that Casey visited me in

1:41:40

New York in the fall. There

1:41:43

was a reason Casey came to New York in the

1:41:45

fall. We got pizza together and did a couple other

1:41:47

things. That was the only reason was the pizza.

1:41:49

For some reason during that trip I

1:41:52

was inspired that maybe I should

1:41:54

start capturing higher resolution

1:41:57

photos especially in panoramas

1:41:59

and... Was it because of that really janky

1:42:01

JPEG that your phone took of you on the beach at night

1:42:03

trying to get your car unstuck? No! We

1:42:08

talked about it on the show, it was just how

1:42:10

terrible that thing was. It was like, man, you know

1:42:12

what I need? I need a higher resolution camera. No,

1:42:17

so anyway, there was an event

1:42:19

that Casey and I attended in

1:42:21

November. And anyway,

1:42:24

so I decided I needed to capture

1:42:26

more high resolution content. Coincidentally,

1:42:28

not at all related to that, I

1:42:30

got a Vision Pro recently, and

1:42:32

I was able to view my

1:42:35

panoramas and my other photo and video content

1:42:37

in the Vision Pro, and

1:42:40

was disappointed in the resolution of them

1:42:42

when viewed at that scale. Because

1:42:44

of course, you know, you're looking at, I'm

1:42:46

looking at like phone captured panoramas from old

1:42:49

iPhones from like five, six, seven years ago

1:42:51

being displayed in this virtual, like, you know,

1:42:54

hundred foot tall view I'm seeing inside the

1:42:56

Vision Pro. So I decided,

1:42:58

let me see what I

1:43:01

can do with higher resolution stuff. And

1:43:04

I wanted for a while to

1:43:06

get into the Fuji

1:43:09

GFX line. Last

1:43:11

year I mentioned how I

1:43:14

had fallen in love with cameras

1:43:16

again because I discovered Fuji cameras.

1:43:19

And I just love the

1:43:22

way Fujis render color, especially

1:43:24

just the out-of-camera JPEGs that

1:43:26

I was able finally to

1:43:29

take pictures that I loved without

1:43:31

having to mess with them

1:43:33

in Lightroom and everything first.

1:43:35

I was just super thrilled

1:43:37

with just the straight out-of-the-camera

1:43:39

performance of Fujifilm's cameras.

1:43:42

And this kind of was inspired because

1:43:44

Tiff wanted the X100V for

1:43:47

her birthday last year. Yes, I know they gesturedly,

1:43:49

the sequel to that, like yesterday I'm very much

1:43:51

aware of that. Thank you very much. So I

1:43:53

got Tiff an X100V for her birthday last year.

1:43:56

I got to try it a few times, fell in love with

1:43:58

it, got myself an X-T5. The

1:44:00

X-T5 is an amazing camera in

1:44:03

so many ways. It is by

1:44:05

far my favorite handling

1:44:07

and controls I've ever had

1:44:09

on a camera. Like the way it just has a

1:44:11

whole bunch of knobs, but for all the main stuff

1:44:13

to adjust, you don't have to go into menus, you

1:44:15

don't have to like hold buttons and turn a wheel

1:44:17

and hope something changes. No, it's just knobs. It's wonderful.

1:44:20

And I love the X-T5. The only thing with the

1:44:22

X-T5 that's a little bit of a downer is that,

1:44:24

well there's two things. Number

1:44:27

one, Fuji's autofocus system is not as good

1:44:29

as Sony's. We talked about this, I don't

1:44:31

want to go too far into this. The

1:44:33

autofocus is not as good as Sony's, and

1:44:35

also because they are APS-C sized crop sensors,

1:44:37

they don't have the very high resolution or

1:44:39

the very low light abilities

1:44:41

of full frame sensors. Well,

1:44:44

it turns out, Fuji makes a

1:44:46

larger sensor camera system. They jumped

1:44:48

right over full frame, have never made one as far

1:44:50

as I can tell. Instead, they

1:44:52

make medium format digital cameras. And yes,

1:44:54

there's some aspects on what that means,

1:44:57

but generally this is what is accepted

1:44:59

as digital for what medium format means.

1:45:01

So this is basically the next step

1:45:03

up beyond full frame

1:45:05

in sensor size. So you have

1:45:08

the little crop sensors that many of the small

1:45:10

mirrorless cameras use, then you have full frame, which

1:45:12

is what the really nice cameras use, and then

1:45:14

above that you have medium format. The

1:45:17

sensor is just giant, and

1:45:19

I've been eyeing this for a while because I'm like, man, if I

1:45:21

could get amazing resolution and

1:45:23

low light performance with

1:45:26

Fuji's colors, that would be the

1:45:28

best of everything. The problem

1:45:30

is they're super expensive. Most of

1:45:32

the GFX cameras are in the

1:45:34

$5,000 and up range. Oh

1:45:38

my word! And

1:45:40

the lenses are also very expensive. And

1:45:42

don't forget, the cameras are usually pretty

1:45:44

big. Yes. Well,

1:45:47

it turned out in the

1:45:50

holiday season this past winter, there

1:45:53

were two factors that led to a

1:45:55

substantial discount on one of the cameras.

1:45:58

One was it was the holiday season. season and everybody

1:46:00

was doing sales and BS like that. Another was

1:46:02

that Fuji was about to it was about to

1:46:05

release a new model. The GFX 100 II

1:46:07

I believe and that pushed

1:46:10

down in the lineup and in price the

1:46:12

GFX 100 S. That's the

1:46:14

one I got because it was on super

1:46:17

sale and I knew it was about to

1:46:19

be replaced but the GFX 102

1:46:21

that the new one that came out was

1:46:23

adding things that I really don't

1:46:25

need or care about and

1:46:28

was at a price point I would never have gone for.

1:46:30

But it what it did was

1:46:32

push the GFX 100 S down

1:46:35

to surprisingly affordable relatively

1:46:37

speaking price points and

1:46:40

because I'm not a professional photographer I don't

1:46:43

need a bunch of giant expensive lenses.

1:46:45

I got the smallest lens in the system which

1:46:48

is its version of

1:46:50

a you know 40 ish millimeter

1:46:52

pancake. It's quite large but

1:46:54

it is the GFX version of that. It's

1:46:56

the 50 millimeter f 3.5. So I've been

1:47:00

playing with this I've been shooting with this for

1:47:03

the last you know two months or whatever it's been since I've

1:47:05

gotten it. It is amazing

1:47:08

and it is kind of

1:47:10

like like many things in technology it

1:47:13

is a massive set of trade-offs. The

1:47:16

camera as John said a minute ago is

1:47:18

very large. Now it

1:47:20

is very large in terms of like if you

1:47:22

look at today's mirrorless cameras they

1:47:25

are very compact compared

1:47:27

to what like good SLRs used to

1:47:29

be. If you actually compare

1:47:31

the GFX 100 S with

1:47:33

this lens to the

1:47:36

previous era's DSLRs that were of

1:47:38

similar professional use cases like for

1:47:40

instance the Canon 5D line. This

1:47:42

camera is almost exactly the same

1:47:44

size class as that. It's very

1:47:46

very similarly sized and weighted to

1:47:48

a Canon 5D with like a

1:47:51

you know a medium aperture

1:47:53

lens on it. So it's

1:47:56

big by today's camera standards

1:47:59

but if you go back even just 10 years ago, it

1:48:01

was considered, it's a normal sized camera

1:48:04

for that time range for professionals to

1:48:06

handle. And I am incredibly happy

1:48:08

with it in most ways. It

1:48:11

is an incredibly slow camera, like

1:48:13

just, so I

1:48:16

should say, it takes 100

1:48:18

megapixel images. Oh my grief.

1:48:21

So do Android phones, Marco. They're probably the

1:48:23

same thing, right? Totally, yeah. Android phones take

1:48:25

the 100 megapixel images? Why did you bother

1:48:27

getting this thing? And the

1:48:30

size of the sensor, like when I had

1:48:32

to open the camera up and

1:48:34

mount the lens, and I got to see that sensor, it

1:48:37

is ridiculous how big it is. It's

1:48:40

so massive. But the result

1:48:43

is you get not only

1:48:45

incredible resolution, but just

1:48:48

as going from a crop

1:48:50

sensor to full frame comes with a

1:48:52

substantial increase in light sensitivity.

1:48:54

And so you get like much lower

1:48:56

noise and higher resolution and better color,

1:48:59

even in low light. You

1:49:01

have that same jump again, going

1:49:03

above full frame into this medium format

1:49:06

sensor. So full frame is already great

1:49:08

compared to the small sensors. And this

1:49:10

is even that additional step above that.

1:49:14

And so it is amazing. I can shoot

1:49:16

ridiculous things handheld. I can crank the ISO

1:49:18

up to like, you know, ISO 24,000 and

1:49:22

it still looks amazing. Like it's, I'm

1:49:24

very happy with this, but I thought

1:49:26

like, hey, what if I look

1:49:28

at some of these pictures in the Vision Pro? And

1:49:30

that took me a while to finally hook up and,

1:49:32

you know, figure out, oh, panoramas, there

1:49:35

is no like metadata that says this photo

1:49:37

is a panorama. But

1:49:39

if you crop any photo

1:49:41

down to be very wide,

1:49:44

if you give it a very wide aspect ratio, I

1:49:46

think a little bit wider than 16 by nine at the

1:49:48

minimum, it will display it in

1:49:51

the panorama section and it will enable a

1:49:53

panorama like view mode in the Vision Pro.

1:49:56

Not exactly the same though. It

1:49:58

basically gives you, it doesn't. give you like

1:50:01

the full 180 view, it kind

1:50:03

of just gives you like a larger in

1:50:05

front of view view. But

1:50:07

it was enough for me to see the effect. There

1:50:10

were two key takeaways for me with

1:50:12

this camera and with trying to use a

1:50:14

professional camera to shoot Vision Pro content. Number

1:50:17

one, the resolution

1:50:20

does matter a lot. The

1:50:22

content of those pictures looked

1:50:25

way better than any

1:50:27

panorama my phones have ever shot.

1:50:29

It wasn't even close. Now that

1:50:32

is not surprising. It's not a fair comparison.

1:50:35

These are massive sensors with amazing optics in

1:50:37

front of them. So of course

1:50:39

you would expect that. So I'm not

1:50:41

saying that the iPhone camera is crap, like it's

1:50:43

just a totally different beast when I'm talking about

1:50:45

here. So that's unexpected and

1:50:47

that's fine. However what I

1:50:49

also learned is that the

1:50:53

lens I have for the camera being a

1:50:55

roughly 40 ish millimeter

1:50:57

equivalent focal length is

1:51:00

totally wrong for this kind of use because

1:51:02

it's just not nearly wide enough. The

1:51:05

panoramic display in the Vision Pro assumes

1:51:08

that you have a very wide field

1:51:10

of view. Again it wasn't even showing

1:51:12

it in the full width. It was it was kind of cropping

1:51:14

it in and giving me just like a bigger regular window. I'm

1:51:17

not sure that might require some kind of phone

1:51:19

only metadata to give it like the full 180

1:51:22

view. I think if you just made it really really

1:51:24

narrow it might do it. Oh maybe.

1:51:26

The key thing to know about the iPhone panorama

1:51:28

is it's not just one capture. You have to

1:51:30

like rotate your phone or whatever and so it's

1:51:33

many many captures which makes it

1:51:35

a way wider field of view in the horizontal axis

1:51:37

or whichever direction you're moving. I mean you fall a

1:51:39

little arrow or whatever. So it's kind of a shame

1:51:41

that Fuji doesn't or does it

1:51:43

have Fuji have a panorama mode where essentially you

1:51:45

take your giant medium format camera and you slide

1:51:47

it around the horizon just like you would do

1:51:50

with your phone. It has multiple hundred megapixel captures

1:51:52

and stitches them all together just like a phone

1:51:54

because that's what you want for. I mean I

1:51:56

know you think I just got like a you

1:51:58

know a five-year-old millimeter lens

1:52:00

or some really wide-angle thing that it might,

1:52:02

you know, kind of do the equivalent, but

1:52:04

I don't think, I think you'd get the

1:52:06

best results if you literally did like a

1:52:09

panorama by stitching together multiple exposures from your

1:52:11

big medium format into an actual panorama, and

1:52:13

that would have even more megapixels in it.

1:52:15

Yeah, I think that's the way to go,

1:52:18

and I think ultimately this is

1:52:20

probably the kind of thing that would be done

1:52:22

in software. I would assume like Lightroom or something

1:52:25

probably has a feature like this where you can

1:52:27

create, where you can stitch together like multiple exposures

1:52:29

into one big panorama. Sometimes they'll do, some cameras

1:52:31

will do it in the camera. Some cameras, I

1:52:33

think Sony would like force you to use some

1:52:36

janky third party Sony app to do it, and

1:52:38

some of them will let you export the exposures

1:52:40

and do it in your own app, but it's

1:52:43

per camera brand. It's another area where if Apple did it,

1:52:45

it would be much, you know, Apple did do it, and

1:52:47

it's way simpler. What do you do? You

1:52:49

just move the phone and it just does it, but lots

1:52:51

of camera companies do have some way to do this. I'm

1:52:53

just not familiar with Fuji's solution. I would

1:52:55

assume that it would definitely not involve the

1:52:58

camera because the camera... It has nothing in

1:53:00

it, yeah. Well, I mean, the camera has

1:53:02

a lot in it, but like it's already

1:53:04

very sluggish to capture and process these giant

1:53:06

images of this giant sensor. There is no

1:53:08

way it has the power. Compared

1:53:10

to the iPhone in terms of how much computing

1:53:13

is in there versus how much computing is an

1:53:15

iPhone. Oh yeah, but even, I mean, geez, just

1:53:17

imagine like how much memory it must take to

1:53:19

stitch together photos from that sensor. I

1:53:22

mean, because the raw... I don't

1:53:24

have the number off hand, but I think the raws are like 200 megs each.

1:53:29

There's a lot of data being used by,

1:53:31

you know, being generated by the sensor. It's

1:53:34

ridiculous like how many pixels you have

1:53:36

to process. So I would expect this

1:53:38

to be, you know, no small feat

1:53:41

for software and hardware to stitch this

1:53:43

together. But I think it's

1:53:45

interesting. But ultimately, even

1:53:47

with the amazing, you know, hardware

1:53:49

of that camera, it

1:53:51

is not suitable to do that job

1:53:53

of stitching together for panoramas for the

1:53:55

Vision Pro. What you ultimately want for

1:53:57

the Vision Pro panoramas is just better.

1:54:00

better iPhone cameras and better iPhone processing. That's

1:54:02

it. Because the iPhone is doing

1:54:04

the capture in ways that no camera

1:54:06

will ever do in terms of being

1:54:08

able to stitch things together so perfectly.

1:54:10

It's so easily, maybe with imperfect input,

1:54:13

maybe in varying light levels across the

1:54:15

frame. If you happen to sweep across

1:54:17

the sun, how do you deal with

1:54:19

that? With the camera, you can fix

1:54:21

the parameters. But then you've got

1:54:23

to kind of expose for the sun and make everything else

1:54:25

darker. It's just this whole thing. The iPhone

1:54:28

takes care of so much of that for us

1:54:31

and makes it so easy. And then

1:54:33

once you're in the Vision Pro looking at it,

1:54:35

you don't have to worry about did I properly

1:54:37

match the perspective

1:54:41

with how I process this photo so it

1:54:43

will display correctly in the Vision Pro. Doesn't

1:54:45

matter. When you do it with an iPhone,

1:54:47

it's always properly matched. It always handles that

1:54:49

for you. So ultimately, I think

1:54:52

we're still going to just be limited by, in practice,

1:54:54

by what the iPhone can do for

1:54:58

the sharpness and resolution and possibly depth of

1:55:00

what we're looking at in the Vision Pro.

1:55:03

Surprised you didn't get one of those white monoliths in

1:55:05

the Least You with

1:55:53

some fancy cameras. Those are probably

1:55:55

bigger and scarier than the discrete

1:55:58

little white pillars with two. little

1:56:00

black dots on them. And maybe

1:56:02

they're maybe the more like the

1:56:04

cameras we were talking about when we talked about

1:56:06

3D movies on the last episode where it's two

1:56:09

gigantic expensive film cameras arranged in a

1:56:12

really weird way through a prism or

1:56:14

a beam splitter so that they can

1:56:16

both get the perspective they need. You

1:56:19

know, because the cameras are so big, they can't get

1:56:21

that close to each other. But either way, I find

1:56:24

that content much more compelling than I would even an

1:56:26

infinite resolution panorama because the panorama still just

1:56:29

looks to me like a giant painting that

1:56:31

I'm looking at as opposed to actually being

1:56:33

in the mountains, you know? Oh

1:56:35

yeah, totally. The panorama is in the vision

1:56:37

pro to me. Yeah, it looks like I'm

1:56:39

in a planetarium. Like, okay, there's a big

1:56:42

static image. Okay. I got it.

1:56:44

Planetarium is usually better than that. But you know what I mean.

1:56:46

It looks like you're in a dome and it's painted on. That's

1:56:50

great for certain things, but it's

1:56:52

nothing like a 3D environment. So

1:56:55

Canon just released, they

1:56:57

kind of released this double

1:56:59

eyed lens intended for capturing

1:57:01

VR180 format, which is what

1:57:03

the high rope thing and, was it the

1:57:05

shark? That's what VR180? The rhinoceros. Yeah. Anyway,

1:57:08

so Canon just released a lens for that

1:57:10

and they have a special mode with one

1:57:12

of their highest end new mirrorless cameras. I

1:57:15

believe it's the R5C that

1:57:17

can do it with the special lens

1:57:19

and the special subscription software that you

1:57:22

need from Canon to fish the images

1:57:24

together. Wait, what is it

1:57:26

doing for you? Panoramas? You mean? No, I mean

1:57:28

3D video. Sorry, I switched gears. But from a

1:57:30

single camera with a single sensor or from two

1:57:32

cameras? Yes, single camera, single sensor. They just released

1:57:34

a lens that has two eyes on it and

1:57:37

I guess it... And they split up a sensor,

1:57:39

like one gets the right half, one gets the

1:57:41

left half? Yeah, but somehow they're doing 8K video

1:57:43

per eye at 60 frames per second. But I

1:57:45

haven't had a chance to look too much into

1:57:47

that yet. But anyway, so if I was going

1:57:49

to actually start creating like 3D content,

1:57:53

I would probably look at that. But what

1:57:55

I learned from this is like, I

1:57:58

love this camera for lots of other reasons. But

1:58:00

it is not it's not what

1:58:02

I need if I if I want to create 3d content that

1:58:04

that's that's kind of a separate beast Yeah,

1:58:07

I do wonder if you could get away with

1:58:09

just way worse cameras But two of them like

1:58:11

the 3d effect will hide a lot of sins

1:58:14

because you're just so wowed that it

1:58:16

looks 3d As long as you don't move your head too much,

1:58:18

right? But yeah,

1:58:20

cuz like I keep thinking of those those

1:58:23

Camera stands in the studio. They didn't look

1:58:25

that big There's no way they look too

1:58:27

close enough to even house a single one

1:58:30

of the 8k cameras that the Apple's recording

1:58:32

the The major league soccer games with

1:58:34

so they just they just not that they're like webcams,

1:58:36

but they looked Way smaller

1:58:38

than you would think like this that there are

1:58:40

two lenses and presumably two cameras inside these little

1:58:42

pillars They just kind of look like posts And

1:58:45

I thought those looked pretty good But maybe I was just

1:58:47

fooled by the fact that they were 3d and I didn't

1:58:50

notice how pixelated everything was No,

1:58:52

the 3d stuff I think was pretty good both

1:58:54

You know the immersive and just regular 3d I

1:58:56

thought was all of it was very very well

1:58:58

done And I want more

1:59:01

of it. I've somebody just listening to somebody talk

1:59:03

about this I can't remember who it was but

1:59:05

on me. No, no, I mean it Why I

1:59:07

thought it was somebody else that was just saying

1:59:09

or maybe was dithering I don't remember but you

1:59:11

know that the 3d the immersive stuff. I shouldn't

1:59:13

say 3d the immersive stuff was

1:59:15

so So good or I

1:59:17

say was as though it's past tense. I mean

1:59:19

it's still there It's just I've experienced it once

1:59:21

in videos. You've seen exactly But

1:59:24

the immersive stuff is so

1:59:26

incredibly incredibly incredibly cool, and

1:59:29

I just want more of it I want all

1:59:31

of it I want all of my stuff to

1:59:33

be immersive and I know that's never gonna happen

1:59:35

But I want it and I hope Apple really

1:59:38

does just hammer on the gas

1:59:40

in order to you know Get get more

1:59:42

of this and we heard about the MLS

1:59:46

stuff and we are hearing rumors

1:59:48

about the slam dunk contest, but

1:59:50

I want all the immersive video

1:59:52

I want all of it, please and thank you that

1:59:54

is a commission like someone to do like nature documentaries

1:59:57

You know kind of like they would did with the

1:59:59

one 4k first came out, you saw a

2:00:01

lot of planet earth type things like, oh, do you have

2:00:03

a new 4K TV? Watch

2:00:05

this 4K account. But the problem is they

2:00:07

sold way more 4K televisions much quicker than

2:00:09

Apple is going to sell these headsets. So

2:00:12

it might be a lot. And Apple has

2:00:14

the ability to bootstrap this because they do

2:00:16

have the rights to some sports franchises. So

2:00:18

they're doing the Major League Soccer thing. And

2:00:20

they do have a studio that makes television

2:00:22

shows and movies. And it's not going to

2:00:24

be economical. And they're going to lose money

2:00:27

on it. But if you want to solve

2:00:29

the chicken egg problem and you can't

2:00:31

convince anyone else to make content for

2:00:33

your 200,000 of your closest friends with

2:00:35

their headsets, Apple, you just pay to

2:00:38

make it yourself. And the good thing is if you make

2:00:40

evergreen content like a planet earth style nature thing, that's

2:00:42

not going to age that badly. In five years when

2:00:45

more people have these headsets, they'll still want to watch

2:00:47

that planet earth thing. Hopefully planet earth won't have changed

2:00:49

that much by then. It's

2:00:51

evergreen. And sports aren't evergreen, unfortunately.

2:00:53

And hey, if you catch a particular dramatic

2:00:55

game or something, that might make some evergreen

2:00:58

content. And if not, there's always more

2:01:00

sports. So Apple needs to practice recording it

2:01:02

in a format that looks good in their headsets

2:01:04

and just keep doing that going forward.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features