Podchaser Logo
Home
Weird Forking Scenario

Weird Forking Scenario

Released Thursday, 15th February 2024
Good episode? Give it some love!
Weird Forking Scenario

Weird Forking Scenario

Weird Forking Scenario

Weird Forking Scenario

Thursday, 15th February 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Are you trying to do like a vision pro layout

0:02

or what? Just don't worry about it.

0:04

I needed John to be here. I waited longer

0:06

than normal expecting Captain Late to be here and

0:08

I'm wrong. Oh my god, it's you. Wait,

0:12

what is behind me? I don't even

0:14

know what that background is. Maybe, does Zoom

0:16

add that or does Apple add that? I

0:18

think Zoom's doing it. I don't even know if

0:21

there's a way to change what's behind me. So

0:24

I am currently using AirPods. I am recording my real

0:26

mic and I will change as soon as I have

0:28

John freak out about how ridiculous I

0:30

look and then I will turn all this off and rejoin on

0:32

my computer. Okay. But

0:35

it was worth it for the fun of it. Oh look, there's

0:37

my, there's-ish my hand? Nope, nope, there it is. There it is.

0:40

You do the trademark Casey thumbs up. I mean, the problem

0:42

is your smile doesn't get big enough. What

0:46

are you gonna do? Ain't nobody perfect. Hi John. Hello. What's

0:49

going on? Last night,

0:51

it was a, did a podcast and I

0:53

did my normal Zoom recording. Hello, scary face.

0:56

And uh- That's all you get.

0:58

It didn't record anything. The

1:01

thing listeners that you have to learn about

1:04

trying to surprise or shock John

1:06

Cirecusa, you will never

1:08

get the reaction you want. We

1:10

know this from being friends with John for what, 13

1:13

years or whatever it's been? We know this.

1:15

A long time, yeah. But whatever

1:17

reaction you want to get out of John- You

1:19

will not receive it. Like

1:21

I believe you got what, about one and a

1:23

half words of a reaction? As

1:26

I said, I've seen a lot of personas now. We

1:28

hadn't seen Casey's persona. Anyway, that's

1:31

not to say that I'm not ever surprised about

1:33

things. Marco was very carefully trying to say you

1:35

won't get the reaction you expect. So maybe I'll

1:37

be way more surprised than you expected or way

1:39

less. There you go. We

1:43

have some follow-up. Apparently John does not

1:45

know how to do calculations. I presume

1:47

mental calculations or perhaps arithmetic beforehand. But

1:50

one way or another, you screwed up

1:52

your PPD. So tell

1:54

me what PPD is and how'd you screw it up, sir? The

1:57

problem wasn't with calculations because I wasn't doing these calculations.

2:00

It was with the little PPD calculator,

2:02

which is pixels per degree. We linked

2:04

the calculator in last episode show notes

2:06

and had a bunch of sliders and

2:08

text fields. Apparently, I messed up

2:10

one of the sliders or text fields when I

2:12

was doing the iPad Pro PPD because last episode

2:14

I said it was 28 PPD. If

2:17

I have my iPad, my 11-inch iPad

2:19

Pro sitting on a pillow on my

2:21

lap in the bed, measure the distance,

2:24

size of screen, resolution, enter it all in,

2:26

and I got 28 PPD, I had one

2:28

of the sliders just incorrectly set. The

2:31

correct number is 67 PPD. So

2:34

that kind of changes things because last episode I

2:36

was like, wow, the Vision Pro is 34 PPD,

2:38

which is pretty good for a headset. And

2:41

the iPad that I watch my TV shows on is only 28,

2:44

and that's less than the Vision Pro. Turns out,

2:46

not so much. Turns out, the Vision Pro is 34-ish, iPad Pro

2:48

11-inch, 67, 4K TV, 5 feet away, 76, Pro display

2:54

XDR, 2 feet, 100. So

2:56

it's making me feel a little bit better

2:58

about my future OLED iPad Pro purchase. Tell

3:01

me about Field of View and eye movement, if you don't

3:03

mind. This was something that occurred

3:06

to me after recording last week's episode.

3:08

We were talking about the Field of

3:10

View, the Vision Pro, and how

3:12

it compares to the headsets, and so on. And

3:15

also, Marco mentioned, like, sometimes

3:18

when he was looking at targets to like

3:20

the left and right side of the Field of

3:22

View, it was having trouble with the eye tracking,

3:24

whatever, at the extremities. And

3:26

it occurred to me that those two things combined,

3:28

that when we talk about Field

3:31

of View and how narrow it is in the headset,

3:33

as opposed to like, you know, just your eyes out

3:35

in the world or even some other headsets, it's

3:38

compounded by the fact that when you do what

3:40

Marco was describing, which is like, keep your head

3:42

still, but turn your eyes to like, you know,

3:44

in the Vision Pro, target button that's in the

3:46

upper left corner of your Field of View. When

3:50

you do that, you know, it kind of

3:52

makes sense. It's at the extremes of the edges

3:54

of the screen. You're looking through the

3:56

edges of the lenses that are in the Vision Pro.

3:59

I can understand how... eye tracking might be more difficult

4:01

there. But when you do that in real life,

4:03

and you have your head staying still, and you

4:05

move your eyes to the left to see something

4:07

in the upper left corner of your monitor without

4:09

moving your head, guess what? Your

4:12

field of view moves with your eyeballs, but

4:14

this does not happen in the Vision Pro.

4:18

So you see what I'm saying? When you shift

4:20

your eyes to the left, your whole field of

4:22

view is always centered on where your eyes are

4:24

pointed. But when you shift your eyes to the

4:26

left in Vision Pro, the field of view does

4:28

not move with your eyeballs. If

4:30

you're not moving your head, the screens are

4:32

in the same place. And

4:34

that's obvious if you think

4:37

about it, but it really does make the narrow

4:39

field of view feel more

4:41

narrow because as you shift your eyes, the field of

4:43

view doesn't shift with them. And I'm not saying it's

4:45

easy to do that. What are they gonna do to

4:47

have a little motorized? The screens that travel around would

4:49

be very difficult to do that, but it does make

4:52

the field of view feel even narrower. And it is

4:54

also why a lot of people who have used Vision

4:56

Pro for a longer time now get

4:59

into the habit of, or suggest that other people

5:01

get into the habit of moving their head more,

5:04

both to avoid Marco's issue, which is like, you know, the eye

5:06

tracking seems like it's the best kind of around the middle-ish of

5:08

the screen. And also because if

5:11

you do want to, for example, take in a

5:13

window that you have floating to your left, merely

5:15

glancing your eyes over there is not going to

5:17

reveal any more of that window. It would in

5:19

real life because the center of your field of

5:21

view would shift, but in Vision Pro, you actually

5:23

have to turn your head to move

5:25

the little screen so they know if you change what they're

5:27

displaying. That's a good point. And I also wanted

5:30

to bring up, I was talking to somebody about this, and

5:32

I think I know who it is, but they all remain

5:34

nameless, but I was talking

5:36

to somebody about this, and I believe it

5:38

was that last episode that you seemed very

5:40

disgruntled about the idea of moving your head

5:43

really in any direction, but I think particularly

5:45

laterally, in order to use

5:47

my fantasy, which is actually kind of

5:49

reality, magical world where

5:51

you've got panels of windows all around

5:53

you, and you seemed, don't

5:55

let me put words in your mouth, if

5:58

I'm mischaracterizing what you said, I apologize. It

6:00

seemed like, John, you were very perturbed about the idea of moving your

6:02

head a lot. And I was thinking about this and talking to somebody

6:05

that we know. Do you not move

6:07

your head when you're looking at that humongous

6:09

XDR? Like, do you really keep your head

6:11

dead center, locked in straight ahead? Because I've

6:13

got three 5K displays here because I'm a

6:16

weirdo and because Marco sent me one.

6:19

I'm pitching my head laterally constantly,

6:21

like all the time. Granted, not up and down,

6:23

but it's only

6:25

laterally. But I am always moving my head. Do you

6:28

not do that with your XDR? I mean, I'm sure

6:30

I do move my head, but considerably less. I'm definitely

6:32

a one monitor in front of me kind of person

6:34

because if I feel like, oh, I

6:36

have to look over there, which I guess there's some

6:38

minimum amount of head moving that makes me feel like

6:40

that. What I want to feel like is that everything

6:42

is there in front of me. Now, obviously, the bigger

6:44

the monitor gets, it's not all in front of you.

6:46

And the part of your vision that is in focus

6:48

is very small anyway. But I can

6:51

flick my eyes over to various... I

6:53

feel like I can take in my whole XDR. Obviously, I can't

6:55

because of the way human vision works. But I feel like I

6:57

can very easily flick my eyes to any corner. And

7:00

when I do that, I'm sure my head moves, but

7:02

doesn't move a lot. Whereas if you have

7:04

a second monitor, you have to make that choice that you've

7:06

made in one way and not be able to pull it

7:08

out otherwise, which is like, do I make it so the

7:10

seam between the two monitors is directly in front of me?

7:12

Or do I put one monitor directly in front of me

7:14

and then one monitor to the side? And if you have

7:17

a big monitor in front of you, the monitor to the

7:19

side is in head turning zone because you really have to

7:21

rotate. Especially if you want to see the

7:23

upper left corner of the monitor that is to the

7:25

left of the large monitor that's directly in front of

7:27

you, you're turning your head a lot and you'll feel

7:29

it. So I prefer obviously the one

7:32

big monitor. I don't know what the limit is. It's

7:35

not 32 inch, right? I'll tell you when I

7:37

get to it. I bet if I put my 65 inch

7:39

television on my desk, that would be past the limit and

7:41

I'd be turning my head just to look at the Apple

7:43

menu. But so far, from

7:45

anything that Apple has shipped that I've used with my computer,

7:48

32 inch fits within my field of view.

7:50

So when people are trying to compare

7:53

the Vision Pro and how it

7:55

might compare to regular computer

7:58

screens like this or how you might be able to... use

8:00

it as a virtual computer screen. The

8:03

sharpness and density, like John was saying

8:05

a minute ago about the PPD, the

8:07

density of a good computer monitor

8:09

is just way higher and it's

8:12

way sharper than the virtualized windows

8:14

that you create within the Vision

8:16

Pro environment. And so what

8:18

it feels like when you're making Vision

8:20

Pro windows, everything feels like

8:22

it is larger and further away usually than

8:24

how you would normally set up a computer

8:26

monitor. You can pull the windows virtually closer

8:28

to you in Vision Pro and you can

8:30

shrink them down so they match the size

8:32

and the position and the scale, but

8:35

the resolution is just not there. The displays

8:37

in the Vision Pro are not yet high

8:39

resolution enough to be able to simulate the

8:41

same density we get from computer displays that

8:43

are right there in front of us in

8:45

the real world. So you

8:48

kind of can't directly compare. So if

8:50

you wanted, for instance, if you wanted

8:52

to have the resolution of the 32

8:54

inch Pro Display XDR be

8:56

reasonably usable in the Vision Pro,

8:58

you would have to make the

9:01

window much larger than the

9:03

XDR actually appears in real life. And

9:05

you probably then either push it back from you a

9:08

little bit further in the distance, which then of course

9:10

shrinks resolution kind of even further because there's only

9:13

so many pixels on the physical displays, so that's

9:15

not really gaining any resolution, it's just changing the

9:17

perspective. Or you bring it really

9:19

close to you, in which case it's really

9:21

big and you have to turn your head

9:23

more. If you actually want to minimize head

9:25

turning as you're using a computer display, the

9:27

best way to do that is not in the

9:30

Vision Pro, it's by using a regular high

9:32

DPI external monitor. Yeah, you know, I

9:34

think another, like not bone

9:36

to pick, that's very antagonistic, but another

9:38

thing that I was reflecting on after

9:40

our last episode that I

9:43

just don't think I agree is that, you know,

9:45

the Mac Virtual Display, I think

9:47

it's perfectly fine. Like I don't really argue anything you

9:49

just said, Marco, but for my eyes, which I will

9:51

be the first to tell you, my eyes are not

9:53

great. I think I have like 2025 or 2030 or

9:55

something like that

9:58

Vision with my contacts in, which is pretty good.

10:01

which is basically the only way I live, but

10:03

they're not perfect. So consider your source here. When

10:05

I say that I think the Mac Virtual Display

10:07

is pretty darn crisp, and I think part of

10:09

that may be because I can blow it up

10:12

to be hilariously large if I so desire. And

10:15

again, I'm not arguing that the

10:17

effective resolution isn't lower than an XDR or

10:19

even a 5K machine, but I don't know.

10:21

I've used Mac Virtual Display in the Vision

10:23

Pro for a couple hours at a time,

10:25

and I didn't find it off-putting

10:27

or frustrating at all. It was perfectly

10:30

serviceable, if not an

10:32

improvement in terms of my ability

10:35

to get things done over my

10:37

14-inch display. My MacBook Pro is

10:39

onboard display. It's not an improvement in

10:41

terms of fidelity, to your point, but it certainly

10:43

felt like an improvement in terms of my ability

10:45

to get things done because I had so much

10:47

more real estate than my little 14-inch

10:50

MacBook Pro has. So I

10:53

don't know how to phrase this concisely. I'm not trying

10:55

to say you're wrong by any means, but I don't

10:57

know. My experience was a little bit different, I guess,

10:59

is the best I can say. Well, but we're actually

11:01

talking about two different things, and I think this is an

11:03

important distinction. What you are saying

11:05

is you can use it as a Mac Virtual

11:07

Display, and it works perfectly fine. You

11:10

even said the word serviceable. It works. You

11:12

can do it. It'll improve your productivity over a built-in

11:14

Mac screen, maybe. You are correct. And

11:17

I've spent a little more time with it since last week's episode.

11:20

I've tried different head-sealed shapes and foam

11:23

cushion shapes, and I've tried with and

11:25

without the reading glasses. And

11:27

there was actually a tip. Somebody put this on Reddit,

11:29

and people link us to it. In

11:32

the IPD adjustment thing, like when you put it on and it has

11:34

you hold on to the crown, and it goes, vroom, and it moves

11:36

the things in. In that

11:38

screen, if you tap the other button,

11:40

like the capture button, it can scoot

11:43

them manually back out. So

11:45

as far as I can tell, this is a

11:47

single direction adjustment. But

11:49

it does allow some degree of manual IPD

11:51

changes. And this person on Reddit had said

11:53

that this made a huge difference for them

11:56

in how sharp and clear it was to

11:58

use that eye strain. Of course,

12:00

I gotta try this. I tried a little

12:02

bit, I tried a lot. It didn't really

12:04

make any noticeable difference for me. I

12:07

have gone back to not using the

12:09

Zeiss Reader inserts, to just using it straight like

12:11

the way I had it in the lab. Sorry, I

12:13

went to a lab. That's all I can say

12:15

about that. But I still find the Vision

12:17

Pro sharp enough that I'm

12:19

pretty sure I'm not having eye problems by

12:21

not seeing it sharper. But

12:24

the Mac screen sharing is still not

12:27

nearly as sharp as a real Mac monitor. Now, this

12:30

is a totally separate discussion from whether

12:33

you can use it and whether it has utility

12:35

and whether some people can be totally fine using

12:37

it for many hours at a time. That's a

12:40

separate discussion. My claim is

12:42

that it is not as sharp as a

12:44

real Mac monitor. The kind of

12:46

sizing and positioning and focus distance

12:48

issues make it less practical for

12:50

me. And if given the choice,

12:52

I would take a regular Mac monitor any

12:54

day. I also try – people have reported,

12:57

if you have the developer strap, which we'll get to in a second, it

13:00

provides, I guess, a faster connection to the Mac

13:02

that it's connected to. And that apparently, Mac screen

13:04

sharing works better with the developer strap. I

13:07

tried it and honestly, I noticed no difference. So

13:09

I don't know if that's a thing or not. I

13:11

could tell no difference. And finally,

13:14

I tried editing this podcast on it last

13:16

week. And while it was

13:18

interesting to have that Mac screen space, I

13:21

didn't notice this when just doing basic email

13:23

and web browser and stuff like that kind

13:25

of productivity. But when I

13:27

started editing the podcast in Logic, I

13:29

immediately noticed lag. Just

13:31

like moving the mouse around because I'm doing lots

13:33

of fast mouse movements and fast keyboard and everything

13:35

when I'm using stuff in Logic. So the

13:38

lag was actually kind of a deal breaker for me

13:40

in addition to the fact that it's just – Almost

13:42

like you're using screen sharing. Yeah, it's also very awkward

13:44

trying to wear studio-sized headphones while using

13:46

the Vision Pro. That also proved to

13:48

be a problem. But there are

13:51

multiple issues with the Mac screen sharing that make it

13:53

noticeably less good than using an

13:55

actual Mac screen. And

13:58

some of those are probably just inherent. to

14:00

the technology. Some of those will probably be fixed

14:02

in the future or improved in the future with

14:04

higher resolution screens. So if you're looking for something

14:06

that's going to directly replace a

14:08

Mac screen, this won't do that, but

14:11

this can serve as a Mac screen

14:13

with some compromises. And for many people,

14:15

that will be totally fine and worth

14:17

the tradeoff. But it's not

14:19

a direct replacement. I think that's

14:22

the key, the way you ended that. That,

14:24

yeah, it isn't a direct replacement. I guess

14:26

that's true. However, it is more

14:29

than serviceable, and I know I said that

14:31

before, but it to me has been pretty,

14:33

pretty good. Now, I've occasionally noticed lag. I've

14:35

actually noticed more pointer lag, where I think

14:37

it's a little confused if I'm trying to

14:39

control a Vision OS window or the Mac

14:41

window. I've noticed a little bit more pointer

14:43

lag than I've noticed display lag, but I'm

14:45

not editing stuff in Logic or whatever. But

14:47

I think your point is fair, that it

14:49

is not better than having a dedicated monitor,

14:52

but if you are ever somewhere other than

14:54

your desk and you would like to have

14:56

more screen real estate, I think that this

14:58

is more than just acceptable. I think it's

15:00

pretty darn good. Again, consider your source. My

15:02

eyes are not great. So it very well

15:05

could be that if I had Marco's eyes

15:07

that I would look at this and go,

15:09

ugh, this ain't great at all. But to

15:11

my eyes, which in general, as

15:13

much as I'm making fun of myself, generally speaking,

15:15

in day to day use of my

15:18

eyes, I can't think of a better way to

15:20

phrase this, I don't feel like things are generally

15:22

blurry. But I want

15:24

to make it plain that my eyes are

15:26

not stupendous. And so I think that the

15:28

fidelity is fine, the crispness is fine. I

15:31

think it works reasonably well. Again, I

15:34

mean lag could be an issue if you're editing in Logic, but for

15:36

the sorts of things that I do, I think the lag is fine.

15:38

I think this is really, really good. And I

15:40

was debating if I wanted to bring this up,

15:43

but I might as well do so. I actually

15:45

did take the Vision Pro to a local library.

15:47

I did this on Monday

15:49

morning. I booked

15:51

a little conference room sort of thing,

15:53

which did have a glass wall behind me, but

15:56

I booked a conference room for a couple hours, which was like

15:58

a two person study room, I guess I thought. I should call

16:00

it. And I had my back

16:02

to the outside wall. So the only thing

16:04

that anyone would be able to see is

16:06

like the weird headband behind me. And

16:09

I did work. I wrote code for a

16:11

couple of hours. And it was

16:14

great. Like it was absolutely great. I then

16:16

got booted from my conference room because my

16:18

time was up and I needed to spend

16:20

a little time in like the regular desks

16:22

and chairs and cubicles area. And I did

16:24

not have the gumption to put the vision

16:26

pro on at that point. But for

16:29

the time that I was somewhat

16:31

secluded and not completely conspicuous, I

16:33

thought it was wonderful. It was so much

16:36

better than my rinky dink, like what is

16:38

it, like 12 or 13 inch monitor that

16:40

I bring with me as like a second

16:42

display. It was so much better than that.

16:44

So again, I'm not trying to say that

16:46

anything you've said, Marco, was incorrect, wrong or

16:48

inaccurate. All I'm trying to say is for

16:50

me and my uses, it's been great. It's

16:52

been really, really good. Yeah. And

16:54

I think what you said at the beginning of that

16:56

is pretty important. You were talking about like,

16:58

you know, your eye quality for the

17:00

lack of a better way. If

17:03

you are accustomed to not

17:06

that sharp of vision, you

17:08

might not see the difference. And that's not an

17:10

insult. Like that's just

17:12

the reality. If you are accustomed to

17:14

sharp vision and you're accustomed to the

17:16

sharpness of Mac screens, when

17:19

you see the virtual screen, like one

17:21

of the effects I get is I

17:23

almost feel like I'm getting eye strain

17:25

because my eyes are trying to focus

17:27

harder to resolve the detail they expect

17:29

to be there, but that isn't actually

17:31

there because I'm accustomed to seeing

17:33

a certain level of sharpness on the physical

17:35

Mac displays. And so when I'm using something

17:37

in vision, in vision, probably in the Mac

17:39

screen sharing mode that is a little bit

17:42

soft, my eyes think they're not

17:44

focusing correctly and they try harder to focus on

17:46

it. Similar to what I was describing last week

17:48

about like when I try to focus on stuff

17:50

that's out that's in the soft depth of field

17:52

areas of a 3D movie. Like I'm thinking I

17:54

should be able to focus on this and it

17:56

kind of hurts my eyes to look at a

17:58

defocused area of the video that I think I

18:00

should be able to focus on. So it's that same kind

18:02

of effect when looking at the max screen. If

18:05

you have pretty sharp vision

18:07

in the physical world, that,

18:09

I think, makes it more noticeable to use the

18:11

max screen this way and to see its flaws

18:13

and to potentially maybe cause some eye strain. Marco,

18:16

about your lag when

18:18

editing the podcast, you just have to wait

18:20

for the Vision OS native version of Logic,

18:22

which based on the iPad schedule should be

18:24

here in 10 short years. So hang in

18:27

there. I mean, the screens are probably a

18:29

higher resolution by now. Yeah. Just to be

18:31

clear for people, like, oh, he's saying this lag is screen

18:33

sharing with the Mac. He's using a Mac program to edit

18:35

the podcast, so he's screen sharing with his Mac. Presumably, if

18:37

it was a native Vision OS version of the thing, there

18:39

would be considerable less lag if the app would actually be

18:41

running on the Vision Pro, which has an M2 and it

18:43

would be fine. Oh, I would expect no lag if it

18:45

was native. Right. Related to what you

18:47

were all saying about screens, this is also one final

18:50

note on the whole field of view and everything. And

18:52

you were kind of both touching on it. Even

18:55

before Vision Pro came out, there was lots of people speculating

18:57

about, well, you

19:00

have such and such size monitor on your laptop

19:03

or on your desk, but once you get the

19:05

Vision Pro, imagine you could make it 100 feet

19:08

tall in front of you. We heard

19:10

a lot of that both before the Vision Pro

19:12

was in anyone's hands and now after when people

19:14

have it. They still make statements like that. And

19:17

what both of you were touching on is the edges

19:19

of that. But it made me think about, what

19:22

does it mean to have a really big screen

19:24

in front of you? Obviously,

19:27

one aspect of it that we've discussed at length is,

19:29

OK, well, how many pixels can I see? Because when

19:31

you're doing stuff with max screen sharing

19:33

or something, what it comes down to is, look,

19:36

toolbars take up a certain number of pixels. And

19:39

if I want to see more stuff on my screen, I

19:41

need to have more pixels because I don't really care if

19:43

I can make something 100 feet if it's 640 by 480

19:46

pixels because there's just not enough information density there.

19:49

But it also got me thinking about things

19:51

like watching movies. Oh, you know,

19:53

well, you could watch movies on your little laptop

19:55

on a plane. But when I'm on the plane, I

19:58

can put a 20-foot screen in front. front

20:00

of me. And there are

20:02

a couple of aspects of what does it mean to

20:04

look at a big screen, especially when thinking about things

20:06

like movie screens. When you're in a movie theater, let's

20:08

say the screen is 100 feet diagonal or something. The

20:10

screens are really big. It's a really big movie theater.

20:12

It's not a dinky movie theater. It's a big movie

20:14

theater. That screen is really

20:16

big. How that manifests in

20:19

our viewing is, one, how much of your field of view does

20:21

it take up? And if you're in the front row, it takes

20:23

up all of your field of view because you can't even see

20:25

the whole screen without turning your head. And if you're in the

20:27

back row, it takes up less, but it takes

20:29

up a certain amount of your field of view. But

20:31

field of view is not the only thing

20:33

that makes a screen big. If it was,

20:35

we could take our phones and jam them

20:37

up to our eyeballs and be like, wow,

20:39

my phone screen is huge. It's taking up

20:41

my home with my entire field of view

20:43

because it's touching the bridge of my nose,

20:45

right? Field of view is

20:48

not the only thing that determines

20:50

a screen, a big, quote unquote, big

20:52

screen. The second thing is, how far

20:54

away is it from you? And

20:56

in the movie theater, if you're watching some

20:58

gigantic IMAX screen, that's hundreds of feet, right?

21:01

Hundreds of feet diagonal. It's just a massive

21:03

screen. It's multiple stories tall. It's

21:05

also not touching the bridge of your nose. It's

21:07

probably pretty far away because if it wasn't, you

21:09

wouldn't be able to see anything. Again, if you

21:11

were sitting in the front row and you cranked

21:13

your neck and you can't even see the entire

21:15

screen, depending on how well the theater is laid

21:17

out. Inside Vision

21:20

Pro, many things conspire to make it

21:22

not a good match for anything that

21:24

I've just described. Obviously, the physical

21:26

reality is there are screens like less than an inch from

21:28

your eyeball or whatever, but that's not how it feels because

21:31

of the lenses.

21:33

So the first thing is field of view. We

21:35

know the field of view of the entire Vision

21:37

Pro is like 100 degrees. The ideal movie viewing

21:39

thing is like 40 degrees over a field of

21:41

view, you're fine. You should be able to get

21:43

something that has the same field of view as

21:45

the biggest movie theater screen you've ever seen where

21:47

you have a good seat in the theater. So

21:50

I feel like we're covered, especially for a static thing like field

21:52

of view. Pixels we already know, it's

21:55

not quite adequate to give the kind of fidelity

21:57

we expect for a Mac monitor, but it's not

21:59

awful either. Then there's distance. One

22:02

of the things that makes that 100 foot screen

22:04

feel like it's 100 feet is the fact that

22:06

it's really far away from you. That it fills

22:08

a lot of your field of view, but also

22:10

when you try to look at it, you have

22:12

to focus, I don't know, 50 feet away or

22:14

whatever, like you have to focus the distance from

22:16

the middle of the giant theater to the screen.

22:18

And that's never going to happen in the current

22:20

Vision Pro because every single thing in there is

22:22

1.3 meters from you. So

22:25

no matter how much of your field of

22:27

view you make the television screen, the

22:29

movie or whatever, no matter how big you make it, even

22:32

if you make it like I'm sitting in the front row

22:34

and I can't even see the whole screen and it's just

22:36

overwhelming me, it's still going to be 1.3

22:39

meters away as far as your eyeballs are concerned because

22:41

you'll be focusing 1.3 meters away to be able to

22:43

see what's on the screen. This

22:45

is not to say that it's bad or good

22:47

or indifferent. Sometimes having it 1.3 meters

22:49

away is probably better than having it 50 feet

22:52

or away. But it explains

22:54

why when I was in there, I experimented

22:57

with like, can I make this video really big to

22:59

make it feel like I'm watching a big screen? And

23:01

I could make it big and I could make it

23:04

fill my field of view, but it

23:06

never felt like I was watching an IMAX screen. And

23:08

it's because the IMAX screen is not 1.3 meters

23:10

from my face. So I

23:12

don't know what the solution to this other than, you know, obviously

23:14

we talked about a headset that has

23:17

a variable focal distance or whatever. But keep

23:19

this in mind when you're thinking about what

23:22

you want out of a big screen experience. When

23:24

you're talking about the Mac, you probably want more

23:26

pixels that this thing can't

23:28

really deliver comfortably for you. When

23:30

you're talking about a movie screen, if

23:32

you like the feeling of sitting in a giant movie theater, you're

23:34

not going to get that when your eyes are focusing 1.3 meters

23:36

away. But in

23:39

Casey's case, or if you're on an airplane or

23:41

whatever, you can definitely get a larger screen 1.3

23:44

meters away than you comfortably can in a

23:46

physical environment. Either whether that means that you're not carrying

23:48

your XDR with you to the library to get the

23:50

view that Casey was getting or bring it

23:52

onto a plane or whatever. How

23:55

do we turn off a Vision Pro? I don't

23:57

even remember talking about this last episode. Was this

23:59

a point? during my demo when the person was

24:01

twisting the power connector to reboot the Vision Pro.

24:03

Oh yeah yeah. So when the person did that

24:05

I teasingly said, oh you know Apple never puts

24:07

power buttons on their things because wouldn't that be

24:09

marking me into a power button. And

24:11

then I suggested to the person when they were twisting

24:14

the little thing and taking it off, I'm like why

24:16

don't you just try holding down the the the crown

24:18

and the button at the same time. And they told

24:20

me no that's not how it works. And

24:23

then as a head head to head door a

24:25

week of people sending the message is saying you

24:27

should have just told the person to hold down

24:29

these two buttons because that will show the little

24:32

shutdown slider that you see in from iOS and

24:34

let you turn down. So anyway there is an

24:36

Apple support document explaining how you can turn off

24:38

the Vision Pro. I think it's just

24:41

called how to turn off the Vision Pro and

24:43

apparently you can do any of the following. Number

24:45

one, press and hold the top button in the

24:47

digital crown and I'll show like the little shutdown

24:49

slider. Number two, go to settings general shutdown and

24:51

then drag the slider. Number three, say dingus

24:54

turn off my Apple Vision Pro. And

24:56

finally take off Amazon Pro, place it on a secure

24:59

surface like a table or desk, then disconnect the power

25:01

cable. It's amazing that they tell you that disconnecting the

25:03

power cable is one of the ways you can turn

25:05

off. I mean that's true I guess but that's kind

25:07

of a let's say ungraceful shutdown

25:09

because all the other ones give software time to

25:11

do you know and a proper clean shutdown. Disconnecting

25:14

the power does not do that. It's just gonna

25:16

power it's like oh it's the Apple TV reboot

25:18

procedure. Just yank out the power cord. Which is

25:20

kind of interesting too because like no other iOS

25:22

based device has ever had this right? Yeah just

25:25

TV. Just TV are the ones. Apple TV that

25:27

like I remember when I first got the Apple

25:29

TV is like surely this is a support document

25:31

telling me how to like you know reboot it

25:33

or shut it down and it's like just yank

25:36

the cable. Although interestingly I understand that you grabbed

25:38

this from the Apple support document not trying to argue with

25:40

you but to my recollection when

25:42

you press and hold the capture in digital

25:44

crown for a couple of seconds then that

25:46

brings up a force quit menu. So maybe

25:49

you have to mash it down for even

25:51

longer. Yeah I think both UIs

25:53

are in the same thing. Why don't you just

25:55

try it? Alright hold on I gotta strap in

25:57

again hold on. Strap in. This

26:00

is not fun to do with headphones on. I

26:02

told you. Well, in case you're doing that, I

26:04

will bring up another point, which is this document

26:06

says you can do all these things, but

26:08

I am suspicious about whether all

26:11

of them are equivalent. I

26:13

do believe that disconnecting the power will turn it

26:15

off, because there will be no more electricity, and those

26:17

capacitors will discharge eventually, and as far as we

26:19

know, there's no backup battery. So turning off, disconnecting

26:22

the power should turn the thing off.

26:25

Every other one of these things, I'm suspicious that

26:27

it's like in a deep sleep mode, and it's

26:29

not really off, you know what I mean? It

26:31

doesn't say that that's the case here, but the bottom

26:34

line is when power is still attached, I always, I

26:36

mean, Max has done it for ages, I always wonder

26:38

what it's like. I'm mostly off, but I'm kind of

26:40

a little bit on, and occasionally I'll wake up and

26:42

check for new email and stuff, which is the thing

26:44

that Max has done for ages, so

26:46

I give this a little bit of side eye.

26:48

Alright, real time follow up, real time follow up.

26:50

So I'm going to press down starting now, and

26:54

now I've got a force quit menu. And

26:56

I'm going to press down starting

26:58

now. Why

27:01

are we doing? No, no, no. Okay, there we got

27:03

slide to power off. So it was an additional one

27:05

to two seconds. I will say also like hot tip

27:07

about powering down the Vision Pro. However

27:09

you choose to do it, definitely power it down

27:11

if you're going to be not using it for

27:13

a while and it's not plugged in, because it

27:15

drains its own battery. If you just leave it

27:17

like on a countertop, not plugged in, it'll

27:20

be dead by the next morning. Mine

27:22

wasn't dead by the next morning, but it was

27:24

like half the battery. It had to be moved

27:26

up. It was like the AirPods Max, just like

27:28

the AirPods Max is downloading your photos from your

27:30

iPhoto library, so is your Vision Pro. AirPods Max

27:32

can't display them, it just likes to download them.

27:35

Photo analysis, do you have to run? I

27:38

actually do think that's one of the things that melted

27:40

my battery, because I had done this like the second

27:42

day I had it, you know, I had left it

27:44

somewhere, I forget where, and it was plugged in, or

27:46

plugged into the battery pack, but the battery pack was

27:49

not plugged into anything. When I got to it the

27:51

next morning, it was at like 50% or something like

27:53

that. Yeah, it's downloading your messages, it's thinking your notes,

27:55

it's doing all the things. If you have a long,

27:57

long time, you know, long time, you know, you can't

27:59

do it. long suffering Apple ID, let's say.

28:02

It's got a lot of data, and this thing's going

28:04

to try to download it, and yeah, it's going to

28:06

eat your battery. All right,

28:08

let's talk about the developer strap. We

28:11

all ordered, was it the day of release,

28:13

I believe, Marco? It was like the day

28:15

after. It was sometime soon afterwards, but you

28:18

could kind of tell, like, maybe they just

28:20

didn't want people to really talk about it,

28:22

because I think they kind of buried this

28:24

announcement. Yeah, I think I'm pretty sure it

28:26

was the day of release, but if not, like you

28:28

said, it was the next day. I would rather. So

28:30

to recap, this is a $300 strap. So

28:36

it's the white pieces on the

28:38

Vision Pro that connect

28:41

to the Vision Pro to the back strap,

28:43

and in certain cases, the top strap as

28:45

well. The light

28:47

seal or shield, the light shield

28:49

and the little light

28:51

shield cushion, those do not touch the strap, those

28:53

touch the Vision Pro itself, but the

28:55

back strap and top strap strap, if applicable,

28:58

connect to these white straps, and the white

29:00

straps also has the ear pods, audio pods,

29:02

whatever they're called. It's the right stick. That's

29:04

right. And so the right hand stick, if you

29:06

get the developer, the $300, did I mention $300? The

29:10

$300 developer strap. Are

29:13

you a little upset about the price, maybe? Like

29:16

I mean, honestly, I do to a degree,

29:18

I do get it, but whoa, golly. So

29:21

this is the $300 strap that in the

29:23

same spot that on the left hand side,

29:25

you plug in power to the Vision Pro,

29:27

it has a very similar design, like little

29:29

nubbin, if you will, and hanging

29:31

off of that nubbin is

29:33

a USB-C receptacle so that

29:36

you can plug USB-C in on

29:38

this and USB-C in on your computer,

29:40

and then you can do things like

29:42

have better screen sharing, allegedly. I mostly

29:44

agree with you, Marco. I haven't really

29:46

noticed a big difference on that. But

29:49

one way or another, you can have better screen

29:51

sharing, allegedly, and you can also do much easier,

29:53

faster, better, etc. development because you're not relying on

29:55

Wi-Fi. I got one of

29:57

these, I ordered it immediately because I was still worried

30:00

about it. about inventory and things like

30:02

that. It turns out that was, I think, for

30:04

naught. But nevertheless, I ordered it immediately. It came

30:06

in the Monday that I had left to go

30:08

to New York, so I didn't

30:10

get a chance to play with it until this past Monday,

30:12

when I brought it with me to the library knowing I

30:14

was going to be doing Vision Pro development, and I thought

30:16

to myself, you know what, I'm gonna leave this thing sealed,

30:18

and it was, it was actually in the shipping box at

30:21

this point, I'm gonna leave it sealed, and hopefully I won't

30:23

need it. You know, hopefully it won't be a big deal.

30:26

And I connected my Vision

30:28

Pro to my computer via Wi-Fi, and

30:31

it did the, I forget exactly what it's called,

30:33

Marco, you probably remember, but the like, downloading symbols

30:36

or preparing for development, whatever it is, dance. And

30:38

it's at 0%, at 2%, at 4%, at 6%. And

30:44

after literally like half an hour of

30:47

this, I immediately opened

30:49

the development, the $300 developer strap, and

30:51

said, the hell with this, I'm

30:54

gonna have to open this thing up. And when

30:56

I opened it, I was under the impression

30:58

that this was a USB 2.0 device, it

31:01

only, only

31:03

thing it does is apparently a little bit

31:05

of magic with screen sharing, allegedly, and

31:08

it lets you do, you know, debugging and

31:10

whatnot via the cable, and apparently, John, that's

31:12

not right. So what's going on here? Yeah,

31:15

I think there's still just people speculating, but

31:17

they're pulling it up in the system information

31:19

app in macOS, and you can see it

31:21

listed under the Thunderbolt slash USB 4 bus.

31:25

You can see the Apple Vapro listed under there

31:27

once you connect it with an actual Thunderbolt cable.

31:29

This is leading people to believe that this thing

31:31

is Thunderbolt capable, even if none of the software

31:34

that we have now is taking advantage of it.

31:36

And one of the things that lends credence to that is

31:38

if you look at the standard little

31:40

white stick that plugs in there, and you take it out,

31:43

and you see the widest lightning connector Apple has ever made,

31:45

that has 10 contacts on it, and

31:47

those contacts are only on one side,

31:49

so it's a very odd, asymmetrical, one-sided

31:52

curved lightning. The

31:54

developer strap has 14 contacts, or 28,

31:57

14 on each side. So

32:01

that's a lot more contacts. Even if it's just four

32:03

more, that's substantial. And the fact that they have

32:05

them on both sides makes me think that this developer

32:07

strap is surely equipped, electrically

32:09

speaking, to do more than USB 2.0

32:12

speeds. We'll see if that

32:14

speed is unlocked in the future, but it sure

32:17

looks like that maybe you might get more for

32:19

your money, more for your $300 than

32:21

USB 2.0 speeds if

32:23

and when new versions of Vision OS

32:26

and or Mac OS and or Xcode

32:28

are released. Can you

32:30

imagine if this thing could, and I mean

32:32

granted it's dongle town all over again, but

32:34

can you imagine if you could plug in

32:36

like an HDMI in to this thing, the

32:39

same ones that I've gotten and many people

32:41

have gotten for their iPads and for their

32:43

Macs, especially what's the name of the

32:45

app that's really good

32:47

for HDMI input on the iPad that the Halide

32:49

people do, I'm drawing a blank now. Shoot,

32:52

I'll have to try to remember to put it in the show notes, but anyways,

32:55

yeah, you imagine having an HDMI dongle and then plugging

32:57

into that HDMI dongle, I don't know, like a

32:59

Nintendo Switch or something like that. That would be neat.

33:02

Is that possible? Who knows? Probably be

33:04

an HDCP, whatever, you know, handshake violation

33:06

and you just get a black screen,

33:08

so don't worry about it. Yeah,

33:11

but I don't know, it would be cool if it

33:13

was more capable than just doing, you know, the developer

33:16

strap stuff as it is today. And I'm not, I

33:18

mean, I am grumbly obviously about the fact that it's

33:20

$300, but

33:22

nevertheless, it is very convenient. And I know

33:25

I haven't done watch development seriously, you know,

33:27

I've dabbled as we talked about many years

33:29

ago, but golly, I would pay $3,000 for

33:31

one of these for

33:34

an Apple Watch. Oh my God, I would, no question,

33:36

like if there was some kind of Apple Watch developer

33:39

strap, even if it was also $300 or more, I

33:43

would buy it in a second, because even

33:46

though the wifi debugging to the Apple Watch

33:48

has gotten way less crappy than it used

33:50

to be, it is still really crappy compared

33:52

to any kind of wired debugging like on

33:54

a phone. So yeah, no question, like, and

33:56

that's why I bought this too. It

33:58

is clunky to use. in the sense that

34:01

it is now two cables. There's a

34:03

cable coming out of each end and

34:05

that's really great. Yeah, that's not great.

34:08

However, if you are doing like

34:10

active debugging or like a fast build and

34:12

run cycle on the Vision Pro from Xcode,

34:14

it is really nice to have that be

34:16

as fast as it can be. And

34:19

that is why I bought it because I knew,

34:22

I always, like I've

34:24

upgraded my entire Apple Watch solely

34:26

because Underscore told me it would build the Apple

34:28

a little bit faster in this build and run

34:30

cycle. That's how much it

34:32

matters when you're actually actively debugging and actively

34:35

building and running an app in like a

34:37

tight loop of, alright, change this, fix this,

34:39

run again. Every second matters

34:41

for both your productivity and honestly

34:43

your mood. And so for

34:45

me it's very high value to try to

34:48

shorten that loop and try to make sure

34:50

there's a little friction as possible when I

34:52

got to do that cycle. And even

34:54

though the cable situation is stupid, the

34:57

price is stupid. The

35:00

fact that it was not built into

35:02

the battery cable that itself has communication

35:04

protocols and the USB-C port on the

35:06

battery is the stupidest of them all.

35:09

However, I still did gladly buy

35:11

it and use it because

35:14

the debugging cycle is just that much

35:16

better and it makes that much of a difference in my

35:18

life. Yeah, I really wish

35:20

that you could optionally, and I

35:22

get why Apple doesn't do this because there's 104 reasons

35:25

why it would be clunky, but I

35:27

wish you could power this thing through

35:29

the developer strap because... Yes, somehow, give me

35:32

one cable. Right, that would be tremendous. And

35:35

I mean, USB-C can carry power. That

35:38

is a thing USB-C can do, but unfortunately not

35:40

here. But I mean, the battery, we just announced

35:42

last week that the battery puts out a voltage

35:45

that is not supported by any of

35:47

the USB power, to the respects, I believe. It's

35:50

weird because it seems like

35:53

the hardware was designed without

35:55

ever talking to the Xcode team. Designed

35:58

in a vacuum with no... No one ever

36:00

considering, hey, what about cable debugging? I mean,

36:02

I'm sure they were using

36:05

Xcode to do all the development of the Vision

36:07

Pro. That's a good – so how – like,

36:09

okay, I don't want to harp on this too

36:11

long, but just how did

36:13

this not get integrated into the main battery

36:15

cable somehow? That blows my

36:17

mind. Yeah. Real-time follow-up, the app

36:20

that you were thinking of apparently according to the chat

36:22

room is Orion. Does that ring at all, Casey? Yes,

36:24

that's it. Thank you.

36:26

Yep, yep, yep. Thank you. I

36:28

was just a little bit of a intro, asking about debugging on the

36:30

watch, like, so the rumors are that there is

36:32

and was a thing that you would connect to,

36:34

like, the little diagnostic port behind the strap for

36:36

watches internal to Apple to do essentially wired debugging

36:38

on the Apple Watch. And then the

36:40

rumor was that future Apple Watches – I don't know

36:42

if that means current or still-to-come ones – used

36:45

a sort of high-frequency wireless interface to do

36:47

that debugging that was better than Wi-Fi, but

36:49

it was kind of like a direct point-to-point

36:52

wireless interface with some really high

36:54

frequency. So what we're saying is that

36:56

we've always heard that inside Apple, the

36:58

build-and-run cycle that Marco was just complaining about for

37:01

the Apple Watch is better inside Apple, but

37:03

that betterness supposedly has not

37:05

yet trickled out to the regular developers. We

37:10

are brought to you this week exclusively

37:12

by ATP members. Please consider

37:14

joining and becoming a member today. So here's

37:16

what you get. The biggest benefit, the most

37:19

commonly used, you get an ad-free version of

37:21

the show. You get basically your

37:23

own private RSS feed. It's compatible with most podcast

37:25

apps, and you can just paste it in, off

37:28

you go, you get an ad-free version of the show. You

37:30

also get bonus episodes. This is exclusive

37:32

content for members. We do all sorts

37:34

of stuff about once a month or

37:36

so. We do things like

37:38

rank certain tech products or

37:41

review movies or try food, stuff like that. It's

37:43

a lot of fun, honestly. I really enjoy the

37:45

member content, and our members tell us they really

37:47

do too. You also get a bootleg

37:49

feed. This gives you, if you want to listen

37:51

to the show this way, you can in addition

37:53

or instead, this gives you unedited live streams of

37:55

the show. It's published right after

37:57

we finish doing the live stream, so it's published about

37:59

usually... You know the night before the

38:01

episode comes out by almost a day, and then

38:03

it's totally raw and uncut So you get to

38:06

hear like you know if we like flub an

38:08

intro have to redo it you hear that you

38:10

use your case He's swearing without the beeping it

38:12

out stuff like that You'll hear bonus

38:14

stuff at the beating an end or picking titles

38:16

and you know trying to figure out the format

38:18

or making you know It kind of irrelevant discussions

38:21

before after the show that we end up cutting

38:23

all that's included in the bootleg It's a lot

38:25

of fun being a member You can do all

38:27

this for just eight bucks a month, and it

38:29

is by far the best way to support the

38:31

show So please consider doing it. We would really

38:33

appreciate it ATP FM join

38:35

once again eight bucks a month

38:37

ATP FM join

38:39

thank you so much for your consideration and

38:41

now back to the show Matt

38:47

Rigby writes to us many quote-unquote 3d

38:49

films including the recent Star Wars trilogy are

38:51

actually 3d Conversions that is

38:53

to say the films are shot in

38:55

2d on a single camera And then

38:58

rotoscope artists individually cut out each element

39:00

of every shot frame by frame Otherwise

39:03

known as rotoscoping then map these elements

39:05

as textures onto rough 3d objects and

39:07

render those objects in 3d space Holy

39:10

fart knockers. I can't believe that that's what

39:12

people do to make these 3d movies. That

39:14

sounds terrible, but Matt links to Real

39:17

3d or fake 3d comm that is literally

39:19

the URL real 3d or fake 3d comm

39:22

Also linked in last last week's show notes

39:24

Because that's why I was trying getting at when I was

39:27

talking about the 3d movies and Margot had watched the Star

39:29

Wars ones And the 3d wasn't done very well there and

39:32

the real or fake 3d Hostname

39:34

is you know obviously? Extreme

39:37

basically saying the thing you just read Casey That's

39:39

fake fake 3d because it was like well the

39:41

movie was shot in 2d and then we make

39:43

it 3d It's also called like post conversion or

39:45

whatever and so the real or fake 3d comm

39:47

site It basically lists movies like look if you

39:50

want to know if the movie you're going to

39:52

watch is real or fake 3d Look

39:54

it up on here, and you'll know what it

39:56

is that you're getting the implication being that you would

39:58

presently want to avoid the quote-unquote about fake 3D

40:00

instead of the real one. So

40:04

I talked to our friend and

40:06

illustrious industrial light and magic special

40:08

effects artist Todd Vizzieri, who has

40:10

worked on many Star Warses and

40:13

many other Star Trekks and other movies you may have

40:15

heard of about the topic

40:17

of 3D, and in particular the whole

40:19

thing about real and fake 3D. And

40:23

he had an interesting take on it. I'm going

40:25

to try to summarize it here because I didn't

40:27

record our conversation and it was in

40:29

audio instead of email so I can't quote passages

40:31

from it. His take was that shooting

40:34

quote unquote real 3D with

40:37

two cameras sitting next to each other, you're filming stuff

40:39

with two cameras in 3D so you get a right

40:41

eye and a left eye thing of it, is

40:44

kind of a pain in the butt. Now it's pain in

40:46

the butt for some obvious reasons. You have two

40:48

cameras, they take up more room, they're heavier, it's pain to

40:50

deal with two cameras to make sure they're all working and

40:52

everything. You can't get those two cameras into the same places

40:54

that you can get one camera into, you have less flexibility

40:57

there, right? Also when you're filming

40:59

with two cameras you have to make a bunch

41:01

of decisions when you're

41:03

filming that you don't have to, when

41:06

you're post converting you can change your

41:08

mind about stuff like that. So for

41:10

example, what we were just saying, the

41:12

IPD, the interpupillary distance, it's called the

41:14

interaxial distance in the realm of 3D

41:16

filming. When you film with two

41:18

cameras you pick that distance by putting the cameras that

41:20

distance apart from each other. And

41:22

it's not easy to change that after the fact,

41:25

whereas when you film in 2D and they do

41:27

that 3D conversion, you're choosing when

41:29

you do the conversion what

41:32

you want that distance to be. When

41:34

you do the conversion, like later after the entire

41:36

film is put together, but when you're shooting in

41:38

3D you're kind of baking in that distance in

41:40

every one of your shots that you make. And

41:43

that's important because there's a whole bunch of guidelines

41:45

for doing 3D filming that you want to try

41:48

to not violate which is like don't

41:50

change that interaxial distance massively from one

41:53

shot to the next. Because if you're

41:55

cutting between them it'll make people's eyes

41:57

bug out. It's like, whoa! now

42:00

my eyes are three feet apart, now they're two inches apart,

42:02

now they're three feet apart, now they're one inch apart. You

42:05

don't want to bounce that back and forth in the

42:07

same way that you wouldn't want to bounce back and

42:09

forth lots of things in the 2D world. So you

42:11

have to have a lot of planning and be careful

42:13

and be precise. You do

42:15

have a lot of repair to do when you

42:17

film in 3D because you have to sort

42:19

of make the image from each of

42:21

the cameras match up in a pleasing way when

42:23

viewed in 3D, which involves unwarping the lens distortion

42:26

and making it so that when you actually watch

42:28

it with 3D glasses or in a headset or

42:30

something that it doesn't look weird.

42:33

If you get lens flares, you'll get different

42:35

lens flares in each camera because they're in

42:37

different positions and trying to reconcile two different

42:40

lens flares that you're showing in 3D is

42:42

weird because we're all used to

42:44

seeing one lens flare because the lens flares actually

42:46

happen inside the lens. I mean you're shooting

42:48

with one camera, you get one lens flare and that's

42:50

what we're all used to seeing in movies. But

42:52

when you shoot with two cameras, you get two lens flares

42:55

or maybe one but not in the other one depending on

42:57

where the lights are and we're not

42:59

used to seeing that so it's weird. That's

43:02

very, it makes

43:04

filming very difficult and it makes you have to sort

43:06

of do like a Hitchcock style where you have everything

43:08

planned out, you know exactly what you want, you shoot

43:11

only what you need and you can't

43:13

change your mind easily by all this stuff. Whereas

43:15

post conversion, you shoot it in 2D using all

43:17

the techniques and technologies that we've always had for

43:19

2D and then later someone comes along and says

43:21

now I have to figure out how to make

43:24

this into 3D and they can slice and dice

43:26

it and you know it sounds like a lot

43:28

of work and it is a big pain but

43:30

you can choose each individual

43:33

frame of film how you want things to look so

43:35

you'll know that okay we know that shot A comes

43:37

after shot B comes after shot C so I'll make

43:40

sure I don't bounce around the interact field distance there

43:42

because the film is done, it's already

43:44

put together you don't have to guess. The people who

43:46

are filming it have to, not guess

43:48

but like say well I hope this shot comes after this shot

43:50

comes after this shot but if we decide to change it around

43:52

it might be jarring because that shot we shot yesterday and the

43:54

cameras are closer together than they are now, it's

43:57

kind of a pain in the butt. And as

43:59

for the movie that I saw in my demo, I

44:01

was asking like, why

44:04

would that look like bad

44:06

3D or fake looking 3D to me?

44:09

It's a CG rendered movie and I kept saying they

44:11

have all the depth information and people thought what I

44:13

was saying is that somehow that there was like that

44:15

I was going to get infinite depth information in

44:17

the like in the movie itself

44:20

as opposed to just a right eye and a left

44:22

eye thing. What I'm saying is like when you're 3D

44:24

rendering it, the rendering software when it's generating the image

44:26

knows the distance of all the pixels. So

44:28

there's no reason that it would, that things should

44:30

look like they are 2D cutouts. Todd

44:33

didn't know any details about the Mario movie but he

44:36

said it's completely plausible that someone could have a CG

44:38

movie and to save money or time they

44:40

would render out either the whole thing

44:43

in 2D and then slice it up and add

44:45

fake 3D to a CG movie or render it

44:47

out in layers and have those be composited together

44:49

and part of that is again for cost and

44:51

annoyance reasons. If you're doing a computer

44:54

animated movie like a Pixar movie, you

44:56

can do it the quote unquote real

44:59

way where you render two different perspectives.

45:01

You have two virtual cameras in your

45:03

virtual world and you render from two

45:05

different perspectives but when you do

45:07

that you quickly find oh it turns

45:09

out that now one of the cameras can see

45:11

around back behind a piece of geometry that I

45:13

thought was hidden in the 2D version of the

45:15

movie and now I can see some place where

45:17

we didn't fill in a texture or like Todd's

45:20

example was like there's a walk animation of someone doing

45:22

a walk thing and they go out of view and

45:24

once they go out of view the walk animation stops

45:26

because you don't need to animate it when they're not

45:29

in view but the other camera spots when their legs

45:31

stop moving and they just start sliding along right so

45:34

you have it's harder to it's like building

45:36

a set right oh now your set's going

45:38

to be viewed from two slightly different perspectives

45:40

so be careful you're not basically messing

45:43

up your movie by trying to do it with two

45:45

virtual cameras and of course two virtual cameras means twice

45:47

the rendering time because you're not just rendering from one

45:49

camera you need twice the CPU power or twice the

45:51

amount of time to render each frame this

45:54

is why tons of quote-unquote fake

45:57

3D happens in movies in it it

46:00

makes sense, but it also kind of explains the

46:02

thing that I don't like about a lot of

46:04

those is it does look like someone

46:07

cut out through pieces of paper, the foreground,

46:09

the mid-ground, and the background, and they're sliding

46:11

past each other in a way that doesn't

46:14

look convincingly 3D to me in the way

46:16

that the cameras in the

46:18

Alicia Keys studio look convincingly 3D as if I

46:20

was there because the cameras were similar to

46:22

some of my eyes and they

46:25

were shooting a real thing that was really there and that's all there

46:27

was to it. Friend of the show,

46:29

Joe Rosenstiel, wrote on Six

46:31

Colors and it's a members-only post that we're

46:33

going to apparently steal some of, so I

46:35

hope we have permission. I'm blaming John. These

46:38

are excerpts from Joe's summary of his own

46:40

post, so I would recommend subscribing to Six

46:42

Colors and read the entire article, which is

46:44

much longer, but here is Joe, he sent

46:46

this through email. It's him trying to condense

46:48

and summarize some of the major points. A

46:50

couple of important definitions off the top. Interaxial

46:52

is the distance between two stereo

46:55

cameras. The distance between the human

46:57

eyes is fixed at about 65 millimeters, but

46:59

the distance between cameras can be anything.

47:02

And secondly, convergence. This is

47:04

where the two

47:06

images converge. When they have positive parallax,

47:09

they recede into the screen and when

47:11

they have negative parallax, they stick out

47:13

of the screen. So

47:15

Joe writes with that in mind, everything you

47:18

see with stereoscopic media, 3D stuff, is going

47:20

to be different because you can't just set

47:22

up two cameras 65 millimeters apart and call

47:24

it a day. When I used to work

47:27

on stereoscopic movies, we would define interaxial and

47:29

convergence values, not just per shot, but per

47:31

element of a shot. Because where the objects

47:33

really were would have been boring to look

47:36

at. Films are about directing the audience's view.

47:38

A big part of depicting

47:40

depth and directing the viewer's eye in 2D

47:42

requires adjusting focal distance and aperture. Elements that

47:45

are extremely out of focus imply depth in

47:47

2D and direct the eye. In stereoscopic films,

47:49

the more something is out of focus, the

47:51

more it loses any detail that your brain

47:53

can use to see disparity between the different

47:55

images shown to each eye. And thus, positive

47:57

or negative parallax. Extremely out of focus elements.

48:00

Mush themselves back toward the depth of the

48:02

screen regardless of them being far away or

48:04

extremely close. service be to point that we

48:06

just weather here are fascinating. So the first

48:09

is. Defining. Different

48:11

so I interact. Oh, and convergence values

48:13

for multiple things in the same size?

48:15

of basically, it's almost as a fight.

48:17

Okay, when we sought the foreground characters,

48:19

the cameras were two feet apart. But

48:21

then the table. There's the that's behind

48:23

them. The cameras were six inches apart,

48:26

like adjusting the parents for the individual

48:28

things. which is. Obviously.

48:30

Not how eyes work. Our eyes on Sunday moved to

48:32

feet apart when we look at one thing and a

48:34

movie together. When we look at another thing, they're always

48:36

the same distance apart. Only thing is you can't just

48:39

take two. Cameras are rather he. Threw

48:41

the way through. These are movies have been done, hasn't

48:43

just been to take two cameras for them human eye

48:45

with the bar and stick them and point them. It's

48:47

something because that is deemed. either.

48:49

Not interesting. Or. As he

48:52

notes like the they're they're using those

48:54

two tools the interests of the since

48:56

I'm a convergence. To. Direct the

48:58

audience of eye toward something which is

49:00

nothing. I think. I distinguishes three movies

49:02

which I tend not to like. From

49:04

the Alicia Keys and Shark Swing towards

49:06

you think those are straight up to

49:08

cameras the with of your eyes and

49:10

so it feels like you're there. right?

49:12

Where is the three? The movie. The.

49:16

Hand of the artist to the director

49:18

is more prominent because they are directing

49:20

your i am to directly your eyes.

49:22

They're doing things that. Don't.

49:25

Exist when you're looking at something like again

49:27

and multiple items in the sought using different

49:29

camera distances apart. Whether it's real three, the

49:32

to deal with the you do in real

49:34

Thirty Two shot them separately in a composite

49:36

of them later or if it's fake three

49:38

days they just you know separated them differently

49:40

when they were slicing elements up. And

49:42

that to me looks weird on the a thing

49:45

as oh what about focal distance? Well things that

49:47

are out of focus tend to just look like

49:49

center like your eyes can't tell the difference between

49:51

them so they just sort of. Converge.

49:54

on the the center of the screen like said just

49:56

as it up the screen even as you may be

49:58

interactive this is huge and the They're

50:00

supposed to be way far back in the screen. As soon

50:02

as you blur them, people start to perceive them as being

50:05

exactly at screen level, which is not

50:08

what you want. And I also kind of feel that when I

50:10

watch 3D movies where it's like, okay, well, they use a shallow

50:12

depth of field here and the

50:15

3D things look 3D, but that blurry

50:17

thing, it's blurry in the film because,

50:19

you know, it was out of focus

50:21

when they filmed it, but it should

50:23

feel like it's 10 feet back, but it feels to

50:25

me like it's right next to the foreground characters because

50:27

it feels like it's at the depth of the screen.

50:30

So Joe continues, the 3D method used

50:32

animated, post converted, or native stereo doesn't

50:34

really make a film good or bad.

50:37

There's a tendency to say that all post converted films

50:39

are bad or fake, but that's

50:41

not universally true because post conversion can allow

50:43

for a greater degree of control over the

50:45

end result if it's done well. Isn't that

50:47

what Todd just said? Conversely, native stereo and

50:49

animated films are not universally more 3D because

50:51

they captured full left and right eye views,

50:54

like if they just set it near to human

50:56

vision, pushed everything behind the screen plane, and didn't

50:58

dial in the depth of field to increase what's

51:00

in focus, etc. So these came in

51:02

independently, and I don't know exactly when

51:04

you had your conversation with Todd, but

51:06

I think pretty much concurrently Joe and

51:08

Todd said basically the same stuff. Yeah,

51:10

Joe also works in the VFX industry,

51:12

and it really clarified for me why

51:14

I don't like 3D movies because, I

51:17

mean, I guess they could be done well or not

51:19

well, but first of all, the Star Wars ones and

51:21

the fake 3D, that always bothers me for like the

51:23

paper cutout thing, like the foreground characters feel like they're

51:26

closer, but they feel like they're being projected onto a

51:28

flat screen and they're close to me, and I actually

51:30

asked about that. I'm like, did they ever, especially

51:32

for the foreground characters, does he ever

51:34

do anything to make it so like when they're

51:36

post converting a 2D thing, the foreground characters don't

51:39

look like the paper dolls, right? And

51:41

apparently sometimes they take like a rough 3D model

51:43

of a head and they

51:45

map the essentially texture of the

51:47

2D filmed guy's head onto

51:49

that, so his ear is closer to you

51:51

than his nose when he's side, you know

51:53

what I mean? But that, I

51:56

just look at that, I'm like, just shoot it with two

51:58

cameras, man, but again, all the complexity.

52:00

And the second thing is, I think

52:02

there has to be a distinction between what looks good in a headset and

52:04

what looks good on a movie screen. The reason

52:06

I'm so wowed by the stuff in the headset is

52:08

because I'm looking at screens, two

52:10

screens that are eye-width apart, and

52:13

the video I'm looking at was shot with a

52:15

camera, where the two cameras were basically two eye-width

52:17

apart. So it's straight, it feels like I'm in

52:19

the water with the shark, feels like I'm in

52:21

the studio with Leisha Keys. That is very different

52:23

than sitting in a theater season looking at a

52:26

screen and then having the people

52:28

who made the movie decide where they

52:30

want to direct your attention with. So

52:32

I'm extremely unrealistic, but hopefully pleasing and

52:34

interesting and exciting 3D work. And I

52:36

personally really don't like that second thing,

52:38

but I really like the shark. Fair

52:40

enough. All right, we've gotten a

52:43

little bit of news with regard to the European

52:45

Union's Digital Markets Act. This

52:48

is the genesis of all the oddness that's going

52:50

on with the App Store in the EU, but

52:52

we're not talking about the App Store right now.

52:54

We're talking about iMessage, and iMessage was one of

52:56

those things that the DMA people were

52:58

wondering whether or not it classifies as a,

53:01

what is it, a core platform service, which

53:03

is their term of art to mean we're

53:05

gonna regulate the snot out of you. And

53:07

so reading from the Verge, Apple's

53:10

iMessage is not being designated

53:12

as a quote core platform service

53:15

quote under the European Union's Digital

53:17

Markets Act. The European Commission announced

53:19

today, this is yesterday, the decision

53:21

means the service won't be hit

53:23

with tough new obligations, including a

53:25

requirement to offer interoperability with other

53:27

messaging services. The Commission also opted

53:29

against designated Microsoft's Edge browser being

53:31

search engine and advertising businesses as

53:33

core platform services. Although iMessage

53:35

has avoided the burden of complying with rules

53:38

that come with the official DMA designation, the

53:41

period of regulatory scrutiny coincided with

53:43

Apple announcing support for the cross-platform

53:45

RCS messaging standard on iPhones. Meta,

53:47

meanwhile, has seen two of its

53:49

messaging platforms, WhatsApp and Messenger, designated

53:52

as core platform services under the DMA

53:54

and has been working to make them

53:56

interoperable with third-party services Wampano. I think

53:58

it's like your prize to like all the loot Microsoft

54:00

Edge, the search engine that nobody

54:02

uses, iMessage, you're not even big

54:05

enough to be regulated, sorry. I'm

54:07

sure Apple likes it, but it's kind of, you know.

54:09

Well, I mean, this might have also been the result

54:12

of Apple lobbying for it in some way. I mean,

54:14

because keep in mind, the DMA

54:16

is not defining these

54:18

standards in a vacuum. The DMA

54:20

targets specific companies with specific products

54:23

and services, and then rationalizes

54:25

it with how it draws the

54:27

line. Yeah, yeah, like it targets them by

54:29

picking an arbitrary number. If you

54:31

have more than this exact number of customers as of

54:34

whatever date, and they just look up who has them

54:36

on that date, and they just... Exactly.

54:38

So like, you know, so for whatever

54:40

reason, it isn't that iMessage just doesn't

54:42

qualify, it's that they drew the lines

54:44

to not include iMessage. Yeah. Which

54:47

I think is fair, actually, because it isn't as

54:49

dominant as the ones they are regulating, and

54:51

certainly Microsoft Edge is not dominant, neither is

54:54

Bing, so congratulations, and I'm sorry,

54:56

I guess. And then Riley

54:58

Testit has written into

55:00

us with regard to Apple's third-party

55:02

marketplace system for the Digital Markets

55:05

Act. So Riley is

55:07

the author, Genesis, creator of Alt

55:09

Store, and so Riley has a

55:11

lot of experience with what is

55:14

probably the most official, even though

55:16

it's very, very, very unofficial, third-party

55:19

App Store for the iPhone today. So

55:21

Riley writes, So Riley has

55:23

been pouring through the Marketplace Kit documentation for the past week and

55:25

a half, and there's some nuances I've learned from implementing this

55:27

for Alt Store. First of all, any

55:29

developer can choose to distribute their apps to alternative

55:32

app marketplaces regardless of where they live. Once they've

55:34

agreed to the new business terms, only developers building

55:36

app marketplaces need to be based in the EU,

55:39

or have legal subsidiary in the EU. To

55:41

start using marketplaces, you must first request a

55:43

security token from an alternative marketplace, which

55:46

will allow you to add that marketplace in App

55:48

Store Connect. Once you've added a marketplace, you can

55:50

then choose which apps you want to distribute with

55:52

it. You can distribute any of your apps to

55:55

any combination of marketplaces, including the App Store. Users

55:57

will have to delete an app before installing the same

56:00

app. from another marketplace though. When you're

56:02

ready to distribute your app, you submit it

56:04

to Apple through Xcode like normal and wait

56:06

until notarization finishes. Once processed, developers can automatically

56:08

submit notarized apps to marketplaces through Apple, or

56:10

they can manually download the notarized, quote, alternative

56:12

distribution package, quote, or ADP, and send it

56:14

directly to the marketplace themselves. It's up to

56:16

the marketplaces to choose how they want to

56:18

receive their apps. That's the most interesting thing

56:20

in this email because before we were saying,

56:23

oh, everything has to go through Apple and

56:26

it does have to go through Apple, but

56:28

Apple and Apple can deliver it to the third party store,

56:30

but they can also just give it back to you and

56:32

say, you know, you can do this last part. I don't

56:35

know what that buys you other than more hassle because you

56:37

do have to go through Apple. And so it's not like

56:39

you can bypass them. But if you wanted,

56:41

you can say, Apple, don't send it to the

56:43

store, send it to me. And then I'll send

56:45

it to the store. And I guess the marketplace

56:47

would have its own upload portal thing where they

56:49

accept them. I don't know what the advantages would

56:51

be, but it's interesting that that flexibility does exist.

56:54

Riley continues, I fully agree that third party

56:56

marketplaces only really make sense for apps that

56:58

can't exist on iOS right now, but not

57:01

just for the obvious content reasons. For example,

57:03

besides the fact that my app Delta isn't

57:05

allowed in the app store because it's a

57:07

Nintendo emulator, it also is entirely monetized through

57:09

Patreon by providing pre-release access to beta versions

57:11

to my patrons. This business model is forbidden

57:13

by the app store despite it being a

57:15

proven way to monetize software in other markets,

57:17

such as indie video games. For this reason,

57:19

I've actually added deep Patreon integration to Alt

57:21

Store to encourage other indie developers to monetize

57:24

apps this way, of which Alt Store

57:26

takes no commission because I

57:28

genuinely believe it's a better system for

57:30

smaller developers. Now the

57:32

other thing with the DMA is

57:34

that you are required to have

57:36

a million euro line of credit.

57:38

And what I think all of

57:40

us took that to mean was you have

57:42

to have a bank say, yeah, we will give

57:44

you up to a million euros if you ask

57:47

for it. Like we've already pre-approved you, we

57:49

will do it if necessary.

57:51

And we had a couple of pieces of feedback

57:53

about this, but Bobby Parati writes, I work in

57:56

commercial finance. Your discussion of the DMA and the

57:58

required million euro, quote, stands for the DMA. by

58:00

letter of credit quote makes me want to clarify

58:02

what that actually is. That's money that must be

58:04

held essentially in escrow by your bank. It's not

58:06

a line of credit. It's not like a line

58:08

of credit. It is more akin to a minimum

58:10

deposit. I think a lot of people assume it

58:12

means you would be okay as long as you're

58:14

approved for that amount of credit from a bank

58:16

like a home equity line. But I can get

58:18

a home equity line of credit, never draw on

58:21

it and not be inconvenienced much at all as

58:23

long as I have home equity. A standby

58:25

letter actually means that the bank is locking those

58:27

funds up so the beneficiary, in this case Apple,

58:29

can take from it if certain conditions are

58:31

met. It's your cash but held

58:33

unable to be used for anything else. A

58:36

good way to think about a standby letter of credit

58:38

is basically a check that the beneficiary, Apple, can cash

58:40

at any time. Small

58:43

consortiums of indie devs which will probably

58:45

have trouble getting that kind of money

58:47

together in order to control their own

58:49

distribution destiny. So I really wonder what

58:51

Riley is going to do about this.

58:55

Maybe Bobby's understanding is incorrect. Maybe our

58:57

understanding certainly sounds like it's incorrect. He

59:00

sounded pretty sure because I went back and forth and run a

59:02

lot of that. I asked one more clarification which is like, okay,

59:04

do you actually have to have that money? Because you can write

59:06

a check and not have the money for it and only when

59:08

the person goes to cash it do you find out, oh, you

59:11

can't actually pay for the thing. He

59:13

said, yeah, not only do you have to have that

59:15

money, and pretty much all cases that he's

59:17

aware of, the institution that gives you

59:19

that standby letter of credit demands that you give

59:22

them the same bank that's giving you that letter

59:24

of credit, you have to give them the 1

59:26

million euros. So you've got to have that money

59:28

for realsy reals, give it to them, then they

59:30

will give you that standby letter of credit and

59:33

they will hold that money and the money is

59:35

basically sitting there saying if Apple ever wants to

59:37

take this, they can take it for whatever reason

59:39

it says in their marketplace

59:41

contract or whatever. So you can't get by

59:44

saying, oh, we're good for it or whatever.

59:46

No, you've got to have that in cash

59:48

and you have to give it to the

59:50

institution who then gives you this standby letter

59:53

of credit. So I don't know, maybe

59:55

Altstar has a million euros hanging around and they're going to

59:57

sell past this, but yeah, it's a, it's a, It's

1:00:00

more of a burden than we thought it was. Much more.

1:00:03

Not even close to what we thought it was.

1:00:05

So thank you, Bobby, for writing and telling us

1:00:07

we don't clearly work in commercial finance. And

1:00:10

I think this basically tells you the

1:00:12

kind of entities that we should expect to

1:00:14

actually jump through the hoops to run an

1:00:17

alternative app store in the EU. It's

1:00:19

not going to be small companies and

1:00:22

small developers. It's going to be probably

1:00:24

a very small number

1:00:26

of pretty large entities. So

1:00:31

let's talk Vision Pro. We talked a

1:00:33

lot about this last week. Let's

1:00:35

do some more. And I think

1:00:37

we left off last week. Our heroes were

1:00:39

about to discuss what it's like to let

1:00:42

other people try the Vision Pro. So John,

1:00:45

it seemed like you had thoughts about this, or you

1:00:47

perhaps wanted to direct conversation, or am I misreading you

1:00:49

entirely? You're misreading. You were going to tell us, so

1:00:51

Mark, I want to tell a story of letting other

1:00:53

people try the Vision Pro, and you tell your story.

1:00:55

I didn't let any other people try the Vision Pro

1:00:57

because I was just an Apple store and it was just me. So

1:01:00

what's going on, Marco? Well, so I've

1:01:02

had a bunch of friends try this in

1:01:05

the last, whatever, it's been a week or two. I

1:01:08

think enough people have pointed out now, the

1:01:10

guest mode that you can put it in from

1:01:12

Control Center to let someone else put it on

1:01:14

without your optic ID, basically. It's

1:01:17

fine. I

1:01:19

would say if Apple wants to give the

1:01:21

guest users a good impression of what it's

1:01:23

like to use a Vision Pro, you

1:01:25

should probably make guest mode a little bit better.

1:01:28

It's fairly clumsy to get

1:01:30

started, and it's extremely unforgiving.

1:01:33

As many people have pointed out, if the

1:01:35

wearer in guest mode lists

1:01:37

the Vision Pro off their face for even a

1:01:39

split second, it resets it completely and kicks it

1:01:41

back into your mode. So even if they list

1:01:43

up, rub their eye, or adjust the fit a

1:01:46

little bit too much or something, once

1:01:48

it's off their eyes, they're out. You

1:01:50

have to put it back on as you relog

1:01:52

in with either optic ID or the passcode, go

1:01:54

back into guest mode and Control Center and turn

1:01:56

it back on for them to put it back

1:01:58

on. This is made especially

1:02:01

inconvenient because every time someone

1:02:03

puts it on in guest mode, they have to

1:02:05

go through the entire eye setup. So first it

1:02:07

has them hold the crown to align the display

1:02:10

as we discussed earlier. Then it has

1:02:12

them go through the whole intro of like, look at the dots

1:02:14

and pinch your fingers and then make it brighter. Look

1:02:16

at the dots again and pinch your fingers. So it

1:02:18

takes a while and it's kind of repetitive

1:02:21

and cumbersome. So the

1:02:23

guest mode experience is not something

1:02:26

that you're going to want to do frequently and

1:02:28

I think it's important that if you're demoing for somebody else

1:02:30

that you warn them, don't take it

1:02:32

off your face in the middle because it will reset

1:02:35

it and have them have to start all over again.

1:02:37

I wonder if that's related to, so Optic

1:02:39

ID is like essentially it's like a touch

1:02:41

ID or face ID, but for your eyeball.

1:02:44

And I know from experience using a shared Mac

1:02:47

in our house, and you probably know if you've

1:02:49

done this on any kind of shared Mac, even

1:02:51

a laptop, there's a limit to how many touch

1:02:54

ID fingerprinty things you can store on a Mac.

1:02:56

And that limit I believe is determined by essentially

1:02:58

the secure enclave and the hardware. So it doesn't

1:03:00

matter how big your SSD is, doesn't matter what

1:03:02

version of the OS you're using, whatever

1:03:04

number of fingerprints it is, it's like seven or eight,

1:03:06

I don't know how many it is. That's it for

1:03:09

the whole system. And

1:03:11

so like, for example, I want to have like my

1:03:14

fingerprint work on both my wife's account and mine

1:03:16

and vice versa, so we don't have to type

1:03:18

in each other's passwords. But

1:03:20

you run out real quickly because if the kids have their

1:03:22

own fingerprints on their accounts, you run out. So

1:03:24

they're not even saving the optic ID for guests so that

1:03:26

if you give it to a guest and they try it

1:03:28

and they take it off and the next day they want

1:03:30

to try it again, it doesn't

1:03:32

like recognize them as a guest that had seen before and

1:03:34

boot them back into their guest mode or anything like that.

1:03:37

It just doesn't even save their optic ID. So

1:03:39

I wonder if A, they're storing the optic

1:03:41

ID in the secure enclave because it

1:03:43

is biometric data presumably, and B, apparently

1:03:45

they're only storing your optic

1:03:47

ID. One, you know, or two,

1:03:49

I don't know, one for each eyeball, whatever. And

1:03:52

that's it. Guests don't get anything

1:03:54

saved about them. Every time the Vision Pro sees

1:03:56

this person, it's like, I have no idea who

1:03:58

you are. You're a guest. The

1:04:00

other major limitation I've run into

1:04:02

is that one of the best

1:04:05

assistive tools for if you're going to be

1:04:07

showing someone how to use Vision Pro is

1:04:10

you can airplay what they are seeing to a

1:04:12

nearby Mac or other screen. So you can, I

1:04:14

have my laptop nearby, so I will say, alright,

1:04:16

mirror the screen of what they're seeing to my

1:04:18

Mac and then I can see what they see

1:04:20

and I can kind of guide them. Okay, go

1:04:23

to this section of the Apple TV app to

1:04:25

go find the 3D videos or whatever, you can

1:04:27

kind of walk them through what they're seeing and

1:04:29

what you want to show them. The

1:04:31

problem is that breaks the

1:04:33

DRM assumptions of the video

1:04:35

player. So

1:04:38

if you have screen sharing enabled,

1:04:41

they cannot watch any video content that

1:04:43

is DRM protected, which is all video

1:04:45

content basically that you would want to

1:04:48

show them. Everything from Apple TV+, everything

1:04:50

from Disney, it's all DRM locked and

1:04:52

so if it's air

1:04:54

playing, it basically breaks

1:04:57

whatever DRM requirement is that you're copying the

1:04:59

screen and so not only can you not

1:05:01

see it on the Mac, they

1:05:03

can't see it on the internal displays either.

1:05:05

So they can't watch 3D video content in

1:05:07

the demo mode if you can see what

1:05:09

they can see and that is

1:05:11

a huge limitation in part because it just kind of

1:05:13

sucks. Also because as far

1:05:16

as I could tell when I did these demos, you

1:05:18

can't turn off the screen mirroring because

1:05:20

they don't have access to control center.

1:05:23

In guest mode, there's no control center. So

1:05:26

if you want to show them, the only way you can find was

1:05:28

to take it off, reset

1:05:30

guest mode, turn off screen sharing, go through

1:05:32

the whole process again. What you

1:05:34

could do is like when I was demoing for some

1:05:36

friends, we were airplane to the

1:05:38

TV, like to the Apple TV I guess I

1:05:40

should say that was in the living room and

1:05:43

when you're on an Apple TV anyway, you can

1:05:45

hit the back or menu or what have you

1:05:48

button to effectively cancel screen sharing. Now if you're

1:05:50

screen sharing to a Mac, I don't know how

1:05:52

that would work. I've only ever done that like

1:05:54

once or twice, but so you may not have

1:05:57

the same option, but it does work

1:05:59

pretty well with an Apple TV. Apple TV where you can just

1:06:01

basically cancel the screen sharing. Oh, I should try

1:06:03

that. I didn't think to try that. But

1:06:05

anyway, so that's, it just, it shows the like,

1:06:07

you know, like this is Apple

1:06:10

TV showing Apple's content on

1:06:12

two Apple devices. It

1:06:15

totally breaks. Yeah, that's because

1:06:17

of the stupid, like I said, the HCCP,

1:06:19

whatever it is. Yeah. High definition

1:06:21

copy protection. I'm going to put a link to it earlier.

1:06:24

That standard has all these things about like, you

1:06:26

know, what it looks like is when it don't

1:06:28

siphon off the video off a side channel so

1:06:30

you can record it secretly only, it can only

1:06:32

be displayed on the screen that it is handshake

1:06:35

through through the stupid secure DRM protocol. Again, there

1:06:37

was a reminder all this is to make sure

1:06:39

no one ever ever is able to pirate

1:06:41

video. And we know of course, this

1:06:43

solved the problem of video piracy and now it is impossible

1:06:45

to private pirate video. Thank you. Copy

1:06:48

protection. You did your job great. No, what

1:06:50

it actually means is that A, everything is available for pirating

1:06:52

and B, you're going to want to pirate it because the

1:06:54

legit copy you bought, you can't even watch because it blacks

1:06:57

out all your screens. And

1:06:59

the thing is, I really wish, I don't

1:07:01

know, maybe I'm missing the point of how,

1:07:03

you know, copy protection works, but I

1:07:06

really wish that perhaps it would

1:07:08

be impossible to see the black

1:07:11

square of content that the user

1:07:13

was seeing on AirPlay. So in

1:07:15

the device, in the goggles, then

1:07:17

they're seeing everything you would expect

1:07:19

to see. But the AirPlay

1:07:21

mirroring, you're getting blackness for the, you know,

1:07:24

the square of content or if you're doing

1:07:26

something immersive, perhaps the entire display is black

1:07:28

or it's like a checkerboard pattern or something

1:07:30

like that. I really wish you could

1:07:32

at least do that because what you've said Marco

1:07:34

is exactly accurate, like leaving aside whether or not

1:07:37

you can turn off AirPlay. The first time I

1:07:39

did this with somebody, you know,

1:07:41

they go to go into, I think

1:07:43

it was Disney Plus we were trying at the time and they

1:07:45

were like, well, it's not working. What are you talking about? And

1:07:47

then I look at the TV and I'm like, oh, you're right,

1:07:50

it's not working. And it took me a few beats before I

1:07:52

realized, oh, I bet you

1:07:54

anything, this is DRM. And so then,

1:07:56

you know, canceling AirPlay seemed to do

1:07:58

the trick if memory serves. And only

1:08:00

nerds would know that because there's no error message. It just shows

1:08:02

it as black, just black screen. Yeah, the same thing as when

1:08:04

you take a screenshot on your iPad trying to take a screenshot

1:08:06

of a TV show, which I do all the time. And I

1:08:09

always have reminded, oh, yeah, this doesn't work. And I think the

1:08:11

reason why you can't do it, you were suggesting Casey, it's like,

1:08:13

oh, why don't they just show it to the person but not

1:08:15

show it to me? Then you got your

1:08:17

copper protection. I'm assuming it has to do with the

1:08:19

fact that essentially once you do

1:08:21

the mirroring, you have broken the chain of trust.

1:08:23

There's no way to do a three-way chain of

1:08:26

trust. So now nothing is trusted.

1:08:28

You have this weird forking scenario,

1:08:30

and you have this. This

1:08:33

is not on Apple insofar as Apple is just

1:08:35

following these stupid industry standards that we have. That

1:08:37

Apple kind of has to follow to work with

1:08:39

all of the other. Even if Apple

1:08:42

didn't want to do this with its own streaming service, which

1:08:44

it does. But even if it didn't want to, it has

1:08:46

to work with all the other streaming. So they have to

1:08:48

essentially implement this in your hardware, and everything has to be

1:08:50

certified. So this is all just so you can watch

1:08:53

content that's out there. And it infects

1:08:56

every part of their system as well because their

1:08:58

whole video chain and system is built on it.

1:09:00

And it's so incredibly dumb. So

1:09:02

hopefully they'll do something to fix this. I

1:09:04

mean, again, especially with Apple's own apps and

1:09:06

own streaming platforms, and own OS and device,

1:09:09

they should be able to fix it for that. Fixing

1:09:12

it for any other streaming apps. If they ever

1:09:14

exist on Vision Pro, ha ha, will

1:09:16

be more tricky. Yeah,

1:09:19

so anyway, showing people the

1:09:21

3D video proved to be tough because

1:09:23

the DRM thing is annoying. And

1:09:26

again, it's Apple's content on

1:09:28

their own streaming service, on their devices.

1:09:31

They know someone. Maybe they can talk to and work

1:09:33

this out. Otherwise, and

1:09:36

I do suggest for Apple, the immersive

1:09:38

3D video should be easier to find

1:09:40

in the TV app in the Vision

1:09:42

Pro. Oh my gosh, yes. Well, I mean,

1:09:44

it's true. Anything in any kind of

1:09:46

streaming service where it's like, what about the

1:09:48

thing I want to find? It's like,

1:09:50

never mind that. Have you seen these giant

1:09:52

things that we're advertising for the first

1:09:54

two full screen folds until you get

1:09:56

down to... Ugh, so... And in effects Vision

1:09:59

Pro too. Like anytime you're like, hey, here's

1:10:01

a video playing app, surely it will be easy

1:10:03

to find the things that I watch frequently. No,

1:10:05

exact opposite. It will be intentionally hard to find

1:10:08

the things that you want because they always want

1:10:10

to shove something new in your face. Never mind

1:10:12

what you constantly watch. Never mind anything about what

1:10:14

you want or your favorites or your frequency. It's

1:10:16

all about what do we have to push on

1:10:19

you, which is so dumb for Vision Pro where

1:10:21

they should be allowing, they're so little content anyway.

1:10:23

They should just be making the same point. But

1:10:25

again, it's based on the same code base as

1:10:27

the TV app and all their other platforms and

1:10:30

it sucks everywhere. Also, you

1:10:32

said a second, there's so little content. That

1:10:35

part has kind of surprised me. I

1:10:38

would have expected with the

1:10:40

launch of Vision Pro, I would have expected there

1:10:42

to be more of Apple's 3D and

1:10:45

immersive content than there actually is. There's actually very

1:10:47

little of it. It's like a few demos basically,

1:10:49

or like one episode of something. It's like 12

1:10:51

minutes here and there. There's

1:10:54

not much content yet. Obviously,

1:10:56

I'm sure Apple is gonna stage it out over the

1:10:59

course of the year as they sell more Vision Pros

1:11:01

and whatever else. But Apple has

1:11:03

a lot of power here because they

1:11:05

are a video producer

1:11:07

and they have shown that

1:11:09

they will make custom recording

1:11:12

gear and record perfectly immersive

1:11:14

stuff that's custom tailored to Vision Pro. That's

1:11:17

great. They need to be doing

1:11:19

a lot more of that because I think

1:11:21

it's going to be a while, if ever,

1:11:23

before they get large support from other producers

1:11:25

of video. Therefore, they should step

1:11:27

up more and produce a lot more stuff

1:11:29

for this than what we're seeing so far.

1:11:31

Hopefully, that's in the pipeline. But I was

1:11:33

kind of surprised and a

1:11:35

little bit disappointed that there wasn't more

1:11:37

immersive content available at launch. Well,

1:11:40

it's a chicken egg thing because even Apple's

1:11:42

own creative wing is saying, wait

1:11:44

a second, you want us to spend how many

1:11:46

millions to make a show that is only possible

1:11:48

to be watched by 500,000 people on the planet?

1:11:52

There's only people who have the capability of watching it.

1:11:55

Let me show you how much money this is per person that

1:11:57

you're asking us to spend. Oh, you don't understand.

1:12:00

to drive people who want people to buy the thing.

1:12:02

I'm like, yeah, but right now they haven't bought it

1:12:04

and you can't make more than this many per year.

1:12:06

And so I can see that conversation being difficult. They're

1:12:08

not going to make a for all mankind in

1:12:11

headset 3D that can be

1:12:13

watched by you and 200,000 of your

1:12:15

closest friends. Because that is not, and

1:12:17

they're like, oh, it's for the future, it's for the future when

1:12:19

we sell 10 million of these things. Like, yeah, but you want

1:12:21

us to make it today. And I

1:12:23

can imagine that being difficult for them to square.

1:12:27

I think they should make more of it, because I think it's

1:12:29

the most compelling thing in the entire headset. But I bet what

1:12:31

Apple is thinking is instead of

1:12:33

that, instead of like doing

1:12:36

what I think would be unprecedented, like

1:12:38

trying to essentially make Alicia Keys swimming

1:12:40

shark caliber of content that I don't think

1:12:42

has ever been made before and sort of

1:12:45

like a long form full television thing, like

1:12:47

with that resolution and those cameras for like

1:12:49

a regular TV show, and

1:12:52

figuring out how to do that, because I don't think anyone

1:12:54

knows how to do that well at this point. That

1:12:57

is much more experimental than the easier thing,

1:12:59

which is we should just do sports like this. We need

1:13:01

to get a good sports contract. And as Gruber said in

1:13:04

those things, when Apple lost out for the bid for the

1:13:06

NFL thing, he was kind of disappointed.

1:13:08

But now that he has Vish and Pro, he's

1:13:10

angry about it. Because that is a gimme. You

1:13:13

saw in the demos how good sports looks, and you don't

1:13:15

have to make that content. They run around in the field

1:13:17

and they make it all for you. You just need to

1:13:19

point cameras at it. And you can have the cameras be

1:13:21

eye-width apart, and you have 17 of those cameras,

1:13:23

and you put them in weird places, and that is

1:13:25

a winner, and that is a big draw, and you

1:13:27

have to pay, well, you have to

1:13:29

pay way less money to make it. You just have to pay

1:13:31

money up front to get the rights to be the one who

1:13:34

has the cameras there. So I think

1:13:36

that is an easier first path for Apple to go

1:13:38

with this. Like, how do I make compelling content for

1:13:40

Vish and Pro? Kind

1:13:42

of popular sport. Film it in

1:13:44

3D, whether it's the NBA, or Major

1:13:46

League Soccer, or whatever. That

1:13:48

seems like an easy first move. And I bet Apple wants

1:13:51

to do that and is going to do that. Yeah,

1:13:53

also concerts, other events. It seems

1:13:55

like live events in 3D, that

1:13:59

seems like a big... market, including

1:14:01

sports and other things. Honestly, anything

1:14:03

is creepy. Hey,

1:14:06

sharks swimming in water, real popular. We're

1:14:09

all kind of snarking, but we're all

1:14:11

serious. We're all also serious. I cannot

1:14:13

overstate, I haven't done a

1:14:15

demo of the Vision Pro since the first weekend I

1:14:17

had it because we've been just exceedingly busy the last

1:14:19

week and a half, whatever it's been,

1:14:22

but I think it was the first day that I had it

1:14:24

that I did a handful of demos for a couple of friends.

1:14:27

The thing that unquestionably sold everyone

1:14:29

the most was that sizzle reel,

1:14:32

which I think we talked about quite a bit last

1:14:34

week, of all the different immersive stuff. The

1:14:37

tightrope walking lady, the sharks,

1:14:39

the soccer game, the rhinoceros,

1:14:41

the Alicia Keys. I

1:14:45

know you can imagine what

1:14:48

it would be like to be

1:14:50

watching something, but as you

1:14:52

twist your head, your perspective changes. It's

1:14:55

obvious and easy thing to

1:14:57

imagine, but when you're

1:14:59

actually doing it and when you're

1:15:02

seeing the incredible fidelity of the

1:15:04

thing that you're watching, this isn't

1:15:06

some 480p crappy

1:15:09

recording because it just can't handle anything

1:15:11

more. It's not flickery and dim and

1:15:14

weird 3D movies with the glasses that

1:15:16

you watch. I cannot

1:15:18

overstate how incredibly impressive this

1:15:20

stuff is. Naturally,

1:15:23

anything that they want to put in

1:15:25

this immersive environment, I'm game

1:15:27

to at least try it. Yeah, the Alicia Keys,

1:15:29

as Mike had said, singing at

1:15:32

you is a little bit weird,

1:15:34

but it was also freaking cool.

1:15:36

It was so cool. I

1:15:38

actually have to go back and watch the whole thing, but I think

1:15:40

I had said last week, I skipped through several minutes of it and

1:15:42

I just kind of zig-zagged

1:15:44

around. It was phenomenally cool.

1:15:46

To build on what Marco was saying a minute ago,

1:15:48

I would pay all the

1:15:50

money to have a really good Dave

1:15:52

Matthews or Mute Math or whatever concert

1:15:55

that's been recorded with these white

1:15:59

obelisks of three 3D cameras, I

1:16:01

would give all the money. And I

1:16:04

just cannot overstate how impressive this is.

1:16:06

As impressive as you imagine it might

1:16:08

be, like double or triple that,

1:16:10

because that's how good it is. So

1:16:13

the video demos, again, if we get past the

1:16:16

DRM and having them navigate to the

1:16:18

Apple TV app, it is a

1:16:20

very impressive thing. I will also say before I

1:16:22

forget that the, of the

1:16:24

two straps that come with the Vision Pro, the

1:16:27

fancy one with the crank and the single

1:16:29

headband around the back, the Solo Knitband, is

1:16:32

far better for demo purposes than

1:16:35

the nice comfortable dual loop band, because it's

1:16:37

so much faster and easier to adjust it.

1:16:39

You know, the dual loop band is like,

1:16:41

all right, you get these two Velcroed straps,

1:16:43

you got to, you know, strap, pull it, strap it down.

1:16:45

Like, that's very impractical for

1:16:47

demo purposes. The hastily assembled one is

1:16:50

less practical than the one that they

1:16:52

clearly designed from the beginning. Surprise.

1:16:55

Yeah, yeah, so you want to be using the Solo Knitband,

1:16:57

with the single loop that goes around the back. You want

1:16:59

to use that for demos if you have it. First, I

1:17:02

had Tiff try it. She

1:17:05

could not possibly be less

1:17:07

interested. And

1:17:10

you know, nerds out there,

1:17:12

many of you have people

1:17:14

in your life that you try to demo

1:17:16

technology for, and maybe they

1:17:19

will humor you and support you in

1:17:21

your love for technology, but

1:17:23

you can kind of tell they're kind

1:17:25

of doing you a favor. Their

1:17:27

heart's not really in it. That's

1:17:29

how this scenario was. You

1:17:34

know, she was not impressed by the

1:17:37

fit, was not impressed by the

1:17:39

weird, what she called the nose cape.

1:17:43

I didn't notice that until like, I don't

1:17:45

know, the second or third day I had

1:17:47

it when it accidentally flipped downwards. So what

1:17:49

Marco's talking about is, there's like

1:17:51

this very thin, completely like

1:17:53

flapping in the breeze material that

1:17:56

sits directly on top of your nose, which

1:17:59

by default, It's kind of like flipped upwards.

1:18:01

So it's it's black against the black inside of the

1:18:03

of the vision Pro So you don't really notice it

1:18:05

but then it can't it has give to it because

1:18:08

it's just a piece of fabric And so flip down

1:18:10

once and I was like what the hell? Oh Oh,

1:18:13

I didn't even know that thing was there like it

1:18:15

took me a day or two before I even realized

1:18:17

what the heck that was But yes nose cape is

1:18:19

a very good word for it or term for it.

1:18:21

Yeah Yeah, so she wasn't

1:18:23

super pleased with the physical side of it. It was

1:18:26

you know heavy on her I mean, it's not fitted

1:18:28

to her. Did you get her a different light

1:18:30

shield? No, it's and that's and that's a fair thing

1:18:32

Like, you know, obviously like everyone that I'm having try

1:18:34

this is trying my size to everything on it. So

1:18:37

anyway she Hated

1:18:40

having to go through the the

1:18:42

eye tracking dot pinching introductory

1:18:45

thing that was not fun I

1:18:48

have found also many people who tried on

1:18:50

do not intuitively get the IPD adjustment thing

1:18:52

where you have to like double tap the

1:18:54

crown to confirm like you it's like hold

1:18:56

it down and then double tap it Like

1:18:58

that especially since the instructions are presented to

1:19:00

you Probably in double vision

1:19:02

because it hasn't adjusted yet And

1:19:04

so it wants you to look at like a diagram and

1:19:06

understand what it wants you to do But you're seeing double

1:19:08

at that point Well, actually for whatever maybe

1:19:10

it's just me even when it's in

1:19:13

it's like unset state I find it

1:19:15

fairly clear in in that mode. I mean you can

1:19:17

close one eye obviously Yeah, anyway,

1:19:19

the funny thing is also like she's

1:19:21

an amazing tester of my app. She has

1:19:23

a special talent I can

1:19:26

hand her something that works perfectly and within

1:19:28

a second She will find a way to

1:19:30

break it which is actually wonderful as a

1:19:32

software developer Like that's a great quality for

1:19:34

your spouse to have because it's a wonderful

1:19:36

first stage of QA She puts on

1:19:38

the vision Pro and this is this is still when I had the 1.0

1:19:42

Software and I now have the 1.1 beta on it. So

1:19:44

I don't know if this is fixed yet, but she put

1:19:46

it on and It

1:19:48

basically immediately locked up and had to be rebooted.

1:19:50

Oh cool This

1:19:52

is still very you know, very 1.0 kind of

1:19:55

days Anyway,

1:19:57

so she gets through it she

1:19:59

basically basically said, okay, yeah, it's cool, but why would

1:20:01

I want this? Wait, wait, wait.

1:20:04

Even after she saw the shark? She

1:20:06

actually bailed out pretty quickly.

1:20:10

So I had all

1:20:12

the testers try the encounter dinosaurs quote

1:20:15

app, which is more of like a brief 3D

1:20:17

demo. This is the thing you've heard

1:20:19

about on other podcasts where like you hold your finger out

1:20:21

and the butterfly lands on it. So I had five different

1:20:24

people try this. All five of them

1:20:26

put their finger out to have the butterfly land on it. It doesn't

1:20:28

tell you to do that, but it kind of looks like you can.

1:20:30

And so you try it and oh look, the butterfly land on my

1:20:32

finger. All five people did that. At

1:20:34

some point, a large dinosaur comes into your

1:20:36

field of view, which can look somewhat intimidating.

1:20:39

Two people that I had tried, including Tiff,

1:20:42

as soon as the big dinosaur showed up, they were just

1:20:44

like, no, I'm out. And just like took my head off.

1:20:47

That was the end of the demo. I'm done.

1:20:50

Did you show them the dinosaur before you showed

1:20:52

them Alicia Keys and the tiger walker? Yes. I

1:20:55

know. I know, I'd have been a mistake. I couldn't find them in the Apple TV. Anyway,

1:21:00

so yeah, two people like noped right out

1:21:02

of the headset. As soon as the big

1:21:04

dinosaur showed up and the other three all

1:21:06

tried to pet the big dinosaur. And

1:21:10

you pressed the dinosaur.com, yeah. Yeah,

1:21:13

so anyway, that's roughly how it went.

1:21:16

Everybody was fairly impressed with the 3D

1:21:18

video content. Everybody was impressed by the

1:21:20

dinosaur thing. I do wish there was

1:21:22

like a little bit more, like one

1:21:24

more 3D like interactive experience to show

1:21:26

people. Again, this will probably come with

1:21:29

time I assume. But it is kind

1:21:31

of, again, I wish there was

1:21:33

a little bit more of a demo because after you watched

1:21:35

a couple of sample things and it's like, okay, well that's

1:21:37

kind of it. Like you can open up notes

1:21:39

or my email if you want to see how that

1:21:41

kind of stuff works. It's a little bit awkward though.

1:21:44

You open up photos, oh, here's a panorama. You can

1:21:46

do that kind of stuff but I

1:21:48

do wish there was a little bit more demo content

1:21:50

available. But again, this will probably go over time. Did

1:21:52

you, you could have taken some spatial video of Adam

1:21:54

with your phone and put it in there. Like I

1:21:56

said, I did. I'm kind of surprised that Tiff wasn't.

1:22:00

convinced by like the birthday scene

1:22:02

and like the spatial

1:22:04

video of people, like seeing the

1:22:06

possibilities for your own content like that? Not

1:22:09

really. Maybe

1:22:13

she was upset with the eye tracking and the nose

1:22:16

cape, but it didn't

1:22:18

really sell her. And then finally I got some

1:22:21

interesting input from Adam. So this is my 11

1:22:23

year old son. He is a heavy

1:22:26

user of the Quest series of VR devices,

1:22:28

had a Quest 2 for a while, recently

1:22:30

got a Quest 3. He barely cared first

1:22:33

of all about using the Vision Pro because

1:22:36

there's no games. Like for him, VR

1:22:38

means games. Obviously not a

1:22:41

lot of people are gonna be buying a

1:22:43

nearly $4,000 VR headset to play games on

1:22:45

it as the primary purpose, but

1:22:47

it's interesting like you know from a kid's point of

1:22:49

view how this is totally irrelevant. It's like a Mac Pro to a

1:22:51

kid like why would I want that? He barely

1:22:53

cared. However he did try it on,

1:22:55

he did a dinosaur demo and everything.

1:22:57

He instantly noticed that the pass-through is

1:23:00

better than the Quest 3's pass-through only

1:23:04

when stationary, but it's actually

1:23:06

worse in motion. I have

1:23:08

since tried his Quest 3 and

1:23:10

he's exactly right, he nailed it. Motion

1:23:13

in the Vision Pro in general gets

1:23:15

very blurry. I've even noticed that like

1:23:17

even when using the virtual Mac screen,

1:23:19

even when just doing computing in Vision

1:23:22

Pro, if I move my

1:23:24

head a little tiny bit, I notice

1:23:26

the motion blur and it's not

1:23:29

ideal. It's not like a massive deal killer, but it

1:23:31

is something that you notice and it is yet again

1:23:33

one of the ways that I

1:23:35

kind of felt a little eye-straining when trying to

1:23:37

use the Mac Monitor mode over

1:23:40

just using a real Mac Monitor. There

1:23:42

is that motion blur. Yeah, I heard a lot of

1:23:44

people talking about that and I do wonder could you

1:23:46

tell whether it is like manually

1:23:48

created motion blur? So for example

1:23:50

Destiny has a

1:23:53

setting in the settings menu that says do you

1:23:55

want us to do motion blur and if you

1:23:57

have it checked they will, whenever the camera moves,

1:23:59

they will artificially created motion

1:24:01

blur by blending together frames because that's what

1:24:03

you're used to seeing from like You know

1:24:05

cameras like film cameras or video cameras or

1:24:08

whatever when you move them around But

1:24:10

you can turn that off and say no don't

1:24:12

pretend you're a film camera Don't artificially create motion

1:24:15

blur just show me the frames which looks Less

1:24:18

like we expect from our life of watching

1:24:20

films content, but if you're playing an FPS

1:24:22

game I find it you

1:24:24

can see things better. So I turn it

1:24:26

off So I do wonder is Apple adding

1:24:28

motion blur intentionally Computationally

1:24:30

by blending frames together to make it look

1:24:32

more Like how

1:24:34

we expect it to look because

1:24:37

or is it just like I can't imagine that

1:24:39

it's anything else because they're OLED screens I imagine

1:24:41

the response rate has to be insanely fast like

1:24:43

every OLED So it's a little bit mysterious to

1:24:45

me, but I heard this exact same complaint in

1:24:47

many different reviews I'm just wondering if

1:24:49

if it's on purpose or not. Yeah, I

1:24:51

don't know And I mean like the fact

1:24:53

that it isn't just motion blurring your pass-through

1:24:56

content, but it's also motion blurring the content

1:24:58

of Windows Yeah, well, Tom Vazurio would tell

1:25:00

you check your motion blur. Everything has motion

1:25:02

blur even lens flares So yeah They would

1:25:04

motion blur the pass-through the video like the

1:25:06

windows that ever they would motion blur everything

1:25:08

because that's again The expectation of how would

1:25:10

it look how when you see a TV

1:25:12

show and they pan the camera you get

1:25:14

motion blur? Yeah, also for whatever it's worth

1:25:16

one of the reasons why I never really

1:25:18

spent a lot of time with the Quest

1:25:21

2 Is that I would get

1:25:23

a little bit motion sick after a fairly short time

1:25:25

and I when I tried the Quest 3

1:25:27

fairly briefly I

1:25:30

had the exact same problem It is

1:25:32

obviously a huge upgrade over the Quest 2 But

1:25:34

it is not good enough for for me to avoid motion

1:25:37

problems, which I again I don't usually have in the rest

1:25:39

of life But for some reason

1:25:41

Quest 2 VR was not good for me Quest

1:25:43

3 VR is also not good for me vision

1:25:45

Pro I do not have that problem at all. I

1:25:47

have I feel zero motion problems division

1:25:49

Pro So a lot of might have to do with

1:25:51

the whatever was called There was just a podcast

1:25:54

that will remember to link in the show so you

1:25:56

can find it We're talking about the talking to

1:25:58

the CEO of the company that Apple bought bought

1:26:00

back in 2017 and they made an

1:26:02

AR thing with pass-through and their whole

1:26:04

stick was like like the cameras

1:26:06

in Vision Pro and most headsets are not where

1:26:08

your eyeballs are so their perspective is different than

1:26:11

your eyeballs so they have to do computational stuff

1:26:13

to sort of remap the camera's

1:26:15

view with like an awareness of what

1:26:17

shape the world is so that

1:26:20

it looks like you're looking through your eyeballs and

1:26:22

not like your cheeks which is where the actual

1:26:24

cameras are in Vision Pro and that mapping I

1:26:26

think is either not done as

1:26:28

well or maybe even not done at all on

1:26:31

things like the Quest because pass-through is not their

1:26:33

emphasis you know it's more of a game-playing machine

1:26:35

and that could be making you sick because imagine

1:26:38

if your eyes saw out of the center of your cheeks

1:26:40

and you moved your head around your brain would be like

1:26:42

I'm not seeing what I expect to see and you know

1:26:44

get the disconnect between what you see and what you feel.

1:26:47

That's possible I mean I so I tried

1:26:49

I was used I used the Quest 3

1:26:51

for about maybe 20 minutes and part of

1:26:53

that was I tried like

1:26:55

a full-screen game and it didn't seem to

1:26:57

be any different it was bad there too

1:26:59

so I don't know yeah anyway

1:27:02

so but speaking of eyes and eye placement

1:27:04

this leads me to my last point about

1:27:06

the demo experience which was eyesight the

1:27:08

display of my eyes on the outside I

1:27:12

have now used it enough around my family that they have

1:27:14

seen this to to

1:27:16

most people who have seen this it

1:27:18

is creepy as hell and

1:27:22

so that's that's interesting what's also interesting

1:27:24

is once I during one

1:27:26

of the demos I hand it to a

1:27:28

friend and somehow it stayed logged

1:27:30

in as me I don't know how this happened

1:27:33

but somehow it's like accidentally

1:27:35

displayed my eyes on

1:27:38

their head actually using it and I got

1:27:40

to see my own eyes oh that's actually

1:27:42

kind of convenient I'll be at the weird

1:27:44

security violation yeah so I gonna see my

1:27:46

own eyes on someone else's head as they

1:27:48

use it let me tell you that is

1:27:50

a strange experience to see I

1:27:53

do not recommend that experience

1:27:55

anyway the eyesight though so I was I was

1:27:57

sitting at my kitchen island using

1:28:00

the Vision Pro for a while with my MacBook Air, testing

1:28:03

that out for a while, getting some computational stuff

1:28:06

done over the weekend. And Adam was hanging

1:28:09

out nearby in his computer, down the island

1:28:11

further, and he looks over and he's like,

1:28:14

"'Daddy, how are they doing that with your eyes?'

1:28:17

I was like, "'What do you mean?' He

1:28:19

said, "'How can I see your eyes? I thought you were

1:28:21

looking at the screens?' He

1:28:23

was totally fooled. He

1:28:26

thought that was action, so it

1:28:28

worked in the sense that it fooled

1:28:31

another person who didn't

1:28:33

realize that it was a simulation

1:28:35

from screens. So- It's

1:28:37

only 11. Yeah, so it didn't fool any

1:28:39

of the adults, but it

1:28:42

did fool someone. And so I feel like it

1:28:45

is possible to make this feature better enough

1:28:47

in the future if they want to, to

1:28:50

maybe fool adults on a regular basis, but

1:28:53

I still don't like it. I

1:28:55

see why they did it. We'll

1:28:58

be talking this to death over the next three

1:29:01

years before they finally kill it, but I see

1:29:03

why they did it to try

1:29:05

to make this product less antisocial than it

1:29:07

really clearly is, but I

1:29:09

still don't think it's going to be great. But

1:29:13

there does seem to be enough room for improvement that

1:29:15

maybe they can make it passable

1:29:17

so the adults won't think it's

1:29:19

too creepy, and then they can

1:29:21

just get rid of it in a few years when they realize

1:29:23

it's not worth the weight and battery savings. Well, so here's the

1:29:25

thing about getting rid of it. Obviously the end goal is how

1:29:28

about just make clear glasses where they can see your actual eyeballs.

1:29:30

Like that's what they would like to make, but we don't have

1:29:32

technology for it. But we do have transparent OLED screens, but we

1:29:35

don't have the confluence of technology to be available

1:29:37

to make something that light, that high fidelity with

1:29:39

that bright of screens, yada, yada, yada, but the

1:29:42

end stage will presumably be, they

1:29:44

see your actual eyes, and you don't have to do all

1:29:46

these trickery, right? Getting rid of it, we

1:29:48

obviously think for weight and cost reasons, if you have

1:29:50

to make a low cost version of this, that's an

1:29:52

easy way to save money. But, as

1:29:56

weird as eyesight is, and as janky as it is,

1:29:58

and I do think it's pretty janky. because I saw

1:30:00

a lot of people doing it in the Apple store. And

1:30:03

it's dim. The

1:30:05

lenticular lenses only show a couple different images from

1:30:07

different angles, so they can't cover them all. So

1:30:09

sometimes your eyes don't look like they're in the

1:30:11

right place, depending on what angle you're on. But

1:30:15

it serves an important function, to

1:30:17

make it so other people are aware when you

1:30:19

can see them. That's

1:30:22

important. That's important for the how socially

1:30:24

acceptable is this, because we don't like

1:30:27

seeing people with their eyes totally blacked out the same way

1:30:29

it's considered kind of rude if someone's wearing really dark glasses

1:30:31

all the time. And when you're talking to them and you

1:30:33

want to have a serious conversation, you just want to say,

1:30:36

take off the sunglasses. You can see their eyes. It's just

1:30:38

an instinctive thing that we have. It's not the end of

1:30:40

the world. Sunglasses exist, and we don't hate everybody who wears

1:30:42

them. But wearing dark sunglasses

1:30:44

during an important conversation is considered

1:30:46

rude for a reason, right? Or

1:30:49

wearing dark sunglasses indoors, or at night, as the

1:30:51

song goes. So the

1:30:54

function, I think, they

1:30:56

can never really, the need for it

1:30:58

will always be there. Until we can see your eyes, the need

1:31:00

for it will always be there. How it's

1:31:03

implemented, there is some flexibility.

1:31:05

So even if they

1:31:07

don't ditch it entirely for a cheaper and

1:31:09

lighter model, you can imagine a

1:31:12

much, much simpler version of eyesight that shows two

1:31:14

big cartoon eyeballs. In fact, Apple has patents related

1:31:16

to this exact thing, and maybe they even prototyped

1:31:19

it and thought it was dumb. But

1:31:21

boy, you can make that way lighter if you do

1:31:23

two monochrome E ink screens on the outside of the

1:31:25

goggles that look like googly eyes that don't even pretend

1:31:27

to look like your eyes. Or even, I

1:31:29

think it was in their patent, like a

1:31:32

text display that says, I can currently see you,

1:31:34

or whatever. You know what I mean? Why not

1:31:36

just put actually googly eyes on there? They're much

1:31:38

lighter and cheaper. Yeah, but the thing is, you

1:31:41

want it to be switchable, because it's

1:31:43

trying to communicate to people, when can you see me and

1:31:45

when can you not see me? I

1:31:47

kind of wish they had this for AirPods, where they can tell

1:31:49

when audio is playing in them and when audio is not playing

1:31:51

in them. And so I think that

1:31:54

the utility of that feature will always exist. So that's the question

1:31:56

of how important is it. Is it important enough for you to

1:31:58

pay X amount more for it? is important for you to add

1:32:01

the Y amount of weight. But I

1:32:03

don't think we'll ever get to a point where we say there

1:32:05

is no utility being able to tell when people can

1:32:07

see me. There's always utility in it. It's just a

1:32:09

question of what is the correct trade

1:32:11

off to get that functionality? And I think you

1:32:13

can get a lot of the benefit. Not the

1:32:16

emotional, I can see your eyes benefit, but at

1:32:18

the very least the binary, can this person see

1:32:20

me or not benefit? You

1:32:22

can get that with way less weight and way

1:32:24

less cost than they're currently doing. And I do

1:32:26

wonder every time I see this, are

1:32:29

these CGI's that much

1:32:31

better than monochrome googly eyes? I mean,

1:32:33

they're a little bit better, but I think monochrome

1:32:35

googly eyes would be easier to see at a

1:32:37

glance. When I was seeing people do their demos

1:32:39

in the Apple store and you get close enough

1:32:41

to them to be like in the person range

1:32:43

or whatever, so you can see their eyeballs and

1:32:45

they can see you when they're doing the pass

1:32:47

through. Sometimes if you're not

1:32:49

at the right angle and there's so many like specular

1:32:52

highlights on that stupid shiny thing, you can't even see

1:32:54

what the heck, you

1:32:56

can't see the dim image on the screen through

1:32:58

the lenticular stuff, whereas if

1:33:00

they were monochrome, high contrast googly eyeballs,

1:33:03

or at least I could see them from every

1:33:05

angle and know when they're totally immersed with the

1:33:07

blue wavy stuff now and when they can actually

1:33:09

see me. So I think unlike

1:33:11

the touch bar, which is my opinion, needed to

1:33:13

die and be rethought, I think

1:33:15

the things that EyeSight is trying to do

1:33:18

are worth continuing to try to do

1:33:20

until we can see your actual eyeballs.

1:33:23

I'm just not convinced that the way they're trying to

1:33:25

do it in the very first Division Pro is

1:33:29

the right path to be traveling down with the lenticular

1:33:31

lenses and the really dim eyes and stuff like that.

1:33:33

So we'll see what they do for version two and

1:33:35

if they drop it from one of them, we'll see

1:33:37

how much that model is frowned

1:33:40

upon because it doesn't have that feature,

1:33:44

but I'm not as anti-EyeSight as other

1:33:46

people because I definitely see the

1:33:48

point of this feature and I think that point is always

1:33:51

going to be relevant. Yeah, I'd actually like

1:33:53

to build on what you said. I am

1:33:55

pro EyeSight. It's not perfect by any stretch

1:33:57

of the imagination, but for all the reasons

1:33:59

you knew... I think it's

1:34:01

absolutely worth it. Like I think it is

1:34:03

useful to get that visual cue whether

1:34:06

or not the other person is paying any attention to you

1:34:08

and Get that visual cue whether

1:34:10

or not that person is in an immersive environment.

1:34:12

Like I think these are all really useful things

1:34:14

Yeah, it looks janky as the both of you

1:34:17

have said. Yeah, it's not as bright as it

1:34:19

should be Yeah, occasionally it looks like your eyes

1:34:21

are not where they're supposed to be But I

1:34:24

think this is the best that we can

1:34:26

do right now And I don't think if

1:34:29

Apple can make this better. I don't think that this is a

1:34:31

bad path to go down No, maybe there's

1:34:33

other better paths. I'm not saying that this

1:34:35

is definitely the winner But

1:34:37

I do think that they've gone down the

1:34:40

right path I do think this juice was

1:34:42

worth the squeeze and I do think that

1:34:44

it makes the device that much more appealing

1:34:47

for Regular people

1:34:49

and that includes me like I think I would

1:34:51

like this device less if it didn't have eyesight

1:34:54

Even knowing that eyesight is janky and weird In fact,

1:34:56

I would argue in some ways it's almost better that

1:34:58

it's drinking weird because then we can all have a

1:35:00

good laugh about How janky weird it is. Well, if

1:35:02

you think about this is another sad reality of some

1:35:05

Apple today with its Restrictive policies

1:35:07

and what can and can't be produced In fact, I

1:35:09

just saw someone get division pro app rejected because what

1:35:12

they made look too much like the Mac OS dock

1:35:14

or something So of course Apple rejected it anyway,

1:35:18

if if vision Pro We

1:35:20

travel back in time and it's the Mac of

1:35:22

the late 80s and early 90s There

1:35:25

would be APIs that people would either

1:35:27

discover or Apple would publish most likely

1:35:29

people would discover for controlling that front

1:35:31

screen and we would have Talking

1:35:34

loose eyeballs out on the front Yoda

1:35:36

eyes like because they would

1:35:38

hack the they would hack the you know They

1:35:40

would find the APS or finding where your eyes

1:35:43

are pointing and peep and people would figure out

1:35:45

how to use that screen And we would have

1:35:47

tons of fun third-party apps doing different kinds of

1:35:49

cartoon eyeballs and guess what? All those silly apps

1:35:51

made by indie developers prevent distributed for free address

1:35:53

for fun Would be a

1:35:56

perfect lab for us collectively as a

1:35:58

community to figure out How

1:36:00

does it work? Are cartoon eyeballs good? Should we

1:36:03

try photo-realistic? How do these screens work, right? That

1:36:05

kind of sort of laboratory of allowing people to

1:36:07

try things and then Apple gets to watch it

1:36:09

all happen and then pick the winners and incorporate

1:36:11

them into the OS is how the Mac got

1:36:14

to where it is today. And all

1:36:16

of Apple's post-Mac platforms have been essentially

1:36:18

denied the opportunity to allow that to happen. And

1:36:20

the only people who can come up with ideas

1:36:22

are Apple because they keep all those APIs to

1:36:24

themselves. And if you try to submit an app

1:36:26

with private APIs they'll reject it. And even if

1:36:28

you try to submit an app that doesn't use

1:36:30

private APIs but looks kind of like the doc that

1:36:32

reject that but they're like we haven't thought of that

1:36:34

yet so no we don't want you third-party developer to

1:36:36

ever try anything like that. And that really annoys me

1:36:39

because I would like to

1:36:41

see fun things on that front screen even if it is

1:36:43

a scrolling text message that says I can currently see you.

1:36:45

I can currently see you know like who knows what the

1:36:47

right choice is. Obviously Apple prototypes a whole bunch of them

1:36:49

because again you look at those patents which means they did

1:36:52

all that stuff internally. What they shipped is

1:36:54

the current eyeballs but I'm willing to believe

1:36:56

that there are other

1:36:58

ways to communicate some or all

1:37:00

that information better and more cheaply and

1:37:02

with less weight. Do you think though

1:37:05

like you know you mentioned AirPods earlier and how nice

1:37:07

it would be if people could tell whether you could

1:37:09

hear them or not. But I think

1:37:11

that also is a kind of interesting counterpoint

1:37:13

to this even being an achievable

1:37:15

goal because we've had AirPods now for

1:37:17

a while. I think

1:37:20

people still don't know when

1:37:22

and whether you can hear them with

1:37:25

AirPods and it still makes people feel

1:37:27

weird and what we learn is that

1:37:29

the correct kind of societally

1:37:31

polite social interaction model is if

1:37:34

you're going to stop and talk to somebody

1:37:36

while wearing AirPods you should take them out.

1:37:39

Even if you could hear them you should

1:37:41

take them out just so that there's no

1:37:43

ambiguity so they know you can

1:37:45

hear them and that you're not listening to something else.

1:37:47

I think the same thing is going to be true

1:37:49

of Vision Pro. Maybe you know

1:37:51

some people might be aware of this

1:37:54

weird eye display on the outside and

1:37:56

what this indicates versus not indicating but

1:37:58

for most people... If someone's

1:38:00

coming up to you and wanting your attention

1:38:02

or to have a conversation with you, the

1:38:05

right move is to take off Division Pro,

1:38:07

not to try to teach society, oh, this

1:38:09

means I can see you. Well,

1:38:12

but I think there's a big difference between the ears and the

1:38:14

eyes because there's nothing to indicate

1:38:16

whether ears are accepting sound or not, other than like

1:38:18

you say, oh, I see things in your ears, that

1:38:20

means you can't hear me. But we all know that's

1:38:22

not true because especially if you're not wearing our parts

1:38:25

pro, having earbuds doesn't mean you can't hear anything. But

1:38:27

we all know when someone's looking at us because we

1:38:29

can see their eyes pointing at us. That's why Apple's

1:38:32

choice to try to do for realistic eyes removes

1:38:34

the need for you to understand what the

1:38:36

googly eyes mean or know that a green

1:38:38

light means that the camera is on, right?

1:38:40

Like they don't require any of that. They

1:38:42

just require what your son did, which is

1:38:44

like, hey, I see your eyes. That probably means you

1:38:47

can see me. That requires

1:38:49

no kind of training. But there's no

1:38:51

expectation that you can ever look at somebody and

1:38:53

know by looking at their ears whether they can

1:38:55

hear you. That's just not that. But the eyeballs

1:38:57

tell you. So with the eyeballs, there's a clear

1:38:59

solution. So like just show the

1:39:01

eyeballs. And again, the solution being how about

1:39:03

having clear glasses where they can literally see

1:39:05

your eyeballs. Just show the eyeballs. Cartoon eyeballs

1:39:07

may be a little bit higher learning curve.

1:39:11

And as for the AirPods, I think what society

1:39:13

has determined based on my experience with AirPods is

1:39:15

that everyone assumes that you can always hear them.

1:39:18

That's my experience both in and out of my house.

1:39:21

I have AirPods on my ear every time I

1:39:23

take a dog walk. And not a single time

1:39:25

has anyone even considered the fact that there might

1:39:27

be a podcast playing. They just start talking to

1:39:29

me. And this also happens inside my house, but my

1:39:32

family's here. And

1:39:34

I'm amazed. I'm like, these are not small white

1:39:36

earbuds. You can see them. There's no hat covering

1:39:38

them. And they're like, I just assume

1:39:40

you can hear everything I can say. And then I have to

1:39:42

quickly go up and pinch the thing so I can actually hear

1:39:44

what they're saying and pause the podcast or whatever. Ears

1:39:48

is a much more difficult situation because there

1:39:51

is no sort of obvious way

1:39:53

to indicate anything. It would have to

1:39:55

be learned. But eyeballs, there's

1:39:57

an obvious way. We just haven't been able

1:39:59

to. pull it off that well. And I think

1:40:01

that's probably why Apple didn't do text or

1:40:04

funny symbols

1:40:06

or cartoon things. And ideally,

1:40:09

Apple would like to make that image as

1:40:11

realistic as possible so that someone thinks, I

1:40:14

can faintly see your eyes through the really

1:40:16

dark ski goggles you're wearing outside for some

1:40:18

reason, weirdo. By the

1:40:20

way, there is totally a way they could do it with AirPods.

1:40:22

They just have it. What they need to do is put

1:40:25

an OLED color screen on the outside of

1:40:27

each AirPod. When

1:40:30

you're in transparency mode, just have it show

1:40:32

a simulated image of an ear. And so

1:40:34

it just disappears. Of the inside of your

1:40:36

ear. I guess

1:40:40

the problem is even just seeing someone's ear, you don't

1:40:42

know whether they can actually hear you or not because

1:40:44

they have those earplugs that are shoved away down in

1:40:46

your ear canal. Or you could be

1:40:48

hard of hearing. And it's another thing

1:40:50

with like, you can kind of

1:40:52

tell that with people who can't see you because if

1:40:55

they can't see you, they're not going to point their

1:40:57

eyes at you. Which is like the sign that if

1:40:59

you see someone's eyes move to you and they're looking

1:41:01

at you, you assume they can see you because if

1:41:03

they couldn't see you, they wouldn't know where to point

1:41:05

their eyes. You know what I'm saying? It's just like,

1:41:07

there's much less to learn there. Whereas

1:41:09

seeing the gross waxy inside of people's ears, I'm

1:41:12

not sure if that's much of an indicator of

1:41:14

anything. But it would be good if

1:41:16

you had like, if you did a earwax problem, they could

1:41:18

put a little camera in there and you could use the

1:41:20

AirPods. It's one of those, what are the doctor tool

1:41:22

call? Where they stick in your ear? Is that

1:41:24

something a scope? Yeah, probably. It's a something a

1:41:26

scope for sure. All

1:41:30

right, so do we want to talk about Fitts' Law? Yeah,

1:41:33

this is something that came up on Dithering,

1:41:36

John Gruber, Ben Thompson's podcast. And

1:41:38

they were talking about Fitts' Law, which they

1:41:40

insisted on pronouncing Fitts' Law because the person's

1:41:43

name is F-I-T-T-S. And the correct

1:41:45

way to possess a size that is F-I-T-T-S, apostrophe

1:41:50

S, which you pronounce as Fitts' Law. But

1:41:52

I'm sorry, I'm old and I've been saying

1:41:54

Fitts' Law for my entire life and the

1:41:57

Wikipedia page even says that it's often cited

1:41:59

as Fitts' Law. That's how I'm gonna say it.

1:42:01

Anyway, Fitt's Law, for those who aren't Mac user in

1:42:03

the 80s, is a thing that

1:42:05

says that the ease of targeting something

1:42:07

with a mouse is proportional to

1:42:09

the size of the target, which kinda makes sense

1:42:11

if you have a big target. Anytime

1:42:14

people get your mouse into this area

1:42:16

and it's a big giant area, it's real easy for them

1:42:18

to get the mouse into it. And if the area is

1:42:20

like two pixels by two pixels, it takes them much longer

1:42:22

because they move the mouse over two, but then overshoot, then

1:42:24

they gotta back up and you adjust and adjust and finally

1:42:26

get it into the two little pixel target. But if it's

1:42:28

a really big area, like a quarter of a screen, real

1:42:31

fast, people can move the mouse cursor, mouse pointer into

1:42:33

it real quickly. This is research from user interface from

1:42:36

the 80s back when the Mac was new and they

1:42:38

were trying to figure out the best way to define

1:42:40

interfaces. And the reason it comes up in

1:42:42

the context of the Mac is one of the things the

1:42:44

Mac interface had from day one is the menu bar at

1:42:46

the top of the screen. And this

1:42:48

is always cited as a great example of

1:42:50

Fitt's Law because you can just jam your

1:42:52

cursor up against the top of the screen,

1:42:54

this is before multiple screens. Anyway, jam your

1:42:56

cursor up against the top of the screen

1:42:59

and you don't have to care when the mouse

1:43:01

cursor stops. It will hit the top of the

1:43:03

screen and the cursor won't go off the edge.

1:43:05

And so essentially the menu bar has infinite height

1:43:07

from a targeting perspective. When you plug the numbers

1:43:09

into the Fitt's Law, you're like, okay, the menu

1:43:11

bar is this many pixels wide. How many pixels

1:43:13

high is it? Don't put in

1:43:15

34 pixels or whatever high the menu bar is.

1:43:17

It's infinity pixels high because all the person has to

1:43:19

do is slam the mouse cursor up to the top

1:43:21

and then they just need to worry about the X

1:43:23

position because the Y position is taken care of for

1:43:25

them with one flick of the wrist. That's

1:43:28

the canonical example of Fitt's Law. And it's

1:43:32

always shown to say like, what are the value of the

1:43:34

screen edges, like the dock being on the edge and how

1:43:36

you can slam the cursor to the bottom of the dock.

1:43:38

And even though it looks like there's a tiny little gap

1:43:40

between the bottom of the screen and the dock, it's still

1:43:42

clickable area because they wanna take advantage of Fitt's Law. If

1:43:45

the dock wasn't like that and it's like the bottom

1:43:48

pixel of the screen was not clickable, that would make

1:43:50

the dock harder to target for people. So this came

1:43:52

up in the context of Vision Pro, both

1:43:55

with your eyeballs and with cursors saying, well, there's

1:43:57

no menu bar in Vision Pro. And

1:44:00

so that's maybe one of the reasons that all the

1:44:02

targets need to be a little bit larger because there

1:44:04

are no screen corners to flick a cursor into. And

1:44:08

there's no menu bar at the top to slam your cursor up

1:44:10

against. And it also came up in

1:44:12

the context of eyeballs. And

1:44:15

saying, does Fitt's law apply to eyeballs?

1:44:17

Bigger targets are easier to look at

1:44:19

or whatever. That might have to do

1:44:21

with the accuracy of

1:44:24

being able to look. There's an accessibility control

1:44:26

where you can enable a cursor that supposedly

1:44:28

shows where your eyeballs are. But I bet

1:44:31

that is also smoothed out because the uncertainty

1:44:33

about where your eyes are looking and how

1:44:35

they dart around is surely

1:44:37

even noisier than the cursor that they will show

1:44:39

you. But they have

1:44:41

to kind of guesstimate and smooth the way you're looking, right?

1:44:43

So bigger targets give you a bigger margin of error and

1:44:45

that makes sense. But

1:44:48

the key difference between your eyeballs and

1:44:50

your hands and arms when controlling a

1:44:52

mouse or a trackpad is

1:44:54

that your limbs, because of what

1:44:57

we use them for in daily life, are

1:44:59

accustomed to having something

1:45:01

that stops them. So if you're reaching

1:45:03

for a doorknob, you're going to

1:45:05

fling your hand in the direction of the doorknob and

1:45:07

you're going to start slowing your hand down as it

1:45:09

approaches where you think the doorknob is. But you also

1:45:11

know that once you start getting close to the doorknob

1:45:13

and you start to feel it, the

1:45:16

thing that will eventually stop your hand is the

1:45:18

doorknob itself. You're reaching for a light switch. You're

1:45:20

putting your hand on the wall. You kind of

1:45:22

know where the wall is. Again, you slow your

1:45:24

hand down as it approaches the wall, but you

1:45:26

have the full expectation that eventually your fingertips are

1:45:28

going to touch the wall and then you'll know

1:45:30

where the wall is and you'll complete the motion.

1:45:32

Then you'll feel the panel on the light switch

1:45:34

and you'll find that your limbs hit

1:45:37

into things. Gently, you hope, but

1:45:39

you can rely on them

1:45:42

finding something and that thing

1:45:44

stopping them. The

1:45:46

menu bar functions like that in the virtual

1:45:48

world. You drive your arms upward and it

1:45:50

doesn't actually stop your arm. Your arm goes

1:45:52

up on the mouse pad, but the cursor,

1:45:55

your virtual finger, does stop. But

1:45:57

your eyes have a different job as you wander around

1:45:59

the wall. wall. When your eyes

1:46:01

dart from one place to another, looking over

1:46:03

there, looking to see someone coming

1:46:05

up your driveway, looking back at the TV, there

1:46:08

is nothing in the physical world that is stopping

1:46:10

your eyeballs. Your eyeballs always

1:46:12

have to stop on their own. If

1:46:15

you dart your eyeballs up to the menu bar, the

1:46:18

infinite target of the menu bar does not

1:46:20

stop your eyeballs. Nothing stops them, except for

1:46:23

your skull and length of your muscles or

1:46:25

whatever. So the job your eyeballs have done

1:46:27

for the entire time, our entire species has

1:46:29

existed, and all mammals that have eyeballs and

1:46:31

everything, they have to be

1:46:34

able to move to a position and stop

1:46:36

on their own. Whereas our limbs have

1:46:38

always been able to rely on essentially

1:46:41

making contact with something, whether it's

1:46:43

the ground, the wall, the light

1:46:45

switch, pulling a fruit from

1:46:47

a tree, whatever it is that you're doing, your

1:46:50

limbs have always had something that stopped them.

1:46:52

And so it's interesting that Vision

1:46:54

OS is an environment in which

1:46:57

Fitt's Law for the primary pointing device

1:46:59

of your eyeballs is

1:47:01

essentially irrelevant because your eyeballs

1:47:03

are really, really good at

1:47:06

going somewhere quickly and stopping on their

1:47:08

own. And they don't need the help

1:47:10

of a screen edge or another thing

1:47:12

to slam against like

1:47:14

our hands and limbs do. I don't

1:47:16

know if this has any consequences for the interface. It

1:47:19

presumably has consequences for when

1:47:21

you use a mouse, for example, inside

1:47:23

Vision Pro, because then

1:47:26

you're not using your eyeballs. But now your

1:47:28

cursor needs something to slam against. I assume

1:47:30

when you do it in the virtual screen on the

1:47:32

Mac, if there is no Vision OS window above you,

1:47:35

it will stop at the top. Well, it'll stop at

1:47:37

the top. It'll stop at the top as long as

1:47:39

your gaze remains on the virtual

1:47:41

display. If I'm not mistaken, I mean, I could try this

1:47:44

out if I really care. But suffice to say, to the

1:47:46

best of my recollection, as

1:47:48

long as you are focused somewhere

1:47:50

on the Mac virtual display window,

1:47:52

you are limited to keeping your

1:47:55

mouse in that display. Now that

1:47:57

works both ways, though, in

1:47:59

that if you glance to say your left to

1:48:01

look at slack or something like that while you're still

1:48:03

mousing about well your cursor is going to try to

1:48:05

jump over to that black window even if it's a

1:48:07

native slack window that you know a native visual a

1:48:09

slack window and so that occasionally can

1:48:12

be a little bit frustrating

1:48:14

and only that's a little dramatic need to say

1:48:16

but a little bit off-putting maybe that you know

1:48:18

i'm trying to mouse to the upper right hand

1:48:20

corner of this forte screen which you know i

1:48:22

may have made quite large in my vision os

1:48:24

world but then i glance to the left to

1:48:26

look at their the writer what have you a

1:48:28

glance to the left to look at the vision

1:48:30

os native slack and next thing i know my

1:48:32

cursor is in the slack window because as far

1:48:34

as division you know vision os is concerned well

1:48:36

that is the active surface right now when it's

1:48:38

trying to use universal control to pull the mouse

1:48:40

into what i'm looking at which does make sense

1:48:43

but it's not exactly what you would expect you don't

1:48:45

expect your cursor to just jump you know i don't

1:48:48

know a thousand pixels to the left all of a

1:48:50

sudden just because you moved your head and look somewhere

1:48:52

else yeah another one of the disparities that division os

1:48:54

brings up that a lot of people have been talking

1:48:56

about in their reviews and we talked about last time

1:48:59

with like having to continue looking at something and not

1:49:01

glance off somewhere else until you've completed the click operation

1:49:03

for example and people in

1:49:05

generalizing that to the idea of taking

1:49:08

something that is traditionally input device

1:49:10

our eyeballs we use them to take in

1:49:12

the world around us they are an input

1:49:14

device and overloading

1:49:17

it and saying guess what eyeball you're now

1:49:19

also an output device you now also determine

1:49:21

the position of the cursor in a virtual

1:49:23

world our eyeballs once you superman are not

1:49:26

output devices they do not shoot lasers from

1:49:28

them you can't affect the world with them

1:49:30

and where you look with them doesn't

1:49:34

affect future operations by for example your arms it's like

1:49:36

well i looked up to the right and then i

1:49:38

snapped my fingers and the thing i was looking at

1:49:40

burst into flames no that doesn't happen anywhere but a

1:49:43

division of it does so we

1:49:45

are being asked to both use them

1:49:47

as an input device which is why we're glancing all

1:49:49

over the place to scan things or whatever but also

1:49:51

they are i wouldn't call it an output devices kind

1:49:53

of trying to use the reverse but like what we

1:49:55

call the mouse we call the mouse and input device

1:49:57

but that's from the perspective of the computer it

1:50:00

provides the computer with input. So our eyes are

1:50:02

both an input device for our brain, and

1:50:04

also they are an input device for the

1:50:06

computer and an output device for us. And

1:50:09

that is not something that we're

1:50:11

used to. Tell me about

1:50:13

Command-Tab. I don't think people were talking

1:50:15

about, oh, I'm in Vision OS and I'm hitting Command-Tab

1:50:18

and I wish it worked and maybe in the next

1:50:20

version it will and it doesn't do expected things. And

1:50:22

this made me think about window layering

1:50:24

in Vision OS. We talked about it before, our marker was like,

1:50:26

you do not want to have a bunch of overlapping windows. It's

1:50:28

a big mess. I tried it a

1:50:30

little bit when I used Vision Pro. I tried it more

1:50:32

in the simulator to get a feel for it. And

1:50:35

I was kind of surprised at how, I

1:50:37

guess I didn't notice this before, I had to use the simulator

1:50:40

for ages before, but I guess I hadn't done, I wanted to

1:50:42

talk to your test. Here I

1:50:44

am on Mac Pro, how many windows can I open? So I

1:50:46

went in the Vision OS simulator and I'm like, how

1:50:49

did the implement window layering here?

1:50:51

And so I just started opening a bunch of

1:50:53

windows. It's my core skill set apparently. And

1:50:56

I wanted to see how it would

1:50:58

handle things. And

1:51:00

so some interesting things we already know

1:51:03

about was we discussed before, many, many shows about

1:51:05

before the Vision

1:51:14

OS will try to maintain the

1:51:17

same visual size, like the field of view of the

1:51:19

window. So essentially when you push the window far away,

1:51:21

it will make it bigger as

1:51:23

you push it farther away so that it

1:51:26

fills the exact same field of view. So if it's like,

1:51:28

if it's 15 degrees of your field of view and you

1:51:30

push it back five feet, it will still be 15 degrees

1:51:32

in your field of view, which means the window will be

1:51:34

larger. You can override that, you

1:51:36

can make it not do that, right? But that's

1:51:39

one of the behaviors they suggest for your windows.

1:51:41

So right away, pushing windows farther away from you

1:51:43

and pulling them towards you, they maintain the same

1:51:45

visual size. In some ways, that's just like the

1:51:48

Mac. When I have a stack of 100 windows and

1:51:50

I bring the back one to the front, it

1:51:52

doesn't change size. It becomes quote unquote the front

1:51:54

most window, it draws in front of the other

1:51:56

windows, it gets the big drop shadow, but it

1:51:59

doesn't change size. And, ditto, if I

1:52:01

bury that window underneath 100 windows, it doesn't shrink,

1:52:03

because it's not getting farther away. This is what

1:52:05

I was getting at last time about, like, on

1:52:07

the Mac. We have

1:52:09

lots of windows, but we conceptually consider

1:52:11

them to essentially be, like, pieces of

1:52:13

paper. Like, they're all pretty much at the

1:52:16

same depth. And yeah, it's magic, because you can pull the

1:52:18

one from the bottom up to the top. But

1:52:20

if I was to look from the side, I would

1:52:22

say, this is a stack of paper. And all the paper

1:52:24

are touching each other. There's no space between them, right? Which

1:52:26

is why the magical metaphor of, like, I click on the

1:52:28

one in the back, comes forward, like, it works for us.

1:52:30

It's like, ah, it's just kind of like I took that

1:52:32

piece of paper out and flipped it in front of the

1:52:34

other ones. But I did it real quick, and you didn't

1:52:36

see it. So the metaphor works for

1:52:38

us. In Vision OS, if you make a big

1:52:41

mess and have a bunch of windows, and some of them are far away

1:52:43

and some of them are close up, and you have this huge stack of

1:52:45

windows, which is the thing that you can do, and

1:52:48

one of the windows is, like, it's way in the back.

1:52:50

Like in the simulator, I was pushing it, like, you can

1:52:52

push them through the back wall, but I was trying to

1:52:54

stay inside the room. One of them is way against the

1:52:56

back wall, and then, like, 17 windows between

1:52:58

me and that window. And I

1:53:00

want that window to, quote unquote, come to the

1:53:02

front. And I click

1:53:04

on or pop on or whatever the hell, I'm in the simulator, so

1:53:06

it's weird, activate the window

1:53:08

that's way in the back. What

1:53:11

happens kind of surprises me. What

1:53:14

doesn't happen is that window does not suddenly leap

1:53:16

to the front in 3D space. No, it stays

1:53:18

pinned against that back wall. It stays 10 feet

1:53:20

away from me, right? It

1:53:23

also doesn't just start drawing on top of

1:53:25

the other windows, which would look kind

1:53:27

of weird, but it's the thing it can do. What

1:53:30

it does is it draws in front

1:53:32

of everything, but then it fades

1:53:34

out all the windows that are ostensibly

1:53:37

in 3D space in front of

1:53:39

it, so that you can

1:53:41

see the window that's against the back wall by

1:53:44

essentially making ghosts out of all the windows

1:53:46

that would be blocking the view, which is

1:53:48

really weird. Like it doesn't move the window.

1:53:51

It doesn't make it bigger. You don't

1:53:53

see it animate forward and suddenly it's two feet

1:53:55

away from you and then it animates back, but

1:53:57

it wants to essentially bring it to

1:54:00

the front. the front and this in the context

1:54:02

of command tab like what would it mean to

1:54:04

command tab you'd be command tabbing you're like oh

1:54:06

suddenly the wind the frontmost active window is the

1:54:08

window that is currently buried behind seven windows that's

1:54:10

five feet away from me how does

1:54:12

that become frontmost and they don't

1:54:14

walk that window up to you go doo doo doo

1:54:16

doo doo here comes the window it's walking through all

1:54:18

the other windows out now that window is two feet

1:54:20

in front of you which they could do because the

1:54:22

whole size maintenance thing the window would slowly shrink as

1:54:24

it moves towards you but you wouldn't notice because it's

1:54:27

moving closer to you so maintain the same visual size

1:54:29

but you'd see like the drop shadow for example of

1:54:31

that window is now two feet in front of you

1:54:34

and sailing into the back wall but instead they draw

1:54:36

that window in front of everything else and fade everybody

1:54:38

out like they're ghosts so what it means that you

1:54:40

have a lot of windows open and you pick one

1:54:42

of them and it is not literally physically the

1:54:44

frontmost the other windows become ghosts

1:54:47

the other windows fade away and you can't see

1:54:49

them and they come up score not just the part that

1:54:51

this drawing over but even the edges of them get all

1:54:53

fuzzy or whatever and it's super weird it's kind of like

1:54:55

if you had a stack of a hundred text edit windows

1:54:58

and you pulled the one in the back

1:55:00

to the front and instead of that window

1:55:02

just drawing in front of them all the

1:55:04

other windows faded away and became ghostly and

1:55:06

that one drew in its current position in

1:55:08

the back but with the ghost windows faded

1:55:10

out in front of it I

1:55:13

don't know if this is if this is

1:55:15

the correct approach but this is apparently what

1:55:17

vision Pro does now and it explains Marco's

1:55:19

warning last time was like you don't want

1:55:21

to run a bunch of windows because that

1:55:24

metaphor and design is

1:55:27

has no precedent in the

1:55:29

2d space it's what they

1:55:31

decided to do in 3d and I guess maybe

1:55:33

they tried all the other ways and they were

1:55:35

worse but it is weird and

1:55:37

it does make it so that having lots

1:55:40

of windows open is much less tenable because

1:55:42

it won't

1:55:44

move them it won't essentially when you say

1:55:46

bring to front in Mac parlance it will

1:55:48

never actually bring that window to the front

1:55:50

it just sort of like it's

1:55:52

like plowing it's like a particle beam

1:55:55

that blows away all the other windows and fades

1:55:57

them out and disintegrates the matter so that you

1:55:59

have a clear shot at that window that

1:56:01

is five feet away from you on the back

1:56:03

wall. And then when you pick a different window,

1:56:05

all those dematerialized windows come back into being and

1:56:07

stop being ghosts and start drawing themselves again. And

1:56:10

I find it extremely weird and

1:56:13

not for me, at least in the simulator, a comfortable way

1:56:15

to manage a lot of windows. Yeah,

1:56:18

when I was at the library using this thing,

1:56:20

I put myself in the position where I had

1:56:22

a couple of windows layered on top of each other and I was seeing

1:56:24

that ghosting and whatever. And that was the first

1:56:26

time because I was using Mac Virtual Display at the same time.

1:56:29

And that was the first time that I

1:56:31

had the presence of mind

1:56:34

to hit Alt-Tab, excuse me, Command-Tab, wow,

1:56:36

my windows are showing, to hit Command-Tab

1:56:39

and try to tab

1:56:41

between the windows. And of course, that didn't

1:56:43

work for nothing. And it

1:56:45

took me a second to realize what I had just done and why

1:56:47

it was wrong. But outside of

1:56:50

a bunch of windows on top of each other

1:56:52

in 3D space and trying to move between them,

1:56:54

I can't say I've ever reached for Command-Tab for

1:56:57

any other reason. But that is the one really

1:56:59

good way and reason to use it. Unfortunately, it

1:57:01

doesn't seem to do anything. But

1:57:03

if it did, what it would do is fire

1:57:06

that particle beam and plow its way through all

1:57:08

the other windows without moving any of them so

1:57:10

you have a clear line of sight on the

1:57:12

one window that is essentially going to draw in

1:57:14

front of all the other ones, even though it

1:57:17

is still behind them. And it's literally

1:57:19

behind them. You can get up and walk over

1:57:21

and stand in the space between the windows. It's

1:57:24

spatial computing, but they

1:57:27

haven't figured out a way. The

1:57:29

fake metaphor I just said of the paper stack

1:57:32

or whatever, that's not based in

1:57:34

reality, but it's close enough. The stack

1:57:36

of paper analogy, if you had a bunch of papers out and

1:57:38

you wanted the one in the middle, you take it out from

1:57:40

the middle of the pile and you put it on top. You

1:57:43

can imagine that's what's going on with all these pieces of paper

1:57:46

that are windows on your thing. But

1:57:48

if you had a bunch of 5

1:57:50

foot by 3 foot magic glass things floating in your living room

1:57:52

and they were all stacked and some of them are against the

1:57:54

back wall and some of them are in the middle and some

1:57:56

of them are real close to you and you wanted to get

1:57:58

at the one in the middle. I

1:58:01

mean, I suppose you could have it fly

1:58:04

towards you and pass through the other ones and now that

1:58:06

one is the front post and then it could fly in

1:58:08

the back but how do you maintain those positions? Would

1:58:10

you want it to fly there? Would you want the other ones

1:58:12

to fly out of the way and part like the Red Sea

1:58:14

so you can see that one? Or do

1:58:17

I guess you want all the other ones to become

1:58:19

weird ghosts so you can see through them to the

1:58:21

one in the back? It's really weird that their spatial

1:58:24

computing thing is like, I have no respect

1:58:26

for the spatiality of this world. Yes,

1:58:29

you can position windows, but when you ask to see one

1:58:31

of them, I am not going

1:58:33

to move things spatially to make

1:58:35

your view better of that thing. I

1:58:37

am just going to dematerialize, partially

1:58:40

dematerialize the things that are blocking

1:58:42

your view so that I can

1:58:44

draw that one in front of the other windows and it

1:58:46

doesn't feel that weird to you, but honestly it's pretty weird.

1:58:50

Thank you so much to our members

1:58:52

who supported this entire episode. You can

1:58:54

join us at atp.fm slash join. There's

1:58:56

lots of benefits to being a member.

1:58:59

Please consider joining us once again atp.fm

1:59:01

slash join. Thank you so much for

1:59:03

listening and we will talk to you

1:59:05

next week. Thank

1:59:20

you. Thank

1:59:49

you. So

2:00:09

a few times during the episode I mentioned that I had

2:00:11

gone to the library to do some work and I've also

2:00:13

been working on, I got a

2:00:16

little sidetracked doing some, adding some features

2:00:18

to regular or plain old call sheet

2:00:20

which just got released, which by the

2:00:23

way if you're interested in how tall actors are

2:00:25

and or your name is Merlin Mann, go get

2:00:27

the latest update because where possible I show how

2:00:29

tall actors are and Merlin seems very excited which

2:00:31

I'm very happy about. What is your data source

2:00:34

on? Wicked data actually,

2:00:36

so the same thing that Wikipedia uses

2:00:38

or I don't know the relationship between the

2:00:40

two but it's part of the Wikimedia Foundation as far as

2:00:42

I know and yeah wicked data has some

2:00:44

actors data a lot I would say but not

2:00:46

everyone but anyway that's not the point.

2:00:50

The point is outside of that distraction I've been doing

2:00:52

a lot of vision pro work because now that I

2:00:54

have the vision pro and now that I have my

2:00:56

hilarious $300 developer strap I've been

2:01:00

putting both to good use and trying to

2:01:02

work on the vision pro native version of

2:01:04

call sheet and this

2:01:07

is you know as an aside

2:01:09

we don't need to unpack this right now because it

2:01:11

could take hours already running long but running

2:01:14

a branch that you're not doing a good

2:01:16

job of keeping up to date with Maine

2:01:19

and then trying to bring it back in line

2:01:21

with Maine like a month or two later. Not

2:01:24

fun my friends, not fun to the point that

2:01:26

I actually abandoned like I still have it but

2:01:28

I abandoned my initial vision pro branch the same

2:01:31

one that I used when I went to a

2:01:33

lab. I have abandoned that

2:01:35

and basically manually replaying a lot of those changes

2:01:37

in part because I've got different opinions about what

2:01:39

I should do and in part because even

2:01:42

though it's only been a couple of months

2:01:44

there's been such a divergence between Maine and

2:01:46

this branch that it's just it's a mess

2:01:48

it's an absolute mess. Anyway

2:01:50

I keep getting myself distracted the point is what was

2:01:53

my experience like doing you know writing code and trying

2:01:55

to get work done both at the library and at

2:01:57

home because I found

2:01:59

that even when I'm at home, even though

2:02:01

I've got 15Ks,

2:02:03

if you will, of screen

2:02:05

here, it's actually much more

2:02:07

easy and there's a

2:02:10

lot less friction to

2:02:13

write code and run it in the Vision Pro when Xcode

2:02:16

is also in the Vision OS world. And so

2:02:18

I've been using Mac Virtual Display for that. The

2:02:21

developer strap, like I had said on the show, even

2:02:24

though I am not in love with the price, it is

2:02:26

worth it if you're doing any real development because it seems

2:02:28

to work much, much, much better. When

2:02:30

I was at the library, I had a very weird

2:02:33

thing though. So I have, I want to say it's

2:02:35

an anchor, it's out of reach from

2:02:37

where I'm sitting, but I have one of those chargers

2:02:40

that I use. I don't use an official Apple

2:02:42

charger. I think it's an anchor charger that has

2:02:45

one, like, I don't

2:02:47

know, maybe 100-watt USB-C port for

2:02:50

a computer, like a 30-watt

2:02:52

port or thereabouts for an iPad or phone or

2:02:54

what have you, and it also happens to have

2:02:56

a USB-A port. And what

2:02:58

I was doing at the library was I had

2:03:00

plugged MagSafe from

2:03:03

the 100-watt slot to

2:03:05

the computer, a

2:03:07

just general USB-C connection from the

2:03:09

30-watt to the battery for

2:03:11

the Vision Pro, and then of course I

2:03:13

had a different

2:03:16

USB-C cable going from the Vision Pro developer strap

2:03:18

to my computer. No hubs or anything like that,

2:03:20

that's all it was. And

2:03:22

I had my AirPods in, and when

2:03:24

I finally decided to commit to using

2:03:26

the developer strap, I

2:03:29

was getting this incredibly

2:03:31

odd feedback, like a

2:03:33

very high-pitched humming sound that I found

2:03:35

was only the case when I had

2:03:37

the computer and the battery pack plugged

2:03:39

in. And if I unplugged the MagSafe

2:03:41

or if I unplugged the battery pack, it went

2:03:43

away. I don't think this has happened since, so

2:03:45

I don't know if my library happens to have

2:03:47

a very dirty power or something like that, but

2:03:50

it was the weirdest thing, and I noticed it

2:03:52

several times at the library. That

2:03:54

was weird thing number one. Weird thing number

2:03:56

two, so I

2:03:58

really enjoyed working... in a

2:04:00

fully immersive environment in part because the

2:04:02

room I was in was wide but

2:04:04

shallow. So it was probably,

2:04:06

I don't know, 10-ish feet, so a couple of

2:04:08

meters, a little bit more than a couple of

2:04:11

meters wide and like less than a meter, less

2:04:13

than three feet deep. Or maybe it was a

2:04:15

little more than three feet. I don't know, it

2:04:17

was not a lot. It was wide but not

2:04:19

very deep. And when you're

2:04:21

trying to put windows around when you're

2:04:23

not immersed, you're running into the wall.

2:04:26

It'll do it but it just looks

2:04:28

weird. And so being immersed was

2:04:30

way, way, way better. I am

2:04:32

a pretty darn good touch typist many, many moons

2:04:34

ago. You and I did, or the three of

2:04:36

us did a typing race thing,

2:04:38

I think on the air or maybe we did it

2:04:40

off the air and compared notes after the fact. But

2:04:43

I'm a pretty good typist. I know Jason Snell but I'm

2:04:45

pretty good. I don't need to

2:04:47

look at my hands when I type. That

2:04:50

being said, when you're fully immersed, finding

2:04:52

your keyboard is harder than you think. Yes.

2:04:54

I said that last week. You've got hands and arms but

2:04:57

you don't have a keyboard. Right. It's actually

2:04:59

kind of frustrating how difficult it is to find

2:05:01

the keyboard. I don't realize how much I need

2:05:03

to glance at the keyboard until I'm trying to

2:05:05

do that and then I realize, oh man, I

2:05:08

actually do glance at it a lot. It's

2:05:10

like it was people who buy the keyboards with

2:05:12

keycaps that had nothing on them to show off.

2:05:14

Well, how about you can't even see the whole

2:05:17

keyboard? No, yeah, because that's the problem is to

2:05:19

align myself with the feel of where my fingers

2:05:21

even go. That's where I find myself when I'm

2:05:23

in Vision Pro kind of missing sometimes. Yeah, I

2:05:25

feel like that's the upgrade that like so they

2:05:27

have obviously the Vision Pro detects your hands and

2:05:29

your arms. Apple detecting its

2:05:32

own keyboards, I feel like that is a

2:05:34

solvable problem. Yeah, yeah, because this

2:05:36

was on the laptop keyboard and actually the only

2:05:38

other keyboard that I use is the whatever

2:05:41

104 key whatever it is with touch ID

2:05:43

here at home and those are the only

2:05:45

keyboards I use. So yes, it should have

2:05:47

been able to detect it like it's a

2:05:49

fine first worldiest of first world problems, but

2:05:51

I couldn't find the frigging keyboard. This happened

2:05:53

not irregularly and yes, I'm aware of the

2:05:55

little bumps on what is it the F

2:05:57

and J keys. You got to find the keyboard

2:06:00

enough to find the bumps first though.

2:06:02

Exactly, exactly. I could not have put

2:06:04

it better myself. I think you are

2:06:06

slightly kidding, but no, that's exactly right.

2:06:08

Also, quick aside, John, didn't the bumps

2:06:10

used to be on D and K

2:06:12

or something like that years ago? Yeah,

2:06:14

Apple has them in different positions than

2:06:16

other keyboards do, and I think they've

2:06:18

changed over the years, but yeah. Because

2:06:20

I vaguely remember when I was a kid using

2:06:23

Apple keyboards drove me nuts because it was under

2:06:25

my middle fingers instead of my pointer fingers, like

2:06:27

the little lumpies or whatever. Anyways, couldn't

2:06:29

find the damn keyboard. AirPods were having a little

2:06:31

bit of feedback, which again, I don't think I've

2:06:34

heard since. But one of the things is,

2:06:36

and I don't know if I ever linked to this in last week's

2:06:38

show notes, but I put up a blog post shortly. I think I

2:06:40

mentioned it last week. I don't know if I linked to it, but

2:06:42

I put up a blog post shortly before

2:06:44

the Vision Pro came out, like literally a couple of days before

2:06:46

where I was talking about how, hey, this would be really neat

2:06:48

if I could have this whole

2:06:51

array of Windows around me with native

2:06:53

Vision OS messages, native Vision OS Slack,

2:06:56

and Safari from Vision

2:06:58

OS and all that different various sundry

2:07:00

Windows all around me. And I can do

2:07:02

that, and it works pretty well. However,

2:07:06

as many people have said, and

2:07:08

I am not the only one,

2:07:11

the iPad

2:07:13

OS native apps, and in this case I'm picking

2:07:15

on Slack, but it's not just Slack, iPad

2:07:18

OS native apps kind of suck

2:07:20

on Vision OS. And the thing of it

2:07:23

is, is I don't know if it's something

2:07:25

on Apple's side or the way the apps

2:07:27

are designed or both. And again,

2:07:29

I'm not the first to say this, but finding the

2:07:31

touch targets is really difficult, particularly I

2:07:33

found on Slack in the upper left,

2:07:36

I think I mentioned this last week,

2:07:38

in the upper left where you choose

2:07:40

which workspace you're in, you know, say

2:07:42

real AFM or something else, it's

2:07:44

really hard to get Vision OS to actually

2:07:47

activate that thing with your eyes. Now, with

2:07:49

that said, with universal control, it's not so

2:07:51

terrible because you can just mouse right up

2:07:54

there. But the Slack app

2:07:56

on Vision OS, I am really looking forward to, and

2:07:58

I don't even know if they've announced anything. But I'm

2:08:00

really looking forward to getting that as vision OS native

2:08:02

because I think it'll be much better But

2:08:05

yeah overall really great experience It's

2:08:07

a little teeny bit of a bummer when

2:08:10

I'm at home when I'm losing and when I'm going

2:08:12

from 15 Ks of real estate And at least having

2:08:14

more than one window It's a

2:08:16

bummer to bring that down to one and I think we

2:08:18

talked about last week You know there's rumblings that maybe

2:08:21

Apple can do two windows on the same or

2:08:23

it's you know two Anyway,

2:08:25

it seems like especially given our the

2:08:27

supposed revelations about the developer strap and

2:08:30

the potential of higher bandwidth there that

2:08:32

it could be something That comes to

2:08:34

a later version of the OS if

2:08:36

only for people with the developer strap. Yeah.

2:08:39

Yeah, definitely But

2:08:41

yeah, I would love to have

2:08:43

a second virtual display But

2:08:45

all all that being said like you know there's

2:08:47

some things I would definitely tweak about this, but it's

2:08:50

pretty nice I've really been enjoying

2:08:52

it, and I wouldn't the to

2:08:54

Marcos point during the episode I

2:08:57

wouldn't necessarily choose to give up you know my

2:08:59

15 Ks of real estate I wouldn't necessarily choose

2:09:02

to give up my standing desk in my situation

2:09:04

at home, but I do

2:09:06

like quite a bit Going

2:09:08

somewhere else to do work, but

2:09:10

of course of course that raises the question if I'm

2:09:13

sitting on Mount Hood in the library Why

2:09:16

couldn't I sit on Mount Hood at home

2:09:18

or at my desk? And and

2:09:20

I don't really have a good answer for that

2:09:22

other than the tea ceremony if you you know

2:09:25

compared to vinyl the tea Ceremony of going somewhere.

2:09:27

I kind of miss like I'm very thankful and

2:09:29

lucky that I don't have to do that every

2:09:31

single day But I like to

2:09:33

at least once a week go somewhere

2:09:36

Wegmans Publix a library

2:09:38

whatever and This

2:09:41

it's it's in some ways It's so much

2:09:43

nicer and better now because I feel like

2:09:45

I'm bringing you know like a LG ultra

2:09:48

fine 4k display with me without having Actually

2:09:50

having to carry very much But

2:09:52

the flip side of that is it's

2:09:54

almost not even necessary anymore Which is

2:09:56

a weird and odd feeling like I'll

2:09:59

probably still do even if I do

2:10:01

use the Vision Pro wherever I'm going.

2:10:04

But it seems a lot less necessary now than

2:10:06

it has ever been before. So anyway, I just thought it

2:10:08

was interesting to discuss. It's all about getting out of the

2:10:10

house, right? Like when you're at Wegman, someone

2:10:13

can't yell your name and ask you to come do something.

2:10:16

Well, it's only Aaron, but your point is

2:10:18

still fair. And I think leaving aside who

2:10:20

is at my house, you're exactly right. For

2:10:22

me anyway, I really do like being able

2:10:25

to get out of the house and go

2:10:27

somewhere and just change my scenery and have

2:10:29

the act of going somewhere. Now, that's because

2:10:31

I believe in superior computers that you can

2:10:34

move very easily and don't need to worry

2:10:36

about carrying multiple pieces. Marco, I believe you

2:10:38

are also an enlightened individual that believes in

2:10:41

these weird, funky things. Multiple pieces? You're bringing

2:10:43

a headset with you. Ah,

2:10:45

well, yeah. But I don't have a 15-pound monitor though, thank

2:10:48

you very much. But nevertheless, I believe,

2:10:50

Marco, you are also an enlightened person that

2:10:53

believes in these funky things called laptops. No,

2:10:55

he doesn't. He's using a desktop laptop. No,

2:10:59

I mean, honestly, like, for the purposes, you

2:11:01

know, obviously, I share some

2:11:03

of your need for getting out of the house

2:11:05

sometimes because we work for ourselves in our houses. And

2:11:07

so it is nice to get out in the

2:11:09

world and work or be somewhere else, you know,

2:11:11

on some kind of regular basis just to get yourself

2:11:13

out of the house. I think

2:11:16

using the Vision Pro to do

2:11:18

that does kind of ruin the

2:11:20

point, but I think the answer is not stay

2:11:22

in your house more. I think the answer

2:11:24

is go out with a laptop and don't

2:11:26

bring the Vision Pro sometimes. Like, that's the

2:11:28

answer because, like, I'm gonna, like, you know,

2:11:30

go work in a coffee shop or something.

2:11:32

Part of the joy of it is interacting

2:11:35

with the world, being out

2:11:37

there, you know, seeing people, saying hello to people

2:11:39

when they come in, if you've seen them before.

2:11:42

Focusing your eyes on distances other than 1.3 meters. Right.

2:11:46

So, part of the appeal is to

2:11:48

be a little bit in the world.

2:11:50

Now, if you're using Vision Pro

2:11:53

out in a place like a coffee shop,

2:11:55

first of all, you're already covering your eyes

2:11:57

and immersing yourself, etc., even if you're

2:11:59

in pass-through mode. you are projecting the

2:12:01

anti-social version of yourself. As

2:12:04

we've mentioned, the Vision Pro speakers are

2:12:06

very open, and so if you

2:12:09

need any kind of audio as

2:12:11

part of your work, if you're watching or listening to something, or

2:12:14

if you are trying to edit audio or whatever, you're

2:12:16

gonna need AirPods. Now you're covering up

2:12:18

your eyes and your ears and sealing

2:12:20

yourself off even more. I

2:12:22

feel like at that point, you are not only being

2:12:24

extremely anti-social to the people around you and to the

2:12:26

business that you're in, but also you are then losing

2:12:29

quite a bit of the value of being there in

2:12:31

the first place. For me,

2:12:33

in that context, a laptop optionally

2:12:36

with headphones is much

2:12:38

better because at least then your eyesight

2:12:40

is totally unencumbered. You can see everything.

2:12:42

People can see you, they know they

2:12:44

can see you, they know you can

2:12:46

see them as discussed earlier.

2:12:48

I feel like you're getting more of the environment

2:12:51

that way, even if you have AirPods in for

2:12:53

whatever reason you need that. So again,

2:12:55

I see the appeal very much to Vision Pro

2:12:57

for things like immersive entertainment. If you're gonna watch

2:12:59

a movie, bring it on a plane, I think

2:13:01

there's a lot of arguments for that. But

2:13:04

working in a coffee shop or working

2:13:06

out in public somewhere

2:13:09

for the sake of getting out in

2:13:11

the world and getting out of your house, I

2:13:13

don't think it's working for your purposes

2:13:16

there. I think it's actually working against your purposes there.

2:13:18

Yeah, don't underestimate, like I said, don't

2:13:21

underestimate the value, especially if you're working on

2:13:23

a programming problem or whatever, of

2:13:25

looking out the window. I can look

2:13:27

out the window with Vision Pro, it's got pass through. I

2:13:30

mean, looking out the window and focusing your

2:13:32

eyes 50 feet away at

2:13:34

the tree across the street while you think about a

2:13:36

problem. I know you can do the thousand yard stare

2:13:39

inside the headset, but really, if

2:13:41

you're working on computer for a long period of time,

2:13:43

I feel like it does help to focus

2:13:45

your eyes on a different distance. Even, forget about

2:13:48

the headset, even just sitting in front of your

2:13:50

max monitor, don't just stare at the

2:13:52

monitor two feet in front of you for eight hours,

2:13:54

you will have a bad time. Get

2:13:56

up and walk around, look over your monitor. I used to do

2:13:58

this at work, look over your monitor. monitor out the

2:14:00

windows that hopefully are in your office and

2:14:03

out in the distance look down the hallway

2:14:05

look at your neighbor seven cubicles away and

2:14:07

wave like it's just good to focus your

2:14:09

eyes at different distances to relax and to

2:14:11

have an environment where you can think about

2:14:14

things and that remains one of the weaknesses

2:14:16

of a headset with a fixed focal length

2:14:19

unless you're doing the defocus your eyes and have

2:14:21

a thousand yards staring and not looking anything which

2:14:23

I suppose you can do in the headset it's

2:14:25

difficult to remind yourself to

2:14:27

focus on different distances to avoid eye

2:14:30

strain and honestly and you

2:14:32

know in my programmer head looking

2:14:35

off at something in the distance like a

2:14:37

tree across the street is somehow connected to

2:14:39

solving programming problems and some complicated wiring that

2:14:41

goes on when you've been a programmer for

2:14:43

20 something years

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features