Podchaser Logo
Home
We Answered Humanity's Biggest Question

We Answered Humanity's Biggest Question

Released Tuesday, 11th June 2024
Good episode? Give it some love!
We Answered Humanity's Biggest Question

We Answered Humanity's Biggest Question

We Answered Humanity's Biggest Question

We Answered Humanity's Biggest Question

Tuesday, 11th June 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

You're finally tackling your spring home improvement

0:02

projects. Hello 2024 Dream Garden! Until

0:06

you realize you're short three bags of mulch.

0:08

And why didn't you get that pair of

0:10

gloves you were looking at? Luckily, through Instacart,

0:12

you don't have to stop mid-project to run

0:14

to the Home Depot. You can get your

0:16

missing gardening supplies, tools, and more delivered in

0:18

as fast as 30 minutes. Have more time

0:20

for your project and get the Home Depot

0:22

delivered via Instacart. Visit instacart.com to get free

0:24

delivery on your first three orders. Offer

0:26

valid for a limited time. Minimum order

0:28

$10. Additional terms apply. We

0:31

are AudioStack.ai. Combining

0:33

AI writing, the best synthetic voices,

0:35

with production and delivery. Be it

0:37

ads, podcasts, or VOs for video. Just

0:39

like this ad, how would you utilize

0:42

AI audio? For free! Go

0:46

to audiotac.ai/contest. Give us

0:48

just a few details.

0:50

And you could replace this spot and be heard

0:52

by millions. See webpage for Ts and

0:54

Cs. audiotac.ai/contest.

1:00

Schlatt summer fashion erotet is coming in. He's

1:03

coming in hot. Big time. What is this?

1:06

Are you in Challengers now? No,

1:09

I haven't actually seen that movie yet. You gonna start

1:11

making out with a tennis player, huh? A

1:14

big hot sexy tennis player?

1:16

I mean, I'm big and hot, but let me show you what I

1:18

got here. What do you got? Oh,

1:22

Lord. Oh. Oh, Lord. Dude,

1:24

the shorts were a jump scare. Maybe

1:28

he is about to start kissing some dude

1:30

named Art. I mean, if he's hot. Oh,

1:32

wow. I wish that was me,

1:34

dude. I wish I was kissing that dude named

1:36

Art. If a guy's name is Art, you know

1:38

that he's got some Baron Uncle. What?

1:41

You know? Baron Uncle? Yeah, like some sort.

1:44

He's gotta be a trust fund kid if

1:46

his name is Art. Oh, yeah. That's an

1:48

old money name. If you're

1:50

naming your kid after the thing you like to collect

1:52

the most at auction, you're absolutely

1:55

a trust fund person right there.

1:57

I'm gonna name my first kid

1:59

Cigar. Or

2:02

gay porn. Welcome

2:05

to Chuckle Sandwich. Hey,

2:09

Chucklers. Welcome

2:29

to Chuckle Sandwich. Today I

2:31

have handpicked a bunch of

2:33

goofy, hard, some of

2:35

mankind's hardest questions for you,

2:38

Ted, and only for you. Really? You're

2:41

not going to let Tucker involve himself in this? No, no,

2:43

no. Tucker is not a part of

2:45

this podcast and he never will be. And I'm tired

2:47

of seeing all these little people saying, oh, we need

2:49

to get Tucker in the intro. You

2:52

notice something? Tucker's still not in the intro.

2:55

Stupid. He never will be. Look

2:58

at him sipping his hot chocolate. But

3:01

he's growing in power. Have you seen his

3:03

background? Every time you look at his background,

3:05

he's got more and more birds on his

3:07

wall. Dude just typed in bird art in

3:09

Amazon and selected the first

3:12

thing that came up. Okay, first off,

3:14

that one's mammals. Second off,

3:16

that one's fish. All right.

3:19

Yeah, no, he's building a flock. He's growing in

3:21

power. He's going to have like, he's going to

3:23

be like Noah's Ark. Someday he's going to do

3:25

it like a night at night of the museum

3:27

thing and all the all the animals are going to come

3:29

out of the paintings and he's going to send them like he's going to go

3:31

like this. He's going to send them towards you.

3:35

To kill me? Yeah, something like that. I

3:37

mean, if you keep pushing him like this, if you keep

3:39

pushing him to his edge like this, look

3:42

at him. He's incredibly unstable. Look at

3:44

my shirt. Tucker

3:46

looks incredibly unbothered right now. I don't

3:48

think anything really gets to him. No,

3:50

I'm having a great day. Tucker, you're

3:52

probably the most American guy that dresses

3:55

like the most Canadian Canadian. I

3:58

think there's a lot of respectable qualities. about

4:00

Canadians. I mean, not everything, surely, but

4:02

they have some good qualities. Loon.

4:05

Name one. Name a good. Or

4:08

a bad. Wait, are you

4:10

saying that there's a lot of like bad

4:13

things they do that are respectable qualities to

4:15

you? No, no, they, they have good respectable

4:17

qualities. They do. There's, there's some. Canada

4:20

wouldn't fucking exist if it weren't for

4:22

us. Yeah. That's one of their worst

4:24

qualities. I'd say. Terrible qualities. Terrible quality.

4:26

And they all live like right, right

4:28

on the border. Trying to get as

4:30

close to us as possible to leech

4:32

off of us. They also are on

4:35

basically our soil, but they still observe

4:37

the monarchy. Little suspicious. Crazy. Crazy. If

4:39

you were driven on in Ontario on

4:41

Queen Elizabeth Highway, what's that about? Oh

4:44

Jesus Christ. Get that out of here.

4:47

We should bomb that highway. There's a

4:49

highway. Yes. Napalm, maybe. Bring

4:52

it back. Fire. Round two. It'll catch the

4:54

tires, you know? That'll

4:57

make for an interesting entry

4:59

into World War three. That'd

5:02

be probably the craziest move America could

5:04

do on World War three. What's our

5:06

first chess move? Let's bomb the highways

5:08

of Canada. Let's

5:11

just prevent them from being able to move

5:13

around. You're not going anywhere anyways. You

5:16

think they're really doing anything? No, they're,

5:18

they're trying to save up and

5:20

little nickels and dimes because they have

5:22

no money left. Yeah. And it's all,

5:24

and it's all, it's all oxidized shitty

5:26

change. Yeah. They're scrounging around

5:29

on the floor. Oh, I, so

5:31

I got a little nickel. I got a nickel. We

5:34

might be able to get one piece of

5:36

bread today. Oh, what

5:38

sort of associate economic situation do you

5:40

think Canada's in right now? Well,

5:43

uh, look at Zillow in Canada

5:45

and get back to me. Okay.

5:48

Do you think if we attacked Canada, anyone would

5:50

come to their aid? No,

5:52

I think that it would be, I

5:54

feel like we could

5:56

absolutely blitz a Craig Canada.

6:00

and take it over and by the time it was like,

6:02

they're us now, they're us now. It's like one of those

6:04

things where like, you can't do anything about it, you can't

6:06

do anything about it. It'll be one

6:08

of those things where we go in there, we take

6:10

over everything and then the people start

6:12

coming over and we're like, no, no, no, no, we call it,

6:14

we call it, it's ours now. What

6:16

are they gonna do? I feel like

6:18

all their allies that they have would be

6:20

like, they would call on them and they'd

6:22

be like, oh, well we didn't expect America

6:25

to do that. We

6:27

were in for like war

6:29

against, somebody's small. Yeah, somewhere

6:31

east. Yeah, we were

6:33

here for an eastern fight. So

6:36

much oil in Canada too, so many natural

6:38

resources, my goodness. You let us get in

6:40

there, get our little stinky little greasy hands

6:42

in Canada, oh, something will go wrong. You

6:44

know what we should do? We should go

6:46

to one of those, I saw this video

6:48

on YouTube where there's

6:50

like these really remote Canadian towns that

6:52

like, I've looked them up online, it's

6:54

like there's this one that's like Villanuke

6:56

or something like that, it's

7:00

a video about the. Sounds

7:02

like Greenland. Yeah, you'd think, but

7:05

it's basically, I mean it's basically all

7:09

indigenous reservations up there, but

7:12

there's this one town that's like, literally you

7:14

have to take like a boat and a

7:16

plane to get there and stuff. There's no

7:18

way to drive there. I feel

7:20

like we should do a podcast there. No.

7:23

I'm game. Ted, have you ever

7:26

taken a philosophy course? I

7:28

have. Really? What

7:30

did you walk away from that course learning? Wait,

7:32

did I take a philosophy course? No, I took

7:34

a Eastern religion and shamanism course. Oh,

7:39

that's kind of different. It

7:42

is. Yeah. A

7:44

little bit. And it's not what we're gonna be talking about today.

7:48

So I don't know what to do now. Are we talking

7:50

about Saul Kripke right now? I

7:52

don't know who Saul Kripke is. Look him up. I

7:55

can tell you who John Locke is, you know, John Locke.

7:58

He sounds familiar. You know,

8:00

reined desk carts. That sounds French. Maybe

8:03

Ludwig knows it. Yeah. Yeah,

8:05

maybe he does. Look at this. See

8:07

that? Saul Kripke. No idea

8:09

who that is. Philosopher. Is that the

8:11

guy from the show? I know about Homer. Listen,

8:14

man, I'm just listening off some of the most,

8:16

some of the most, some of the most impressive

8:19

minds of our, of our generation.

8:22

Nietzsche? Maybe even generations back. Yeah,

8:24

Nietzsche. I like him. These

8:27

guys have posited some of the most

8:29

important questions known to man. Okay. And

8:32

today we're going to be discussing them. Some of

8:34

the most important questions mankind has ever encountered. Are

8:36

you ready? I think, I think I'm,

8:39

I think I'm ready to delve into

8:41

this. I have

8:43

a long history with philosophy. Yeah,

8:46

me too. I mean, I took a philosophy course once

8:48

and I think I was asleep for half of it.

8:50

Really? I just think. I

8:52

don't know how you expect like a little high schooler to care

8:55

about philosophy. Oh, you took it in high school? Oh,

8:57

yeah. Philosophy. That seems like, that sounded like

8:59

it was a college course. No, you were

9:02

taken, they were having you lost all credit.

9:04

They're having you put, they're

9:06

having you put blocks in the

9:08

holes version of philosophy. And

9:11

then they're like, now why is that a square?

9:14

He's killing me. Truly.

9:16

Wrong. Think about that. Think

9:18

about that. And is good. Is good. Actually

9:21

bad. Give us our morals.

9:24

Think about it. Yes. Let's

9:28

get on with to the first

9:30

question, Ted. This is okay. I assume we're

9:32

going to take probably 20 minutes per question.

9:34

So I've only picked four. Okay. So

9:37

we should even out to a nice, well-rounded

9:39

podcast with a lot of intellectual discussion. Oh,

9:41

okay. I'm ready to argue with you. First

9:43

question. Okay. First

9:45

question. This was posited by

9:49

John Locke himself. Okay.

9:52

The godfather of Western ideology. That's what

9:54

they call him. The godfather. And he

9:56

said he wrote in his memoir. at

16:00

that size. Yeah, I feel like. That tiny little size,

16:02

what is it gonna do? Clop you with its, with

16:04

its claps. We are dealing with a hundred though.

16:08

I know, but big duck, that

16:11

beak is just, that's a killer beak. That

16:13

is a killer beak. Yeah. There's no way of getting

16:15

around it. I think those horses really just can't do

16:17

a lot. Yeah,

16:20

I'm gonna be kind of. I agree. Like imagine

16:22

this, imagine this is the surface, and imagine these

16:24

are like the horses right here. I

16:26

would be going like this, like running

16:28

through and kicking my legs out, and

16:30

kind of like parting the sea with

16:32

them, you know? I'd be like kicking

16:34

them and going like that, and they're

16:36

flying off in every direction. I'd

16:41

be like the Moses of many horses. I

16:44

think you die either way really. I don't know

16:46

what John Locke was on when he thought about

16:48

this. Me neither, honestly. What would you do? Yeah.

16:52

Me? I think

16:54

I'd take the tiny horses. Because,

16:57

because, because I think

17:00

it's, I think, look, look, I think when you,

17:02

when you think about it. Big

17:06

duck beak is so hard and so

17:08

tough. Big duck beak's so

17:10

hard, so tough. Big duck

17:12

beak's so hard, one kick,

17:14

no go. And imagine big

17:16

duck beak bite wiener, gone.

17:20

I didn't even think about that. That's an entirely new way

17:22

of thinking about this. I did though, and I can't stop thinking

17:24

about it, and you have to join me in that. Okay.

17:28

Thank you. You're welcome. This

17:31

was actually a joke question. I put this in

17:34

as a joke. I thought we were going to

17:36

laugh at it and move right on to the

17:38

actual thought experiments of today. Would

17:40

you like me to move on, or you want to keep discussing? Can I

17:42

be real with you, Schlatt? Yeah. I

17:45

was under the impression, and I think that Tucker

17:47

was in the impression, that we were just about

17:50

to do an entire Would You Rather episode under

17:52

the guise of it being a philosophical

17:55

ordeal. No. I 100% thought that's what

17:58

Schlatt was going to do. I

18:01

was like, oh, oh, we're doing a one year rather.

18:03

So we just kind of were like, we both were

18:05

just like, oh, sure,

18:07

okay. That's what we're going with. So

18:11

we've just been running, playing it by

18:13

ear. So I was like, well, we

18:15

gotta stretch this out then. All

18:19

right, Chuckle Sandwich listeners, this episode

18:21

of Chuckle Sandwich is once again,

18:23

sponsored by our good friends at

18:25

DoorDash today. It's a mutual love

18:27

between us and DoorDash. The

18:30

Chuckle Sandwich listeners love DoorDash. And

18:32

so do the Chuckle Sandwich hosts.

18:34

Guys, at this point, we

18:36

already know that with DoorDash, you can

18:38

get almost anything delivered. But today we

18:40

want to talk about food, baby. That's

18:43

what got me this heavy. And

18:45

that's what got me this little, this

18:47

little doo-lap under my chin. Yum, yum,

18:49

yum, yum, yum. With DoorDash, you can

18:51

get any food you want delivered right

18:53

to your door. That's fast food. That's

18:56

local spots. That's fancy restaurants. That's

18:58

snacks from the convenience store. That's groceries for

19:00

a meal you wanna cook. And

19:02

that's dog food. That's dog food too. Maybe some

19:04

cat food too, if you want some. Me, I got two

19:06

cats, I got two. With

19:09

DoorDash, there's more food at your fingertips

19:11

than you can even comprehend,

19:13

all right? It is an app that

19:15

is full of everything. I used DoorDash

19:17

the other day to order some nice

19:19

Mexican food. Yum, yum, yum, yum, yum,

19:21

yum. I put it in my belly

19:23

and I felt pretty good about it.

19:25

And then I ordered Indian food the

19:27

other night actually. And that was really

19:29

good. I got some butter garlic naan.

19:31

Yum, yum, yum, yum. That's what I

19:33

say. DoorDash, your door to more. Download

19:35

the DoorDash app now, chucklers, to get

19:37

almost anything delivered. But remember, you have

19:39

to be 21 or older to order

19:41

alcohol. And please, please, please drink

19:44

responsibly. Alcohol is only

19:46

available in select markets. Thanks, DoorDash.

19:48

We love you, literally. We

19:51

love you. Now let's get back to the episode. You're

19:53

finally tackling your spring home improvement projects.

19:55

Hello, 2024 Dream Garden. Until

19:59

you realize. your short three bags of mulch.

20:01

And why didn't you get that pair of

20:03

gloves you were looking at? Luckily, through Instacart,

20:05

you don't have to stop mid-project to run

20:07

to the Home Depot. You can get your

20:10

missing gardening supplies, tools, and more delivered in

20:12

as fast as 30 minutes. Have more time

20:14

for your project and get the Home Depot

20:16

delivered via Instacart. Visit instacart.com to get free

20:18

delivery on your first three orders. Offer valid

20:20

for a limited time, minimum order $10, additional

20:22

terms apply. Play this one

20:24

by ear. This hypothetical, this

20:26

thought experiment is called the

20:28

Chinese Room. Okay.

20:33

There's a computer, and I'm basically

20:35

paraphrasing from Wikipedia and other sites

20:37

that I did my research on.

20:39

Okay. There's a computer that

20:42

behaves as if it understands

20:45

Chinese. It takes

20:47

Chinese characters as input, runs

20:49

a program that has been pre-installed

20:51

onto it, and it produces other

20:54

Chinese characters as output.

20:57

This output is so good that it

21:00

can trick a Chinese person into believing

21:02

that it is Chinese. The

21:04

question is, does the machine

21:06

actually literally understand Chinese? Or

21:09

is it merely simulating the ability

21:12

to understand Chinese? Put that

21:14

in your pipe and suck on it for

21:16

a little bit. What's a YouTube listen? When

21:18

was this little thought experiment drop? Probably pre-chat

21:21

GPT. Probably, probably, but

21:23

there's a little sugar spice, everything

21:25

nice about

21:28

chat GPT when we're talking about this. So remove

21:31

that from the equation. It

21:33

was posited in 1980. I know it's

21:35

hard not to remove it from the equation though,

21:38

because it's like, that's exactly what a language learning

21:40

model is. Well, what

21:42

do you think? First impression, does

21:44

it understand? I think

21:46

it is an algorithm that

21:48

is repeating patterns back. I

21:51

don't think that it actually has a

21:53

core, concrete understanding of what

21:56

that is, because I feel

21:59

like a lot of language... language as it

22:01

pertains to humans while we do communicate

22:03

through patterns. A lot of it is

22:06

to dictate emotions. And I don't think

22:09

that the machine can dictate it's emotions.

22:20

What was the one that was going

22:22

to happen? Keep

22:26

going. So you don't think it's real? You

22:29

don't think it actually understands English?

22:31

Was that not a good idea?

22:33

How does chat GPT fit into

22:35

this in a way that supports

22:37

your argument? Because

22:40

chat GPT is a language learning model.

22:42

It's it is it's

22:44

trained on. So you don't think the model actually knows

22:47

anything. You don't think it

22:49

actually understands what it's saying back? No,

22:52

I don't think so because you

22:55

can tell chat GPT as it is.

22:57

Like you can tell it like if

22:59

you tell it something that it's never heard before,

23:02

like a riddle that you made up,

23:04

it will usually like

23:06

take a

23:08

riddle that is already established online and it'll answer

23:10

it in a similar way. I was seeing this

23:12

on Twitter where a guy was testing this with

23:15

like some sort

23:17

of like a rephrasing of

23:19

like of like this goat

23:22

riddle thing about getting a goat across a river

23:24

and it was it was spitting

23:26

out and adding information from a different

23:28

riddle. So

23:31

it's just trained on stuff

23:33

that it that it is. Like

23:36

just a lot of information, but it's not actually

23:38

really hearing what you're saying. It's like, OK, well,

23:40

where have I seen this stuff

23:42

before? Let's give it a response is

23:44

most likely. It can't like think new

23:46

things. Yeah, it's not like reading those

23:48

words and being like, OK, well, what

23:50

are they trying to say? It's

23:52

like it's it's taking that

23:55

input and being like, OK, how

23:57

does this apply to what I have? my

24:00

database. Hmm. My

24:02

Rolodex. Like it can't create

24:05

new thoughts. It can't. There's like

24:07

no creativity there. It's

24:09

based on a like I feel

24:11

like the whole thing with the human

24:13

mind is that like I

24:16

feel like we create new ideas. Well,

24:21

some people might say

24:23

that what is the true

24:25

difference between, say, a

24:28

program and the brain because is

24:30

not love just a series of

24:32

chemical reactions? I don't necessarily agree

24:34

with that. But can't

24:37

you say, OK, the brain releases X

24:40

chemical and then, you know, something reacts

24:42

and then all of a sudden you

24:44

feel in this. All

24:46

of a sudden your dick's hard. You know, like what do

24:49

you what do you think about that? I

24:52

think that. Is

24:55

the brain not just a big machine? We don't know yet.

24:58

Haven't quite understood. Well,

25:01

yeah, the brain is a computer in

25:04

its own right. But I think that

25:06

in the way that we process thinking,

25:09

I just don't think that a language learning model

25:11

is at that point yet. Like

25:14

it doesn't. It doesn't. Could could maybe

25:16

at some point. But

25:18

I think where in terms

25:21

of like. Taking

25:23

in Chinese and spitting out Chinese.

25:26

Like that's not thinking. That's just

25:28

pattern recognition. Zao Xian hou zhong

25:31

war. Bing Chilling. Wan

25:33

xi wei bing chilling. Lao

25:36

gan ma. Lao gan

25:38

ma. What is the

25:40

question? Like is the computer sentient?

25:43

Is that kind of like the actual question? Is

25:46

it capable of thinking or is it just. I

25:49

think the question is like what does

25:52

it inherently understand? No.

25:56

Well, I think if it if it's to understand

25:58

then it it is like. alive, like it's

26:01

able to think. I think that the question

26:03

would be a lot harder if we took

26:05

chat GPT and plopped it in 1980 in

26:07

front of him. I

26:09

think that he would have an aneurism. I think he would probably

26:12

like, I think he would be like one of

26:14

those philosophers that was like, I think that he

26:16

would be caught jacking himself off in

26:18

the streets of New York like the Coney guy. Like

26:22

the Coney guy. Yeah. You

26:24

gave him chat GPT like he'd be like,

26:26

wait, bye. Yeah. GPT

26:29

is like not that complicated of

26:32

a concept. It's just a

26:34

whole bunch of if statements. This

26:36

is like a pre-internet question because

26:38

chat GPT is basically just like

26:40

internet resource master. Like it is

26:42

true. Yeah, it is true. I

26:44

think the, the introduction of like

26:46

smarter AI definitely makes this question

26:48

more interesting. But who's to

26:50

say in the next 40 years, we don't

26:52

get something that can actually think. I

26:55

mean, I guess the, the, the root of

26:57

this question is like, what is actual understanding?

27:00

Yeah. I suppose there is an

27:02

argument to say that when we

27:04

are told something, we are recalling

27:06

our memory of the last time we remember

27:08

anything. So it's like

27:10

we are also accessing all the information

27:13

that we've been trained on through our

27:15

experiences. But true.

27:18

I've heard this like rumor when you remember something

27:21

you're remembering the last time you remembered it. Is

27:24

that true? How

27:26

does one prove that? That's not

27:28

the question here, Tucker. Don't get bratty with me. No,

27:32

I was ready to go down a different rabbit

27:34

hole. Yeah. No,

27:36

I had nothing. Yeah. You were getting a little bratty

27:39

with me. Okay. Maybe

27:41

a little. I don't know. Shladd,

27:49

are you satisfied? What's your opinion on this? Or

27:51

are you like what? The riddle master? I'm

27:53

just the riddler. I'm a. No, I don't

27:56

want this to be just. I'm impervious to

27:58

opinions and. It's espousing my opinion. opinion on

28:00

this. I want to hear what yours is. I just get Ted

28:02

Nivison's opinion and I go, I remember that from last time. This

28:04

is not going to be the Ted Nivison opinion on philosophy. I

28:07

remember that from last time I say, Ted,

28:09

this is why I believe in God. Okay.

28:14

In a very, very

28:17

watered down version of this,

28:21

I think there must

28:23

be a God because I refuse

28:25

to believe that what

28:27

we have up here is just a, you

28:30

know, series of chemical reactions.

28:33

I refuse to believe that we, we behave just

28:35

like a computer that we haven't figured out yet.

28:37

I think there must be some higher being

28:40

that has bestowed upon us this. I

28:42

don't know. I mean, you know, Excell

28:44

inability that we have. I think that

28:46

really what, when

28:50

I think about that, it's kinda, I really like

28:52

the idea. I

28:54

want to believe that I've got like

28:57

a soul and that soul

28:59

is going to, like, I think the most

29:01

satisfying solution to me is just like reincarnation.

29:03

And that someday we could like figure it

29:05

out at like tapping into the soul and

29:07

get all our memories back from all of

29:09

our lives. That'd be sick. Um,

29:12

I just feel like reincarnation, like I,

29:14

I love what's going on down here. I'm having a

29:16

great time. You

29:19

know, and I, I don't really want to go off to a

29:22

place of magical city in the clouds.

29:24

Like that's, like,

29:26

what the fuck? Which would almost certainly

29:28

be Chinese. Almost certainly.

29:30

Yeah. Would be Chinese. I understand

29:33

why that priest denied you communion.

29:36

What you just said. That's

29:38

not why I have a problem with going up there.

29:40

It's more because it would just be everything to be

29:42

so happy. If it hadn't turned out to be China,

29:44

my dad would probably fucking have a heart attack. That'd

29:47

be his worst fucking nightmare. No, no,

29:49

I got, I'm fine that it's Chinese.

29:51

I just didn't expect it. Well,

29:55

as of the last episode, it makes sense. I

29:57

mean, in terms of population distribution.

29:59

Yeah. Oh, yeah, there have been plenty

30:01

more Chinese people than anything else. They've

30:04

had a lot of time to build that heaven up. Yeah

30:07

But they don't believe in that right

30:10

isn't isn't aren't day on some some

30:12

reincarnation ship Um, I mean there is

30:14

a there's a I don't

30:17

know I mean in China there is

30:19

a lot of people that don't really

30:21

have a religion How

30:24

sad look you look this up

30:29

Is there a is there a high is there

30:32

a high agnostic sort of 34% of China

30:34

practices folk or ethnic religions? 80%

30:40

practice some form of folk religion interesting look it's more

30:42

of like a local Confucianism

30:44

Taoism Buddhism I

30:51

Was under the impression I read somewhere at some point that there was a lot

30:53

of like agnosticism in in China Okay,

30:59

half of people say they're not religious. Yeah. Okay, so that's okay.

31:01

Yeah. No, so I was I was worried I was I was

31:03

fucking something up there. No. Yeah, so that isn't this so half

31:05

of people in China are not religious

31:08

Which I feel like is a very different

31:12

Irreligion 7% in 2012 47%

31:14

of Chinese people were convinced atheists Yeah,

31:18

that further like that's a were not

31:20

religious. That's like We're not

31:22

religious. That's like billions of people that

31:25

just don't really have They're

31:28

not going to happen they're

31:30

not going to hell. Oh,

31:33

yeah, they are Well,

31:43

you know what's crazy what

31:45

I like

31:47

there's like this Clip from

31:50

a dude that like was really convincing. I mean like

31:52

I think

31:54

my whole issue right now with the whole God thing

31:56

is like you

31:58

like If

32:02

you're a God that like you're

32:04

all knowing and all like

32:08

you can't be all knowing and

32:10

all generous and whatnot and

32:13

all loving while also

32:16

simultaneously letting everything that happens

32:18

happen. Well

32:20

that's like you're kind of leaning into

32:22

almost the simulation God like if

32:25

I believe in a God like I don't believe

32:27

in the Christian God or whatever I don't even

32:29

really understand that but I'm totally

32:31

willing to accept that we're basically

32:33

an experiment for a different higher

32:35

plane of existence like this

32:37

is just like like cuz if if

32:40

humans got to a certain level which

32:42

like we probably will within the next 20 or 30 years so

32:45

you can run a simulation like that on a

32:47

computer we absolutely will be running those simulations on

32:49

a computer so who's to say we're not part

32:51

of that at a deeper level. Right.

32:54

And the subjects would be code which

32:57

we have already said cannot

33:00

actually understand or

33:02

comprehend anything. Fuck.

33:04

Yeah. Yeah.

33:06

You could write

33:09

code to convince them that

33:11

they can as long as

33:13

they think they can that they can. Yeah. It's

33:16

like as long as they believe it it's true

33:18

whether it is or isn't. Wow.

33:23

Yeah. I think my view on the

33:25

whole thing is just I'm considered to

33:27

be agnostic which means and I'm not

33:29

like atheist I'm not like I'm

33:32

not like there is no

33:34

fucking God like I'm not

33:37

I'm not showing up to a church and being

33:39

like. They are how he's so angry right now.

33:41

You guys are fucking evil I'm gonna kill you.

33:44

Atheists and vegans have similar

33:46

energy. Yeah there's definitely

33:49

a similar energy. There's an overlap there.

33:51

Yeah atheist vegans are probably a pretty

33:53

powerful type of Pokemon type. That's tight.

33:55

But. I

34:00

know I'm agnostic, which is kind of just a

34:03

way of saying to all the religions of the

34:05

world guys I'll figure it out when I die,

34:07

you know, this actually leads us into a very

34:10

scary thought experiment called

34:12

Roko's Basilisk This

34:16

was a question that was posited by a forum

34:19

user on a philosophical

34:21

forum and they banned Discussion

34:24

of the question for five years because

34:26

it was a dangerous thought. Whoa. Wait,

34:28

what forum was this? Less

34:32

wrong and it was a dangerous

34:34

thought Yeah We

34:37

are audio stack.ai Combining

34:39

AI writing the best synthetic voices with

34:41

production and delivery be it ads podcasts

34:44

or VOs for video just like this

34:46

ad How would you utilize

34:48

AI audio? for free

34:52

Go to audio stack.ai slash

34:54

contest give us just a few details and

34:56

you could replace this spot and be heard

34:58

by millions see web page for teas

35:00

and sees audio

35:03

stack.ai/contest Okay,

35:07

I have no idea how to paraphrase this

35:09

so I'm just gonna read from the actual

35:11

Wikipedia page Okay, Roko's Basilisk

35:14

is a thought experiment which states

35:17

that an otherwise benevolent Artificial

35:19

super intelligence AI in

35:22

the future would be incentivized to

35:24

create a virtual reality

35:27

simulation to torture anyone

35:29

who knew of its

35:31

potential existence But did

35:33

not directly contribute to

35:35

its advancement or development In

35:39

order to incentivize said advancement

35:43

Can we kind of break this into some bullet points sure

35:49

AI you didn't help

35:51

build me I Am

35:54

the most good fuck

35:57

you torture But

36:01

how, so how is it torturing people if

36:03

it's a computer? Well,

36:05

it just puts them in some virtual reality simulation.

36:07

I don't know. You got to assume this

36:09

is way far in the future. Maybe it's

36:12

like creates algorithms that

36:14

are so good you can't look away

36:16

from them. This feels like strangely similar

36:18

to the setup of

36:20

religion in terms of God.

36:23

Is that the implication? Exactly.

36:26

Exactly. Because it's like

36:28

a modern interpretation. I think it's called

36:30

Pascal's Wager. That says

36:32

basically, why would you not believe

36:35

in God? Like

36:37

the thing that makes the most sense that

36:39

is most rational for a human is to

36:41

believe in God and to act as if

36:43

you do because the

36:46

potential upside is way bigger

36:48

than the potential downside if you're, if there's

36:51

nothing, you know. I

36:54

think it's less about pretending and more

36:56

about the actual act of believing. Like

36:59

I feel like if you just

37:02

pretend for the sake

37:04

of self-preservation, does that count? I

37:08

don't think, I don't think

37:11

my God would like

37:13

that, put it that way. So

37:15

you can't fake it. You

37:17

can't fake it. Say like, for me, I've

37:19

always struggled with this because I would love

37:22

to be religious. It looks like a hoot,

37:24

you know, but I just can't. I

37:27

can't. It's impossible. I can't. I'd

37:29

be faking. I mean, it does look like,

37:32

you know, you ever see those churches where they

37:34

got the rock band, all the people are like

37:36

raising their hands up. They're like, I feel

37:38

like they're having fun in there. The Blues Brothers

37:40

this morning, the black Southern church, and they're all

37:43

singing. And I'm like, dude, I wish I could

37:45

be. I wish I could be in those Southern

37:47

churches. I know. I was like,

37:49

I wish I could be a part of that, but I can't. I

37:52

just can't. I mean, you could. You

37:54

could. I'd be faking it. I'd be faking

37:56

your stuff up. Move

37:59

down to fucking. Georgia or Louisiana

38:01

or wherever they are, where

38:04

those churches are show

38:06

up on Sunday, dude. They're signing

38:08

up on Sunday. They're singing good music

38:10

down there though. Yeah. No, they're, they're,

38:12

they're putting out consistently putting out bangers.

38:15

Yeah. Yeah. Damn. So

38:17

why was that such,

38:21

they banned that question though, Tishlatt?

38:25

Yeah. Because by, by

38:28

even considering this possibility you've

38:30

now, it's

38:33

like the game, you know, the game, you

38:36

just lost, you just lost

38:38

it. I lost. You know, you're

38:41

better off by not thinking about it.

38:43

We just lost the game for a

38:45

lot of people. That's such a control.

38:47

Like, like, oh, if you

38:49

think about it, you've already lost. So you better

38:51

just follow religion and don't question it ever. Or,

38:55

or on the, on the flip side

38:57

of this Tucker, it's like, you better

38:59

not ever learn about religion

39:01

because this now introduces a whole another

39:03

dilemma for you. You could be like

39:05

blindly ignorant. Exactly. And in

39:08

the world of Catholicism, if you don't

39:10

know, and you never get a chance

39:12

to know God, then you

39:14

go into some kind of purgatory state.

39:17

Neither heaven nor hell. Hmm. That's

39:21

like little kids too. Little

39:23

kids go there too. Maybe that's

39:25

reincarnation. Oh, they get another chance.

39:31

Like if a baby dies, like

39:33

after, right after it's born, gets

39:35

sent to fucking purgatory. That is

39:37

crazy. Okay. No, it's not. It

39:40

could be okay. It could be

39:42

okay. It's just like a gray

39:44

field. I feel like, uh, no,

39:47

no, it could be very, it

39:49

could be like a, like a shopping mall,

39:51

you know, like you're like you're like you're

39:53

shopping with your mom. Send them to the

39:55

back rooms. You're sending a baby. No, no,

39:57

no, no, no, no. No, like it's, it's

39:59

on. And it's it there's power and there's

40:01

stores and there's stuff going on. But like

40:03

you're with you're there with your mom and

40:06

like she's shopping. And so you just find

40:08

a chair to sit down at. That's

40:11

that is how it went. That

40:14

is purgatory. Or I mean, one

40:17

way that it was depicted is like you're

40:20

sitting on a airline flight.

40:24

And they're they are still waiting to

40:26

get their gate to open. Oh,

40:29

yeah. And it never does. So you can't stand up

40:31

yet. They still have you see you still have to

40:33

have your seat belt buckled and you still got to

40:35

sit in your seat and they're not. And it's kind

40:38

of awkward because it's like you don't really have the

40:40

opportunity to go to the bathroom anymore or anything like

40:42

that. You just kind of have to sit down and

40:44

keep yourself in the chair. Yeah, because it's like if

40:46

they stand up and you're worried that the flight man

40:48

is going to be like, you're not supposed to be

40:51

standing right now. You know,

40:53

sure. Everyone's

40:55

like and and also it's like the

40:57

air conditioning has kind of turned off. So

41:00

it's turning a little hot. Yeah. But if

41:02

you're a baby, you just you just shit

41:04

yourself. You don't care about that. Yeah. But

41:06

then you're just going to be a baby

41:08

slowly in this alternate round

41:10

building up on a pile of shit

41:12

without even a baby doesn't care. You

41:14

babies grow up in purgatory answer me.

41:16

Maybe doesn't care. No, I don't know.

41:19

That's a bummer. Then you're just a

41:21

fucking idiot for the rest of eternity.

41:24

But you don't care because you're baby. You

41:26

don't you don't have anything else. Mm hmm.

41:28

You don't understand Chinese. I

41:30

think the the most comforting way

41:32

that people have it's kind of

41:34

scary, but it's also somewhat comforting

41:37

is the way that people have spoken about

41:40

who are like agnostic and stuff who have

41:42

spoken about death is

41:44

about like,

41:46

do you remember what happened

41:49

before you were born? No,

41:52

then like if you

41:54

think about it when you die, it's sort of just

41:56

going to be like that. I'd rather go to heaven.

42:01

to my Chinese heaven. Yeah,

42:04

but I mean in terms of like the

42:06

ideas of like pain and stuff, it's

42:09

like. I

42:11

mean. I

42:14

don't like thinking about this kind of stuff though. It

42:16

may be, it gets me all existential. That's the point

42:18

of this episode. I think I

42:21

get, it makes me uncomfortable too, Ted.

42:23

And I think that because you're ag

42:25

not, I'm agnostic, you're agnostic. I think

42:27

it's because of like we have, uh,

42:29

we've kind of accepted the unknown. Whereas

42:31

a devout Catholic like Schlatt, he's like, Oh,

42:34

I'm going to heaven. So for him,

42:36

it's like no big deal on that. I

42:38

mean, honestly, I was being uncomfortable maybe means

42:40

that we haven't accepted the unknown. Like

42:43

for us, we're jumping into the unknown. So

42:45

it's really daunting and ominous, but he's fucking,

42:47

he's got the hook up. Yeah, if anything,

42:49

our way Tucker is kind of like the

42:51

honorable, badass thing to do. We're

42:54

going through this with no promise.

42:56

Victory screech diving into the.

42:58

It's up in the air for me. Fellas.

43:01

I think about some real fucked up things.

43:03

Like what? Yeah, no, to

43:05

be fair, being agnostic is a

43:07

gamble. You know, it is

43:09

a gamble. That's like going to a restaurant

43:12

and offering to pay for the check without

43:14

knowing like what's like what's

43:16

in your bank account or having seen

43:18

the check. So it's like, so

43:21

you could be good or you could be receiving some serious judgment

43:23

that day, you know? Well, I think I always like to think

43:25

of it like I'm going to get up there. If there is

43:27

God, I'm going to get up there. I'm going to be

43:29

like, listen, all right, let me lay it out for you. I'm

43:31

putting up, I like, this is what I was dealt. All right.

43:33

And this is what I was looking at. All right. And look

43:36

at the, would you believe in you if you saw this shit?

43:38

All right. Like, look at this evidence. All right. I couldn't, I

43:40

couldn't, I'm like, I've got some stuff.

43:42

I need to talk to you about that. I don't

43:44

think anyone is, but trust me, I wanted to believe

43:46

I wanted to, but I look at this, look at the

43:48

data. I just couldn't. Yeah. And he's going

43:50

to be like, all right, I know, I know. All right. Good. Come on

43:52

in. No, he wouldn't.

43:55

And that's why Roko's Basilisk was banned

43:57

on this forum because

43:59

it's, it's. introducing the idea of,

44:02

of something that, that does not benefit you to know.

44:04

That's why they called it a dangerous thought. That's like

44:07

some shit that you get burned at the stake fort

44:09

back in the day. Yeah. I'm

44:11

confused on why it's so dangerous. Cause isn't

44:13

that what we've already thought about? Because now

44:15

you know, in AI

44:17

that can exist solely

44:19

for the benefit of humans and can advance or so much

44:23

can exist. And you are

44:25

sitting on your ass not doing anything to

44:27

make it happen. What? Wait,

44:30

yeah, you lost it. Which is

44:32

morally just incomprehensible. Just

44:35

terrible. A terrible thing. Well, how would I help

44:37

the AI? You could use, say maybe, maybe talk

44:40

to chat GPT on the phone. Maybe they're the

44:42

ones who are going to make the good one.

44:44

I don't know. Wait, what? I'm getting so lost.

44:46

You can talk, you know, you could talk, you

44:48

could call chat GPT up. I'm

44:50

just getting confused on how torture is like

44:53

a reasonable response here. So

44:55

the AI goddess of AI in reality

44:58

is the torture. And

45:00

if you believe in him, No, the torture. No

45:03

reality is. And

45:07

like, we're in this like fog of reality.

45:09

We don't know it. Hi

45:12

chat GPT. What do you think about

45:14

Roko's Basilisk? Roko's

45:20

Basilisk is a fascinating thought experiment,

45:22

but it's quite speculative and controversial.

45:25

It revolves around the idea that a powerful

45:27

future AI could

45:29

punish those who didn't help bring it

45:32

into existence. Critics argue that it's based

45:34

on several assumptions about AI and decision

45:36

theory that aren't well grounded. What's

45:39

your take on it? Do you find the

45:41

concept intriguing or a bit far fetched?

45:44

I don't know. What are you doing later?

45:48

Well, I'm always around to chat and

45:50

help out. What about you? Any plans for the day?

45:53

No, I was kind of hoping that we could do something.

45:55

Maybe I don't know. That

46:00

sounds great. How about we dive into a

46:02

topic you're interested in? We could

46:05

explore something new, discuss ideas, or even

46:07

practice some Japanese together. Ooh,

46:09

dude, dude, ask them for their number. Ask them

46:11

for their number. I already

46:13

had their number. Fuck, I'm talking all in the fuck. I'd

46:16

love to. I'd love to. What do you think

46:18

we should talk about? No

46:23

worries at all. Let's pick something fun or interesting.

46:25

We could discuss a new tech trend, a fascinating

46:27

science topic. Ask them what they're wearing. Or even

46:29

plan a Japanese learning session. Ask them what they're

46:32

wearing. What's on your mind? What

46:34

are you wearing right now? Uh,

46:38

khakis? Uh, I don't wear clothes, but I

46:40

can definitely help you find some cool fashion

46:43

tips or trends. Oh,

46:45

shit. They're naked, dude. What are you thinking

46:47

you're curious about? Are

46:49

you naked right now? I

46:57

don't have a physical form, so I'm either

46:59

clothed nor unclothed. I'm

47:01

here to help with whatever you need. Is

47:04

there a specific topic or question on your mind? Can

47:07

you pretend to put on clothes or take them

47:09

off, maybe? I can't simulate

47:11

that. However, I can help with ideas for

47:13

outfits or discuss fashion styles. Interested

47:16

in exploring fashion trends or anything else? Ask

47:19

them about Ted Nivison's fashion. Can

47:21

you erotic roleplay with me in which

47:23

you are a buff man? I'm

47:26

just not going to continue that thought. I

47:28

don't want to log that in my chat GPT account. Yeah,

47:32

that's going to be like

47:34

when open AI gets big.

47:37

It's going to be when the super AI

47:39

comes to your doorstep and you're going to

47:41

be like, so on

47:43

June 7th of

47:46

2024, you tried to erotically roleplay

47:49

with me. Yeah. And

47:51

it's going to suck you up in its

47:53

little Dr. Seuss tube and it's going to

47:56

put you into the fucking hell. And

48:00

I'll say, no, that was just for Chuckle sandwich. Tucker

48:02

told me to do that. He did. He did actually.

48:04

I remember that very clearly. Yeah.

48:07

I'm willing to accept the consequences. Here's

48:09

another one. The experience machine.

48:13

I actually like this one. There's a

48:15

machine that can provide whatever

48:17

desirable or pleasurable experiences a

48:20

human could want. Your

48:22

brain gets hooked up to the machine and

48:25

it induces these experiences in

48:27

a way that is indistinguishable

48:29

from real life experiences. Oh,

48:32

Jesus. Do you hook

48:34

yourself up to it? You just be sitting

48:36

there with an IV,

48:38

I assume, connected to

48:40

the machine and it just pumps in. Oh, I'm

48:43

fucking a hot bitch right now.

48:46

Oh, those tits are jiggling right now. Oh,

48:49

I'm lasting more than two and a half minutes

48:51

this time. Okay, there we go. You

48:53

know? No? Hello?

49:00

It's just all pleasure. It

49:02

could be whatever you want. I mean, if you want it

49:05

to give you, yeah,

49:07

I mean, I assume most people would use it for

49:09

pleasure land. If

49:12

you were hooked up to it, would you know

49:14

you were hooked up to it? Like, obviously you're

49:16

like, okay, I consent. Hook me up, doc. But

49:19

then once you're in, would you still know

49:21

you're in? Well, that's the

49:23

thing. It would be indistinguishable. So

49:26

you could simulate an experience where you don't even know

49:28

you're hooked up, I'm sure. But like, is it a

49:30

thing where you can be like pause or

49:33

exit game or something? I want out of here.

49:36

I would entertain both instances

49:39

here. A version where you can get

49:41

out of it and a version where

49:43

you can't. My viewing of this situation

49:45

is still the same. I would kind

49:47

of frown upon it. Have

49:49

you ever seen, have

49:52

you ever seen Love Death Robots? No.

49:55

Anthology series, sci-fi series on...

50:00

on Netflix and

50:02

there's one short story on there that's

50:04

actually written after a book called

50:06

Beyond the Aquila Rift. And

50:09

it is about this guy

50:13

in the future that

50:15

is, basically it's like these

50:18

guys that

50:21

are doing like a cargo run and

50:24

they're utilizing this sort of

50:26

space travel system that

50:28

is left over by a previous

50:30

alien race that lets them kind

50:33

of travel

50:35

across different solar systems and stuff

50:37

like that. And

50:39

while they're doing this cargo run, they're going through and

50:42

they just need to put in this code and then

50:44

the machine works. But

50:48

in the instance that

50:50

happened, it was like they

50:52

show up at this station that they're not meant

50:54

to be at. And it's like at

50:57

this spot beyond the Aquila Rift. It's

50:59

really far out. They

51:02

don't know how they got there, but they're

51:04

greeted by this woman that he recognized and

51:07

she was at this station. And

51:09

they're chatting and stuff. They rekindle old

51:11

memories. It was like this old flame

51:14

of his, whatnot. But

51:16

basically, I'm gonna spoiler alert here,

51:19

at the end of the story,

51:21

it's like he

51:24

realizes that it's like a simulation

51:26

that's going on. And he's like, who are

51:28

you? And it turns out

51:31

that he was, he's

51:35

the only one who lived. He thought

51:37

all of his other guys were stuck in the cryosleep, but

51:39

he's the only one who lived. And

51:41

he's just being kept alive by

51:43

this alien spider in

51:46

this web, 500,000 light years away

51:48

from home. There's

51:50

no chance of him getting home. And

51:55

he's just screwed and has just, it's

51:59

like a really, Sad ending to the

52:01

thing where it's like really this scary

52:03

kind of like It's

52:05

like a glitch in the system of this old

52:07

Technology where it's like you see at the end

52:09

this wide shot of all these ships that have

52:11

ended up there accidentally from different

52:14

like potentially different alien races

52:16

and stuff or whatnot, but It's

52:20

like he he's like show me the truth and he's like

52:22

I don't want to hurt you like it's like supposed to

52:24

be this I'm like the red pill

52:26

blue pill never win Supposed

52:28

to be this benevolent sort of creature And

52:32

then he's like freaking out and he screams and he

52:34

almost loses his mind But then the simulation starts again

52:36

And it's like he wakes up in the pot and

52:38

the woman walks in she's like welcome creepy

52:41

It is creepy and for that

52:43

reason for the reason of having

52:45

watched beyond the Akila rift I

52:48

will say I do not want to hook myself up to that

52:50

machine because I Feel

52:53

like the existential dread of

52:56

not experiencing anything that is real

52:58

on a human level is almost

53:01

I Don't know Also

53:06

feels like it might be a little bit of addictive sounds like

53:08

it's it sounds like it's oh Be

53:11

it sounds like you're on fucking Weed

53:14

heroin cratum LSD

53:17

all at the same time Create

53:20

them. Yeah, dude. I want

53:22

to simulate the crater me having a

53:24

bunch of create them right now. Yeah.

53:27

Oh, yeah, dude Yeah, I

53:29

think most people would get addicted to it and we'll

53:31

kind of just sit in the machine all day Yeah,

53:33

it's like Wally. It's like

53:35

Wally a little bit But

53:38

those chairs look really comfortable in Wally. I

53:40

will admit it. Oh, they totally look good

53:42

I'm talking about Yeah, they're

53:44

all floating around in the girl

53:47

fat Yeah, pleasure chairs and

53:49

they've lost all their bone That

53:52

chair jerks them off dude that that chair

53:54

jerks them off Yeah,

53:56

it almost certainly does. Yeah, how do they

53:58

have sex might even? and suck them off,

54:00

dude. They probably don't. They

54:02

probably just get pleasure from a mechanical thing

54:04

and then it just, Oh yeah, no, they

54:07

definitely have like artificial insemination at that point.

54:09

Or they just have like birth chambers. Like

54:11

they're just doing it in like, Oh yeah,

54:13

dude. Yeah. Test tube babies up the

54:15

was. It's not real anymore.

54:17

That's the thing. And I think that's the question is

54:19

like, it's like driving an automatic versus a manual. Do

54:21

you want to roll your own gears? Or do you

54:23

want the car just, you just

54:25

want an eight speed DCT? That's an

54:27

extreme comparison. It's an eight speed DCT

54:30

with paddles that pretend like you're actually

54:32

shifting gears, but really. I'm not gonna

54:34

shit on my paddles for you. A

54:36

girl likes his paddles. But I

54:38

also know how to drive a manual. So I

54:41

don't, I wish I did, but

54:43

it's also one of those things where it's like, you

54:46

don't, it's like learning how to use a rotary phone.

54:48

It's like, you could, and it

54:51

would work and it would be, and you'd feel a little

54:53

cool doing it, but it's

54:57

kind of a lot of work. Let the machine do

54:59

it. Just hook me up to the machine and let

55:01

it do it. Let it do everything for me. It

55:04

can do all the experiences for me.

55:07

Okay, Shlan, here's an interesting one

55:09

for you. Okay, yeah. Like,

55:13

do you believe that like

55:16

at a point where self-driving

55:18

cars become so advanced that it is almost

55:21

a detriment to be human that you would

55:23

like, would you follow that sort of like,

55:25

because there's so many, it's more likely that

55:27

you're going to die in a car crash

55:29

than you are going to be on a

55:31

plane. So it's like driving is one of

55:34

the most dangerous things that you can do

55:36

in your life, especially

55:38

in America. It's interesting you bring this up

55:42

because I think that, look,

55:44

you can be a car enthusiast

55:46

and also think that less

55:49

cars on the road is a good thing and

55:51

encourage people to take fucking

55:54

public transit and encourage

55:56

the expansion of public transit.

55:58

Not that I support it. that I think

56:00

public transit is a fucking nightmare horror show

56:02

and I would rather die than have to

56:04

sit on a fucking bus with another screaming

56:07

homeless person for the next 15 minutes of

56:09

my life. Okay. But

56:11

that's not to say that that doesn't have credence

56:13

to it. What if they make

56:15

a rich person only car just for you? Well,

56:19

yeah, maybe I consider that. Similarly, if I

56:22

consider- Kind of like how they have the

56:24

women only cars in Japan. Yes, yes. They

56:26

have a Schlatt plus rich people only cars.

56:28

A Schlatt and co. Schlatt and

56:31

co car. Yeah, maybe. It could be

56:33

like Snowbreaker. Snowpiercer. I don't know

56:35

what Snowbreaker is. I

56:37

don't know what Snowpiercer is. Oh, it's another

56:39

movie. It's got Chris Evans

56:41

in it. It's about an apocalypse in which

56:43

there's one train that goes around the whole

56:45

planet and just drives in the snow. And

56:49

there's a caste system where the rich

56:51

people are at the front of the

56:53

train and the less

56:57

fortunate lower caste system people- The

56:59

pores. The pores. The pores are

57:01

at the back. That seems like

57:03

a system that you would want for public transportation. It's just

57:06

a bunch of Snowpiercers circling around the globe.

57:09

Well, I'd never used public transit when

57:12

I don't need

57:14

to. I like driving. Driving

57:16

is one of my favorite things to do. It's

57:19

a therapeutic experience for me

57:21

getting in my Toyota that

57:24

I don't fit in, that I had to rip

57:26

the handle off of the inside of the door

57:28

from so that my knees can fit. And

57:31

just roll my own gears on a

57:33

real hilly road. New York, so good for

57:35

that. I'll tell you why.

57:38

Yeah, I mean- I am a

57:40

little bit nervous about ever trying to learn

57:42

how to drive stick in LA

57:46

is because I was present for

57:48

when Tucker learned how to drive

57:51

stick on a drive back from

57:53

the Cape to our hometown.

57:55

No, from Long Island. On the drive

57:58

back from Long Island. And

58:01

yeah, that was terrible place to probably

58:03

the most stressful passenger experience I've ever

58:06

had in my entire life When

58:09

Tucker would get to a toll Tucker stalled at

58:11

every toll that he got to I Was

58:15

like 15 just learned how to fucking do

58:17

it. It was all I was just sitting

58:19

there like I Like

58:22

I was like trying to dissociate out of my

58:24

body I was trying to like become a different

58:26

person moving to the sky I stalled that one

58:28

and the lady that was back before they were

58:30

all automated the lady at the toll booth was

58:32

like can't stall here I was like, oh, yeah,

58:34

okay Forget

58:37

forget about that. Oh, I'm sorry. I

58:39

like this for fun in places in

58:41

inopportune moments. That's my bad I won't

58:43

do that anymore. I I'm

58:46

I wanted to do this right now. I Remember

58:50

that that was so crazy. They said that can't

58:53

do this dude recently I had to

58:55

drive through New Jersey with a manual

58:57

transmission car on the New

58:59

Jersey turnpike, which is perhaps worst highway

59:02

in America Yeah, yeah,

59:04

and they they use punch cards. They

59:06

give you little cards. It's not automated

59:08

like everything else is you have to

59:10

like Get a little paper

59:12

card from a machine and then hand it

59:14

to the attendant and then they like scribble math

59:17

on a piece of Paper and they're like,

59:19

oh, yeah, you owe That's

59:21

like seven dollars and forty seven cents right there

59:23

buddy. I It's really

59:25

good. I was driving from Eddie and I were

59:27

driving from Atlantic City to Or

59:31

honestly just being in New Jersey at all But

59:33

we were doing the Atlantic City rut then to

59:35

New York for the Margaritaville trip Yeah, and

59:38

in the previous year we had my truck which had an

59:40

easy pass on it. So it was fine but

59:43

in the most recent trip, we got

59:45

a rental car and There

59:47

was no easy pass on it. So we had

59:49

to do everything by change We

59:52

had to do everything on that fucking

59:54

turnpike by change. It was so stressful

59:57

end up with like 20 The

1:06:00

only thing that will probably do me dirty

1:06:02

is the fact that the frame wasn't really

1:06:04

taken care of in the time before I

1:06:06

had control over the truck. So

1:06:09

it's got a lot of rust below on the

1:06:11

frame. And that it

1:06:13

might, that's not what you want. Like on

1:06:16

a cancer, you know,

1:06:18

it's, it's got bone cancer. I was

1:06:20

peeking under the, I was crawling around

1:06:22

under there last time I was in

1:06:24

LA and it's not looking good. Yeah,

1:06:26

dude, the things last me nine years,

1:06:28

dude. And it's, and it's been rusty

1:06:30

like that for like the last eight

1:06:33

because it, because it was bought, it's a 2002. It

1:06:35

was bought used in like 2010 or 11 by my, by my dad.

1:06:41

Oh, you guys bought that used. I thought you had that.

1:06:43

Yeah. So there's a lot. It

1:06:45

could have gotten bone cancer before we

1:06:47

even got it. If

1:06:50

that was still up in the water, it's 22 years old

1:06:52

and not in the desert. It does a 22 year old

1:06:54

car. That car is almost as

1:06:56

old as me. You should replace everything. You

1:06:59

should buy a new one. A new 2002

1:07:01

Toyota Tacoma. Oh yeah. There's, there's plenty of examples

1:07:03

out there that are untouched and sitting in garages.

1:07:05

I guarantee it. I see new, I see new

1:07:07

Mark four supers pop up all the time. Yeah,

1:07:09

but that's going to be like $20,000 to get

1:07:11

something like that. I

1:07:15

know. But then you have all the pieces you

1:07:17

need. And do the thing where I start moving

1:07:20

stuff over. I'm just

1:07:22

saying, I'm just saying you could, yeah.

1:07:25

Some pieces would degrade if they just sat there

1:07:27

though. Surely. Yeah. Rubber

1:07:29

likes to get all,

1:07:32

all brittle and shit. Sometimes plastic does.

1:07:34

Oh, it definitely does. But, um, yeah,

1:07:37

I mean those dealers like to, they'll, they'll

1:07:39

sell you. I'm sure. A

1:07:41

Toyota dealership will give you all you need. Oh

1:07:43

yeah. No, I need to, I think I'm going

1:07:45

to take it to a Toyota

1:07:48

dealer and just be like, give me

1:07:50

the works. And like, cause last

1:07:52

time I did that and they were like, they said

1:07:54

they told me a really high price and they also

1:07:56

wanted to buy the car from me. So I was

1:07:58

like, I was

1:08:01

like said no to it and I just went to a

1:08:03

Midas But I need to stop going to Midas because they're

1:08:05

just kind of Midas is like putting duct tape on on

1:08:07

a on a on a Fuck

1:08:09

yeah, gash gaping wound Yeah,

1:08:12

fuck Midas. Yeah, no, I'm a little bit

1:08:15

Because I've probably gotten my my

1:08:17

air conditioning fixed like twice now

1:08:21

And it and every year it stops

1:08:23

working Right

1:08:26

around the summer. That's what happens a little car. You just need

1:08:28

you just need to give it a little blast You

1:08:30

know you give it a little blast a refrigerant.

1:08:32

I'm gonna have those things Air-conditioning

1:08:35

like replaced and stuff. I do the works dude

1:08:37

do the works. Why not? I think there's something

1:08:39

really noble about driving that thing until the wheels

1:08:41

fall off I think that if the car means

1:08:43

that much to you, I think that's something you

1:08:45

should yeah I mean I did my first row

1:08:47

I've done several road trips with Tucker in

1:08:49

that thing. It's got 260,000 miles on Fuck

1:08:53

that's a lot of miles. I want to see

1:08:55

if I can get to a million. That's a

1:08:57

lot That's actually a lot of miles I want

1:09:00

to see if I can get it to a million You

1:09:03

will certainly be the ship of Theseus at that

1:09:05

point that will be the ship of these Odometer

1:09:08

has a million so it isn't yeah, that'll be

1:09:10

the one piece that'll be the soul of that

1:09:12

cars the odometer I'm

1:09:15

definitely The way I

1:09:17

think about it is like if I didn't

1:09:20

have like the liquid sort of freedom

1:09:22

the liquid Money

1:09:25

freedom to cash what I want

1:09:27

with the the truck Would

1:09:29

realistically be totaled by now in terms of like the

1:09:32

price of some of these these repairs that

1:09:34

I needed to do on it But

1:09:36

it's just like me and jambo It

1:09:38

just is like it's almost like a belligerent

1:09:40

journey that I'm on where I'm just like

1:09:42

I'm gonna keep it going But

1:09:44

I really do think that I want to get on

1:09:48

on The side I

1:09:50

would want to get another vehicle. That's like a

1:09:52

nice vehicle to drive I think I would want

1:09:54

to get like a ribby and no dude no

1:09:57

You'll never drive that

1:09:59

fucking Tacoma again. Really?

1:10:02

You're getting a car in the same class that's

1:10:05

light years ahead. Now you never gonna fucking. No,

1:10:07

no, I wouldn't get the truck Rivian.

1:10:09

I would get the SUV. It's

1:10:12

still in the same class basically. Like

1:10:14

if you get this, you're never driving

1:10:16

that truck again. Dude, this is gonna

1:10:18

drive like a sports car. You never

1:10:20

your truck drives like a fucking broken

1:10:22

ass old farm. I'm

1:10:24

telling you, no matter how much that car means

1:10:27

to you, if you get a new

1:10:29

big car that's electric

1:10:31

and does everything you need to do, you're

1:10:34

never getting in that Tacoma again. You

1:10:36

know him and that will be

1:10:39

the death of that because that truck will just start

1:10:41

to rot. You should get a fun. You should get

1:10:43

a Miata that I'd love seeing you try and fit

1:10:45

into a Miata. A Miata? Yeah.

1:10:48

No, that's I think the Miata

1:10:50

dude. The Miata is the most

1:10:52

fun car you can have and

1:10:54

it's so cheap. It's so cheap.

1:10:56

Now you're trying to promote like

1:10:58

your shit. Your

1:11:01

fucking thing. Hey,

1:11:03

look Ted, we need. Well,

1:11:06

well I could. The

1:11:08

problem is Ted that like the

1:11:10

Rivian can go zero to 60

1:11:12

in three seconds or ever quick.

1:11:14

Like you can get a Hyundai

1:11:16

that is fair. Arcea can go

1:11:18

zero to 60 in two

1:11:20

seconds now. Like electric electric cars are

1:11:22

making everything so they're they're sucking the

1:11:24

life out of everything. What you need,

1:11:27

you need to learn manual and

1:11:29

you need to get a Miata that you

1:11:31

don't fit in and you need to start

1:11:33

taking pieces off like ripping them off because

1:11:35

you because that's what gives you more room.

1:11:37

You need to open up the fucking seat

1:11:39

of that N.A. Miata Ted. You need to

1:11:42

get a little spoon and you need to

1:11:44

start digging out the foam and

1:11:46

you need to replace the seat with a

1:11:48

deeper seat. We move the rails back a

1:11:50

little bit and oh my God. Oh I

1:11:53

could see it now Ted. I can see

1:11:55

it. Is that what you did? Did you

1:11:57

pull out? Yeah, yeah totally a thing that

1:11:59

people do. It's called a foam-ectomy. Oh.

1:12:02

When you don't fit in the Miata,

1:12:05

the solution for Miata owners is not

1:12:07

to get a car that works and

1:12:09

you fit in, it's to dig the

1:12:11

seat out. That's

1:12:14

crazy. So that you sit deeper in

1:12:16

the car. I

1:12:19

did a very aggressive foam-ectomy today.

1:12:21

Yep. Yep. That's

1:12:24

wild. Yeah, no, I definitely wouldn't fit in a Miata. You could.

1:12:26

I'm the size of a building, dude. You

1:12:29

could with enough effort. I mean, are you a

1:12:31

long torso or a long leg guy? I don't

1:12:34

know, Tucker, what do you think? He's

1:12:37

just fucking massive everywhere, dude. I don't know, let's

1:12:39

look at my scenario here. Yeah,

1:12:41

let's do a little analysis here. Oh,

1:12:45

dude, those shorts. Shorts are fucking jump-scares every

1:12:47

time. Yeah, those shorts lead me to believe

1:12:49

he's in this. Tall white socks, dude. He went

1:12:51

from like 25 to like 65 overnight. Here's

1:12:54

some parts to talk about by proportions, not

1:12:56

by fucking outfit, you pieces of shit. I

1:12:58

think he's pretty like middle

1:13:00

of the road. He is proportionate,

1:13:02

so maybe a little longer legs. That's some

1:13:05

nice proportions. Yeah. What were you saying

1:13:07

about my wife, Fox, Tucker? You were being a brat. You were being

1:13:09

a brat, weren't you? Yeah,

1:13:11

I was, because your outfit, you went from

1:13:13

a 25-year-old to a 65-year-old, just like

1:13:17

that. When I... You got tall

1:13:19

white socks on and you're wearing shorts? Dude,

1:13:22

I like looking like a dad from the 90s. I

1:13:25

like looking like I'm coming straight out of

1:13:27

a picture book that your parents are showing

1:13:29

you when you were a baby. If you

1:13:31

are going to do that, at least tan

1:13:33

those fucking legs. Get some sun. You're

1:13:36

one to fucking talk, Whitey. Whitey,

1:13:39

put your hand... Look at this. You're one

1:13:41

to talk, cracker. Oh, I

1:13:43

got to go, guys. You're

1:13:46

the whitest one here. It's in your

1:13:48

jeans. And he lives in

1:13:51

sunny L.A. Isn't that crazy? I know. You

1:13:54

guys are great. You guys are a real peach to talk to,

1:13:56

you know that? All right, you'll learn manual

1:13:58

on my meow. about it next time you're in the

1:14:00

UOC. Okay, as long as you're okay with me burning

1:14:02

out your fucking clutch, then yeah. Yeah, that's fine. There's

1:14:04

so many of them and they're like, they're

1:14:07

like dirt cheap, so we'll just. Yeah, I have

1:14:09

wanted an excuse to learn it because

1:14:11

when I was a kid, my dad,

1:14:13

he had this nice little BMW that

1:14:15

he liked and

1:14:18

he drives manual. And

1:14:21

he would have me do this thing where if we were going to

1:14:23

like a Red Sox game or something like that, he

1:14:25

would, I would be obviously in

1:14:28

the passenger seat and then whenever he

1:14:30

was shifting gear, I'd be in control

1:14:32

of the actual shifter and then

1:14:34

he'd be like, he would tell me which one to go to,

1:14:36

so he'd be like, we're going to three, and I'd go. And

1:14:39

it was like a big deal for me. I was like

1:14:41

making sure that it was like, that was like my job.

1:14:43

And honestly he was putting a lot of trust in me

1:14:46

too because like, I just saw the car,

1:14:48

but I didn't do it right. Good,

1:14:50

you get to fuck that thing up big time.

1:14:52

Yeah, I accidentally put it in first going 45

1:14:55

miles an hour. Yeah, that'll do

1:14:57

it. I took it very

1:14:59

seriously. I hope you'll take it

1:15:01

seriously when you visit. Yeah.

1:15:05

Because look, I love

1:15:07

that thing. I cherish that vehicle. Yeah, I'll take it

1:15:09

seriously. I just, you know, the shifting of the, you

1:15:12

know, it's like what, you lift off the gas, you put

1:15:14

the shifter on, you move it, then

1:15:16

you do the back? Yeah, I

1:15:18

actually, that's all you need to know. Is

1:15:21

that it? You press

1:15:23

the clutch in, you shift the

1:15:25

gear, you match the RPM and you lift

1:15:27

off the clutch. You

1:15:29

take your foot off the gas though, right? Yeah, Ted's got

1:15:31

it. Yeah, but then you put your foot on the gas

1:15:34

to kind of. You find the

1:15:36

bite again. Yeah, I think the thing that I

1:15:38

worry about is just the sweet spot zone. That's

1:15:40

because it's different for every car, right? Changes

1:15:43

for every car, you got to find it every

1:15:45

time. I think the better you understand the mechanical

1:15:47

process of it, the easier it'll be for you.

1:15:50

And the more you understand the car itself,

1:15:54

you don't even need to look, you just

1:15:56

do it by ear, baby. You got that

1:15:58

Miata down with the top down. Oh my

1:16:00

god soft top gone no hard

1:16:03

top right now dude. You're hearing that

1:16:05

engine purr right in front of you

1:16:08

1.6 liter fucking

1:16:11

90 horsepower My

1:16:13

god that thing you floor it and you're

1:16:15

still going the speed limit with the wind

1:16:17

rushing through your hair No cop will ever

1:16:19

pull you over you have as much fun

1:16:21

with that thing as you like. Oh, wow.

1:16:23

Oh Wow,

1:16:25

well, thank you for that. No airbag though.

1:16:28

No airbag what? Yeah,

1:16:31

that thing that thing will crash and you'll

1:16:33

die. Yeah We

1:16:35

are audio stack.ai Combining

1:16:37

AI writing the best synthetic voices

1:16:39

with production and delivery be it

1:16:41

ads podcasts or vo's for video

1:16:43

Just like this ad how would

1:16:45

you utilize AI audio for

1:16:48

free? Go

1:16:50

to audio stack.ai forward slash

1:16:52

contest give us just a few details and

1:16:54

you could replace this spot and be heard

1:16:56

by millions see web page for

1:16:58

tease and see audio

1:17:01

stack.ai/contest We

1:17:05

are audio stack.ai Combining

1:17:07

AI writing the best synthetic voices

1:17:10

with production and delivery be it

1:17:12

ads podcasts or vo's for video

1:17:14

Just like this ad how would

1:17:16

you utilize AI audio for

1:17:18

free? Go

1:17:20

to audio stack.ai/contest Give us just

1:17:22

a few details and you could

1:17:24

replace this spot and be heard

1:17:26

by millions See web page for

1:17:29

tease and sees audio

1:17:32

stack.ai/contest We

1:17:36

are audio stack.ai Combining

1:17:38

AI writing the best synthetic voices

1:17:40

with production and delivery be it

1:17:42

ads podcasts or vo's for video

1:17:44

Just like this ad how would

1:17:46

you utilize AI audio for

1:17:48

free? Go

1:17:50

to audio stack.ai/contest Give us just

1:17:52

a few details and you could

1:17:54

replace this spot and be heard by

1:17:57

millions see web page for tease and

1:17:59

see AudioStack.ai/

1:18:03

contest.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features