Podchaser Logo
Home
Apple's A.I. Embrace & Elon's Tantrum

Apple's A.I. Embrace & Elon's Tantrum

Released Friday, 14th June 2024
Good episode? Give it some love!
Apple's A.I. Embrace & Elon's Tantrum

Apple's A.I. Embrace & Elon's Tantrum

Apple's A.I. Embrace & Elon's Tantrum

Apple's A.I. Embrace & Elon's Tantrum

Friday, 14th June 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Start your summer road trip at Midas and get up to

0:02

$30 off your next repair service. Plus get a free closer

0:04

look vehicle check to make sure you're road trip ready. So

0:07

if you need a brake service and alignment check or tune

0:09

up, hit up Midas for up to $30 off. For

0:12

more details, request your appointment at midas.com.

0:15

TuneIn is the audio platform with

0:18

something for everyone. News. In order

0:20

to secure convictions in a court

0:22

of law, it is essential that

0:24

we conclusively... Sports. The clock at

0:26

four. Don't ditch. The

0:29

step back three. You bet. Music.

0:36

And even podcasts. Whatever you

0:38

love, hear it right here on

0:40

TuneIn. Go to tunein.com or download

0:42

the TuneIn app to start listening.

0:50

Aloha, namaste everyone and welcome to InPolitik

0:52

with Jon Heilman, my new podcast on

0:55

politics and culture for Odyssey and Puck,

0:57

burgeoning newsletter empire covering the quarters of

0:59

power and influence in America, where

1:01

I am now the chief political columnist cranking

1:04

out a weekly dispatch that shares

1:06

its name, InPolitik, with the show. Column

1:09

arrives via email every Sunday night if you

1:12

sign up, and you can do that by

1:14

firing up your browser and going to puck.news

1:16

slash InPolitik and taking advantage of a

1:18

20% discount on a Puck subscription

1:21

that I'm offering to all you for

1:23

simply being such splendid people. Everyone

1:25

in this list of this podcast is by

1:27

definition splendid, so you deserve a break today,

1:29

a discount on a Puck subscription.

1:31

Go get it. Some of you may

1:33

recall that I used to host a podcast called Hell and High

1:36

Water. This new show, this one

1:38

right here, InPolitik with Jon Heilman, is just

1:40

like that one, only it's bigger and it's

1:42

better, dropping new episodes not

1:44

once a week, like the

1:46

old one, but twice a week, every Tuesday and

1:48

Friday, featuring deep, rich, candid,

1:50

and yes, InPolitik conversations with the

1:52

people who shape our culture in

1:55

politics, entertainment, business, tech, sports, and

1:57

media, and the

1:59

A-list. journalists and authors who chronicle

2:01

their lives and exploits, two of whom

2:03

are on the podcast today.

2:05

The first is Kara Swisher, the most

2:07

influential technology journalist of the digital age,

2:10

formerly of the Wall Street Journal, Recode,

2:12

and the New York Times, now editor-at-large

2:14

at New York Magazine, host of the

2:16

podcast, on with Kara Swisher, and co-host

2:18

of the Pivot podcast with Scott Galloway,

2:20

also author of the recent bestselling

2:23

memoir Burn Book, a tech love

2:25

story. The second hack

2:27

who's on the show, and I say that

2:30

lovingly, is Joe Klein, who for the past

2:32

50, count him 50 years, that's a full

2:34

half century folks, starting at the real paper

2:36

in Boston and continuous stints at Rolling Stone

2:38

and New York Magazine and the New Yorker

2:41

and both, both Time and Newsweek, has

2:43

been among the best and most celebrated political journalists

2:45

of our time, as well as the author of

2:47

one of the most famous political novels ever, Primary

2:50

Colors. Kara is

2:52

here to help us get our heads

2:55

around Apple's big, long-awaited, maybe transformational embrace

2:57

of artificial intelligence and its products, announced

2:59

this week and met with lusty enthusiasm by

3:01

Wall Street and pretty much everyone else, except

3:04

for Elon Musk, who

3:06

crapped all over the announcement for reasons

3:09

that Kara lays out on this episode

3:11

with her usual degree of restraint and

3:13

mealy-mouthedness. To

3:16

say none, neither of those qualities are

3:18

Kara Swisher's qualities. Joe

3:20

Klein's visit to the pod is due

3:23

to a sadder circumstance, the death this

3:25

week of our former colleague and longtime

3:27

friend, Howard Feynman, 30-year veteran of Newsweek,

3:29

ubiquitous on-air presence on

3:32

MSNBC, Howard passing

3:34

after a long, brave and

3:37

self-pity-free struggle with pancreatic cancer.

3:40

So after you get the smarts

3:42

on the tech stuff from Kara, be sure

3:44

to stick around for Joe and my celebration

3:46

of Howard's life and our mourning of his

3:48

loss, along with our reflections

3:51

on the journalistic era that Howard represented

3:53

as well and as fully as anyone

3:55

we know. All of it, right

3:57

here on this episode of Empower John

4:00

Heilman is coming at you in three, two,

4:02

one. At

4:05

Apple, it's always been our goal

4:07

to design powerful personal products that

4:09

enrich people's lives by enabling them

4:11

to do the things that matter

4:14

most as simply and

4:16

easily as possible. We've

4:18

been using artificial intelligence and machine learning

4:20

for years to help us further that

4:23

goal. We said

4:25

developments in generative intelligence and

4:27

large language models offer powerful

4:30

capabilities that provide the

4:32

opportunity to take the experience of

4:34

using Apple products to new heights.

4:37

Introducing Apple Intelligence, the

4:40

new personal intelligence system that makes

4:42

your most personal products even more

4:44

useful and delightful. So

4:47

that's Tim Cook, Apple CEO, rolling

4:49

out Apple AI. We're here with Kara Swisher.

4:51

Kara, thank you for taking

4:54

the time. When

4:57

you see Tim Cook do a thing like

4:59

that, a big advancement, like as someone who

5:01

knew Steve Jobs, covered Steve Jobs, do

5:03

you think, do you feel the tingle? I

5:06

thought Steve was masterful at doing those things. You've been to a

5:08

lot of them. It doesn't hardly

5:10

matter because the company's worth at least 10

5:12

to 15 times worth more than

5:14

when Steve Jobs was running it, right? If you

5:16

do things on those terms. So I always laugh

5:18

when they're like, oh, he's workman-like, oh, he's boring.

5:21

I'm like, well, the company's worth today. The stock

5:23

has had a real run this past week because

5:26

of these AI announcements from June 11th.

5:28

It was sort of at a low, but now

5:31

it's moving quite substantially. But as a market cap

5:33

of $3.35 trillion, one

5:36

of the most valuable companies in the world. I

5:39

believe as of today, back to back in this tough spot

5:41

as of this moment. Maybe, it could be. We should be

5:43

recaping on Wednesday, yeah. In any case, they're

5:46

doing great under Tim Cook and he's

5:48

made a lot of very canny moves

5:50

that might not be as exciting and

5:52

obviously as groundbreaking as an iPhone. But

5:54

I would say the watch and the

5:56

AirPods and a bunch of things have

5:58

been really amazing businesses, both of

6:00

the services business, which has been impressive too. Yeah,

6:02

and it was actually kind of what I was

6:04

gonna ask you because I think he's been doing

6:06

it 13 years now when he took

6:08

over the company. They were worth about 400 billion, as

6:11

we said, they're three and a quarter trillion now.

6:14

I mean, I feel like there's still a little Rodney

6:16

Dangerfield with him. I'm not saying from you or from

6:18

me, but that there's still this kind of, you

6:21

still hear that cliche of working like, and I

6:23

think it feels like it's time for a reassessment.

6:25

It's almost gonna have to step back at some

6:27

point and go, man, Tim Cook

6:29

has done a hell of a job.

6:32

Everyone who really understands it does. I think

6:34

there's not a person that doesn't understand what,

6:36

both him and Sachin Nidala, who took over

6:38

for Bill Gates and Steve Bomer, and

6:40

I consider Steve a founder really in many ways.

6:44

But I think, he's of

6:46

course killed it too. These are two

6:48

executives who've really taken their companies to

6:50

new heights with products and expansion

6:53

and improving the culture and everything

6:55

else. I think Microsoft had a bigger climb

6:57

on that issue. But

7:00

I just feel like, I just look at the

7:02

numbers and I'm like, okay, they have a bigger

7:04

market share, they have a bigger business. They've

7:07

managed to keep the Apple, the iPhone

7:09

in the center of things, despite eventual

7:11

difficulties in that, but adding value to

7:13

it. It's just like this last week's

7:16

announcement of Apple Intelligence, which I

7:18

thought was very well done and

7:20

the stock market's reflecting that. So

7:23

let's just talk about that for a second, because

7:25

obviously there's two large things here. Apple,

7:28

which has all kinds of iconic quality,

7:30

and then there's AI, which has lots

7:33

of hope and lots of fear. Just

7:35

focus on what they've, on

7:38

how important this announcement is for Apple,

7:40

for the industry, for AI, for us. You

7:43

know, I think, what Apple does a

7:45

lot is, they're very fast followers,

7:47

right? As he noted, I think, or

7:50

maybe he noted to me, and I said

7:52

something everybody knows, they weren't first to the

7:54

watch, they weren't first to AirPods, they

7:57

weren't first to the phone even, right? They

7:59

hadn't, they... They weren't first to a lot

8:01

graphically, well maybe they were early at Graphical

8:03

User Interface. Yeah, pretty much. They often take

8:05

things, they don't have to be the pioneer

8:08

of things, very few things to

8:10

the pioneer of. And so they tend

8:12

to iterate around things and they don't do it

8:14

if it's not good enough. And I think in

8:16

this case, one of the problems

8:18

with AI is revenues. There's a lot of evaluation

8:21

when it comes to AI, but business, not so

8:23

much, right? It's sort of a, you know, except

8:25

for the people who sell the chips in video.

8:27

So it's, you know, there's not so much gold

8:30

in the gold rush, there's just a lot of

8:32

pickaxes sold, right? And that's who

8:34

gets really rich. And in that case, it's Nvidia.

8:37

But at this moment in time, but

8:39

what you have to look at is where

8:41

it's going, right? And where it's going. And I think one of

8:44

the things that Apple does very well is they're a fast follower,

8:46

because everyone was sort of, where are they in AI? Where are

8:48

they in AI? They have a lot

8:50

of strictures around AI because of privacy issues. They

8:53

have, one of their brand attributes is

8:55

privacy, both as a marketing attribute. And I

8:57

think it really quite deeply felt within the

8:59

company. And they're a hardware company. That's

9:01

the other thing. People always think of them as

9:03

something else, but they're actually a hardware company.

9:06

And services have certainly grown, but they're

9:09

certainly not a software company.

9:11

They have software, obviously. But

9:13

most of their revenues come from hardware

9:15

with increasing amount coming from services and

9:17

software. Mostly services more

9:20

than software. And so I feel

9:22

like that was fine the way they're doing

9:25

this, because now you can begin to see

9:27

how you might use AI in the real

9:29

setting versus, you know, querying

9:31

chat GPT and getting an answer. It's

9:34

kind of by itself. And in this

9:36

case, it's critical that they have AI

9:38

capabilities within the system, because their system

9:40

is perfect. There's a perfect example of

9:43

how it could actually be useful as

9:46

a, I don't want to say a feature, but as an integral

9:48

part of the operating system, but not the

9:50

show, which I think everybody's making AI the

9:52

show. AI isn't the show, it's the electricity,

9:54

it's the interface, you

9:57

know. It's not what people think it's

9:59

going to be. I think personally. It's just

10:01

interesting. I think we both agree about this.

10:04

It's often the case that the people who innovate, who

10:06

really break the seal on some piece of innovation,

10:09

technical innovation, don't end up becoming the behemoth in

10:11

the industry. That's been a story of our business.

10:14

Intel is a kind of a rare exception to

10:16

that for a long time in chips, but a

10:18

lot of these companies are companies where some company

10:20

has the breakthrough and

10:22

then a larger company. Well, probably tried.

10:25

I think they really did innovate the phone. I think

10:27

the iPhone was the single most important device. Sure.

10:30

Sure. I think they're

10:32

very smart to look at how does it fit in

10:34

the system. How does it work for them? That's

10:37

the same decisions on picking partners,

10:39

right? How does it help them

10:41

to buy or build or rent,

10:43

in this case with chat GPT,

10:46

and then protect their users who ... It's

10:48

just another reason to stay in the Apple

10:51

ecosystem. Sure. They have to

10:53

be whether ... Sometimes it works. Public health is kind

10:55

of okay and some of their workout

10:57

stuff, but music has worked out

11:00

rather well for them, right? They make stuff

11:02

that we covet. It's

11:04

like there's almost no one in tech who

11:06

understands and design things that make things that

11:10

become objects of our desire and fashions. That

11:13

is a thing that ... The

11:16

whole universe around Android is still, I

11:18

would say, go ick. I

11:20

would go ick, right? Exactly. I

11:22

don't know about Android, but many years ago, I've

11:24

told this story when Microsoft

11:27

made the Zune, right, if you recall.

11:29

It was brown. It was kind of poopy brown color.

11:32

Walt brought a demo to Steve,

11:35

and Steve very dramatically, he

11:38

placed it in Steve's hand and Steve dropped it as

11:40

if it was made of lava. I cannot touch

11:42

this ugly piece of shit, which

11:45

kind of reminds me of one time, very early

11:47

talking about Fred Rosen's office.

11:50

He's like, chrome, chrome everywhere, just

11:52

like people who love design are

11:54

just as painful. I

11:57

think one of the things they do do well is more

11:59

... Other than just being beautiful and I think

12:01

that's what people know them for is

12:03

– Beauty and functionality. They're very useful, right? Yeah.

12:06

Yeah. I think that's the stuff that has succeeded. And

12:08

it takes them a little bit in terms of

12:10

the watch. I didn't have the initial washes

12:12

because I didn't see the efficacy of it.

12:15

Now I find it indispensable for a number

12:17

of reasons, including finding my iPhone, the Deuce

12:19

of Time. That's really

12:21

what I use. I'm so thrilled to have

12:23

it. But it took me a minute and

12:26

then it made sense. And I think this

12:28

AI stuff is critical to things. That's

12:31

a good place for me to focus on. When

12:34

Tim talked about it the other day, he said it had to be

12:37

all the Apple things, powerful,

12:39

intuitive, integrated, most importantly, personal

12:41

and private. And then he saw

12:43

a bunch of flashy demos of what it might be

12:45

good for. When you looked at what they were saying,

12:47

here's what it's going to give you as a user

12:50

benefit. What did you look at and say, I can

12:53

see how I want to use that thing. I'm not being

12:56

skeptical. I'm just curious. Really, interestingly,

12:59

the idea of getting rid of apps.

13:02

I think it made me see a world where there

13:04

was no apps. You didn't fire

13:06

up the Google. You didn't fire up

13:09

the, including Apple apps that you might

13:11

use, all kinds of different Apple

13:13

apps, fine mind, this and that. That it

13:15

starts to integrate just the way AI

13:18

pulls things out of search instead of going,

13:20

instead of taking a place where you can

13:22

find something, it finds it for you. This

13:25

brings the functionality right there in an integrated

13:27

way. They kept using the word integrated. I

13:30

don't think it was a mistake that

13:32

they were using integrated constantly. Because

13:35

a lot of things, you go to your phone and

13:37

you go, click Google Drive,

13:39

click files, click Safari,

13:42

whatever happened. I'm just looking at the front of my

13:44

things, Google Maps. It would integrate

13:46

so that you'd go, I want to go to

13:48

a really good Chinese restaurant near my house. Can

13:50

you get, you know what I mean? Instead of opening

13:53

it up, there's lots of ways to search that. One

13:55

by going to Google, one by going to Maps, one

13:57

by, you know, all kinds of things.

14:00

it would bring it to you. So it

14:02

kind of destroys the app ecosystem in a

14:04

way. Especially the ones you use

14:07

regularly. You know, like maps and search

14:09

and things like that. It

14:11

feels like that ecosystem's ripe for destruction in the

14:13

sense that you think about your phone where you've

14:15

got 534 apps, you use seven of them, and

14:17

the seven, if you could get away from them,

14:19

you'd rather just not, right? Or they would integrate

14:21

with each other. So what

14:23

you do is you get an invitation in the mail

14:25

and you can sometimes put it in your schedule and

14:27

sometimes not. It would just be like, hey,

14:29

you just got an email, do you want to go to this party?

14:31

Yeah, I put it in my schedule. Like,

14:34

and that's the idea of a great, competent

14:36

assistant, right? And they've tried over the

14:39

years, because I've had this holy

14:41

war against Siri, because I think it's so stupid.

14:43

Siri is so bad and so useless. It's

14:45

always wrong, you can't find names for me.

14:48

It's sort of a, I wish they'd rename it. Have you

14:50

ever gotten as mad as it, as Larry David

14:52

did on Curb the Coast? Oh no, he was

14:54

great. But I kept at

14:56

it, I'm like, what is wrong with you? I

14:59

said, call mom and they're like, you have to have mom

15:01

in your contacts. I'm like, it is in my contacts. Like,

15:03

what is, I call them all the time. That

15:06

Siri could then start to tell you,

15:09

oh, your mom called you a couple, and this

15:11

is what she said in the message. She

15:13

seemed agitated, Kara, she seemed agitated. She's gotta get

15:15

back to her soon. Yeah, or would

15:17

you like me to send a note to her? Would

15:19

you like me to send flowers? Would you like me

15:21

to, you know, that's the kind of thing you could

15:23

see Apple being good at doing is integration, because

15:26

a lot of it is sort of all on the

15:28

floor, but not, and you have

15:30

to constantly be doing one act,

15:32

then another act, and then another act.

15:34

And so I find that really interesting. I'm

15:37

looking for it at the moment where it'll say, you know,

15:39

your mom called. I think this time she's just kind of,

15:41

she's bored. You don't need to get back to her. Yeah,

15:43

you can, you can wait for 48 hours to get back

15:45

to her. If you gave it permission, right? And I think

15:47

that's where the power for Apple is here. And I think

15:49

that's why their stock is going up, is that very

15:52

few companies have permission from their

15:54

users to feel comfortable in the

15:57

ecosystem. And, you know, you feel

15:59

more... worthy of Apple

16:01

than you do of other things, right? I

16:03

think you have a good relationship also with

16:06

Amazon even though it is, you know, tracking

16:08

everything you do. You feel like it's a

16:10

good experience, right? So you trust

16:12

it. So then they can extend, would

16:14

you like your drugs, would you like

16:16

your pharmaceutical, you know, your medicines delivered

16:19

to you. Would you like your drugs?

16:21

That's a different business. But

16:24

would you like this to happen? And so

16:26

they start to extend services to you and

16:28

rather than go to one janky startup

16:30

after another, you would go to

16:33

them like versus and I would I would

16:35

actually pick them over a lot of companies.

16:37

There's a couple you have I guess Uber but

16:40

that's sort of a functional utilitarian relationship. Get me

16:42

the car. But why couldn't that say, would you

16:44

like me to call get Uber to get a

16:46

car for you? When do you need it? I

16:48

see from your phone from your schedule, your Google

16:50

schedule that you're arriving this time, would you like

16:53

the car waiting for you? Right. Yeah. And then you

16:55

just say yes or no and that's, you know,

16:57

that's really the dream of course for many

16:59

years. So so there's a, you know, as

17:01

you said, the market loves this. A

17:03

lot of there's been the press coverage

17:06

has been solid. Yeah, it's good.

17:08

The loudest critic has been your

17:10

friend. You're, I use that I

17:12

put quotes around it, your friend,

17:14

your your your your long time

17:17

jousting partner, let's put it that way, you want Musk who

17:20

put it and I just I'm going to say these couple

17:22

things then you can tell me what you think is

17:24

going on with this because his immediate thing was, hey, if

17:26

they integrate this, he put on an X he said if

17:28

or Twitter, whatever you want to call it, if Apple

17:30

integrates open AI at the OS level, Apple devices

17:32

will be banned at my companies. Unacceptable

17:35

security violation. Visitors will have

17:37

to check their Apple devices at the door would

17:39

put in a Faraday cage and then he trashed

17:41

their relationship with open AI. So that's one thing.

17:44

Then literally within hours, he

17:46

drops this lawsuit that he's had against open

17:48

AI. I don't understand what's going on. So

17:50

just tell me what you think of the

17:52

criticism. First of all, the last part because

17:54

discovery was about to start and nobody

17:57

wants discovery like Elon Musk doesn't want

17:59

discovery. That's why he gave up, you know,

18:01

when he was saying, I'm going to keep this

18:04

lawsuit if not to buy Twitter. He

18:06

did it right before he had to talk in court.

18:08

He does not want discovery, this guy. I mean, who

18:10

knows what's there? I mean, you saw the journal story

18:13

today. I sure did. But

18:15

for those who haven't, The Wall Street Journal dropped

18:17

a doozy of a story this morning, Wednesday

18:19

with the headline, Elon Musk's boundary

18:22

blurring relationships with women at SpaceX.

18:25

The subhead is the billionaire founder had sex

18:27

with an employee and a former intern and

18:29

asked a woman at his company to have

18:31

his babies. So yeah. There's

18:34

discovery everywhere on this fella. And so

18:36

that's one. And he also does have a case,

18:38

right? They were going to have, it was going to be

18:40

dismissed probably. And then he would have been

18:43

embarrassed. So instead of embarrassing. What was the lawsuit? What

18:45

was the essence of the lawsuit? They were supposed to be

18:47

good hearted and they weren't. They didn't do what they said.

18:49

They opened the eye. Yeah. They

18:51

opened the eye. So I started it as I

18:53

did it. And by the way, let me tell you,

18:55

he was there. He was a critical member. He wasn't

18:58

the only one, but in Elon's world, he's a ready

19:00

player one on every single thing that exists on the

19:02

planet. And he put a Joe into it. He did. No,

19:05

listen, he is the very first person

19:07

who started talking about January day. I

19:09

know hands down the first person. The second

19:12

was Sam Altman and the third was Reed

19:14

Hoffman. So I

19:16

mean, I was there when they were forming it. And

19:19

so, you know, it was, it

19:21

was, and he actually talked about it extensively at one

19:23

of the either the Code of the All Things Deep

19:25

conference about the dangers of AI very

19:27

early. So Elon was mad at

19:30

opening AI because he didn't think they were doing business the

19:32

way he thought they were going. No, I think that's

19:34

a lie. I think he said, I think he

19:36

had a tantrum and laugh, of course, what he says and

19:38

what he really is quite

19:40

a favor. And I'm using

19:43

that in a very loose term. I

19:45

mean, really very, very, very, very, that's

19:47

very low level criticism given. Yeah, he's

19:49

mendacious is what he is almost persistently.

19:53

And on lots of topics,

19:55

it's not just that one, of course. Oh, F

19:57

full self driving is coming next week. Oh, is it?

20:00

It's really interesting. He's very jazz

20:02

hands with everything he does. So

20:04

in this case, I think he

20:06

threw a tantrum because he always seeks more

20:08

control, right? And he made a move on

20:10

open AI. And unfortunately for

20:12

him, Reed Hoffman is up

20:15

to him, right? So that Reed

20:17

and others at that company, including

20:19

Sam, were like, no, you can't

20:21

have this. You can't have every single cake

20:23

in the store, Elod, and you can't have

20:25

this one. And so what he's doing right

20:27

now around Tesla trying to get the 56

20:30

billion and more shares and more control, he's

20:32

done his whole career. He wants full control

20:34

of everything. And they said no. And so

20:36

he threw a tantrum. And actually,

20:38

I believe he actually threw a tantrum, yelling

20:41

or something at one of the meetings I read in

20:43

one of the books. And then

20:45

he left, huffed out. And what he

20:47

did is focus on a lot of

20:49

other things, some very important, Starlink and

20:51

Tesla and stuff like that, some very

20:53

strange, like Twitter. And

20:55

then he sees this happen. The

20:57

thing he had predicted, among

20:59

other people, let me just continue to say he

21:02

wasn't the only one, but he was early, the

21:04

thing he predicted happened, right?

21:07

And so he wasn't the focus of attention.

21:09

And very much like Donald Trump

21:11

or any number of narcissists and malignant narcissists

21:13

we have in our society, he has to

21:15

say, look at me. He's the Mr. Look

21:17

at me. And so that's why he entered

21:19

the picture here to do this lawsuit. So

21:22

he dropped this lawsuit yesterday, you

21:24

said a second ago. He didn't want discovery. It

21:27

goes back to the question, I just want to loop

21:30

back to the Apple thing, which is how we started

21:32

into this, was him attacking Apple for its relationship with

21:34

AI. What do you make of that criticism and the

21:36

merits? You remember he was attacking when there was

21:38

a lot of crap on Twitter, and they

21:40

were talking about taking Twitter off of it.

21:42

And he was attacking Apple for being anti-free

21:44

speech. There was a little, another one of

21:47

his. There's so many of them. He's so

21:49

Trump-like in that regard. It's Tuesday, it must

21:51

be. Whatever he's yelling about,

21:53

immigrants or trans people. One

21:55

day was Apple, where he was attacking

21:58

them. And instead of attacking back. Tim

22:00

Cook is so clever and so charming

22:04

and very gracious, I would say. Does

22:07

a Southern thing very nicely from

22:09

Alabama. And he invited

22:12

him to the Loop and they walked

22:14

around in their beautiful garden where

22:17

you could eat almost anything, I guess. I don't

22:19

know, whatever. It's a beautiful garden there at Apple in

22:21

the Loop. And made him lunch and

22:24

petted him. And then what did Elon

22:26

do? He did a picture, here

22:28

I am, it's a loop. And all you could think of

22:30

is, oh, little boy, you needed a hug, I get it.

22:33

You know what I mean? And that's what,

22:35

and Tim, I thought, I was like, that guy

22:37

is, I wouldn't have hugged him. I punched him

22:39

in the mouth, but he didn't. He

22:42

tugged him. And so

22:45

Elon was backed off immediately because he

22:47

got the attention he needed. Again, very

22:49

much like Trump and the New York

22:51

people. They'll never love him, right? So

22:53

he hates them, but he loves them,

22:55

that kind of thing. And so in

22:57

this case, they picked OpenAI, which I

22:59

started OpenAI, you're not picking Grok.

23:01

Which for the uninitiated, Grok is

23:04

the chatbot rival to chat GPT

23:06

that Elon developed at his company,

23:08

XAI, and

23:11

that he's integrated into Twitter. Maybe

23:13

they will add Grok, by the way. Maybe

23:15

they will. They will probably add lots of

23:17

services and let people pick. But

23:20

in this case, just like Google was

23:23

picked for Maps, chat

23:26

GPT was picked. They must have done a

23:28

survey. Sam Altman is

23:30

incredibly charming. He's not offensive.

23:35

And Sam is sort of going for the Steve Jobs

23:37

kind of personality kind of thing. And so

23:39

Tim is used to. And

23:42

they were easier to deal with because they're smaller and

23:44

they don't have to deal with tantrums. And next week,

23:46

he's not gonna say, Kamala

23:49

Harris is a man, don't you know it? Like

23:51

that kind of something, whatever. Whatever comes out

23:53

of his crazy mouth at three in the morning.

23:56

They're basically just both. The lawsuit

23:58

originally against OpenAI. was one

24:00

kind of tantrum. And his outburst about

24:03

Apple yesterday, or two days ago, was

24:05

also another form of nonsense. This

24:07

is all just Elon being Elon, basically. Well, it's

24:09

just a Janicean song. He was picked last

24:12

at basketball, and he can't stand it. You

24:15

know that song. Yes, yes. I

24:17

mean, honestly, he's like a Janicean song. And

24:19

I think he carries those traumas into the

24:21

day to day, which we have to all

24:24

suffer from by listening to them, or not, as

24:26

the case may be, which I'm doing a lot

24:28

less of. And that, Kara,

24:30

is a sign of your maturity and your

24:32

mental health. Those are two

24:34

qualities not typically associated with Elon Musk. We

24:36

need to take a quick break here for

24:39

the advertisers. And while we do that, I'm

24:41

going to take this opportunity to go and

24:44

find that song, that Janicean

24:46

song, which I believe actually is called At 17. It

24:50

was a huge hit for her, maybe

24:52

won a Grammy Award in the mid-70s. So

24:55

we're going to go find it, my

24:57

crack research team. Producer

24:59

Bob. We're going to go find

25:01

that track. So we can listen to a little bit. On

25:04

the other side, we come back after

25:07

these words for those sponsors, we're going to go.

25:15

At Alma, we know the connection between you

25:17

and your therapist matters. But if you're already

25:19

feeling stressed and burnt out, the idea of

25:21

trying to find a therapist you really connect

25:23

with can be overwhelming. That's why Alma's focused

25:26

on helping you find the right therapist for

25:28

you. When you browse their online directory, you

25:30

can filter your search based on the qualities

25:32

that are most important to you. Then you

25:34

can book a free 15-minute consultation call with

25:37

any therapist you're interested in seeing. So you

25:39

can get a feel for whether they're the

25:41

right fit before you commit to a full-length

25:43

session. Alma also makes it easy for mental

25:45

health care providers to navigate insurance. That's why

25:48

95% of therapists in

25:50

their directory accept insurance for sessions. So

25:52

you can find care that's affordable without

25:55

stressing about the paperwork. You

25:57

want to talk to someone, but not just

25:59

anyone. Alma is. there to help you find the

26:01

right fit. Visit

26:05

helloalma.com/therapy60 to schedule a free

26:07

consultation today. That's helloalma.com/therapy60. And

26:31

there you have it, ladies and

26:33

gentlemen, the aforementioned lines from the

26:36

aforementioned lines from the 1975 Janis

26:38

Ian global mega hit at 17

26:40

classic ballad about teenage angst and

26:43

adolescent mean girls' cruelty that on

26:45

this show will be known forevermore, thanks to

26:48

Kara Swisher, as Elon's Anthem. Kara,

26:52

it's safe to say that you and Mr.

26:54

Musk have had a complicated relationship that

26:57

wasn't always as scratchy

26:59

as it is now. I really

27:01

admired a lot of this. As you know, I still

27:03

do. Yes. I mean, much to my

27:05

detriment, everyone's like, you liked him before. I'm like, I

27:07

did. Well, we got you. You liked him. I'm

27:10

like, I did. I'm saying I did. He

27:12

did. He really did. He really did. He

27:15

didn't invent it, but he definitely pushed. If you had to

27:17

pick one pioneer, he'd be it. I think

27:19

you have to be able to drive down

27:21

the street and see the cyber truck and

27:23

think it's just one of the most ridiculous

27:25

vehicles ever. It is. And learn

27:27

that it's sold about 3,000 of them and that

27:29

he's lost a ton of money and be like

27:31

really happy about that. And also at

27:33

the same time acknowledge that if not

27:35

for what he did with Tesla, the

27:38

electric car market would have been five years behind where it

27:40

is now. I think that's fair to say. No

27:43

question. a

27:45

lot of the people you've written about over time. I

27:48

think he's moved from complicated to hateful, honestly.

27:50

That's the thing. It's really, it's not

27:52

complicated anymore. There's a lot of, like Sam Altman's complicated,

27:55

right? Sure. I don't disagree

27:57

with you. What I mean is, it's...

28:00

picture of being purely an asshole

28:03

is complicated by the fact that he has done

28:05

some things that are genuinely interesting and have been

28:07

genuinely good for the planet. That's

28:09

what's complicated part. So

28:12

we're recording this on Wednesday, which

28:14

is the day before the

28:16

big shareholder vote at Tesla

28:18

over Elon's insane, like, you

28:20

know, stratospheric head exploding

28:22

45 billion. That's

28:24

45 billion with a B, his pay package.

28:27

So I'm not going to ask you to

28:29

make any predictions or leave any

28:31

hostages to fortune that could come

28:33

back to haunt you, Kara, but

28:35

even so. You

28:39

know, I don't know. That to me

28:41

is the most remarkable thing is it's

28:43

touch and go. It's a 50-50 kind of thing. And

28:46

in that way, he's lost, right? And

28:50

because before it was like, of course, they'll give him

28:52

everything he wants, but now they're maybe not so

28:54

much. Well, and that brings us back to

28:56

the piece that you mentioned earlier, that Wall

28:58

Street Journal piece, which has Elon. I

29:01

mean, it's really kind of incredible what is in it. It

29:04

says he got involved with an intern who

29:06

he then put on his executive staff. There's

29:09

a SpaceX flight attendant who claims that

29:11

he exposed himself to her and then

29:13

offered to buy her a horse, a

29:16

horse in exchange for sex. They've

29:18

got a former employee who says that Elon asked

29:21

her to have his babies. And

29:23

then another former employee who says she

29:25

had a month-long sexual relationship with

29:27

him while she directly reported

29:29

to him. There is no

29:32

CEO in America, Kara, who

29:34

could survive this piece in the

29:36

Wall Street Journal, except for Elon

29:38

Musk. That's correct. He's like Trump.

29:40

He's the Trump attack, essentially. Here's the issue, though.

29:42

Why is he? Why is he surviving? Because

29:45

our whole Overton window was changed on this stuff. That's

29:47

why. But no one else can survive it. Donald

29:49

Trump certainly did. Did porn star hush money?

29:51

No other CEO. No other CEO. Okay.

29:55

All right. But he's a celebrity,

29:57

sure. But it doesn't matter. He controls everything, so it

29:59

doesn't matter. He of course he can survive

30:01

it because he runs everything and he has

30:03

control of everything. They're all private companies without

30:05

him. They're nothing. So of course

30:07

he can survive. Everyone's like, one point when they're

30:10

like, oh, he's not just going to get in

30:12

trouble over something, the gay thing or whatever negative

30:14

piece of shit he tweeted. And

30:16

someone's like, oh, now he's in trouble. I'm like, how?

30:19

Because who? Who? Where's the

30:22

great savior coming to save us and vanquishing

30:24

this, you know, this person? Because

30:27

it doesn't exist. He

30:29

has one public company. It's called Tesla. I'm

30:31

sure, I guess I wonder whether at some point

30:33

in the one publicly traded company that he controls,

30:35

whether he will start to, he can go too

30:38

far and we'll lose the faith of

30:40

those shareholders. Look, first of all, this board is

30:42

not a board. I don't know what they are. They're

30:44

making, I mean, I thought the more, you know, of

30:46

all the stories the journal's been doing, which is interesting,

30:48

the drug one was of course everyone knew it, like

30:50

everyone knew about that. And I'm glad they finally did

30:52

the reporting to get it down there and took the

30:54

risk to write it completely everything I've

30:57

heard. Now I didn't do the reporting they

30:59

did. But everything I've heard,

31:01

absolutely. Everything in

31:03

there was stuff I had previously heard, then

31:06

again, didn't do the reporting on. The

31:10

more devastating story was how much money

31:12

the board members were making off of

31:14

this guy, right? He's like the golden

31:16

friggin ticket for Steve Gerbertson or Robin

31:18

Denholm or any number of people on

31:20

that board. Antonio

31:23

Garcia. Anyway, all of them

31:25

are, enable this guy,

31:27

sort of a little like Elvis. And

31:30

here, take another drug, Elvis. No problem,

31:32

Elvis. Which I

31:34

don't think he's going to end well because of that. I think he

31:36

has a lot of people with nobody saying no. And

31:39

so there isn't someone to, this isn't

31:42

a board. And so when people say, he's going

31:44

to, this is going to get him, I'm like,

31:47

okay, who? I always like

31:49

explain to me how that's going to happen because I

31:51

don't see it happening. So

31:53

he has been able to operate with

31:55

impunity in those ways for all the

31:57

reasons that you just laid out. Look,

32:01

Twitter slash X or whatever you call

32:03

that platform now, it's like an 18th

32:05

century open sewer system at

32:07

this point. Oh, I see. I call it

32:09

a Nazi bar, but you can pick whatever you want. Take your

32:12

pick, right? And yet, I

32:14

see this thing in Axios the other day.

32:16

He's the most important, Elon Musk, the most

32:18

important business player in US politics right now.

32:21

Do you agree with that assessment? Yes, I do because they

32:23

cannot get off of Twitter, the political people. I

32:25

think they're very, I've always, listen, you know this,

32:27

Twitter has always been incredibly small business, but it

32:30

had high amounts of influence among the

32:32

politicians. So among politicians, yes, they

32:34

like the Nazi bar. Among

32:36

most people, no, they do not. And including

32:38

packaged goods, advertisers, anybody. I was in an

32:40

ad event and I was like, who here

32:42

is going to invest? And none of them,

32:44

right? There was a big ad event, big

32:46

ad buyers. Absolutely

32:49

not. Like, are you kidding me? They don't

32:51

care that, you know, Rick Wilson and Marjorie

32:53

Taylor Greene are arguing. They don't want to

32:55

be part of that. Like it's not a

32:57

business. It's just a noisy place where political

32:59

people like to be. And you know, no

33:02

one saw the political people, but they have low

33:04

standards and that kind of stuff. Sure.

33:07

But he also, you know, it's

33:09

a little bit, part

33:12

of it is obviously Twitter. Yeah,

33:14

but what's the influence precisely? Well,

33:16

that's kind of one of my questions is whether you think,

33:19

do you think there's, that

33:21

he has actual influence? I

33:23

think he has more influence abroad. And he's

33:25

always, you know, when he first bought it and it seems

33:27

so crazy from the numbers, you're like, he cannot.

33:30

I thought he might be able to do some things with it. The

33:32

stuff he's doing is not the direction I thought

33:34

he would go in. I thought he'd actually fix it and make

33:36

it better. But, you know,

33:39

someone who is very smart and knows

33:41

him for many, many decades really said

33:43

he's not buying it for the business.

33:46

He's buying it for the influence for his

33:48

companies in other countries. And

33:50

then I was like, oh, sure. I don't know

33:52

why I didn't think of it, but it made

33:54

perfect sense whether it was Star Link or Tesla

33:56

or any of his, their SpaceX in general.

33:59

You know, he has. just to work around the world,

34:01

whether it's China or India or you know,

34:03

and there's a lot of autocrats where Turkey,

34:05

I think he went and visited Erdogan

34:09

there, you know, wherever he

34:12

goes, he get, when he was car guy,

34:14

they might've met with him or not, like

34:16

maybe we'll meet with you, but when he's

34:18

Twitter guy or ex guy, they

34:20

do, he has an influence within their

34:23

countries. So I think he's much more,

34:25

he's with a political class hugely important,

34:28

I would say entertainment, not at all, I

34:30

would say regular businesses, not at all. I'll

34:32

ask you one more Elon Musk question and

34:34

then I wanna talk a little bit about

34:36

politics and Silicon Valley more broadly, but to

34:39

go back to the AI thing, you

34:41

didn't interview a Sam Altman on your

34:43

podcast where you talked about Elon, actually

34:45

let's actually take a listen to a

34:47

part of that podcast right now. So

34:50

Elon used to be the co-chair and you have a

34:52

lot of respect for him, so you thought deeply about his critiques.

34:54

What do you make of the critiques? When you

34:56

hear them from him, I

34:58

mean he can be quite in

35:00

your face about that. He's got his style. Yeah,

35:02

I know. I don't think that's a positive thing

35:04

about Elon. Yeah, I'd like to do that. I

35:06

think he really does care

35:09

about a good future.

35:11

He does. With AGI. That is correct. And

35:14

he's, I mean he's a jerk, whatever else

35:16

you wanna say about him, he has a style that is

35:18

not a style that I'd wanna have for myself. He's changed.

35:21

But I think he does really care

35:25

and he is feeling very stressed about

35:29

what the future is gonna look like. For humanity. For

35:31

humanity. So that was Sam

35:33

Altman with Kara Swisher, Sam

35:35

Altman from OpenAI saying, okay, Elon, he's

35:37

a jerk, but he's serious about his

35:39

concerns about AI. Like he has real

35:41

concerns. And I wanted to button this

35:44

Elon discussion with that in a way,

35:46

because whatever you think of his outburst

35:48

about Apple and whatever you think about

35:50

the way he's dealt with this lawsuit

35:52

with OpenAI, he has articulated

35:54

a set of concerns that a lot of people

35:56

who are not Elon Musk have. about

36:00

artificial intelligence. And we have an election

36:02

coming up. We have Russia, China, others

36:05

who have already weaponized information

36:07

against us in various ways. That AI will

36:09

be the new tool that we're gonna see

36:11

be kind of overrun by AI. So I'm

36:13

not worried about AI as much as

36:15

it's more power. It's like, do I worry about

36:17

an atom bomb over a nuclear bomb? I'm more

36:20

worried about the nuclear bomb. But the atom

36:22

bomb was pretty bad, right? They've managed to do

36:24

a lot, you know, with just paper, by the

36:26

way, you know. And so, you

36:29

know, way back in the day, Hitler did

36:31

a very good job with just paper and

36:33

newsreels. And my biggest worry when you actually

36:36

go down to it is the people,

36:38

I'm worried about the people who use it

36:41

and the bad people. I'm always worried about

36:43

every single technology tool being taken

36:45

by bad people to create

36:47

havoc or dominance or whatever.

36:49

And so, you know,

36:51

am I worried about a machine gun less

36:54

than the person wielding it, right? Like, they're

36:56

sitting there, I don't love that it's

36:58

around. But I'm more worried about what

37:00

people decide to do with it. And that's why

37:02

I'm always calling for some global decision making on

37:04

certain things. Like we have with nuclear

37:07

weapons, like we have with cloning. Now there's

37:09

always gonna be an outlier who's like Pakistan

37:11

or whoever it happens to be or Iran,

37:14

who's gonna just be like, no, we're a rogue

37:16

nation. You know, that's a Tom Cruise movie, rogue,

37:18

whatever rogue the fuck he is. I love

37:20

that movie. But there's always someone like

37:22

that, right? And so I

37:25

think the safeguards and guardrails are important

37:28

around this technology. The other thing, I interviewed Mira Marati,

37:30

who's the CTO, a very young woman, I think she's

37:32

34, 35. I

37:36

would say the most prominent woman in tech right now, but she's one

37:38

of the most prominent people in tech right now. And

37:40

a lot of what she said several

37:42

times is like, do you think it's

37:44

gonna achieve artificial general intelligence, which it

37:47

hasn't done yet, right? Well, it depends

37:49

on what you decide that is. Everyone

37:51

has a different definition of that. And

37:53

a lot of times she kept going, I

37:56

don't know how it works. I don't know how it's

37:58

doing it. They don't know. And it reminds

38:00

you a little more of biology than

38:02

it does digital,

38:05

because they don't know. It's got biological

38:07

elements, right? Like, how is it growing?

38:09

What's it learning? How does it learn?

38:12

And I think they sometimes are surprised.

38:14

They're like, oh, look at that. That's the

38:16

shit that freaks me out. The biology. It's

38:18

the shit that freaks me out when they say that some

38:21

AI started learning new languages without being instructed

38:23

to learn the new language. It's like, OK,

38:25

that's the part that weirds me out. Yeah.

38:28

Here's a non-artificial intelligence. Here's a, I'm not

38:30

even sure there's any intelligence in this. I

38:32

just want to give you, I give you

38:34

a small piece, because we know podcasts would

38:38

be complete in our world without hearing a

38:40

little bit of Donald Trump. So here he

38:42

is, our former president, who used

38:44

to want to ban TikTok, but is now

38:46

on TikTok in this

38:48

clip being interviewed by MAGA social

38:51

media gadfly, Charlie Kirk. We're

38:53

on TikTok. What is your message to younger voters

38:55

right now? Well, the big message is vote the

38:57

Trump. We're going to make our country greater than

38:59

ever before. We had something going that was incredible

39:02

and just horrible what's happened. This

39:04

is the worst president in history,

39:06

Joe Biden. And you look

39:08

at it even as a representative, he goes over to

39:10

France and something happened that was not good. I don't

39:12

know what it is. But we're

39:15

going to make our country greater than ever before.

39:17

And I appreciate all your support. Thank you. And

39:19

you'll never ban TikTok. That's for sure. I will

39:21

never ban TikTok. Thank you. So

39:25

that's how Trump now makes formal declarations of

39:27

policy with Charlie Kirk on TikTok. But it

39:29

doesn't matter because it's already banned by Congress.

39:31

He can't do anything about it. That's the

39:34

one thing that they forgot. Charlie Kirk, the

39:36

crack reporter, Charlie Kirk forgot that. Well,

39:39

it's a mild sarcasm there. Charlie

39:41

Kirk, crack reporter. He's a terrible person. But

39:44

go ahead. Yes, I don't disagree with

39:46

that. But here's the question for you. Like, it's just

39:48

it's very you thought Trump was on the right back

39:51

in the day when Trump wanted to ban

39:53

TikTok. You sort of thought, you know, he's

39:55

probably he's probably headed the right direction here.

39:58

And now he's back to work. from it. I

40:00

think you have an explanation for why money is

40:03

now pure money. Money. You

40:05

know some of the Jeff Yass and some others

40:07

are invested in it and he doesn't... Trump

40:10

is another one of those things. It's not my idea, it's

40:12

not a good idea, right? And it was my idea and

40:14

then they took it, like how dare they. You know he

40:17

had a... his idea was

40:19

correct, his methodology was as usual

40:21

incompetent, right? It wasn't a very

40:23

good deployment of what was a

40:26

worrisome issue, wasn't a thoughtful or

40:28

effective deployment about the issues of

40:30

China having so much sway

40:33

in that air in a content company.

40:35

And I think it's a content company, I don't

40:37

think it's a social media site, I don't think

40:39

it's a technology company, I think it's an entertainment

40:41

news organization of a different

40:44

type. And so I think

40:46

he was directionally correct and

40:48

executionally incompetent. And so

40:50

that irritated me about him. I was like this

40:52

is actually a very important issue we need to

40:54

talk about, which is a foreign

40:57

adversary, who very clearly China is,

40:59

having its tentacles in any way it could.

41:02

And one of the arguments they were making

41:04

was, well, Kara, they have to

41:06

prove it, they're doing it. I'm like of course

41:08

they're doing it, we would do it. Like

41:10

if we had a similar

41:13

media site in China, we'd be

41:15

spying our little hearts out, you

41:17

know. Why wouldn't we? This is

41:19

what countries do. And so

41:21

you know they're already in

41:24

our infrastructure, they're in our software,

41:27

they're everywhere. Of course they're gonna, you know,

41:29

they have the spy things,

41:31

balloons, whatever. There's 500

41:33

million spy balloons via TikTok.

41:35

So I think we should, that

41:38

said, I think it just needs

41:40

to be monitored correctly, just like any

41:42

content company that is owned by a

41:45

foreign adversary would be. And so

41:47

I put it into a CNN or a

41:49

New York Times or whatever. If it

41:51

was anything else it would be, it wouldn't even be

41:53

a question. Are You surprised at

41:55

all? I Know you're not gonna be surprised

41:58

but I don't think you'll be surprised. The

42:00

used to this the suddenly Silicon Valley. The.

42:04

A place that was never quite as

42:06

liberal as it was sometimes memorandum that

42:08

in the in the been the mainstream

42:10

press. but now you see Trump getting

42:12

have the warm embrace of these mods

42:14

or bought of David Sachs activists X

42:16

ray David Sachs are critical. Player and six

42:18

hours or days it's asking. If there's will tell

42:20

me what you think of, what do you make

42:22

of that The sudden like that does hard on

42:24

that Silicon Valley has for Trump in certain quarters

42:27

for he says a loud mouth with money right?

42:29

So there you haven't That's my ceiling on Hampstead

42:31

Ice. Yeah, he was like the second person

42:33

is the think he's the in a buddy

42:35

movie his the other guy right? not the

42:37

main character. essentially. And

42:39

so he was always sort of hanging around the

42:41

basket said the much smarter T by Peter Thiel.

42:44

I have a list since I target. Didn't feel

42:46

that he's a brilliant person, that just no two

42:48

ways about it. You know he just is like

42:50

the simplest and he's a big things are. I

42:53

don't agree with a lot of the stuff that

42:55

I definitely am. Read his own dogs and I

42:57

saw interesting right on the same thing with a

42:59

Reid Hoffman or even a Max Levchin. Like really

43:01

groundbreaking. Think there's in many ways and and

43:04

so I think one of the things they

43:06

realized through t of was pulled back some

43:08

and he was a true conservative. Like a

43:10

troop. You know? And then

43:12

there is a John Chambers and I get

43:14

into where the traditional conservatives right or Larry

43:17

Ellison suits you know, heavy on Israel's answer.

43:19

To make me leave he blackmail a

43:21

oh my god I can. Speak to

43:23

her in our butts They were always like

43:25

this year they they were more time and

43:27

they different flavors. Marc Andreessen was never a

43:29

democrat as far as I may have given

43:31

money to democrats but he was not have

43:34

that else. And see you have a lot

43:36

of Libertarian like people who didn't even understand

43:38

or I didn't even know what the politics

43:40

was. I didn't think they had any nice.

43:42

Just leave me alone with their politics at

43:44

design and the best person ever and everything

43:46

is in my self interest that they mostly

43:48

as I remember dates doing like what do

43:50

we need them for the to said that.

43:53

The. Washing. posts when it came to launch

43:55

when we need washington for i was like

43:57

all they have seen as dell supply and

43:59

they love to send subpoenas. And he would

44:01

eventually learn to his regret. I called it

44:03

a city of ex-student body vice presidents who

44:05

was subpoena power. I was like, you're fucked

44:07

because they have subpoenas. And

44:11

so I think what happened with Peter showed,

44:13

and I think he didn't spend very much

44:15

money and got a lot of influence. I

44:17

think he spent 30, $40 million to get

44:19

his own influence he has, and they realized

44:22

politicians are cheap whores. And they're

44:24

like, oh, it doesn't cost very much to do this. And

44:27

we have opinions about things, like whatever

44:29

it happens to be, or woke,

44:32

or people are telling us what

44:34

to do. It always centers people telling people what

44:36

to do. They don't like being told

44:38

what to do because they remain

44:40

badly raised 12-year-old boys at some

44:42

point. And so

44:44

I think that's really, is they realized what an easy

44:46

game it was. And

44:49

so they're like, oh, why don't we influence it? And

44:51

it could help our business too, right? It

44:54

could help us. In Elon's case, he

44:56

really needs to embrace Trump because if

44:59

Biden wins, Elon's in for a world

45:01

of hurt at the SEC, around Tesla.

45:03

You can start to see that there's

45:05

investigations going on. I

45:08

think they'll probably look at his

45:10

national security credentials. I wouldn't be

45:12

surprised. I've heard it from national security.

45:14

They're worried about SpaceX going

45:16

public. He could still own it, but maybe

45:19

he can't run it, right? He's in for

45:21

a world of hurt in a Biden next

45:23

to Biden administration. Not sure. And I have

45:25

to say that you look at a guy like

45:27

David Sachs. David Sachs, who is for the record,

45:30

a Silicon Valley entrepreneur, former

45:32

CEO of some thing called

45:35

Yammer, pal Peter

45:37

Thiel and JD Vance, who

45:40

was first a fan of Bobby Kennedy Jr., wanted

45:42

him to run for president, then a backer

45:44

of Ron DeSantis when he was running for

45:46

the Republican nomination. This week,

45:49

now suddenly, is hosting or

45:51

has hosted Donald Trump at

45:54

a splashy big dollar fundraiser

45:56

at his $20 million Pacific Heights

45:58

mansion in San Francisco. I'll

46:00

tell you, Paul Carr wrote a good

46:03

piece where it pointed out a story

46:05

I wrote about David Sacks many years ago.

46:07

He had written a book with Peter calling

46:09

rape belated regret, which was so offensive. It's

46:11

so offensive. It's just offensive on his face. Paul,

46:14

was it date rape? Was that what you said? Date

46:16

rape was belated regret. Yes, he had the idea of

46:18

belated regret. I was like, huh? That's just so offensive.

46:20

How did you come to that conclusion? Seriously.

46:23

Wow. But

46:25

then he apologized in a story I wrote,

46:27

and he was definitely irritated that he had

46:29

to apologize. I remember that discussion. Sure. But

46:32

he did it. He's like, I got to suck this one up,

46:34

right? I'm going to have to apologize. Right. And,

46:37

oh, that was the young me. I'm so sorry.

46:39

I never, whatever the quote was, Paul found it

46:41

and revived it.

46:44

And then he wrote, nothing says, I

46:47

really regret saying that, like having a

46:49

convicted sexual harasser

46:51

or whatever, sound guilty of sexual

46:53

harassment at your house.

46:57

And on that note, Kara, I guess

46:59

it's time to bring this thing in for a

47:01

landing with a special kind of lightning round that

47:03

I'm introducing on the podcast today. We'll see how

47:05

long it lasts, but we want to start off

47:07

with you because I thought you would be good

47:09

for this. It's, like I said, a special kind

47:12

of lightning round where we aim for sort of

47:14

a combination of cultural news you can use and

47:16

reasons to be cheerful because we always like a

47:19

little optimism around these parts. So here goes. What

47:22

are you watching right now and liking? On

47:24

TV? I love those things. Gosh,

47:26

TV's so good. Give me a couple. Bridgerton,

47:28

I admit it. I like the Tom. I

47:31

love Bridgerton. I've been watching,

47:33

I watch Hacks, which I thought was, Jean

47:35

Smart is literally one of my favorite people

47:37

and I love seeing her career thrive.

47:40

It's really amazing to see her thrive.

47:43

What are you reading and liking? I love

47:46

this book, Northwoods. I

47:48

just thought, oh, it's by, it's on my

47:50

phone, Mason. The guy's name is Mason. It's

47:53

a story about a house that's been

47:55

in owners. You

47:57

find out the history of the house by the people who lived in it.

48:00

it over centuries and I find it very

48:02

touching and beautiful in a lot of ways.

48:05

I think I read the novel. It's

48:08

a novel, right? It's a novel. It's a novel by

48:10

Daniel Mason. Yes. Daniel Mason.

48:12

Amazing, amazing. Look, I just read, I'm

48:14

interviewing Griffin Dunn later this afternoon and

48:17

he has a book called The Friday Afternoon Club. Joan

48:20

Didion's in it and his father Dominic Dunn. I

48:22

just loved, I think he's a beautiful writer. I

48:24

really enjoyed it because I sort of like that

48:26

Hollywood set. Is there

48:28

any music you're listening to that you really like right now?

48:31

Music. I'm not as much

48:33

a music person. I'm a killer swiss. I'm so typical of

48:35

ladies, white ladies, white lady music style.

48:38

Never apologize for loving things. I know, I

48:40

know. I love country

48:42

music in all its forms. I

48:45

love country music. That's something I'm sure

48:47

with George Bush when he was saying I love it

48:49

and everyone makes fun of him. I'm like, no, no.

48:51

There are people who really love country music and I'm

48:53

one of them. I'm with you on that. That's having

48:55

a moment too, country music for sure. I've always had it. Is

48:58

there anything that you have watched, read or

49:00

listened to lately that everyone likes but you

49:02

absolutely hate? Oh

49:04

a lot of things. Gosh, Dune. I just

49:07

don't know. Dune. Dune's

49:09

on the Kara Swisher dead list. Okay. I

49:12

just am like, oh God, Sand. Oh, Timothy Shannemar. I

49:15

like him and I think he sounds it. Apparently he

49:17

listens to Pivot which, thank you Timote. Timote

49:21

Timote Timote. I

49:23

think he's a beautiful looking man for sure.

49:26

But I do and I just am like,

49:28

I'm exhausted by the entire affair. And

49:30

my final question, something,

49:33

anything that's happened recently in the news

49:36

at work, at home, in your personal life, on the

49:38

public stage, whatever, I don't care, that's made you feel

49:40

the most optimistic about the future. Oh

49:42

my kids. Having more kids at

49:44

my advanced stage. I always

49:46

joke that I'm a straight white man and I

49:48

got remarried. How

49:51

many did you have now? Four. Four

49:54

kids. The youngest one is? Two

49:56

and a half. Two and a half. Any more in the pipeline? I

50:00

think I'd be crazy. Well, maybe grandchildren. I hope not soon,

50:02

but I'm told myself Well, if you're a straight

50:04

white male if you're straight white anchor, you could

50:06

use up like four or five more to go

50:09

Yeah, no, I can't do the DeNiro thing. No,

50:11

no, I admire him. I admire them both I'll

50:13

be honest with you. That's another thing I grew a deal on

50:15

I'm like lots of kids great I

50:18

just think my kids are amazing and they they

50:20

all love each other which was real I was

50:22

you know I have to say covid brought them

50:24

together quite a bit more than in the only

50:26

Deposit thing about covid they spent a lot of

50:28

time together and at a very young age and

50:30

I just love their I just think they're Great.

50:32

I think it's just I had a I had a back

50:34

and forth with a bunch of conservatives about that I'm like,

50:37

I I have that, you know, they were like liberals don't

50:39

believe in the future. I'm like I have four kids Why

50:42

would I have kids if I didn't believe in the future? I'd not

50:44

have kids right so a lot of young people don't want to

50:46

have kids because they don't believe in the future But I do

50:49

so I think you know, it's

50:51

always good to go out and any podcast

50:53

with a an implicit Whitney

50:55

Houston quote, you know, you believe the children

50:57

are a future It's

51:02

like, you know, you know who's the problem in this

51:05

society people 30 to 50 they need to shut up To

51:08

get out of the way 35 to 50 Oh

51:11

calm the fuck down all of you like

51:13

that's my and I'm older than that So

51:15

I'm like stop it like young people are

51:17

if you spend enough time with young people

51:20

you feel much better about life Well,

51:22

that's what I'm gonna do right now care swisher you're awesome for

51:24

coming on I got to go and hang out with some young

51:26

people or some don't do it. No one could between 30 and

51:28

50 They're available Awesome.

51:32

Awesome. Awesome. I'm gonna take you up on

51:34

that Kara, but in the meantime, thank you

51:36

for coming on the program again And

51:38

while Kara bugs out we're gonna take another quick break to

51:40

do some business and when we come back We'll be joined

51:42

by my old friend Joe Klein Political

51:46

reporter magazine writer extraordinaire who'll be joining

51:48

me to celebrate the life and work

51:50

of another Person who

51:52

falls into that category the late-grade Howard Paimon

51:54

who passed away this week after a long

51:56

and valiant struggle with pancreatic cancer Joe and I

51:58

will mourn his loss and celebrate it by doing

52:01

work right after this. Worried

52:08

about letting someone else pick out the

52:10

perfect avocado for your perfect impress them

52:13

on the third date guacamole? Well, good

52:15

thing Instacart shoppers are as picky as

52:17

you are. They find ripe avocados like

52:19

it's their guac on the line. They

52:22

are milk expiration date detectives. They bag

52:24

eggs like the 12 precious

52:26

pieces of cargo they are. So

52:29

let Instacart shoppers overthink your

52:31

groceries so that you can

52:34

overthink what you'll wear

52:36

on that third date. Download the Instacart

52:38

app to get free delivery on your

52:40

first three orders while supplies last. Minimum

52:42

$10 per order, additional term supply. You

52:45

don't just live in your home, you

52:47

live in your neighborhood as well. So

52:49

when you're shopping for a home, you

52:52

want to know as much about the

52:54

area around it as possible. Luckily, homes.com

52:56

has got you covered. Each listing features

52:58

a comprehensive neighborhood guide from local experts.

53:01

Everything you'd ever want to know about

53:03

a neighborhood, including the number of homes

53:05

for sale, transportation, local amenities, cultural attractions,

53:07

unique qualities, and even things like median

53:09

lot size and a noise score. homes.com.

53:12

We've done your homework. Treasure

53:16

verifiable thumbs. The

53:19

late Senator Daniel Patrick Moynihan, who spoke

53:21

to Coney at commencement in 1977, had

53:23

a famous dictum. He

53:27

said, everyone is

53:29

entitled to his or her

53:31

own opinion, but not to

53:34

his or her own thoughts. Another

53:38

point, leave your comfort zone. Read

53:40

the other side, the other sides. If

53:43

you watch MSNBC, and I hope you

53:45

do, watch Fox too, and

53:47

vice versa. It won't kill you.

53:50

Scan websites as far away from your

53:53

own thinking as you can. Talk

53:55

to people whose views differ from your

53:58

own. As Nietzsche said, whatever

54:00

doesn't kill you will make

54:02

you stronger, except

54:05

possibly Glenn Beck. So

54:10

that was Howard Feynman at

54:13

his alma mater at Colgate doing the

54:16

commencement address back in 2011. And

54:20

we're here with Joe Klein, who

54:22

was a colleague of Howard's back in

54:24

the days at Newsweek. And

54:26

Joe, I somehow did not

54:28

catch the news last night that

54:31

Howard had passed, and I woke up this

54:33

morning to it. And I

54:35

don't know why it, Howard is someone who, you

54:38

worked with Howard Feynman for how long? Start with that. I

54:42

worked with Howard for about five years.

54:45

Only five years. I think of this era that

54:47

when you guys were at Newsweek, and

54:49

that would have been what, like 90? It

54:51

was 92 to 96. Right,

54:56

the first Clinton administration. Howard

54:58

had 30 years of Newsweek, and you only had those

55:00

five. So basically that's the overlap. I

55:04

think of that era and that time,

55:06

and that moment as

55:08

being the apogee of

55:10

the news magazine. The

55:13

time in Newsweek were, were they, I

55:15

mean I know that in earlier, before

55:18

I was politically conscious

55:21

and media savvy, the news

55:23

magazines were even greater, and more titanic forces

55:25

in some ways in American life. But in

55:27

terms of Washington politics, the

55:30

thing that news magazines did then that you,

55:32

Joe, John and Alter at Newsweek in that

55:34

time, it was sat in

55:37

that incredible space between, there

55:39

was genuine reportage, news

55:41

got broken, analysis was

55:43

delivered, and kind of magazine,

55:46

really high level magazine level

55:48

craft was present

55:50

in the writing. And you were one of the

55:52

people who did that in your way. John Alter

55:54

was another, but man Howard was just, without

55:57

trying any comparisons between anyone's skills. You

56:00

think about a guy who just was in the magazine almost

56:02

every week writing these incredible

56:05

mergers, blends of all those things that

56:07

I don't know if they really existed

56:09

before, they don't exist now. I

56:13

don't know what exists now because I don't know, you know,

56:15

I just don't read news magazines

56:17

now, but you're right,

56:19

that was a moment and in fact

56:21

that moment lasted for a while. You

56:23

know, we were all in the magazine

56:27

every week. Alter was in

56:29

the magazine every week, Howard was

56:31

in the magazine every week, I

56:33

was in the magazine, Mike Elliott

56:36

was in that magazine every week

56:39

and we felt as

56:41

if we were

56:43

right on top of the election. You

56:45

know, there are some elections where you feel you're kind

56:47

of on the outside of it, but

56:50

during the Clinton administration, that

56:52

first administration, we felt

56:55

that we were in control

56:57

of the news, that we were right

57:00

there, we were thinking along with them.

57:03

Just talk a little bit about Howard as a

57:05

colleague and a journalist.

57:08

What do you remember about the time when you guys were working

57:10

alongside each other? And also, you know, he is,

57:13

as the Times points out in its obit and

57:15

a lot of people have, you know,

57:17

he's a guy who really was on the

57:19

early cutting edge of understanding that cable TV

57:22

and punditry that went alongside the

57:25

kind of more straight reporting

57:27

and writing that he did and a lot of

57:29

us did, that that was going to become an

57:31

important adjunct to doing our jobs. So he

57:33

spent a lot of time in green rooms with Howard in

57:35

addition to a lot of time in

57:38

the offices of Newsweek. Well,

57:40

the first thing that Howard said to me when

57:43

I came to Newsweek was he said,

57:47

I'm glad you're here, I don't mind that your column

57:49

is going to be in the magazine every week, but

57:52

don't impinge on my TV time. He

57:55

knew at that moment. And

58:00

he was so good at that.

58:03

I mean, there was something about Howard. You

58:06

know, I'm not as fluent

58:08

as he is. You

58:12

know, I'm a writer. Howard was

58:14

not just a writer and not just

58:16

a reporter, but he was a talker.

58:19

And he could

58:22

summarize things cleverly and intelligently

58:26

on air. He

58:29

always seemed so smooth on air. And

58:32

he never seemed flustered. And I

58:34

don't know about you, Halman, but I

58:37

often feel flustered on air. Because

58:41

you're much more concerned than the rest of us, Joe,

58:43

with sounding smart. The rest of us realize that all

58:45

we have to do is try to keep it to

58:47

the end. But Howard was a reporter. I

58:49

mean, Howard and I came out of daily newspapers.

58:51

That was the first job I ever had,

58:53

was, you know, working for

58:56

a suburban daily. And Howard worked for

58:58

the Louisville Courier-Journal, which was a much

59:02

more prestigious place than I started

59:04

out. But

59:07

once that's in your blood, reporting

59:09

is in your blood, there

59:12

was a hunger in Howard

59:16

to know stuff. Yes. And

59:20

I had that same hunger. And,

59:25

you know, the interesting thing is

59:27

that we

59:29

were not often in the same place. We

59:33

were often on the road in different

59:35

places. And so I

59:38

got to experience Howard mostly on the

59:41

phone. You know, we would

59:43

be in morning meetings remotely.

59:46

And it

59:48

was, you know,

59:50

both of us, but Howard more

59:52

so than me, was

59:55

just looking for the

59:57

jugular. What is happening right

59:59

now? now? What are

1:00:02

they thinking right now? What's

1:00:05

going on, even

1:00:09

with Bob Dole,

1:00:11

famously? What's Bob

1:00:14

thinking right now? We

1:00:17

had different sets of sources. In those days,

1:00:20

I was wired into Pat

1:00:25

Moynihan, who had been my mentor, and

1:00:27

he and Bob Dole were

1:00:30

the co-chairs of the Senate

1:00:32

Finance Committee. And Pat

1:00:35

Moynihan's chief of

1:00:37

staff was Lawrence O'Donnell. The Lawrence

1:00:40

O'Donnell. It was a great source,

1:00:42

just a fabulous source. And Sheila

1:00:44

Burke was Bob Dole's chief of

1:00:46

staff. I

1:00:50

know Howard felt this way, but I always

1:00:52

felt this way too, which

1:00:54

was, wow, these folks will talk

1:00:56

to us, and

1:00:58

they will respect us if we respect

1:01:01

them. If we don't blow their

1:01:04

cover, it'll

1:01:06

be fine. And that

1:01:10

was something, I don't know

1:01:13

how much of that is going on now.

1:01:16

When I look at cable TV, I see

1:01:18

a lot of really good-looking people. Why did

1:01:20

they let you on, Heilman? I have no

1:01:22

idea. I don't know. I'm grandfathered in, Joe.

1:01:24

I'm grandfathered in. Go ahead. No,

1:01:28

no, no. I was just going to say,

1:01:30

the thing about him, well,

1:01:32

Howard had a great head of hair. He

1:01:35

was very proud of it. In fact, I

1:01:37

had a- He had

1:01:39

a great look. He really had this

1:01:41

distinguished Eric Severide sort of look. It's

1:01:45

funny, I was looking at an email that I

1:01:47

had from him this morning, and literally,

1:01:50

a few months ago, he would watch TV, and he

1:01:52

would send me an email all the time, all the

1:01:54

way up until just a couple of few months ago.

1:01:56

And I realized when I stopped getting these emails, commenting

1:01:59

on various things, I'd say, said on TV that I

1:02:01

realized we had to be getting close to the end.

1:02:03

But at one point he

1:02:06

said something to me in this email about six months

1:02:08

ago. I asked him how

1:02:10

he was feeling and he said, I know you're a busy

1:02:12

multimedia man, but so was I back in the day. And

1:02:15

I wrote back, I said, you were

1:02:17

a beast. And he said, hardly, but

1:02:19

I did wear cool knit ties, used

1:02:21

good shampoo and had a pathological fear

1:02:23

of not knowing what was going on.

1:02:26

And I think that's actually as good as a

1:02:29

summary of Howard as anything I've ever heard. Like,

1:02:31

you know, the concern for the knit tie, the

1:02:33

hair with the shampoo and the pathological fear of

1:02:35

not knowing what was going on. He was

1:02:37

as relentless and competitive as anybody I've

1:02:39

ever met in the business. Oh,

1:02:41

it wasn't just the ties though, John.

1:02:44

It was the tie shirt combination. And

1:02:47

the shirts were usually a shade of

1:02:49

blue or blue striped to

1:02:51

go with the knit tie. He

1:02:54

looked fabulous. I

1:02:56

mean, I think he put, I know

1:02:58

he had to put more into the way he

1:03:00

looked than I ever put into the way I

1:03:02

looked because I would just show up looking schlumpy.

1:03:04

Well, that's a low bar, Joe. I

1:03:07

mean, in terms of looks. But

1:03:11

I mean, he was also, like I said a

1:03:13

second ago, he was relentlessly competitive. That pathological, I

1:03:15

mean, he's very clear eyed in his self-diagnosis. He

1:03:18

was insecure in some ways,

1:03:20

insecure not about his appearance or

1:03:22

his abilities, but insecure about that

1:03:24

someone would know something he didn't

1:03:26

know. He just made him crazy in some

1:03:28

ways that there was some information out there.

1:03:30

And that's such a crucial element

1:03:32

of what a real reporter is about, even

1:03:34

if you're wearing a Turnbull

1:03:36

and Asser shirt and you

1:03:39

have the great pompadour or whatever. But Howard was out

1:03:41

there still burning shoe leather because he just did not

1:03:43

like the idea that anybody do shit that he didn't

1:03:46

know. Could I tell

1:03:48

a dirty little secret about Howard though? Sure, please.

1:03:52

He was a wonderful guy. He

1:03:55

was a wonderful dad. He

1:03:59

was this horrible. hard-boiled classic

1:04:02

journalist. He

1:04:04

played one on TV, but in

1:04:07

real life, he was a wonderful husband

1:04:09

to Amy. He was a

1:04:11

wonderful dad. He was a sweet guy, although

1:04:13

he didn't want you to know that. He

1:04:16

didn't want to advertise it, but

1:04:19

he was. Yes. Well, I

1:04:21

would say also, he

1:04:24

taught a class at the University of

1:04:26

Pennsylvania at the Annenberg School up

1:04:28

until the time he got sick. He

1:04:31

had me up there to speak to the class once.

1:04:33

We had just a really great day of talking to

1:04:35

the kids for a couple of hours and then going

1:04:38

out and having a meal after. He was really

1:04:40

in his glory. He

1:04:43

loved interacting with young people. He loved

1:04:45

his time when he was in the

1:04:47

Huffington Post working well and throwing himself

1:04:49

back into the future again and having young

1:04:51

kids to mentor and being in that role of

1:04:53

the Ponder Familias.

1:04:57

He was on some level,

1:05:00

and this is the thing I think that

1:05:02

you ... He wrote as he got sicker,

1:05:04

he didn't write that much obviously, but he

1:05:06

wrote a couple of really beautiful pieces. One

1:05:08

before he got sick, about the Tree of

1:05:11

Life synagogue shooting in the New York Times.

1:05:14

And another after

1:05:16

January 6th, he wrote a piece in

1:05:18

the Real Code Politics about what was

1:05:20

lost by the Capitol being shut down

1:05:23

and becoming heavily ... being

1:05:25

consumed in fencing and security. And they

1:05:27

were deeply romantic kind of pieces about

1:05:30

what we lost in our politics and

1:05:32

in our American life. He had been

1:05:34

very optimistic about the notion that America

1:05:37

was built on argument and that

1:05:39

argument was good for us and that he

1:05:41

constantly was starting to feel like this thing's now

1:05:44

out of control. The arguments are no longer

1:05:46

strengthening us, but they become so bitter and toxic

1:05:48

that they're weakening us. Yes,

1:05:50

well, you know, that

1:05:54

was another interesting thing about cable news

1:05:56

back then as opposed to now. He

1:06:00

was he was he was on Almost

1:06:04

despite his identity. He was a Jewish

1:06:06

man, right? He was a good-looking Jewish

1:06:08

man. He was a smart Jewish man,

1:06:11

but now He

1:06:14

would be considered We

1:06:16

would I would be I've been told that

1:06:18

I am in publishing male

1:06:20

pale and stale and we've

1:06:23

gotten so much into the identity

1:06:26

of you

1:06:28

know the talking heads on TV rather

1:06:30

than the things that they say

1:06:32

and The

1:06:34

not just the cleverness but the the creativity

1:06:37

of what they say people are hired to

1:06:39

have a certain point of view I'm you

1:06:41

know I was I was

1:06:43

under contract to CNN during those years

1:06:46

and I was a disappointment to them because

1:06:48

they would have me They

1:06:52

they would have me pitted against a conservative

1:06:54

and as often as not I would agree with the

1:06:57

conservative and So there wouldn't

1:06:59

be the kind of fireworks that they wanted

1:07:02

And Howard was unpredictable in that way

1:07:04

too And I think that the thing

1:07:06

that you just pointed out it

1:07:09

was crucial is that he

1:07:11

was a traditionalist You

1:07:14

can be a traditional liberal you can be a

1:07:16

traditional conservative or you can be a bomb thrower

1:07:18

and He

1:07:20

had respect I think

1:07:22

we all felt Wow

1:07:27

You know They're letting us

1:07:29

do this. They're paying us to do

1:07:31

this We're walking, you

1:07:34

know through the I always got a thrill

1:07:36

and I know Howard did too walking through

1:07:38

the corridors of the Capitol Which

1:07:40

is why when it was desecrated

1:07:43

on January 6th, he was so

1:07:45

personally upset about it I always

1:07:48

had a thrill When

1:07:50

I was in the White House and

1:07:52

I know Howard felt the exact same way

1:07:57

You know, I don't know whether I

1:08:00

I would be shocked and disappointed

1:08:02

if young reporters didn't feel that

1:08:04

way still, but in

1:08:07

some ways, there

1:08:09

aren't that many young reporters anymore. I

1:08:12

wonder what you think about this, because this gets

1:08:14

at, you know, he wrote this book, Howard,

1:08:17

I think he only wrote one book, and

1:08:19

I shouldn't say only, anybody who published this book

1:08:21

in my view, published one book, you've

1:08:24

done something consequential. The book was

1:08:26

called The 13 American Arguments, Enduring

1:08:28

Debates That Define and Inspire Our

1:08:30

Country. It came out in 2009, he wrote

1:08:32

it mostly in 2007. And

1:08:36

in 2020, he

1:08:38

was out on the campaign trail, and he wrote a,

1:08:41

he sent me an email that said, this

1:08:44

is my 10th New Hampshire, starting with Gary Hart in

1:08:46

1984. I think I'm seeing

1:08:49

the last reel of something in our politics, so I

1:08:51

want to stick around for the big finale. Unfortunately,

1:08:53

I think, you know, he would probably say, we

1:08:56

might be seeing the last reel in 2024, and he's not going to

1:08:58

be here for it, but this is 2020. And

1:09:01

I asked him what he meant, and by the

1:09:03

last reel, and he said, the

1:09:05

end of broad acceptance here and

1:09:07

abroad of American exceptionalism, the

1:09:09

idea that slavery, racism,

1:09:11

and ruthless economic and environmental

1:09:14

exploitation notwithstanding, we were

1:09:16

once in a planet chance to create a

1:09:18

great nation built on the best of human

1:09:20

nature and the human mind. So

1:09:23

when I wrote it in 2007 in my

1:09:25

entirely too optimistic book about America, that arguments

1:09:27

would keep us free instead of ruining us.

1:09:30

Now we're coming to accept the fact that

1:09:32

we're just another corrupt empire. This was Howard

1:09:34

Zinn's and the communist and fascist view all

1:09:36

along, but the inspiring fiction, and

1:09:38

then by inspiring fiction, hope,

1:09:41

feels like it's slipping away. Do

1:09:45

you feel that? I'm just

1:09:47

thinking, Howard did

1:09:49

himself a disservice back

1:09:52

in the days when we were

1:09:54

working together, and he was

1:09:56

so focused on being a hard

1:09:58

charging reporter. He

1:10:00

was a very thoughtful guy and a

1:10:02

beautiful writer and a beautiful writer and

1:10:05

he didn't and he didn't allow that to

1:10:07

show All that much and you didn't

1:10:09

see it all that much because there wasn't space for it

1:10:11

on television Yeah, but the

1:10:13

fact is that He

1:10:17

was a patriot and You

1:10:23

know it got to

1:10:25

the point in our lives where it became kind

1:10:27

of goofy and and

1:10:30

Soft and I guess wet

1:10:33

was the word to

1:10:37

Describe yourself as a patriot, but Howard was

1:10:39

one and I

1:10:42

hope he was wrong in that last Blast

1:10:47

I feared that there's absolutely

1:10:49

no evidence That

1:10:51

he wasn't he wasn't right but

1:10:55

I Think

1:10:58

in the end The

1:11:01

respect that he had for the country, I

1:11:03

mean I've said this in the past the

1:11:06

biggest change in Journalism

1:11:08

from when I started in 1969

1:11:12

to now is that we've

1:11:14

gone from having skepticism being the

1:11:16

default position of

1:11:19

a proper journalist to

1:11:22

cynicism and Howard was

1:11:24

a skeptic But and

1:11:26

at times he would play the tough

1:11:28

guy cynic on TV, but

1:11:31

he wasn't really He wasn't

1:11:35

He respected the institutions he

1:11:37

respected the institutions he worked

1:11:40

for I'm

1:11:44

getting Satter by the minute

1:11:46

here Joe. No, I know what

1:11:48

you're saying though And I think that it's funny because

1:11:50

Howard was you know There was no one who was

1:11:52

better suited to Chris Matthews to show in terms of

1:11:54

the title than a hard hard ball then Then

1:11:57

then Howard because Howard was a student

1:12:00

of hardball politics. He was a,

1:12:02

he cast a gimlet eye and he saw

1:12:04

things with great clarity when he would see

1:12:06

the machinations and some of the ruthlessness and

1:12:08

power grabs and stuff. But there was never,

1:12:10

I never had thought even with all of

1:12:13

that understanding of all of the Machiavella natures

1:12:15

of Washington and of campaigns and being totally

1:12:19

realistic about how ugly our business, the

1:12:22

business of politics could be. I never thought there

1:12:24

was a trace of cynicism

1:12:27

at Howard. That's

1:12:29

like, your point is exactly right. He managed

1:12:31

to be able to be completely clear that

1:12:33

a lot of people who practice politics are

1:12:36

cynical without becoming cynical himself. And I think

1:12:38

that's just a really hard thing to do

1:12:41

and a dying art in humanity,

1:12:44

not just journalism, but he really managed

1:12:46

to keep it alive for the whole

1:12:49

time of his career. Amen.

1:12:52

Amen, bro. Thank you for taking the few

1:12:54

minutes here to think about him and I'll catch you. Okay,

1:12:59

great. In

1:13:07

Politic with John Heilman is a podcast in

1:13:09

partnership with Odyssey. Thanks again to Cara Swisher

1:13:11

and Joe Klein for coming on and chopping

1:13:13

things up. If you dug this episode, please

1:13:15

follow In Politic with John Heilman and share

1:13:18

us, rate us, and review us nicely on

1:13:20

the free Odyssey app or wherever you have

1:13:22

it to ask in the splendor of the

1:13:24

podcast universe. I'm John Heilman, your cruise director

1:13:27

and chief political columnist for Puck, where you

1:13:29

can read my writing every Sunday plus the

1:13:31

work of all of my terrific colleagues by

1:13:33

going to puck.news.inpolitik. That's

1:13:36

P-U-C-K.news.imp, O-L-I-T-I-C,

1:13:43

and scoring the 20% discount on a puck

1:13:45

subscription that I'm offering to our listeners for

1:13:47

just being so damn special. You

1:13:49

know who's also special? Our executive producers

1:13:51

here at In Politik, John Kelly and

1:13:54

Ben Landy, our senior executive booking producer

1:13:56

and dope queen, Lori Blackford, our chief

1:13:58

troublemaker and executive assistant, Ali. Clancy, the

1:14:00

powers that be in the Odyssey Empire,

1:14:02

J.D. Crowley and Jenna Weiss Berman, and

1:14:05

the one and only Hall of Fame

1:14:07

five-tool player Bob Kabador, who flawlessly produces,

1:14:10

edits, mixes, and masters in politics with

1:14:12

John Heilman, as well as making the

1:14:14

world's most magical margarita. See

1:14:16

you next time everyone, and as always,

1:14:18

namaste.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features