Podchaser Logo
Home
The Cult of Failing Upwards

The Cult of Failing Upwards

Released Wednesday, 1st May 2024
 1 person rated this episode
The Cult of Failing Upwards

The Cult of Failing Upwards

The Cult of Failing Upwards

The Cult of Failing Upwards

Wednesday, 1st May 2024
 1 person rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:02

All Zone Media. Hello

0:06

and welcome to Better Offline, Cool Zone

0:08

Media's happiest podcast. I'm

0:10

your host ed ze Tron. Well,

0:23

as I've run through in the last two episodes,

0:26

managers have poisoned taxability to

0:28

innovate with a degenerative capitalism

0:30

known as the rot economy, pushing growth

0:32

at all cost metrics on companies

0:34

you love will isolating and removing those

0:36

that don't agree. And by the way, they're

0:38

the same people who actually build things

0:41

and make good products and use

0:43

them as well. Nowhere

0:45

is this more obvious than Meta,

0:47

a company with leadership completely removed

0:50

from any meaningful interaction with their products

0:52

or any value to society. Since

0:55

two thousand and nine, Facebook's core products

0:58

have reliably become more profitable and

1:00

exactly the same rate they decay, with every

1:02

founder behind every product that

1:05

Zuckerberg has acquired, including Instagram,

1:07

Oculus, and WhatsApp, leaving the

1:09

company and almost immediately talking about

1:11

how much they hated working there. According

1:14

to a New York Times piece from twenty eighteen, Kevin

1:16

Sistrom, co founder of Instagram, only

1:19

chose to quit the company after Mark Zuckerberg

1:21

became jealous of the app's success, Taking the

1:23

spotlight away from that of Facebook

1:26

itself, an appy kind of stole

1:28

from the Winklevosses. Sistrom

1:30

allegedly didn't really want to leave Facebook,

1:33

but felt that Zuckerberg was depriving Instagram

1:35

of resources and now and I

1:37

quote seemed to want Instagram to use

1:39

its momentum to help the big Blue app, which

1:42

is an annoying way of describing a situation

1:44

that feels a convenient time to reveal

1:46

that this was a Kara Swisher piece. Despite

1:49

Swisher's bloviating, it took tech Crunch's

1:52

Josh Constein to reveal the real reason

1:54

that Sistram had left. Facebook

1:57

had replaced Instagram's VP of Product

1:59

Kevin will who everybody loved,

2:02

with the former VP of Facebook

2:04

News in May twenty eighteen. You know that

2:07

great year for news, and that

2:09

man was named Adam Masseri. He

2:13

would take over and over the next

2:15

six years he would absolutely

2:17

destroy everything that Sistrom

2:19

and his co founder Mitchell Krieger

2:21

had built. According

2:24

to Constein's reporting, Sistrom had

2:26

also clashed with Chris Cox, Facebook's

2:28

chief product officer at the time. Constein

2:31

described Masseri as a Zuckerberg loyalist

2:34

who was and I quote, disappointed

2:36

that he didn't get the head of Facebook gig that

2:39

went to Will Cathcot, who now heads

2:41

up Whatsam. Over

2:43

time, Massi and Zuckerberg moved

2:45

to erode Sistrom and Instagram's independence

2:47

from Facebook, and eventually, I

2:49

guess it all became too much to bear. Sistrom

2:53

was there to oversee the most damaging change

2:55

to Instagram, though, which was the introduction

2:58

of the algorithmic feed in Due twenty

3:00

sixteen, two years before he left.

3:03

That horrified users who feared that they would

3:05

now not see posts from their friends, a

3:07

thing that almost immediately happened on

3:09

both Instagram and Facebook, which made a

3:11

major change to its newsfeed algorithm in

3:13

twenty fifteen. Wait,

3:16

wasn't that when Adam Masius? Oh

3:18

my god. Anyway, a

3:22

few months later, Instagram would try and

3:24

clone the functionality of Snapchat, a company

3:26

that has had quite literally one

3:28

profitable quarter in its history, with the release

3:30

of Stories. By the way,

3:32

that's exactly what it was called

3:34

on Snapchat. They just didn't anyway. This

3:37

move was illustrative both of the lack

3:39

of creativity within Instagram and Facebook,

3:41

but also of its future direction, with

3:44

Story serving as yet another touch point for

3:46

advertisers, and yet another thing that

3:48

Mark Zuckerberg would rip off from people

3:50

who actually build things and have ideas.

3:54

One Systrom left, Masseri became

3:56

the head of Instagram, turning it into

3:58

one of the most profitable business units in history

4:01

while destroying its basic functionality is

4:03

an app that showed you photos and videos

4:05

from people you chose to follow, doubling

4:07

down on the algorithm's ability to interrupt and annoy

4:09

you and stop you from

4:11

seeing things that you want to see, pushing

4:13

ads and sponsored content seemingly

4:16

at random. But the one thing you can

4:18

rely on is it would do it a lot. Since

4:21

taking over Instagram, Adam Massei

4:23

has, with the direct approval and support

4:26

of Mark Zuckerberg, turned the app into

4:28

a glorified ad network, devoid

4:30

of any ability to innovate with products

4:32

like IGTV and Threads. By

4:34

the way, not the social network. It was a camera

4:36

app built to compete with Snapchat, which

4:39

has also been shut down. Neither of

4:41

them found traction, and every change

4:43

under Massi seems to be a direct

4:45

copy of either snap or TikTok.

4:49

It's also important to remember and know what

4:51

Adam Massari is. Adam

4:53

Massari is not a creator. He's

4:55

not an engineer. He's not a founder. He's

4:58

a designer that found his way into

5:00

product manageer, escaping the doldrums

5:03

of actually doing things, into the beautiful

5:06

pantheon of wearing suits

5:08

and yelling at people. Okay, okay,

5:10

I don't know if Adam yelled at people, but he

5:12

definitely annoyed them. And in

5:14

late twenty twenty, he made arguably

5:17

the worst change to Instagram yet, launching

5:19

Reels, a fifteen second video format

5:21

for Instagram built to compete with the ascendant

5:24

rise of the extremely algorithmic

5:26

TikTok. Reels quickly

5:29

became the dominant form of content on both Facebook

5:31

and Instagram, flooding your feed with fifteen

5:33

and eventually sixty second clips

5:35

that automatically players you scrolled by, each

5:38

one engineered or paid to

5:40

get in the way of things that you actually want to see.

5:43

And I really want to be clear, though there are

5:45

people who are going to say, well, surely,

5:48

surely ed the fact that

5:50

Reels was such a runaway success,

5:52

Well that's proof that it was good, right,

5:56

wrong, horribly wrong. Facebook's

5:59

algorithm controls everything

6:01

with Instagram and Facebook. Now, Systram's

6:04

worry and Kevin Systrom, the founder of Instagram.

6:06

His worry was that Zuckerberg

6:09

was trying to just turn Instagram into an arm

6:11

of Facebook, kind of a feature

6:14

app, Like you went on Instagram to do things

6:16

with your Facebook account. This is

6:18

exactly what has happened. Instagram

6:20

is just Facebook, but with

6:22

more visual media. It has

6:24

the same logins, it has the same problems.

6:27

It also has no customer support of any

6:29

kind, like every social network. But don't

6:31

worry. Thanks to Adam goddamn Massari,

6:34

you can now pay fifteen dollars a month for

6:37

verification on Instagram and Facebook,

6:39

which will also get you access to customer

6:41

support. Great goddamn idea, Adam

6:44

burn in hell anyway, I'm

6:47

not the only one angry with Adam Assei. He's

6:49

also one of the least popular tech executives

6:51

in history. And I'm not kidding. I have

6:54

been reading the tech media very deeply. For

6:57

Jesus coming up on sixteen years bloody

6:59

hell. Anyway, I've never seen someone

7:01

this unpopular other than maybe Elon

7:03

Musk, and even then, Elon Musk,

7:06

who's a loathsome individual, has

7:08

far more riars than Adam Massearri,

7:10

who mostly spends his time apologizing, No,

7:13

seriously. Since taking over Instagram, he's

7:15

had to apologize for an update that made Instagram's

7:17

feed mood sideways and I'm not kidding about

7:19

that one. He's had to apologize for Instagram

7:22

censoring pro Palestinian content. He's

7:24

had to testify before Congress about

7:26

Instagram's harm to young people, and he

7:28

said to tell people that Instagram is no longer

7:30

a photo sharing app. What

7:32

does the Graham eh? Anyway,

7:36

He's overseen so many deeply

7:38

unpopular changes that Kim Kardashian

7:41

and Kylie Jenner, who between them

7:43

have over six hundred million followers,

7:45

had to beg him to stop Instagram

7:47

from trying to be TikTok, to which

7:49

he responded that more and more

7:51

of Instagram is going to become video over time,

7:54

and claim that it had cut back on recommended

7:56

content, something I think we can all agree

7:59

was a blatant, fucking lie. I

8:01

find Adam Massari very annoying as well,

8:04

because when you watch his videos, he's always going,

8:06

hey, guys, yeah, so yeah,

8:08

So the reason that Instagram's bad now is

8:10

because, uh, and you can see him

8:12

in real time trying to come up with a reason

8:15

why it sucks. That isn't just well,

8:17

it makes it. It makes us so it may it may have made me

8:19

so rich. I have a house in Kensington, now I'm so

8:21

rich. Is so good? He's

8:24

just he's worm like. He

8:26

and I want to get into personally inside.

8:28

I take that back. Adam Massari is not wormlike.

8:31

He is a coward. Though he is the

8:33

reason that Instagram sucks. Now he

8:36

is the architect of the destruction

8:38

of one of the most dominant places

8:40

on the web. These are his decisions.

8:43

Massari, like many of the most powerful

8:46

men in tech, is a glorified management

8:48

consultant, incapable of creating anything

8:50

of note. Massari has already

8:52

anounced that Threads metas dollars stor clone

8:55

of Twitter, is not for news and politics,

8:58

making news organizations kind

9:00

of hesitant to invest in a platform that was

9:02

made by a company that has a rich history of

9:05

screwing over news organizations. Also,

9:08

what the hell do I talk about in social networks?

9:10

Then, Adam, No, nothing

9:12

that's happening in the government of words.

9:15

Well oh wait, let me answer that. On

9:17

threads, what people talk about is whatever's

9:20

making them angry that day, without

9:22

much form, more feature, and

9:24

they still talk about politics and news. It's

9:26

just not supported by the algorithm. It's just

9:30

it's the kind of thing that you would

9:32

make if you had no idea how

9:34

social networks worked, but you knew

9:37

product management. If you were like, all

9:39

the people, what do we want to see out of a social

9:41

network? We want to see lots of clicks,

9:44

we want to see lots of scrolling. Yeah,

9:46

we definitely want people interacting and engaging.

9:49

That's what makes good social networking, right.

9:52

And I think we can all agree at this point that Twitter was

9:54

a mistake. It was like a text based platform. I

9:56

don't think Bisstone and Jack Dorsey are particularly

9:59

gifted product peace people, but they were

10:01

smart enough to leave it the hell

10:03

alone. If you look at the Twitter files,

10:05

which by the way, are very funny because

10:08

they Matt Taybe sold is sold to

10:10

Elon Musk, the most deceptive man alive

10:12

other than Donald Trump. It rocks didn't

10:17

think the leopards would eat your face, did you, mate?

10:19

Anyway? The Twitter files,

10:21

all you can see in there is Twitter's executives

10:24

just being like, Oh, I don't want to touch it, mate, I don't

10:26

want to know. I don't want to if we mess with it it's

10:28

going to make it bad. It's going to break. Everyone will

10:30

be so angry if we do anything. We shouldn't ban this person,

10:33

we shouldn't change this threads

10:35

is this weird, hyperoptimized, hyper algorithmic

10:38

crapfest. And all the people on there are

10:40

people who write comments on Instagram

10:43

it sucks, and there are some good journalists on

10:45

there. I'll pop in for that. But it's a bad

10:47

social network. And it's a bad social network

10:50

because it's made by a guy who doesn't build products,

10:52

unless you think of products as like financial

10:55

vehicles, in which case Adam Massei maybe

10:57

the most gifted man alive. But

11:01

let's be honest. Nowhere

11:03

is Adam Massari's consultant mindset

11:06

more obvious than in his suggested

11:08

plan to deal with Instagram's hundreds of thousands

11:11

of underage users by creating

11:13

a new type of family centered account

11:15

in Instagram that would permit Meta

11:17

to upsell Instagram to children under

11:20

thirteen, a disgusting,

11:22

loathsome program that was planned

11:24

as an alternative to instituting stricter

11:27

registration procedures, according

11:29

to a lawsuit against Meta filed by the

11:31

Attorney General of New Mexico, Yeah,

11:33

Adam I won't come without a link because

11:36

I'm scared anyways. This

11:38

is the man running one of the most important tech

11:40

platforms in the world, a man bereft of morals

11:42

or qualifications or even ideas, a

11:45

walking, talking figurehead that

11:47

exists only to spout vague platitudes

11:49

about what social media can or will do, as

11:51

the profits of making it harder to communicate

11:54

with friends and family make him unfathomably

11:56

rich. He comes out every

11:59

so often bab about how social

12:01

is important, how the changes he's made

12:03

that make Instagram worse are actually good,

12:05

and then disappears up his own asshole. One

12:08

time he responded to something I wrote on the information,

12:11

and he responded with a bunch of typos,

12:13

which is very funny. But he also responded

12:15

saying that Facebook planned no

12:18

layoffs and Instagram planned no layoffs.

12:20

They laid people off on six months later. It's

12:22

just this is the guy. These are the guys

12:25

in power now. Well.

12:27

Much of the blame for the state of Instagram and

12:29

Facebook can obviously be laid at the

12:31

feet of Mark Zuckerberg. It's important

12:33

to remember Mark sucks, and Mark

12:36

is the reason he did the original Facebook

12:38

after he stole it from the Wet Brothers. But

12:40

nevertheless, it's also important

12:42

to understand the sheer level of damage

12:45

that Adam Massei has done to the world. Instagram

12:48

is now a truly awful

12:50

product. It's terrible, and Massari's

12:53

only response to the pain and frustration

12:56

of his users is to tell them that

12:58

he intends to make it shittier. That

13:00

was his actual response when Kylie Jenner

13:03

and Kim Kardashian said, hey, stop

13:05

making Instagram like TikTok, he said, I'm

13:07

actually gonna make it more like Tiktel. Your fucking

13:09

assholes. Okay, that's not exactly what he said.

13:11

He just said there'd be more video. They've

13:13

claimed their rolling things back, but it's all nonsense.

13:16

It's all lies. And this is the management

13:18

consultant mindset that dominates tech. They

13:21

trap users in these terrible experiences,

13:24

and they do so because they have giant monopolies,

13:26

and then they make their products worse and worse

13:28

and worse once they know that their users can't

13:31

or won't go anywhere else. And

13:34

what's insane about this is Instagram

13:37

could have probably made Facebook tens of

13:39

billions of dollars without sucking. There

13:41

are honest ways to do a business like this

13:44

if they really actually invested in

13:46

algorithms. And I kind of hinted this in previous

13:48

episodes and made it so it was just

13:51

very good at showing you things you like. Hell,

13:54

they'd make TikTok. Why

13:57

do you think TikTok has done well?

13:59

Because it's our algorithm, while extremely

14:01

weird and unsettling, is

14:03

actually very good at showing people things

14:06

they'd find interesting. It's invasive,

14:08

it's weird, it's bad. But guess what Matt's

14:11

worth hundreds of billions of dollars. They're

14:13

putting tens of billions of dollars into the

14:15

metaverse. Still, Reality Labs is still burning

14:17

what ten fifteen billion dollars a quarter or

14:19

a year. It's an insane amount of money.

14:21

How about feeding that into the algorithm so your

14:23

experience doesn't suck us. I'm

14:27

being vulgar, and I've been quite vulgar in this episode,

14:29

and I apologize, I really do. I shouldn't

14:31

swear so much. My mother tells me this,

14:33

my father tells me this, shrink

14:35

tells me this. But anyway,

14:39

I just find this all so annoying. I

14:41

find it frustrating because the

14:43

bad guys keep winning and

14:45

the reason they keep winning is nobody points

14:47

at them and says how bad they are, or

14:49

at least they don't do it enough. Really,

14:52

people should walk out of Instagram, and we should

14:54

stop using Instagram and Facebook.

14:57

I think I use it Instagram like once

14:59

every cup day's look at my friend's very

15:01

fat dog, which I do enjoy. I

15:04

might just text him and say, can you just send me picture

15:06

of site of your dog? But that's kind of weird. These

15:09

apps have become part of the social fabric,

15:12

and people like Adam Masseria are aware.

15:14

Same with Mark Zuckerberg. They know exactly

15:17

how well they've done, and they have. They

15:19

are a success story. An evil success

15:22

story, but they're a success story.

15:25

You are on Instagram, you are on

15:27

Twitter, you are on email. You

15:29

are on these platforms because that's where

15:31

people exist online. There

15:35

are people that I can only speak to

15:37

on Instagram. They're just bad at text, but they love

15:39

Instagram, and it sucks. I

15:41

don't want to use these platforms, and imagine you don't

15:43

want to EVA, which is why it's important

15:45

to talk about who mess them up. It's important

15:48

to say Adam Masseri a hundred times

15:50

and keep saying it. That's

15:52

the only way that history will know

15:55

who has done this damage. And

15:58

these management consultant types

16:00

they're everywhere, They're

16:03

everywhere, and they're inspiring people to be

16:05

like them, to be ruthless assholes,

16:08

to be terrible product developers,

16:10

but excellent businessmen. And there is

16:12

a middle ground, a sustainable ground,

16:15

one which doesn't involve burning cash or

16:17

burning customers or in the case

16:19

of Instagram, shortly adding generative

16:22

AI to everything, to

16:24

Facebook, to Instagram. And by

16:26

the way, here's a crazy story for you.

16:29

Recently, a generative AI on

16:31

Facebook responded to a group saying

16:33

that it had a gifted child. And

16:36

you may think I'm mistaking something. No, no, it

16:39

said this whole thing about having a gifted child.

16:41

This is what happens when people who don't know

16:43

product, who don't build anything, who don't

16:45

understand anything, get the keys

16:48

to the kingdom. They fuck it up and

16:51

they'll keep doing so, and

16:54

their massive success only seeks

16:56

to inspire generations of useless

16:58

founders with creating profitable

17:01

pain boxes over useful products. And

17:03

as we speak, the most quirked up loathsome

17:06

one of the more is rising to power. I'm

17:09

talking about Sam goddamn Altman. Now

17:21

if you don't know who that is. Sam Altman

17:23

is the CEO of open Ai, which is

17:26

a very unprofitable revenue

17:28

generating company building software

17:30

to do something, and Generative Ai might

17:32

do something. You should go back and listen

17:34

to the episodes about that. But

17:37

I want to tell you how Sam

17:39

Altman got started, and I want

17:41

to let you know how shit

17:43

Sam Altman has peen at his job in the past.

17:46

In two thousand and five, Altman, a Standford

17:49

dropout, co founded a company called Looped.

17:51

That's loopt, that's what companies

17:54

were called back then, and they were a location

17:56

based social networking app that raised over

17:58

thirty million dollars from taking Hiba, y

18:00

Combinator, and vcs like Sequoia

18:03

Capital and NEA. Seven years later,

18:05

Aortmand would flog Looped to a publicly

18:07

traded financial services company called green

18:09

Dot, best known for their prepaid cards,

18:11

for remarkable forty three point four

18:13

million dollars, despite the fact that the app

18:16

didn't find traction or revenue

18:19

ormand got quite rich from the Loop deal,

18:21

despite the fact that a group of senior employees

18:23

urged the board on two separate occasions

18:26

to fire Aortmand for what they described as

18:28

deceptive and chaotic behavior. According

18:31

to The Wall Street Journal, Ormand

18:33

would almost immediately become a partner

18:35

at y Combinator, surprising a lot of people

18:38

after working there part time before

18:40

being made president by co founder Paul

18:42

Graham in twenty fourteen. Yet,

18:44

behind the scenes, according to reporting

18:47

by Elizabeth Dwaskin and Natasha

18:49

Tiku of The Washington Post, Aortmand

18:51

was well known for and I quote for

18:53

an absenteeism that rankled his peers

18:56

and some of the startups he was supposed to nurture.

18:59

Ortmand was also double dipping in y combine

19:01

at startups by investing through Alt Capital,

19:04

a venture firm founded with his brother Jack,

19:06

with one source quoted by the Post describing

19:08

Aortman's tenure as the school

19:10

of loose management that's all about prioritizing

19:13

what's in it for me. Aortmand

19:16

became wildly rich during his tenure at

19:18

y Combinator, using his connections

19:20

and his cult of personality to make early

19:22

investments in companies like Gusto and optimizedly,

19:25

which was acquired in twenty eighteen for one

19:27

point one six billion dollars, and

19:29

Patreon and Asana and

19:31

Reddit probably made a couple hundred

19:33

million red air really depressing. In

19:36

twenty fifteen, he founded OpenAI, at

19:39

the time, a nonprofit organization dedicated

19:41

to building responsible artificial intelligence

19:43

applications. Yet

19:45

it's really important to

19:47

note that Aortman is not, and was not

19:50

ever an engineer or a technologist. He

19:52

did code, but he was, in

19:54

every case, from what I can find, a figurehead

19:56

and a fundraiser that was able to convince

19:59

actual academics and engineers like Dirk

20:01

Kingman and we'll check Zaremba

20:04

and Iliasidskava to do the actual

20:06

work at open AI, while he sent master ptory

20:08

emails to Elon Musk, who only ever donated

20:10

fifty million dollars of the one hundred million he

20:12

claimed he'd invested in open AI. You

20:14

know what the thing is, Sam, at

20:17

the very least, can you make sure that Elon Musk pays

20:19

up? Are you that much of a count anyway?

20:22

I really want you to remember that Sam Mortman

20:24

was an absentee parent for the first few years

20:26

of open AI. He split his efforts

20:29

in actually a way that was very

20:31

similar to Elon Musk, across

20:33

multiple other investments and enterprises

20:36

like the two funds he'd built inside of

20:38

Y Combinator for him to run. In

20:40

twenty nineteen, according to reports at the

20:42

time, Alltmond would step down from Y Combinator

20:45

amid a series of changes at the accelerator,

20:48

a story that much of the industry

20:50

press just simply chose not to look into.

20:53

Though I will give Eric Newcomer some credit

20:55

for calling people out for this kind

20:58

of in vain. Yet the Washington Host's

21:00

reporting revealed that Y Combinator found

21:02

a Paul Graham, best known for writing

21:04

extremely annoying tweets and very

21:06

long and quite boring blogs. That's my job,

21:08

Paul. Anyway, he flew from the United

21:11

Kingdom to Francisco to personally

21:13

kick Sam Altman out, though he blamed his wife

21:16

for some reason, and anyway due

21:18

to Altman continuing to focus on his

21:20

own personal projects and press over

21:23

his thing at WY combinetor This

21:26

is the story of the man whom

21:28

New York Magazine called the Oppenheimer

21:30

of our Age in a meandering

21:32

piece that frames Altman's vagueness about

21:35

AI as some sort of big secret,

21:37

a hidden truth, when I think the truth

21:39

might be far simpler. Sam Altman

21:41

is yet another fucking management consultant.

21:44

In a piece published in early twenty twenty one,

21:47

Sam Altman proposed the concept of Moore's law

21:49

for everything, referring to the principle that

21:51

the number of transistors on an integrated

21:53

circuit doubles approximately every two years,

21:55

leading to a linear increase in computing power in

21:58

the process. And if I got that wrong, I

22:00

just I think it's okay anyway. Moore's

22:03

Law for Everything is, in essence a utopian

22:06

take on the impact of AI, noting

22:08

that as machines usurp the roles

22:10

of humans in the supply chain, the prices

22:12

of goods and services and thus the cost of living

22:15

will go down exponentially, something that has

22:17

been proven wrong by a lot of history.

22:19

The problem with this piece is it's kind

22:22

of like Sam Moultman in that it's a deeply

22:24

complex bucket of nothing. It's

22:26

an extremely verbose screed

22:28

that says things like AI will lower the cost

22:30

of goods and services because labor is the driving

22:32

cost at many levels of the supply chain, and

22:35

suggests things like that we should tax

22:37

capital rather than labor by creating an American

22:39

equity Fund where companies are forced to give up a percentage

22:42

of their shares to a nationally incorporate adventure fund,

22:44

an idea that makes sense if you're in the business

22:46

of flogging private companies to public companies.

22:49

Moore's law for Everything is a remarkably

22:52

telling piece in that if frames Aortman's

22:54

worldview as one where the only thing

22:56

that can save mankind is startups,

22:59

and those should be funded in the

23:01

way that Sam Ortman likes, and

23:03

this piece feels a lot like everything

23:05

in a Ortmand's universe. Endeavor actually

23:08

connects Moore's law to anything, which

23:10

I should add isn't so much of a law

23:12

as an observation as long ceased to be

23:14

relevant as the shrinkage of transistors as

23:17

slowed in the recent years. In many

23:19

respects, this comparison

23:21

is actually ierially prophetic given the slow

23:23

down and even regression of improvement we've

23:25

seen in large language models like chat GPT,

23:29

And much like chat GPT, Ortmand

23:31

is capable only of loosely approximating

23:34

the output requested, because that is

23:36

core. He lacks any kind of substance

23:38

or technical history that you'd need

23:41

to do so, like chat

23:43

GPT, he's a very

23:45

intelligent know nothing that, through deterministic

23:48

measures, completely detached from

23:50

the meaning of the underlying ideas, picks the

23:52

right words to say at the right time. And

23:55

by the way, this is the man

23:57

selling the artificial intelligence stream.

24:00

He's a salesman, and he's a salesman

24:02

capable of superficial connections between

24:05

ideas in a way that's initially satisfying

24:07

as long as you just don't think about it too much.

24:10

Altmand's famed startup playbook, which is

24:12

published in twenty fifteen, is full of the kind

24:14

of obvious yet satisfying crap

24:17

that you'd expect. He extols

24:19

the virtues of being flexible yet rigid.

24:21

Advis is that you talk to your users and watch

24:23

them use your product, which is an exact

24:26

quote, and he says that you should

24:28

try to improve your product five percent

24:30

each week. These are

24:32

the kind of things that are very useful, genuinely

24:34

to an early twenties founder, and super

24:37

impressive to a mid fifties white venture

24:39

capitalist that doesn't remember the last time

24:41

they worked a job that wasn't ten hours a week

24:43

of investing in Chuntley the SaaS for dog

24:45

breeders. They're the tech equivalent

24:48

of live, Laugh, Love It

24:51

does. However, at one point,

24:53

betray Sam Altman's real mindset

24:56

that the only universal job description

24:58

of a CEO is to make make sure that

25:00

the company wins. Almand's

25:04

material contributions to open Ai kind

25:07

of hard to nail down. Well, it's unfair

25:09

to judge someone entirely by their emails.

25:11

Those that I can find, such as the ones

25:13

from Elon Musk's lawsuit against open Ai,

25:16

feel like they could be from any other managerial

25:19

huckster, and they feel kind

25:21

of as specious as Elon Musk's. And

25:25

by the way, you're able to compare those because

25:27

when Elon Musk sued open Ai, open

25:29

Ai published a bunch of emails and him and

25:31

Alman are the same guy. They're both just sitting going,

25:34

yes, yes, the future will be very good. It'll be very important

25:36

that we have technology in the future. Yes, AI will

25:38

be able to grow. And Sam Almond's like, yeah, dude,

25:40

yeah, that's great. That's crazy man. It's

25:43

like Joe Rogan, except they're worth

25:45

billions of dollars. Almond

25:48

blathers on about governance structures

25:50

and how open ai needs to create

25:53

the first general AI and use it for individual

25:55

empowerment, which he defines as

25:57

the distributed version of the future that seems

26:00

the safest. Like I said, Musco

26:02

and Aortman are very similar creatures, managers

26:04

wearing engineering costumes, and both are credited

26:07

as having expertise in AI without

26:09

actually appearing to have written a single

26:12

line of code in a decade. And let

26:14

me tell you something about most of

26:16

the guys who actually work deep

26:18

in AI. You know what, They've got academic

26:21

papers, they have actual published

26:23

things they've done. They're not afraid

26:26

to share their code. They're

26:29

not asking, as Elon must did when laying

26:31

people off at Twitter for people's most salient

26:33

lines of code, because they actually know how code works.

26:36

I, by the way, do not. I'm not going to

26:38

pretend I do. But I'm also not there

26:40

selling you the future of AI. From

26:43

what I can tell, Altman has like broberkar

26:45

Ragavan, the villain of the last episode

26:48

Fallen Upwards. He

26:50

was a constant source of frustration at Looped

26:53

due to his pursuit and I share

26:55

you not of side projects,

26:57

with the Wall Street Journal reporting late last

26:59

year that Altman wants diverted engineers

27:02

to work on an unnamed gay dating app.

27:04

As I previously noted, Ortmand was

27:06

fired from y Combinator for his absenteism

27:09

and I quote reputation for favoring

27:11

personal priorities over official duties,

27:14

and the Wallstroop Journal reports that by early

27:16

twenty eighteen, a year before he was fired,

27:18

Ortmand was barely present at y Combinator's

27:21

headquarters, spending more time at open Ai,

27:23

which rankled longtime partners at y Combinator

27:26

who began losing faith in him as a

27:28

leader. The

27:41

journal's piece does reveal a little

27:43

more about why Ortman was fired as

27:46

CEO of open Ai, something that happened

27:48

last November and was very weird

27:50

if he didn't see it happen. He was fired

27:52

for like three days, and

27:54

then a bunch of managers

27:56

like Reid Hoffman and Brancheski of Airbnb,

28:00

sachurn Adela, the king of managers, the CEO

28:02

of Microsoft, got together and bullied

28:04

a nonprofit into putting him back. Super

28:06

happy story, But one of

28:09

the reasons that he was fired was because

28:11

Sam Altman is a pretty atrocious

28:13

manager. Founding

28:15

and sadly now former open Ai

28:17

board member Ilia Sutzkava described

28:20

to the board when calling for Altman's

28:22

removal, a long running pattern of

28:24

Altman's tendency to pit employees against

28:26

one another, or promised resources

28:28

and responsibilities to two different executives

28:31

at the same time, yielding conflicts. More

28:34

worryingly, the General reports that other

28:36

members of the board had heard similar concerns

28:38

from senior OpenAI executives.

28:40

By the way, anyone who

28:43

brought Sam Altman back in ire

28:45

ascam bag, we all know it. They

28:48

also feared that Sam Altman would use his influence

28:50

in Silicon Valley once fired, something

28:52

that almost immediately came true

28:54

when Sam Altman ran as I mentioned to

28:57

Brian Chesky of Airbnb, who then

28:59

called Saturn Della of Microsoft, which

29:01

sparked a chain of events that restored Altman

29:03

as CEO of open Ai and led to

29:05

the removal of the non believers like Ilia Suitskafer

29:08

from the board entirely. This

29:11

this is Silicon Valley's king. This

29:14

is the guy that people think is

29:16

the Oppenheimer of our age. This

29:20

is the king of Silicon Valley now, a

29:22

multi billionaire who's actually a lobbyist

29:25

role playing as a founder, a diplomat

29:27

masquerading as a technologist, A charming,

29:30

capricious, abusive, and untrustworthy

29:33

man that has proven time and time

29:35

again that his only reliable trait

29:38

is that whatever happens must benefit

29:40

Sam Altman. This also

29:42

explains why so little of Sam Altman's

29:44

promises about AI makes sense and

29:46

why open ai has been so unashamed

29:49

in steam rolling and plagiarizing the entire

29:51

world. Altman has created

29:53

nothing other than wealth for himself

29:55

and other rich guys, helping elevate

29:58

and protect existing power structures and the

30:00

ideologies of men like Microsoft

30:02

CEO Sachnadella and LinkedIn

30:04

founder and career manager Read Hoffman.

30:07

And let's not forget open AI's new board

30:09

member Larry Goddamn Summers, and

30:12

it all kind of makes sense why GENERATIVEAI

30:15

doesn't really help anyone other than

30:17

those who want to sell something. When

30:19

the center of attention at a company

30:22

isn't really on the product or the tech, but

30:24

the idea of what the product could do, very

30:26

little about the company's culture is focused

30:28

on building useful things for real people.

30:32

When leadership is dominated by managers

30:35

that haven't touched a line of code in decades or

30:37

talked to a normal person in decades, nobody

30:39

steering the ship has the ability to judge whether

30:41

software is good or useful, or valuable,

30:44

or does anything other than help

30:46

you raise venture capital, of course, and

30:50

this is the direct result of

30:52

Silicon Valley's corruption by the managerial

30:54

sect. While Propakar Ragavan

30:56

may be a decorated computer scientist

30:59

and academic, he arguably oversaw

31:01

the destruction of Yahoo, formerly one

31:03

of the Webb's most dominant search engines, and

31:06

failed upwards into a managerial role

31:08

that allowed him to take over and now arguably

31:10

ruin Google's search product chasing

31:13

Away Ben Gomes a hero and

31:15

a man responsible for actually building things.

31:18

Adam Masseri was and always

31:20

will be a manager making calls about

31:22

products he's had no hand in building, and

31:24

has architected the outright destruction of

31:27

a social network used by billions of people.

31:30

And Sam Altman, a career

31:32

failure famous for making himself

31:34

rich and popular and upsetting and

31:37

hurting the people he works with, is

31:39

on course to become the most toxic manager

31:41

of them all. If left unchecked,

31:44

OpenAI will perpetuate one of the largest

31:46

thefts in history, looting the Internet and

31:48

using it to train models that have yet to prove

31:50

their necessity. Other than there's a symbol

31:52

that Silicon Valley has still fucking

31:54

got it, even though it

31:57

unquestionably doesn't when it comes

31:59

to generative as yet, because

32:01

Altman, like every manager, is so

32:03

thoroughly divorced from natural production, he's

32:06

only succeeded in generating unsustainable

32:08

hype and making vague promises that

32:10

the people who do the actual work at

32:12

OpenAI likely know that they

32:15

can't keep. And it's

32:17

frustrating because, as I said before,

32:20

bad guys keep winning. There are people

32:22

in Silicon Valley making real products.

32:24

There are people out there who are doing

32:26

good things, but Silicon

32:29

Valley will continue to suffer as long

32:31

as we entrust the future to management consultants

32:33

and showmen who don't build things. Just

32:36

look at Humane, a company that raised

32:38

hundreds of millions of dollars to make a voice

32:41

activated AI powered pin that the

32:43

ultra popular YouTuber markus brownly

32:45

called the worst product he'd ever reviewed.

32:48

One might wonder how a company would willingly

32:50

launch a seven hundred dollars product that overheated

32:53

within minutes of use and repeatedly failed

32:55

to answer basic queries. And

32:57

the answer is actually really simple. Was

33:00

founded by Bethny Bonziano, a former

33:02

management consultant at PwC, and Imronchell

33:05

Dori, a former director of design

33:07

from Apple, that re firs to himself as an inventor

33:10

and innovator that was fired in twenty

33:12

seventeen for sending out an email about

33:14

his planned exit from the company that suggested that

33:16

Apple could no longer innovate. Well,

33:20

may look at the humane pen. Do you think

33:22

you innovated? Just to be clear, if you haven't

33:24

seen this thing, you should really look it up. It's really funny.

33:26

It clips on and you are meant to use

33:29

it to project a laser thing onto

33:31

your hand to make phone calls or take

33:34

photos. It's got a little camera in it. Here's

33:36

the problem. It overheats after a few minutes

33:38

because of the laser. And on top of that, the

33:40

queries don't work after time. And when they

33:43

do, who really cares? It's

33:45

just chat GPT except seven hundred

33:47

dollars worth a twenty four dollars a month subscription.

33:51

And this is what happens when you're insulated from

33:53

real people's problems, and when you don't participate

33:55

in the process that might actually solve them, you

33:58

become fundamentally disconnect from

34:00

any real value creation. Silicon

34:03

Valley is atrophying as a result

34:05

of lazy, disconnected vcs and

34:07

power players elevating man like

34:10

sam Altman and incumbents

34:12

helping career consultants dictate the actions

34:14

of those who actually build software and hardware.

34:17

If the tech industry wants to escape

34:19

the public's eire, it should push back

34:22

against this managerial poison and

34:24

talk to real people with real problems

34:26

and focus on solving those before

34:28

creating yet another growth centric bullshit

34:31

machine. And as the value

34:33

really wants to change, it needs to

34:35

stop empowering those who have failed

34:37

upwards just because they say the right

34:39

things. It feels good.

34:41

I get it that we have a guy out there

34:44

who's saying, yeah, I can help

34:46

the value be worth money. But as

34:48

we speak, in vidios down ten percent by

34:51

the time this episode's out, I truly don't know where

34:53

it will be. But I'm worried. I'm

34:56

worried that the tech industry is going to

34:58

start sputtering because every rues

35:00

put their eggs in the AI basket. But

35:02

I do have some hope. A

35:04

lot of the startups I talked to are only

35:07

slightly touching generative AI. They

35:09

don't seem firmly embedded in it. Maybe

35:12

this is just anecdotal, Maybe I just know good

35:14

people. I don't know, but my thought

35:16

here is the fact that the zero interest

35:19

free generation has kind of ended. That venture

35:21

capitalists can't just get hundreds of billions

35:23

of dollars quite so easily. Means

35:26

that they're not so quick to invest

35:29

in this stuff. But I

35:31

think are reckoning's coming, and

35:33

I don't know if it will be from people, but

35:35

I think it's going to be kind

35:37

of a larger effect. It's going to be one where

35:40

you see that these products just don't get adopted,

35:42

like I mentioned in the AI episode, and

35:44

I think you're going to see these big, nasty,

35:46

overfunded consumer AI companies

35:49

fall apart. But like I said before,

35:52

I'm afraid the open AI is going to start falling

35:54

apart, even if it is mostly part

35:56

of Microsoft. I'm afraid of the knock

35:58

on effects on the stock market. And I know, oh,

36:01

stock market is only rich people play.

36:04

No, that's people's pensions as people like

36:06

regular people do actually invest in the stock market,

36:08

and regular people watch Jim Kramer as

36:10

he goes you need to invest in AI. That

36:13

man does not know a goddamn thing. By the way,

36:16

a lot of people who've responded to my AI

36:18

episode have said, yeah, it's good that AI's fail. It's

36:20

good that the tech industry is falling apart, and

36:23

it feels good to see bad people fail.

36:25

But the thing I need to caution you about is management

36:28

consultants are also really good at avoiding

36:30

blame. Prabakar

36:33

Ragavan he destroyed

36:35

Yahoo, or at least watched it happen, and

36:37

he's now the head of Google Search and he's destroying

36:39

that too. Sam Altman has messed

36:41

up so many times, and

36:44

yet here he is. He's the king of Open

36:46

AI. He gets these glossy stories. He can

36:48

be interviewed by anyone anywhere. These people

36:50

keep winning, but you want to know how

36:52

they get defeated Sunlight talk

36:55

about them, Talk about Adam Maseri,

36:58

talk about Sam Mortman. These stories

37:00

to your friends, say the name Propacar

37:02

Ragavan. As much as you can

37:05

blame these people for their actions,

37:07

I can't say it will change much. But

37:10

at least the wider society will

37:12

know who to blame for destroying

37:15

the Internet. Thank you for listening.

37:27

Thank you for listening to Better Offline. The editor

37:30

and composer of the Better Offline theme song is

37:32

Matasowski. You can check out more

37:34

of his music and audio projects at Matasowski

37:36

dot com. M A t t O.

37:39

S O w Ski

37:41

dot com. You can email me at easy

37:43

at better offline dot com or check out better

37:46

Offline dot com to find my newsletter and

37:48

more links to this podcast. Thank you so much

37:50

for listening. Better Offline

37:52

is a production of cool Zone Media. For more

37:54

from cool Zone Media, visit our website

37:57

cool Zonemedia dot com, or check

37:59

us out on the I or radio app, Apple Podcasts,

38:01

or wherever you get your podcasts.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features