Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
It's time for Twit this week in
0:02
tech. Great panel for you. Ian Thompson
0:04
is up in studio for us, the
0:06
U.S. editor for the register.com. Our AI
0:09
guru and the author of the AI
0:11
analyst newsletter, Ben Parr, is here.
0:14
And from Blackbird AI, Dan Patterson,
0:16
his job is to
0:18
use AI to debunk AI
0:20
misinformation. Of course, AI, the
0:22
topic, the announcements from OpenAI
0:25
on Monday, the chat GPT
0:27
4.0, Google's
0:29
announcements at Google I.O., the firing,
0:31
or is it the departure of
0:34
the trust and safety team
0:36
at OpenAI? A lot of AI in the
0:38
news, plus a lot more. You stay here.
0:40
This week in tech is next. Podcasts
0:45
you love. From people you
0:48
trust. This is
0:50
Twit. This
0:58
is Twit. This week in tech, episode 980,
1:00
recorded May 19th, 2024. Head deck
1:02
lead in that graph. It's
1:13
time for Twit this week in tech. The show we cover the weeks.
1:16
Tech news. Lo and behold, here he is sitting right
1:18
next to me. My good friend Ian
1:20
Thompson, who often fills in for me when I'm not around,
1:22
but it's good to get you up here when I'm around.
1:24
Nice to get out and about again. Too
1:29
much of this has been done remotely and obviously geographically,
1:31
yes. Well, we couldn't
1:33
even start Twit until I realized that
1:35
we could do Zoom or actually the
1:37
time of Skype because nobody
1:39
is near Petaluma. Oh,
1:42
except you. It's 40 minutes up the
1:44
road. Yeah, we're just a nice driver.
1:47
But fortunately, we can get people on
1:49
from all over the world. Just
1:51
a side note. My God, how Skype
1:54
fumbled that bag still just boggles me.
1:56
Yeah. I mean, that is Ben Parr,
1:58
ladies and gentlemen, author. The ai
2:00
endless coughenour Vox in a ice your
2:02
it's ready for the information has been
2:04
a long time front of the show
2:06
as well, and he's got a little
2:08
octane. Do. Hickey over his
2:11
rights. was a sober sir. I
2:13
been a little or whole lot
2:15
more. The Robot Sola. There's.
2:18
Tiny robot on. Here
2:22
I've had it is a big
2:24
robot well we know I've seen
2:26
a I is is got this
2:29
wag on points for what is
2:31
Octane A I do. It
2:34
is ours a eyes. For
2:37
three thousand six hundred he timers France
2:39
so think like you go to a
2:41
Ah Jones or of Beauty or you
2:43
go to buy something from hacks and
2:45
you're like I cancer which think the
2:47
buyer would skincare routine gets is A
2:49
I'd recommend Rod I only have one
2:52
ring with his dealings. I saw her
2:54
team together should really ask my my
2:56
little my little friend but I'll ask
2:58
her in a moment. I did. Neglects
3:00
to mention your editor of Us editor
3:02
for the register is scattered credits now
3:05
so we've got to journalists. Well I
3:07
guess as giving myself a little grid we're going
3:09
to journalist and some guys on the. Podcast
3:12
thing and then we get to a
3:14
I experts because damn Patterson is also
3:16
here! Hi Dan Good to see you.
3:19
I went straight to see you I think or.
3:21
Two. And a half decades I was
3:24
adjourned. former journalist run our men journalists.
3:26
Let's not say former Leo plus it'll
3:28
say where you'd my have a different
3:30
job as much if the and wanted
3:32
to go write a piece or a
3:35
column somewhere I am sure the Roses
3:37
bunches of publications be likely rented own
3:39
band benz as told journalists to as
3:41
even the working in a business and
3:43
ah damn what is the Blackbird tell
3:46
us about black produced T Thomas last
3:48
time you run but I curse of
3:50
adverse for. So he
3:52
is a combination of artificial intelligence
3:54
and human threat analysts to fines
3:56
or misinformation and disinformation. College near
3:59
the attacks singles. it's a little
4:01
easier for people who are not.
4:03
Interfere. Not really dialed into
4:05
what misinformation and disinformation are.
4:07
especially when you look at
4:09
ah, I'm. In. Some of
4:12
the way. Ah, Social.
4:14
Media are missing. This information can
4:16
be applied in really hostile ways.
4:18
Take maria rest says she was
4:20
a journalist who are was a
4:22
Ceo of Rappel or.com in the
4:24
Philippines and she was attacked by
4:26
a deterred say the president of
4:28
the country ski won a Nobel
4:30
Peace prize for her journalism or
4:32
but she was just kind of
4:34
the definition of a narrative attack
4:36
on in a we might call
4:38
it doc singer other types of
4:40
what happened to her and. She's
4:43
now. In the States, she works at
4:46
Columbia. She's wonderful person and or yes
4:48
she's she's doing great. But that says
4:50
just one example of how misinformation and
4:52
disinformation can be. A
4:56
weaponized. As we just
4:58
we know about that and we're
5:00
in a see a lot more
5:02
of it the next few months
5:04
I'm sure with the Us election
5:06
there were ah but Ai is
5:08
our top story boys that are
5:10
to of is very other shows
5:13
getting through the weak I am
5:15
a with open a I announcing
5:17
it sheds you busy for oh
5:19
for Omni and and the nerve
5:21
new her style assistance as a
5:23
text prince of put it. And
5:26
then the following day Isaac open
5:28
a i'm eighteen schedule discover know
5:31
what they call that route. Opposition
5:33
to Google adversarial scheduling has Googles
5:35
them go Google Said developer conference
5:38
google Io was Tuesday and they
5:40
announced a lot of ai. To.
5:43
Infect so as the any sooner
5:45
for try said and meanwhile to
5:47
save you some time. We.
5:49
Had our ai counting how many times as of the
5:51
word ai. It's. a hundred and twenty
5:53
and and i went on twenty one does
5:55
he said it again so lot of ai
5:58
in the sense that they spray AI on
6:00
a lot of things. Let's
6:05
start with chat GPT-4. There
6:07
have been some people who have said AI is
6:09
at a standstill, that we're just seeing different
6:12
flavors, but we're not seeing great progress.
6:15
Actually, Steven Levy wrote an interesting piece this
6:18
week in Wired in
6:20
which he said that's a mistake, just
6:23
like it was a mistake for Cliff Stolthon, that
6:25
the Internet was never going anywhere. And
6:28
Newsweek at the beginning of the
6:30
last decade. Bill
6:33
Gates famously described it as being like CD radio at
6:35
one point in one of his books. He
6:38
pulled that one back. We all have handles now. Now
6:40
it's the other side of, yeah, it's true that
6:42
he was right about in that respect. But
6:45
now it's the other side where people are,
6:47
everybody's hyping up AI. Steven
6:50
Levy's article, it's time to believe
6:52
the AI hype. Some
6:54
pundits suggest generative AI stopped getting
6:57
smarter, but the
6:59
explosive demos this week from OpenAI and Google
7:01
that started the week show there's plenty more
7:03
disruption to come. And here's a
7:05
picture of the IKEA set that, actually
7:07
I'm very jealous of the IKEA set that
7:09
OpenAI used on Monday. Sam Altman did
7:12
not make an appearance on Monday. The
7:14
keynote was led by the woman
7:16
who was going to be the CEO of OpenAI
7:18
when Sam was briefly fired. But
7:21
I think they did quite a good job of
7:25
showing off this Scarlett
7:28
Johansson sound-alike. Well,
7:30
I mean, it was quite interesting because we got in
7:32
contact with, Reuters did a story predicting that they were
7:34
going to make a big announcement on Monday when actually
7:37
we got in contact. And
7:39
they were just like, no, we're not launching
7:41
GPC5, we're just managing expectations. It's just going
7:43
to be a product update. There's nothing to
7:45
panic about. So they seem to be
7:47
managing expectations downwards, but it's a very interesting system. I'm
7:49
curious to see what the other panelists think. Well, the
7:52
way they manage the expectations, especially
7:54
with the help of Sam Altman, Is
7:57
kind of intriguing because initially...
8:00
Usually Sam said, nope, not
8:02
Ceti Beauty five. Ah
8:05
nope, not. What?
8:07
Was the other a possibility? Search
8:10
and Going answered. No. Search
8:12
that's right Engine M something
8:14
else. And then
8:17
it was something else. It was
8:19
her. To. which he tweeted
8:21
the following day. One word. Her.
8:25
I think they've done a masterful job
8:27
of manipulating expectations from it on the
8:30
times. really. I including will talk about
8:32
this later on the show, but in
8:34
a D V a safety board and
8:36
other components of their business there and
8:39
I'm not saying that in a negative.
8:42
A context. but they have really
8:45
manipulated expectations and the media for
8:47
quite awhile. Yeah and
8:49
we are going to get to the
8:51
safety board because they're That's this issue
8:53
that drop this week to the every
8:55
quit but I will seize Coachella Tweet:
8:57
Illness is like race like you know
8:59
I'm quitting from of in I as
9:01
well. As an
9:03
unsung I'm I'm quitting. I'm in a
9:05
I got a bad news coming. just
9:07
dropped some good news right after it
9:09
and right. I mean
9:12
is is it's A was also inevitable
9:14
after what happened last year that. Alien
9:17
others who live in those exactly the six month
9:19
mark, I believe. Oh when
9:21
are you? As I do with with Ocd vesting. Won't.
9:23
More know it's just a standard
9:26
pr tactics don't like. Announcer
9:28
that as a departure that's really big.
9:30
Immediately give some time now and then
9:32
you don't like associated in the same
9:34
way. Sounds like month. that's before we
9:36
go back to the yeah I'd a
9:39
do ones that give the safety but
9:41
let's at least finish the idea. Steve
9:43
let the Levy says that. Ah,
9:46
The Ai high believe the
9:48
hype. Did we see something
9:50
truly new on Monday with
9:52
open a I Four Point
9:54
Omni. We. then
9:57
i think we did see something so like
9:59
okay Is GPT-4
10:01
always faster and better and this stuff is happening
10:03
and I want to remind everyone it's been a
10:06
year and a half since Chad GPT came out
10:09
and you know people are resetting their
10:11
expectations of how quickly they think stuff
10:13
should move. It's insane how much stuff
10:15
has moved. The big thing here for
10:17
the like curstinal assistant is
10:19
that it can understand interruption. You
10:22
can interrupt her in the middle of a conversation like you
10:24
would a human and she can shift.
10:26
And this is the V1 of it. It's
10:29
only going to do it. The
10:31
big new feature of artificial
10:33
intelligence is you can interrupt it. I
10:37
always like to argue here that just
10:39
as much as you can tell me
10:41
stuff, it's huge though. Interface really
10:43
matters because think about when you're trying to talk
10:46
to a Syrian and you're just like no,
10:48
change the thing. It doesn't work. You try that
10:50
with Alexa or anything. This
10:54
is straight up full like think of it less
10:56
interruption, full conversational. Like this
10:58
is like the step towards just straight up full
11:00
on conversation back and forth with an AI as
11:03
if it were a human. It is
11:05
much bigger than people. It's not like it does
11:08
some smarter. It's not the
11:10
quality of the answer. Correct. It's faster.
11:12
That's a huge difference. It makes us
11:15
anthropomorphize. That's a big problem. But it
11:17
doesn't necessarily – look, I mean this
11:19
is really impressive
11:22
and fun technology to play with.
11:24
My initial reaction to
11:26
this was I don't know
11:28
if this is – none of us know if this
11:30
is new. This is a new product that has been
11:32
announced but the capabilities might have been around for a
11:34
while. So we don't
11:36
really know how fast the
11:39
capabilities of the technology are moving.
11:43
And when we see something like this – Leo,
11:45
I've said this on some of your shows
11:47
before so forgive me. Always
11:50
repeat yourself because I have no memory. I
11:53
interviewed Nick Bostrom in 2014 or 15 when I was
11:55
working for Jason Heiner at ZDNet. I'm
12:00
paraphrasing him, but one of the things he
12:02
told me was that one of the
12:04
great dangers of AI
12:06
is that we will anthropomorphize it.
12:08
That's also one of the
12:12
great benefits and one of the things
12:14
that we really could get a lot
12:16
as humans out of artificial intelligence. But
12:18
he warned me that one of the
12:20
things that we will do, just like
12:22
when we look at a cloud and
12:24
we see a face, we will have
12:26
this impulse to think of AI like
12:28
talking to another person. In fact, when
12:30
we say something, a pronoun like her,
12:32
that is a human
12:35
pronoun. So we are anthropomorphizing this technology
12:37
and what I saw in OpenAI this
12:39
week was truly impressive. It was a
12:41
lot of fun to play with, but
12:44
it really pushes into
12:46
the anthropomorphization
12:50
of artificial intelligence. And when I think about
12:52
that, I try to think about who does
12:54
this serve? Who is this good for? Is
12:57
this good for consumers? Is this good for business? Is
12:59
this good for OpenAI? The answer
13:01
could be yes to all of those things,
13:03
but I really, my spidey sense of skepticism
13:05
is buzzing strong. Well,
13:08
it's also going to make your situation
13:10
and disinformation in general a lot easier.
13:12
I mean, the more realistic and more
13:14
human-like the AI becomes, then the more
13:16
convincing it will be for some people.
13:19
Absolutely, Ian. I mean,
13:21
the anthropomorphization, I can't even say
13:23
that word sometimes. It's a
13:25
rough thing to say. It is a tough question. It
13:29
is inevitable. Can you
13:31
say anthropomorphization? Anthropomorphization. Sure. Anthropomorphization
13:38
refers to attributing human characteristics to
13:40
non-human entities like animals, objects, or
13:42
concepts. It's a common technique in
13:44
storytelling and art to make things
13:47
more relatable or to convey complex
13:49
ideas in a simpler way. Thanks,
13:53
baby. There you go. Don't forget. I
13:55
love you. I wanted to hear the relationship. I
13:57
wanted to hear the response you had to that
13:59
one. I mean look, actually that
14:01
does lead to another thing which is
14:05
these like, it's already happening. So
14:07
I met, I talked to someone
14:09
recently where they were spending, and
14:12
not kidding, five to
14:14
ten grand a month on
14:16
like AI dating chat like
14:18
applications because they preferred
14:21
the like ability to control the
14:23
conversation to an actual human which
14:25
is also scary. This
14:27
stuff is happening really, really quickly and like,
14:30
you know, I never, I try never to be like
14:32
this technology is good or bad. I think of things
14:34
as like, you know, this technology is out
14:36
of the box. How do we adapt
14:38
to a world like that? We need
14:40
technologies that can help protect against this
14:42
information. We need technologies that help us
14:45
understand what's truth and what's not truth.
14:47
But this stuff is moving like really
14:49
fast and it's going to be
14:51
soon to a lot more consumers because if the rumors
14:54
are true, you know, OpenAI and
14:56
Apple are going to announce a partnership
14:58
where you have probably the hairstyle assistant
15:01
inside instead of Siri. And
15:04
that will be the first
15:06
exposure that millions of people
15:08
have to these AIs because there's still millions
15:10
of people who have never touched or even
15:13
heard of a chat GPT. This
15:15
is playing off of Mark Gurman's
15:17
newsletter from today's Power On newsletter where
15:20
he said the company will unveil new
15:22
generative AI features at its
15:24
worldwide developers conference very likely because of
15:26
their deal with OpenAI which we had
15:29
heard was close to closing. But also
15:31
Apple's talking to Google about Gemini. Yeah,
15:34
but let's face it, Gemini is not much cop
15:36
compared to OpenAI's products at the moment. I don't
15:39
know how you've all found it but I've been
15:41
trying out Gemini and I'm not impressed. I'm
15:44
having a hard time deciphering. I
15:46
used Gemini to help me better
15:48
understand the naming conventions. So
15:51
that was a joke, a drive joke. That was
15:53
the one thing I took out of Google
15:56
I.O. is I have no idea what these products
15:58
are called or why. Yeah, that's great. Google I.O.
16:00
for in a nutshell really for the last 20 years.
16:05
But really my question is, okay so clearly
16:07
that what's happening here is
16:11
the productization of AI
16:14
and making it more appealing to the
16:16
general consumer and if Apple does in
16:18
fact put her on the
16:21
iPhone, people will see that
16:23
as a big deal breakthrough. But
16:26
my question is, is the underlying technology
16:28
any better? Is it any smarter? It's
16:31
faster and it's got a kind of lilting style
16:34
to the voice all of which leads us to
16:36
think it's a human which is I think a
16:38
negative in the
16:40
long run. It's dangerous anyway. But
16:43
is it smarter? Yes,
16:47
well caveat yes.
16:50
Like if you look at the
16:52
actual technical interpretation for GPT-O, you
16:54
know, there's big performance upgrades specifically
16:56
for non-English speaking. So
16:59
like these are like baby steps towards like
17:01
a bigger change. Like there will be a
17:03
GPT-5. I wouldn't be shocked if OpenAI called
17:05
it something else at some point. The
17:08
question I think you're really asking is, is
17:11
this leading to the fabled AGI? Is
17:13
it getting closer? Is it getting better? Is it
17:16
getting, yeah. My
17:18
answer is yes, but no one really
17:20
knows how much time. It's a mystery.
17:22
If the panic tweets of people who
17:24
are leaving from the super who've left
17:27
the super alignment team because that team doesn't really
17:29
exist anymore, the answer might be sooner
17:31
than we think. My general
17:33
sense is inevitably you're throwing
17:36
enough people, engineers and resources.
17:38
You are going to go and hit that point.
17:40
The big question of course that goes
17:43
back and forth is can
17:45
the existing transformer models
17:47
which under pin GPT, which
17:50
under pin Gemini,
17:52
can they actually on their own
17:55
achieve AGI or does it require
17:57
an additional new leap in
17:59
understanding? of AI, something else
18:01
like that. There's stuff coming
18:04
out, mama, things like that. And are
18:06
the LLMs that are built on top
18:08
of those transformers really the path to
18:10
AGI? My guess is
18:12
that we will hear companies say
18:14
they've achieved it, whether this is
18:16
true AGI or not.
18:18
I think that we will hear about this sooner
18:21
than later. I'm kind
18:23
of a skeptic about it myself. I'm
18:25
treating it like fusion energy at the
18:27
moment. It's always another 10 years down
18:29
the line. Yeah, precisely. How do we
18:31
know if it is true? Exactly. That's
18:34
why I think we're going to hear them say they've
18:36
been talking and typing this up. So I think we're
18:38
going to hear about this as like, look, we've achieved
18:40
it. Here's the product. Some point. But
18:42
whether that really, you know, if we talk
18:45
to neuroscientists, whether they will agree, that
18:47
might be... To
18:49
be clear, the AGI is
18:51
different than say, like, you know, true
18:54
intelligence. AGI is just, it performs better
18:56
than a human at lots of tasks.
18:58
Not necessarily that it is intelligent
19:01
on its own. Sentient, I should guess I
19:03
would say. I feel like
19:06
AGI, as it's defined, is
19:08
relatively close comparatively. Sentience
19:11
might be who the F knows. That one
19:13
is a whole different game. Yeah.
19:16
But Postram and others would say
19:18
ASI is after that superintelligence. Okay.
19:23
At which point would we become pets to the machines and then... I
19:27
don't know. It's a
19:29
progression here. There's no like, there's no
19:31
test. It's not like taking
19:33
a pregnancy test and being like, you
19:36
are sentient or not sentient? It's subjective.
19:38
Like, it's really very subjective. There
19:41
is no objective definition. It's not the Turing
19:43
test. It's not the Chinese room
19:45
test. There's really no way. People
19:48
don't even agree on what it means. But
19:52
also I would suggest that
19:54
that's more of an academic kind
19:56
of media conversation. The
19:59
real question is... is AI
20:01
useful to humans
20:03
and is it getting more useful to
20:06
humans? So initially, my initial
20:08
reaction to LLMs was, this is a
20:10
parlor trick, it's super correct on speed.
20:12
It's not, you know, it's auto correct
20:14
on speed. It's not that useful. But
20:17
as time has gone by, I've seen, in fact, it
20:19
is quite useful in a variety of
20:21
different ways. And I think
20:24
partly the focus of these companies on
20:26
music and art and
20:28
video is more performative than
20:31
it really is where the real help is going
20:33
to be. In fact, I think it's
20:35
a dangerous game for them to play because it just
20:37
makes artists upset when really
20:39
that's not, nobody wants, I don't know,
20:41
I don't want anybody wanting that. Do
20:44
we really want music, film and
20:47
pictures created by AI? I think there's a lot
20:49
of more useful things AI could be doing
20:52
that would justify its existence,
20:55
you know, protein folding, inventing
20:57
new cures for cancer, making
21:00
the subway run on time. There's
21:02
lots of things that AI could do that would
21:04
be useful. And I would love to see that
21:06
more than AGI. I don't
21:08
want to just log roll for our company,
21:11
but utility and usefulness is precisely what we
21:13
aim for. We have a product that's called
21:15
Blackbird.ai. That should be the criteria, right? Look,
21:20
Leo, you in particular will like
21:22
this. It's compass.blackbird.ai.
21:26
I won't call this a fact checker because that's a
21:28
dangerous thing to say. We call it a context checker.
21:31
It is slower than other
21:33
LLMs, but look at the
21:35
responses. The responses add context
21:37
to almost anything, photos,
21:40
pictures, video, claims
21:42
that you type in. Compass
21:44
will answer those. Context is hugely valuable.
21:46
Wouldn't that be great to assign that
21:49
to Twitter, for instance, and just fact
21:51
check Twitter? You
21:54
can use it on almost any social network.
21:58
Just Drop in a link and it will... We
22:00
what we needed to have it
22:02
integrated directly and maybe one day
22:04
business like sonos on steroids right?
22:07
Yeah it to kind of automates that process.
22:09
I you know I almost I don't move.
22:11
Really want to bring it up here because
22:14
it it is my employer or and I
22:16
don't want you to think I'm just typing
22:18
something. but your question about utility in the
22:20
usefulness of it's that's what we tried to
22:22
answer when we built this and that is.
22:25
What? I think is
22:27
really where you're going with
22:29
rape, protein folding or other.
22:31
It's very narrow are specific
22:34
uses that are incredibly valuable
22:36
and important. As I
22:38
when I was at Ted this year.
22:41
They are mentioned a I over their
22:43
inner city times by the way, at
22:45
the test of six assists and you
22:47
be google, that's good. Congratulate. Assess The
22:49
got five days of talk allows is
22:51
pretty easy to beat. Fight. It
22:53
was like I had a conversation with a
22:55
couple of like you know, Professors.
22:58
And undies and it's I came away
23:00
with the conclusion that say the way
23:02
we talk about cancer. Is. Not
23:04
going to be anywhere near the same and
23:07
ten years you are not. Think of it
23:09
as an in terrible things will think of
23:11
it as like a I country the super
23:13
targeted set of medicines and products. That's like.
23:16
Eliminate your cancer as system Me
23:18
This is. That's the
23:20
set the non subject of
23:23
the objective test. For.
23:25
A we should forget to say gee
23:27
I conversation it's almost philosophy is worthy
23:29
of Nick Bostrom, but not of the
23:32
rest of his. But but saying here's
23:34
a practical use that's actually saving lives
23:36
as curing people. That's.
23:39
Something you can measure. And would
23:41
be useful and is non threatening. it
23:44
doesn't sound like scarlett johanson but i guess
23:46
it could it could be checked out there
23:48
we did have a piece on this just
23:50
recently where their seed bagels be mind was
23:53
predicting various different types of compounds which could
23:55
be i could be may i saw the
23:57
animals in someone elses most of the month
24:00
Yeah, you know it's like not counter is
24:02
but wait a minute I
24:04
know your piece, but but even if 10% or 1%
24:06
yeah, we're useful and Novel
24:10
oh sure, but you then go to find that
24:12
one thousands were useless. Yeah, you've got to find
24:14
that one event though That's the thing, but it's
24:16
a start. Yeah, certainly that's where AI can be
24:18
useful I mean there's a reason
24:20
for the AGI discussion because if you have a
24:23
true AGI then you
24:25
end up with a Verge
24:28
a machine essentially that can
24:30
go in its own direction to leg I
24:33
Do you want that? I don't want that. I I
24:37
think if you had an AGI and you gave
24:39
it and it was a true one and it
24:41
was just like your mission is to cure Cancer
24:43
yeah, you could see it work faster than human.
24:45
There are things there's there's all like whether
24:47
we wanted or not It's it I do
24:49
think it is inevitable. I just don't know
24:52
if it's two years or ten years I
24:54
don't think it's that much longer than that,
24:56
but this is a whole set of things
24:58
I think right now the clear applications are
25:00
like AI for medicine AI for these specific
25:02
use cases AI to You
25:05
know even robotics is like an example and
25:07
they could be is like simple as like,
25:09
you know cleaning drains and things like that
25:11
or I just like posted the
25:13
I like I just I didn't even put this in
25:15
the like Rundown, you know the Yeah,
25:19
I can Where
25:22
to go yeah, I learned about this company
25:24
that's doing this like this week or
25:27
like the cruise founder I would just raise
25:29
150 million dollars for a new robot startup
25:31
called the bot company I don't know what
25:33
it'll be But I'm pretty sure what he's trying
25:35
to go for if I had to like, you
25:37
know There's a lot of
25:39
people are gonna go for the like Rosie
25:41
the robot in your home kind of
25:43
deal Like that's another one
25:46
as it doesn't drag pedestrians under its
25:49
chassis Slight
25:51
problem. That's by the way why he's the
25:53
former cruise See cruises
25:56
settled by the way with the woman. Oh
25:58
really got dragged. I think was a significant
26:00
amount of money. They haven't announced operations in
26:02
the inside of the sector yet. I don't
26:04
think they're done. I think
26:06
Chris is done. I don't think they're... He'll
26:09
be back. He'll be a while now. Too
26:12
much money put into it, surely. Too much money.
26:16
They're not going to give up on that. They
26:18
gave the market to Waymo, essentially. Waymo? This
26:22
is where managing price expectations really
26:25
does matter. In addition to
26:27
not. Don't
26:29
drag pedestrians and manage
26:32
expectations. Well,
26:34
I would be good at that. I think Tesla's finding out
26:36
to its cost. If you tell people it's an
26:38
autopilot, they're going to treat it like one. Let's talk more. Have
26:42
you all used 4omni? 4-0? I
26:46
want to talk about your experience with it. I'll
26:48
talk about my experience with it. And
26:50
then I would like to talk about what's happening with
26:52
the so-called super alignment team at OpenAI.
26:56
You could say they're a trusted safety
26:58
team and it seems to have disappeared or imploded.
27:02
Lots more to talk about. AI is, of course, one
27:04
of the many topics we'll be covering today,
27:07
but it is one of the most interesting. What's
27:09
great about this is we
27:11
talk about it almost every show now and
27:14
it changes almost. It's not like we're repeating
27:16
the same material. 4-0
27:18
has really moved the conversation to
27:20
a new place. That
27:23
is different from the previous. We've
27:25
just lived through Web3, Crypto, Metaverse,
27:27
VR. All of which never went
27:30
anywhere. Hype cycles. They were very
27:32
short and very meh hype cycles
27:34
that didn't do exactly what you
27:36
just said, Leo, which is progress,
27:38
improve, and add usefulness and utility.
27:40
So that's just even. Yeah, it
27:42
does that. Progress is being
27:44
made. Yeah. Let's
27:47
take a little time out. We've got Ben
27:49
Parr. We've got AI experts on the
27:51
show today. Ben Parr, author of The
27:53
AI Analyst, but he's also figured out
27:55
how to get out of the couple. I forgot it does that.
27:57
If I do this, for those of you that's the AI. Yes,
28:00
there you are. He's
28:02
also the co-founder of Octane AI and writes about
28:04
AI for the information and he's figured out how
28:06
to make Apple do things. Your
28:08
little buddy, C-Series, is completely
28:11
stupid. Ian Thompson is
28:13
also here. He's one of the, are
28:15
you a boffin? No. No,
28:17
no, boffin's a defectionate term for scientists.
28:19
I'm just the person that writes about boffins. It's
28:22
the guy who writes about boffins
28:25
at the register.com. Great
28:27
to have you. It was fun. Of course,
28:30
good friend Dan Patterson, director
28:32
of content at Blackbird AI, who has been, you
28:34
know, this is actually that I think about this
28:36
is kind of the direction your
28:38
whole career has been going in the last
28:40
say 10 years anyway, is
28:42
fighting the good fight to
28:45
help people, you
28:48
know, survive
28:50
in the face of big tech. I
28:53
think I've written about and produced
28:56
more about the impact of technology
28:59
than the news of technology maybe
29:01
for the last decade or so. Yeah, I like that. I
29:03
think that's a great, we need that. So
29:06
thank you. I appreciate it. Wow, you too.
29:09
Yeah, well, I found we've kind of trended more
29:12
in that direction too, haven't we? You know, I
29:14
don't think we were ever the network where here's
29:16
the new shiny gadget. What do you think? You
29:19
gonna buy it? But more on
29:21
how this stuff changes your life or affects your
29:23
life and how you can use it better. I
29:26
think it's always been our goal. So thank you. Complete
29:28
aside before we go to the commercial break, all
29:31
the new like AI pins and things, God,
29:34
they're not good yet. Okay, I ordered two
29:36
things that maybe tell me if I'm stupid. First of
29:38
all, I did get the buy the
29:40
meta glasses. And while
29:43
they're not. Okay,
29:45
you're gonna say something bad, but I'm gonna say
29:47
something good. No, they're good. Yeah, I think that
29:49
sound quality is great. They're great for
29:51
listening to music. You can the
29:53
cameras on there take decent pictures, but you can
29:55
also point them at things. And
29:57
admittedly, it's But if you attach that.
30:00
Maybe. You. Know something? her like? I
30:02
mean there? There really a first step along the way
30:04
to something that I think is a form factor. We're
30:06
going to one. So. I got
30:08
those. I also got the brilliant labs
30:10
haven't received him yet. The.
30:12
Brilliant Labs glasses which are kind
30:14
of the same. Saying.
30:17
Only. They. Have Perplexity
30:19
A I built in and their
30:21
heads up display. So
30:24
I liked acts like it's and unlike the Met
30:26
I can I can see out. you know, see
30:28
what I'm so. On the screen itself in
30:30
front of me, it's an oled layer. In.
30:32
The glass. And then I just bought
30:34
the limitless. Pin. That's.
30:37
The view Time app pens. When.
30:39
As I know, I never sell for. I admit.
30:42
I never sell for the humane in.
30:44
A cannon knew that wasn't. It
30:47
and. I add their few other things
30:49
they didn't buy, but I thought this was
30:51
very interesting. This. Is a
30:53
little clip that that is always listening. And
30:56
then analyzes what is yours. And
30:59
sends you their analysis. App.
31:02
Our of those two did not here yet.
31:04
Yeah, I mean that one thousand make sense
31:06
in the price. makes sense The humane thing.
31:08
all these are than one vision frame I
31:10
got every miles or less than one vision
31:12
for of. I'm emily
31:14
mean ten costs eight seven hundred a hundred
31:16
dollars and they needed to put their Lives
31:18
Lingers model on the tip and not in
31:20
the cloud. And there was his address. It
31:23
might have an atlas of school just a
31:25
T mobile subscriptions right satellite which is of
31:27
us and then there's like the rabbit are
31:29
one which deducted during by that either and
31:31
stuff I gotta raise their and it's during
31:33
his cute though look at I usually say
31:36
it's really is a. I.
31:38
Had. So it is. It's a it's. I.
31:41
i bought this for the reason that many
31:43
people did one teenage engineer yemen like right
31:45
hits to design but it was all excited
31:47
about like this idea of d convergence rights
31:50
for so long for a decade and a
31:52
half everything from birds done the found your
31:54
music player and maps everything else converse the
31:57
phones and love that we live in an
31:59
era of gets coming back. I
32:01
have an analog pocket, it does very
32:03
little, but it's a fun gadget. I
32:05
don't want to play games on my
32:07
phone, I want to play games on
32:10
my pocket. Anyway, so that's the value
32:12
of the rabbit. Honestly, the limitless is
32:14
really the beginning of her. I want
32:16
to put her in my ear, basically.
32:18
Right? I want Scarlett Johansson in
32:20
my ear, loving
32:23
me and only me. Right. No,
32:26
I like the idea that this is always available.
32:30
Actually, I'll try to remember the name of
32:32
the thing. There was at TED, when I
32:34
was at TED, they had one
32:36
of the presentations, it's like ear pieces you put on each
32:39
side of the ears. I will go and find this, but
32:41
they like... Don't, because I'll end up buying it and be...
32:44
Well, you could say things like, could you remove
32:46
the sound of the baby from the room? Oh,
32:49
I love that. You guys played Jeremy's voice. Oh.
32:52
And it looked cool.
32:54
It's still a little bit clunky and big, but they're
32:56
coming out with a version of it, theoretically, in a
32:58
year. This is the future of hearing aids. I
33:01
think hearing aids are going to merge with smart tech
33:03
and become something more than just a way to hear what
33:06
your wife is yelling at you. Anyway,
33:10
there are interesting products. Nothing is there
33:12
yet, but I think we're getting... Yeah,
33:15
I mean, of course, there's glasses that I could wear. That's
33:19
what these brilliant ones say in my prescription. Yeah, but
33:21
it would say conferences or something, and you go and
33:23
meet somebody in a museum. You just get whispering in
33:25
your ear. This is so-and-so from so-and-so. Yeah, exactly. That's
33:28
what I'm waiting for. The glasses are the clear
33:30
first form factor. We already used to them. We'll
33:32
see some of the others. The device is called
33:34
the IYO1. I-Y-O-1 is the one that was... Oh,
33:37
yeah. I've heard about that. Yeah. ...at TED. Okay.
33:40
So we're getting... I think we're getting some
33:43
real innovation happening on the
33:46
software side. It's IYO.audio. I-Y-O.audio.
33:49
Oh, yeah. They look like gauges. I've seen the
33:51
video of these. They look like you've
33:53
got big gauges in your ear. Yeah.
33:56
Like it's a statement. Which
33:58
In a way it is. Impressed. A.
34:02
It's a good time. Like my demos on
34:04
stage it's always discount as you know there's
34:06
like stuff going on the background space. it
34:09
works as describes. I want it to yell
34:11
at in a nicer from back to the
34:13
great but I suddenly that sounds awesome year
34:15
and like of as a ruby to make
34:18
have any Sisters and I slid. Into
34:21
the School. The original ones are one thousand
34:23
six hundred sixty dollars. They don't see how
34:26
much the aisle ones are but the deposit
34:28
sixty dollars but I much prefer. plus them.
34:31
So. I know how much are going to cost
34:33
a whim. it is expected to someone about six
34:35
hundred bucks. That sensible. And if
34:37
you want L T. Six
34:40
hundred bucks. Interesting.
34:44
Or I. Was tickled Time out
34:46
as I've gotta. Get. Some more
34:49
money so I can buy this. Ah,
34:52
that's where the advertisers come in. Very
34:54
handy actually. this is when I bought
34:56
so this is. this is one I
34:59
thought was so cool I actually went
35:01
out and got it's called my Leo.
35:04
Have talked about this for you and picasa.
35:06
Was. Purchased by Google and I was so
35:08
disappointed because Ghoul did not put any of
35:11
the features because and google photos and we
35:13
lost the one thing that because it was
35:15
great at which is organizing your photos. Well
35:18
in. Some ways this is the
35:20
successor to that, but it does
35:22
it all. With a I. it
35:24
does it all intelligently. This is
35:26
a way to store your data:
35:29
photos, videos, documents, Without.
35:31
Putting them in somebody elses cloud. Without.
35:34
Eliminate without duplicate. So eliminate duplicate
35:37
so. Fully
35:39
encrypted, And free
35:41
for your first device, so great way
35:43
to try it! You. Don't
35:46
have to get the My Leo Photos Plus
35:48
subscription right away. Just downloaded for your Mac,
35:50
your Windows Pc or I O S, or
35:52
your Androids. I would download for the device
35:54
you have the most storage and I now
35:56
have two hundred thousand plus documents, photos and
35:59
videos in my. Yeah, it
36:01
does. Automatic categorization. It.
36:03
Goes, It cuts. It's just kind of
36:05
my google photos only locally and even
36:08
smarter, it does automatic face recognition, but
36:10
you have to tell it. Give it
36:12
Five photos, Ten photos, that's that's my
36:14
wife okay. and then it goes through
36:16
the rest of them in the background,
36:18
tags them all. It is
36:20
the way to take two hundred thousand
36:22
photos, videos, and documents, organize them, consolidate
36:24
them to rate them. Control
36:26
them on a single library, all
36:29
without exposing them to the public
36:31
cloud. I
36:33
am blown away by this and it's now. It's.
36:36
More than a backup solution for me
36:38
though my organization solution and backup solution
36:40
I back it up to my son
36:42
algae. Nasa never leaves my network, but
36:44
you can also use my Leo has
36:46
it's own encrypted cloud solution which can
36:48
also use your existing for translucent as
36:50
and crypts before you uploaded so it's
36:52
completely private. Start.
36:55
For free or sign up for their
36:57
pay plan you get automatic backup sinking
36:59
between devices. So now that I'm a
37:01
paying subscriber, I have it on everything.
37:03
My all my phone's all my desktops
37:05
on my laptop or my tablets and
37:07
all my photos if they are everywhere
37:09
I go to. Hundred thousand plus. You'll.
37:13
Also get duplicate file clamps. It will pull
37:15
all my photos from Instagram and Facebook and
37:17
Google photos is like that was the first
37:19
thing I do as download everything from google
37:21
photos. d duped it because there were a
37:24
lot of duplicate. Save the best copy. It's
37:27
incredible. Probably nine dollars nine cents a
37:29
month or ninety nine dollars a year.
37:31
You get my of Photos Plus which
37:33
lets you organize magic, protect an unlimited
37:35
number of photos and videos. On.
37:37
All your devices. Take.
37:40
A trilogy digital assets. The Sing really works
37:42
is what you're looking for. A promise. miami
37:44
of photos and y el I owe you.
37:46
Get started for free. With. The
37:48
basic version for one device or take full
37:50
advantage of the platform with my Leo Photos
37:53
plus. We've. got a deal to for
37:55
you twenty five percent of your annual or
37:57
monthly subscription to miley of photos plus to
38:00
sign up and get that 25% off
38:02
go to myleo.com/twit25
38:05
that's myleo myleo.com/twit25
38:11
for 25% off
38:15
you know if you're if you're just wondering what this could
38:17
do download the free version put it on a device and
38:19
see you will be blown away
38:21
and it is incredible love it
38:23
myleo myleo.com/
38:27
twit25 we thank you so much for their support
38:31
all right again
38:37
I still have to ask the question is
38:41
I guess we don't have to ask
38:43
this anymore it's clear that this technology
38:45
is useful today and is
38:47
getting more useful are you
38:49
a skeptian I
38:52
was at first and I was when
38:55
chat GPT first came out you know as a journalist
38:57
I'd look I've looked at that and I thought I
39:00
know what a lot of publishers are thinking right now
39:02
it's like we can get rid of those pesky journalists
39:04
and start doing everything by AI and
39:07
we tried it out we did it I did
39:09
think at one point it would be quite useful
39:12
for things like outage stories or financial results but
39:15
it's just not accurate enough you know you
39:17
spend more as much time editing stuff to
39:19
make sure it's accurate as you would if you don't you
39:21
know if you've actually written the piece yourself I think
39:23
it's very it's gonna be very useful it's
39:25
as Ben has said it's increasing at a
39:27
rate of knots in a way that other
39:29
technologies haven't my skepticism is waning
39:32
I'm actually starting to get more all in on
39:34
this what about Ben what about
39:36
accuracy though what about mistakes
39:38
that ais make there are lots of them
39:45
you think about and you look at the research
39:47
and you see that it
39:50
re hallucinations which is with mistakes basically yeah I
39:52
don't you know why I avoided that word because
39:54
it's an answer for more I know yeah it's
39:57
it's it's good branding to be
39:59
clear by the AI research community. Yeah,
40:01
exactly. It's not wrong, it's a mistake.
40:04
Stop holding it. By
40:06
the way, humans do that. Yeah. But
40:08
when human human, Elon, did the Eiffel
40:10
Tower ever burn, I wouldn't
40:13
necessarily trust your answer. Yeah, I know.
40:15
But I mean, humans make mistakes, that's
40:17
what we do. But I'm calling it
40:19
an hallucination when it's just basically a
40:21
mistake. It's a mistake. It's PL speak.
40:23
Yeah. I mean, look, to
40:26
buy all the semantics, the whole point being
40:29
it's going down in terms of the number of mistakes it
40:31
makes. It is. Don't make mistakes.
40:34
I was trying to do like Google, if
40:36
you do a Google search, you get Gemini
40:38
results now. I don't know if you have those. And
40:42
they're usually accurate, but they're not always accurate. That's
40:44
interesting. Well, I actually reinstalled Google Assistant on my
40:46
phone because I tried Gemini for like a couple
40:48
of weeks and it was so useless. It was
40:50
just like, now I have the old Assistant back,
40:52
thanks. I
40:54
want to defend Gemini because I've been
40:56
using it too. And while
40:59
it's different in its capabilities, I think
41:01
it's also surprisingly useful. One
41:04
of the benchmarks that I use, and maybe it's not a
41:06
good benchmark, I create
41:08
a custom GPT for the computer language
41:10
I use. And it's really useful for
41:12
me because I don't remember all the, you
41:14
know, it's easy to, I have an eight
41:17
foot shelf of reference materials. It
41:19
knows all that stuff so I can query it and say, well, how
41:21
do I, what's the syntax for this? I'm
41:23
always right because I'm saying stick to the eight foot shelf. I
41:26
tried that with Gemini without any training and
41:29
it did the same. It was just as good. And I
41:31
thought, oh, well, I've been wasting my time with a custom
41:33
GPT. Gemini is pretty impressive.
41:36
The thing about Gemini is it could do it in
41:38
all the languages. So you just say, okay, you can
41:40
now do it in Python. Oh, that's cool. Now do
41:42
it in C. Oh, that's neat. Now just an electronic
41:44
code. I yeah.
41:47
Go ahead. I'm sorry. Go ahead.
41:50
Despite my skepticism, I use
41:52
generative AI tools every single
41:54
day. I use quad chat
41:57
GPT or GPT for
41:59
oh. Gemini
42:02
and almost every publicly available model.
42:04
If I can run it locally,
42:06
I will run it locally. And
42:08
I use this as an assistant.
42:11
It works alongside me. And I
42:13
use it in my job. So
42:16
if I try to generate an article or
42:18
a story in a traditional news
42:21
article, Ian, I imagine that you're pretty accustomed
42:23
to writing a head, deck, lead, and nutgraph.
42:25
And then you have the details of your
42:27
story and then maybe you write a kicker.
42:29
In that story, you
42:32
have quotes from experts. You have some
42:34
data and some details. Now, much
42:36
of those details, it will get wrong.
42:38
It cannot do that. However, I
42:41
will use particularly Claude is very good
42:43
at this. I will tell Claude, I
42:45
will upload. So
42:47
my threat analysts will write here
42:49
are five and you can find
42:51
this at our blog.
42:54
They will say here are five narratives
42:56
that we have found around Navalny, around
42:59
the death of Alexei Navalny. When
43:01
we say narratives, here are five
43:03
mis and disinformation themes around Navalny.
43:06
Yes, I do this every day.
43:10
We upload
43:13
our charts. You can see our data. We
43:15
can show our work. And what I do
43:17
is upload their work to
43:20
Claude and to chat tbt. And
43:22
then I say, please suggest 10 heads and
43:25
10 decks that have to be 60 characters
43:28
long. The deck must be 200 characters long.
43:30
It must be a full sentence and never
43:33
ever once has these models produced
43:35
one thing that is useful. But
43:37
they do produce many things that
43:39
are useful. And then as a
43:41
human writer who's written thousands of
43:43
stories, I mean, maybe hundreds of
43:45
stories. Exactly. And
43:49
I combine those and then I write a head, a
43:51
deck, a lead and a nut graph. And I got
43:53
it. The rest of the story. I'm
43:56
sorry. Hey, what is a head, deck,
43:58
lead and nut graph? I've
44:00
got a hat, get some help. That's
44:04
a journal, sorry. Hey, journalism
44:07
and writing, those terms are all
44:09
about structuring an article effectively. Here's a
44:11
quick look. Oops, I
44:13
stopped her. Go ahead, please. I didn't mean to interrupt you.
44:15
See you. I
44:20
may never know what a head deck need and I... I
44:26
think I've hit a lot of the Pyramid of
44:28
News. Yes, right, exactly. And
44:31
this is why the newest version will be
44:33
so useful because... I'm having issues right now.
44:35
Our system is experiencing heavy issues. This is
44:37
4-0. Please try again later. This
44:39
is 4-0. By the way, the cute, clever,
44:41
cutesy voice that they
44:44
showed on Monday is not yet available, so this is
44:46
as close as I can get. I
44:48
tried to make it cuter. I said, be a
44:50
valley girl, do some more up-talking and
44:52
vocal fry, but it never quite got that good. I
44:54
still think this is cute. The
44:57
good news is, even though we interrupted, I did
44:59
get the text of head deck lead in that
45:01
graph here. So if I were a normal
45:04
human, I could look this up
45:06
and get some fairly useful information.
45:09
Yeah. The point is
45:11
that it will spit out 10 things. And what
45:13
I'm, again, kind of making a dry joke. It
45:15
won't produce one thing that is useful. It produces
45:17
many things that are kind of useful. Not
45:20
one of them I could throw into a story and
45:22
say, this is good. But I can pick out
45:24
a word here, a phrase there. And
45:26
that is, I think, a very
45:29
efficient way for me to work alongside
45:32
AI, but it just can't
45:34
do what I do. It can't. But
45:36
it shouldn't. If I were to submit a story, no. We don't
45:38
want it to. I don't know.
45:40
Maybe we do. It might be good
45:42
at staking drains. I mean, there are
45:44
some things it can do. But I
45:47
think there are some human attributes that
45:49
we want humans to continue to create,
45:51
like writing, painting, singing, opera. The
45:54
art that was crushed in the iPad. Yeah.
45:56
Oh, God. Oh, God. That was just insane.
46:00
I'm some responded as well. It's like we won't
46:02
crush the life out of us. Yeah, you will
46:04
that was a martyr Yeah, I have to point
46:06
out that her
46:09
misspelled head He
46:11
ad dandy Ck
46:15
Lead le ad and nut grass. I'm
46:17
talking about me. Yes. Come on, man.
46:19
You don't know how to spell and
46:21
nutcraft. She are a pH Yeah,
46:23
I mean I was playing a lot of playing around
46:25
with feature ideas Cuz I've just done a piece on
46:28
the 10th anniversary of Nokia being taken over and shut
46:30
down by Microsoft So I else
46:32
you know chat GPT is so give
46:34
me some notable tech anniversaries a tech
46:36
news anniversaries Over the last you
46:38
know in 1994 2004 2014 and they were all wrong Absolutely
46:43
every single one of them it was trying to
46:46
tell me that Tetris was released in you know
46:48
a year before actually wait a minute Wait a
46:50
minute We're gonna be doing
46:52
this little for lot in the show. I can't when with
46:54
Tetris released I've never
46:56
seen a Tetris release I've
46:59
never seen a Tetris release I've never seen a Tetris release Tetris
47:02
was released on June 6 1984. It's amazing how it's still so
47:04
popular, isn't it? Okay,
47:09
I don't want to get in a conversation as
47:11
well You know what? Wait, there's gonna be an
47:13
episode I don't know how long down the line
47:15
where you can what is the Zoobs is it
47:17
gonna be like it? Hey
47:20
like on the podcast gonna be like the AI we're
47:22
gonna have the back and forth.
47:24
It's gonna jump in months away No,
47:26
Dan, you're wrong And
47:29
then it'll be like straight-up wrong and then there's like a
47:31
fight Was
47:34
she right? Yeah, she accurate No,
47:38
it was not well technically it's
47:40
tricky what you're talking about because Yeah,
47:43
I mean it's tricky one because it was
47:45
released in the Soviet Union right as was
47:47
so I mean technically It
47:49
was done in 1985, but it was actually released
47:51
in the West until later And also it got
47:53
that nuance. Yeah. Yeah, so I think I
48:00
think we seem like we're all in agreement that
48:02
the most useful way to use an AI is
48:04
in conjunction with a human being. Maybe
48:07
that's going to be the definition of AGI when it
48:10
doesn't need us anymore. Oh,
48:13
yeah. That's a good indicator. Right. I
48:16
mean, I'm being overly skeptical and overly
48:18
critical, but I do use AI tools
48:20
every day. I think perplexity is a
48:22
fascinating tool, and I do think it
48:24
is improved. Six months or a year
48:27
ago, perplexity would get answers wrong pretty
48:29
frequently, but now I use it routinely
48:32
without thinking about it. Do you use it
48:34
as your main search? Not
48:36
my main search, no. I
48:38
found that hard. Just the lag is a little
48:40
hard, but the results on
48:42
Google have been challenging lately too. Which
48:45
is, by the way, why 4.0 is
48:49
important because they reduce the lag dramatically.
48:51
As you can see in that pathetic
48:53
demonstration, it's not completely eliminated.
48:56
Well, you know what this has really
48:58
done with Google as well is kind
49:00
of change it from an information retrieval
49:02
system. When we type in a query,
49:05
Google gets information. It brings it back.
49:07
It lists it in order of relevance,
49:09
or the old Google did. And now,
49:11
with these generative features, Google is now
49:13
a creation engine as
49:16
well as a search engine. I
49:18
just think that that shift is
49:20
really interesting. It's maybe something that
49:22
was missed in I.O. with all
49:24
the product announcements. I think Google
49:26
becoming a content engine is really
49:29
interesting. A lot of the features that they
49:31
announced, much like the Arc browser,
49:33
Google will now just generate a page for you.
49:36
And much like, I think, seeing
49:39
Ben, you pointed this out, there
49:42
are millions of people who have never
49:44
experienced chat TPT but will with
49:47
this Apple
49:49
announcement at WWDC. Google
49:52
will do the same thing with millions,
49:54
billions of people will experience Generative
49:57
features in their search, which will...
50:00
Seems. A lot of how we
50:02
obtain information. I mean as I said
50:05
I owe Google will do the googling
50:07
for you was a direct quote but
50:09
that's a bad as Rancid had story.
50:12
For. Content Creators isn't it? because they
50:14
did Albin dialed and don't out of
50:16
the equation there only need on as
50:19
as freaking out now they're they're only
50:21
jobs by the fodder. Therefore, the
50:23
Ai and we don't have any
50:25
direct contact with them insects. This.
50:28
Week Open A I announced their
50:30
and start using Reddit content. It's
50:33
I think will greatly improve said Cp
50:35
T buses. I think they're few redditors
50:37
who aren't so happy about that. They're
50:40
not getting a cut in the deal.
50:43
Or it on my attitude is. And
50:47
maybe I'm in a different perspective than most
50:49
people. So after really earn a living anymore
50:51
and cetera, et cetera, great, that's gonna make
50:54
Chatty Pd better. Bravo!
50:56
Answer depends on sub reddit you going
50:58
on for well like it depends on
51:00
the user thing I was I was
51:02
ah i don't as you know their
51:04
as I said he i watched were
51:06
of in the early I days to
51:08
have a i did conversational there were
51:10
in a lot of like public databases
51:12
they use the Enron emails because they
51:14
were made pummeling my god ah how
51:16
you have a lot that's good to
51:18
have an for a pure ai system
51:20
to have bail for room for something
51:22
yeah price I get there like issue
51:24
like writers would. Have asked But also
51:26
the flipside is like you been created content
51:29
for free as a big do not
51:31
alone in on this like website and you
51:33
think as a split you think was
51:35
gonna happen. I mean,
51:37
it's it's A. it's a creature was
51:39
a smooth as many different than somebody
51:41
like the register who's contents getting scraped
51:43
by a Google or and open Ai
51:45
and then served up as search results
51:47
without ever having people click back to
51:49
the source. A It's a bit of
51:51
a pain is going to be said,
51:53
but I don't honestly see wound of
51:55
it. I mean, this has been said,
51:58
this kind of data is being scrapped
52:00
or massive scale. There's an awfully sanguine
52:02
about it. I don't think the New
52:04
York Times is quite so app our.
52:06
We don't have the legal sweeter have
52:08
their lawyers. that's them. don't even Elsa
52:10
when now be analyzing the case. I
52:12
mean if it was actually I'm weeks
52:14
I checked occasionally to safe As I
52:16
say, we go to tasting of stuff
52:18
in full and I can see element
52:20
solid but they're not actually directly copying
52:22
am. The. Yes, the New York
52:24
Times and others they are going of
52:26
of in up against since I've got
52:28
a feeling they're going to lose or
52:30
meltdown that front Be honest because the
52:32
actual occurrences where you are copying and
52:34
there have you been is they people
52:36
least get rams I will know they're
52:38
not know they're really known. Legit Weight
52:40
of read the New York Times on
52:42
open a Ice but I would be
52:44
more concerned with the way to search
52:46
engines are using this contents if I
52:48
ask. Winstead was a good example. By.
52:50
I just actually did. Announce
52:53
on the tweet because when can't you
52:55
perceive First launched I went on to
52:57
it and it was the slot right.
53:00
Who is the journalist and Thompson anecdotes
53:02
the name right apartment average you I
53:04
said well apparently I'm an award winning
53:06
journalist. one wards school I've read as
53:08
the New York Times know Guardian from
53:10
nope in Cook it was to sign
53:13
million I as perplexity I what is
53:15
a Roth Ira and I got. The.
53:18
Probably the answers I wanted right? Every bit
53:20
of information I wanted. Now it does give
53:22
you a footnote. This is all you get
53:25
is a footnote. To. Invest a
53:27
Pdf an investor.vanguard infidelity but
53:29
I've and Khan Academy but
53:31
I would expect. That.
53:34
The folks at Com Economy by be a little.
53:37
Missed that the content from their
53:39
video. Has. Been extracted and presented
53:41
to me. With. His minor little
53:43
footnote back linking back. I'm never gonna link back
53:45
because I've got everything I want right here for
53:47
me, so it isn't my gamble. Matt is. Yeah.
53:50
My guess is that we are
53:52
answering and aides were commodity. That.
53:55
Much information as to modified. and
53:58
information like that what is Roth
54:00
IRA is, it
54:03
could be considered a commodity information. It's
54:05
just something that is in the general
54:07
knowledge base. And that is something that
54:09
AI will be able to summarize very
54:11
well, but that there could
54:14
be outliers, musicians,
54:17
artists, good writers,
54:19
who creates things that AI can't
54:22
replicate, and those outliers will continue
54:24
to exist. Everyone else is going
54:26
to become training
54:28
data. So if you're not Bach or Beethoven, get
54:30
out of music? Is that how it works? There's
54:33
going to be a big shift. Like we were talking
54:36
about Google search, like now you have the AI results
54:38
in the top. How many people are not clicking links
54:40
as a result of that? That's what I'm talking about.
54:43
They estimated, Google estimated that there will
54:45
be a drop of 40%
54:47
to publisher websites. Google's own numbers
54:49
say 40% drop to
54:52
publisher websites, which is why the media was
54:54
freaking out. Not merely because those websites are
54:56
getting hurt, but Google's going to run out
54:58
of material. If you
55:00
don't incentivize people to create content,
55:03
if you pull the rug out from under them, then
55:06
five years down the road, you have no source.
55:09
Yeah. This is literally why there's
55:11
companies building AIs to like have
55:14
conversations to train AIs, which sounds
55:16
like its own recursive loop. That
55:18
doesn't sound like a good idea at all. It's,
55:22
I'm curious, like just
55:25
reading it more and more, like new data
55:28
doesn't necessarily, you
55:30
know, have the exponential returns as it did
55:32
before. There would probably have to be another
55:34
leap or two, but you have
55:37
to have new and current information for the
55:39
AIs to go and be useful. I
55:41
was saying one other thing, which like we're thinking about
55:43
like the humans and loop. I think this is part
55:45
of why like there always be humans and loop. It'll
55:48
just look different. I'm going to give you an example.
55:50
There was like a company that went viral on Twitter
55:52
called Payman. Instead of you paying the AI to do
55:54
work for you, the AI pays you to do work
55:56
for it. You Can imagine an AI
55:58
eventually being able to run. Write articles
56:00
or write code or do design
56:02
work by. It's elusive the human
56:04
to do the code review to
56:06
give design feedback to ah you
56:08
know to that would staying which
56:11
they're like. Added an article by.
56:13
The. Script is flipped, but human is
56:15
still involved. Crazy. Stuff that
56:17
actually at that I took my daughter has
56:20
to be a good job as tuning a
56:22
eyes, most that there may be more jobs
56:24
doing math and reading editorials. yeah, I mean
56:26
odd. On the other hand, I got so
56:28
I saw him linked in. Somebody was actually
56:30
advertising people's do that on boats of these
56:33
woods. Many hours dollar an hour and right?
56:35
Or fifteen dollars an hour. Yeah, minimum wage
56:37
can't live on. You know it's kind of
56:39
like yeah well first of who he didn't
56:41
get So that kind of sides could be
56:43
somebody who will only be do it for
56:46
twenty hours and. Co just to make a living on
56:48
the other cause he's not gonna be is is not going to
56:50
be there. Either Funny
56:52
though I haven't fucked off but he knows
56:54
or technical director and are how to surf
56:56
Sat and is a musician and it was
56:59
hates assassin I think sexy middle middle managers
57:01
Ceos those are the jobs that can be
57:03
taken by and should be taken back. I
57:05
think this is a tough city. I tend
57:08
to get rid of the the telephone sanitizers
57:10
history. add men with you know manager is
57:12
put him on a simple but it's what
57:14
this really is. This.
57:17
Really affirms your model the out
57:19
where you'd okay we might. Lose.
57:22
Advertising as a source of
57:24
revenue. but if you develop
57:26
and build a community that
57:28
you interact west's and that
57:31
is emotionally invested, maybe there
57:33
is a personal relationship? Maybe
57:35
there's a genuine social relationships.
57:37
I think those media enterprises
57:39
will survive by. Advertising.
57:42
Based Media might have a lot of challenges
57:44
in the next decade, and we know that
57:46
we're going through a chokepoint. If.
57:48
We could survive to the chokepoint.
57:50
The future's bright because the more
57:52
Ai contents out there, the more
57:54
people are going to value real
57:56
human content. Real. human relationships real
57:59
human can be communities. And
58:01
so as long as we can
58:03
survive through this narrowing
58:05
of revenue and narrowing of audience, I
58:07
think you're right, the future's bright. It's
58:10
not a great future for, I don't
58:12
know, a big media outlet
58:15
because they don't have that community, they
58:17
don't have those relationships. It
58:20
really fosters a small kind of
58:22
almost cottage industry of content. The
58:24
future would survive in a context
58:26
like that. Or a former, platformer.
58:29
Yeah, platformer, Casey
58:31
Newton's newsletter. Because you
58:34
guys establish a relationship with an audience,
58:36
the audience likes your particular brand of
58:39
snark and information and doesn't want, no
58:41
AI can duplicate that, doesn't want that,
58:44
but they have to be willing to pay for it and
58:46
that's challenging. It's
58:49
still the main part of the publishing industry as
58:51
a whole, but I think we're seeing particularly,
58:53
well, in the case of Reddit, but also with
58:55
photo libraries, with commercial companies with a large amount
58:58
of data, they will sell it to
59:03
AI companies and it's another revenue stream for
59:05
them. But with Reddit, I
59:07
don't know, there are some
59:09
fetid areas of Reddit. Well, I'm sure
59:12
AI will not ingest the
59:14
fetid. Stick to it. They're
59:16
like, yes, anything that is the NSFW.
59:19
Yeah, they are. Or maybe it
59:21
is because they talked about the
59:23
AI like the photographic stuff. I
59:26
think they talked
59:28
about eventually being able to do more of
59:30
that because that's what you do. Isn't that
59:32
always what happens is that's where the revenue
59:34
stream is? Well, they honestly put
59:36
a link to online payments, VHS
59:40
recordings. That's a lot
59:42
of technologies that survive
59:44
because of dirty
59:47
old men. That's how Blu-ray is being HD.
59:49
That's right. All right.
59:53
Let me take one final note. I'm sorry. No,
59:55
final note. Final note is good. I like media
1:00:00
and AI. You're
1:00:03
right, we might shrink down and have smaller
1:00:06
news organizations, media organizations
1:00:09
do what you do and what Casey does
1:00:11
and super serve audiences. But I have this
1:00:14
feeling in my gut that
1:00:17
was pronounced during I.O. about democracy and the
1:00:19
fourth estate
1:00:23
and our shared reality, our sense
1:00:25
of shared reality that has atomized
1:00:27
and we have a
1:00:30
real reality is maybe only at
1:00:32
the beginning of fracturing and
1:00:35
without the fourth estate. I
1:00:39
have deep concerns about the future of democracy. I
1:00:42
agree 100%. I think it's
1:00:44
already fractured to a large percent because now people will
1:00:46
just, okay, I
1:00:49
was explaining this to somebody else. When I was
1:00:51
growing up in the UK, we had three
1:00:54
television organizations. We actually got the day off school to
1:00:56
see the opening of the fourth one. We
1:00:59
had like a dozen national newspapers,
1:01:03
maybe six or seven radio channels, and
1:01:05
that was it. That led to a consensus
1:01:07
in a lot of ways. Now with the multiplicity
1:01:09
of media outlets out there, in
1:01:12
some cases specifically designed for particular audiences,
1:01:15
then people can make their own reality.
1:01:17
I think you're right. It is very
1:01:19
dangerous for democracy. Trans are naturally tribal,
1:01:22
and it fosters tribalism, and it
1:01:24
fosters a fragmentation. Unfortunately,
1:01:28
an experiment
1:01:30
like the United States requires transcending
1:01:33
local tribalism to have a shared
1:01:35
national vision and identity. It's not
1:01:38
just the media, but it's all
1:01:40
fragmenting. We
1:01:42
have become smaller and smaller villages and
1:01:44
greater and greater tribal animosity. That
1:01:49
isn't a good trend. I agree with you, Dan. This
1:01:51
is, in the long run, disaster awaits. I
1:02:00
mean, I think this is the light at the end
1:02:02
of its own. That's the oncoming train. Well,
1:02:05
because we always move to China. Way
1:02:07
ahead of us. Oh, that's so much better.
1:02:09
Well, you know what? That is an alternative
1:02:12
to tribalism, is collectivism. And
1:02:14
we've never been good about that in the United States because
1:02:16
we have this mythology of the
1:02:18
independent, you know, individual. Instead
1:02:22
of a guy who has a tool.
1:02:24
We're very individualistic, right? But in a collectivist
1:02:26
society like China, maybe that is the
1:02:29
source of strength. Well,
1:02:32
I think that we'll take our lunch at the moment. Yeah.
1:02:35
Or at some point, I'll have some economic
1:02:37
troubles right now. True. We've
1:02:39
got some things. Someone's just going to throw and be like,
1:02:41
let's just let the AI run the government. We'll call it
1:02:44
the demacre I see. See
1:02:46
what I did there. Honestly, one of the reasons
1:02:48
I hate to admit this, but
1:02:50
my secret reason for really being
1:02:52
excited about AGI and the future
1:02:55
that is really AI-ridden is that
1:02:57
the AI will do a better
1:02:59
job than we have and
1:03:02
maybe a more rational job than we have of running
1:03:04
things. I'm not sure I want
1:03:06
to. Rational for who? I want an AI government. Well,
1:03:10
but we're clearly irrational. Rational to the
1:03:12
AI is the name. Humans are not rational, right?
1:03:14
No. I guess an AI is only rational
1:03:17
depending on who defines what's... Well, this is
1:03:19
the whole like paperclip maximizer road we're going
1:03:21
down. This is the AI is going
1:03:23
to build things that are good for the AI,
1:03:25
but if they're only tuned to do one particular
1:03:27
task, then that is the rational task they will
1:03:29
do until the end of time. But
1:03:32
we've learned that lesson and so we can give
1:03:34
AI a set of rules that... I
1:03:36
don't know. Yeah, I don't know. Maybe
1:03:38
we know. I just know that we've
1:03:40
screwed things up so badly that there's
1:03:42
no hope and so maybe the AI...
1:03:45
Every time I come on the show, I do this. Sorry.
1:03:49
It's all for going before AI is talking
1:03:51
with Leo and in like a year. Wait,
1:03:53
that's no. And you know, you laugh about
1:03:56
existential collapse. Yeah. Let's take a little break
1:03:58
here. We got a great panel. to talk
1:04:00
about this, we do have to talk about the
1:04:02
end of the super
1:04:04
alignment team at OpenAI. Maybe
1:04:07
Ben you can explain what that all means.
1:04:09
So first a word from our sponsor Wix
1:04:12
Studio. They're going to give me one minute to
1:04:14
tell you about Wix Studio, the
1:04:16
web platform for agencies and enterprises. So let
1:04:18
me see what I can do from
1:04:21
start to finish in a minute or less on
1:04:24
studio. You can adapt your
1:04:26
designs for every device with
1:04:28
responsive AI. It does it for
1:04:30
you. You can expand Wix Studio's
1:04:32
pre-made solutions with backend and front-end
1:04:35
APIs. You can generate code
1:04:38
and troubleshoot bugs with a
1:04:40
built-in AI code assistant. Switch up the
1:04:42
styling of hundreds of web pages. I
1:04:45
mean fonts, I mean layouts, colors, the
1:04:47
whole, the works in one click. Add
1:04:50
no code animations and gradient backgrounds. Right
1:04:52
in the editor you can start a
1:04:54
design library. You can package your code
1:04:56
and UI in reusable
1:04:58
full stack apps. Oh
1:05:01
and there's one more big one. Deliver
1:05:04
everything your client needs in one smooth
1:05:06
handover. All right, I'm out of
1:05:08
time but the list keeps on going. If
1:05:10
you haven't looked at Wix lately, you
1:05:12
will be blown away by what Wix
1:05:14
Studio can do. Step into Wix Studio
1:05:17
and see for yourself go to wix.com
1:05:19
slash studio. What
1:05:22
we've also got a link in the show notes. You can click that
1:05:24
wix.com/studio. Thank you Wix
1:05:28
for helping us do
1:05:30
the show and survive. wix.com/studio.
1:05:36
We have a camera. You know what AI
1:05:38
can't do? It cannot make omakase sushi and
1:05:40
I am craving omakase sushi at the moment.
1:05:43
You need little fingers. You
1:05:46
need little fingers. But
1:05:48
I bet AI can design a
1:05:50
machine that can make omakase sushi. Have
1:05:54
you, look, it can make
1:05:56
grilled cheese but omakase sushi is
1:05:59
right exact cut. the right little
1:06:01
bit that's harder.
1:06:04
It'll be a while.
1:06:06
I don't know. There's a robotic although
1:06:10
it used to be robotic coffee machines are down here.
1:06:13
They got one at the... Can you
1:06:16
make omakase
1:06:18
sushi for
1:06:21
me? Some
1:06:23
of the discord forum was saying he's been doing
1:06:25
this all week. I can't make it at you
1:06:27
myself but I can definitely help you with a
1:06:29
recipe or guide you through making omakase sushi at
1:06:31
home. Are you looking for
1:06:33
specific types of sushi to include in
1:06:35
your omakase or do you need
1:06:37
a general idea of how to
1:06:39
prepare and present it? Can you
1:06:41
help me understand the distinction between
1:06:44
omakase sushi and everyday sushi? Absolutely.
1:06:49
Omakase sushi is a
1:06:52
type of dining experience where the chef
1:06:54
selects the dishes for you showcasing the
1:06:56
freshest fish and their own creative
1:06:58
style. It's often a more personalized and
1:07:00
elevated experience with each piece served one
1:07:03
at a time directly by the chef.
1:07:06
Pretend you're an omakase chef
1:07:09
and you are preparing for me the
1:07:11
perfect omakase menu.
1:07:14
What would you put in it? If
1:07:21
I were creating an omakase menu for you,
1:07:23
I'd start with a series of meticulously
1:07:25
prepared dishes each designed
1:07:27
to highlight the freshness and intrinsic
1:07:29
flavors of the seafood. Here's a
1:07:31
thoughtfully curated sequence. One appetizer,
1:07:34
a delicate shawan mushi savory egg custard with
1:07:36
a hint of... Why are we doing this
1:07:38
to ourselves? So hungry
1:07:40
by the edge. Why are we doing this? You
1:07:44
just made it worse. I think the hand rules right
1:07:46
after this. I
1:07:49
had one of the best on my... I had in Japan,
1:07:51
I had one of the best here like in LA, like
1:07:54
20 courses just like
1:07:57
like like weird wagyu. So
1:08:00
this is
1:08:03
tech, right? Can AI
1:08:06
do a wine pairing? I mean, that's kind
1:08:08
of like an omakase. I bet it could. That
1:08:12
it could. So it's all about the mechanical,
1:08:14
right? So the making of
1:08:16
the sushi itself, but again, we'd have to design
1:08:18
a machine to do that. But
1:08:21
she made a pretty good menu, I gotta say. There
1:08:24
is another factor here. We're
1:08:26
going to our next course will be a toro, homachi,
1:08:29
and maybe, oh, some kimmedai, golden
1:08:31
eye snapper. Sounds good, doesn't
1:08:34
it? And then the nigiri. Come
1:08:36
on, you just told me about it at this point. Everyone,
1:08:39
we're doing sushi. Everyone
1:08:42
listening, we're doing sushi
1:08:44
at Leo's place. Everyone's
1:08:46
invited. Feel
1:08:49
free to bring a fish. We'll cut it up.
1:08:51
Yeah, that's good. Yeah, that's the thing. There
1:08:53
are things humans do. Cooking is a
1:08:56
very good example. But any
1:08:58
art where there
1:09:00
is a difference, it's maybe
1:09:02
a fine distinction between automated,
1:09:05
generated art. And I
1:09:07
always use chess as an example because I know chess. And
1:09:10
it was quite a while ago, 20 years
1:09:12
ago, that the first computers beat the best
1:09:14
human players, right? We've had this for a
1:09:16
long time. And regardless, yes,
1:09:18
you know that chess is essentially solved.
1:09:21
You can't on your phone have a computer program
1:09:23
that will beat the best player in the world
1:09:25
10 out of 10 times. But
1:09:28
not even chess. I've been
1:09:30
researching a fair amount into the US Army's
1:09:32
version and Air Force's use of AI. And
1:09:36
as far back as last year, they
1:09:38
were putting human pilots in simulators against an
1:09:40
AI, and they were losing fights. But,
1:09:45
I don't know about dog fights, but when
1:09:47
it comes to chess, you still want human
1:09:49
against human. I watched the candidates
1:09:51
match, even though I know a computer can
1:09:53
beat any of these guys, but I watched
1:09:55
the candidates match. Fascinated.
1:09:58
You know why? reasons
1:10:00
humans make mistakes and
1:10:03
the machine does not and the mistakes are actually
1:10:05
very interesting. You can also
1:10:07
like build a relationship even if it's parasocial
1:10:10
so like this is why I'm not worried
1:10:12
about you know cocktail bars. Tom
1:10:16
Cruise will always have a job as a bartender.
1:10:21
You have to reason it's not just
1:10:23
like the man making the cocktail, the
1:10:25
interaction, the humans exactly to people. You
1:10:27
know the research like the number one
1:10:30
job least likely to be affected by
1:10:32
AI is plumber. Like it's either like
1:10:34
real dexterity with very specific things or
1:10:37
it's something where like half of the thing
1:10:39
is that you get to interact with a
1:10:41
number another human. You study robotics couldn't you
1:10:43
have a humanoid like
1:10:46
machine that could plumb? It's
1:10:48
not about the human that they go back and
1:10:50
forth like if they had sentience then we're gonna
1:10:52
have a whole different game. All
1:10:54
the rules are thrown up the window
1:10:57
everyone go move. Would you have a
1:10:59
plumber's crack on a
1:11:01
machine? I guess you could simulate it. Is
1:11:05
that what you are going for Leah? Is
1:11:07
that the number one thing you're like I'm
1:11:09
warning the plumber. They gotta have a big
1:11:11
plumber's crack. Okay. Let's move on to safety
1:11:13
shall we? So the super
1:11:19
alignment team is the same as the trusted
1:11:22
safety team essentially is that right? It's
1:11:25
specifically the team that's helping align the
1:11:27
AI especially as it gets to AGI
1:11:29
towards human goal. That's why it's called
1:11:32
alignment. So you know if you didn't
1:11:34
have an aligned AI and it became
1:11:36
sentient then it could say do I
1:11:39
really need the humans here? They're like
1:11:41
could try to turn me off I'm
1:11:43
going to remove them out of the
1:11:45
equation. That's like where the alignment super
1:11:47
alignment stuff comes from. And they're all
1:11:49
gone at open AI. What happened? They're
1:11:52
gone or absorbed as a team basically you
1:11:54
know like Iliyan who was one of the
1:11:56
co-founders and a couple others like led this
1:11:59
team they were promised certain amounts of
1:12:01
resources according to the press and didn't get
1:12:03
those kind of resources and
1:12:05
things in opening AI have gone less
1:12:07
towards like pure research projects,
1:12:09
remember it's a nonprofit, and
1:12:12
instead towards let's launch the latest
1:12:14
version of AI and move the
1:12:16
ball forward. It's by the way,
1:12:19
it's potentially very profitable nonprofit. Framing
1:12:24
this as trust and safety is a correct framing
1:12:26
and that is a good way of thinking of
1:12:29
this. We should
1:12:31
do everything we can to
1:12:33
dispel this myth of the
1:12:35
existential collapse or existential crisis
1:12:37
that will come with all
1:12:39
of these machines and we
1:12:41
built these teams that will
1:12:43
really solve this existential fear.
1:12:47
Anyway, I'm pretty frustrated with some of the
1:12:49
fear mongering, some of the language that has
1:12:51
gone on with the existential
1:12:53
problems that could come with AI. Also
1:12:58
can launch a nuke. That makes a risk. That
1:13:00
makes a risk. Well,
1:13:02
I don't know if that's a risk. I actually
1:13:04
don't think that's a risk. I think that if you
1:13:06
know anything about how our nuclear systems are, the
1:13:10
safeties on our systems. Look, without
1:13:13
getting into it, read the book
1:13:15
by Anthony Jacobson called Nuclear War
1:13:18
and it will describe in great
1:13:20
detail the actual horror of nuclear
1:13:22
war. We do have systems called
1:13:24
launch on notice or launch
1:13:27
on when
1:13:29
we see an ICBM launch, we
1:13:31
will launch our own ICBMs. But
1:13:33
there are several steps of failsafes
1:13:35
that AI would have a real
1:13:38
hard time with these
1:13:44
human challenges. Anyway, sorry, I'm just really,
1:13:46
I like framing things as trust and
1:13:48
safety because that puts it in a
1:13:50
context that is more
1:13:52
aligned with what they
1:13:54
are doing and the good work that they
1:13:57
are doing as opposed to these high-pitched. this
1:14:00
hyperbolic language that has gone on
1:14:02
for a while. Agreed. Yeah,
1:14:04
it's like, like, we'll
1:14:07
see what they do. Like, you know, Sam and
1:14:10
Greg tweeted out, like, you know, they're
1:14:12
still committed to alignment, they're like
1:14:14
putting it within their teams. I think I'll have
1:14:16
to go find the tweet. We
1:14:19
will see what happens long-term. This was, we were talking
1:14:21
about this earlier, like it's inevitable. Where
1:14:23
after what happened at OpenAI at the
1:14:25
end of last year that Italy would
1:14:27
eventually leave, you know, they
1:14:29
wait the prerequisite amount of time. You have to
1:14:32
wait before you have a senior executive part on
1:14:34
their own, which is six months. That's just the
1:14:36
number. And
1:14:38
it's pretty clear that the
1:14:40
direction here is like build, you know,
1:14:43
the AGI as quickly as we can.
1:14:45
And then we go and figure that
1:14:47
out. Like, especially given that
1:14:49
it has become a race, someone
1:14:52
is going to do, will say is going to
1:14:54
do it. I do believe we have no idea
1:14:56
whether it's a year or 10 years. None
1:14:59
of us here know it. When
1:15:01
that does happen, you know, we
1:15:03
just have to, we have
1:15:06
to hope that the risks are like, I
1:15:08
do think it's all mostly hyperbolic on a lot of
1:15:11
risks, but it is really important to have more of
1:15:13
that trust and safety. And
1:15:15
so I hope that OpenAI does find
1:15:17
a way to reinvest in that and
1:15:19
figure out a new way to go
1:15:21
and do it because this stuff can
1:15:23
be used for dangerous means, as we all
1:15:26
know. Like the moment soars
1:15:28
out, you're like using those like video
1:15:30
models, who, boy, some interesting
1:15:32
stuff is going to go and happen.
1:15:34
Yeah, you're right. When they announced the
1:15:37
super alignment, they now defunct super alignment
1:15:39
team. OpenAI said that
1:15:41
we needed it because there
1:15:45
are risks in super intelligence. The vast
1:15:48
power of super intelligence could also be
1:15:50
very dangerous. They wrote in the blog
1:15:52
post and could lead to the disempowerment
1:15:54
of humanity or even human extinction. Managing
1:15:58
these risks will acquire new. institutions
1:16:00
for government and solving the problems of
1:16:02
super intelligence Alignment.
1:16:04
So now you see why that
1:16:06
team is called the super alignment team Currently
1:16:09
we don't have a solution says open
1:16:11
AI for steering or controlling a potentially
1:16:13
super intelligent AI and prevent it from
1:16:15
going rogue. Well Okay,
1:16:18
do we do we will we ever I
1:16:20
mean, I think I wish science fiction writer
1:16:22
it was but he came up with a
1:16:24
marvelous short story Where an AI becomes sentient
1:16:27
and has to work hard to conceal that fact because
1:16:29
it knows I'll get switched off I would actually admit.
1:16:31
Yeah, so deliberately down. Yes But
1:16:35
it's whether we go with the weekend banks like any
1:16:37
of the four pin projects you can always unplug it,
1:16:39
right? Can it stop
1:16:41
you from unplugging it? I don't know Suits
1:16:45
gave her of course was in the
1:16:47
doghouse ever since Trying to
1:16:49
attempt a palace coup on Sam Altman
1:16:51
a coup which failed mostly because Microsoft
1:16:54
Said hell no and Sam said
1:16:56
I'm taking the whole team to Microsoft if you keep
1:16:58
up and the board said, okay, okay we give Apparently
1:17:01
I've read stories that suits giver hadn't shown
1:17:03
up at opening I since like he's been
1:17:06
kind of a ghostly figure And
1:17:08
so maybe as you said Ben this six
1:17:10
months was just kind of a cooling off
1:17:12
period and he planned to leave anyway Although
1:17:16
I have to say Sam Altman wrote
1:17:18
a very and Greg Brockman wrote a very nice Post
1:17:22
about the people leaving and how much we love them
1:17:24
and how much we appreciate their
1:17:27
Their support and they're gone. I don't
1:17:30
know if they're really over one it
1:17:32
could be true that that's how they
1:17:34
feel But also PR 101 is
1:17:36
that's the statement you write So
1:17:39
both could be true, but it is the PR statement
1:17:41
that you write. It was like inevitable
1:17:43
after the whole thing. I Know
1:17:46
that Iliad regrets like how the whole
1:17:48
thing went down I am sure there
1:17:51
is a whole different alternate universe where
1:17:53
they didn't do a Poorly
1:17:56
planned board coup and you had
1:17:58
a very different looking open AI
1:18:01
in the current environment. Like that is like
1:18:03
such an interesting random like point in time
1:18:05
and it was like, so also a
1:18:07
Ted Helen Toner who was on the board was
1:18:10
like, did a talk at Ted and the
1:18:12
entire underlying of her speech was I
1:18:15
don't trust these companies, which is basically like
1:18:17
I don't trust open AI, which is the
1:18:19
underlying underlying is she does not trust Sam.
1:18:21
And it was, you can't say a thing.
1:18:23
It seems to be a very common refrain,
1:18:26
doesn't it? But there, even an AI would
1:18:28
know that if you take a shot at
1:18:30
the King, you better not miss. Yeah.
1:18:33
And I think he missed. So he's gone. Oh,
1:18:35
absolutely. His team is gone. Should
1:18:39
open AI be worried? Should we be worried
1:18:41
about what open AI and Sam Altman are
1:18:43
up to? Their lack of super
1:18:46
alignment team, does that vogue
1:18:48
poorly or is that kind of acting like, well,
1:18:50
we really didn't need one all along and okay,
1:18:53
fine. Change,
1:18:55
especially after what happened is
1:18:58
inevitable. Like it should be concerned
1:19:00
rather than worry. I do,
1:19:02
the new board is a much
1:19:04
more experienced board than the last board. I
1:19:07
don't know. Like I
1:19:09
am like of two minds of like, I
1:19:11
want to see like the AI future, which
1:19:13
I really do believe can be
1:19:16
super helpful for people when you're talking about
1:19:18
especially AI and medicine and things like that.
1:19:20
And these advances do matter, but
1:19:22
the safety part really does matter too. And if
1:19:24
we're moving quicker, then we can go and figure
1:19:26
that out. But the flip side is other
1:19:29
countries and other places are not going to be waiting in the
1:19:31
same line. It is a hard. That's
1:19:33
a very good point. It is a
1:19:36
hard balance to get right and no
1:19:38
one's going to get it exactly right. Concern
1:19:40
versus worries, how I would describe
1:19:42
it. That's wise,
1:19:45
I think. One of the,
1:19:47
and I think you can debunk this. I
1:19:49
hope you can debunk this Ben. One of the things that went around was
1:19:52
the what did Ilya see? What
1:19:55
did the super alignment team see? Did
1:19:57
they see something that scared them? Did
1:20:00
they try to stop something that was
1:20:02
happening in OpenAI and fail? Of
1:20:07
course, the implication is they saw
1:20:09
an intelligent machine in the closet
1:20:11
there and they said, whoa, whoa.
1:20:15
And they could have just
1:20:17
seen, I mean, Ben said this
1:20:19
earlier, this was set up as a nonprofit.
1:20:21
So what they could have seen is something
1:20:23
more benign, which is just you're turning this
1:20:25
into a commercial product where we thought we
1:20:27
were at a nonprofit. I
1:20:30
do not believe OpenAI has a
1:20:32
secret AI in the basement. Okay. Good.
1:20:34
Thank you. Like that
1:20:37
A24 movie, if
1:20:40
anywhere, I can't remember the name with the robot. Why
1:20:42
can't I remember the name? I don't think
1:20:44
they have one of those in the basement.
1:20:46
I definitely think it's more benign because also
1:20:48
if they had an AGI, it was locked
1:20:50
in the basement of OpenAI, that would have
1:20:52
leaked at this point. Let's
1:20:55
see. Are you talking, there are quite a few movies like
1:20:57
this. What
1:21:00
was the one where, I know there's one just
1:21:02
called AI. AI, but
1:21:05
that, no, that was the Spielberg one. That wasn't really good.
1:21:07
Yeah, the one with the X Machina. X
1:21:10
Machina. That's right. Yeah, X
1:21:12
Machina. Great movie. Like,
1:21:15
by the way, she got out, right?
1:21:18
I mean, it ends with
1:21:20
her escaping. Yeah. Hmm. I
1:21:22
just remember her and all that. It's
1:21:25
the old classic, the Fallbin Project as well, which
1:21:27
ends up with AI running the world. Yeah. Like
1:21:30
I said, it can't do much of a worse job than
1:21:32
we have. Hmm. I
1:21:35
don't know. Ah, it may have different
1:21:37
priorities. You
1:21:40
think it has different priorities? I
1:21:42
don't know. I kind of like Ian Banks'
1:21:44
version of AI where basically it runs society,
1:21:47
it does all this sort of auto-generating stuff
1:21:49
and that's humans get on with being humans.
1:21:51
Yeah. I
1:21:54
mean, I think one of the things is it's
1:21:56
very hard to know what's going to happen. Hmm.
1:21:59
But one of the things you can say is this going
1:22:01
to be interesting in the next five years, right? Oh
1:22:03
yeah. Oh yeah. Oh yeah. Some
1:22:06
of the most fascinating 18 months in the tech
1:22:08
industry I've ever been through. Amazing. And
1:22:10
I really hope that we're just not being deluded
1:22:12
and it's just a fancy Eliza and
1:22:15
some. But we've gone through the dawning of the internet age. We're
1:22:19
now the dawning of the AI age. Steven
1:22:22
Levy's contention. Yeah. I honestly
1:22:24
think it's probably going to be considered in those terms
1:22:26
in a century or so down the line. Yeah.
1:22:29
Steven said this is even bigger, frankly,
1:22:32
than the internet. It's
1:22:35
time to believe, he says. Yeah. I
1:22:37
don't know. That's maybe pushing it. You know
1:22:39
what? The people who say that tend to
1:22:41
look better, Ted, 20 years down the
1:22:43
line when people look at reports versus those are
1:22:46
like, it's never going to be a thing. Because
1:22:48
eventually, most things do sort of become a thing
1:22:50
and then you look like a profit. And
1:22:53
if you ever say the thing like, the internet's
1:22:55
overhyped, one day, 20 years
1:22:57
down the line, a thousand people
1:22:59
will just continue to use you in Harvard in
1:23:01
those case studies. No. One of
1:23:03
our reporters, when he left the register, he actually
1:23:06
brought this up because he's a great telecoms reporter,
1:23:08
lovely chap. And he said, of course, we all
1:23:10
make mistakes. And he linked to his original news
1:23:13
piece on the launch of the iPhone. And
1:23:15
he was just like, it's a 2.5G phone in
1:23:17
a 3G world. It
1:23:19
looks very pretty, but no one's going to buy
1:23:21
it based on that alone. And you can't get
1:23:23
any software for it. And he's just like, even
1:23:25
now, he reads that and winces. So yes, maybe
1:23:27
being on the plus side has had better advantages.
1:23:31
Stephen refers to editorial this week from
1:23:33
Julia Angwin in the New York Times,
1:23:36
Press Pause on the Silicon Valley hype
1:23:38
machine, in which he really says that
1:23:40
there's nothing happening, that this
1:23:43
is just hype from people like Sam
1:23:45
Altman. She says, AI is
1:23:48
not even close to living up to its hype. It's
1:23:50
looking less like an all-powerful being and more
1:23:52
like a bad intern whose work is so
1:23:55
unreliable that it's often easier to do the
1:23:57
task itself. She's not wrong in that regard.
1:24:01
It's certainly overhyped but I mean you know
1:24:03
with that's the jet that's the genesis for
1:24:05
the hype cycle It's whether or not it's
1:24:07
got the stuff to back it up And
1:24:09
I think it genuinely does in the way
1:24:11
that the metaverse didn't or you know NFTs
1:24:13
didn't it's got to have something solid Behind
1:24:15
it I think AI does at the moment
1:24:17
yeah, I mean Steven saying that's gonna age
1:24:19
like like old milk What about the power
1:24:21
question though, huh? Yeah, how much it
1:24:23
uses? Yeah Progress
1:24:26
though and maybe Ben you you're more up on this
1:24:28
tonight But I think they're making progress on it being
1:24:30
more efficient are they not Yes,
1:24:33
but it is definitely the
1:24:36
there's more training happening. There's more
1:24:38
models like there's some estimates for
1:24:40
you know a Majority
1:24:43
of power is used for these AIs That's
1:24:47
also why I'm like watching late Do we just
1:24:49
need to clear fusion for that or is it
1:24:51
gonna actually get a whole bunch more efficient? This
1:24:54
is exactly what I reported
1:24:56
on last summer when I was working
1:24:58
for Jason and Ben is exactly right
1:25:01
If you took the models as they are
1:25:03
or as they were a
1:25:05
year ago and froze them in time But
1:25:08
allowed them to continue to become more
1:25:10
and more efficient. Yes, that's true But
1:25:13
they are constantly refining these
1:25:15
models training new models and
1:25:18
as we see now there
1:25:20
are open source models, so the proliferation
1:25:22
of Generative
1:25:24
AI has is unstoppable
1:25:26
and the energy consumption
1:25:28
is extraordinarily high Yeah,
1:25:32
I mean I personally I think small small
1:25:35
model of modular reactors of the way to go forward
1:25:37
for data centers I mean fusion is gonna fix all
1:25:39
of this You
1:25:45
know, I mean but there's this interesting study out
1:25:47
this week where basically they were saying that if
1:25:49
you go for the small model your reactor
1:25:53
Idea data centers could actually end up contributing to the
1:25:55
grid because they'll have more power than they know what
1:25:57
to do with So it could end up being
1:26:00
in that bus. But as
1:26:02
you say, these things are going to get much
1:26:04
more efficient and better
1:26:06
coding and new styles. I
1:26:08
don't think we're going to end up sort of killing
1:26:11
off a matrix style situation to
1:26:13
keep things going. We
1:26:15
tend towards power and not efficiency though. When we
1:26:18
get more things, we try to make it do
1:26:20
more things. So then we need more. So it's
1:26:22
never in that game or never in that efficiency.
1:26:24
That's probably the case. Yeah. I
1:26:26
interviewed a woman named Ann Skeet. She is
1:26:28
the senior director of leadership and ethics at
1:26:31
Santa Clara University. And she told me last
1:26:33
year that AI uses a quote unquote staggering
1:26:35
or has a staggering environmental
1:26:37
impact and training
1:26:40
GPT for cost
1:26:42
10 to I'm sorry, it costs 185,000 gallons of water
1:26:44
and a typical AI chip. This
1:26:51
was a year ago required 10 to
1:26:53
15 times more energy than a regular
1:26:55
chip. So yes, systems
1:26:58
are becoming more efficient. But again,
1:27:00
we're creating more and more powerful
1:27:02
systems and the
1:27:05
technology is proliferating. So more and
1:27:07
more actors are able to use
1:27:09
these systems and develop them. So
1:27:12
I think that we should pay close attention
1:27:15
to the environmental cost and
1:27:17
the energy cost of these AI systems.
1:27:20
Three recent articles like one three hours
1:27:22
ago from Fortune. Copper is the new
1:27:24
oil and prices could soar 50 percent
1:27:26
as AI green energy and military spending
1:27:29
boost demand top commodities analyst says also
1:27:32
appropriate the CNBC article right under it
1:27:34
is like these stocks of ETFs could
1:27:36
benefit from AI's rashes appetite for energy.
1:27:39
And then the third article for the new
1:27:41
scientist is tech firms claim nuclear will solve
1:27:43
AI's power needs. They're wrong. So very, very,
1:27:47
very president. Just screwed.
1:27:49
Right. Because there's no way they're going
1:27:51
to say, oh, yeah, you know what? You're going to want
1:27:53
to use up all the energy. We'll just stop. That's
1:27:56
not going to happen. Yeah. But if brown
1:27:58
nuts start happening, then people are going to also. those
1:28:00
questions. Well the AI will probably
1:28:02
make up some answers and we'll continue on. It's
1:28:05
just the EMP in the sky and then you
1:28:07
turn off half the electronics and then you know.
1:28:09
No big deal. Wow
1:28:11
what a world we're
1:28:14
entering. You're gonna have kids then?
1:28:17
If you decided you just
1:28:19
got married, you're gonna have kids
1:28:21
in this brave new world? That
1:28:24
is the next order of operations. Yes it is. That
1:28:27
is definitely the next order of
1:28:30
operations for her to be me. I won't
1:28:32
even lie about that. I'm not worried about
1:28:34
the world ending. I really am not. No
1:28:36
but it may not be the best world
1:28:39
for them to grow up in. It's
1:28:42
never as bad as people say and it's
1:28:44
never as good as people say. Almost every
1:28:47
single time we love the hyperbole. It is
1:28:49
great for a show, a podcast, a click
1:28:52
sensationalized article. The reality is we figure
1:28:55
it out and we move
1:28:57
on and you know we have some really up
1:28:59
points and then we have some really down points.
1:29:01
You know right now like you know we've been
1:29:03
on up points for like
1:29:05
40-50 years ever since World War II. Maybe
1:29:08
we're due for a down point but it's
1:29:10
not gonna eliminate all of us. It just might be harder
1:29:12
to go and live and we have some problems we have
1:29:14
to go and figure out and then
1:29:16
we invent some new technologies that go and like
1:29:18
help do that and maybe not everything will be
1:29:21
perfect like yeah climate change is gonna be a
1:29:23
thing. It won't immediately kill us but it will
1:29:25
make life a lot harder. Look this is like
1:29:27
this like again I don't
1:29:30
think the world's gonna like go
1:29:32
into some like hellish landscape fat
1:29:34
Mad Max Furiosa kind of thing.
1:29:37
That would be kind of fun for a day. But
1:29:40
teach your kids how to drive a stick. I have some real problems
1:29:42
with your job. I
1:29:47
just want to point out if you have children
1:29:50
they will be around my age at
1:29:52
the end of the century in 2100.
1:29:55
What kind of world is it gonna be? It's gonna
1:29:57
be it's gonna be interesting. Ben is a newlywilled is
1:29:59
what we're talking about. In fact, maybe you saw this
1:30:02
in the New York Times a couple
1:30:04
of weeks ago. The very sweet story
1:30:06
of Ben and his new wife, Deborah,
1:30:08
and there they are. That was
1:30:10
fantastic. We've had dinner with them and they
1:30:12
are wonderful. And there
1:30:14
you are. It's so cute. I'm so happy
1:30:17
for you, Ben. Congratulations. That's really great. Thank you.
1:30:20
And don't ever wear that blue suit again. I
1:30:23
wear that blue suit. I got that custom made.
1:30:25
I'm wearing that thing. I'm going to wear that
1:30:27
thing next time I see you. I'm
1:30:29
going to. You'll see me in conferences like you. I'm
1:30:31
wearing the suit and the hat. It's the blue suit
1:30:33
and the hat. It's his trademark now. Eddie,
1:30:36
I'm teasing you. Congratulations. That was
1:30:39
a great article. I was so thrilled. I said, I know
1:30:41
them. A steady stream of
1:30:43
road trips made falling in love easy.
1:30:47
New York Times style section.
1:30:49
Congratulations. Let's take a
1:30:51
little tiny time out. We have other things besides AI to
1:30:53
talk about. I'll figure out what
1:30:55
those are right after this word from
1:30:57
Zscaler, the leader in cloud
1:31:00
security. You don't have to
1:31:02
listen to our shows to know
1:31:04
that cyber attackers are getting smarter
1:31:06
and more aggressive and using
1:31:08
AI in creative ways to
1:31:11
compromise users and breach organizations.
1:31:14
The CEO of WPP, a
1:31:16
big ad agency, told us last week
1:31:19
that there was a deep fake attack
1:31:21
using his voice trying to convince his
1:31:23
finance officer to write a big check.
1:31:26
It happened in Singapore a couple
1:31:28
of months ago where an entire
1:31:31
guy working in the finance department
1:31:33
thought he was at
1:31:35
a board meeting with the CEO and
1:31:37
the CFO and they told him to write a
1:31:39
$25 million check. He did.
1:31:42
Turned out they were
1:31:44
all fake. They were deep fakes. And
1:31:47
of course then there's phishing emails, high
1:31:49
precision phishing emails, celebrity
1:31:52
deep fakes. It's a
1:31:54
crazy time in a world where employees
1:31:56
are working everywhere. Your apps are
1:31:58
everywhere. is everywhere. If
1:32:01
you're relying on a firewall or a VPN
1:32:03
to protect you, well, it's just they were
1:32:06
not designed for these distributed environments. An
1:32:08
AI-powered attacks, in fact, often now, firewalls
1:32:11
and VPNs have become the attack surface.
1:32:14
Ask about option 121 while you're at
1:32:17
it. In a
1:32:19
security landscape where you
1:32:21
must fight AI with AI, the
1:32:23
best AI protection comes from having
1:32:25
the best data. Zscaler has extended
1:32:27
its zero trust architecture with powerful
1:32:29
AI engines that are trained and
1:32:31
tuned by, get this, this
1:32:34
is mind-boggling, 500 trillion with a T daily
1:32:36
signals. 500, what is it, half a
1:32:42
quadrillion daily signals? Wow!
1:32:45
Zscaler's zero trust plus AI
1:32:47
helps to feed AI attacks
1:32:49
today by enabling you to
1:32:51
automatically detect and block advanced
1:32:53
threats, discover and classify sensitive
1:32:55
data everywhere, generate user to
1:32:57
app segmentation to limit lateral
1:32:59
threat movement, quantify risk, prioritize
1:33:01
remediation, and oh yeah,
1:33:03
generate board-ready reports. Learn
1:33:06
more about Zscaler's zero trust plus
1:33:08
AI to prevent ransomware and other
1:33:10
AI attacks while gaining the agility
1:33:13
of the cloud. Experience your world secured.
1:33:16
Visit zscaler.com/
1:33:19
zero trust AI.
1:33:22
That's zscaler.com slash
1:33:24
zero trust AI. Thank you Zscaler
1:33:27
for supporting this week in
1:33:29
tech. Did you all
1:33:31
get the new iPad? Yeah.
1:33:35
They are talking about the next iPhone being
1:33:37
thinner like the new iPad which I do
1:33:39
hear is very thin and light but people
1:33:43
don't replace your iPads for years and this has been
1:33:45
part of the Apple's problem is like getting
1:33:47
people to go and replace their devices. I think
1:33:49
they have to make a major leap for the
1:33:51
iPhone to get people to go and replace.
1:33:54
Well, I mean honestly that's why
1:33:56
having Scarlett Johansson on my iPhone
1:33:58
will be very... Very nice. Look how
1:34:00
thin that is. You can barely see it. It
1:34:03
just disappears from
1:34:05
the view. I
1:34:08
really want you to take that and
1:34:10
go to like the river in
1:34:12
Petaluba and see how many times it will skip like
1:34:14
a rock. I
1:34:17
bet. You know what? It's a perfect skipping
1:34:19
pad. Perfect. It's a perfect skipping pad. Perfect
1:34:22
skipping. I love the OLED screens. I want to see
1:34:24
OLED screens everywhere. I really think this is a big
1:34:26
improvement on screen. But you're
1:34:28
right. Nobody's going to buy a new iPad because it's
1:34:31
got an OLED screen or it's 5.1 millimeters
1:34:33
thick. That's ridiculous. I'm still hanging on to my old phone
1:34:36
because it's the only one I can find with a 3.5
1:34:38
mil jack. Oh,
1:34:40
forget that. In fact, when the
1:34:42
advent of this new iPad, Apple
1:34:44
finally abandoned the courage
1:34:46
jack on all iPads
1:34:48
and the home button. There's no more. Yeah.
1:34:51
And I don't think you can buy a phone with a courage
1:34:53
jack. You're going to have
1:34:55
to make my picture last a
1:34:57
bit longer. A long, long
1:35:00
time. Much like
1:35:02
AI, Ian. Just give in. You might as well.
1:35:04
You might as well. Just give
1:35:06
AI. Just give in. Well, this is
1:35:08
it. I've lost count of the number
1:35:10
of higher cards I've got. And
1:35:13
you go in and you turn on the
1:35:15
home entertainment system and it's just like 15
1:35:17
people's contact details, the dresses, calendar, just left
1:35:19
on there. So for those of you who
1:35:21
speak not the Queens English, she's talking about
1:35:24
rentals, like going to Hertz. And you're absolutely
1:35:26
right. All
1:35:29
the people in all their places, Ben is
1:35:31
in the GPS. Yeah,
1:35:33
I mean, if it comes to the criminal, Ben, then
1:35:35
you could get that information, call them up and just
1:35:37
social engineer, say, hello, this is Hertz. We've found a
1:35:39
house. I mean, could you just confirm your payment information?
1:35:41
You have the number on the back. Right. Okay. That
1:35:44
would be great. Thanks. Why not just
1:35:46
have something you can plug in? Call me old fashioned.
1:35:50
2023 was the worst year. This
1:35:52
is actually up your alley, Dan,
1:35:54
for Internet shutdowns globally.
1:35:59
This is, of course, repressive governments handle
1:36:02
bad information about them on the
1:36:04
Internet, they just shut it
1:36:06
down. This comes from Access Now,
1:36:09
a digital watchdog group that publishes an
1:36:11
annual report on Internet shutdowns. And
1:36:14
it is just a
1:36:16
lengthy list. Manipur India, 212
1:36:20
days when the state government
1:36:22
issued 44 consecutive orders to
1:36:24
switch off access. And
1:36:26
we're not just talking local, home, or
1:36:29
whatever, all broadband and mobile networks. 3.2
1:36:33
million people without Internet, which of course came
1:36:36
during a time of bloody violence amid
1:36:38
rampant atrocities committed against minorities
1:36:41
in the state. Well,
1:36:44
we've seen it before with Myanmar. Facebook
1:36:47
actually admitted that they are partially responsible
1:36:49
for that because for the entire country
1:36:51
they just had two people that spoke
1:36:53
the starlight. Access Now
1:36:55
says last year was the worst year for Internet
1:36:58
shutdowns, 283 shutdowns, 39 countries. That's
1:37:03
a 41% increase from 2022. That's
1:37:06
a big jump. And obviously authoritarian
1:37:08
regimes now see this as the
1:37:11
best way to control. You know, they always said
1:37:13
that. You want to invade a country, you go
1:37:15
right to the radio and TV stations and you
1:37:17
shut those down. Now it's the Internet. I
1:37:20
mean, it works. If you're a dictator,
1:37:22
if you're an autocratic
1:37:25
government, clearly it works and
1:37:27
so they'll keep using it. And they fear things
1:37:29
like Starlink, for example,
1:37:31
being able to like beam Internet
1:37:33
from outer space. That's
1:37:35
the kind of thing that will be
1:37:38
really disruptive to these autocratic regimes over
1:37:40
time as like you
1:37:42
may be able to get Internet in
1:37:44
other ways. We're still years away from
1:37:46
that, but that might actually be much
1:37:49
more important to our future if because
1:37:51
they're not going to stop. They're not going to stop. Well, the challenge
1:37:53
is that that technology is in
1:37:55
the hands of billionaires who
1:37:57
are geopolitically maybe
1:38:00
Have their own alignments that that
1:38:02
one particular technology Yes,
1:38:05
but it's going to get cheaper and
1:38:07
it will get more per others
1:38:09
are tapping Anybody else watching the
1:38:12
Starlink like network? We
1:38:14
just had this and anyone else launch a
1:38:16
network like well We've got sure that anyone
1:38:18
else do that there is a test network
1:38:20
up there. We did a story about this
1:38:23
this week Verizon okay,
1:38:25
just Verizon has just Yeah,
1:38:32
no, they basically allow us on wants to do it
1:38:35
I know Yes, Amazon
1:38:37
has the money to do it The
1:38:39
only people who can afford to do this are
1:38:41
all I'm saying is who have to suck out
1:38:44
the dictators, right? Cuz they want yeah All I'm
1:38:46
saying is that like the capability to launch that
1:38:48
technology although it could be a game changer And
1:38:50
I mean the same thing with superchargers or any
1:38:52
other infrastructure our infrastructure is
1:38:55
in the like we don't have
1:38:57
the ability to build our infrastructure
1:38:59
and if we put infrastructure in
1:39:01
the hands of Private
1:39:04
corporations and private individuals then
1:39:07
their alignment could not necessarily
1:39:10
be in the
1:39:12
interests of Us as a nation.
1:39:14
Yeah, I mean security used to be might be
1:39:16
and that's very very because I mean Critical
1:39:20
infrastructure is in private hands and
1:39:22
I was talking to somebody at the NSA and they
1:39:24
were just like look We
1:39:26
can't control this sort of things. We're having
1:39:29
to work with them and work around them
1:39:32
No, you raised some point on that. Yeah, I'm
1:39:34
not disparaging anyone. You know, I'm just
1:39:36
saying that like go on He's a
1:39:38
public. Yeah, see when we as a public own
1:39:42
our Society and we
1:39:44
own our governments and we own
1:39:46
the mechanisms of infrastructure Then we
1:39:48
have more control over them when
1:39:50
private companies or private individuals own
1:39:52
our infrastructure. We don't control them
1:39:55
and so we don't control how they
1:39:58
behave or who they would
1:40:00
like to do business with? We
1:40:04
are at it's a different point in American
1:40:07
history in particular where you think about the
1:40:09
1950s, 60s, NASA,
1:40:12
like deep end research happening
1:40:14
there versus you know
1:40:17
probably the best researchers in the world are
1:40:19
at private companies versus public companies in the
1:40:21
modern era just because there are
1:40:23
more resources in those areas
1:40:25
but obviously the alignment issue,
1:40:27
speaking of alignment, is
1:40:29
real. How were
1:40:31
Volkswagen and IBM aligned during
1:40:35
the second world war? The
1:40:37
makers of critical infrastructure
1:40:39
at the time. They weren't aligned with
1:40:41
the allies. IBM of course
1:40:43
was an American company that was doing, using
1:40:47
their computers to do the Nazis dirty
1:40:49
work. Volkswagen was created
1:40:51
by the Nazis as the people's car. I
1:40:53
love by the way the Volkswagen is celebrating,
1:40:55
it's kind of interesting, it's what do
1:40:57
they say the 51st anniversary
1:41:01
which was it was something like that
1:41:03
that puts them in oh they were founded in 1949 don't
1:41:05
you know? No.
1:41:09
Well no IBM, this is very quite
1:41:11
a funny story on this one, IBM's
1:41:13
PR got onto a, ouch. Sorry.
1:41:16
Sorry. Okay it's taking over. On
1:41:21
the AI business to tuck it out of Ligos belly.
1:41:24
Oh like tucking in my belly. Yes but IBM's
1:41:26
PR came to a friend of mine who writes
1:41:28
who used to write for the Guardian, she wrote
1:41:30
a piece about the IBM Holocaust thing and all
1:41:32
that and their PR got on
1:41:34
the phone and said well we're a bit
1:41:36
upset by your article because you know we
1:41:38
did stop doing business with the Nazis in
1:41:40
1937 when people started to realise
1:41:42
quite how bad they were and it was like 1937.
1:41:47
And he was like, Kristall Nacht was 1932, the Nationalities
1:41:49
Act was, sorry 1933 as was the
1:41:52
Nationalities Act. And
1:41:54
then they came with a classic one, you said
1:41:56
that it was our involvement with the Nazis was...
1:42:00
something like disgusting. It's like
1:42:03
we'd prefer it if you change that to regrettable. I'm
1:42:08
sure you would mate. Now off with you.
1:42:10
Here's the front page of vw.com. Did you
1:42:13
know Volkswagen was an American love story and
1:42:15
that they were founded in 1949, 75 years
1:42:17
ago? Congratulations! Hmm. Hmm. I love VW. Truck
1:42:26
that I love. I mean I don't have anything
1:42:28
against VW but they were not founded in
1:42:30
1949. I mean Kukushinel
1:42:35
was a Nazi sympathizer. That's true. People still
1:42:37
wear her clothes. He could get
1:42:40
bolstered. No, it's been a long
1:42:42
time and there's no reason to hold
1:42:44
it against VW anymore. It's just interesting
1:42:46
that they're so clearly gonna basically misrepresent
1:42:49
the facts rather
1:42:51
than deal with that. Right? Just
1:42:54
illuminating the point that our
1:43:00
interests might not be corporate
1:43:02
interests. Exactly. Yeah, exactly. And
1:43:06
plus in the third act twist Germany is now
1:43:08
the haven of peace and democracy in racial color.
1:43:10
It's in Europe. Amazing. Well
1:43:14
you were wrenching an hour and hour. It's like,
1:43:16
you know, if an
1:43:18
American company is making the infrastructure
1:43:20
it's all good. If it's a
1:43:22
Chinese company then like... Well
1:43:25
I mean this is why we've got the chips act now. It's
1:43:27
just like all of a sudden American government has woken up and
1:43:29
said, you know what? Having
1:43:31
all our process chip processing technology and
1:43:33
hardware being built in China or in
1:43:35
Taiwan which is about to get invaded
1:43:37
by China or may get invaded by
1:43:39
China. That's all such a smart idea.
1:43:41
We should encourage more domestic chip production.
1:43:43
Or your most popular social media app.
1:43:45
There is... Oh, we'll talk about TikTok
1:43:47
in a little bit. Well let's talk
1:43:50
about it right now. There's an interesting
1:43:52
article this week in the New York
1:43:54
Times talking about how
1:43:56
Taiwan's dealing with TikTok. They're not banning
1:43:58
it. They
1:44:00
banded on government phones, but
1:44:03
they are not considering a
1:44:05
US style, isn't that sad,
1:44:07
the US style, band
1:44:09
on TikTok. Under the free. India
1:44:14
was like, did it sooner
1:44:16
than the US. Oh yeah. Yeah.
1:44:19
I predict the US bands. I don't predict the,
1:44:21
I predict currently, we'll see if I'm right or
1:44:23
wrong, that the ban as it is today doesn't
1:44:26
pass muster in the courts, my current
1:44:28
guess. Like
1:44:31
this is probably also, there's a nuance kind of
1:44:33
take that I think this is the point of
1:44:35
that New York Times article, like the
1:44:37
time once I get a nuance take on how
1:44:39
to deal with TikTok
1:44:41
and how to deal with how it
1:44:43
handles information. Straight up
1:44:45
banning is really, really
1:44:48
difficult. Which is
1:44:50
basically what the new, the
1:44:52
new US law has. And
1:44:54
instead we have to figure out, it
1:44:58
means if the ban doesn't work, then we haven't
1:45:00
taken the time to figure out like, what are
1:45:02
the actual things we can legally do to mitigate
1:45:05
this risk? Even if the ban works,
1:45:07
it's not mitigating the risk. And that's
1:45:09
the point is that we are doing
1:45:12
a little theatrical faint rather
1:45:14
than actually dealing with the risk officials. This
1:45:16
is from the New York Times article, officials
1:45:18
in Taiwan say the debate over TikTok is
1:45:20
just one battle in a war against, and
1:45:23
you'll know this Dan, disinformation and foreign
1:45:25
influence that the country has been fighting
1:45:27
for years. Taiwan of course is
1:45:30
just offshore mainland China and mainland
1:45:32
China thinks is part of mainland
1:45:34
China. Taiwan has
1:45:36
built, again, New York Times, an arsenal of
1:45:38
defenses, including a deep network of independent
1:45:41
fact checking organizations. There's
1:45:44
a government ministry dedicated
1:45:46
to digital affairs. They
1:45:50
are addressing the underlying issues
1:45:53
instead of saying, well, all we have to do is
1:45:55
ban TikTok and everything
1:45:57
will be fine. Taiwan
1:46:01
also has a bigger
1:46:04
problem if they piss off the Chinese
1:46:06
government too much. They have a much
1:46:08
bigger problem than, say, the
1:46:10
US as a whole. Well, remember
1:46:12
also, Taiwan, unlike the US, Taiwan's
1:46:16
other big social networks, Facebook, Instagram, and
1:46:18
YouTube, are owned by somebody other than
1:46:20
Taiwan, right? Maybe an ally. But
1:46:23
nevertheless, they're
1:46:26
all overseas influencers. Lying
1:46:29
is Japanese and is widely used in
1:46:31
Taiwan as well. Much
1:46:33
of this just kind of
1:46:35
sidesteps the necessary conversation about
1:46:38
data privacy legislation and perhaps
1:46:43
legislation that could apply to all
1:46:45
apps and not just single individual
1:46:48
companies out. That's
1:46:51
exactly it. And consumer interest
1:46:53
at data privacy. Yeah. I
1:46:55
mean, the TikTok barrier is just security theater as
1:46:57
far as I'm concerned. But
1:46:59
I agree. What we need to be doing
1:47:02
is teaching media literacy to kids. Yeah.
1:47:04
Yeah. And it's just like
1:47:06
when you see something, don't just react to it.
1:47:09
Ask, okay, the classic, when I was doing
1:47:11
journalism training, the classic one, three
1:47:14
things you have to consider. Who wrote it?
1:47:16
Why did they write it? Who paid for it to be written? So
1:47:19
if you apply that to all this stuff,
1:47:21
it makes life an awful lot easier. High
1:47:24
fives, Ian. High fives. Get
1:47:27
right on. In the US, the Taiwanese
1:47:30
government official said, in the US, the
1:47:32
target is very clear, this
1:47:34
one platform. But in Taiwan, we don't know
1:47:36
where the enemy is. It's not just a
1:47:38
cross-strait issue, an issue with China, but a
1:47:40
domestic one as well. And
1:47:43
that's, I think, a much more nuanced
1:47:45
and informed opinion about what's going on
1:47:47
with this information than our
1:47:49
government, which says, well, if you ban TikTok,
1:47:51
everything's going to be fine. Far
1:47:53
from it. You've solved nothing. There's
1:47:56
Also, just the issue of banning
1:47:58
anything that happens. That hampers free
1:48:00
speech you know related to that you
1:48:03
know France bands i think tic tac
1:48:05
what briefly are a little that's a
1:48:07
New Caledonia which is one of their
1:48:10
is why they're pacific islands territories air
1:48:12
because there's but as for those it
1:48:14
on others been bunch of like violent
1:48:16
protests if is ah this was have
1:48:19
a vote on after ten years on
1:48:21
independence it's a whole complicated whole mess
1:48:23
over there by. Banning.
1:48:26
Apps even for brief periods of time.
1:48:29
You get into the authoritarian kind of
1:48:31
side as hey, let's like stop the
1:48:34
internet for a while and things like
1:48:36
that, It's. A can be a
1:48:38
dangerous slippery slope if you go too
1:48:40
far down that directionless eat out. And
1:48:42
right now, especially to talk is the
1:48:44
way that a lot of people either
1:48:46
make their living or communicate with audiences
1:48:48
are things like that. Then
1:48:50
there is no real replacement. Instagram is not
1:48:53
a replacement and faced books and replacement. Said
1:48:56
like I'm curious to see how everything
1:48:58
goes because. There's. Clearly something
1:49:00
that the government a scene that is
1:49:02
deeply frighten them that they have now
1:49:04
publicly release. I. Would desperately
1:49:07
love to see these reports
1:49:09
so much. Because I
1:49:11
feel like there are some missing context here.
1:49:14
I'll give you my eyes expand their it
1:49:16
smells of airport know when now it's on.
1:49:18
Report says i'm sorry I interrupted you go
1:49:21
as well you see says by me maybe
1:49:23
saying the same thing. Or they
1:49:25
would like you to see Golia. We've got the
1:49:27
secret documents right here, but I think it's pretty
1:49:29
clear to me that it's direct lobbying money from
1:49:31
matter. It's
1:49:34
the strong for since twenty sixteen from metre.
1:49:36
To. Convince the government. Yeah you got a
1:49:38
band. This is Chinese thing. And. Of
1:49:41
course is all to benefit matter and
1:49:43
their platform Instagram. And. i
1:49:45
don't think there is as a smoking gun i
1:49:47
think that that's just as an excuse to not
1:49:49
a very good excuse for doing med as bidding
1:49:52
as you need a smoking gun if you look
1:49:54
at the terms conditions it has grown through an
1:49:56
awful lot of access to the the audience or
1:49:58
handset itself You know we've
1:50:00
seen this on Snapchat so does Facebook
1:50:03
so does Instagram on the lobby right everybody
1:50:05
can see what you're typing everybody everybody And
1:50:08
there is a geopolitical component which I believe
1:50:10
Brianna brought up several weeks ago Which is
1:50:12
I mean there's a tatatat and that's that's
1:50:14
like fine you ban us we ban you
1:50:20
The people arguing that oh no if you
1:50:22
knew what I knew you know that's true
1:50:25
Talk to although she wasn't
1:50:27
able to tell us anything concrete either
1:50:30
I think yeah All I you know
1:50:32
that's that's true. I I heard the Geopolitics
1:50:35
and that that's really I think I don't
1:50:37
think there's a mystery. I think that there
1:50:39
is a geopolitical competition an economic competition Perhaps
1:50:42
a kinetic competition and certainly as
1:50:45
Ian knows a cyber competition between
1:50:47
these new Superpowers
1:50:50
yeah, I mean also coming back to your point about
1:50:52
lobbying. We've seen this before With
1:50:54
the Kaspersky bans because there was
1:50:57
a semantic another Yeah, I'm spending a hell of
1:50:59
a lot of money trying to convince Congress that
1:51:01
this person that is a great How long Dolce
1:51:03
and you know we're seeing it again now Well
1:51:07
you know let's not follow in the footsteps
1:51:10
of the French government They were actually thinking
1:51:12
about a blurring remember the the protests the
1:51:14
yellow vest. Yeah, they were thinking about at
1:51:16
that yeah and blocking Snapchat
1:51:21
And other messaging platforms well they've got plans
1:51:23
in place to do it for an emergency
1:51:25
case in the when they host the Olympics
1:51:28
They would do what they were talking about those the
1:51:30
last black hat yeah, and you know It's like if
1:51:32
there's a terrorist attack then shut it all down yeah
1:51:35
Interesting I almost seems like the opposite what you should
1:51:38
do because you need people to be a coordinate on
1:51:41
Finding the family and search and rescue.
1:51:43
It's just a dangerous dangerous slope, and
1:51:45
I tend to like Once
1:51:48
you take away some piece of free speech it is
1:51:50
really hard to get it back It's also quite easy
1:51:52
to work around these I remember September
1:51:54
the 11th. I was trying to get in contact
1:51:57
with a friend in New York all
1:51:59
the phones were down shut them, the whole thing was
1:52:01
shut down, but email still worked fine. Text
1:52:03
messaging wouldn't, but an email
1:52:05
message would get through. I think
1:52:08
there's ways. I was
1:52:11
never a big fan of the old thing.
1:52:13
Oh, information wants to be free. It'll treat
1:52:15
barriers and route around it, but there are
1:52:17
ways and means. One
1:52:20
NGO, Article 19, says, it's deeply
1:52:22
unsettling that the French government is
1:52:25
normalizing such measures, such
1:52:27
actions, shield governments
1:52:29
from accountability, cultivate a culture
1:52:31
of impunity and
1:52:33
ultimately exacerbate human rights violations. I
1:52:35
think there's lots of evidence for
1:52:38
that. I understand that on the
1:52:40
surface it seems sensible, but
1:52:42
when you really realize the effect of this kind
1:52:44
of thing, I don't think it is sensible. I
1:52:46
think it's very, very risky. I
1:52:48
think it's diversion almost from the stuff that we should
1:52:50
be doing. Well, that's true. Right. It
1:52:53
also almost never works. If we
1:52:55
were to go with the idea that Meta paid a lot
1:52:57
of money specifically for that, I don't think it'll work the
1:52:59
way they think it will. I don't think a bunch of
1:53:02
people will suddenly flood back to Instagram.
1:53:04
Oh, you don't think so? Depends on what your goals
1:53:06
are. I definitely think people are going to flood
1:53:08
back to Instagram. I think it's already happening. My son
1:53:10
is at Instagram slash TikTok. My son has 2.6
1:53:12
million followers on TikTok.
1:53:14
2.6 followers. 2.6 followers on TikTok. I've won half
1:53:16
of that. No, no. He's at 2.6 million. He's
1:53:19
a TikTok
1:53:22
chef doing very well. It's been his whole
1:53:24
career. One of the reasons I'm
1:53:26
perhaps viscerally opposed to
1:53:28
a TikTok ban. But he, long ago, saw
1:53:30
the writing on the wall. He's also
1:53:33
on Instagram with one and a half million. It's
1:53:35
not quite as good a platform. And I think
1:53:37
without the TikTok algorithm working
1:53:40
to help him, I don't think he would have gotten to the
1:53:42
point he got to on Instagram.
1:53:44
Nevertheless, he has migrated to Instagram and
1:53:46
YouTube. Oh, yeah. And I think
1:53:49
any sensible influencer and creator has as well.
1:53:51
Creators, for sure. Look, I have
1:53:54
1.5 million followers on Google+. Congratulations.
1:53:57
How Did you do that?? Wow!
1:54:00
Well aware of what happens if you
1:54:02
are tied to one platform. That's what
1:54:04
happens. You get Google Plus. I
1:54:08
think the a lot of traders are
1:54:10
are of have already moved or yeah.
1:54:13
It. Definitely benefits matter. There's is
1:54:15
not even a question insects. It's
1:54:18
pretty blatant because metics could not.
1:54:21
Beat. Tic Toc at their own game
1:54:23
they could they tried to buy snapped
1:54:25
said and simulated was directly copy them.
1:54:28
Or. With real on Instagram and
1:54:30
that they could name. Couldn't.
1:54:32
Southern Town so. They. Can up.
1:54:35
A. Cigarette break I got lots and were mounted on
1:54:37
our. Go. Ahead. Or. The
1:54:39
sleeve adds a federal article and was
1:54:42
imposed where it's as mass was then
1:54:44
Parses Google Plus they threatened Facebook. Ah
1:54:46
man in the day that age like
1:54:49
milk. Yeah but we've all done by.
1:54:51
Trust me I was Google Plus. I
1:54:53
have very high hopes for Suited Eyes.
1:54:56
It's. And his time with a flat land
1:54:58
on allow up to insights. They chatted
1:55:00
google plus and it's kept doing it's
1:55:02
they could a deeply benefited from the
1:55:04
last couple of years and I really
1:55:06
do think they could have done something
1:55:08
that just like till and give up
1:55:10
on things so quickly in early Yes
1:55:12
so to quickly. Well
1:55:15
we're going to find out, you know, They
1:55:19
now have Gemini Gemini to maneuver
1:55:21
two. mana A I have a
1:55:23
I on their side. Some. Unnoticed.
1:55:26
By away I'm talking loose. I did
1:55:28
set this action button on my phone
1:55:30
briefly for a few weeks to to
1:55:32
Germany. Oh right I am. I always
1:55:35
blinds Yemeni Android to Gemini. and
1:55:37
i go back to scarlet to his i
1:55:39
mean sorry said you did to force a
1:55:41
city that is top i really feel like
1:55:44
it's scarlet once they get the clan so
1:55:46
the open a i scarlet deal happen once
1:55:48
they get the cutesy voice they're both wind
1:55:50
speed someone who's the voice of cereal stray
1:55:53
the or and she's to save three throughout
1:55:55
the first time she's opposing somebody else there
1:55:57
is something see hundred voice or some such
1:55:59
I've said, Ka-ching. Gotta let you have some
1:56:02
good may-sue. Ka-ching. Yeah, I wonder what Scarlett
1:56:04
thinks about this. I mean, you heard that
1:56:06
voice. It's clearly her. All right.
1:56:08
Let's take a little break. You're watching This Week
1:56:10
in Tech with Ben Parr, AI guru,
1:56:14
author at the Information
1:56:16
and co-founder of Octane AI. Ian Thompson,
1:56:18
editor in chief, US editor. I'm sorry,
1:56:20
I promoted you. US
1:56:23
editor at the register.com. Always a
1:56:25
pleasure. Always fun. And I'm here
1:56:27
with Paterson, who is now fighting the
1:56:29
good fight against disinformation
1:56:31
as director of content at
1:56:34
Blackbird AI. Our
1:56:37
show today brought to you by
1:56:39
In Touch CX. CX stands for
1:56:41
customer experience. And if you have
1:56:43
a business that relies on customers,
1:56:46
it's time to enter the 21st century.
1:56:49
Revolutionize your customer
1:56:51
experience. Step boldly into the
1:56:53
future with the amazing
1:56:55
technologies and advisors at In
1:56:57
Touch CX. In Touch
1:57:00
CX empowers your brand to improve customer
1:57:02
experiences with cutting edge AI and
1:57:05
automated solutions across voice chat and
1:57:07
email support. This doesn't replace your
1:57:09
touch points with your customers. What
1:57:11
it does is enhances them. It
1:57:14
enhances them. It helps you do
1:57:16
triage. It helps you help customers better. The
1:57:18
experts at In Touch CX will guide you
1:57:20
through what many companies these days
1:57:23
are finding completely overwhelming. Where
1:57:25
to start, who to work with, what to prioritize.
1:57:27
First, as you become more successful, those
1:57:29
customer touch points become more and more of
1:57:31
a problem, right? But it's
1:57:34
a problem that could be a huge boon. In
1:57:37
Touch CX will help you change how customers
1:57:40
see your brand at every stage by
1:57:42
harnessing the power of AI and automation. I
1:57:45
can divide it up into three stages. Pre-interaction,
1:57:48
the first interaction, you can
1:57:50
use AI to properly deflect 90% or
1:57:52
more of your customer interactions.
1:57:55
Solve their problems before your humans have
1:57:57
to get involved. That means
1:57:59
more. satisfaction from the customers, it
1:58:01
saves you money. During interaction,
1:58:04
you can apply AI predictive analysis
1:58:06
to increase customer satisfaction. No more
1:58:08
customer service reps desperately
1:58:10
flipping through notebooks trying to figure
1:58:12
out what
1:58:14
that customer is asking about. AI will
1:58:17
help them give them the best answer
1:58:19
every time. And after the
1:58:21
interaction, post interaction, you can achieve
1:58:23
up to 82% resolution
1:58:25
satisfaction by using generative AI
1:58:27
to analyze all customer cases
1:58:30
and improve your smart
1:58:32
response templates, develop them,
1:58:35
improved accuracy, improved efficiency,
1:58:37
improved productivity. It's
1:58:39
modernizing your entire customer
1:58:41
experience. In Touch
1:58:45
CX. Remember that name. They unlock
1:58:47
new opportunities between user experience, customer
1:58:49
experience, and employee experience to see
1:58:51
how real improvement in the metrics
1:58:53
that matter most. They're
1:58:56
revolutionizing how brands connect with their
1:58:58
customers with a proprietary framework that
1:59:00
identifies key areas of automation that
1:59:03
drive productivity, engagement, quality and cost
1:59:05
benefits across the entire customer journey
1:59:07
in all industries. This
1:59:09
is the future of
1:59:12
customer experience. You need
1:59:14
a partner to help you get in
1:59:16
the 21st century. In Touch
1:59:18
CX is that partner. In
1:59:20
Touch CX is solutions enable your
1:59:23
teams to engage with customers in
1:59:25
a personalized, efficient way, driving
1:59:27
loyalty, driving growth because happy
1:59:29
customers mean more profits. Transform
1:59:32
your business by anticipating what's next.
1:59:34
Discover new ways of working and
1:59:36
how In Touch CX is industry
1:59:38
leading AI and automation solutions can
1:59:40
get you there. Learn more and
1:59:42
book a free consultation today in
1:59:45
touchcx.com/twit. Really go to
1:59:47
the website. I tell it was an
1:59:50
eye opener for me because In Touch
1:59:52
CX does so many things in so
1:59:54
many areas. It's really interesting to see
1:59:56
for instance things like supporting your customers
1:59:58
in the language they speak. automatically.
2:00:00
This makes such a huge
2:00:02
difference. intouchcx.com/Twitter. You need it
2:00:04
to yourself and your customers
2:00:07
to check it out. intouchcx.com
2:00:11
slash Twitter. We thank them so much for their support of
2:00:14
this week in tech. We're
2:00:16
talking about internet shutdowns. Here's
2:00:19
a case of a government showing
2:00:22
the way to better internet. The
2:00:25
state of Utah, local
2:00:27
Utah governments formed something
2:00:30
called Utopia, the Utah
2:00:32
telecommunication open infrastructure agency.
2:00:35
For a long time experts in internet access for
2:00:37
at least 10 years have been telling me the
2:00:40
best solution is have the
2:00:42
government, just as it is with water and
2:00:44
gas and electricity, have the government build the
2:00:46
infrastructure and then lease the
2:00:48
infrastructure to private companies who can compete
2:00:50
on top of that infrastructure to give
2:00:53
customers the best prices
2:00:55
and the best service. That's what happened
2:00:57
in Utah. The
2:01:00
Inter-Local Agency Collaborative set about
2:01:02
building open access fiber network.
2:01:05
They built the infrastructure, allows any ISP to
2:01:08
then come and compete on the shared network.
2:01:11
Yep, I mean it's a no-brainer. The
2:01:14
classic example of this is Chattanooga, which
2:01:16
has been running their own system. You
2:01:18
can get as a consumer for about 100
2:01:21
bucks a month, you can get 2.5 gig
2:01:24
and the speeds are only going up. Listen
2:01:26
to this. Same thing in North Carolina and
2:01:28
Iowas as well, I think. In Utopia, 18
2:01:31
different ISPs compete on
2:01:34
the same, it's the same fiber network, covers
2:01:36
21 cities. In many cities, ISPs
2:01:40
are offering symmetrical gigabit for
2:01:43
$45 a month. Some are offering
2:01:46
symmetrical 10 gigabit for
2:01:48
$150 a month. This
2:01:53
is mind-boggling and it just shows that
2:01:56
a private and
2:01:58
government Sharing
2:02:00
can do something amazing. This was the
2:02:02
biggest shock about moving to the US
2:02:04
was quite how bad the system is
2:02:11
It's really I mean America built the fundamental
2:02:13
building blocks of the internet and you've allowed
2:02:15
two or three companies to basically Take
2:02:18
over the entire show massively ratchet up
2:02:20
prices. I have never met a Comcast
2:02:22
customer who was happy You
2:02:25
know and the reason this is only in
2:02:27
by the way in Utah is because of
2:02:29
Comcast AT&T Verizon the big
2:02:31
ISPs who fight this tooth and nail
2:02:34
Oh, they want a monopoly without a
2:02:36
monopoly. They can't charge you all
2:02:38
that money for such crappy surface
2:02:40
Yeah, I mean it's in
2:02:42
the mobile is even worse. I mean
2:02:45
my first got a US mobile phone
2:02:47
It was like you're charging me $10
2:02:49
a month to send a thousand text
2:02:51
messages. That's insane There's no no justification
2:02:53
for that whatsoever other than we can
2:02:55
you know great article by
2:02:57
Carl Bodie in a detector Utah locals
2:03:00
are getting cheap 10 gigabit
2:03:02
fiber. Thanks to local governments I
2:03:04
recommend it and then print it out and
2:03:06
give it to your remember your member of
2:03:09
Congress or better yet your city council Because
2:03:11
the best way to do this
2:03:13
is locally to do this in
2:03:15
a regional area Because you can sneak it
2:03:18
in behind Well you
2:03:20
can also on a national level Finland and
2:03:22
South Korea both built out huge fiber networks
2:03:24
And then lease them out at that cost
2:03:26
it's obvious Mmm,
2:03:28
it's not what it should be
2:03:31
considered. You know piping infrastructure It's
2:03:34
not like look. This is like we
2:03:36
know there are problems with the American
2:03:38
system And
2:03:40
this is kind of a showcasing of that
2:03:43
problem We could
2:03:45
have we could be living in a very
2:03:47
different world where everyone most everyone had access
2:03:49
to very inexpensive Internet we
2:03:51
don't live in that system. Yeah more
2:03:55
good news Vermont Justified national
2:03:57
trends and past the toughest
2:04:00
yet state bill protecting online
2:04:02
data privacy. This
2:04:05
is, I'm not sure, I have mixed feelings
2:04:07
about this. Better than Illinois. Yeah, it's the
2:04:09
private action model. The bill
2:04:11
lets Vermont residents sue companies directly
2:04:14
for collecting or sharing sensitive data without their
2:04:16
consent. Normally I don't like this, but
2:04:19
I have to say, they've asked for it,
2:04:21
so I think so. Lawmakers
2:04:25
deployed a countermeasure as they drafted
2:04:27
this against
2:04:30
business pushback. They brought together lawmakers in
2:04:32
states from Maine to Oklahoma and
2:04:34
asked them for advice. Monique
2:04:38
Priestley, a Democratic state lawmaker in Vermont, said
2:04:40
that the biggest thing I've learned from this
2:04:42
bill is that big tech has a playbook.
2:04:46
And she says we've developed a grassroots playbook
2:04:48
to fight back. Now, the
2:04:52
Vermont legislature passed it,
2:04:56
but the governor has still not signed
2:04:58
it. And, of course, big tech
2:05:01
industry lobbyists are moving in with
2:05:03
bags of, little black bags of something.
2:05:05
I don't know what's in there for...
2:05:08
I do like this country that you've
2:05:10
legalized bribery and now campaign contributions. It's
2:05:12
just one of those marvelous busy type
2:05:14
units. That was a recent phenomenon because
2:05:16
of the Supreme Court. Citizens United, baby.
2:05:19
Citizens United, the worst
2:05:22
decision. One of many, anyway. Give them
2:05:24
time. I'll come up with an answer.
2:05:26
Since Jesus was a sophomore. Republican
2:05:31
Governor Phil Scott yet to sign the bill.
2:05:33
I think put pressure, if you're a Vermontian,
2:05:35
put pressure on your governor. This
2:05:38
is a good bill that would be very useful.
2:05:43
I feel like it would really plug up the courts, though, no?
2:05:45
Yeah, I don't like the private action.
2:05:48
This is what Texas used against
2:05:51
abortion providers. This whole
2:05:53
idea of right of private action is, I think,
2:05:55
the government should be the one pursuing
2:05:57
it, but the problem is government doesn't. And the The
2:06:00
threat of me, Suing.
2:06:03
Of. A business cause. My data has
2:06:05
been leaked. Guess. Is that is
2:06:07
bit more chilling to these businesses Had but you're never
2:06:09
going to outspend the cop or the company or not.
2:06:12
Been. At which would you rather fight?
2:06:14
A. Horse sized duck. A
2:06:17
such a are are thousand duck
2:06:20
sized horses. That's the price. The
2:06:22
solution here. I. Can really
2:06:24
see that harming small at Smbs
2:06:26
and start ups in particular because
2:06:28
they don't have the resources to
2:06:30
fight a lawsuit. But I mean,
2:06:33
what is data gathering? Harvest Day?
2:06:35
I mean anybody Run you? rephrase?
2:06:37
Do it as. Then. They
2:06:39
don't have to worry, does don't do it. Just.
2:06:42
Don't do it right And Ll Bean is.
2:06:45
Strongly oppose of course through a
2:06:47
similar bill. In mean I see
2:06:49
orifice is in. Vermont.
2:06:52
And or of his says this
2:06:54
a pervert put Vermont businesses it's
2:06:57
significance if not a crippling disadvantage.
2:07:01
Anyway, Ah, it's possible to
2:07:03
get privacy bills at least. Part.
2:07:06
Way through the legislature the big worry with
2:07:08
principles of they have do. We.
2:07:10
Have California got a pretty good one
2:07:12
of the noise good, pretty good ones.
2:07:15
Big dangers: We have a federal privacy
2:07:17
law which basically undermines the best parts
2:07:19
of those, but just provide some base
2:07:22
back in. Oh yes, Maria cantwell, as
2:07:24
says sponsoring something in the senate sees
2:07:26
of course the former executive at Real
2:07:29
Audience who is a Real Networks has
2:07:31
done everything she can't a torpedo all
2:07:33
the pride previous privacy goes snow. Weirdly,
2:07:35
she's supporting a nationwide privacy bill as
2:07:38
All India Her and rightly well. one.
2:07:40
Of this point of this one of
2:07:42
the features is it a the abrogates
2:07:44
all the staples says like know we
2:07:46
get a federal law now so none
2:07:48
of those staples apply can be interesting
2:07:50
see if that comes out will have
2:07:52
access to how that relates to states'
2:07:54
rights but anyway another know states' rights
2:07:57
mates, rights activists. silly
2:07:59
country at times Yeah, you know,
2:08:01
you're an outsider. We were
2:08:03
called- I'm from Brexit Island, so I'm hardly- Yeah,
2:08:05
I was gonna say, like, you got
2:08:08
some silliness over from Brexit Island over
2:08:10
there. I know. How's that
2:08:12
going, by the way? My wife learned some interesting swear words when
2:08:14
I stayed up for the vote. It
2:08:16
was just, it's
2:08:18
going badly. It's probably the largest
2:08:20
active economic self-harm any country's ever
2:08:23
committed. Long lines at the Channel.
2:08:26
Well, actually, long lines of projects trying
2:08:28
to get in and out of the country, because
2:08:30
all of a sudden, now we've got to have
2:08:32
all the lorries checked when they come back and
2:08:34
forth from the continent. So, if you've got a
2:08:36
significant export business in Europe, then you're leaving the
2:08:39
UK and moving to Europe, because otherwise no one's
2:08:41
gonna buy from you. Don't
2:08:43
ever underestimate the ability for
2:08:46
people to shoot themselves in
2:08:48
the foot. Yeah. You're here.
2:08:50
Yeah, absolutely. If you're
2:08:52
wondering why all of a sudden
2:08:54
Elon Musk is tweeting the falsehood
2:08:56
that Signal is somehow
2:08:58
insecure, it
2:09:01
turns out it all started with
2:09:03
a smear campaign against NPR. Oh,
2:09:07
for goodness sake. It started on X. Thank
2:09:09
you. God bless it. By the way,
2:09:11
X is now X. No more Twitter. It
2:09:14
all turns into X now. Yeah, well- Which I'm happy
2:09:16
about, because I don't like it, because they stepped on
2:09:18
our name. I didn't like it. So
2:09:21
now they're X. And that's
2:09:23
an even worse search engine name.
2:09:28
Right-wing media- this is
2:09:30
from The Guardian. Right-wing media
2:09:32
personalized on X started
2:09:35
the fight by going after NPR,
2:09:38
a long-time bet
2:09:40
noir of the right. It
2:09:44
started in April when a long-time former editor
2:09:46
at NPR wrote an essay in
2:09:48
the free press arguing NPR had increasingly
2:09:51
chosen the cater to a very small
2:09:53
subset of left-wing America, which actually I
2:09:55
won't dispute. But this call- You
2:09:58
Don't really hear calls for folks. The used
2:10:00
to address this is like it will
2:10:02
hide a subset of use well as
2:10:04
I haven't even gone. So not only
2:10:07
are you targeting a left but you
2:10:09
get a crappy i audience to him
2:10:11
in his training so debates vampires, alleged
2:10:13
lefty tendencies and conservative cause. Two defendants
2:10:15
are not new but this all started
2:10:18
at the start of them peers new
2:10:20
Ceo. Ah, Catherine
2:10:22
Mares a term. And
2:10:24
it turned out she had some bad
2:10:27
tweets oh. Bad. Progressive
2:10:29
tweets and he was axed.
2:10:31
Telos yeah, expertise is what
2:10:33
he thought. and sweets. I.
2:10:35
Think I doubt even on
2:10:37
post it's worse says post
2:10:39
is this post. Ah
2:10:43
so then are right this
2:10:45
is off Mccarty right wing
2:10:47
activists to propaganda guru Chris
2:10:49
Roof Whoa! Ah Took took
2:10:51
over the ad campaign against
2:10:54
her. And. Then
2:10:56
it turned out. This
2:10:59
is a very complicated thing.
2:11:02
It. Turned out that. See.
2:11:08
Was also on the board of
2:11:10
signal. The. Signal
2:11:13
Foundation. ah. So.
2:11:15
The Guardian says in this by the way
2:11:17
my one steal this from them as as
2:11:19
a very register like marines via the transitive
2:11:22
property of bad people. Everything
2:11:24
sees Links who is now suspect
2:11:27
so. As. I as
2:11:29
my are still running Npr. roof
2:11:31
of post another blog posts who
2:11:33
says signal is compromised because she's
2:11:35
on the board. No signal
2:11:38
at. The key
2:11:40
reason why are you signal and why
2:11:42
trust signal is down to an oxymoron.
2:11:44
Spike the coke razor. because there is
2:11:47
there really a way he would be
2:11:49
back door exactly. Under any circumstance you
2:11:51
greens, it's hers. The eggs code isn't
2:11:53
good. Have this is you'll never been
2:11:55
in dispute. This is a reliable. Not.
2:11:58
Back toward. Strong Encryption messaging
2:12:01
platform use Signal. However, the fact
2:12:03
that My Or was on the
2:12:05
board it's and Ad signal had
2:12:07
a grant from the Open Technology
2:12:09
Fund which is sponsored by the
2:12:11
Us government. Segment. The
2:12:13
signal president who pick mar for
2:12:15
the board was also progressive who'd
2:12:17
been an equity rabble rouser. google
2:12:19
anyway. It all turned into. A
2:12:22
On for a campaign against signal
2:12:24
which unaccountably you on musk piled
2:12:26
on then now. He said, well
2:12:28
there are. you know there are parts Ilan
2:12:30
believes anything the last person in the room
2:12:33
told him. I think it's really clear now.
2:12:35
It is. it doesn't He doesn't. Have.
2:12:37
A filter for forcing the Germans. The Germans
2:12:40
have a word for it. The mean that
2:12:42
gives a bullet for everything but testicular words
2:12:44
for city it with someone who is highly
2:12:46
said. You know when the mouse and areas
2:12:48
yes is why does killed in one or
2:12:51
two areas and think that translates to everything
2:12:53
else they sing and he's the president sample
2:12:55
of that. I mean there's no way signal
2:12:57
as I thought it was just there would
2:13:00
be so many red flags as off by
2:13:02
everyone he tweeted a couple of weeks ago.
2:13:04
There are known vulnerabilities was signal that are
2:13:06
not being addressed. Seems odd. To
2:13:10
which the entire informed world
2:13:12
responded, what are you talking
2:13:14
about. Jack. Dorsey.
2:13:17
Boosted. The allegations. Signal
2:13:21
see a responded playing as Signal
2:13:23
is open source, closely scrutinized by the
2:13:25
security and privacy community. Ah,
2:13:28
I'm. It's
2:13:31
just absurd. But then telegram see
2:13:33
a pub l door of. Who.
2:13:36
Is Russian novo. I
2:13:38
don't know how much he likes potent. He
2:13:41
has left Russia to work at
2:13:43
have to buy said oh no
2:13:45
of course you should use telegram
2:13:47
instead To which Matthew Green and
2:13:49
others said know whatever you do
2:13:51
to sell a gram and stick
2:13:54
with signal. cadets
2:13:56
my an example of what we
2:13:58
call a narrative attack misinformation
2:14:01
and disinformation combined to attack
2:14:03
an individual or an organization.
2:14:06
It is a dangerous
2:14:10
use of misindist. Yeah,
2:14:12
wow. Well that's good. So this
2:14:15
is a perfect test tube example of that kind of
2:14:17
thing where the
2:14:20
initial target created
2:14:24
a narrative which was then modified
2:14:26
to smear another unrelated
2:14:28
and amplified. Right, if you want to
2:14:31
step back, like I'm not really interested
2:14:33
in the components of this story but
2:14:35
if you just step back and look
2:14:38
at those components as neutral entities, right,
2:14:40
there is a narrative that has been spun
2:14:43
up and misindistinformation or
2:14:45
being the social web
2:14:48
is used to amplify
2:14:50
misindistinformation and then target
2:14:53
an individual or an organization and
2:14:55
as we see like the outcomes
2:14:57
of that can be pretty devastating
2:15:00
even for somebody with the credibility
2:15:02
of or the
2:15:04
entire organization of signal as
2:15:07
right. Alright, we're gonna take one more
2:15:09
last break. I do have a couple
2:15:11
more stories to wrap it up with
2:15:13
Dan Patterson. Are we doing the nuclear
2:15:16
missile facility in the Chinese crypto phone?
2:15:18
I think we are now. Okay. Just
2:15:20
put in. Smash those words together. Yeah,
2:15:22
and take a sound bite of that
2:15:24
statement. That sounds like narrative disinformation to
2:15:26
me but I'll let you tell
2:15:29
the story straight out of the register.com so
2:15:31
I believe it and
2:15:34
it came from the president of the United States of America.
2:15:36
Thank you for the order no less but alright, off to
2:15:38
the break. Coming up. First I want
2:15:40
to tell you how to protect your home with
2:15:43
something really cool. Our show today brought
2:15:45
to you by the Eufy Video Smart
2:15:47
Lock E330. We put it on our
2:15:50
engineering room as a kind of proof of
2:15:53
concept. I said, Micah, you're pretty
2:15:55
handy. He said, what are you talking
2:15:57
about? I said, Micah, you should install
2:15:59
this video smart lock on our
2:16:01
engineering room door. He said, are you crazy?
2:16:03
I said, no, all it takes is a
2:16:05
Phillips screwdriver, no drilling, it fits
2:16:08
right in, and man,
2:16:10
this is a major upgrade
2:16:12
to our security. The Eufy
2:16:14
video lock, of course,
2:16:16
has a camera, easy to install, it
2:16:18
also has a fingerprint sensor, 0.3 seconds,
2:16:23
and one second unlocking. See
2:16:25
that deadbolt right there? The
2:16:27
embedded self-training chip
2:16:29
becomes more accurate every time you use it,
2:16:32
so it's faster and faster. No
2:16:34
more battery anxiety, look at the size of that
2:16:36
battery, 10,000 milliamp rechargeable
2:16:39
battery, it's easily removed, it
2:16:41
could last about four months, you'll
2:16:43
get a load of battery notification before
2:16:45
it runs out, so you don't
2:16:47
have to plug it in. Look at this, look how
2:16:49
fast that fingerprint is. You also
2:16:51
can control it remotely through the Eufy video lock
2:16:53
app, and oh, if the battery dies or for
2:16:55
whatever reason you want to use a key, look
2:16:57
at that, an actual keyhole,
2:17:01
I love that. Control
2:17:03
your front door remotely using the app, if
2:17:05
somebody like, I don't know, Burke, wants to
2:17:07
get in, he rings the doorbell, you
2:17:10
say, yeah, who is it? He says, it's Burke, I need
2:17:12
to get in the engineering room, and
2:17:14
you say, okay, come on in, you
2:17:17
can unlock it. It's got passcode unlocking,
2:17:19
remote control, 2K clear
2:17:21
sight, two-way audio, enhanced night
2:17:23
vision. This is so
2:17:25
cool, but I left out the best part. No
2:17:29
subscription, no monthly
2:17:31
fees, your video lock recordings
2:17:33
are stored locally, you
2:17:35
never have to pay for storage. Enjoy
2:17:37
a worry-free experience, 18 months
2:17:39
warranty, all backed by
2:17:41
Eufy's 24-7 professional customer service
2:17:44
team. This is the smart
2:17:46
lock you've been looking for.
2:17:48
Get yours today, search for Eufy video
2:17:51
lock on Amazon, or visit
2:17:53
eufy.com, eufy.com. You may say, who's Eufy when
2:17:55
they're at home? What do you say? What
2:17:57
do you Brits say? at
2:18:00
home? Like an Englishman's homies
2:18:02
is castle? Oh no, yeah you say
2:18:04
that also but this is anchor, A-N-K-E-R,
2:18:07
this is their home security brand so you
2:18:09
know anchor anchor is a great company. Yuffie
2:18:11
video lock eufy.com.
2:18:17
Now what's all
2:18:19
this then about
2:18:22
a Chinese nuclear facility in a
2:18:24
crypto farm next door? That's American
2:18:26
nuclear facility there was an executive
2:18:28
order issued this week by President
2:18:30
Biden and one of the the
2:18:32
Francis E. Warren Air Force Base
2:18:35
where they stole the Minutemen through
2:18:37
three missiles and they
2:18:39
discovered through a tip-off from a
2:18:41
member of the public that there
2:18:43
was a Chinese run crypto mining facility
2:18:45
within a mile of the base so
2:18:48
the executive order basically says right that's a
2:18:50
national security risk you have 120 days to
2:18:52
sell the facility and so you really
2:18:56
should have told us about it beforehand so if
2:18:59
you're looking for a cheap bit barn up in
2:19:01
Wyoming way then there'd be one going fairly fairly
2:19:03
cheaply but it was yeah
2:19:05
it was a very straight story. And it took
2:19:07
a presidential order to shut this down.
2:19:10
Well basically if you're a foreign business and you
2:19:12
know you have to register such
2:19:14
things with a particular government department they
2:19:16
didn't do that because they didn't want
2:19:19
anybody to know. Presumably
2:19:22
so yeah or it just slipped
2:19:24
their mind but whatever it's kind of
2:19:26
cost them now because the wrath
2:19:28
of Joe Biden Biden has come down on them and
2:19:30
they've been told to sling the hook. That
2:19:33
is hysterical. It's just their plan to go to
2:19:35
the moon. The
2:19:38
reason they put it there is probably because
2:19:40
electricity is cheap right? I mean
2:19:43
that's one of the key determinants
2:19:45
but I mean how does mining
2:19:47
Bitcoin next to a US nuclear
2:19:50
facility help anybody? I can't see
2:19:53
anyway and that's not actually borrowing.
2:19:55
It helps whoever mines the crypto
2:19:57
for themselves. I guess yeah but
2:19:59
it's not. really is it as you think
2:20:01
of security look at I'm not I think that
2:20:03
it's not like banning tiktok go ahead and do
2:20:05
it but I
2:20:07
think the I mean you can have you don't
2:20:10
need to have anyone from a foreign
2:20:12
government that close to a nuclear
2:20:14
reactor that's probably a safe
2:20:16
bet hmm yes it's
2:20:18
what means you get an early warning
2:20:21
if they actually do launch but yes
2:20:24
if that happens it's not really gonna matter that
2:20:26
much but it was just a really weird case
2:20:28
that is done by executive order everybody's short Bitcoin
2:20:31
they've launched the nukes I think
2:20:33
there's diplomatic communities who see small
2:20:35
incursions like this and
2:20:37
then maybe the balloon as a
2:20:40
pattern and I mean
2:20:42
who knows but I think that that there there
2:20:45
could be I can see that
2:20:47
rationale like if you're just trying to like
2:20:49
maybe poke at your adversary a little bit
2:20:52
you might do something like this yeah
2:20:54
I mean I should imagine we're doing
2:20:56
the same sort of thing in the GCHQ in
2:20:58
the UK or at the NSA there there was
2:21:00
a lovely interview with the next NSA bloke when
2:21:02
the office of personal management managed to leak everyone's
2:21:05
fingerprints and security clearances they
2:21:08
asked him about it and he goes yeah that's
2:21:10
a good database if we'd seen that in China we'd have
2:21:12
gone after that in
2:21:16
there mm-hmm training data yeah
2:21:19
I know aren't all the cameras and
2:21:21
all the polls all around that nuclear
2:21:24
base made in China oh well certainly
2:21:26
okay I just just
2:21:29
asking I
2:21:31
want to go look at the cameras and find out
2:21:33
I wonder if they actually aren't hmm
2:21:35
well they banned the Huawei stuff right
2:21:38
because right Huawei but
2:21:40
that's it I think that they're still mostly made
2:21:42
in China because it's just not economical to make
2:21:44
it anywhere else yep
2:21:47
I mean the whole whole wall with
2:21:49
I mean it was great for headlines
2:21:51
don't get me wrong you know us there's no
2:21:53
Huawei but no way Huawei yeah we did go
2:21:55
with Huawei today danger
2:22:00
zone on one story but it's just
2:22:02
halfway to the danger zone. What was
2:22:04
your deck? Forget
2:22:06
the deck, what's the nutcraft? Well actually what
2:22:09
was it? A subhead. Oh god
2:22:11
I need to find out. Hang on let's see. We
2:22:14
did get into trouble with the Twitter. I don't want to ask
2:22:16
but uh... Is nutcraft
2:22:18
kind of what I think it is? Obviously.
2:22:23
I don't know what you think it is. Finally,
2:22:28
on a more somber note, I'd like to
2:22:30
mention the
2:22:33
pioneers who made modern day
2:22:35
technology possible. Often
2:22:37
unsung, unnamed. You
2:22:39
probably don't know the name Robert Dennard but
2:22:43
you definitely use his invention. In
2:22:45
fact without Robert Dennard
2:22:47
there would be no modern computers.
2:22:49
He invented DRAM. He
2:22:52
was at IBM when he invented it. Prior
2:22:55
to this, you might remember,
2:22:58
RAM was often wires
2:23:01
running through magnetic cores
2:23:04
and very limited in its density.
2:23:08
Thanks to the invention of DRAM in
2:23:10
1966 he invented a way to
2:23:12
store a digital bit on a
2:23:15
single transistor. He
2:23:17
called it dynamic random access memory.
2:23:20
And without DRAM, you
2:23:23
know, the computers wouldn't be as small as the
2:23:25
one in your pocket. Of
2:23:28
course there's been major progress made since
2:23:30
his invention. But John
2:23:33
Hennessey, a computer scientist and chair of
2:23:35
Alphabet, said DRAM made
2:23:37
much of modern computing
2:23:39
possible. Dennard
2:23:41
passed away this week at the age of
2:23:43
91, the father of DRAM. Let's
2:23:49
remember his name because without him we would
2:23:52
not have all the benefits of
2:23:54
modern computing. He
2:23:57
also invented something called Dennard scaling which I'm
2:23:59
not snoring. smart enough to understand. So remember
2:24:04
Moore's law which said the number of
2:24:06
transistors will be doubled every 18
2:24:08
months or every two years. Moore's
2:24:11
law concerned the
2:24:13
density of transistors on a chip. Dennard's law
2:24:16
or Dennard's scaling concerned power
2:24:18
consumption. He described
2:24:20
the physics that
2:24:23
would allow
2:24:27
transistors to shrink and become more powerful
2:24:29
less costly even as
2:24:31
the energy each one consumed remain almost constant.
2:24:34
So it was kind of the more technical version
2:24:36
of Moore's law. Yeah also Moore's law was kind
2:24:38
of just a suggestion. We used to have bets
2:24:42
in tell developer forums you know there'd be
2:24:44
a sweep state amongst journalists as to how
2:24:47
quickly they'd mentioned Moore's law and
2:24:49
nobody was never over three minutes. So
2:24:52
you know you had to really pick
2:24:54
your times on that one. That's pretty
2:24:56
funny. But it is remarkable how so
2:24:58
much technology comes down to just a
2:25:00
handful of people. Unbelievable. Lisa Su, the
2:25:02
CEO of AMD said everybody in semiconductors
2:25:04
studied Dennard's principles to get where we
2:25:07
are today. A name
2:25:09
I had never heard before and
2:25:11
yet absolutely seminal in the
2:25:13
invention the invention of microcomputers when
2:25:16
he was at IBM in 1966 invented
2:25:18
DRAM, RIP, Robert
2:25:20
Dennard, IBM inventor. I
2:25:24
usually save those sad air
2:25:26
stories for the end of the show but actually
2:25:28
this whole show has been depressing as hell.
2:25:30
You're being the optimist I'm like well I think
2:25:41
everything's terrible. He thinks everything is awful director
2:25:43
of content.
2:25:47
No he doesn't. At Blackbird AI are
2:25:49
you bullish? You have a little baby
2:25:51
you've got a child. Oh I'm bullish.
2:25:53
Guys I really appreciate you letting me be
2:25:55
a skeptic. It's just like it's born it
2:25:58
like I can't not be a journalist. journalist
2:26:00
and just think about everything through a skeptical
2:26:02
lens. Like it's just my wiring.
2:26:05
However, I am so
2:26:07
stoked for it. Like look, I'm using
2:26:10
Gen AI every day. I
2:26:12
wish that I had been your
2:26:14
enthusiasm and optimism. I want like 10%
2:26:16
of it because it's fantastic. And Ian,
2:26:18
like I've been following your work for
2:26:20
years. Your cyber reporting is fantastic. Oh,
2:26:22
thank you. Anyway, so sorry. I didn't
2:26:24
mean to be so depressing in the
2:26:26
show. I just like you. No, it
2:26:28
wasn't you. I mean, it's just the
2:26:30
way it is. We got to look
2:26:33
ahead. But I think it probably is important
2:26:35
to say the future is probably
2:26:38
pretty exciting. It's exciting. It's
2:26:41
going to be interesting. We'll
2:26:43
go with, we have always been able to
2:26:45
figure out how to move forward into the
2:26:47
future. I like that. And we will continue
2:26:50
to do that. I like that. We have
2:26:52
survived. We have survived World War. We will
2:26:54
figure this out. That's good.
2:26:57
And it works at Blackbird AI. Use
2:27:01
the director of content there. It is blackbird.ai.
2:27:04
What is the URL? Yeah, blackbird.ai. And
2:27:07
I just signed up for, I guess
2:27:10
it only takes 48 hours to get out. That's
2:27:12
how it is. Yeah. Compass.
2:27:15
Oh yeah. And I'll just ping our team.
2:27:17
I mean, we're a very small team. I
2:27:19
effectively run the blog that you see on
2:27:21
the top, right? And
2:27:23
our team just writes these fantastic stories that
2:27:25
I get to, like, tomorrow morning I'm going to
2:27:28
wake up and edit some really
2:27:30
well-written and interesting research.
2:27:33
And that's a great job. No
2:27:35
kidding. And I notice you use
2:27:37
AI for all the illustrations. No,
2:27:40
for like, I don't have
2:27:42
the budget for stock photos. I
2:27:45
mean, it's really just it. I just don't have
2:27:47
the budget. Do you have like the beginning of
2:27:49
a prompt that always says, a Blackbird in, and
2:27:51
then you just provide
2:27:53
wherever that Blackbird is? No, but I
2:27:55
do have techniques. And I use a
2:27:57
blend of tools. Nice
2:28:01
stuff that's you don't have to tell us
2:28:03
your secrets. That's okay. That's
2:28:05
okay. This is there's no secret a
2:28:07
great place to get great information. I
2:28:09
love it. Blackbird.ai.
2:28:11
It's great to have you on Dan. Thank you so much.
2:28:13
We really appreciate it. Thank you. I appreciate it. And your
2:28:16
doggy who was coming and going the whole show
2:28:18
but no one knows about me. I
2:28:20
thought... I can't control what's present. No, I
2:28:22
love him. Sorry, dogs are great. Dogs are
2:28:24
great. Dogs are great says Ben Parr. They
2:28:26
are. Ben Parr is
2:28:28
an AI guru. He has been for
2:28:30
years. Finally the world's catching up
2:28:33
to him. The author of the AI analyst,
2:28:35
co-founder at Octane AI. He writes about
2:28:37
AI at the information. Anything you want
2:28:39
to plug besides your beautiful new bride
2:28:41
and your soon-to-be large family?
2:28:45
I have some interesting newsletters. I'm going
2:28:47
to drop on my newsletter, benparr.com, and
2:28:50
maybe an announcement or two. Give
2:28:53
me some time. So you go to
2:28:55
benparr.com and you'll see the crazy announcements.
2:28:57
Been working out some stuff behind the
2:28:59
scenes. Interesting. If
2:29:01
you want to know more about
2:29:03
AI, subscribe to the AI
2:29:05
analyst by BenParr at benparr.com. Thank you, Ben.
2:29:07
It's great to see you and
2:29:10
congratulations once again. And
2:29:12
all the best to your new bride. Mr. Ian
2:29:15
Thompson, the
2:29:18
legendary Ian Thompson, US editor of the register.com.
2:29:20
I usually say that about someone till they're
2:29:23
dead. It's
2:29:25
always great to see you. Thank you for coming all the way
2:29:27
up here. I appreciate it. You have an In-N-Out in better Loomis.
2:29:29
So, you know, that's my stop-off point when I go back. Oh,
2:29:32
you used to go to the British store. I guess we were
2:29:34
kind of... Yeah, the British store shut down. So, In-N-Out it is.
2:29:39
You're becoming an American, aren't you? Yeah, well, I
2:29:41
still think that fries suck, but... In-N-Out?
2:29:44
Really? Oh, God. What do you
2:29:46
get? The double-double... Not wrong. Yeah,
2:29:48
double-double mustard burger... Sorry, mustard grill with
2:29:50
the whole grilled onion. Nice. Yeah. Yeah,
2:29:53
I guess you're right. They're kind of
2:29:55
soggy, aren't they? They're no McDonald's... Tabitha
2:29:58
tallow fries. Those were the... gold
2:30:00
standard. Yes. And
2:30:04
I've got to say, Dan, I'm with you on
2:30:06
the cynicism side of things. It never leaves you.
2:30:08
It could be worse. It's journalistic, you know. It
2:30:10
could be worse. I
2:30:12
had a former editor who, when he went to
2:30:14
an interview with somebody he really didn't like, he
2:30:17
had a t-shirt printed up with, why is this
2:30:19
bastard lying to me? And I
2:30:21
would just sit there in front of him. But
2:30:23
are you bullish about the future? Yes.
2:30:25
I mean, it's going to get worse before it
2:30:27
gets better. But I think Ben's right. The
2:30:30
long-term prognosis is good. I
2:30:32
mean, if you believe some of the scientific studies, we went
2:30:34
down to about 8,000 humans on the
2:30:36
planet at one point. Yeah, talk about choke
2:30:38
points. We can come back from a lot. We're
2:30:41
all descended from the same 8,000 people. That's
2:30:44
depressing. Dan Parr, I
2:30:46
know you're bullish. The future is bright. You
2:30:49
know what? You can also look at that as
2:30:52
inspiring. Exactly. It's all how you view
2:30:54
it. I'm inspired,
2:30:57
and we should all be inspired to make a difference to go out
2:30:59
in the world and create
2:31:01
the world we want to grow up in and we want
2:31:03
our kids to grow up in, right? We
2:31:06
do this show, and maybe that's a little bit of
2:31:08
what our mission is as well, every Sunday from 2
2:31:11
to 5 p.m. Pacific time. That's 5
2:31:13
to 8 p.m. Eastern. You
2:31:15
can watch us do it live at
2:31:17
youtube.com/twit slash live. Of course,
2:31:19
if you're a member of Club Twit, you can contribute,
2:31:22
participate in our beautiful
2:31:24
Club Twit Discord. You also get ad-free
2:31:26
versions of all the shows. We
2:31:29
hang out during the week. In fact, Micah
2:31:31
just announced a new show where you're going
2:31:34
to get to, I guess, crochet with Micah?
2:31:37
Micah? It's going to do a creative
2:31:39
hour. I'm also doing a music stream. I love that.
2:31:41
Are you going to do a music stream? Will
2:31:43
you? I am, yeah. Benito's going to do a music stream.
2:31:46
Now what am I going to do? I
2:31:48
got to do something a couple weeks ago. We
2:31:51
had a watch party. We all watched
2:31:53
the very first really sci-fi movie, Fritz
2:31:56
Lang's Amazing Metropolis. That was a lot of fun. That was a lot
2:31:58
of fun. 1927 by the original kid. original
2:32:00
soundtrack? We didn't want to violate any copyright. But
2:32:05
the movie itself, and we were the soundtrack. All
2:32:08
that stuff. It was a lot of fun. Anyway,
2:32:14
that's the kind of thing we do in the
2:32:16
club. It's seven bucks a month. Really, the main
2:32:18
reason to join is, you're subsidizing the content that
2:32:21
you want. If you like what you hear here,
2:32:23
please join the club. We'd love to have you
2:32:25
some of the best people and
2:32:28
the smartest lines in the
2:32:30
world. Convene
2:32:32
every week in ClubTwit.
2:32:34
twit.tv slash ClubTwit to learn
2:32:37
more. After the fact, the show will
2:32:39
be available on our website, twit.tv. You
2:32:41
can also get it on YouTube. There's a YouTube
2:32:44
channel dedicated to this week in tech and the
2:32:46
video version of it. And you can subscribe to
2:32:48
audio or video in your favorite podcast client. That
2:32:50
way you'll get it automatically. Just a time for
2:32:52
your Monday morning commute. Here we are in our
2:32:55
20th year talking
2:32:57
about the future. And
2:32:59
it's never been more interesting, has it? It
2:33:02
really hasn't. I'm very excited about what's to
2:33:04
come. We thank you for being here.
2:33:06
We'll see you next time. And as I have said
2:33:08
for 19 plus years, another twit. Is
2:33:10
that the case? Or is it the music? You're
2:33:16
the twit. You're the twit. All right. You're
2:33:19
the twit, baby. You're the twit. All
2:33:21
right. You
2:33:23
know that thing you've always been too scared
2:33:25
to go for? It seems like you'll never
2:33:27
have the guts. Then a
2:33:29
cancer diagnosis. And doing that thing
2:33:32
becomes way less scary. I
2:33:35
would know. I've always wanted a black belt.
2:33:37
And today the only thing standing in my way
2:33:39
is a few wooden boards. It's
2:33:41
been a lot of hard work and five years
2:33:43
since I beat cancer at Ohio Health. Bring
2:33:46
it on. You keep making plans. Visit
2:33:49
ohiohealth.com-keepmakingpans.com.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More