Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
It's time for Twig this week in Google Jeff
0:02
Jarvis is here. Paris Martin, no is
0:04
here. We're going to talk about, of course, our
0:06
top story. The New York Times suing Microsoft
0:09
and open AI saying they're stealing
0:11
our content, but are they really,
0:13
I'll make a case for keeping
0:15
your hands off of AI. No
0:17
regulation at all. And
0:19
Neil dash makes a case for the internet getting
0:22
weird again in 2024. And
0:25
why you might want to pay attention to how much
0:27
you're making on those app-based
0:29
sites like Airbnb, you're
0:31
now going to be reported to the
0:34
IRS. It's all coming up next. So
0:36
this week at Google podcasts,
0:38
you love from
0:40
people you trust. This
0:44
is Twig. This
0:50
is Twig this week in Google episode 749
0:52
recorded Wednesday, January 3rd, 2024,
0:58
a well-regulated chart team. This
1:01
episode of this week in Google brought to
1:04
you by babble, the language learning app that
1:06
actually works. Babble uses quick 10 minute
1:08
lessons designed by over 150 language experts. So
1:12
you can start speaking a new language in
1:14
as little as three weeks. Babble's built
1:16
with science backed cognitive tools like
1:19
spaced repetition and interactive
1:21
lessons. And it's created to help
1:23
you build real world conversations. So
1:26
you can learn how to order food, ask
1:28
for directions and speak to
1:30
merchants. It's easy. Babble
1:32
also teaches and I love this context
1:34
traditions, the culture of the language that
1:36
you're learning. I'm learning Spanish right now.
1:39
I studied French earlier. It's
1:41
just fantastic from self-study app lessons
1:43
to podcasts, even live classes,
1:46
which is great for conversational speaking.
1:48
Babble has a wide range of learning
1:50
experiences for learners of all ages based
1:52
on level and your time commitment. That's
1:55
one of the things I love about babble. It's just a few
1:57
minutes a day. If I want 15 hours
1:59
with. BABL is equal to one
2:01
university semester. Plus all of BABL's
2:03
14 language courses are backed
2:06
by their 20-day money back guarantee.
2:09
Now I'm a lifetime member of BABL,
2:11
because I love it so much, but
2:13
we've got a special limited time deal.
2:15
If you're new to BABL, right now
2:17
get 55% off your BABL subscription. This
2:19
is only for listeners. You have to
2:21
go to babl.com/twig. 55%
2:25
off at
2:27
babl.com/twig. babbel.com/
2:30
twig. Rules and restrictions apply. It's
2:33
time for Twig. This week in Google,
2:35
the show where we wear funny glasses
2:37
and talk about Google. Hello Paris Martineau.
2:39
You see I wore my specs. Yours
2:42
are handmade. I was gonna say, I don't
2:44
know what you're talking about. These glasses are
2:47
decidedly serious. Nothing funny about them. Nothing funny
2:49
about those. Jeff Jarvis is
2:51
not wearing funny glasses either. They didn't send me
2:53
the memo. I'm so hurt. Yeah, he's just wearing
2:55
your John Lennon, your usual
2:58
John Lennon glasses. Jeff is the Leonard Tao Professor
3:00
for Journalistic Innovation at the Craig Newmark Graduate
3:04
School of Journalism at the
3:06
City University of New York, Emeritus. We have to keep
3:08
playing that through August because
3:10
we pay for the singers. So
3:14
we'll keep playing that.
3:16
And then Paris is of course at
3:18
the Information where she does great
3:21
work. Still working on that big story that you can't tell us about? I
3:24
am. Okay, good. Watch
3:26
the internet, guys. When it breaks, you'll let
3:28
us know, right? I will. You'll
3:30
be the first to tell us. So I
3:33
don't normally, I usually wear contacts. I wear glasses,
3:35
but I usually wear contacts on the show. But
3:38
I thought, you know, I got some fun glasses and
3:40
I thought, I can, Paris and
3:42
I now can talk about our glasses. These,
3:45
these glasses which really make me
3:47
look like Elliot Gould in Ocean's Eleven or
3:50
maybe Larry Budd Melman, I'm not sure which.
3:53
You do seem like you're gonna do a heist. For
3:56
those of you listening, big. black
4:00
glasses kind of like Michael Caine might wear.
4:03
But these are from a company called Vinyl
4:05
Eyes. They're made out of old records. Oh,
4:09
that makes sense. I was about to say
4:11
they have some sort of reflection that you're
4:13
seeing. That's the groove. It's the life that
4:15
is like a record. It's the grooves of
4:17
the record as I move
4:20
my face around. Yeah, there they are. You
4:22
need to put a needle on it to see what
4:24
the little bits are. Oh, you do. Okay, so each
4:27
model has a different name. This is called Fleetwood. I'm figuring
4:29
it's Fleetwood Mac. It could be the
4:31
Rumors album. It could be Tusk. I
4:33
don't know. It's true. They
4:36
also have an AC, S-E-E-D-C version, which
4:38
I figure is old AC-DC records. So
4:41
isn't that a good idea? So I'm doing
4:43
my part to keep the earth green. That's
4:46
super cool. By wearing an old Fleetwood Mac
4:48
album on my face. No
4:51
one would ever have used for that
4:53
Fleetwood Mac vinyl. There's nowhere else could go.
4:55
We have two, not one, but two
4:58
vinyl record stories in Petaluma alone.
5:01
Oh, you would? Oh,
5:03
it's the hippest thing. You know, I have a... I
5:06
don't know. All right. Do
5:08
either of you guys have record players? No. Do
5:11
you? I think you would. I do, yeah.
5:14
You're a Brooklyn hipster. I mean, that goes with that
5:16
saying. I got my way for one for Christmas two
5:18
years ago. She has all her albums downstairs, yeah. I
5:21
thought she's played with one. She has.
5:23
I just don't want to ever have to move albums
5:26
again. That is such
5:28
a pain. Yeah,
5:30
I don't have that many. I
5:32
have more of a concern of moving all
5:34
of my books. Books are bad too. It's
5:37
just unnecessary. I stuffed everything, my albums, my
5:39
books and everything into an iPhone. When
5:42
I move, I put it in my pocket and I'm gone. No
5:45
boxes. You put your 27 phones
5:47
in your pocket and there you go. Well,
5:49
there's a disadvantage. Yes, yeah, having so many
5:51
phones is not a good thing. How
5:55
many phone numbers do you have? Well,
5:57
you know, it's funny. We did a little... Clean
6:00
out during the holiday season. We went
6:02
down of writing and reading your phone
6:05
number. Cn went out of rise know
6:07
I'm mesmerized stock to bow yes it's
6:09
a I want to close out my
6:12
account. Believe. Lows:
6:14
it's closer. I. Am
6:16
an app. The were through all of
6:18
them so subtly assists the sea ice
6:20
storm and eighteen seek out with no
6:22
eighteen T Sims or numbers or anything
6:24
but I have it accounts and I'm
6:26
paying money so senseless there must be
6:28
some gum. So
6:31
that the the next I would see the honestly
6:34
they make it so hard against against the have
6:36
to go nowhere else and we get the rest
6:38
are the chances are you can't do it here
6:40
as rewards we do a suggest going one on
6:43
the phone and columns but ah well as the
6:45
torture The member manager was good he said but
6:47
I will walk you through and I will expedite
6:49
so it so he says a kid dallas number.says
6:51
okay immediately press one I said okay. This
6:54
is Susan my starts talking press one a Catholic,
6:56
some acting. As
6:58
it went right through the tech support and
7:01
it was much so credit to the manager,
7:03
the register Bedlam. Was. Very very
7:05
nice of them. He could have we
7:07
explained it's it's something went wrong the
7:09
signals apparently they the must have lost
7:11
a tower in Petaluma. Is used
7:14
a very strong five G signals and now
7:16
not. So. What
7:18
Do you mean? How many phone numbers did you have
7:20
on your for As and plan? Let.
7:23
Me Put it this way: one hundred. And
7:25
one you can get. it seemed minute sure I
7:27
get these him for I went against and said
7:29
no you can't have too many men it's. I
7:35
don't know the limit is, but there is actually a limit.
7:39
On Nazis you have a huge family.
7:41
Does you have Zoom in your corner?
7:43
These other only take it another. Is
7:47
lost out to a stretch limo full of
7:49
skills and. My family.
7:51
What are you doing here Road. or
7:55
right ah did you see had a nice
7:57
a holiday were ten members is a new
7:59
year's eve party you actually New Year's Eve
8:01
Eve Eve Eve Eve very important yeah wonderful
8:03
Eve it's the day before the day before
8:05
the new year and it's a great holiday
8:08
to celebrate with your friends nice although
8:10
I guess not for the next couple
8:12
of years because it's been good weekend
8:14
recently yeah it's been the weekend but
8:17
starting that year's Eve Eve will fall
8:19
on a workday which is not as
8:21
fun as it really is fun and
8:24
Jeff did you did you do anything for the holidays so I
8:27
was angry when the neighbors
8:30
fireworks woke me up at night on New
8:33
Year's Eve I'm an
8:35
old guy yeah did you go out
8:38
and shake your fist step along nothing
8:40
thanks too much energy just
8:43
first he just thought that thought so
8:45
the big story we talked a
8:50
little bit about it on Windows Weekly this week the
8:53
New York Times is
8:57
suing Microsoft and open AI
9:00
saying you bad people you
9:02
are your
9:05
AI systems are widespread
9:07
copying copying
9:09
our stories and that's copyright
9:12
infringement telling
9:14
Lee the folks at the Times have
9:16
asked for a jury they've demanded in
9:18
the terms of the plea a
9:20
jury trial is
9:23
it there right alone to do that yeah
9:26
I think I also have yeah I mean the defendant
9:28
I think would have more of a right to say
9:30
now I think that you well I
9:32
guess the judge will rule a judge
9:34
will rule but I think
9:36
that is typically the plaintiff
9:39
who demands I don't know really
9:41
no because remember Donald
9:44
Trump was complaining about not having a trial in the
9:46
New York case and the
9:48
judge pointed out that his attorneys had
9:50
specifically not asked for a jury so
9:56
yeah I think either side probably can ask for it anyway I
9:58
don't know I'm not a lawyer we should get Matthew Yellison.
10:02
Mike Masnick wrote about this and said,
10:04
the New York Times really should think
10:06
about what they're
10:10
asking for because this could bite them on the beehives.
10:16
The New York Times, he points
10:18
out, very commonly will
10:21
notice a story in another journal
10:24
and then launch their
10:26
own investigation from
10:30
it, sometimes without credit. And he says,
10:33
if you can sue OpenAI for taking
10:35
your material and training an
10:37
AI on it, what's to stop those
10:39
other journals from suing you? The
10:42
New York Times lawsuit, he says, against OpenAI would
10:44
open up the New York Times to all sorts
10:46
of lawsuits should it win. But
10:49
to me, honestly, that isn't
10:51
even the issue. I
10:54
really think that from my point
10:56
of view, and by the way, I got
10:58
in a little fight with Paul Tharott who said, as
11:01
representing creators, the
11:05
New York Times fighting the good fight. Creators
11:07
ought to sue AI for using
11:10
their content. So
11:13
I went on the CBC to talk about
11:16
this. And
11:18
my argument, and Paris made fun of me saying,
11:20
well, you can't stay away from a camera for
11:22
more than a week, can you? Okay, it was
11:24
very funny because it was a Wednesday right when
11:26
we record this show that you were like, I'm
11:28
going to be live. I've got to have my
11:30
airtime. What am I
11:32
doing? So my
11:35
contention is that the machine has a
11:37
right to learn and
11:39
that it doesn't record
11:42
it. And we'll go
11:44
into, I think, the details about some of the Times
11:46
allegations about specific segments in a minute. But
11:49
if the machine can't learn, if the
11:52
machine can't do what we do, Then
11:55
it's a problem. And Also on line
11:57
68, it is the heritage of our
11:59
industry. That we go back to
12:01
renting a measure the show before the newspapers
12:03
had scissors editors. Who. Was
12:05
their job to cut up newspapers and
12:07
put them in copyright did not covered
12:09
his papers at all at first or
12:11
magazines even when it did it didn't
12:13
cover news per se it only covered
12:15
com a special things of authors of
12:17
and so it is a bit i
12:20
might write as it is it is
12:22
you're right leone stuff for issue on
12:24
but i think it's is disingenuous of
12:26
the are tons. To. Act as
12:28
if are they don't do this every damn day.
12:30
I love this paragraph from Mike's article. In the
12:32
end though, the crux of this lawsuit. Is.
12:34
The same as all the others.
12:37
And he's talking about Sarah Silverman
12:39
lawsuit, George Rr. Martin's lawsuit against
12:41
an open A Ice. It's a
12:43
false belief that reading something. Whether.
12:46
By a human aura, machine somehow
12:48
implicates copyright. This is false. Is
12:51
the courts or the legislature decide
12:53
otherwise, it would upset pretty much
12:55
all of the history of copyright.
12:58
And create some significant real
13:00
world problems. Or. Out though
13:03
Paul said that in the New York
13:05
Times complaint which go run sixty four
13:07
pages that they. Provided.
13:09
Ample examples of chatty busy quoting them
13:12
for being I saw I thought they
13:14
were they got him to, but Mazda
13:16
Katzenberg appointed as another one I put
13:19
up online. Or. Sixty
13:21
Six. Which. Has it demonstrations.
13:23
One of the examples is a
13:26
cool a quote from a review
13:28
of Guy Fieri New York restaurants
13:30
and as I'm. Whoever.
13:33
Does who wrote this? Ah sad that
13:35
Kevin A. are I am. A.
13:37
To see I'm right in. The him
13:39
on Twitter that I'm quote was was
13:41
all over the internet was quoted again
13:44
and again and again and again. A
13:46
so wasn't hard to get of one
13:48
one place and then be what cat
13:50
what what Mazda points out. A.
13:53
lot of inside impersonal what lights
13:55
points out is that the way
13:57
the queries were done if my
14:00
OpenAI down the path so that basically the only response
14:03
could have been the words that were next to the
14:05
words that were next to the words next words in
14:07
that case. Yeah that's
14:09
what I without reading the entire plea that
14:11
was my belief was that
14:14
the New York Times had carefully crafted
14:16
the prompts. Oh yes. And my issue
14:18
and what I told Paul is no
14:20
one is you're not asserting I
14:22
hope and the time wise the Times is
14:24
that somebody would read or query chat GPT
14:26
in lieu of reading the New York Times
14:28
that they would say oh I don't have
14:30
to buy the Times because I can get
14:32
everything I want from chat GPT because
14:35
they're just gonna quote the time. Well they are
14:37
arguing that in so far as there's the
14:40
same argument that Google was used against Google.
14:42
Exactly made it against Google. Well and Mike
14:44
points out that I don't know if it
14:46
was in the in the complaint per se
14:48
but they they whine about about well Wirecutter.
14:50
Mike's example is I go to Wirecutter and
14:52
I ask what's the best bike I get
14:54
the answer that's all I need is the
14:56
brand and the model number from search.
14:58
Yes that's true I don't go to the New York Times
15:00
to read the whole thing and yes
15:02
the New York Times doesn't then get the affiliate money
15:05
but God didn't give it to them. Sorry
15:07
guys you're you're involved in an
15:10
information ecosystem that you the New
15:12
York Times take advantage of every
15:14
single day where information once known
15:16
is free to use. The hot
15:19
news doctrine the Associated Press years ago tried
15:21
to have a hot new I remember this
15:24
which was to say that there were two things
15:26
that went on one was that they said that
15:28
that that there was a period of time was
15:30
never established but about 12 hours or
15:32
day where if you broke the story when
15:34
Paris has her big story out it's
15:37
hers for a day nobody can even
15:39
mention it because it's Paris's right and
15:42
that the courts didn't go for that they did at first they didn't
15:44
go for that and then when radio
15:46
came along newspapers tried to do the same
15:48
thing and they told radio that they weren't
15:50
allowed to discuss any story until 12 hours
15:52
after it happened. That
15:55
also went by the way and
15:57
but the same kind of sacred language
15:59
that's in the New York Times complaint,
16:02
which Mike makes fun of a great
16:04
life is, we're so special. We're the
16:06
New York Times. We're saving democracy. We
16:08
put so much money into it. Yeah,
16:10
yeah, yeah. So, I want to hear
16:12
what you have to say about it, Peristo. Yes. I
16:14
have other thoughts, though. I
16:16
mean, I'm curious about your thoughts. I
16:20
think that the arguments that you guys have outlined are
16:22
correct. I think that
16:24
at first, you know, you see the argument
16:27
the Times is presenting as, oh, people shouldn't
16:30
be able to scrape our news and use
16:32
it to train these systems. But I think,
16:34
as we've just discussed, the
16:37
actual issue at hand is a lot more complicated
16:39
than that. And also, the idea that
16:41
anyone is going to be using chat
16:43
GPT to get a line-by-line read
16:46
of a Guy Fieri review, and
16:48
that is going to undermine democracy.
16:50
A 12-year-old Guy Fieri review, which
16:52
was talked about like crazy because
16:54
it was so unfair to Fieri.
16:56
People screamed about it and quoted
16:59
it at length all over. Sorry,
17:01
guys. Go ahead. No. I mean, I think
17:03
that, I think, what
17:05
do you guys think is going to happen in the courts
17:08
with this? Here's my issue. First of all, I
17:10
think it's telling that the New York Times demanded
17:13
jury truck because they know a judge will,
17:15
as judges already have, throw this out on
17:18
the face of it. It's fair use.
17:20
Judges have already ruled again and again
17:22
that AI has the right to scrape
17:24
the Internet and generate its large-language models
17:26
from that content that's publicly available. And
17:29
it's not a violation of copyright.
17:31
I think that that's going to always be the
17:33
case with a judge. Jury might be different. And
17:36
I think it's one of the reasons this bleeding is so
17:38
emotional is they're playing to a prospective
17:41
jury saying, oh, you know, you don't
17:43
want the New York Times to fail, do you? But
17:48
here's my big thing now
17:50
that I am an AI
17:52
accelerationist. I
18:01
think Paul Therat said, and I think
18:03
I agree, that really this is a negotiation ploy,
18:05
as it usually is. Yes. At
18:07
the Times just wants to sue them so
18:09
that OpenAI will give them some money. And
18:12
unfortunately, both Google and OpenAI have already
18:14
done this with other journals,
18:18
and as a result have created this
18:20
slippery slope. I don't
18:22
think they should ever do this, because
18:25
I think really this
18:28
is the worst kind of regulatory capture.
18:31
Yes. Yes. Because the future
18:33
of AI really, in my opinion, and we have a
18:35
Jan Lekoon article we can talk about in Wired, an
18:37
interview in Wired, but I agree with
18:40
him where he says really the real future, the future
18:42
you want with AI is open. That's
18:44
what OpenAI was supposed to be. You want open
18:47
source AI. You want everybody to be able to
18:49
develop with it, use it, create stuff with it.
18:51
You don't want the big tech companies to be
18:53
the gatekeepers. You don't want Microsoft, Google,
18:56
Amazon, Apple, anybody to
18:58
own AI. You don't want them to be the
19:00
gatekeepers. You want AI to be everywhere. And
19:03
I truly believe that. I think that's
19:05
really important. So no matter what
19:07
the upshot of this case is, if the
19:10
New York Times wins, if
19:12
OpenAI ends up
19:15
paying them, it means that it
19:17
makes it harder for the open source
19:20
AI to build and succeed.
19:23
So it's going to have a bad outcome.
19:25
The only possible good outcome is
19:27
if the jury, and I don't
19:29
think so, but if the jury says no, this is fair
19:32
use, go away New York Times,
19:35
then that would open it up for everybody. I
19:37
mean, do you think though that companies like OpenAI
19:39
should be able to use anyone's
19:41
materials or anything? If it's publicly on the
19:44
internet, they can learn from it. They can
19:46
train from it. Not quote it, not
19:48
steal it. And this is the
19:50
thing. If it's behind a paywall and
19:53
it's not metered, should they be
19:55
able to scrape that?
19:57
Actually, that's an interesting one. If
20:00
OpenAI pays for one subscription to the New
20:02
York Times and then scrapes all the content.
20:04
Right, exactly. That's an interesting question. I
20:07
had a conversation, if I could add to this, I
20:09
had a conversation this week with Rich Sprinta, who started
20:11
Topics years ago and I didn't know it. Rich is
20:13
now the Executive Director of the Common Crawl
20:17
Foundation. Which
20:19
Mike mentions in this, by the way. Mike mentions in
20:21
this. Yeah. And what Mike
20:23
also says in his story is that the New York Times, and I
20:25
just find this, there's a
20:27
larger issue for the ethics and morals of journalism
20:29
and society here. The New York Times, Mike
20:32
says, has demanded that Common Crawl take
20:34
off all of the content that it
20:37
got from the New York Times. Now
20:40
OpenAI and company have done
20:42
a, you know, a robots.txt for this
20:44
case, which is fine. But
20:47
the New York Times is talking, respectively,
20:49
in the past, to take
20:51
it off. What is Common Crawl? Is
20:54
it like archive.org? Is it like the Internet
20:56
Archive? It's different to this extent. All it
20:58
does is it scrapes huge amounts
21:00
of data from
21:03
the open web, open, open, open web.
21:06
Mainly it was intended for academics, so
21:09
that there was a source of study, which
21:11
is invaluable. And it's open
21:13
source and it's free and it's gigantic. He told
21:16
me how long it would take to download it.
21:18
It's like, you know, forever. Huge,
21:21
huge thing. Whereas Internet
21:23
Archive saves the actual pages and the images and
21:25
all that. This is text only. It
21:28
was for the purposes of academic
21:30
research. It's a foundation.
21:33
Well, so along come
21:36
LLMs and, hello, Glorioski.
21:38
Look at that. What a great resource,
21:40
right? So they're using it and they're all happy to use it. But
21:44
now you have all these organizations like Reddit, as
21:47
very publicly said, well, we don't want anybody to scrape us
21:49
because we think there's a gold pot of gold
21:51
there. The problem in the case
21:53
of Reddit is, well, you didn't make the stuff,
21:55
Reddit. Your users did. Right. And
21:58
so... So everybody thinks
22:00
that there's some pot of gold here and they all
22:02
own something. And I
22:04
think that we've got to have a
22:06
conversation in journalism and media about our
22:09
obligation to the public information ecosystem. If
22:11
everything ends up behind the paywall, which is their
22:14
right to do, that's fine. If everything remains unscrapable
22:16
or you're doomed at a court case. And the
22:18
only thing that's left out there for people and
22:20
machines to learn from without going bankrupt and
22:23
for open source efforts and for small
22:25
journalists and so on is crap and
22:28
propaganda and lies. I
22:30
mean, how are those original
22:32
creators of the news content
22:35
supposed to survive in a media
22:37
ecosystem where their work has
22:40
to be essentially fair use
22:42
and publicly available for free? No,
22:45
they can put it behind the paywall. It
22:47
just has to also be accessible by OpenAI.
22:50
No, no, I would say
22:52
the good compromise here is if it is behind a
22:54
paywall, OpenAI can't get it. Books
22:56
should only be publicly accessible material. For
22:59
instance, your articles are behind the paywall and
23:01
the information. They shouldn't be able to scrape
23:03
the information. However, I
23:05
do think that OpenAI is different from
23:07
Books3. Books3 did not
23:09
obviously buy every book in Books3. Many
23:12
of those are pirated. And that's what Sarah
23:14
Silverman et al's objection
23:16
was. Well, you're reading our
23:18
books by pirating them. But
23:21
in this case, you really can subscribe to the New
23:23
York Times. The New York Times can put in terms
23:25
of use. That
23:27
would be a solution. Yes.
23:30
Yes. The bigger question is
23:32
that the business
23:34
model of news is badly
23:36
broken. And
23:38
I think that when we see ourselves in a
23:40
position where we think all of our value is
23:42
resident in this thing we call content, we're screwed.
23:45
Because content is a commodity. Because machines can now
23:47
make it. Only in
23:49
content is special. It's not where the value is. The
23:52
information what's special is that you do reporting
23:54
others don't do. And
23:56
people who find value in that in
23:58
their jobs in many cases. and me
24:00
and my podcast, choose
24:03
to subscribe for a lot of money. I was glad
24:05
there was a sale on it. I got it cheaper.
24:07
Thank you very much, Jessica.
24:10
But, so, and if
24:12
you're really special and if you're really good, you can do that.
24:15
So much of journalism, look at the Guy Fieri
24:17
story. So much of journalism is about us copying
24:19
each other. You get our own pages and
24:21
our own page views and our own likes and our own
24:23
clicks and our own and pennies. And
24:25
the amount of unique original
24:28
journalism that occurs, like the
24:30
information, is a lot rarer than
24:32
we admit. So we're trying
24:34
to support a whole infrastructure of copying
24:36
each other to get our own SEO
24:39
and clicks. So I think we've got
24:41
to have an honest audit of
24:43
where the real value is in journalism and where it isn't.
24:46
I almost feel like a lot of the New York Times, I
24:48
love the New York Times. I pay for the New York Times.
24:50
I criticize the New York Times because it's the best and I
24:52
want it to be better. But a
24:54
lot of the New York Times has nothing
24:57
to do with news, has nothing to
24:59
do with improving the democracy. It's fluffed
25:01
to attract attention, which is an old
25:04
business model that's gone away. Sorry. I
25:08
also think that there is a society,
25:12
and we've talked about this before, just as
25:15
in the early days of the internet, we didn't
25:17
tax it. We were reluctant to put
25:19
a lot of government regulation on it because we didn't know
25:21
where it was going. We wanted it to grow, and that
25:24
worked out. Admittedly, there have
25:26
been problems, but we know what those are. But
25:29
I think it was the right choice. I think we needed to do the
25:31
same thing with AI. I
25:34
really believe that AI has a lot of potential,
25:36
but we don't know exactly what it's
25:38
going to be. And I think
25:41
that it would be problematic if it's
25:44
old media trying to hold back new media.
25:47
And I think that would be problematic. And
25:50
Eric, do you
25:52
see any difference in learning
25:56
from materials to just
25:58
teach the machine to speak? as
26:00
a skill versus
26:05
giving back answers
26:08
from current content that
26:11
are reliable because their
26:13
current content. Can you see what
26:15
they're kind of do? I think I
26:19
see a difference there but I think
26:21
that ultimately the hesitation
26:23
I have around this generally comes from
26:25
the fact that we're ultimately talking about
26:29
for-profit companies that are building
26:31
models to then sell use
26:34
of that model to people.
26:37
I think that there should be a robust
26:40
public debate and I guess
26:43
legal debate around
26:45
whether or not those
26:48
companies can benefit
26:50
from other people's work. I
26:56
love that idea because and that's
26:58
by the way why OpenAI originally
27:00
wasn't a non-profit, it wasn't tenable
27:02
but if you wrote an exemption
27:04
for open source AI
27:06
work and said but if you're
27:08
going to charge for it, well then you
27:11
got a license. I'm okay with that. In
27:13
fact, I think that's the best possibility because
27:16
I don't want Microsoft and Google and the
27:18
others to dominate AI. I think
27:20
that would be a horrible mistake. So
27:23
let me ask a related question. The
27:26
information report, I thought it was the information. Oh
27:29
I guess it wasn't but somebody reported, I think I
27:31
thought it was the information, that OpenAI's annual
27:34
revenue reportedly... Well it's the information. You're talking
27:36
about another story that actually aggregated our work
27:38
which is the very phenomenon we're talking about.
27:40
You're like putting it the wrong way. Right.
27:43
Here it is. OpenAI's annualized revenue
27:46
tops $1.6 billion and then the rest of the headline is as customers shrug
27:57
off CEO drama. That's
28:00
a great story. Maria Heater, Amira Frade,
28:03
and Stephanie Palazzolo wrote that. So your
28:05
colleagues. So that went up at what
28:07
time, Leo? That went
28:09
up 7 a.m. on New Year's Eve
28:11
Eve. Right. So
28:13
on New Year's Day at minus 3
28:16
is 11 a.m. Silicon
28:21
Angle put up the story, which is the one
28:23
I accidentally put in the rundown, saying
28:26
the information reported. Yeah. Well,
28:28
we do that too, though. So I got to
28:30
point out. I mean, that's the whole show, basically.
28:32
The whole show. I, because of
28:34
that, I'm cognizant of that. I always, you notice,
28:36
I just read their names and
28:39
I quoted it. I
28:41
try to give credit back. But honestly,
28:43
that's another reason I have some concern
28:45
about this lawsuit, because we don't do
28:47
any reporting. I don't do any reporting. You
28:50
guys do. Certainly, you do, Perris. Perris does
28:52
it. I don't know what you do, Jeff. No, I don't.
28:55
But I don't do any. I
28:57
do zero reporting. Just out there.
28:59
Zero reporting. The entire extent of
29:01
my work is to
29:03
read all this stuff, digest it, and
29:05
editorialize on the talk about it. That's
29:08
what we do. AI. So
29:10
I think I'm in the same boat as
29:12
OpenAI. You are. But here's my question,
29:15
though. Who's
29:17
paying OpenAI $1.6 billion? I
29:20
am. I give them $20 a month. But
29:23
geez, at $20 a month, how many people are doing this
29:25
a lot? Yeah, it's not
29:27
given honest answers. How
29:30
are they getting that much revenue? I
29:33
don't get it. I mean, they're getting it from, I guess,
29:36
I would assume, obviously, this is not
29:39
reporting based, but I would assume that
29:41
actual users like Leo are probably
29:43
a small percentage of that. Yes.
29:46
Corporate clients are a large percentage. What are they
29:49
getting? Well, I can tell you one thing that
29:51
I get that is worth it. Actually, because by
29:53
the way, Microsoft has put ChatGPG
29:55
4 out for free on
29:58
iPhone and Android. You can get that. of
30:00
being chat app and do
30:02
everything. But what I like and
30:04
what I use and what is worth $20 a
30:06
month for me is the expert systems I've created
30:08
as GPTs. This
30:11
is by the way, I think in the long run, this is
30:13
what's going to happen is, you know, you
30:15
had this app store revolution with Apple's iPhone. I think
30:17
you're about to have an app
30:19
revolution with AI where people, because these
30:21
all have open APIs, you can license
30:23
it. I think that's where the majority
30:26
of money, by the way, comes from
30:28
is licensing the API. And
30:30
so I have a couple
30:33
of really useful expert
30:35
systems. And this is more just
30:37
for me. In that case, to learn this content
30:39
that you have uploaded, open AI didn't provide it,
30:41
you provided it. So this is how this works. And I'll
30:43
show you in the configuration. It's both. So
30:46
without an LLM, there's
30:50
nothing here to do, right? Right.
30:53
I'll go into my common lisper, which is the
30:55
name of this little expert
30:57
system I created. The
31:00
LLM, I could upload as I have all this
31:02
stuff, but it wouldn't be able to put it
31:04
together into a expert. So
31:09
there's both the corpus of
31:11
data, but it's on top
31:13
of a large model that
31:16
is generated by reading the New York
31:18
Times and the information and whatever else it can
31:20
to create an LLM. But
31:22
what I've told the LLM is don't hallucinate.
31:25
Only give me answers that come from it. I now have 10,
31:28
11 books in here. I'm
31:31
doing this with Common Lisp, and one of the advantages
31:34
of doing it with Common Lisp is it's so old
31:36
that there's a lot of public domain PDFs
31:38
of classic books by Paul Graham
31:40
and Peter Norvig. The
31:43
entire Common Lisp spec
31:45
is available as a PDF online, a
31:48
number of great books. So I just put all those things. I
31:50
already have them as PDFs. I just put them all into this
31:54
GPT. And it's really been useful. I mean,
31:56
it's incredible. I can ask it
31:58
a question that I would normally go... and search the web for,
32:00
I would search, you know, Common Lisp
32:03
for loop and then have
32:06
to sift through the Google results. Now
32:09
I can actually ask it to explain
32:11
it to me. Here, I'll show you. Explain
32:16
loops. Now it
32:18
knows it's Common Lisp, so I
32:20
don't even have to say in Common Lisp, it knows that. And
32:23
so it's gonna give me some sample code,
32:25
it's gonna describe this. This is like having
32:28
a teacher and it's all expert information
32:30
that comes directly from these books. There's
32:33
no who's there. Did you buy all those books? Some
32:35
of our public domain, Peter Narvig's book is public
32:37
domain, some I bought, yeah. But all of
32:39
them are in the public domain at
32:42
this point. I mean,
32:44
look at... Is there anything that stops you from offering
32:46
this to others? Well,
32:49
yes, interestingly, lately,
32:52
this is a
32:54
change in chat GPT. Public
32:56
actions required valid privacy policy
32:58
URLs. So I have to
33:00
create a... What does that mean? I don't know, but I have
33:02
to create a... I stopped, I make it only mean now because
33:06
it's so it's only for me. But in theory,
33:08
you could, if you wanted to have a little
33:10
business, create expert system
33:13
GP. Let's say, let's
33:15
say I am a BMW repair
33:17
shop and I have every manual
33:19
for every BMW ever made. If I can get
33:22
that to an expert system, that's
33:24
a huge value. Now here, this is the question, this
33:26
is why the New York Times is worried. Can
33:29
I then sell that expert system? That's
33:31
what I'm asking, right? Yeah. I
33:34
don't know. What do you think the answer is,
33:36
Leo? Well, I'll get...
33:39
I don't know the answer is I have
33:41
an Emacs expert, which I
33:43
don't distribute because
33:45
one of the corpus,
33:48
one of the things in the corpus here is a
33:51
is a book by a great guy named
33:53
Mickey Peterson that I bought. I have a
33:55
PDF of. I was able to
33:57
upload it mastering Emacs, but it's not
33:59
public... domain, he sells it. So
34:02
I would feel funny about selling
34:07
this expert system. I
34:09
don't have a problem with using it for
34:11
myself. By the way, this is why open
34:13
source is also so important, is I should
34:15
be able to do this all on my
34:17
own system with my own LLM, with
34:20
my own database, my
34:23
own corpus of knowledge, not
34:25
sell it, but just for myself. That's
34:27
where you're going to get some really interesting things happening. Yeah,
34:30
somebody's saying in our Discord, take all
34:32
of Steve Gibson's notes and put them
34:34
into an expert system. In
34:36
fact, I started doing that. I put all
34:38
the transcripts of Steve's shows, all
34:41
his show notes into an expert system. Now
34:43
you have an expert Steve Gibson. Now here's
34:45
a really interesting question. That's
34:48
all content Steve has made as part of
34:50
our podcast. He's published
34:52
it publicly. It's
34:56
like everything Steve ever said or knows, does
34:59
Steve have rights to that LLM,
35:02
to that GPT? What's the difference between that?
35:05
It's that. All right, so let's just play with it for
35:07
a second. So I can go
35:09
to the web and I can read all that
35:11
stuff laboriously myself, which would take a very long
35:13
time. I could hire someone,
35:16
a librarian, to go do that for me.
35:18
Or I could hire a machine. Well, to
35:20
some degree, a search engine also. A search
35:22
engine, right? Exactly what I'm saying. Yeah.
35:25
Search for the word. Search through all of the
35:27
transcripts, which people do all the time. For a
35:29
term, I want to know more about ransomware and
35:31
get all of those references. How
35:35
that's not so different from a
35:37
chat GPT saying, oh, summarizing it
35:39
in a paragraph. That's
35:41
the only real difference is same content. I
35:45
don't know. I mean, I don't know. This is
35:47
a very interesting. This is the first time your analogy
35:49
between this and the accelerationist potential
35:52
of the internet has ever really clicked for me.
35:54
So I'll give you that. Say more. Say
35:56
more. I mean, I think just in the sense that what
35:58
you're describing. a tool
36:01
in that can make
36:05
existing processes happen
36:08
at a much faster rate. More
36:11
than that. The technology already to, you know,
36:13
like go through all of Steve's
36:17
podcast episodes, read the
36:19
notes of what he said and get up
36:21
to date on certain topics already exists, it
36:23
would be laborious. But putting
36:25
all of that into a GPT, having
36:27
it summarize it for me, isn't
36:30
inherently all
36:32
that different. And here's the
36:34
real point, it's a society, a societal
36:37
good. Yes, it
36:39
is. If we can take the knowledge
36:41
of the world and make
36:43
it available, this is Google's mission
36:46
statement, right? To put
36:48
the information of the world at your fingertips.
36:50
If we could do it even better in
36:52
such a way that it's useful so that
36:54
you can query a GPT
36:57
about security and get
36:59
good responses from it, yes,
37:02
I understand the New York Times or Steve
37:04
Gibson or the information or Paris Merno might
37:06
say, but wait, that's
37:09
all, that's my knowledge. But
37:11
from a societal point of
37:13
view, that's a huge societal
37:15
benefit. And this is the
37:18
point of copyright in a nutshell, is to
37:20
give both sides a benefit, to give the
37:22
creator the right to
37:24
make money off of it, patent
37:27
same thing, but ultimately to make it into
37:29
the public domain at some point, right now
37:31
it's life plus 70, which is nuts, but
37:33
ultimately make it into the public domain so
37:35
that we can all benefit from it. And
37:38
I think that that's the thing I really want to
37:40
focus on. Even if you're
37:42
a creator, as part of that. Yes, you're a
37:44
creator. Once I've learned it, that is my
37:46
knowledge. You can't take it away from me. You can't
37:48
play the men in black pen against
37:50
my head. But here's another thing,
37:53
when I met Stephen Johnson originally about notebook
37:55
LM, what we speculated about is
37:57
why shouldn't the New York Times be offering that?
38:00
as a service on their own. They
38:02
should make a New York Times GPC.
38:04
Exactly. A publisher
38:07
of a book should make, this is something
38:09
that I said in what Google
38:11
do, we have to update the book and then in good
38:13
parenthesis I recanted that and said no, let the book be
38:15
alone. But I'll go back again. A
38:17
book publisher should be able to put up a
38:19
book in such a way that you can query
38:22
it. There you go. And you can ask the
38:24
questions. But they don't do that because they
38:26
said they don't know this is ours, you have to buy it in the format we
38:28
give it to you. Use it that way. There's
38:32
going to be no way to protect the book. There's going to be a little bit of
38:34
a devil's advocate here. This is a good question
38:36
in the chat from Andrew saying the thing is if
38:39
we are out there building all of
38:41
our Steve GPTs, someday who
38:44
will be Steve? Oh, somebody always has to
38:46
be. If we're just letting AI do
38:49
X and Y and Z and everything for us, are
38:51
we not making any experts in these key fields
38:53
anymore? No, of course not. There's always going to be a
38:56
Paris Martineau and a Steve Gibson. You're
38:58
going to still have value because you'll still be able
39:00
to get value out of what you do because you're
39:02
creating the original content. People will still
39:04
subscribe to the information. Well,
39:07
Paris is worried. Well,
39:09
I understand and maybe I should be too.
39:11
I understand it's a little bit of, I
39:14
think they're less vulnerable. Frankly, the class of
39:16
commentary as a class of
39:20
journalism which is very important and valid, it's what we're
39:22
doing right here, is more of
39:24
what should be concerned. I mean,
39:27
who is going to be listening to this?
39:31
You have to create. Yes, you know what? It
39:34
would be trivial frankly and it
39:36
will happen in a couple of years to do this show
39:39
without any of
39:41
us. It won't
39:44
have our jokes about your weird
39:46
glasses. Probably will. That would be the easiest part.
39:51
How do you keep your glasses from sliding down
39:53
your nose? I don't. They slide down my face
39:55
all the time and it annoys me. The first
39:57
show Paris, I thought that was an affectation. I
40:00
thought this was the look. I
40:02
thought that was the parents' knowledge. It's just
40:05
I really can't. I need to get them tightened. It's a
40:07
problem. No, I got them tightened. I think it's just the
40:09
nature. Partly is I don't have a lump in my nose.
40:11
You see, I have a perfect profile.
40:14
Oh, poor you. There's nothing
40:16
to hold it up, you see? You see
40:18
what I'm saying? You should get one of
40:20
those lumps installed. Yeah, lump in the phone.
40:22
But when I wear glasses like you, Jeff,
40:24
with the pads, right, to press in on
40:26
the nose, that hurts. But
40:29
it keeps it from sliding down. Well, also,
40:31
these are incredibly, incredibly light. These are titanium.
40:34
Oh, the minor light. My
40:37
other ones are light, too. And
40:39
these because I don't know, because they are heavy, I guess.
40:42
So in the Gutenberg, for instance, this part of me, I've got
40:44
to plug in once an hour. I
40:46
think that's twice now. I quote
40:49
the fight between Kevin Kelly and John Updike
40:51
in 2006. So
40:54
Kevin Kelly said just what you just said, Leo. Oh, you're
40:56
going to make it new ways. You're going to do new
40:58
things. It's going to be open.
41:00
It's going to be wonderful. And every book is linkable.
41:02
And everything is out of that, right? And Updike just
41:04
went crazy at Book Expo. And he
41:06
called that a pretty grisly scenario. He
41:09
said books traditionally have edges. The
41:11
electronic anthill. Where are the edges?
41:15
So booksellers, he said, defend your lowly
41:17
forts. And he said, I don't want to perform
41:20
for my lunch. I want to write for my lunch. And
41:23
I get that. I get it, too. That's
41:25
why we don't ask people
41:28
from the previous generation what
41:31
they think of the next big thing. Because
41:33
they never really understand what it's going to
41:35
be. And they have their natural aversion to
41:38
it. Because it's a change is
41:40
going to get in the way of it actually happening. And
41:43
you know what? There may be
41:45
disruptions. There almost certainly will be.
41:47
The internet costs huge disruptions in
41:49
businesses, including the New York Times
41:52
business. And they're desperately
41:54
scrambling to find a way
41:56
to survive. Many newspapers have gone out of
41:58
business. But I wouldn't want to. stop
42:01
progress to save newspapers any more than
42:03
I wanted to stop TV to save
42:06
radio. It's just or
42:10
stop cars to save buggy whips. Not
42:13
all progress is good. Exactly
42:15
the same way. Yes. They're going against open AI
42:17
now. Yes. Not all progress is
42:19
good but it is in the nature
42:21
of life and we have to move
42:23
forward. We cannot freeze this in Aspic
42:26
and so to ask John Updike who I
42:28
deeply respect, by the way, guess what?
42:31
There's people still read his novels. They
42:34
love his novels. His novels
42:36
have not gone away. Books still have
42:38
edges. Absolutely. So the
42:40
problem wasn't that you
42:43
know I understand his concern but he was from
42:45
another generation he didn't know
42:47
any more than I did at the time
42:49
what was going to happen and so to
42:52
ask him is to freeze progress in the
42:54
form that somebody from an earlier generation thinks
42:56
it should be frozen and that's a clear
42:58
mistake and that's my exact point with open
43:01
AI or AI in general is we don't
43:03
know and so it would be we see
43:05
it with music it would be wrong for us at
43:08
any point in this to say oh no
43:10
no no you can't let that happen because
43:12
we just don't know open
43:14
AI is the nap many have said this it's the
43:16
Napster of content
43:19
and Napster opened the door and yes all the
43:22
law came out what happened in the end is
43:24
the is the album got decommissioned
43:28
de-orbited right is that is that what the member used and
43:31
but there are far more
43:33
creators being able to be heard right now in
43:35
far more than always that were the case and
43:37
I want to point out Peter Gabriel
43:40
just released an album that
43:42
is an album that is in sequence and the
43:44
way he did it is he released
43:46
a cut at the full moon of every month in the
43:49
year 2023 in album order
43:52
and on December 1st released the full
43:54
album he preserved the album he found
43:56
a way to do it I mean
44:00
And by the way, I was really pissed because I
44:02
was having a hard time buying the album. I wanted to give him my
44:04
$17 on iTunes. And
44:07
Apple doesn't want you to buy albums. They want you
44:09
to subscribe forever to Apple Music. So they make it
44:11
very difficult to find a way to buy them. But
44:14
Jason Snell, who is a friend, the host
44:17
of MacBreak Weekly, and a fan of Peter
44:19
Gabriel, said, oh no, Peter sells all his
44:21
stuff on Bandcamp. Yeah,
44:25
Bandcamp is where they're at. You could go to
44:27
Bandcamp and you could still buy an album. So
44:31
here's a guy, he's 70-something. He's
44:34
an earlier generation musician who
44:36
believes in the form of albums and
44:39
has preserved it. I
44:41
think John Updike didn't go away, Peter Gabriel didn't
44:43
go away, but they should not be allowed to
44:45
say. There's a lot more John Updike and Peter
44:47
Gabriel's nothing to be had who couldn't make it
44:50
through the gauntlet before. That's true, too.
44:52
Okay, I agree, but I will
44:55
also say the current state of
44:57
the music industry, which is streaming
44:59
based, is not very hospitable for
45:02
new artists who are trying to make a living. Getting
45:05
pennies from Spotify is not going to produce the
45:07
next case. Well, we agree to see what they're
45:10
doing in the podcast. You're right. I
45:12
agree. We should go to Bandcamp and sell your album.
45:14
Bandcamp is still messing up. And so in
45:16
the book that's coming out next year, I'm not
45:18
sure we're not plugging yet, but
45:21
I argued that the mistake that
45:23
I made was that I gave too much
45:25
to companies. I gave too much to Twitter,
45:27
I gave too much to Facebook, and
45:30
then this is the lesson Leo taught me and it beat
45:32
into my head when it came to Macedon, is
45:34
we've got to honor the open structures
45:36
of the Internet. And
45:40
it's not always going to be
45:42
the easiest way or the obvious
45:44
way or maybe even the most
45:46
profitable way. But
45:48
there is also to be considered the interests
45:51
of society. And
45:53
I think it is in society's interest that we
45:55
allow this stuff to develop.
45:57
We keep an eye on it. there
46:00
will be bad things coming out of AI. We
46:02
know that. But there could
46:04
be so many good things. It's very much like
46:07
the internet. There's so many good things possible
46:09
as well. So I
46:13
really want a smash cut of all of your
46:15
different take things I know last year. I think
46:19
that would be pretty good. In the words, we could get AI
46:21
on that. In the words of Henry
46:23
Davis and Paris, foolish
46:26
consistency is the
46:28
hobgoblin of small minds. Am
46:31
I wrong? Actually was
46:37
Ralph Waldo Emerson. But other than that,
46:42
you hallucinated.
46:45
Yeah, humans hallucinate a lot more than
46:47
machines. Hobgoblin. Yeah, that's a
46:49
good word, isn't it? It's really good.
46:51
I think there's more hobgoblin references. That
46:54
was fun. That was
46:56
really fun. It's a really
46:58
exciting area. And it's one of the reasons, as
47:00
I've said many times, we
47:03
have a job to do. And so this
47:05
is why I'm not worried about AI. Because
47:07
you and I have
47:09
a job to do, Paris. And
47:12
Jeff, I guess, have a job.
47:14
Until August? No, no. We have
47:16
a job to do, just
47:19
to understand this and explain it. And
47:22
also advocate. I mean, that's why I'm not
47:25
a pure journalist. We advocate. We
47:28
do our best. And I love
47:30
I've done this my whole career since
47:32
the early 90s. Because I love technology.
47:34
But I also feel very
47:36
strongly that we have that there are certain
47:38
paths we should avoid. So Beth, I've always
47:40
been a big believer in open versus proprietary.
47:42
And I've always flogged that. Now, that's my
47:44
point of view. Maybe people
47:47
don't agree with me. There's plenty of them. But
47:50
that's part of our job. And I think we'll have
47:52
a job to do. And AI cannot advocate. And
47:55
AI doesn't really, can only give
47:57
you information. And then it's not well
47:59
and not often not well, but
48:01
then me too. Emerson
48:04
Thoreau. It's all, you know, the
48:07
same. Poor Michael Cohen. Michael Cohen. It
48:09
all is the same. Yeah, that's right.
48:11
Michael Cohen used Bard. He did it
48:14
too. To write his pleading. Well, you
48:16
know what? He's a doofus. He's a
48:18
doofus. But I also
48:20
could see how in Bard, Bard now appears
48:22
right next to the Google search bar, and
48:25
it takes the place of the old Wikipedia
48:27
information up there. And so you ask
48:29
a question in the search, and there is information right
48:31
there in Bard. And if you don't, if you're not
48:33
paying attention to the news, which he doesn't accept about
48:35
himself, I could see
48:37
the confusion. He's not his lawyer,
48:39
is another case. Is that a
48:41
good one? Yeah. The context is
48:43
Michael Cohen says that he unwittingly
48:45
passed along to his attorney bogus
48:47
artificial intelligence-generated legal case citations that
48:50
he got online before they were
48:52
submitted to a judge, which
48:54
is pretty funny, honestly. It is. It
48:57
is. Okay.
49:00
It's crazy, though, that they ended
49:02
up cited as part of written
49:05
arguments by his attorney. His attorney
49:07
didn't do his job, and then the attorney, there was all, it
49:09
was right before anybody else was listening. Well, I mean, if you're
49:11
the attorney for Michael Cohen, you've got a lot of things in
49:13
your plate. You're a busy guy.
49:16
Honestly, I mean, I don't know enough about
49:18
the job that these people do, but I
49:21
bet you there is some time pressure, and
49:24
you're working really hard to pull these
49:26
things together, and maybe you
49:28
should have checked. Well,
49:30
Michael doesn't have access to Westlaw anymore because he's not
49:32
a lawyer anymore. Oh, that's right. He isn't a lawyer,
49:34
is he? He's been to Spar. All right. Right.
49:37
Well, we'll talk about Bard, actually, when we come back. This is
49:39
this week in Google. The show where
49:41
we cover some... And Bard is a Google
49:43
product. Bard is a Google product. Yeah, here
49:45
we are. We're getting into Google before the
49:47
ad break. That's huge. How about that? We're
49:49
going to talk about Bard. It's getting close.
49:52
So if you want to use it in your trial,
49:55
stay tuned. But first, a
49:58
word from our sponsor. I
50:00
love these guys. Collide. And why
50:03
do I love Collide? Because they
50:05
love end users. When
50:08
you go through airport security, you know that
50:10
line where the TSA agents check your ID?
50:13
And the other line where a machine scans your bag?
50:17
Try to picture this, because this is a great
50:19
analogy for what happens in enterprise security. It's not
50:22
passengers and luggage. It's end
50:24
users, and it's their devices. Right
50:26
now, these days, most companies are pretty good at the
50:28
first part, where they check user identity.
50:31
Perhaps you're using Okta to do that.
50:34
That's great. It's strong authentication. The problem
50:36
is, there's the assumption you authenticate the
50:38
user, oh, their bags are fine. But
50:41
that's not necessarily the case. User
50:43
devices can roll right through that
50:45
authentication. They're not getting inspected. In
50:48
fact, right now, 47% of
50:51
companies allow unmanaged, untrusted
50:53
devices right in to access
50:55
their data. That
50:58
means an employee can log in from a laptop
51:00
that has, let's say, its firewall turned off, hasn't
51:02
been updated in six months. Or
51:05
worse, a laptop might belong to a
51:07
bad actor using employee credentials. They
51:10
may be able to get in, but the laptop got in
51:12
with them, and you are in trouble. Collide
51:14
solves this device trust problem. Ensures
51:16
that no device can log in
51:19
to your Okta-protected apps unless the
51:21
device passes your security checks. Plus,
51:24
and this is great, too. You
51:26
can use Collide on devices without MDM. Your
51:29
Linux fleet, contractor devices, you can't force
51:32
them to put MDM on their devices.
51:35
Every BYOD phone and laptop in the company,
51:37
you know the executive's going to bring that
51:39
phone in, but it still works.
51:43
collide.com/twig. Watch
51:45
a demo, see how it works. k-o-l-i-d-e.com/
51:50
twig. You need this. Collide. We
51:53
thank Collide for supporting this
51:56
week in Google. You
51:58
want to see the BART? is
52:00
inching towards launch. I
52:02
think that's an interesting choice of words from 9 to 5
52:05
Google, inching. Apparently,
52:08
where it's imminent, right? This
52:11
is from APK Insight, where they take applications,
52:15
files, the APKs, and decompile them and
52:17
analyze them. And inside, they found a
52:20
little pop-up that's
52:23
an Assistant with Bard tab right there in
52:25
the Google app. This
52:28
is designed to fully replace the existing
52:30
Google Assistant on
52:32
supported Android devices. So for right now,
52:34
you've got your Pixel. And
52:37
you go, hey, you know who? It's
52:41
going to be Bard's going to be in there in some
52:43
way. The
52:45
company is also planning to prominently place the
52:47
Assistant with Bard experience on the Discover page
52:49
of the Google Search app. Here
52:52
is from 9 to 5 Google
52:55
and APK Insight
52:58
the video. What
53:01
appears to be ways to quickly switch between
53:03
performing a normal Google search and
53:05
getting help from AI? See, Michael Cohen's going to love
53:07
this. Here's a
53:09
here. That's going to be huge for him. This is going to be huge.
53:13
So there's the Assistant. Hi,
53:15
I'm Assistant with Bard. Oh,
53:17
like that? Or here's like you're in,
53:19
what would you like to do today? Would you like to see
53:21
me type? Or you talk? Hello there. Can
53:23
you tell me what's Santa? I'm
53:26
sorry, what's on the image, please? I
53:28
don't know where Santa came in. It said Santa.
53:30
It did. It did say Santa. Yeah. And
53:32
then it's looking at the screen. And
53:35
this is Bard. APK
53:37
Insight says for now, it's not clear whether this
53:40
is intended to be a permanent fixture of Discover
53:42
tab. Or maybe it's just
53:44
a one-time only little ad for checking
53:47
out Assistant with Bard. Then
53:51
there's a guy named Dylan Roussel who managed
53:53
to enable the actual pop-up window. I love
53:55
people. It's
53:58
hysterical. and
54:00
they dig and shows us
54:02
how that's going to look to use
54:05
Bard to submit questions. So we're getting
54:07
closer and closer. I have to say
54:09
I pay $20 bucks as I mentioned
54:11
a month to have chat GPT on
54:14
my phone and on a
54:16
new iPhone I'm able to use their action
54:19
button to launch it and talk to
54:21
chat GPT and have it respond back. I showed you that
54:23
a few weeks ago. Now
54:26
we're getting, I think
54:28
it's really interesting that Google looks
54:30
like it might want to add Bard to the
54:32
regular Google Assistant. So
54:38
do you also pay for an anthropic to do the
54:40
work of the guy? I think we must, right? We
54:44
use Anthropics AI Claude, which
54:46
by the way, by informant deep
54:49
within the industry said,
54:51
oh those guys at Anthropic they're jerks. He
54:54
said, oh you're a long walk guy? My
54:57
long walk guy. He said, you're a long
54:59
walk with Sam Altman? Yeah, yeah. He
55:01
said, those are the guys who are
55:04
so concerned about safety that they left
55:08
Google because they were so worried about safety.
55:10
He said, so they've created, Anthropics created this
55:13
safe AI. He
55:15
was very dismissive of it. But I have
55:17
to say it's been very useful for us. But
55:19
here's my question. Yeah. I'm
55:21
sure we pay for it. Anthony, do we pay
55:23
for that? Yeah, we must. What do you use
55:26
it for? We
55:29
use it to do show notes. We
55:32
also, I believe we use it to chop up
55:35
the show into little TikTok-like
55:38
vertical videos. That's a different
55:40
tool. What tool is that? That's podium. I'm
55:42
sorry, podium is what we use for the transcript and stuff like
55:44
that. These
55:47
are all tools you pay for. All AI tools we
55:49
pay for. But
55:52
here's my question. We use a lot of them. We use blog posts
55:54
out of cloud as well. Cloud makes blog
55:56
posts. Yeah, but you write them. Some point.
55:59
Well, we edit them as humans. But
56:01
yeah, it'll take a show, it'll listen
56:03
to the whole show, it'll read
56:06
the transcript of the show and then like give us
56:08
like a summary. Let Claude do it. I don't want
56:10
Claude to do it. No, I'll tell you, this is
56:12
historically a huge problem for podcasts because
56:15
Google doesn't search the audio. They always said
56:18
they were gunno and they never really did.
56:20
So there's no discoverability in a podcast. So
56:22
we've always said what we really should do
56:24
is a blog post for every episode. But
56:26
I'm not going to do that.
56:29
None of our hosts wanted to do that. That's
56:31
just, you know, that's adding insult to injury. So
56:34
the AIs now take this show, they
56:36
transcribe it and they generate an article
56:38
from it and then the
56:40
producers look at it and clean it up.
56:43
Where is that? Is that in the blog on the Twitter
56:45
TV? Yep.
56:48
So there's actually a lot of stuff in here
56:50
that we got to make this more discoverable. These
56:53
are transcripts. There's the newsletter there.
56:56
Micah Sargent's Holiday Gift Ideas. Micah probably
56:58
wrote that. The
57:01
transcripts are created by Podium. Oh,
57:04
here you go. Look at this. Remember
57:07
when we had Steven Johnson on the show?
57:10
So we fed that show to an
57:12
AI. To
57:14
his AI? Different AI. No,
57:16
different AI to who do we use for
57:18
this? Claude? To Claude? Yeah,
57:21
Claude. Claude wrote this and
57:24
then Anthony Nielsen, our AI guru, but
57:26
it's sometimes the producer, sometimes Anthony, went
57:29
through it, made sure it was legit. How much do
57:31
we pay for this feature? What
57:34
am I looking at? This
57:36
is Claude reading the blog post. We
57:40
don't pay for that. Claude just does it
57:42
for free? That's my question. That's free.
57:44
Pretty soon, everything you've mentioned is going to be available
57:46
for free because they're all going to be competing and
57:48
you're not going to be paying for it. The internet was
57:50
like that too. Look, it's even got a quote. It's
57:53
free for like three to five years. Yeah, and then we'll have to all pay for it. $25
57:55
a month or probably like $100 a month. Yeah,
57:59
and we'll be back. pissed. This is
58:01
actually pretty good. This is
58:03
a summary of what Stephen
58:05
said. It's got a quote from Stephen. But
58:08
Anthony, how much cleanup did you have to do
58:10
on this, Anthony? It's
58:17
almost there. He just touches it up a little bit
58:20
and makes sure there's no lies. It works out
58:22
by names and by quotes. I want to
58:24
take everything from Jeff. That's his instruction. And then
58:28
we were going to use it for show notes, right, with
58:30
the emoji bullet points. Have we started doing that yet or
58:32
no? Not emoji bullet
58:34
points. I like the emoji bullet points. The metric
58:36
said it would break things. Oh, it would break
58:38
things. Oh, well, that's not good. So
58:41
we don't use it, but it does make the show notes
58:43
here. It depends on the producer.
58:46
Okay. And then what
58:48
do we use? What's the tool we... Do we pay for
58:50
that feature? Okay. Oh,
58:53
no, we just run that through Claude. Claude
58:56
is free and free, free, free?
58:58
No, no. No. Well, you have
59:00
eliminated usage in the like... Okay.
59:02
So if we use it more than a certain amount,
59:04
we would have to pay for it.
59:07
Oh, so you're paying for it. Yeah, I'm
59:09
paying $20 for chat GPT. So, so
59:11
20 bucks. And then, podium
59:13
is much more expensive, I'm sure. That's
59:15
our... does transcriptions. And then,
59:18
and then what is the... What
59:23
is the thing we use to make the
59:26
TikToks? TikToks. Opus.pro.
59:29
And that we pay for. Yeah. And
59:31
that's like it's, you know,
59:33
for an episode, it's like, you know, five bucks and you
59:35
get... It'll like generate, you
59:38
know, 30 clips that you could kind of
59:40
pull from and edit and... It does a
59:42
pretty... It also adds the text,
59:45
right? And it'll automatically like,
59:47
you know, deal with the... There's one of
59:49
our shows. There's you, Paris. Hey,
59:52
there's me. That's
59:54
cool. Never noticed that. So,
59:57
yeah, that is cool. There's Scott
59:59
Galloway, I think. Yeah. Ugh. Ugh.
1:00:03
There's young and profiting. This
1:00:08
is great because honestly we don't have the manpower.
1:00:11
You know my son Henry, he looks
1:00:13
at our TikTok and says, Dad, would you just
1:00:16
fire whoever's doing that? I said,
1:00:18
we can't, it's an AI. He said, Dad, that's terrible. I said,
1:00:20
it's not terrible. She said, look, let me do it. I said,
1:00:22
no, because I can't pay you. What you would need to get
1:00:24
paid to do this. But his contention, and of
1:00:26
course he's been very successful in TikTok and it's not
1:00:28
with millions
1:00:31
of followers, but his
1:00:33
contention is it's not really creating really
1:00:35
truly viral stuff.
1:00:38
I will say, I mean. It's better than not
1:00:40
being on the platform at all, but it's not probably going to
1:00:42
get people to do it. I mean, you know, it's not going
1:00:44
to be the same. But I think it's going to be a
1:00:46
very, very, very, very, very, very, very, very, very, going
1:00:52
to get people to check
1:00:54
out to it from TikTok. But nothing
1:00:56
we would do would do that, right?
1:00:58
That's the problem. I think it would.
1:01:01
I think it's a really popular
1:01:03
format on TikTok to kind of
1:01:05
have cuts from podcasts
1:01:09
by kind of going back and forth. Yeah. And
1:01:12
it does drive subscribers. I mean,
1:01:15
that's how Dropout, that a
1:01:17
like a streaming service I've talked about
1:01:19
before, gotten the vast majority
1:01:21
of their subscribers of the last couple of
1:01:23
years is from TikTok. But that would also
1:01:25
require hiring someone. All Henry had to
1:01:28
do is say, well, how many views around that? How
1:01:32
many followers do you have? How
1:01:35
many likes do you have total? Yeah.
1:01:40
So, I mean, we like, we
1:01:42
recently just started. You don't have to defend it.
1:01:45
You don't have to defend it. He's
1:01:47
a snob because, but I'll tell you what the
1:01:49
real thing is, and he doesn't admit this, even
1:01:52
those in his heart of hearts, the
1:01:54
real audience is not humans. The
1:01:57
real audience is the TikTok algorithm. Because the
1:01:59
way you you actually get views on TikTok
1:02:01
is by TikTok surfacing it in the For
1:02:04
You tab. So until you
1:02:06
get the TikTok algorithm, not a
1:02:08
human, but the algorithm to notice
1:02:10
you, it doesn't matter. Yeah, but
1:02:12
that's the term human, like... We
1:02:14
don't know how the TikTok algorithm works. I
1:02:16
don't think it's just from human attention. I
1:02:20
think TikTok's looking for certain kinds of
1:02:22
content. Okay, I'm sending
1:02:24
you guys an example on Discord of a
1:02:26
podcast that I can't speak to the quality
1:02:28
of any of these videos. I just know
1:02:31
that I've seen this in my feed
1:02:33
beforehand, and it is kind of talking
1:02:35
heads, people recording a podcast, but they
1:02:37
do frame the shots in a way, and their
1:02:40
videos out? 3.4 million followers. The
1:02:44
basement yard. I
1:02:47
have no idea what it's about, but I... So,
1:02:51
oh, this looks kind of like what we do
1:02:53
where it's a vertical, you've got, you
1:02:55
know, the captions. The problem
1:02:57
is, we have to
1:03:00
say something interesting. I
1:03:02
think we say a lot of interesting
1:03:04
things. I think you just got to think of
1:03:06
what's TikTok-able. You have to
1:03:08
really play to the algorithm. Maybe that
1:03:10
is ultimately in the people, and there's a lot of
1:03:12
competition. Which does take us to line 93. Wow.
1:03:18
Okay, now this is a point, just
1:03:20
a little behind the scenes here for everybody, where
1:03:23
I have to decide before looking if it
1:03:27
is worth completely stopping, even just
1:03:29
talking about, and go to
1:03:32
line 93, which could be
1:03:35
Jeff Jarvis and a
1:03:37
TikTok with a monkey, or we
1:03:39
just don't... No
1:03:41
one knows. Oh wow, this is... It
1:03:43
is a TikTok. Yeah, it is a TikTok,
1:03:45
and it is relevant. You see? It
1:03:47
is relevant. It's not seen from first.
1:03:49
But was it worth... He scolding me people. He's scolding
1:03:52
me people. I don't care. Changing the
1:03:54
entire thrust. I don't care. Here
1:03:56
we go. We're doing it. When guys start the podcast for
1:03:58
no reason, it is really... We're the podcast
1:04:00
boys. Put the mics down! Put the mics
1:04:02
down! Don't even think about it! This episode is over
1:04:04
fellas. We're the podcast boys. Anything
1:04:07
you say can and will be used against you. Subscribe to the Patreon. Alright,
1:04:09
I'm out! It's become a real problem here in Los Angeles. It's
1:04:13
pretty good. Put the mics down! Put
1:04:15
the mics down! God damn it, they fled! I'm
1:04:18
sorry. I'm sorry. I'm sorry.
1:04:21
I'm sorry. I'm sorry. I'm
1:04:23
sorry. I can see that it has 634,000 likes.
1:04:47
Yeah. It's ridiculous. So this
1:04:49
is why I don't, honestly
1:04:51
don't feel like we should try to compete
1:04:53
on TikTok. Now I
1:04:56
thought we did a good job with
1:04:58
your little gale.com thing on TikTok. Did
1:05:01
you see that? What we did with that?
1:05:03
I didn't, but I do follow you on
1:05:06
TikTok. Alright, here is, this is from an
1:05:08
episode of Twig. Oh my God, I'm having
1:05:10
to watch. My pick of the week this
1:05:12
week is one of my colleagues accidentally
1:05:15
mistyped Gmail the other day.
1:05:18
Okay, so Henry would say this was. It took too
1:05:20
long. I've flipped. I've flipped up. I'm
1:05:22
gone. I've flipped and also
1:05:25
we got to get rid of the little,
1:05:27
on the bottom third of the TikTok is
1:05:29
a subscribe to Twit sort of banner thing.
1:05:31
You can't click. You got to get
1:05:33
rid of that. And you can't click it. You can't
1:05:35
click. There's no, it's not clickable. It's just a thing.
1:05:37
This is great for AI. And I think we're doing a really good job.
1:05:39
By the way, as an example, I bet
1:05:42
you anything that the TikTok algorithm sees
1:05:44
that and goes, eh, we're not going to put that on the for
1:05:46
you page. Probably. It's
1:05:48
stuff like that. It has nothing to do with how many
1:05:50
people saw it or whether they liked it or clicked it
1:05:53
or said anything. Henry says it doesn't matter how many people
1:05:55
like you. That is not part
1:05:57
of the algorithm. And I think the algorithm does things
1:05:59
like, yeah. I don't like the way that's laid out.
1:06:07
It should be illegal to make me listen to myself
1:06:09
on this show. I know. It's
1:06:12
not painful. I hate it. I know
1:06:14
what you mean. It does have a different... I
1:06:17
think this was a really good segment and a great thing
1:06:19
for a TikTok. But of course
1:06:21
it's not going to succeed on TikTok. It's
1:06:24
not going to grab you.
1:06:26
You're going to swipe. Yeah,
1:06:29
I think it would probably require some of your
1:06:32
son's fast editing to make it work. He
1:06:34
needs knife work. We need those. We need
1:06:36
the crackling round of bacon. We need knife
1:06:38
work and frying bacon. Oh yeah. Look
1:06:42
at this poor guy. I've got to tell you. He's
1:06:45
doing very well, but he works morning
1:06:47
till night making these stupid
1:06:49
sandwiches nobody eats. I said,
1:06:52
Henry, how come you not be
1:06:54
fat? He says, I don't eat
1:06:56
that stuff. I
1:06:58
said, well, do you... So he's doing it in his mom's
1:07:00
kitchen. Actually, he's moved back down to LA, but he was
1:07:02
doing it in his mom's kitchen. Do you give it to
1:07:04
mom? He said, no. I said, I
1:07:06
don't want to. Jennifer said, yeah, usually I
1:07:08
have to do over eat. I said, I can't get
1:07:10
into the kitchen. And then I said, well, who do
1:07:13
you give it to? You never give it to me.
1:07:15
He said, yeah, I have friends. I
1:07:18
don't understand. But anyway, that's neither
1:07:20
here nor there. It's hard.
1:07:22
Let's put it this way. It's hard to go
1:07:24
viral. There's no magic thing. But
1:07:27
one of the tricks really, at least Henry's
1:07:29
trick, is you've got to... Your audience is
1:07:31
an audience of one that... And it's a
1:07:33
machine. It's a TikTok algorithm. Don't
1:07:37
you think that's true? Speaking of... Yeah,
1:07:40
I mean, absolutely. Aren't you glad
1:07:42
we did that podcast police segment,
1:07:44
Jeff, that really forwarded the conversation?
1:07:48
I thought it did. Line 93. I
1:07:50
think it did. Line 93, guys. Speaking
1:07:52
of which, I think we could talk about Line
1:07:54
98, which we talked about a bit before the show.
1:07:56
Now you've got parents doing it. Listen.
1:08:00
And I'm going to lean in. He hates this. You
1:08:02
remember weeks ago you may have heard of
1:08:05
the TikTok tunnel girl, a
1:08:08
spiritual cousin of the TikTok ill
1:08:10
guy. She was a woman
1:08:12
who was digging a series of tunnels
1:08:14
underneath her Virginia home. She's
1:08:16
been ordered to stop doing that because
1:08:18
it's against the law. How
1:08:22
long did she go and how far
1:08:24
did she get digging these tunnels? Probably
1:08:26
longer than a year because the video I played
1:08:28
for you guys a couple of weeks ago was
1:08:30
a one year recap of her tunnels and
1:08:33
their cavernous. She had to get a
1:08:35
mine car set up in there. Apparently
1:08:37
pools of water. Is
1:08:39
she an engineer? She
1:08:42
is an engineer. She's a software
1:08:44
engineer. Oh, that doesn't count. They
1:08:47
don't teach you in the computer science
1:08:49
program how to keep the roof of
1:08:51
a tunnel from collapsing in on you.
1:08:54
Wait a minute. Wait a minute. Is that
1:08:57
a true? That's not actual Barack Obama
1:08:59
coming to that. He didn't really say
1:09:01
you are my Beyonce? Hero? Yeah,
1:09:04
no. No. Okay. All
1:09:06
right. That's somebody that doesn't even look like
1:09:08
Barack Obama. No,
1:09:11
it's someone on the internet lying about who they
1:09:13
are. But
1:09:15
for some reason the New York parts decided to
1:09:17
include it. Yeah.
1:09:20
So it was an
1:09:23
unpermitted tunnel digging project in a suburban
1:09:25
Washington, D.C. home. And it's now been
1:09:27
slapped with a bunch of potential violations.
1:09:30
What I'm curious is, did
1:09:32
she go under neighbor's property? She
1:09:35
had to. Otherwise it seems bigger than that. I
1:09:37
think she was building, if I recall correctly,
1:09:40
it could be a storm shelter. She
1:09:42
says it was a storm shelter. Yeah. All
1:09:44
right. Now we've moved on to something else.
1:09:47
Yeah, okay. At least you have to scold
1:09:49
parents like you scolded me. No, no.
1:09:52
Yeah. I wanted us. Yeah,
1:09:55
exactly. We're going to get in trouble. Yeah,
1:09:58
exactly. Both kids have to. Right. Oh,
1:10:01
she did post a video of her getting
1:10:03
shut down by the police, which is great
1:10:06
content. I'm sure Sorry,
1:10:08
that is an exercise for
1:10:10
the listener Go
1:10:13
to tick tock. What is your tick tock handle?
1:10:16
Oh, I have no idea. Okay Search
1:10:18
tunnel girl. She'll come out. I'm sure
1:10:20
girl. Yeah The
1:10:24
one plus buds three No,
1:10:27
I don't want to do a story about that. No you put this
1:10:29
on there's the folks You playing
1:10:31
at home. This was the first thing on
1:10:33
the rundown He's
1:10:38
going after us for going down Three
1:10:46
reportedly get price cut That'll
1:10:52
be a real conversation starter Particular
1:10:56
order in fact they are in
1:10:58
chronological order so as you can see
1:11:01
that story was bookmarked yesterday So it's
1:11:03
old too Yeah,
1:11:08
January 2nd You
1:11:11
bust us we bust you well, I'm just
1:11:13
saying I haven't put this in
1:11:15
any order you're assuming Yeah, yeah, yeah,
1:11:17
the first line is the most important. No,
1:11:19
that's not true. In fact, I didn't
1:11:21
lead with this did I led with? line
1:11:26
In our part of the thing. Yeah, yeah,
1:11:28
that should have been in there too. No, it
1:11:30
is It's actually line 51 as well in my
1:11:33
area, but it was in the AI area Anyway,
1:11:36
oh, it's right. It was a little too much
1:11:38
inside baseball. So what do you have to say
1:11:40
about those? one-plus buds
1:11:42
Leo Their price actually
1:11:44
ended up getting cut Deleting
1:11:48
now a
1:11:52
First time we've shamed Leopter a
1:11:54
moving row of the
1:11:56
Google sheet is now a mutual
1:11:58
shaming society. We're We're
1:12:00
all seeing the Howard Sitter Show. Apps
1:12:04
will be reporting your earnings to tax authorities
1:12:06
starting this week. How about that one? If
1:12:09
you make money with an Airbnb
1:12:11
or Etsy or eBay. I'm surprised
1:12:13
they haven't been. I am too.
1:12:16
Don't you have to give a, what do you call it, a $9? $10.99
1:12:18
or something? Yeah. Well, no. That's
1:12:21
what they give you. What do you have to give them? A
1:12:24
W-9. A W-9. Right. Yeah.
1:12:28
It doesn't mean you have to pay income tax
1:12:30
on it. In fact, the 9-5-MAX says whether or
1:12:32
not you have to pay tax on this income
1:12:35
depends on an often complex set of rules which
1:12:37
vary by country. But
1:12:39
this is a global agreement. So it's not just
1:12:41
in the U.S. The 38
1:12:43
members of the Organization for Economic Cooperation and
1:12:46
Development have all agreed, including
1:12:48
the U.S., U.K., many European countries, have
1:12:50
all agreed that these platforms have really
1:12:52
got to tell us if you're making
1:12:54
money. Here's the hard
1:12:56
part of that. So I wrote
1:12:58
a piece for some Spanish magazine. They're going
1:13:00
to pay 300 euros. Fine. I'll
1:13:02
take the 300 euros. But oh my God, I
1:13:05
have to get IRS documents to prove that
1:13:07
I'm an American, do not get
1:13:09
the Spanish tax, take it out. Oish.
1:13:12
That's a lot. A lot. Did
1:13:15
you see Anil Dash? By the way, I've
1:13:17
been attempting to get a hold of Anil, but
1:13:20
he's been on the show many times before in
1:13:22
the past. His Rolling Stone article entitled,
1:13:24
The Internet is About to Get Weird
1:13:26
Again. I think that's
1:13:29
weird as in Austin weird, good
1:13:31
weird. You mean good weird. Yeah.
1:13:33
Yeah. Anil is a
1:13:35
serial entrepreneur. He was Gina
1:13:37
Trapani's boss. I think that's how. And we used
1:13:40
to have him on the show frequently. I've known
1:13:42
Anil for years, probably a decade or more. Oh,
1:13:44
I've known him for probably longer than you have
1:13:46
because we go way back in blogging land. Yeah,
1:13:49
he's a proto blogger. Yeah.
1:13:52
His point, which is I like it, is that
1:13:54
2024 is going to be a watershed.
1:14:00
for the internet that the big tech
1:14:02
companies, the tech giants, thanks
1:14:05
to the EU primarily but somewhat to the
1:14:07
FTC, are being forced to
1:14:09
hold their noses and embrace mandated
1:14:11
changes like I'm reading his
1:14:13
prose here, like opening up their devices to
1:14:16
allow alternate app stores to provide apps to
1:14:18
consumers. Back in the
1:14:20
US, a shocking judgment in Epic Games lawsuit
1:14:23
against Google leaves us with the promise that
1:14:25
Android phones might be opening
1:14:27
up in a similar way. Twitter's
1:14:31
slide into irrelevance and extremism
1:14:34
has hastened the explosive growth of a whole new
1:14:36
host of newer social networks,
1:14:38
including he's a Mastodon
1:14:40
user. And he
1:14:42
mentions Mastodon and Blue Sky
1:14:44
and threads. He
1:14:49
says, and I think this is an interesting point
1:14:51
of view, that he sees it going back somewhat
1:14:53
to its roots. I hope he's right.
1:14:58
We're going to still try to get him on it because I'd love to
1:15:00
hear what he thinks about this. He
1:15:03
says, I'm not a Pollyanna about the fact that
1:15:05
there's still going to be lots of horrible things on
1:15:07
the internet and that too
1:15:09
many of the tycoons who rule the tech
1:15:11
industry are trying to make bad things worse.
1:15:15
There's not going to be some new killer
1:15:17
app that displaces Google or Facebook or Twitter
1:15:19
with a love powered alternative, but that's because
1:15:21
there shouldn't be, and Neil Dash
1:15:25
writes, there should be lots of different human
1:15:27
scale alternative experiences
1:15:29
on the internet that offer up home
1:15:32
cooked, locally grown, ethically sourced code to
1:15:34
table alternatives to the factory farm junk
1:15:36
food of the internet. And
1:15:39
they should be weird. This is what Kevin Marks has
1:15:41
been saying. The indie web. What
1:15:44
you've been saying. That's what I've been saying. You've been
1:15:46
saying it. Honestly, this also applies to our previous conversation.
1:15:48
That's why I don't want big tech to run up
1:15:50
AI. I want AI to be
1:15:52
open and available and weird. AI should be
1:15:55
weird. That's where innovation happens
1:15:57
is at the interface. So
1:15:59
I like this. article in the Rolling Stone it's kind
1:16:01
of an opinion piece I guess but a
1:16:03
Neil Dash yes
1:16:06
labeled commentary yeah so
1:16:09
we'll try to get a Neil on here's a
1:16:11
but to find it in Rolling Stone I was
1:16:13
glad that's interesting well Rolling Stone has become really
1:16:15
a kind of a political opinion journal
1:16:18
more than anything cheaper than reporting
1:16:20
yes Noah Shackman took over
1:16:23
it's yeah become more Daily
1:16:25
Beastie yeah and what is his
1:16:27
background again I'm trying to remember he
1:16:30
was he bought
1:16:32
it right chief of Daily Beast okay
1:16:36
and before that he was at wired
1:16:38
and somewhere else yeah
1:16:40
Rolling Stone which I think
1:16:42
Jan Wenner sold it yeah
1:16:44
but it's not just he didn't own it he
1:16:47
was just installed as the head
1:16:49
of it right it's brought up just all
1:16:51
kinds of entertainment media trade
1:16:55
and retail
1:16:57
which is bizarre cuz Penske is to
1:17:00
me is a motor racing company
1:17:02
well that's the way he's like
1:17:05
the song now Penske is a
1:17:07
media now he is yeah recently
1:17:09
invested a hundred million into Vox
1:17:11
at a time when oh that
1:17:13
was already having issues so
1:17:16
a big change so do you think
1:17:18
the Rolling Stone is more like the
1:17:20
Daily Beast now that Shackman is in
1:17:22
charge I mean editorial wise absolutely yeah
1:17:24
I mean Noah
1:17:26
is very good at what he
1:17:28
does and he smartly decided
1:17:31
that the Daily Beast model
1:17:33
was working and what Rolling
1:17:35
Stone was doing before was not and
1:17:37
brought over a lot of the smartest
1:17:39
political writers and reporters from the Daily
1:17:42
Beast to Rolling Stone in addition to
1:17:44
kind of revamping their cultural coverage yeah
1:17:48
he was a you go to PMC
1:17:50
calm I don't want to really
1:17:52
you're ready to leave Penske if you go
1:17:54
to me and calm there will be a
1:17:56
tick-tock on the website Really?
1:20:01
Remember that? Yeah. Wow. The
1:20:04
Firefly mobile phone for children?
1:20:06
So this
1:20:08
is really interesting. So Roger Penske, who is an
1:20:11
IndyCar driver, became,
1:20:15
took his renown as
1:20:17
a racecar driver to
1:20:19
found the Penske Corporation. Wow.
1:20:23
Very interesting. That
1:20:26
wasn't a bad detour that was an old
1:20:28
change. Oh, it's a weird detour. But I
1:20:31
brought it up because I said, what's happened
1:20:33
to the Rolling Stone? And this is all
1:20:35
from Anil Dash's interesting piece. All
1:20:37
right. Let's take a little break because
1:20:39
I'm breathless. You're
1:20:43
listening to This Week in Google with the
1:20:47
the the Chordling Anderson
1:20:49
Cooper style. All
1:20:54
we need is a cat cafe and we've got a
1:20:56
maid and there's more
1:20:58
to the information. Yeah.
1:21:01
Where's your cat? I was about to
1:21:03
say, my cat is clearly sleeping on
1:21:05
her duties. She's got to be on the screen for this.
1:21:08
I think we're talking to a new
1:21:10
sponsor that makes one
1:21:13
of those kitty layers that rotates.
1:21:15
Ooh. Do you want that? An
1:21:17
automatic cat cleaner? Yeah. Those
1:21:20
are nice. Yeah. Okay. They're
1:21:22
very expensive. They're okay. They're high quality things. All
1:21:24
right. They're a long
1:21:28
time. I
1:21:31
love the hearing off mic. I
1:21:33
want one too. Everybody wants one.
1:21:35
Automatic cat cleaners forever long. Huge.
1:21:37
Huge. Have
1:21:40
you ever heard
1:21:42
of Zulily? No.
1:21:44
What is that? I feel like I know that name.
1:21:46
I think they were B.E. Commerce for a business. They
1:21:49
are going on a business. Thirteen years. They
1:21:52
started in Seattle. They're shutting down their
1:21:54
operations because they say they couldn't compete
1:21:56
with Amazon. If
1:24:00
you settle, not sure what it costs you. Well
1:24:03
Google and the plaintiffs have agreed to terms that will
1:24:05
result in litigation being dismissed. The agreement will
1:24:07
be presented to the court by end of January, the court
1:24:09
giving final approval. This Ars Technica article
1:24:11
does not have a dollar amount, but I think I
1:24:14
did see some with like, was
1:24:16
it $20 billion? I
1:24:18
thought it was hundreds of millions. Hundreds of millions? Let
1:24:20
me see if I can find it. I thought so.
1:24:22
I don't know. I can be wrong. I
1:24:24
can be wrong. I can be hallucinating as they say. Five
1:24:26
billion dollars. Five billion, geez. According
1:24:29
to NPR. I'm not a evil voice. Five
1:24:31
billion dollars. Well, okay, wait a
1:24:33
minute. They agreed to settle a
1:24:35
$5 billion lawsuit. Oh,
1:24:37
yeah. But for how much terms
1:24:40
weren't disclosed, the suit originally sought $5
1:24:42
billion on behalf of users. Yeah, yeah,
1:24:44
yeah. Okay, all right. Yeah. I've
1:24:47
been in that seat before. We
1:24:49
don't know, I guess. I was going
1:24:51
to say, I mean it's rare for settlement amounts to
1:24:53
be made public in these sort of cases. That's
1:24:56
right. That's true. And usually, in fact,
1:24:58
there's a clause that says you can't. Public thing,
1:25:00
isn't it? Yeah. Except
1:25:02
for this. I mean, was it a class action suit? Yes.
1:25:06
In that case, the class... Wait a minute. Was it? Actually,
1:25:09
wait a minute. I'm sorry. I don't
1:25:11
know if it was. It is a class action suit. It is. Yeah,
1:25:14
actually. So, yeah, they'll have... They'll say what
1:25:17
you... We all get the 50 cents
1:25:19
each or whatever. But that's another loss for Google
1:25:21
in the courts. It's not been a good few months
1:25:23
for Google. So, when you used
1:25:25
incognito or you used it still, what
1:25:29
does it protect? Do we know? It's
1:25:31
basically the spouse mode. It
1:25:35
protects somebody with access. It doesn't add
1:25:37
the sites you visit to your history.
1:25:40
It prevents somebody in your home from going to
1:25:43
your browser and looking at what you looked at.
1:25:46
It does not, in fact, in any other
1:25:48
way hide what you do. Ah.
1:25:52
So, I think there was some merit in
1:25:54
this. I mean, by the way, that's true
1:25:56
of all private browsing modes. I
1:25:59
think. Let's
1:26:01
see. Here's
1:26:04
what Firefox says. This is in their, they
1:26:06
don't call it incognito, they call it private
1:26:09
browsing. I'm in a private browsing
1:26:11
window. It says Firefox clears your
1:26:13
search and browsing history when you close
1:26:15
all private windows, but this doesn't make
1:26:17
you anonymous. It
1:26:19
only basically clears your history is what it
1:26:21
really does. So Google says
1:26:24
it won't save your browsing
1:26:26
history, cookies and site data, information entered
1:26:28
in forms. Your activity might
1:26:30
still be visible to websites you visit,
1:26:32
your employer or school, your internet service
1:26:35
provider. Certainly. That's always been
1:26:37
there though, I think. I know Google's always had
1:26:39
this disclaimer just as Firefox had, but the name, it's
1:26:42
like Tesla's
1:26:44
autopilot. The name doesn't
1:26:47
fly. Incognito
1:26:50
and then they have a little spy with sunglasses
1:26:52
on and a hat. It implies
1:26:54
that you're traveling around invisibly. We've
1:26:56
always, I mean I've told people
1:26:58
this for years, but
1:27:01
I think it's a very common misconception just
1:27:03
from the name. And
1:27:07
Google, this will be one of those where they
1:27:09
admit no wrongdoing, but here have some money. Did
1:27:13
you? And that money will ultimately end up being
1:27:15
$7 per person. Oh,
1:27:17
if that is, oh I think the lawyer is the
1:27:19
only people who get rich on this. I
1:27:24
found this out from our local
1:27:26
NBC affiliate in
1:27:28
California. That's when
1:27:30
you were watching the local news. I was watching
1:27:32
the local news during the Rockin'
1:27:35
New Year's Eve. Yeah, let me just get my hair here
1:27:37
because I want to make sure I look good. Leo
1:27:40
is combing his hair while staring intently in
1:27:42
the camera. It's for
1:27:45
the TikTok. California
1:27:50
Law enforcement officials, authorities are told
1:27:52
that if they pull over an
1:27:54
autonomous vehicle without a driver, they
1:27:56
may not write a ticket. You
1:28:02
who had the ticket goes to school.
1:28:05
That's part of the problem, right? When
1:28:07
driverless cars break the rules of the
1:28:09
law road, there's not much the law
1:28:11
enforcement's do about it. Tickets can only
1:28:13
be written in California if there's an
1:28:15
actual driver in the car. That's not
1:28:17
true in all states. By the way,
1:28:20
In some states, you can take the
1:28:22
manufacturer, the operator of the driverless vehicle
1:28:25
with you. You skipped over stuff your.
1:28:27
How does a cop pullover driverless car?
1:28:30
Yeah, we sort of. the car. Noted.
1:28:33
Pullover. Well.
1:28:35
They I'm sure it's. Time
1:28:39
for looks or a sauce, flashing
1:28:41
lights and sirens and over to
1:28:44
get out of the way. you
1:28:46
write with that kind of know
1:28:48
that have been pulled over. Just
1:28:52
like you would. It
1:28:54
sees the flashing lights and it's really no
1:28:56
reason why I then what it is not
1:28:59
be as is that the zebra gotta pull
1:29:01
over. A.
1:29:03
Dry. So in Texas. Ah
1:29:06
credits Texas Transportation Code The owner of
1:29:08
a driverless cars considered the operator whether
1:29:10
they're in the car or not. Read.
1:29:13
In Arizona same thing, the owner of
1:29:16
the vehicle may be issued a traffic
1:29:18
citation. Or other penalty for
1:29:20
vehicle fails to comply with traffic
1:29:22
for motor vehicle was bought in
1:29:24
California. Au Prince is just seems
1:29:27
like a loophole. Perhaps the does.
1:29:30
I never really even thought of, I just
1:29:32
assumed that the car would pullover for is
1:29:34
it turned on the flashers and and. Support
1:29:37
a whole lot of rollover. Everyone should
1:29:39
pull over with a splash of beyond
1:29:42
you to lead to get Fargo bought
1:29:44
or yes or. Well.
1:29:46
But. that's
1:29:49
of as anybody show a hands
1:29:51
been in a way imho and
1:29:54
been pulled over. I
1:29:56
mean the thing is doesn't happen that often to these guys
1:29:58
are pretty good. Them in of those ones. Away
1:30:00
my can do is stop office
1:30:02
very cautious know my grandma. A
1:30:05
way I you know, Cruelly
1:30:07
personally practically out of business because
1:30:10
of that trouble with this San
1:30:12
Francisco. Dmv. And so forth.
1:30:14
so. It's really it's way
1:30:16
more at this point, but between him and who should
1:30:18
be liable? For.
1:30:22
Like you know a chat about the easy
1:30:24
as a software or you have such as
1:30:26
a as I don't know? Well okay so
1:30:28
there's somebody driving. Or something.
1:30:31
Whatever. Is driving his life, their service
1:30:33
and he I who who gets the sickest.
1:30:36
Ah, The company the com line owns the
1:30:39
car to heal on or whoever owns the guts.
1:30:41
I mean sometimes these cars are driven by somebody
1:30:43
back of the Hamas has more often apparently than
1:30:45
than we know. Ah,
1:30:48
But. So that's a driver discuss, not
1:30:50
some in the driver's seat. And yes, Way
1:30:52
Mo is if is nobody else. Way more
1:30:55
because it's their software that violated the law.
1:30:58
You gonna be able to punish him? I'm
1:31:02
sure that's the question the California legislature as
1:31:04
yeah, this is probably why there's no real
1:31:06
lies about my the and a lot worse
1:31:08
were no. I
1:31:11
think it's. Funny. Though
1:31:13
it's got it's gotta be the company's
1:31:15
own car. It's gotta be. But they're
1:31:17
also the ones who put the money
1:31:19
into the politicians. Ah, you think it's
1:31:21
a you think it's a profit motive
1:31:23
for me as isn't everything in America
1:31:25
for him. As a person who's who's
1:31:27
yes, saw the young people are just
1:31:29
so cynical I was still yeah yeah
1:31:31
I mean my sewing libraries seats is
1:31:33
this is. Under
1:31:36
a young person. Take
1:31:38
it to lot to say youngsters imitate
1:31:40
the sale Zero better for all young
1:31:42
person I used to be avoided or
1:31:45
kids to do Believe it I know
1:31:47
I call John as soon Everybody's most
1:31:49
eligible man and San Francisco I well
1:31:51
as one of the hot and on
1:31:53
the card. says. it right here
1:31:56
has formerly one in san francisco's one
1:31:58
hundred most eligible bachelors does it right
1:32:00
there in print to find a
1:32:02
read formerly lived in tomato fields
1:32:06
Oh half a detected tomato field
1:32:08
I don't get no no no we're not
1:32:11
gonna go there this
1:32:13
goes back to a story I was
1:32:15
from years ago talking about can of
1:32:17
tomato soup right okay
1:32:20
I lived in cinnamon city New Jersey
1:32:22
she asked Leo fair enough you can
1:32:25
we know that it was built on
1:32:28
the old Campbell's tomato
1:32:30
fields that's all there
1:32:32
is the story do you grow up
1:32:34
smelling the sickly sweet smell of tomatoes
1:32:36
well it was all it
1:32:38
was a great it was actually a great story about I didn't
1:32:41
realize this but Campbell's made so much tomato
1:32:43
soup they actually they actually had to like
1:32:46
buy up New Jersey and grow tomatoes there's
1:32:48
well the only place they could really do
1:32:50
it and so
1:32:52
they owned a huge farms in New Jersey
1:32:56
cinnamon and Moore's town
1:32:58
yeah so
1:33:01
the miss you the spelled CI double an AMI and
1:33:03
so when it's kind of
1:33:05
article you're reading you know metal sloths or
1:33:09
actually was in there was a
1:33:11
close it was in the Smithsonian magazine
1:33:13
how Campbell's soup turned New Jersey into
1:33:15
a tomato growing state and
1:33:18
they actually made a special kind
1:33:20
of tomato just from modern farmer
1:33:22
from modern farmer a classic magazine
1:33:24
farmer oh seriously
1:33:26
it's wonderful it's reprinted by the Smithsonian I
1:33:28
guess and there's a picture that's what they
1:33:30
give it away that's Harry Hall that's Jeff
1:33:33
check yeah that's Jeff and in
1:33:35
his younger days looking at tomato uncle
1:33:37
Harry this is
1:33:39
in cinnamon and so I
1:33:41
think it is yeah there it
1:33:43
is cinnamon cinnamon and what
1:33:46
was the name of the tomato that
1:33:50
they created there was a special tomato
1:33:52
that they made that was what
1:33:55
they made for Campbell's tomato soup it's really a
1:33:57
great actually a great story
1:33:59
I think it's That's a fascinating story. The
1:34:01
Rutgers Tomato. Probably
1:34:04
done with an extension of Rutgers University. The
1:34:08
JTD Tomato named after John Thompson
1:34:11
Dorrance, later
1:34:13
president of the company released in
1:34:15
1918, medium-sized red tomato
1:34:18
averaging in the 8 to
1:34:20
12 ounce range, uniform
1:34:22
in shape, tasty, and most importantly,
1:34:24
does not crack. The
1:34:28
JTD. Tomatoes
1:34:30
crack? Oh yeah, you've seen that,
1:34:33
haven't you? Where they kind of grow
1:34:35
the skin. Oh,
1:34:37
like an heirloom tomato. Yeah, yeah, they crack. Where
1:34:40
they look like they're weirdly sewn together. Yeah.
1:34:43
Anyway, there you go. The history of tomatoes in
1:34:45
New Jersey. I know, don't you? Now,
1:34:51
what's the accompaniment, the natural
1:34:53
accompaniment for a bowl of Campbell's tomato is?
1:34:56
Grilled cheese. Grilled cheese, yeah. Okay.
1:35:00
Gotta be. We have that most every Friday night. I
1:35:04
have an $1800 appliance that
1:35:06
makes excellent tomato soup. Say
1:35:10
more. You
1:35:12
need your tomatoes to be uncracked? No, no, I
1:35:14
actually can make it with canned
1:35:17
tomatoes, but I prefer to use
1:35:20
the nice Italian Marzano tomatoes.
1:35:23
It's the Thermomix. Do you know about the Thermomix? Stacy
1:35:26
introduced me to this, I think, and
1:35:29
forced me to buy one. I
1:35:33
will not be introducing you to any $1200
1:35:35
tomato specific apartment. I
1:35:38
also have several June ovens, thanks, Dar.
1:35:41
Stacy's by the way, it used to be a
1:35:43
game on the other show, Paris was to get
1:35:45
Leordo Live, but now the things aren't going so
1:35:47
well in podcasts when we stopped doing it. But
1:35:49
selling off all of these things that
1:35:51
he bought. Let's
1:35:54
see. As you were starting to say, Stacy, you
1:35:56
were interrupting. Stacy's going to be on
1:35:58
Twitter. When's that coming up? Benito and be
1:36:00
on Twitter on the 28th 28th of this
1:36:03
month cool and Let's
1:36:05
not forget Stacy's book club which continues
1:36:08
if you are a club twit member You
1:36:10
know that we've been doing this for some months
1:36:14
And even though Stacy has departed
1:36:16
this show. She says I still want to
1:36:18
do the book club So Stacy's book club
1:36:20
is coming up When
1:36:23
is that February 8th? I'm reading the book right
1:36:25
now. It's It's
1:36:27
called the water knife by palog bachigalupi
1:36:29
bachigalupi, and it's really good so far.
1:36:32
I'm really enjoying it so February Huh
1:36:38
What is it about I will be hosting that that's why
1:36:40
I'm reading the book it's about I'm
1:36:43
sorry cross talk it's about
1:36:45
well so far it seems to
1:36:47
be about Las Vegas in the future
1:36:49
Which has become one of those
1:36:51
you know like desert? city
1:36:54
things You know kind
1:36:56
of monoliths and then our college that's
1:36:58
the word days and so
1:37:00
it's in Las Vegas because you can't
1:37:03
live in Las Vegas because it's too hot so that's
1:37:05
become an ecology and it's about the
1:37:07
woman who really kind of is
1:37:09
the runs it just kind of Anyway,
1:37:12
it's very interesting. It's nice. Yeah, and
1:37:15
then I should mention we are gonna do an
1:37:17
insight twit for club members tomorrow One
1:37:20
o'clock Lee Lee Lee said I will do a state
1:37:22
of the nation or a state of the
1:37:24
podcast I guess One
1:37:27
p.m.. Pacific tomorrow For
1:37:30
those of you who are in the club if you're not
1:37:32
in the club And you want to know more about this
1:37:34
kind of stuff and the shows we do in the club
1:37:36
and we're gonna do Samsung Galaxy unpacked in January 17th stuff
1:37:38
like that Join the
1:37:41
club Seven bucks a month you
1:37:43
get ad-free versions of all the shows You
1:37:45
get access to the club twit discord which is
1:37:47
a wonderful community by the way ants in there
1:37:49
right now, which is great. Hi ant we
1:37:53
also He
1:37:55
says he wants an immersion blender
1:37:57
mayo tutorial, okay, ant deal.
1:38:01
I'll show you how I'm never buying
1:38:03
mail again. It's amazing. You
1:38:05
also get special programming. We don't
1:38:07
put out anywhere else like Stacy's Book Club.
1:38:10
And most importantly, that $7 a month really
1:38:12
supports what we do here. We
1:38:16
just cannot survive on ad money alone, I'm
1:38:18
sorry to say. So we
1:38:20
need your help. I'm very pleased to say
1:38:22
people have responded to this. And
1:38:25
it's great. We've got 10,000 members now. We need 35,000. So
1:38:27
which is only 5%.
1:38:31
I think I at
1:38:33
least would like to see that soon. Sooner
1:38:36
the better. I think what is the goal by
1:38:38
the end of the year 35,000? I think so.
1:38:41
twit.tv slash club. Please
1:38:44
give us and and
1:38:47
get gifts. I belong to the
1:38:49
club. I bought a subscription for Son
1:38:51
Jake. Thank you. Your whole family there.
1:38:53
Yeah. Lisa did a blog post on
1:38:55
the 12 ways to support. She's
1:38:58
she's one. In fact, you know
1:39:00
what, it was an example of how to do
1:39:02
tiktok. Because she split it up into a
1:39:04
whole bunch of little bits and then shrug it
1:39:06
out. Yeah. And she did the same thing on the Twitter,
1:39:09
which was great. And
1:39:11
it's on the twit blog at twit.tv. And some
1:39:13
of them don't cost any money. Anything you can
1:39:16
do to help will be much appreciated. Joining
1:39:19
the club is fantastic. I really love the club.
1:39:21
I during the Christmas break, I spent a lot
1:39:23
of time in the in
1:39:25
the discord, getting help from
1:39:28
the advent of code experts in there. That was a
1:39:30
lot of fun.
1:39:32
Moving right along. Should we do our AI? We've
1:39:34
done much of it, but we can do the
1:39:36
rest of it. Our AI segment. Let's this.
1:39:40
Oh, we don't have a there's no AI. Hey,
1:39:44
hey, can't you
1:39:46
see it's AI?
1:39:48
That's the Brooklyn version.
1:39:50
Hey, you should do
1:39:52
that for
1:39:57
the Brooklyn in New Jersey. Hey, yeah. From
1:40:01
Politico, a new kind of AI
1:40:03
copy can fully replicate famous people.
1:40:06
The law is powerless. You
1:40:11
can take photographs of people and steal their souls.
1:40:13
There should not be a law. Actually, the story
1:40:15
is, when you read it, it's
1:40:17
like, well, no, actually Martin Seligman loves this
1:40:19
idea. So Martin Seligman, who
1:40:21
is 81, is a psychologist, very
1:40:24
well-known apparently. He was
1:40:27
pondering, according to Mohar Chatterjee,
1:40:29
who wrote this story, he
1:40:31
was pondering his legacy at a dinner party
1:40:33
in San Francisco one late
1:40:35
February evening. He
1:40:39
got an email from a graduate
1:40:42
student in China, Yukun Zhao. He
1:40:46
had created, Zhao had created without
1:40:48
Seligman's knowledge, a virtual Seligman.
1:40:50
Over two months by feeding every word
1:40:52
Seligman had ever written into
1:40:55
cutting-edge AI software, Zhao
1:40:59
had built an eerily accurate version of
1:41:01
Seligman himself. A talking chatbot is exactly
1:41:04
what we were just talking about, right?
1:41:07
Whose answers drew deeply from Seligman's
1:41:09
ideas, whose prose sounded like a
1:41:11
folksier version of Seligman's own speech,
1:41:13
and whose wisdom anyone could access.
1:41:16
Impressed, Seligman
1:41:18
went, wow! Circulated the chatbot
1:41:20
to his closest friends and family to say,
1:41:23
you know, does this sound like
1:41:25
me? And his wife said,
1:41:27
she was blown away by it. The
1:41:29
bot is cheerfully named Ask Martin.
1:41:33
Cheerful? That's cute. It
1:41:36
was built by researchers in Beijing
1:41:39
and Wuhan without Seligman's permission or
1:41:41
even awareness. Seligman
1:41:44
doesn't mind, he's 81. In fact, this answers the
1:41:46
question, what's going to happen after I go? I
1:41:50
think this is fine. There
1:41:57
are others, Belgian celebrity psychotherapist
1:41:59
Esther Mr. Perrell,
1:42:01
who were not so
1:42:03
happy, in Southern California, a tech
1:42:06
entrepreneur created a chatbot of her
1:42:08
scraping her podcasts off the internet.
1:42:10
No one would ever scrape podcasts
1:42:12
off the internet. He
1:42:15
actually- Belgium, Dr. Phil. I
1:42:18
guess so. What's interesting is the
1:42:20
guy who did this, Alex Fermansky, did it
1:42:22
because he had a recent
1:42:24
heartbreak and he built it to counsel
1:42:26
himself. Well, that's
1:42:28
kind of sad. I want to cry
1:42:30
on it. I know. Here's his Medium
1:42:33
article. Instead of simply
1:42:35
speaking with a therapist, I created an
1:42:37
AI one. Actually, it's sub-set.
1:42:39
It's cheaper. Yeah, probably.
1:42:45
So, we've come a long way from Eliza. I
1:42:48
don't know if Perrell
1:42:50
was unhappy about this. Like
1:42:54
Seligman, the article goes on to say she
1:42:56
was more astonished than angry. She called it artificial
1:42:59
intimacy. I
1:43:03
think this is a good thing. Now
1:43:05
Congress may not. In fact, Congress has good
1:43:09
old Amy Klobuchar has
1:43:12
a bill, she's one of the co-sponsors,
1:43:14
called No Fakes, the No
1:43:16
Fakes Act, which is- I'm
1:43:20
sure it stands. It's of course an initialism. Let me
1:43:22
look it up. I want
1:43:25
to be at the bar when the staffs come up with
1:43:27
these names. Yeah, I would like
1:43:29
that job. I just like the job of making the
1:43:31
little acronyms. They're
1:43:34
really retronyms because they
1:43:36
start with what they want it to say,
1:43:39
right? Of course. Yeah. Yeah.
1:43:42
It would penalize
1:43:44
AI for generating someone's likeness without
1:43:46
their consent. Chris
1:43:51
Coons, Marsha Blackburn, Amy
1:43:53
Klobuchar, and Tom Tillis. Amy,
1:43:56
come on. Knock it off. The
1:43:58
No Fakes Act. Let's
1:44:01
download the text of the No Fakes Act
1:44:04
and see what that stands for. Now you've got me. I really
1:44:07
want to know. Title,
1:44:12
to protect the image, voice, and
1:44:14
visual likeness of individuals and
1:44:16
for other purposes. Oh, get ready. Here
1:44:18
we go. What does No Fakes
1:44:20
stand for? Nurture originals,
1:44:23
foster art, and keep
1:44:26
entertainment safe, act. Oh,
1:44:30
man. Well, that's the No... No
1:44:34
Fakes. That is No Fakes. Yeah,
1:44:37
it spells it. I
1:44:39
mean, yeah, I guess you should
1:44:41
have to give permission. Honestly, if somebody wanted to
1:44:43
create a Leo Laporte, I don't think there's a
1:44:46
problem with that. No, it depends
1:44:48
on how they use it. What if the Leo bot
1:44:50
became more popular than you, Leo? Would you have a
1:44:52
problem with that? Fine. But I'm at the
1:44:54
end of my career. And someone was making money on it?
1:44:56
It might be different for you, right? I'm at the end
1:44:58
of my career, and I would be fine with that. But
1:45:00
you might not. The issue isn't making it. The issue is
1:45:02
how you use it. If you use your image in an
1:45:04
ad and you don't give permission, that's already illegal. That's already
1:45:06
illegal, yeah. If you use it for a commercial purpose rather
1:45:08
than an editorial purpose. By the way, it's not illegal if
1:45:10
I'm dead, is it? And
1:45:13
it's not illegal if it's an editorial purpose. I
1:45:18
don't know. I have a mixed feeling. It's
1:45:20
illegal if you're dead for
1:45:23
a certain period of time. I mean, isn't that how the
1:45:25
estates of famous people work? What, until it's cold? I'd
1:45:28
say you're colder in your grave. The
1:45:31
No-Fakes Act has received support from multiple organizations
1:45:33
across the arts and entertainment industries. The
1:45:36
RIAA, of course, describes
1:45:39
the use of unauthorized AI performances
1:45:41
as theft. The Actors
1:45:43
and Writers Union, SAG-AFTRA, and
1:45:45
WGA. Of
1:45:48
course, one of the reasons they struck
1:45:50
was they were concerned about AI. I don't know. I
1:45:56
think I have the same position, which
1:45:58
is... It's probably already illegal. to
1:46:00
do certain things. And we got to be
1:46:02
careful about too much regulation on this stuff
1:46:04
because we want the innovation to
1:46:06
happen. I think there's a benefit. I don't
1:46:09
know but I mean what innovation is happening
1:46:11
by making an AI chatbot off
1:46:14
based on one person? Well
1:46:16
you're probably right. Publicly scraping
1:46:18
all of their appearances. But
1:46:21
maybe the real benefit is that we learn
1:46:23
something about it, not necessarily that that thing
1:46:25
that they created is of that much value
1:46:27
but... I mean yeah, if you want to
1:46:29
do something in private to
1:46:32
learn something I don't think anyone's stopping you
1:46:35
from doing that. I think that probably
1:46:37
what... obviously I don't know the text of
1:46:39
this bill and whatnot but I assume it
1:46:41
has something to do with commercial use of
1:46:44
these. I think it just prohibits it
1:46:46
outright whether you make money on
1:46:48
it or not. Good article. This
1:46:50
is the one I would point people to.
1:46:53
Jan Lekun who is one of the founding
1:46:55
fathers of LLMs and is very well
1:46:57
known. He's at Facebook. He's
1:47:00
met his chief AI scientist.
1:47:03
A good interview with him in Wired.
1:47:08
He scoffs Wired
1:47:10
writes at his peers dystopian
1:47:12
scenarios of supercharged misinformation
1:47:15
and even eventually human extinction.
1:47:17
He's not a doomer.
1:47:21
When his former collaborators... He's a voice of
1:47:23
sanity in all of these. I think he
1:47:25
is. When his former collaborators Jeffrey Hinton and
1:47:27
Yoshua Bengio put their names at the top
1:47:29
of a statement calling AI a societal scale
1:47:31
risk, Lekun refused to sign
1:47:34
it. He said instead he signed an open
1:47:36
letter to the US President Joe
1:47:38
Biden urging an embrace of
1:47:40
open source AI declaring
1:47:43
it should not be under the control of
1:47:45
a select few corporate entities. Exactly what I've
1:47:47
been saying but I... Here's
1:47:49
a question. Yeah. Here's a question. When
1:47:53
Lambda kind of leaked, the
1:47:56
presumption of many was that all they didn't... Facebook
1:47:59
wouldn't do that. want to do that and
1:48:01
this open source stuff is accidental. I don't think
1:48:03
so. I think Lekun is so religious
1:48:06
on the point. Did
1:48:08
Lambda leak or was it intentional? I
1:48:10
think it initially leaked and then Facebook
1:48:14
released to the open source community a
1:48:17
public version of it called
1:48:19
Llama 2. So I think
1:48:21
yeah Lambda's Googles.
1:48:23
So they're all plays
1:48:25
on the album. That's the one that's alive. Right. Llama
1:48:29
2 is widely used in
1:48:31
open source. I have some open source tools that
1:48:34
use it as a model. Who
1:48:40
wrote this? I only get his name. This
1:48:42
is Stephen Levy, my good friend Stephen. Hi
1:48:44
Stephen. He says,
1:48:46
when I sat down with Lekun in a conference
1:48:48
room in Meta's Midtown office this fall, we talked
1:48:50
about open source, why he
1:48:52
thinks AI danger is overhyped and whether
1:48:54
a computer could move the human heart
1:48:57
the way a Charlie Parker sax
1:48:59
solo can. Lekun
1:49:01
is a jazz fan.
1:49:05
What is Stephen thing to ask?
1:49:07
Yeah it is. Isn't it? I
1:49:10
love it. Why
1:49:14
are Stephen asks, why are so many prominent people in
1:49:16
tech sounding the alarm on AI? To
1:49:19
which Lekun says, some people are
1:49:21
seeking attention. Other people are naive
1:49:23
about what's really going on today.
1:49:25
They don't realize that AI actually mitigates
1:49:27
dangers. This is interesting like hate
1:49:29
speech, misinformation, propagandist
1:49:31
attempts to corrupt
1:49:34
the electrical system. At
1:49:36
Meta we've had enormous progress using AI for
1:49:38
things like that. Five years ago of all
1:49:40
the hate speech that Facebook removed from the
1:49:42
platform, 20 to 25 percent was taken down
1:49:45
preemptively by AI systems
1:49:47
before anybody saw it. Last
1:49:49
year 95 percent. Which I think
1:49:54
is interesting. How do
1:49:56
you view chatbots? Stephen asked, are they
1:49:58
powerful enough to displace your human jobs.
1:50:01
Lacun said they're amazing, big progress,
1:50:03
they're gonna democratize creativity to some
1:50:06
extent. They can produce very fluent
1:50:08
text with very good style but
1:50:10
they're boring and what they come up
1:50:12
with can be completely false. Mm-hmm.
1:50:15
He is a voice of reason. Yeah. He
1:50:20
says Mark Zuckerberg is very involved in
1:50:22
the AI push at Meta. Why
1:50:25
did Meta decide that llama code could be
1:50:27
shared or would be shared with others open
1:50:30
source style? Lacun says when you have an
1:50:32
open platform that a lot of people can
1:50:34
contribute to progress becomes faster. I've been saying
1:50:36
this all along. The systems you end up
1:50:38
with are more secure, they
1:50:40
perform better, imagine a
1:50:42
future in which all of our interactions
1:50:44
with the digital world are mediated by
1:50:46
an AI system. You don't want that
1:50:48
system controlled by a small number of
1:50:50
companies on the west coast of the
1:50:52
US. He
1:50:56
says Americans may not care but I
1:50:58
can tell you this, in
1:51:01
Europe they won't like it.
1:51:03
They say okay well this speaks English correctly
1:51:06
but what about French, what about German, what
1:51:08
about Hungarian? Yeah he is a proponent of
1:51:13
open and I think open is the right way to
1:51:15
go. What's sad is that was open AI's initial thesis
1:51:19
and they come because of the cost. They wouldn't be
1:51:21
getting 1.8 billion dollars now. Well
1:51:23
I started listening to a podcast called
1:51:26
Mystery AI Hype Theater 3000 with Emily
1:51:28
Bender who's a co-author of the Stochastic
1:51:32
Parents paper and Alex Hanna and
1:51:34
the interesting thing is here is that I
1:51:37
look at Lacun as a voice of reason. They
1:51:39
puncture all the hype
1:51:43
of the of the Duber boys but
1:51:45
they also try to puncture Lacun for
1:51:47
being too enthusiastic. It's
1:51:50
really hard to get the scorecard
1:51:52
here of who stands where on AI these days.
1:51:54
Yeah he says I don't like
1:51:56
the term AGI artificial general intelligence because there's
1:51:58
no such thing as general intelligence. This
1:52:01
is what we've grappled with. Intelligence is
1:52:03
not a linear thing you can measure.
1:52:05
Different types of intelligent entities have different
1:52:08
sets of skills. How
1:52:10
do you define it? What is general
1:52:12
intelligence? What is intelligence? He
1:52:15
says eventually there's no question machines will be smarter
1:52:17
than humans. We don't know how long it's going
1:52:20
to take. It could take years. It could be
1:52:22
centuries. And define smarter.
1:52:25
At that point, Levy says do we
1:52:27
have to batten down the hatches? No, no.
1:52:29
Well, I love this vision, by the way,
1:52:31
the future. Well, I have AI assistance. It
1:52:33
would be like working with a staff of
1:52:35
super smart people. They just won't be people.
1:52:37
Humans feel threatened by this. I think we
1:52:39
should feel excited. The thing that excites me
1:52:41
the most is working with people who are
1:52:43
smarter than me because it amplifies your own
1:52:45
abilities. It's true. I like working with you
1:52:47
too. You're smarter
1:52:49
than me. That's
1:52:52
what makes the world go around. Why would
1:52:54
I not want an AI assistant who's
1:52:56
also smarter than me, right?
1:52:58
And you know what? It would follow orders and
1:53:01
it wouldn't suggest going to different lines in the
1:53:03
rundown when you don't want to. Exactly my point.
1:53:07
There's no reason- I would never watch the talk. There's
1:53:10
no reason Lacun says to believe that just
1:53:12
because AI systems are intelligent, they will want
1:53:14
to dominate us. People are mistaken when they
1:53:16
imagine AI systems will have the same motivations
1:53:19
as humans. They just won't. We'll design them
1:53:21
not to. That's what Ray
1:53:23
Kurzweil always said too. Anyway,
1:53:27
I think it's very
1:53:29
interesting. Have you tried to get
1:53:31
him on note? No, but that's good.
1:53:33
Benito making note of that. Jan would
1:53:35
be fantastic on this show. I'll try.
1:53:37
Love to get him. Yeah. Here's
1:53:40
a great piece in tech dirt. Mike
1:53:44
Maszak nailing it again. The FTC
1:53:47
continues to wade into copyright
1:53:49
issues in AI without understanding
1:53:51
anything. I love
1:53:54
that the kind of subhead is
1:53:57
the- from the why
1:53:59
is the FTC? even looking at
1:54:01
this department on the article. He
1:54:03
says, seriously, what
1:54:06
the F is the FTC doing endorsing
1:54:09
any of these bonkers points without
1:54:12
pushing back on why they themselves
1:54:15
are anti-competitive and problematic.
1:54:17
Instead, the FTC endorses
1:54:19
the untested idea that all training
1:54:21
data must be licensed. It
1:54:24
also argues that style mimicry is a
1:54:26
concern when that's kind of the basis
1:54:28
of almost all creators learning and building
1:54:31
their own styles. The problem is
1:54:33
that the FTC brought in a bunch of creators. He
1:54:39
said it was a very one-sided roundtable
1:54:43
of people from the creative industries who more or
1:54:45
less all agreed everyone should be forced to give
1:54:47
them money at every opportunity. Oh,
1:54:49
Mike is the greatest. He
1:54:52
says, hey, this is not part of the
1:54:54
FTC's portfolio. It's not part of their mandate.
1:54:57
They shouldn't be weighing in. And
1:54:59
they're wrong. They're misguided. So
1:55:04
I agree with him. This
1:55:07
is not the FTC's concern. I'm
1:55:09
not sure why they think it is. And
1:55:12
by the way, you want to encourage competition? That's
1:55:15
how you do it. You don't shut down
1:55:18
AI. You encourage it. All
1:55:22
right. I think that was our AI segment
1:55:25
unless you guys have a line number to
1:55:27
fire at me. Actually
1:55:29
this week, I think it was all crap. There
1:55:34
is, as I mentioned, we're going to do the,
1:55:38
I don't know if it's worth doing, but we're going to
1:55:40
do the Samsung Unpacked event, which is
1:55:42
coming up. They announced that. CES is
1:55:45
next week, right? So
1:55:47
for some reason, Samsung, instead of announcing
1:55:49
this at CES or concurrently with CES,
1:55:51
is waiting and we'll
1:55:53
announce the Galaxy S24 with AI. By
1:55:58
the way, that's a big part of this. and
1:56:01
they're going to do that at 10 a.m. Pacific. They're
1:56:03
coming to California to do it, which is interesting. Last
1:56:05
one they did was in Korea, and
1:56:07
at an ungodly hour, the
1:56:10
Samsung event will be at 10 a.m. on
1:56:12
the Galaxy Impact event on the 17th. I'm
1:56:16
so hoping for a new Galaxy Chromebook.
1:56:19
Can you just get a Chromebook? No,
1:56:23
well, no. I have – no,
1:56:25
I returned things. Oh,
1:56:28
you returned the Chromebook? You returned it. Why?
1:56:31
Yeah, I have an Asus that I hate. Okay. But
1:56:35
you got the Asus. But you got the Asus. I had
1:56:37
to return it, and they went back and went up. You
1:56:39
got the Asus that Kevin Tofel
1:56:41
recommended, right? Yeah, it – he
1:56:44
said he was very unhappy for me, but
1:56:47
yeah, it was a lot of them, I think. I think it was
1:56:49
just a bad one. Oh, you just got a bad one. But you're not
1:56:51
going to get another one? Well,
1:56:54
Samsung is, of course, using the
1:56:57
latest Qualcomm chip, which has a ML
1:57:00
processing unit built into it, as do
1:57:02
all iPhones nowadays. And Samsung
1:57:04
is going to tout the fact that AI will be
1:57:07
available with the push of a button. So
1:57:10
I don't know if it'll be Bixby or something
1:57:12
smarter, but we will cover that. We'll do that live
1:57:14
right before when it's – I hope it has a
1:57:17
better name than Bixby. I
1:57:19
like the name Bixby. It's
1:57:21
kind of nerdy, and it's like somebody would wear really
1:57:23
dark glasses and – So it's like a 60s sitcom
1:57:26
character. Yeah, it does feel
1:57:28
like that. It's still Bixby, yeah. Amazon's
1:57:31
planning to make its own hydrogen-powered
1:57:34
vehicles. Oh, no. He's going
1:57:36
to – and Amazon's going to make hydrogen
1:57:38
to power vehicles. Okay. Amazon
1:57:43
– because I guess they
1:57:45
use a lot of hydrogen-powered stuff. And they're
1:57:48
in warehouses. Yeah. Forklifts
1:57:50
and things, yeah. So they partnered
1:57:52
with hydrogen company Plugpower to install
1:57:54
the first electrolyzer, which
1:57:58
splits water molecules to produce hydrogen. hydrogen in
1:58:00
the fulfillment center in Aurora, Colorado. It
1:58:03
will make fuel for 225 forklifts. That's
1:58:07
pretty fun. Yeah, that's what it was.
1:58:10
Clean hydrogen is a little problematic.
1:58:12
There's not like any green hydrogen
1:58:14
really yet, but
1:58:16
maybe this will help move that a lot. Why?
1:58:20
Because it consumes energy to
1:58:22
make it or? To make truly green hydrogen, says
1:58:24
the Verge, Amazon would have to make
1:58:27
sure its new electrolyzer runs on renewables.
1:58:30
The company is looking into pairing it with
1:58:32
renewable energy generated on site, but doesn't have
1:58:34
a concrete timeline for when that might happen.
1:58:36
Most hydrogen is made with fossil fuels, which
1:58:40
releases obviously the same stuff that your
1:58:42
car releases. There's also
1:58:45
a methane leak issue. So
1:58:47
yeah, hydrogen is not super clean at this point.
1:58:51
Someday. Actually solar splitting of water
1:58:53
would be awesome, wouldn't it?
1:58:57
Before Elon Musk, according to Fidelity, which
1:58:59
put money into Twitter, the
1:59:02
value of Twitter has fallen 71% since
1:59:04
he bought it. They're
1:59:09
writing down the value of their shares. Who
1:59:11
all does he blame but himself? I love
1:59:13
that clip of him saying
1:59:15
it's the advertisers fault. He says the whole
1:59:18
world will know. The
1:59:20
Earth will know. The Earth will know. The
1:59:22
Earth will know. And then advertisers. You're on
1:59:24
Mars, Elon. Earth ended Twitter. Yeah.
1:59:29
Okay. Okay. Wired
1:59:31
is 30. Did you go to the party,
1:59:33
Paris? No. I
1:59:36
did not get invited to the Wired 30s party. It's
1:59:40
in San Francisco though, so I don't feel bad about
1:59:42
that. 30 years, it's hard to
1:59:44
believe. It's changed a lot since 30 years ago when
1:59:46
it first came out in the 90s. You
1:59:49
could barely read it. Yeah.
1:59:52
It had this
1:59:54
weird high contrast or low contrast
1:59:56
page design and all that stuff.
2:00:00
But then there's Line 92, the podcast police.
2:00:02
Oh, wait a minute. We already did that.
2:00:04
It would have fit in
2:00:06
so much better now. Wow. It would
2:00:08
have just been perfect right now. Oh,
2:00:12
shucks. The transition. We can
2:00:14
edit it around. I
2:00:19
think we have come to the end of the line.
2:00:24
Could that be? Wait. You're
2:00:26
not going to go to the super
2:00:28
important one of don't forget to wish
2:00:30
every horse in the Northern Hemisphere happy
2:00:32
birthday because all Northern Hemisphere horses have
2:00:35
the recorded birthday of January 1st.
2:00:38
That's super important and relevant. So
2:00:41
wait a minute. When they're a two-year-old, it means they're
2:00:44
not two years old from their birthday,
2:00:46
but two years old from January 1st
2:00:48
following their birthday? Yeah. What?
2:00:51
If that's how every horse that races
2:00:53
the Kentucky Derby is the same birthday,
2:00:56
January 1st. So
2:00:58
wish your local horses happy
2:01:00
belated birthday. Just
2:01:02
one more reason. The whole thing is bizarre.
2:01:05
I think we should do the Google change log. You ready?
2:01:07
Here we go. The
2:01:10
Google change log. Nothing in the change log. What
2:01:16
is in the change log,
2:01:18
Leah? It's
2:01:24
absolutely nothing
2:01:26
in the change. We already did the one. That's
2:01:29
it. Thank you very much. Good
2:01:31
night, everybody. When
2:01:33
we come back, your picks of
2:01:35
the week. Prepare them, if you will. You're
2:01:38
watching This Week in Google.
2:01:41
A little plea, by the way, just
2:01:43
before we get back to
2:01:45
the action to take our survey.
2:01:47
We do this every year. Twit.tv
2:01:49
slash survey 24. It's
2:01:52
pretty quick, 10 minutes, but it
2:01:54
helps us understand you, what you like, what you don't
2:01:56
like. It's very important. We want to make sure
2:01:58
we're giving you the material you want. but also
2:02:00
helps us sell ads because we don't
2:02:02
give advertisers tracking information so
2:02:05
that means we have to give them aggregate information
2:02:07
like our audiences 58 percent male is
2:02:09
28 percent college
2:02:12
educated whatever you know those kinds of
2:02:14
statistics so help us out which it
2:02:16
go to twit.tv slash survey 24 and
2:02:19
take the survey you have to the end of the month but
2:02:21
don't put it off and I thank
2:02:24
you in advance we do this every year
2:02:26
is very very helpful Paris
2:02:30
I have two this week
2:02:32
one is the
2:02:34
failure museum oh I
2:02:36
believe that failure dot museum which
2:02:38
is an online collection of failed
2:02:41
companies and products that's really fun
2:02:43
one of my friends I think
2:02:45
I tweeted about it because they
2:02:47
recently added a mug from a
2:02:49
convoy which was a recently
2:02:54
recently imploded freight and
2:02:56
shipping start but
2:02:59
it's just a very fun little
2:03:01
website cataloging all the failures from
2:03:03
Juicero SCX to Blockbuster. Oh
2:03:05
we had these when I was a kid
2:03:08
JARTS lawn darts
2:03:10
with metal spikes banned
2:03:13
by the consumer product safety and
2:03:16
rightly so these things look at
2:03:18
them they're big they're heavy and
2:03:20
they have sharp points yeah yeah
2:03:23
we not good we had these when I was a kid JARTS
2:03:27
we didn't know what do we know the second
2:03:29
amendment requires that we we are we should
2:03:31
we every night you think there's something in the
2:03:33
Constitution that says we have to have JARTS we
2:03:36
have a well-regulated JARTS team
2:03:39
I love the FTX uh...
2:03:41
schwag that's kinda
2:03:44
nice I wouldn't mind having some
2:03:46
of that yeah FTX pool party
2:03:48
limited edition bobblehead of Samuel Bankman
2:03:51
Fried Forbes magazine where
2:03:53
she's in the cover fantastic
2:03:56
so by the way there was a story I put in that just
2:03:58
while we're on this I'm not gonna go online number but
2:04:01
his anthropic investment is
2:04:04
actually gonna turn out so well all of his people
2:04:06
could have been getting paid back. Oh my god. So
2:04:09
he wasn't such a scammer. Great
2:04:11
for shareholder value. Well I
2:04:14
think he invested himself but anyway like three
2:04:16
to four billion dollars and it
2:04:18
may the law firm has already gotten a
2:04:20
lot of money together so they may be
2:04:22
whole. Here's my favorite and I don't know
2:04:24
why Mattel killed this is called growing up
2:04:27
skipper you
2:04:29
twist her arm and her breasts grow. Oh
2:04:32
boy. See
2:04:36
her grow slim and tall
2:04:38
and curvy. It says
2:04:40
for little girl skipper turn her arm
2:04:43
all the way around clockwise then she's
2:04:45
cute and young again. Oh
2:04:48
wow. Not
2:04:50
old and sassy. Tall curvy
2:04:53
teenager you choose. The
2:04:56
Jeffrey Epstein vision. Growing up skipper
2:04:58
at the failure museum
2:05:01
this is so fun. I love
2:05:03
this. My second pick
2:05:05
is equally as fun. It's a letter
2:05:07
of recommendation for visiting your
2:05:10
local medieval time. Wait a
2:05:12
minute. On Friday me and
2:05:14
my recreational ski ball team
2:05:16
went to the Lindhurst New
2:05:18
Jersey castle. We
2:05:20
all dressed up and it was honestly
2:05:22
a fantastic time. We were
2:05:25
mostly going there for the bit. We
2:05:27
were like oh medieval times it'll be
2:05:29
very funny but honestly they put on
2:05:31
a fantastic show. The horses danced. They
2:05:33
jousted. Are the knights cute? The
2:05:35
knights were cute also. There
2:05:38
were female knights as well who were
2:05:40
also cute. Did you enjoy
2:05:42
your turkey leg? I
2:05:45
did enjoy my turkey. Honestly
2:05:47
the food was fantastic. They had
2:05:49
a tomato bisque soup to begin.
2:05:53
No utensils right? No
2:05:55
utensils. All with your hands.
2:05:57
All slurp. Yeah. I love
2:05:59
them. I've
2:06:01
been to the medieval times in Southern California. You
2:06:04
wore the little hat, the crown? Oh,
2:06:06
we did wear the hat. I'll find
2:06:09
a photo for you guys. We
2:06:12
dressed up also. I like
2:06:14
you and your friends. I
2:06:17
really think that's awesome. You have a
2:06:19
good group of friends. That's fantastic. Yes,
2:06:21
we here. How do you know them?
2:06:24
Are they college buddies? Are they just people
2:06:26
you met on the street? Are there friends
2:06:28
just I know through? Journalists
2:06:30
maybe? Technically, all of these people are
2:06:33
on my recreational ski ball team.
2:06:35
You mentioned that. Bourgeois ski.
2:06:37
Bourgeois ski. We
2:06:40
play in the fall. I
2:06:43
know them originally, a couple of them
2:06:45
through journalism, but the rest don't work
2:06:47
in journalism. Here
2:06:49
I'll post. I've seen your ski
2:06:52
ball pictures on your Instagram and
2:06:54
I realized this was a serious thing for you.
2:06:57
Oh, I mean, serious
2:07:00
is an interesting word. We
2:07:04
are famously losing ski ball teams.
2:07:06
This year we won zero games.
2:07:08
Yes, there's a league. There's a
2:07:10
league that we play in every
2:07:13
fall during the fall season. It's
2:07:17
called Volo is the
2:07:19
actual season. Yes,
2:07:21
thank you. During
2:07:25
the skis and we compete last year, when I
2:07:28
first joined this ski ball team about three years
2:07:30
ago, the Bourgeois ski had never won a game.
2:07:32
I will say the first two years I was
2:07:34
there, we did win multiple games. Whether or not
2:07:36
that was related to me, I guess
2:07:38
is a good question because this
2:07:40
year we won zero, but a lot of
2:07:42
people are traveling for weddings. So
2:07:45
this ski ball game, for those of
2:07:48
you who don't know, you roll a
2:07:50
ball up an incline and
2:07:52
you try to get it into the target
2:07:54
areas. It's kind of like rolling
2:07:56
darts and you get points. Yeah.
2:08:00
something you'll often see in arcades or
2:08:02
you know in a corner of a bar.
2:08:05
Is it hard? It
2:08:08
is. Well it's
2:08:10
hard to be good at it because
2:08:13
the thing is it's
2:08:15
a very specific skill.
2:08:17
You both have to roll the ball up
2:08:19
with precision but then give it the right
2:08:22
amount of lift and it changes depending on
2:08:25
the machine. So it's kind of hard
2:08:27
to practice. Do you have uniforms for your
2:08:29
ski ball team? No but
2:08:32
we for next year want to make
2:08:34
custom Leatherman Jack, Leatherman Jack. I do
2:08:36
see some of these
2:08:40
teams have outfits. Hannah
2:08:43
they give us volo sports is
2:08:45
like a intramural recreational sports league.
2:08:47
They give us little t-shirts. That
2:08:52
makes me want to go to the chat. I love it.
2:08:54
There's also Paris as
2:08:56
the medieval dining room wench.
2:08:59
Yes I did post photos of us
2:09:01
going wench mode. I did dress
2:09:03
up. Perfect.
2:09:08
And my outfit was pretty good on it. It's
2:09:10
very good. It's one of those things I had
2:09:12
to buy. The next one
2:09:14
you can see my full outfit if you want.
2:09:16
There you go. Right next to the United Shining.
2:09:20
I'm wearing a long skirt and
2:09:22
a little sash and I've got
2:09:24
a leather top. The leather top
2:09:26
is the one thing I bought. I
2:09:29
was gonna say if you owned that I would be a
2:09:31
little worried. No
2:09:33
everything else I didn't know. Everything else
2:09:35
was normal. Yeah this is cute. This
2:09:37
is so cute. We were
2:09:40
the only people dressed up at medieval times. Well
2:09:42
the only adult. I'm the only adult. Do they
2:09:44
have beer? Oh yes
2:09:47
they have alcohol. Oh good.
2:09:49
Oh good. A portion of
2:09:52
ironic irony seeking adults
2:09:55
versus children and families. Um
2:09:58
I would say There were a
2:10:01
lot more children and people celebrating birthdays
2:10:03
than I would have thought. They
2:10:05
had a segment where they went through and
2:10:08
announced all the birthdays and it lasted five
2:10:10
to maybe ten minutes. Although
2:10:12
I think it could have been if you're
2:10:14
buying a group package, you can put in
2:10:16
like a little message. And obviously if we
2:10:19
had had the foresight to do that, I'm
2:10:21
sure someone in our group would have celebrated
2:10:23
a fake birthday for one of us as
2:10:26
like humiliation. But honestly it was a
2:10:28
great show too. There was a
2:10:31
falconer, a falcon soared throughout the stadium
2:10:33
and did the little tricks. And you
2:10:35
came to New Jersey. I
2:10:37
did come to New Jersey. We took
2:10:39
an Uber from Manhattan to the castle. It's
2:10:44
better in Jersey I think. The one in
2:10:46
Southern California is full of failed actors so
2:10:49
they really ham it up. Oh
2:10:52
yeah, the New Jersey one is
2:10:54
unionized. Oh, a SAC after sure.
2:10:58
Or is it the Steelworkers Union? What
2:11:00
is the union? I
2:11:02
do think it might be SAC after I think
2:11:05
the performers union. Yeah, performers, interesting. We're
2:11:07
all equity here. Jeff, your pick
2:11:09
of the week? Well,
2:11:13
you know it's the first part of the year.
2:11:15
There's just not much. So I hate demographics and
2:11:17
I think that's the ruin of society is putting
2:11:19
people in the buckets but Axios
2:11:22
is insisting that there's a
2:11:24
new generation of... Oh, oh
2:11:26
no. The oldest are 13, the
2:11:29
youngest will be born in the coming year. The
2:11:31
first generation born entirely in this
2:11:34
century. Okay. So they're
2:11:37
going to be miserable young people who are
2:11:39
going to be mad at us for pandemics
2:11:42
and climate and all of
2:11:44
that. They're going to be unhappy. I
2:11:47
think there's probably a competition
2:11:50
among blogs to be the first to
2:11:52
name the generation and clearly Axios has
2:11:56
scooped everybody else on this one. They're
2:11:58
not the first generation to be born. born in this
2:12:00
century. This century is 23 years old. They
2:12:04
know. So that doesn't work a lot. Yeah,
2:12:06
it's got to be. The first
2:12:08
entirely online cohort? No. Maybe they were born
2:12:10
of parents that were born in this century.
2:12:12
Yeah, I think that's more like it. The
2:12:14
children of Millennials is what it is. Born
2:12:18
between 2010 and 2024, it
2:12:21
is expected to be the largest in history at
2:12:23
more than 2 billion but of course that's
2:12:25
just because the world is getting bigger. Meanwhile everybody's
2:12:27
predicting stories this week too about how the population
2:12:30
is going to peak and go down. Everybody's
2:12:32
saying. Mostly the children
2:12:34
are Millennials. They
2:12:36
are the successor to Gen Z. That's why we have to
2:12:38
go to Greek letters. We're out of we're out of Roman
2:12:41
letters. So from Gen Z to
2:12:43
Gen Alpha. We started as X
2:12:45
though. We didn't start at the end of the alphabet.
2:12:47
Yeah, we could just go back. We
2:12:49
could start with A. Why? We could
2:12:51
start with X. Because
2:12:53
we're the coolest generation. Well, start
2:12:55
with baby boomers. No, no, baby boomers. Yeah, boomers.
2:12:57
I know, but as far as letters. No.
2:13:00
Was that the next after baby boomers? It was Gen X?
2:13:03
Yep. And then Gen Y
2:13:05
and then Gen Z. Gen Y
2:13:07
is the Millennials. Yeah. So
2:13:11
these are these alphas are the mostly
2:13:13
the children of Millennials. Which is
2:13:15
about right, yeah. What are you
2:13:17
guys? What do you mean
2:13:19
you guys? Me? No, not you. I
2:13:21
know you're the old part. I'm a baby boomer.
2:13:24
I'm a Gen X. He's a Gen X.
2:13:27
And you're Gen Y probably, Paris, right?
2:13:30
I'm cusp depending
2:13:32
on what you depending
2:13:34
on what you ask. I'm
2:13:37
either Millennial or Gen Z.
2:13:39
Right in the border there.
2:13:42
Well, there you have it. That was exciting.
2:13:44
Don't forget to get the audio edition of
2:13:47
Jeff's book, which is equally
2:13:49
exciting. You'll find it on
2:13:51
Audible and elsewhere. It's Jeff reading his own.
2:13:54
And it will freak you out
2:13:57
that I speak like a human.
2:14:01
But you can speed it up so it's normal
2:14:03
you will get a headache. Gutenberg parenthesis.com that's also
2:14:05
where you can get his new newest
2:14:07
book magazine object lessons magazine
2:14:11
at the Gutenberg parenthesis and
2:14:14
he's working on a new one kids he never
2:14:16
stops just like the web
2:14:19
we we what's
2:14:21
called the web we weep web we weave
2:14:23
a week that makes more sense the
2:14:26
web we weep that's the book I'll
2:14:28
write Jeff Jarvis is ladies and gentlemen
2:14:31
the director of the town I'd center
2:14:33
for entrepreneurial journalism at the now Craig
2:14:37
Newmark graduate school of journalism at
2:14:39
the City University of New York for now emeritus
2:14:43
the orbiting soon soon you
2:14:46
got through August don't don't rush
2:14:48
it don't rush it Paris Martineau
2:14:50
is a working hard on a
2:14:53
super secret project at
2:14:55
the information.com if you're not yet
2:14:57
a subscriber extra not too late
2:14:59
to subscribe to schedule first as
2:15:01
we know when I
2:15:03
cannot confirm or deny even
2:15:06
that no deadline I do
2:15:08
have to take some calls immediately after
2:15:10
this podcast who do with that information they
2:15:12
give you will they give you a deadline
2:15:17
I'm it depends once
2:15:19
like the reporting and stuff is there certainly will schedule
2:15:22
something for it I mean I think a large part
2:15:24
of what my editor has to do is be like
2:15:26
alright Paris time to come out of the rabbit hole
2:15:28
now reporting enough
2:15:32
but that's an editor's job so
2:15:34
cool well I can't wait to see it
2:15:37
and of course the information.com is
2:15:40
well worth subscribing to I do so
2:15:43
does Jeff we really love this
2:15:46
it is one of the great sources we converted it's
2:15:48
actually honestly it's much more useful for
2:15:50
people covering technology than that times
2:15:53
are there any or even Bloomberg it's really
2:15:55
good thank you
2:15:57
for that ladies and gentlemen thank
2:15:59
you for joining us and for
2:16:01
you Club Twit members, thank you so much
2:16:03
for making this show possible to our advertisers as
2:16:05
well. We do this week in Google on
2:16:07
Wednesdays, 2 p.m. Pacific, 5 p.m.
2:16:09
Eastern. That's 2100 UTC, sorry, 2200 UTC.
2:16:12
The website is twit.tv slash twig. We do
2:16:19
stream on YouTube during the live show, so if you
2:16:22
want to watch us do the
2:16:24
show, you can go to youtube.com/twit. Whenever
2:16:27
a show is in production, you'll see a
2:16:29
live stream there. After the fact,
2:16:31
download an episode from the website or
2:16:34
go to YouTube. There's a This
2:16:36
Week in Google channel on YouTube if you
2:16:38
like the video. You can also
2:16:40
subscribe to audio or video on your favorite podcast
2:16:42
client. We
2:16:44
like Pocket Cast, but, you know, pick the one
2:16:46
you like and subscribe. That way you'll get it
2:16:49
automatically. It's available. Don't forget to survey. If you
2:16:51
haven't yet, join Club Twit. Thank
2:16:54
you for being here and we'll see you next
2:16:56
time on This Week in Google. Bye-bye. Hey,
2:16:59
I'm Rod Pyle, editor-in-chief of Ad Aster magazine, and
2:17:01
each week I join with my co-host to bring
2:17:04
you This Week in Space, the latest and greatest
2:17:06
news from the final frontier. We
2:17:08
talk to NASA chiefs, space scientists, engineers, educators,
2:17:10
and artists, and sometimes we just shoot the
2:17:13
breeze over what's hot and what's
2:17:15
not in space books and TV, and we
2:17:17
do it all for you, our fellow true
2:17:19
believers. So whether you're an armchair adventurer or
2:17:21
waiting for your turn to grab a slot in Elon's
2:17:23
Mars rocket, join us on This Week in Space and
2:17:26
be part of the greatest adventure of all time.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More