Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:05
They had that like continuous Seinfeld
0:07
generator for a while. Oh
0:10
really? Yeah, I
0:12
forgot what happens. I
0:15
think it just went to and
0:17
started doing racist things though. It
0:19
started talking like Jerry Seinfeld. Yeah,
0:22
exactly. It started talking like Seinfeld
0:24
and Michael Richards. The
0:28
IDF is doing great things! Oh
0:31
lord. What's
0:33
with it? With these dude and
0:35
protesters. Well, that's the deal. They
0:37
all got the same tent! Who's giving them
0:39
the tents? That
0:42
is his material. Sound
0:50
is personal, intimate, and emotive. Just
0:52
like this podcast. We
0:55
are Audiosstack.ai. We
0:57
combine AI writing. The best synthetic
0:59
voices like ours. With
1:02
production like music and mastering. And
1:04
deliver them to be heard. Be it ads,
1:07
podcasts, or VOs for video. Just like this
1:09
ad you're listening to right now. However,
1:12
we have millions of spots just like this
1:14
on podcasts. And rather than hearing
1:16
from us, we want to hear from you. How
1:19
would you like to win an AI
1:21
audio campaign for free? Do
1:24
you work with businesses, products, events, or causes
1:27
that could benefit from free promotion on podcasts
1:29
in the coming month? Tell
1:31
us how you might use synthetic voices. Or
1:35
dynamically change ads for a news
1:37
podcast like this versus true crime,
1:39
history, or even comedy. Go
1:42
to audiosstack.ai/contest and
1:45
your company could be heard by millions. See
1:47
webpage for teas and sees. I'm
1:50
Katja Adler, host of The Global Story.
1:53
Over the last 25 years I've covered conflicts in the
1:55
Middle East, political and
1:57
economic crises in Europe, drug cartels, and other
1:59
countries. in Mexico. Now
2:01
I'm covering the stories behind the news
2:03
all over the world in conversation with
2:05
those who break it. Join me
2:08
Monday to Friday to find out what's
2:10
happening, why, and what it all means.
2:12
Follow the global story from the BBC
2:15
wherever you listen to podcasts. Here's
2:21
something you might not know about wireless.
2:23
Sometimes what you see isn't what you
2:26
get, but with visible what you
2:28
see is what you get. Switch to
2:30
visible the wireless company that makes wireless
2:32
visible. Get a one-line plan with unlimited
2:34
5G data powered by Verizon just $25
2:37
per month taxes and fees
2:40
included. Switch now at visible.com
2:42
monthly rate on the visible
2:44
plan for data management practices
2:46
and additional terms visit visible.com.
2:51
Hello the internet and welcome to
2:53
season 344 episode 2 of production
2:55
of iHeartRadio and
2:59
this is a podcast where we
3:02
take a deep dive into America's
3:04
shared consciousness America's deep brain if
3:07
you will a little tip of the cap because we're
3:09
big AI fans a little little spoiler we're
3:11
big AI fans now folks I have
3:14
seen the light come around I
3:17
think when it's all you see on
3:19
social media you're like this is art
3:21
oh yeah maybe this is cool yeah
3:24
it's Tuesday June 25th 2024 oh
3:28
yeah big day it's national
3:30
catfish day which is odd
3:32
because it's also my partner's
3:35
birthday I brag are you
3:37
have you been catfishing me this whole time
3:40
it's national strawberry haven't met her person still
3:42
right but one of these days no well
3:44
and every time I want a video chat
3:46
she says her phone's broken yeah we just
3:48
kind of stick to the phone call stuff
3:51
but it's also very normal at this stage
3:53
in a marriage very normal also this is
3:55
so weird and this is just like some
3:57
weird religious stuff it's National League did you
3:59
know this Does
8:00
it? Yeah, it comes through, washes the kibble, has
8:02
a few bites, and then takes off into the
8:04
night. It's like a pit stop for one of
8:07
the raccoons in the night. Yeah. I'm leading a
8:09
raccoon-based Dungeons
8:12
& Dragons campaign starting next week. Oh,
8:14
for real? That's going to be like,
8:17
yes, it's going to be like a heist that
8:19
takes place in the warehouse where
8:21
I play roller derby. I'm
8:24
very excited. I've got it really architected.
8:28
I don't want to give any secrets in
8:30
case any of my player characters listen to
8:32
this podcast. Right. Okay.
8:34
That sounds amazing. And what a coincidence. Raccoons,
8:36
having a bit of a moment. Yeah, truly.
8:38
At least on this podcast. Yeah, they are.
8:41
So in addition to being
8:43
a host of the wonderful
8:45
Mystery AI Hype Theater 3000 podcast, which
8:49
podcast hosts the highest honor one can attain
8:51
in American life, but you
8:53
both have some pretty impressive credits. Emily,
8:56
you are a linguist and professor at the
8:58
University of Washington, where you
9:00
are director of the Computational Linguistics
9:03
Laboratory. Yep, that's right. Do we
9:05
have that right? Okay. Alex, you
9:07
are director of research at the
9:10
Distributed AI Research Institute, both widely
9:12
published, both received a number
9:14
of academic awards, both have PhDs. We
9:16
had you on the podcast a few
9:19
months back, told everyone
9:21
the truth about AI that
9:23
a lot of the stuff that we're scared of
9:26
and a lot of the stuff we think it
9:28
can do is not true. It's bullshit. And
9:31
I sat back and was like, well, we'll
9:34
see what AI does after this one. And
9:38
it's just kept happening, you guys. What the
9:40
heck? What can we do? If
9:43
anything, it's gotten worse since we told
9:45
everybody the truth. What's happening? Truly.
9:48
You know, everybody seems to want
9:50
to believe. And it's absurd. Yeah. It
9:52
is so wild. Yeah. And
9:54
part of what we do with the podcast, actually, is
9:56
try to be a focal point for a community of
9:58
the people who are like... like, no,
10:01
that's not right. Why does everybody around me
10:03
seem to believe that it's like, you know,
10:05
actually doing all these things? Yeah. So
10:08
it's, yeah, it's, you know, it's
10:10
what we say in our podcast. Like every time
10:12
we think we've reached peak A.I. hype, the summit
10:14
of bullshit mountain, we discover there's worse to come.
10:16
Like it's not stopping. Yeah, right. Just keeps growing.
10:18
Oh yeah. This is just a base camp until
10:20
you get to the real peak. Like, right. Well,
10:22
it's just that you just keep on thinking that
10:24
keeps on becoming, and
10:27
there's more and more things that
10:29
the CEOs just, you know, are
10:31
really just say
10:33
incredible nonsense. I don't know if you saw
10:35
this. It was the
10:38
last week, the chief technology officer
10:41
of open A.I. Mira
10:43
Marati. Mirati. Yeah. Who
10:45
famously was, I
10:48
think it was an interview with 60 minutes. And
10:50
when they were talking about one of their tools,
10:52
Sora, you know, they had asked if they
10:55
ever filmmaker. Yeah. Exactly. Yeah.
10:58
Exactly. Reckon out Sora. Thank you.
11:00
Yeah, exactly. This saw really edging
11:02
out, you know, David Lynch these
11:04
days. And so, you know,
11:06
and they asked her, do you train the
11:08
stuff on YouTube and she she kind of
11:11
grimaced? Yeah. Painful. Yeah, we covered it. Yeah.
11:13
And and I remember a great Twitter comment.
11:15
I was like, well, if you're going to
11:17
just lie about stuff, you at least have
11:19
to have a good poker face about it.
11:21
Yes. And and so the last week she
11:24
was like, well, this is doing
11:26
another interview and she was like, well, some some some
11:29
creative jobs are going to go away.
11:31
Like some artists should be, you know,
11:33
please. Say some some creative jobs maybe
11:35
shouldn't have existed in the first place.
11:38
Right. Like these jobs were in a
11:40
front to God or something. Some of
11:42
them just shouldn't have even been there.
11:44
But she does have a French
11:46
accent. So it's really hard to be like,
11:48
this is ridiculous. Well, she's Italian. And
11:51
that's what's amazing about her having a friend. I
11:55
don't I'm not I'm not a cultured person.
11:57
I don't know the difference between. They're all
11:59
friends. to me, I'm American. Right.
12:02
Hey. She has a Canadian accent,
12:04
I think? I'm not sure. I know. It's
12:08
only plagiarism if it comes
12:10
from the French region of
12:12
Italy. That's right. That's
12:15
right. Yeah, we're going to get
12:17
into that story and just,
12:20
yeah. All of the madness
12:22
that has continued to happen, the bullshit
12:24
has continued to reign
12:26
even harder, it seems
12:29
like. Yeah. Which, yes,
12:31
does make the mountain go higher, unfortunately, the
12:33
bullshit mountain. But before we
12:35
get to that, Emily, Alex, we do
12:37
like to ask our guests, what is
12:39
something from your search histories that's revealing
12:42
about who you are? Alex, you want
12:44
to kick us off? Oh,
12:46
gosh. Okay. I don't... The
12:49
thing is, I don't think... So
12:51
I use DuckDuckGo, and so it doesn't actually
12:54
keep the search history. And
12:56
if I actually look at my Google history, it's
12:58
actually going to be really shameful. It's going to
13:00
be me searching my own name
13:02
to see if people are
13:04
shit-talking me online. No, this isn't just how
13:06
we tell if someone's honest, as if they
13:08
actually give that answer. We're like, okay, so
13:10
you are not a person. You actually search
13:13
yourself, yeah. But I think
13:15
the last thing I actually searched was queer
13:17
barbers in the Bay Area, because I haven't
13:20
had a haircut in like a year, and
13:22
I think I need to term up or
13:25
get air out the sides
13:28
of my head for Pride Month.
13:30
So that's the last thing I
13:32
searched. What
13:34
are you going? Are you going full shaved on
13:36
the sides? I think maybe trim it a little
13:38
bit, and trim it up the back, and bring
13:41
out the curls a little bit. Okay. Love
13:44
it. On board. I wish I could
13:46
bring out my curls. You've got a few more days in Pride Month
13:48
to get that done. I'm late. In
13:51
July, you're like, you do discounts? You
13:54
discounts? I'm late. It's like
13:56
after Valentine's Day. Do
13:58
I get an undercut at 50%? off now.
14:00
Right, exactly. Emily,
14:03
how about you? What's something from your search history? So
14:05
forgive the poor pronunciation of this and
14:08
the rest of the story because Spanish
14:10
is not one of my languages, but
14:12
champurado is something I search
14:14
into. Yeah. So I was in Mexico
14:16
City for a conference last week and at one
14:18
of the coffee breaks, they had coffee and decaf
14:20
coffee, and then they had champurado
14:22
con chocolaté o ajueña. And each of the
14:25
labels- You're kind of telling it
14:27
on the Spanish pronunciation by the way. Don't mean
14:30
to- Yeah, give us that about that. What do
14:32
you say when you see that word? Champurado Mexican
14:34
hot chocolate. All right.
14:36
So, yeah. You're literally reading the Google
14:38
results. Yeah. So
14:42
the labels all had like translations into English,
14:44
and so it was champurado with Oaxacan Choco.
14:46
I'm like, yeah, I got that. What's champurado?
14:49
And so I look it up because I want to know
14:51
what I'm consuming before I consume it. And it's basically a
14:54
corn flour based thick drink. So
14:57
like chocolate corn soup, it was
14:59
amazing. Chocolate corn soup. You
15:01
had me until chocolate corn soup.
15:06
The corn is just a thickening
15:08
egg there. Thick chocolate
15:10
drink. Yeah. Thick chocolate drink with
15:12
a slight corn flavor. Like think corn
15:14
tortilla, not crayon the cob. Yeah,
15:17
yeah. Ooh, yeah, yeah, yeah. That sounds amazing. Yeah.
15:20
It was really good. I love some corn flakes
15:22
in a chocolate bar. Uh-huh. Yeah. So corn and
15:24
chocolate. There you go. You got to arrive in
15:26
your own way as to why that appeals to
15:28
you. That's right. So I'm back on board with
15:31
the thick corn chocolate drink. It
15:34
was really good. And just awesome that it was
15:36
there. The coffee breaks had the Mexican
15:38
sweetbreads and stuff like that, but otherwise it was pretty
15:40
standard coffee break stuff. And then all of a sudden
15:42
there's this wonderful mystery drink. Yeah. One of the big
15:44
urns. It was lovely. That sounds great. What
15:46
is something you think is underrated, Emily? Um,
15:50
I think Seattle's weather is underrated.
15:53
Okay. Yeah. Everyone makes fun of
15:55
our weather and like, you know, fine. Believe that we don't
15:57
need lots of people coming here and it's true. It gets
15:59
dark. the winter, but like almost
16:01
any day you can be outside and you are
16:04
not in physical danger because you are outside. I
16:08
guess that's, I mean, if
16:10
you're going for, yeah, that's interesting. But I
16:12
mean, it's that, I mean, the winters are
16:14
just so punishing, though, it's so gray. It's
16:17
dark, but the weather is not going
16:20
to kill you. It
16:23
looks like shit, but experientially not
16:25
bad for you. I mean, yeah,
16:27
I know. It's, when does like,
16:30
it doesn't get all gloomy, I imagine in
16:33
the summer, right? You have wonderful blue skies
16:35
and you can enjoy the... The summers are
16:37
gorgeous. Yeah, summer's nice. Fire season aside. Right.
16:40
But yeah, from sort of mid-October to
16:42
early January, it can be pretty like,
16:44
it's gray. And so like when the
16:47
sun is technically above the horizon, it's a little hard to tell.
16:49
Yeah, right, right. So, but you know,
16:52
compared to like Chicago, where you have
16:54
maybe four livable weeks a year between
16:56
the too hot and the too cold.
16:59
Wow. Wow. Don't do that because my thing was going to
17:01
be Chicago because I was just there. And I
17:04
was going to say my answer was going to
17:06
be that Chicago is the
17:09
best American city. I
17:11
stand on this like 100%. For
17:15
two weeks out of the year, that's very true.
17:17
No, absolutely not true. No. Come
17:19
on. I'll even deal with the winter.
17:21
I'll deal with the winter. I mean,
17:23
if I... Okay, I'll
17:26
be honest. If I didn't, you know,
17:28
if the weather in Chicago, if I
17:30
could bring Bay Area weather to Chicago,
17:32
I would live in Chicago. I mean,
17:35
there's other reasons. But I mean, it's...
17:37
Look, the vibes immaculate.
17:40
Street festivals, the neighborhoods,
17:44
it's the one place that's probably... The food
17:47
still comparatively affordable compared
17:49
to the coast's radical
17:52
history. Just some of the
17:55
best politics. I would say... It's
17:59
not fugitive there. Yeah, they shot
18:01
what they shot. What did they shoot there?
18:03
The Fugitive? Oh, I did.
18:05
That's a deep cut. Yeah. I
18:07
mean, I think they've shot a lot of Batman
18:10
movies there because the iconic lower
18:12
Wacker Drive and
18:14
they call it Gotham. Yeah.
18:17
That's pretty cool. Yeah. Great city.
18:19
Crappy weather. If you're
18:21
going to dump on weather, I'm like, everyone makes fun
18:23
of Seattle's weather. Honestly,
18:26
Emily, this is a
18:28
hot take. I'd rather take Chicago's
18:31
weather than Seattle's weather because I
18:33
can't do gray. I can
18:35
do crossfire. I can do
18:37
frigid. I cannot do gray.
18:39
It's too depressing for me.
18:42
Well, this is why I say don't move to Seattle if
18:44
you can't handle our weather. The people who move here and
18:47
then complain about the weather are the worst. Yeah. It's
18:49
like, what'd you expect? Yeah. All of
18:51
this, what they say is true about it being gray and
18:53
they're like, oh, I didn't expect it to be that gray.
18:55
Right. I think people talk
18:57
about it like that. All
18:59
right, Alex, let's stay with you. What
19:01
is you guys's overrated and please do
19:04
it in a point counterpoint style also
19:06
that contradicts
19:08
one another. Well, I got to
19:10
think about what's overrated these days.
19:12
I just don't
19:14
know what's in the, I know the name of the
19:16
show is the Daily Zeitgeist, but I don't really know
19:19
what's in the Zeitgeist. I mean, I guess
19:21
Taylor Swift, I mean, I don't really
19:24
have, maybe that's controversial. I'm saying something
19:26
that's hot take, but I guess that's
19:28
maybe not controversial to people
19:30
of our
19:32
generation. No. So, yeah.
19:35
Joining Dave Grohl
19:38
on the attack this weekend. Yeah. Wait,
19:41
what happened with Dave Grohl? Dave
19:43
Grohl was implying that she's like,
19:45
he's like, well, we play our
19:47
music live, like raw live rock
19:49
and roll, you know, unlike the
19:51
errors tour, you know, the errors
19:53
tour. And then everyone's like, fuck
19:55
you, Dave. Or other people being
19:57
like, exactly, exactly. Yeah. It's
20:00
just like, yeah. Yeah. I mean,
20:02
Dave Grohl is also overrated, I
20:04
guess. But I mean, I enjoyed,
20:07
look, I enjoy Everlong, like the
20:09
next, like, yeah, middle-aged
20:12
sort of like dad figure. But
20:16
I, you know, I'm sure like
20:19
I'm glad that you played every part in
20:21
that song. It sounds good, but, you know,
20:23
yeah, it doesn't make you an
20:26
authority on Taylor Swift. So, yeah. So I
20:28
think I'm undercutting my own point. No, let's
20:30
go, Dave. You did Dave Grohl.
20:32
You got a point in your own overrated. In
20:34
my own, yeah. Which is excellent, because I don't
20:36
even have an opinion about Taylor Swift. Never saw
20:38
Tucker Carlson do that. Yeah. Was
20:41
that what that show was called, Crossfire? Or was that? Crossfire
20:45
was with, what's his face? Tucker
20:47
Carlson and Paul Bagala. That was
20:49
the one that John Stewart came
20:51
on and was like. Destroyed? Yeah.
20:54
It was like, this show is bad.
20:57
And then like they canceled it a couple of weeks
20:59
ago. But then there was
21:01
that one show, Hannity and Combs,
21:03
where Sean Hannity was supposed to
21:06
be conservative voice. And then, you
21:08
know, Combs were like, I don't even know
21:10
the guy's first name. They kind of
21:13
just had him as
21:15
a token, like liberal on. And then they
21:17
just, it was on Fox News that he
21:19
attacked him relentlessly. He wasn't allowed to read
21:21
the news. He's like,
21:23
you argue the liberal points, but you're actually
21:26
not permitted to leave this room. We're going
21:28
to keep you in here, old boy style
21:30
for the. Oh, that was the end of 60 minutes
21:32
that Andy Rooney would do. There
21:35
was part of 60 minutes was point counterpoint
21:37
and it would be Andy Rooney. If
21:40
that's what you're thinking, Jack. I don't know. There's
21:42
many. No, no, there was a show. Yeah. It
21:45
was right when I got out of
21:47
college and worked for ABC News. And
21:49
then at that time, there was a
21:51
big show on CNN called Crossfire. Yeah,
21:53
it was Tucker. But you are talking
21:55
about Crossfire. Yeah, Tucker Carlson was the
21:57
conservative. Paul Bagala was the liberal. And
22:00
they just like got on and yelled
22:02
at each other. I'm looking at now. This
22:05
is good. Apparently there there was a
22:07
that they they had
22:09
a revival. And then
22:11
in 2013 and 14 on the
22:13
left was Stephanie Cutter and Van
22:16
Jones and then
22:18
Newt Gingrich and S.E. Cup on
22:20
the right. And then
22:22
whatever. And then whenever they needed breaking
22:24
news, they'd bring in Wolf Blitzer for
22:26
some reason. Because
22:29
Wolf Blitzer. Attracting him out of the Situation
22:31
Room. Yeah. Yeah. Yeah. They released him from
22:33
the cryogenic. He was helicopter
22:35
lifted from the Situation Room, three
22:37
rooms over to the crossfire set,
22:40
just with dead pan. We need you. We
22:42
need you. No hint of emotion
22:44
on his face ever. You guys ever
22:46
seen the Wolf Blitzer episode
22:49
of Celebrity Jeopardy? No. Oh,
22:51
dude, you're so bad at it. A
22:53
favor. Is it? Is
22:55
it hasn't been scrubbed yet? Is
22:58
it as good as like the
23:00
SNL parodies, the Celebrity Jeopardy with
23:02
like the Sean Connery? It's just
23:04
so bad. Yeah. Just no,
23:06
no. And also
23:08
incorrect. Yeah. One
23:10
after another. Because he had like negative went
23:14
into the into the red. He's in the red.
23:16
Very quickly in Final Jeopardy. Well, Wolf, we're going
23:18
to spot you 3000 because we
23:20
can't have somebody be in negative
23:22
numbers going into going into Final
23:25
Jeopardy. And I think Andy
23:28
Richter was on with him
23:30
and just destroyed was so
23:33
good. That's so funny. Andy
23:35
Richter, like the
23:37
kind of crossover I didn't know I need. Yeah,
23:40
it's still up there mostly from what I could
23:42
tell. It's on YouTube. Yeah. I
23:44
am an old person. All
23:47
right. We still have Emily. You're
23:49
overrated. What do you think
23:51
is overrated? Big cars are overrated. Oh, totally.
23:54
Sort of halfheartedly looking for our next car
23:56
and can't find anything that is like reasonably
23:58
small. And the other day I was in
24:01
the parking lot for a grocery store near
24:03
here. Mostly I can walk for groceries, but
24:05
occasionally I have to drive to this other store. And
24:08
half the spots were labeled compact. And
24:11
all of those spots were taken up two at
24:13
a time by what we
24:15
now have as regular cars. Because somebody's
24:17
decided that people in this country don't
24:19
deserve normal size cars. Yeah.
24:22
There's so... I mean, it's to the point where
24:24
even the people who design parking lots are like,
24:26
we have to tell the automobile manufacturers. The
24:29
standard we've set as people who create
24:31
parking lots, they're pushing the boundaries of
24:33
what we can actually do or how
24:35
we measure things. Because the
24:37
cars are so fucking big. And
24:40
our streets around here in Seattle, we have a
24:42
lot of neighborhood streets where there's like parking on
24:44
both sides and then sort of just barely enough
24:46
space for two normal cars to go through. Right.
24:49
Or sometimes you have to pull over through the car and the bigger the car
24:51
is, the harder that gets. I love that thing.
24:53
I remember one of the times I went to
24:55
Seattle seeing how everybody just parks on whatever side
24:58
of the street in whatever direction they want. I
25:00
was like, all right. I'm like, all
25:02
right, Seattle. I was
25:04
not familiar. That's fun. Yeah. A
25:07
little bit of chaos. It totally offends my spouse
25:09
who's like, that's not how parking works. But
25:11
that's it. Yeah. I
25:14
love it. Yeah. Automakers
25:16
just seem to be getting bigger and heavier.
25:18
They won't stop until they make a car
25:20
that is legally required to have a fog
25:22
horn on it. Right. So
25:26
the Cybertruck? I was going to
25:28
ask, have you considered the Cybertruck?
25:30
I've seen one in person. They
25:32
are hilarious. Like you can't not laugh.
25:34
Exactly. It is
25:36
an experience seeing one in the wild. It's
25:38
like, wow. I just want to say
25:40
that what we really need is functional public transit. Yeah.
25:43
But short of that, we also need to not be doing bigger and
25:45
bigger cars. Yeah. Yeah. No,
25:48
I just, I mean, I have a truck. I have a 2020
25:51
truck and I wish, I really
25:53
wish it was much smaller because it's hard
25:56
to park. It's way too big.
25:58
I mean, I think the peak. of
26:00
truck design was like a 1987 Toyota
26:02
Tacoma long cab. Yes.
26:08
Just where like, yeah, you got to bunch up
26:10
your knees in the back if you want to
26:12
fit four people on it, but you
26:15
actually had a long, you actually had a
26:17
truck bed that actually had
26:20
some carrying capacity. And
26:24
it was a car you could absolutely run
26:26
into the ground with no problems at all.
26:28
Yeah. Oh yeah. My new
26:30
Ford Lightning needs a software update. Oh, God. Jesus
26:32
Christ. Well, that's the thing is like, yeah, I
26:34
mean, like, I mean, that's a big deal. I
26:37
know in like Oregon, which is like the way
26:39
they had right to repair bill. And
26:42
I mean, in some ways, the people that were
26:45
kind of into it weirdly were like, Google
26:47
actually came out kind of into it. There
26:49
was a good for four media
26:52
podcasts where they talked about this with
26:54
an Apple because they have such a
26:56
closed ecosystem was so against right to
26:58
repair. Even
27:00
if you have right to repair, they'd actually add
27:03
on all these things where you'd still have to
27:05
send to an authorized dealers because of firmware issues
27:07
or whatever. Right. Right. And
27:10
then John Deere, like John Deere is this kind of thing where they
27:12
have so much of their tractors
27:15
are computerized. And so there's like
27:17
a lot of like these John
27:19
Deere hacking kinds of things. Lots of people
27:22
who are outside of the US, you
27:24
know, programming these kinds of hacks for
27:26
people running these tractors and can't run
27:28
into their firmware. Yeah.
27:32
Yeah. The farmers have all the good, the
27:34
GPS, though something. But did you
27:36
hear about how the GPS was out for
27:38
a while with this? Oh, yeah. I
27:41
heard about that. This is again 404
27:43
media podcast, but the way those tractors
27:45
work for like planting is so precise.
27:47
Yeah. And the GPS. To
27:50
this end, I think. The GPS off, they basically
27:52
couldn't plant because then the seeds wouldn't be in
27:54
the right spot for the next process. Yeah. And
27:57
so they had to wait and there's a really
27:59
narrow window apparently with our, you know. currently genetically
28:01
modified, very, very specific corn that part of Monsanto
28:03
owns. And so it was actually looking pretty bad
28:06
for a while. I didn't hear any follow-up, so
28:08
maybe the solar flail was short enough and the
28:10
GPS came back online, but apparently that was a
28:12
big thing. Yeah. It
28:14
has to be a brief window because it goes
28:16
from corn seed planted
28:19
in the ground to popcorn in the
28:21
movie theater in two and a half
28:23
weeks. Yeah. It's a hot, hyper-engineered corn.
28:27
Most of it doesn't even go to popcorn in
28:29
the movie theater, most of it goes to animal
28:31
feed or ethanol, I think. Yeah. Right.
28:33
We're right. Yeah. All right. Well, let's
28:36
take a quick break and we're going
28:38
to come back and dive into why
28:40
Miles and I are excited
28:42
about the future of AI. We'll
28:45
be right back. Crossfire!
28:54
Zite gang, customers are rushing to
28:56
your store. Do you have a
28:58
point of sale system you can
29:00
trust or is it a real
29:02
POS? You need Shopify for retail.
29:04
Shopify POS is your command center
29:06
for your retail store. With Shopify,
29:08
you get a powerhouse selling partner
29:10
that effortlessly unites your in-person and
29:12
online sales into one source. Connect
29:15
with customers in line and online.
29:17
Get hardware that fits your business.
29:19
Take payments by smartphone. Transform your
29:21
tablet into a point of sale
29:23
system or use Shopify's POS Go mobile
29:25
device for a battle-tested solution. Plus, Shopify's award-winning
29:27
help is here to support your success every
29:29
step of the way. Do retail right with
29:31
Shopify. I was looking at their website just
29:34
trying to see, look, how do I find
29:36
the things that I need as somebody who
29:38
would potentially have a retail business? And surprisingly,
29:40
very easy to navigate. They have all the
29:42
things that you need answers to right there
29:45
on the website. So definitely check it out.
29:47
So sign up for a $1 per
29:50
month trial period at
29:52
shopify.com/TDZ, all lowercase. Go
29:54
to shopify.com slash TDZ to
29:56
take your retail business to the next
29:58
level today. shopify.com slash... Sound
30:02
is personal, intimate, and emotive. Just
30:05
like this podcast. We
30:07
are AudioStack.ai. We
30:10
combine AI writing. The best synthetic
30:12
voices, like ours. With production, like
30:15
music and mastering. And
30:17
deliver them to be heard, be it
30:19
ads, podcasts, or VOs for video. Just
30:22
like this ad you're listening to right now.
30:24
However, we have millions of spots just like
30:26
this on podcasts. And rather than hearing from
30:28
us, we want to hear from you. How
30:31
would you like to win an AI
30:33
audio campaign for free? Do
30:37
you work with businesses, products, events, or
30:39
causes that could benefit from free promotion
30:41
on podcasts in the coming month? Tell
30:44
us how you might use synthetic voices. Or
30:47
dynamically change ads for a news
30:49
podcast like this versus true crime,
30:51
history, or even comedy. Go
30:54
to AudioStack.ai/contest and your company
30:56
could be heard by millions.
30:59
See webpage for T's and C's. I'm
31:03
Katja Adler, host of The Global Story.
31:05
Over the last 25 years, I've covered
31:07
conflicts in the Middle East, political
31:10
and economic crises in Europe, drug
31:12
cartels in Mexico. Now
31:14
I'm covering the stories behind the news all
31:16
over the world in conversation with those who
31:18
break it. Join me Monday
31:20
to Friday to find out what's happening,
31:22
why, and what it all means. Follow
31:25
The Global Story from the BBC wherever
31:27
you listen to podcasts. And
31:38
we're back. We're back. So just
31:41
to, for people who haven't
31:44
listened to your previous appearance
31:46
in a while, I feel
31:48
like a broad overgeneralization, but it
31:51
feels like the stuff that AI is
31:53
actually being used for and capable of
31:56
is not what we're being told
31:58
about through the mainstream media. It
32:01
is not an autonomous intelligence that is
32:03
going to be the bad guy in
32:05
a Mission Impossible movie. I mean, it
32:08
is a bad guy in a Mission Impossible movie,
32:10
but it's not going to be a bad guy
32:12
in reality. In our relationship. Yeah. The
32:15
way that our actual president believes in this.
32:19
That was an amazing reveal that
32:21
Joe Biden basically watched Mission Impossible
32:23
and was like, we got to
32:25
worry about this AI stuff, Jack.
32:29
It's going to know my next move. But
32:31
it is like the
32:33
large language models are basically
32:35
more sophisticated, autocomplete. That is
32:38
telling you what its data
32:40
set indicates you want
32:42
to hear or what its data set indicates will
32:45
make you think it is thinking, talking like
32:47
a person. In many
32:49
cases, that means what they call hallucinating,
32:51
what is actually just making shit up.
32:54
What other jobs could you say that you're
32:56
like, sorry, that was hallucinating. Oh,
32:59
okay. Oh, good. You
33:03
wouldn't last long as a precog.
33:05
Yeah. I
33:08
would be the worst precog. On the
33:10
IRS size, hallucinating on that last tax return.
33:12
Can I get a do-over? Well,
33:15
can people talk about using this to do
33:17
your tax returns? Yeah, right. Yeah, there's actually,
33:20
I mean, in California, there's whatever
33:22
the Department of Tax and Revenue,
33:25
there was some great reporting on CalMatters
33:28
by Kari Johnson. He
33:30
was talking about how they were using
33:32
this thing, some language
33:35
model, to effectively
33:37
advise the agents
33:40
who respond to people who call into the California
33:43
franchise, and they're like, well, they're there and
33:45
they're like, well, the agents are still going
33:47
to have the last word. But
33:53
they're overworked. They're
33:55
going to read this stuff or meet them. Right,
33:59
exactly. Oh, you're going to use this as an
34:01
extra thing, just an extra
34:03
expense to do the product, do
34:05
your job even better? That doesn't
34:07
sound like a company necessarily. Yeah.
34:11
Yeah. So an interesting thing
34:13
that we're seeing happen, we pay
34:15
attention when there's an AI story
34:17
that captures the zeitgeist. We
34:20
covered the B minus version of a
34:22
George Carlin routine that came
34:24
out, they were like, AI just brought George
34:26
Carlin back from the dead. We
34:29
covered Amazon Fresh, having that
34:31
store where the cameras know
34:33
what you've taken. And so even if you
34:35
try and shoplift, like the cameras know, they're
34:37
going to catch it. And then you don't
34:39
even have to check out, you just walk
34:42
out and they like, it charges your account
34:44
because of AI. And then
34:46
what we're seeing is that
34:48
when the truth emerges, it
34:50
does not enter the zeitgeist. You
34:53
guys cover it on your show, which is why we're so
34:56
thrilled to have you back. But we
34:58
have updates on those two stories. Carlin, that
35:00
was just written by a person. The
35:04
Amazon Fresh, those videos were
35:06
being fed to people
35:08
working in India to try
35:11
to track where everything
35:13
was going, which was why there was like
35:15
a weird pause, like as
35:17
people were there like, oh, I
35:19
think we got, okay, yeah, we're just gonna
35:21
do a best guess. But
35:23
it's straight up like mechanical Turk. Which
35:27
again, Amazon named one of their
35:29
companies the Mechanical Turk.
35:31
So they know what's going
35:33
on. They knew what they were planning to do
35:36
here all along maybe. But is
35:38
that kind of the model you're seeing is
35:41
big flashy announcement. This is
35:43
what AI integration can do. And
35:45
then when it falls short, people just
35:47
kind of ignore it. Or
35:50
how does it seem from where you're sitting? Yeah,
35:53
we haven't seen really good
35:55
implosions yet. And surprising because
35:58
like the stuff that goes wrong goes like. really,
36:00
really wrong. And people are like, yeah,
36:02
well, it's just in its infancy, which
36:04
is a really, really annoying metaphor. Because
36:06
it first of all suggests that this
36:08
is something that is, like
36:10
a human, like an animal at least
36:13
that's a baby and can grow. It's
36:15
something that is learning over time. And
36:17
also sort of pulls on this
36:19
idea that we should be kind to these systems,
36:21
because they're just little babies, right? And so if
36:23
something goes wrong, it's like, well, no, that's just,
36:26
it's still learning. And we get all of these
36:28
appeals to the future, like how good it's going
36:30
to be in the future. And there is, at
36:32
this point, I think so much money sunk into
36:35
this, that people aren't ready
36:37
to like, let go and own up
36:39
to the fact that yeah, so, and
36:41
it is, I guess, too easy to
36:44
hire exploited workers for poor pay, usually
36:46
overseas to like, backstop the stuff. There's
36:48
also so you gave us the Amazon
36:50
Go stores actually being monitored by people
36:52
in India. There was one of the
36:55
self driving car companies admitted that their
36:57
cars were being supervised by workers in
37:00
Mexico. And remember the stats
37:02
on the yeah, yeah, it was it was it
37:04
was so Eric Voight, the
37:06
CEO of cruise. And
37:08
he had said, and then there was this reporting on
37:10
the New York Times, where
37:12
they said, you know, they use they use
37:14
humans. And then he was
37:16
like, well, wait, wait, wait, wait, you're, you're really blowing
37:19
out of proportion. We only use it something
37:22
of three to five percent of the time.
37:24
Like, that's a huge, huge
37:27
amount of out. And
37:29
he posted this himself on on Hacker
37:31
News, which is this, you know, kind
37:33
of like, I don't know, 4chan for
37:35
tech bros, I guess, well, I guess
37:37
for 4chan is 4chan for tech
37:39
bros. But I mean, it's, you know, but
37:42
like, with a little less overt racism, I
37:44
guess. Just a little. Yeah, just a slightly.
37:46
Yeah, it was still yeah, but we're seeing
37:48
this in a lot of different industries. At
37:50
the end of the day, it's just, this
37:53
is outsourcing humans. Janet Vertesi is
37:56
a sociologist at
37:58
Princeton. She has a a piece
38:00
in Tech Policy Press, which the
38:02
title is something like AI is
38:04
just forecasting, or is just outsourcing
38:07
2.0 effectively. Yeah,
38:09
we're seeing a lot of the same patterns that
38:11
we saw in the early
38:14
90s when these business process
38:16
outsourcing or BPA organizations were really
38:18
becoming all the rage in the
38:20
US. Right. The other thing
38:22
that I see a lot too is I felt
38:24
early on, especially when we were talking about it,
38:27
the thing that intrigued us was when everyone was
38:29
like, dude, this thing's going to fucking end the
38:31
world. It's how powerful AI is. I
38:34
have a whole plan to take
38:36
myself off this mortal plane if I have
38:38
to, the moment in which AI becomes sentient
38:41
and takes over. I
38:43
think it felt like maybe the markets were like, hey
38:45
man, you're scaring the kids, man. Do we have another
38:47
way to talk about this? I feel like recently I
38:49
see more of like, together when
38:52
we harness human intelligence with AI,
38:54
we can achieve a new level
38:56
of existence and ideation that has
38:58
not been seen ever in the
39:00
course of human history. I
39:03
saw that in the Netflix J-Lo
39:06
movie where the entire crux of
39:08
the film was this AI skeptic,
39:11
had to embrace the AI in order
39:13
to overcome the main problem, conflict in
39:15
the film, or just even now, like
39:18
with the CTO of OpenAI also doing
39:20
a similar thing when talking about how
39:22
AI, some creative jobs
39:24
are just going to vanish, but that's
39:26
because when the human mind harnesses the
39:28
power of the AI, we're going to
39:30
come up with such new things. That
39:33
feels like the new thing, which is
39:35
more like we got to embrace it
39:37
so we can evolve into this next
39:39
level of thinking, etc, computation or whatever.
39:41
You guys on, sorry, I was just going
39:43
to say Mystery AI, Hype Theater 3000 reads
39:46
the research papers so that we don't have
39:48
to, and Miles watches the J-Lo movies so
39:50
that you don't have to. I'm glad you're
39:52
watching the J-Lo because there's so
39:56
many different cultural touchstones of this. Yeah. I
39:58
had to look at it. because I thought
40:01
the movie you were talking about was the
40:03
autobiography, This
40:06
Is Me Now, a Love Story. I'm
40:08
like, there's a film? I was
40:10
like, there's an AI subplot in
40:12
that? Yeah. I didn't
40:15
know that JLo's life was a
40:17
complete cautionary tale
40:20
about AI and
40:22
the inevitability of it. Sorry,
40:26
Emily was about to say something. I just want to
40:28
be starchy. Our
40:31
colleagues, to meet Gabriel and Emil
40:33
Torres, coined this acronym TESCRIL, which
40:36
stands for a bundle of ideologies that are
40:39
all very closely related to each other. What's
40:41
interesting about the transition, you notice that they've
40:43
basically moved from one part
40:45
of the TESCRIL acronym to another. It's
40:48
all this stuff that's based on these
40:51
ideas of eugenics and really
40:53
disinterest in any actual
40:55
current humans in the
40:57
service of these imagined people
41:00
living as uploaded simulations in the far
41:02
long future. It's utilitarianism
41:04
made even more ridiculous by being taken
41:06
to an extreme endpoint. This thing like
41:09
it's going to kill us all comes
41:12
partially from the long-termism part of this, which
41:14
people are fixated this idea of we have
41:16
to, and it's ridiculous. They have
41:18
a specific number, which is 10 to the 58,
41:21
who are the future humans who are going to live as uploaded
41:24
simulations in computer systems
41:26
installed all over the galaxy. These
41:29
are people who clearly have never worked in IT support.
41:32
Because somehow the computers just keep running. Yeah, it'll
41:34
be fun. Yeah. The
41:36
idea is that if we don't make
41:39
sure that future comes about, then
41:42
we collectively are missing out on the happiness of those
41:44
10 to the 58 humans. That's such
41:46
a big number that it doesn't matter what happens now. I
41:49
always say when I relate the story that I wish I
41:51
were making this up, but there are
41:53
actually people who believe this. That's where the like, oh
41:55
no, it's going to
41:57
end us all stuff lives. before
54:38
at BetMGM. Signing up and playing is so
54:40
easy. Simply sign up using Code Buckeye and
54:43
receive up to $1,500 back in bonus bets
54:45
if you don't win your first bet. When
54:47
you register with BetMGM you can get instant
54:49
access to a variety of parlay selection features,
54:51
live betting options, and the best daily promotions
54:53
in the business. And with BetMGM at your
54:55
fingertips, every play and every game matter more
54:57
than ever. Place your money line, prop, and
54:59
parlay bets with a King of Sportsbooks today.
55:01
Sign up using Code Buckeye and receive up
55:03
to $1,500 back in bonus
55:06
bets if you don't win your first bet. That's
55:08
right! specific
1:02:00
technology is, but more someone who's like, learns
1:02:02
how to harness technology for this
1:02:05
other specific aim. Yeah. Yeah.
1:02:08
So surveillance is not synonymous with safety.
1:02:10
Like the one kind of one use
1:02:12
case for the word surveillance that I
1:02:14
think actually was pro public safety is
1:02:16
there is a study that long-term study
1:02:18
in Seattle called the Seattle Flu Study.
1:02:21
And they are doing what they call surveillance testing
1:02:23
for flu viruses. So they get volunteers to come
1:02:25
in and get swabbed and they are keeping track
1:02:27
of what viruses are circulating in our community. Right.
1:02:30
I'm all for surveilling the viruses. Yeah. Especially
1:02:32
if you keep the people out of it. Yeah. I
1:02:35
would add a wrinkle to that just because I
1:02:37
think that, I mean, there's a lot of surveillance,
1:02:39
I mean, that's the kind of technology, that's the
1:02:41
kind of terminology they use with health surveillance to
1:02:43
detect kind of virus rates and
1:02:45
whatnot. I would also add the
1:02:47
wrinkle that like a lot of those, you
1:02:49
know, organizations are really trusted by, distrusted by
1:02:51
marginalized people. Like what are you going to
1:02:53
do? What's it mean? Especially
1:02:56
thinking like, you know,
1:02:58
like lots of trans folks and
1:03:00
like, especially like under
1:03:02
housed or unhoused trans folks and just like, you're going
1:03:04
to do what? You want this data from me for
1:03:06
who? You know? Right. Yeah.
1:03:09
Yeah. Understandably. Especially because surveillance
1:03:11
in general, like, is not a safety
1:03:14
thing, right? It is maybe a like
1:03:17
safety for people within the walls of the
1:03:19
walled garden thing, but that's not safety, right?
1:03:22
That's, yeah. The other thing
1:03:24
about this is that what we call
1:03:26
AI these days is predicated on enormous
1:03:28
data collection. Right. And so
1:03:30
to one extent, it's just sort of an excuse to
1:03:32
go about claiming access to all that data. And
1:03:35
once you have access to all that data, you can do things
1:03:37
with it that have nothing to do with the large language models.
1:03:40
And so there is, you know, this is I
1:03:42
think less, typically less immediately like threatening to
1:03:44
life and limb than the applications that Alex
1:03:47
was starting with. But there's a lot of
1:03:49
stuff where it's like, actually,
1:03:51
we would be better off without all that
1:03:53
information about us being out there. And
1:03:56
there's an example that came up recently. So did you
1:03:58
see this thing about the system called Recolored? call that
1:04:01
came out with Windows 11. This
1:04:03
is such a mess. So
1:04:06
initially it was going to be by default turned on.
1:04:08
Oh yes. This is the
1:04:11
Adobe story too. Yeah. Yeah. Every five seconds it
1:04:13
takes a picture of your screen. Then
1:04:15
you can use that to like using AI, search
1:04:18
for stuff that you've sort of, and their example is something stupid. It's
1:04:20
like, yeah, I saw a recipe, but I don't remember where I saw
1:04:22
it. So you want to be able to search back through your activity.
1:04:25
Like zero thought to what this
1:04:27
means for people who are victims
1:04:29
of intimate partner violence. Right.
1:04:32
That they have this surveillance
1:04:34
going on in their computer that eventually
1:04:36
ended up being shipped as off by
1:04:38
default because the cybersecurity folks push back
1:04:41
really hard. By folks, I don't mean
1:04:43
the people at Microsoft, I mean the people out in the
1:04:45
world who saw this coming. Yeah. But that's another example of
1:04:47
like surveillance in the name of
1:04:49
AI that's supposed to be the sort of
1:04:52
helpful little thing for you, but like no thought to
1:04:54
what that means for people. It's like, yeah, we're just
1:04:57
going to turn this on by default because everybody wants
1:04:59
this obviously. Right. It's like, no, I
1:05:01
know how to look through my history actually.
1:05:03
I've developed that skill. I
1:05:06
don't need you to take snapshots of
1:05:08
my desktop every three seconds. But your
1:05:10
shows covered so many upsetting ways that,
1:05:14
it doesn't seem like it's people implementing
1:05:16
AI, it's companies implementing AI in a
1:05:18
lot of cases to do jobs
1:05:21
that it's not capable of doing. There's
1:05:24
been incorrect obituaries, you
1:05:26
know, I like, it's not even a big deal of breaking up
1:05:28
when we have a new distributor, it's a local du board provider,
1:05:40
but you know, just you know, I think we should them but
1:05:42
go back into what we were doing and go
1:05:45
back into that, exact
1:05:47
time around when you were there. And
1:05:50
it was good. No, Tizeko, I
1:05:52
mean, not so much. I think kilometres
1:05:54
who is in shifted has become planning AI in a lot
1:05:56
of cases to do jobs that it's not capable
1:06:00
of. doing. There's been
1:06:02
incorrect obituaries. Grok,
1:06:04
the Elon Musk one, the Twitter
1:06:06
one made up fake headlines about
1:06:09
Iran attacking Israel and public, put
1:06:11
them out as a major trending
1:06:13
story. You have this great anecdote
1:06:16
about a Facebook chatbot
1:06:18
AI responding to
1:06:20
someone had this very specific
1:06:22
question. They have a gifted
1:06:24
disabled child. They're like, does
1:06:26
anybody have experience with a
1:06:28
gifted disabled like to e
1:06:31
child with like this specific
1:06:33
New York public school program
1:06:35
and the chatbot responds. Yes, I
1:06:38
have experience with that and just like
1:06:40
made up because they knew that's what that's
1:06:42
what they wanted to hear. And fortunately,
1:06:44
it was like clearly labeled as an AI
1:06:46
chatbot. So the person was like, what, what
1:06:49
the black mirror? Yeah, but
1:06:52
World Health Organization, you know,
1:06:54
eating disorder institutions replacing therapists
1:06:56
with AI, like you
1:06:58
just have all these examples
1:07:03
of this going being used
1:07:05
where it shouldn't be and things
1:07:08
going badly. And like
1:07:11
there's a detail that I think
1:07:13
we talked about last time about Duolingo, where
1:07:18
the model where they let
1:07:20
AI take over some of the stuff
1:07:22
that like human teachers and translators were
1:07:24
doing before. And you made
1:07:27
the point that people who are learning the
1:07:29
language who are beginners are not in a
1:07:32
position to notice that the quality has dropped.
1:07:34
Yeah. And I
1:07:36
feel like that's what we're seeing
1:07:38
basically everywhere now is just the
1:07:40
internet is so big, they're just
1:07:42
using it so many different places
1:07:45
that it's hard to catch them
1:07:47
all. And then there's not an
1:07:49
appetite to report on
1:07:51
all the ways it's fucking up.
1:07:54
And so it just everything is
1:07:56
kind of getting slightly
1:07:59
too drastically. shittier at once.
1:08:03
And I don't know what to do with that. I
1:08:06
would say, yeah, well, go ahead, Emily.
1:08:09
What you do with that is you make fun of
1:08:11
it. That's one of our things, is ridiculous process to
1:08:14
try to keep the mood up, but also just show it
1:08:17
for how ridiculous it is. And
1:08:19
then the other thing is to really seek out
1:08:21
the good journalism on this topic, because so much
1:08:23
of it is either fake journalism output
1:08:25
by a large language model these days, or
1:08:28
journalists who are basically practicing access journalism, who are
1:08:31
doing the Jews thing, who are reproducing press releases.
1:08:33
And so finding the people who are doing really
1:08:35
good critical work and supporting them, I think is
1:08:37
super important. Yeah. But Alex, you were going to
1:08:39
say. Well, I was, well, you just teed me
1:08:41
up really well, because I was actually going to
1:08:43
say, you know, some of the people who
1:08:45
are doing some of the best work on it are like
1:08:48
four or four media. And
1:08:50
I want to give a shout out to them
1:08:52
because they're, you know, these folks are basically, you
1:08:55
know, they were at motherboard and
1:08:58
motherboard, you know, or
1:09:01
the whole vice empire
1:09:03
was basically, you know,
1:09:05
sunsetted. So they laid
1:09:07
off a bunch of people. So
1:09:10
they started this kind of journalist
1:09:12
owned and operated place. And, you
1:09:14
know, that focuses specifically on tech
1:09:16
and AI. And these folks
1:09:18
have been kind of in the
1:09:20
game for so long. They know
1:09:23
how to talk about this stuff without really
1:09:25
having this kind of being bowled
1:09:27
over, you know, there's
1:09:29
people who play that access journalism,
1:09:32
like Kara Swisher, who like kind
1:09:34
of poses herself as this person
1:09:36
who is very antagonistic. But
1:09:39
like, you know, right off the just like
1:09:41
fawning over like AI people. Yeah, like
1:09:43
all the time. Yeah, I trusted Elon
1:09:45
Musk and tell us like, well, why
1:09:47
did you trust this man in the
1:09:50
first place? Did you know
1:09:52
I was reading the Peter Thiel
1:09:55
biography, the contrarian and,
1:09:57
you know, and like it's
1:09:59
a very. It's a very harrowing read. I
1:10:02
mean, it's fascinating, but it was very harrowing.
1:10:04
It wasn't in the augo. It was pretty
1:10:06
like critical But like, you
1:10:09
know, they discuss the PayPal
1:10:11
days, you know 24 years ago When
1:10:14
you know Elon Musk was like, well,
1:10:16
I want to rename PayPal to X
1:10:19
And then and then everybody was like
1:10:21
why the fuck would you do that?
1:10:24
People are already using people are using
1:10:26
PayPal as a verb You
1:10:28
know this effectively the same thing you did
1:10:30
with Twitter like yeah, people are talking about
1:10:32
tweet as a verb Why would you say,
1:10:34
you know, just it's been like an
1:10:36
absolutely vapid human being with
1:10:39
no business sense anyways
1:10:42
That was a very long way of saying
1:10:44
cares so sure sets and then They'll
1:10:48
say also saying that there's lots of
1:10:50
folks. There's a number of
1:10:52
folks doing great stuff So I mean folks
1:10:54
at four or four Karen how who's independent?
1:10:57
But had been at the Atlantic and MIT
1:11:00
Tech Review and Wall Street Journal Carrie
1:11:02
Johnson who was at wired is
1:11:04
now at Cal matters. There's a lot of people
1:11:06
that really Report on AI
1:11:09
from the perspective of like the people
1:11:11
who it's honoring Rather than
1:11:13
starting from well, this tool can do X
1:11:15
Y and Z, right? You know, we really
1:11:17
should take these groups out their claims But
1:11:20
yeah, I mean the larger part of it
1:11:22
is I mean, there's just so much stuff
1:11:24
out there, you know And it's it's so
1:11:26
hard and it is like whack-a-mole and I
1:11:28
mean we're we're not journalists by
1:11:31
training I mean, we're sort of doing a
1:11:34
journalistic thing right now commentary
1:11:36
where I Think
1:11:38
we're I think I would not say we are
1:11:40
journalists. I always say we are doing a journalistic
1:11:43
thing We
1:11:48
are not doing original reporting sure sure but
1:11:50
it is Well, and you
1:11:52
know, I I would you know, I'm not I
1:11:54
don't know I'm not the I don't I don't
1:11:56
know who decides this is the court of journalism,
1:11:58
but you know reporting insofar as
1:12:01
looking at original papers and effectively
1:12:03
being like, okay, this is
1:12:05
marketing. This is why it's
1:12:07
marketing. Yeah, there's no there
1:12:09
there. Yeah. Rather than a
1:12:11
Wizzbang CNET article or something
1:12:14
that comes out
1:12:16
of a content mill and says,
1:12:18
Google just published this tool
1:12:20
that says you can find 18
1:12:24
million materials that are complete.
1:12:26
Well, it's like, okay, well, let's look at those claims
1:12:29
and upon what grounds do those
1:12:31
claims stand and how
1:12:33
that's a pretty poor thing.
1:12:35
I like to think of what we're doing is, first
1:12:38
of all, sharing our expertise in our specific
1:12:40
fields, but also modeling for people how to
1:12:42
be critical consumers of journalism. So
1:12:46
journalism adjacent, but yeah, definitely without training
1:12:48
in journalism. Yeah, yeah. But I think
1:12:51
we want to do the M&M article math, I
1:12:54
mean. Oh my gosh.
1:12:56
There's this article that has like
1:12:58
broken our brains because it just
1:13:00
has this series of sentences. That
1:13:03
I don't know that like, is everything is degrading
1:13:05
like journalism. There's that story about like the Daily
1:13:07
Mail was like Natalie Portman was hooked on cocaine
1:13:09
when she was at Harvard. You're like, no, that
1:13:12
was from that rap she did on SNL. And
1:13:14
that was like a bit, but because it gets
1:13:16
ingested. This thing's just great. And then the Daily
1:13:18
Mail had to be like, at the end they
1:13:20
corrected it. They're like, she was not. That was
1:13:22
obviously a satirical and that was due to human
1:13:25
error. Like they really leaned into that. You're like,
1:13:27
no, yeah, of course. Did I say by the
1:13:29
time that a fabricated quote of mine came out
1:13:31
of one of these things and was printed as
1:13:33
news? No. No. So I
1:13:36
also like Alex have searched my own name because I
1:13:38
talked to journalists and not that I like to see
1:13:40
what's happening. And I had, there was something in an
1:13:42
outfit called Bihar Prabha that attributed this quote to me,
1:13:44
which was not something I'd ever said. And
1:13:47
not anybody ever remembered talking to. So I
1:13:49
emailed the editor and I said, please take
1:13:51
down this fabricated quote and printer retraction because
1:13:53
I never said that. And they
1:13:55
did. So the article got updated, remove the thing
1:13:57
attributed to me. And then there was a. a
1:14:00
thing at the bottom saying we've attracted this, but
1:14:02
what they didn't put publicly, but he told me
1:14:04
over e-mail, that the whole thing came out of
1:14:06
Gemini. They
1:14:08
posted it as a news article. Of course. The
1:14:11
only reason I discovered it was it
1:14:13
was my own name, and I never
1:14:16
said that thing. Well, I need your
1:14:18
expertise here to decipher this Food and
1:14:20
Wine article that was talking about how
1:14:22
M&Ms was coming out with a pumpkin
1:14:24
pie flavored M&M, but very
1:14:27
early, normally pumpkin pie flavored things don't enter
1:14:29
the market till around August, like around when
1:14:31
fall comes. But M&Ms- This is why we
1:14:33
were covering it, because we are journalists. Yes,
1:14:35
we are journalists. We cover the important stories.
1:14:38
In May, pumpkin spice already?
1:14:40
No. But again, they
1:14:42
were saying this is because apparently
1:14:44
Gen Z and millennial consumers are celebrating
1:14:47
Halloween earlier, but this is this
1:14:49
one section that completely- Wait, wait, can
1:14:51
we back up? What? Yeah.
1:14:54
I don't know. That's what they're saying according
1:14:56
to their analysis. So
1:14:58
let me read this for you. Quote,
1:15:02
the pre-seasonal launch of the milk
1:15:04
chocolate pumpkin pie M&Ms is a
1:15:06
strategic move that taps into Mars
1:15:08
market research. This research indicates that
1:15:10
Gen Z and millennials plan to
1:15:12
celebrate Halloween by dressing up and
1:15:14
planning for the holiday about 6.8
1:15:16
weeks beforehand. Well,
1:15:18
6.8 weeks from Memorial
1:15:20
Day is the 4th of July. So you
1:15:22
still have plenty of time to latch onto
1:15:24
a pop culture trend and turn it into
1:15:26
a creative costume. I
1:15:29
don't- That's a chaos. It's all chaos.
1:15:31
That's a chaos, right? It doesn't make
1:15:33
any sense. I know. Look, wait. Wait,
1:15:36
I'm fixing this. I'm
1:15:38
fixating on 6.8. Exactly.
1:15:41
I'm fixating on two. What does that even mean?
1:15:43
What the fuck does that mean? And where did
1:15:45
Memorial Day come from in that? And what is
1:15:48
6.8 weeks for Memorial Day? Because
1:15:50
it's not any of the days that they said
1:15:52
it was. They said July 4th. And
1:15:55
also 6.8 weeks isn't
1:15:57
a real amount of time. That's 47. 6.6
1:16:00
days. What is even a 6.8 week? So
1:16:05
if this were real, it's
1:16:07
possible that they surveyed a bunch of people
1:16:09
and they said, when do you start planning
1:16:11
your Halloween costume? Those people gave dates and
1:16:13
then they averaged that. That's how you could
1:16:15
get to it. I get that. That's
1:16:17
fair. But also, it totally
1:16:19
sounds like someone put into a large
1:16:21
language model, write an
1:16:23
article about why millennials and Gen
1:16:25
Z are planning their
1:16:28
Halloween costumes earlier. It sounds
1:16:30
like that. But also just so odd
1:16:32
to say, well, 6.8 weeks from Memorial
1:16:34
Day is the 4th of July. This
1:16:36
article didn't even come out. It came
1:16:38
out after Memorial Day. It's
1:16:41
just nothing made sense. I was like,
1:16:43
I don't fucking understand what they're doing
1:16:45
to me right now. But again, this
1:16:48
is the insidious part for me about it. So
1:16:50
this appeared in Food and Wine? This is in
1:16:52
Food and Wine magazine with a human in
1:16:55
the byline. I actually DM'd this person on
1:16:57
Instagram, and I said, do you mind just
1:17:00
clarifying this part? I'm a little bit confused
1:17:02
and I've got no response.
1:17:05
I'm wondering if it's because I know
1:17:07
that there was some good coverage in
1:17:09
Futurism, and they were
1:17:12
talking about this company called Advon
1:17:14
Commerce, and the way that
1:17:16
basically this company has been basically
1:17:19
making AI generate
1:17:22
articles for a lot of
1:17:24
different publications, usually on
1:17:26
product placement. So
1:17:30
it makes me think it's like, because
1:17:32
Food and Wine may have
1:17:34
been one of their, I forgot the article,
1:17:37
but they had better homes and
1:17:40
gardening and these legacy articles like
1:17:42
that. So I don't know if
1:17:44
it's something of that or
1:17:46
this journalist said, write me this
1:17:48
thing and I'm just going to drop it and then
1:17:50
go with God. Yeah.
1:17:55
My other favorite example of AI is this
1:17:57
headline I saw somewhere. It's no big secret.
1:18:00
why Van Vaught isn't around anymore. And with
1:18:02
a picture of Vince Vaughn, but they just
1:18:04
got his name completely
1:18:06
wrong. Yeah. Well, I can't find it.
1:18:10
It's no big secret. Why
1:18:13
Van Vaught isn't around anymore. You
1:18:19
know, if I was just scrolling and I'd
1:18:22
say, yeah, I liked Van Vaught
1:18:24
and the intern. And
1:18:29
then I would have looked at it and then I
1:18:31
would have double taped. I'm like, wait, wait, wait. Did
1:18:33
he co-star with Owen McWilson
1:18:35
or something? Yeah, yeah, yeah, yeah. Russell
1:18:38
Wilson was in that. I
1:18:40
think it was the Adweek report that you're
1:18:42
thinking about. So Futurism did a bunch of
1:18:44
it, but then Adweek had the whole thing
1:18:47
about Advon and I can't quite get through
1:18:49
it. No, it was Futurism. It was Futurism,
1:18:51
yeah. Because Adweek had the thing on this
1:18:53
program that Google was offering
1:18:55
and it didn't have a name. Oh, right. Yeah.
1:18:57
So Advon was Futurism. Yeah, but it totally sounds
1:18:59
like one of those. But it is
1:19:01
happening, yeah. Yeah. See, I thought you were going
1:19:03
to talk about the surveillance by M&M thing. We
1:19:06
said M&Ms. So this was somewhere
1:19:08
in Canada. There was an M&M vending machine
1:19:10
that was taking pictures of the students while
1:19:12
they were making their purchases. And I forget
1:19:14
what the sensible purpose was, but
1:19:17
the students found out and got
1:19:19
it removed. Probably freaked out
1:19:21
and made a big deal about it. Students,
1:19:24
are we right? Well,
1:19:27
I feel like we could talk to you guys
1:19:29
once again for three hours. There's
1:19:32
so much interesting stuff to talk about.
1:19:34
Your show is so great. Thank you
1:19:36
both for joining. Where can
1:19:38
people find you, follow you,
1:19:40
all that good stuff. Emily, we'll start
1:19:42
with you. Well, first, there's
1:19:44
the podcast, Mystery AI Hype Theater 3000,
1:19:46
where you find any podcast, you can
1:19:49
find ours. And we've also started a
1:19:51
newsletter. If you just search Mystery
1:19:53
AI Hype Theater 3000 newsletter, I think it'll turn
1:19:55
up. And that's an irregular newsletter,
1:19:57
where we basically took the things that used
1:20:00
to... be sort of little tweet storms. And
1:20:02
since the social media stuff has gotten
1:20:05
fragmented, we're now creating newsletter posts with
1:20:07
them. So it's off the cuff discussions
1:20:09
of things. On Twitter,
1:20:12
X and Macedon and Blue
1:20:15
Sky, I'm Emily M Bender. And
1:20:17
I'm also reluctantly using LinkedIn
1:20:19
as social media these days. So the news
1:20:22
that I need is... It's gonna
1:20:24
be the last one. It's gonna be the one
1:20:26
that survives them all because we... I know. ...some
1:20:28
people kinda need it. Really the cockroaches of social
1:20:30
media on the website. Yeah.
1:20:33
Yeah. Yeah. Yeah. Yeah. I'm
1:20:35
at Alex. Oh, you Alex? Yeah.
1:20:37
Alex Hanna, H-A-N-N-A on
1:20:39
Twitter, Blue Sky. I barely
1:20:43
use Blue Sky or Macedon, but Twitter is
1:20:45
the best place to find me. Also
1:20:48
check out dare, dare,
1:20:51
D-A-I-R, hyphen, institute.org. We're
1:20:54
also dare, underscore, institute on Twitter,
1:20:57
Macedon. And we're not on
1:20:59
Blue Sky yet, but we're
1:21:02
on LinkedIn. But that's where
1:21:04
you learn a lot about what our institute's doing, lots
1:21:08
of good stuff, amazing colleagues, and
1:21:11
whatnot. Yeah. Amazing. And
1:21:13
is there a work of media
1:21:15
that you've been enjoying? Yes.
1:21:18
I've got one for you. This, I think, started off
1:21:20
as a tweet, but I saw it as a screencap
1:21:22
on Macedon. So it's by Lama in a text. And
1:21:24
the text is, don't you understand that the human
1:21:26
race is an endless number of monkeys. And every
1:21:29
day we produce an endless number of words. And
1:21:31
one of us already wrote Hamlet. That's
1:21:35
really good. That's such
1:21:38
a hyper-specific piece of media.
1:21:42
I think last time I was on this,
1:21:45
I was plugging Worlds Beyond Number, which is
1:21:47
a podcast, which I'm just absolutely in love
1:21:49
with, which is a Dungeons
1:21:51
and Dragons actual play podcast. But it's
1:21:53
got amazing sound production. I
1:21:56
would just plug in everything on
1:21:58
dropout.tv. I mean, it's a streaming
1:22:00
service, honestly. Sam Reich, who is
1:22:03
the son of Robert Reich,
1:22:05
liberal darling and
1:22:10
former Department of Labor Secretary in
1:22:13
the Clinton administration, has
1:22:15
turned college humor into
1:22:17
an area of really
1:22:19
great comedians. So
1:22:22
they're putting out a lot of great stuff.
1:22:24
So I'd say make
1:22:26
some noise. It's coming out with a new season today,
1:22:28
which is a really great improv
1:22:30
comedy thing. And yeah,
1:22:33
let's just go with that. So just
1:22:35
plug in. Those very important people interviews
1:22:38
are hilarious. Those very important interviews, Vic
1:22:40
Michaelis. I named one of
1:22:42
my chickens, vehicular manslaughter, after
1:22:45
an inside joke there, and another one,
1:22:48
Thomas Shrigley. So yeah,
1:22:50
just incredible, incredible stuff. Yeah.
1:22:53
Shout out to Sam. He's one of the
1:22:55
best. Miles. Yes. Where
1:22:57
can people find you? Is there a
1:22:59
work media you can enjoy? They have
1:23:01
at symbols. Look for at Miles of
1:23:03
Gray. I'm probably there. You
1:23:05
can find Jack and I on our
1:23:08
basketball podcast, Miles and Jack at Mad
1:23:10
Max D's, where we've wrapped up the
1:23:12
NBA season and I have two streaming
1:23:15
down my face with pain and anger
1:23:17
as the Celtics win again. And also
1:23:19
if you want to hear me talk
1:23:21
about very serious stuff, I'm talking about
1:23:23
90 day fiance on my other show,
1:23:25
420 day fiance, which you can check
1:23:28
out wherever they have podcasts. A tweet
1:23:30
I like first
1:23:32
one is from a past
1:23:34
guest, Josh Gondelman, tweeted, I
1:23:37
bet the best part of being in a
1:23:39
throuple is he have someone to do all
1:23:42
three Beastie boys, Parsa karaoke. I
1:23:45
guess one way to look at that. And
1:23:47
then another one from other past guest Demi
1:23:49
Adijuibe at electro lemon, uh, got his account
1:23:51
hacked and he tweeted, hi, hello, it's Demi.
1:23:53
I got my account back. Uh, I feel
1:23:55
the need to clarify that under no circumstances
1:23:57
should you ever believe that. or
1:24:00
anybody on this website is selling cheap
1:24:02
Mac books for charity or otherwise. And
1:24:04
what benefit would my signature do to
1:24:07
a laptop? So yeah, thank you for
1:24:09
clarifying it. I actually remember because I
1:24:11
followed Demi and I remember when his
1:24:13
account got hacked and I thought, man,
1:24:16
that's really, and I, at first I
1:24:18
thought it was a bit because Demi
1:24:20
is hilarious. But then I'm just like,
1:24:22
what the hell? It's funny, his follow-up
1:24:24
tweet was, for anyone who
1:24:26
thought I was doing a bit, what's
1:24:28
the punchline? My
1:24:31
jokes are never so obtuse. I love it
1:24:33
when you pay off. I want you to
1:24:35
know it wasn't all that funny and I
1:24:37
want you to know quick. Yeah, no, I
1:24:39
was also trying to find out what the
1:24:41
punchline was. Right, right. Yeah. Wait for it.
1:24:43
Wait for it. He's so funny
1:24:45
that part of you wants to be like, well, hold
1:24:47
on. What are you doing here? Yeah, what's the deal
1:24:49
here? You don't want
1:24:52
to immediately just dismiss Demi because he's
1:24:54
such a great comedic. Yeah. But yeah,
1:24:56
if you do want good Demi content,
1:24:58
the who's welcome at the cookout, you
1:25:00
can find that some dropout content that you can
1:25:02
get for free on YouTube. There you
1:25:04
go. We have been enjoying
1:25:07
Sleepy at Sleepy underscore nice tweeted.
1:25:09
It's absurd that Diddy Kong wears
1:25:11
a hat that says Nintendo, patently
1:25:13
ridiculous. There's no way he understands
1:25:16
the significance. It would be like
1:25:18
me unknowingly wearing a hat that
1:25:20
coincidentally depicts the true form of
1:25:22
the universe. That's
1:25:27
incredible. Oh my
1:25:29
God. It's
1:25:32
so fucking good because yeah, the second he
1:25:34
showed up, you're like, I don't know. Yeah,
1:25:36
I know. Brandon, he likes
1:25:39
Nintendo. You can find
1:25:41
me on Twitter at Jack underscore O'Brien.
1:25:43
You can find us on Twitter at
1:25:45
daily zeitgeist. We're at the daily zeitgeist
1:25:47
on Instagram. We have a Facebook fan
1:25:49
page and a website daily zeitgeist.com
1:25:51
where we post our episodes and our
1:25:53
foot. No, we link off to the
1:25:55
information that we talked about in today's
1:25:57
episode as well as a. that we
1:26:00
think you might enjoy. Miles,
1:26:02
what song do you think people might enjoy?
1:26:30
What does it actually come out? Is it recent? Hey
1:27:01
there girls, where are you going?
1:27:03
And they're like, down to the
1:27:05
beach is where we're going. But
1:27:10
there's this like charm to it and the
1:27:13
instrumentation is cool. So anyway, this is the
1:27:15
beach nuts with out in the sun parenthetical
1:27:17
fail. All right. Well,
1:27:19
we will link off to that in the footnotes. The
1:27:21
daily zeitgeist is a production of I
1:27:23
heart radio for more podcasts from my heart radio
1:27:26
visit. Yeah. Heart radio app, Apple podcast, or
1:27:28
wherever a fine podcast are given away for free. That's
1:27:30
going to do it for us this morning.
1:27:32
We're back this afternoon to tell you what
1:27:34
is trending and we will talk to y'all
1:27:37
then. Bye. Bye. Bye. Sound
1:27:45
is personal, intimate and emotive. Just
1:27:47
like this podcast, we
1:27:50
are audio stack.ai. We
1:27:52
combine AI writing the best synthetic
1:27:54
voices like ours with production
1:27:57
like music and mastering and
1:27:59
deliver the them to be heard, be it
1:28:01
ads, podcasts, or VOs for video.
1:28:03
Just like this ad you're listening to right now.
1:28:06
However, we have millions of spots just
1:28:08
like this on podcasts. And rather
1:28:10
than hearing from us, we want to hear from you.
1:28:13
How would you like to win an
1:28:15
AI audio campaign for free? Do
1:28:19
you work with businesses, products, events, or
1:28:21
causes that could benefit from free promotion
1:28:23
on podcasts in the coming month? Tell
1:28:26
us how you might use synthetic voices. Or
1:28:30
dynamically change ads for a news
1:28:32
podcast like this versus true crime,
1:28:34
history, or even comedy. Go
1:28:36
to audiostack.ai/contest, and your company
1:28:39
could be heard by millions.
1:28:41
See web page for Ts and Cs. Here's
1:28:44
something you might not know about
1:28:46
wireless. Sometimes what you see isn't
1:28:48
what you get. But with Visible,
1:28:51
what you see is what you
1:28:53
get. Switch to Visible, the wireless
1:28:55
company that makes wireless visible. Get
1:28:57
a one-line plan with unlimited 5G
1:28:59
data powered by Verizon. Just $25
1:29:01
per month, taxes and fees included.
1:29:03
Switch now at visible.com. Monthly
1:29:06
rate on the Visible plan. For
1:29:08
data management practices and additional terms,
1:29:10
visit visible.com.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More