Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Politics has never been stranger or more
0:03
online, which is why the Politics team
0:05
at Wired is making a new show
0:07
Wired Politics Lab. It's all about how
0:09
to navigate the endless stream of news
0:12
and information and what to look out
0:14
for. Each week on the show will
0:16
dig into Far Right Platform A I
0:19
chat bots, influence or campaigns and so
0:21
much more. Wired Politics Lab Largest Thursday,
0:23
April Eleven. Follow the show where ever
0:25
you get your podcasts. Hey
0:32
are I'm. Erin Haynes, the host
0:34
of The Amendment, a brand new weekly
0:36
podcast on Gender politics, some power brought
0:38
to you by The Nineteenth News and
0:41
Wonder Media Network. You've probably heard the
0:43
news that this election year, our democracy
0:45
is at stake. Only amendment I'm breaking
0:47
down with that actually means specifically for
0:50
the March Last Folks. Who depend on
0:52
democracy the most? This.
0:54
Is a show that daf pass the headlines
0:56
and gets clear on the unfinished. Work of
0:58
our democracy. Was. Into
1:01
the Moment Now wherever you get your
1:03
podcast. What?
1:06
Basically Op and zombies questions Now
1:08
is that they're geared toward been
1:10
an entire income But Investments: Harvard
1:12
Harris Poll. Sixty two percent
1:15
of Americans say. That. The
1:17
country. Is on the wrong track?
1:19
But. If I had shown you that
1:22
data for years back. It would
1:24
be nearly identical. But it would be flipped.
1:34
What? Could go right. I'm.
1:36
Zachary care about the founder of the
1:38
progress that worked. Join the as always
1:40
by Emma Var Valukas the Executive Director
1:42
of Yes You've got it. The progressed
1:44
our. And. This is our weekly podcast
1:46
where we look at what you go right as
1:48
opposed to what could go wrong which had a
1:50
look at what's going well in the world or
1:52
this which had to look at people that are.
1:55
Taking. a more nuanced
1:57
sophisticated com or saner
2:00
less outraged, less fearful, less
2:02
hysterical, less partisan, less
2:04
political, less dystopic, less dystopian
2:06
view of what is going on in
2:09
the world. And one of the
2:11
ways that we all assess what's going on
2:13
collectively is through this
2:15
very modern, modern as in the
2:17
past century, institution known as
2:19
polls, where a group of people go around
2:22
and ask another group of people what they
2:24
think of X. And polling has become one
2:27
of the tools that we
2:29
use to try to figure out
2:31
whatever we are as opposed to whatever
2:34
I am. So we're going to talk
2:36
today to someone who I think probably does a better
2:38
job given that much of that polling
2:40
or at least a portion of that polling
2:42
in a political election year is
2:44
driven entirely by horse race mentality. Who's going to
2:47
win? Who's not? Who's going to
2:49
get the most votes? Who isn't? So Emma, who are we going
2:51
to talk to today? Emma Bensky So today we're talking
2:53
to John Gerzweil. He's the CEO of
2:55
the Harris Poll and he's a pioneer
2:57
in the use of data to identify
2:59
a particular social change. And Harris Poll
3:01
does a lot of different kinds of
3:03
polling, in particular helping companies anticipate new
3:06
trends and demands, but also lots of
3:09
general polling about Americans, their mood, what
3:11
they care about, what they're looking at.
3:13
He has written for numerous different publications
3:15
and is also the author of a
3:17
couple books, among them the Athena Doctrine.
3:20
So let's go see what John has to say
3:22
about polls. Let's do it. Welcome,
3:25
John, to our podcast. Welcome to
3:28
What Could Go Right. I thought
3:30
it would be interesting. This
3:32
is a spontaneous thought and therefore
3:36
maybe much less interesting in retrospect that
3:38
it feels pungently in the moment. But
3:41
tell us about the history of the
3:43
Harris Poll. Like, how did it come
3:45
about? Who created it? Who was Harris?
3:47
What was his goals? Was it different
3:49
than Gallup or Pew or anything else?
3:51
John Gerstle Oh, that's an excellent question.
3:53
You'd have to go back 60 years,
3:55
but Lou Harris started the company up
3:57
in Rochester, New York. And
3:59
If you... Imagine that time these
4:01
were the giants of industry. You
4:04
had the Harris Poll, he had
4:06
Kodak. You. Had dust.
4:08
A. Few other companies have that El que they
4:11
were do is releasing things up state. But.
4:13
He had to sort of. Simple. Believes
4:16
that the public's voice mattered
4:18
and leadership. And and leaders need
4:20
to be held to account. That
4:22
is sort of. Mission was to try to. Bring.
4:24
The public's voice into the debate
4:26
on politics and and later in
4:29
the business. So he was John
4:31
Kennedy's pollster. A sort of how
4:33
he got started. And. Delta.
4:36
Really? Fascinating Polling Company.
4:39
That. By the time of his gas
4:41
in. Twenty Six T.
4:44
Harris sadly sat inside Nielsen and it
4:46
wasn't even Harris Poll anymore. They've taken
4:49
the name down and they. And
4:51
their wisdom they called it meals and
4:53
consumer insights. Were. Able to
4:55
acquire the company as part of our holding
4:57
company which is called Stigma Group. And.
5:00
They are spoken to. Came roaring back. So proud
5:02
of the work we do. So.
5:04
I fill it with. Polls. Are
5:06
the see. what they tell us
5:08
and brought into the public conversation lot
5:11
but I guess until two thousand and
5:13
sixteen I don't remember there being i
5:15
public conversation about. Polls. Themselves
5:17
Ryan Harris said south. Downfall.
5:20
Of the polls a moment. then. The
5:22
point where people can ask and I
5:24
do. You mean he saying what's the
5:27
point sell. Wanted. To give you
5:29
an opportunity to give up for
5:31
throated defense of Paul has. Got
5:34
it. So that's good. That's my job. Yeah, so.
5:37
The. Twenty Sixteen election was
5:39
probably the. Janet.
5:42
Jackson, Justin Timberlake that has
5:44
bestowed gray reference You know
5:46
where. Everything they could go wrong
5:48
went wrong. There. Were. A
5:51
myriad of reasons in retrospect that
5:53
have been widely reported, but I
5:55
think the real important things to
5:57
think about with. When
5:59
polling. Goes bad is there are a lot
6:01
of really bad boy firms and you can
6:03
see even if you cast for to the
6:06
Well in that instance. and twenty sixteen. You.
6:08
Had obviously Trump voters wildly
6:10
under measured, right? They didn't
6:12
wait or samples right? Be.
6:15
Made simple mistake slight not ever seen
6:17
in that many from voters people in.
6:20
And rural parts of the country went
6:22
online and a word available for online
6:24
polling. So simple things like that. Then.
6:26
It was obviously response bias. And.
6:28
There's a lot of different biases to go into
6:30
polling, but. Are responding bias?
6:33
Could. Be one to all kinds
6:35
of supervisors, but one of them is like
6:37
ability. And. So that would create shy.
6:40
From. Voters when things were looking out
6:42
in the future for. This
6:44
election is not shy trump voters
6:46
necessarily that may be side biden
6:48
voters. So. People that
6:50
I'm not really say that their opinions but.
6:52
Might. Show up felt point is incredibly difficult,
6:55
but the people they get it right we
6:57
like to think we're part of that. We
6:59
were very accurate on the election. Last.
7:01
Time and as where many other pulsars for.
7:03
You have to have the right resources. He
7:06
can't wing it. To be honest, it's
7:08
not a good sport to be in. See.
7:10
Mention that Lou hours initially found that
7:12
accompany him was Kennedy's pollster and after
7:14
that I lose. A little confused at
7:17
the. Party affiliation part
7:19
of post organizations you see
7:21
like. Stanley. Greenberg was, you
7:23
know, Clinton was a democratic pollster and
7:25
the New Franklin's is a republican pollster.
7:28
And I would think that from
7:30
a political calculus their zero parties and
7:32
gain. From. Having
7:35
a partisan read of what people might
7:37
do in the future meaning. It.
7:39
Isn't everybody interest in actually figure out
7:41
where people's ideas are and then either
7:44
trying to save that org or gain
7:46
that accordingly Says the whole republican vs.
7:48
dumber have whole I do find confusing.
7:50
I don't find it confusing of if
7:52
you want to have a data point
7:54
for an opinion columnist. See.
7:56
what your poll to come to a foregone
7:58
conclusion and you know x percentage of Americans
8:01
are pro-life, so therefore we should be pro-life
8:03
or pro-abortion. But I don't understand that part
8:05
of it, right? Because you kind of want
8:07
to know what people are thinking regardless of
8:09
which side you're on and act accordingly, don't
8:11
you? Yeah. So the cynical side of that
8:13
is there are many left-leaning or right-leaning political
8:16
firms that will make a lot of
8:18
money putting out their own polls. They're
8:20
going to whip up their constituents, create
8:23
fundraising, create political action
8:25
committees, and there's business in there
8:27
and advantage. But we don't look at it
8:29
that way with my folks. I've spent most
8:31
of my life out of New York. I was born
8:34
in Montana. I've lived in the Midwest for half my
8:36
life. We try to get
8:38
a very well-represented group of researchers
8:41
that sort of represent the country. In fact, we're
8:43
highly virtual, so we've got people all over the
8:46
country and we just try to encourage them
8:48
to be empires and see both things
8:50
because that's what's exciting, right? You've got
8:52
to be able to look at any type
8:54
of an issue and be able to argue it from
8:56
both sides. How confident are
8:58
you that the polls are
9:00
reflecting the reality of what's
9:03
going to happen a few months from now, even? Well,
9:07
polls are first of all a snapshot in time
9:09
as of today. They're not
9:11
actually the best predictive tools. It
9:14
takes an analysis of what's going to happen, what
9:16
you think's going to happen. But
9:19
the dual nature of what voters
9:21
think is really what makes
9:23
American democracy so effective
9:25
and what makes elections so
9:27
unpredictable. So, John, I'm
9:30
wondering if you could maybe
9:32
the SparkNotes version of this, because I imagine there's
9:34
a lot of ways to get into it, but
9:37
tell us what makes for a good poll. Because
9:39
a lot of the time, we'll see polls online
9:41
and people will say, oh, it wasn't a good
9:43
poll. It had a sample size of X. I'm
9:45
like, okay, but what is a good sample size?
9:47
I didn't go to university for this.
9:49
Someone's got to at least give
9:52
us the basics. A good sample
9:54
size is everyone. Everyone is a good
9:56
sample size. I should poll all of
9:58
us all the time with 100. percent response
10:00
rate. You got to know it. Look,
10:02
there are lots of things you kind of
10:05
have to avoid. There's things, it's phenomenons that
10:07
are called river sampling, which
10:09
is the way that you collect your sample is
10:11
related to different sorts of online promotions
10:14
and motivations to get people to become
10:16
respondents. So we work with a lot
10:18
of different panels. We have really sophisticated
10:21
fraud protection. So you're just
10:23
looking for things like reaching people
10:25
where they are. So in
10:27
some instances, they don't have broadband access. You've got
10:29
to be able to weight your sample so that
10:31
you're getting the right people
10:33
to represent the total population of the
10:36
country. We do typically like
10:38
2000 is sort of a
10:40
base size for us to be statistically
10:42
accurate. And that's sort of in line
10:44
with most notable pollsters research standards. But
10:47
I think, you know, the more interesting question in
10:50
all this is, Zachary,
10:52
kind of your point, how do you
10:54
write an interesting question? Right? Because
10:57
you can write questions, we call them
10:59
carrots or ice cream questions, you know,
11:01
so I can say, you know,
11:04
Zachary, do you want like carrots or ice cream and
11:06
most people in the country are going to sell take
11:08
ice cream, right? So whenever I see
11:10
anything is get 90% agreement in
11:13
this country, you know that it's not
11:15
a good question. It doesn't have tension.
11:17
It doesn't have trade offs. And it
11:19
doesn't go deeper to sort of find insights. The
11:22
kind of stuff we're proud of, it's
11:24
just things that shed light on culture.
11:26
And one of the ones I was
11:28
really proud of recently, we actually uncovered
11:30
that grandmothers, that grannies were kind of
11:33
the hidden engine of the economic recovery.
11:36
And the way we figured that out was
11:39
we found that two thirds of
11:41
working parents that were
11:43
holding their jobs said that they
11:45
relied on a grandparent for caregiving,
11:49
right? And 20%
11:51
of those working parents said they would have lost
11:53
their job without that support. So
11:55
suddenly, you just saw this really interesting thing that
11:57
I don't think I'd really thought of is
11:59
like... how important the support structures are
12:02
for working parents, especially working women
12:05
in America and the role the Grammys play.
12:07
So, that's what we're trying to do. You just
12:09
try to go seek out the kind
12:11
of untold stories and think like a journalist
12:13
when you're kind of going after these types
12:15
of polling questions. Did you
12:17
call the poll the Granny Economy? Dang
12:19
it, we should have. I can't remember.
12:22
We gave it to Fortune and they wrote something
12:24
clever about it. That's where my
12:26
talent stopped as a researcher. So,
12:29
on this thing, so let's say you do a poll, you define
12:31
your 2000 set geographically,
12:34
demographically, race, gender, urban,
12:36
rural, etc. And
12:38
you get a perfectly worded question. But
12:41
doesn't then the problem become in
12:43
today's particular world, I'm sure
12:45
I am not alone in that if I
12:47
got a number these days from a number,
12:49
an exchange I don't recognize, I'm highly
12:52
unlikely to pick up the phone. I doubt
12:55
I would call you back if you left me
12:57
a message. Those people who do online
12:59
samples, which I guess for a while was
13:01
morning consult or others, it's selective as in,
13:04
are you interested proactively on your
13:06
own, unpulled, are you interested in
13:08
pushing your information out there?
13:10
And that's a whole other question. Who will
13:13
do that voluntarily? I don't
13:16
have a landline or do have a landline, but I
13:18
think it goes automatically to my soul. So,
13:20
basically, how do you reach people and how do you
13:22
get them to answer the phone in
13:24
order to even do a survey today? Yeah,
13:27
so we're not doing telephone polling because you're
13:29
right, nobody picks that up. We haven't
13:31
done that for ages. I mean, there's some
13:33
very rare instances where we'll still use the
13:36
phone. We're online, but we're working with panel
13:38
providers. And so what we're
13:40
doing is we're actually getting access to fresh
13:43
respondents that have been brought in
13:45
to do a poll around
13:47
a specific topic. They would have to
13:49
have knowledge of the topic, they'd have
13:51
to have knowledge of the brand. I
13:54
should say also, by the way, we're not a
13:56
heavy duty political firm. We have a Harvard Harris
13:58
poll that we do every month. to sort of
14:00
track the mood of the country. But most
14:03
of this work is done for brands, for
14:05
marketing. So I'm trying to get people's opinions
14:07
on various topics. So no, it's 100% a
14:10
lot. Do you do the Harvard Harris
14:12
with the same methodology? Harvard Harris
14:14
is done with the Harvard Harris
14:16
Bowl through our political firm in
14:18
DC. Yeah, called Harris X. Me,
14:20
you don't do phone. It's so
14:22
online. It's still online. Yeah. And
14:25
the difference in it is it's American voters.
14:28
It's not Americans at large. When you get
14:30
to American voters, you find that they're much
14:32
more engaged in politics and they've got stronger
14:34
opinions than the rest of us here in
14:36
the country. And just one more on
14:38
that. Do you worry about if brands
14:41
are providing panels for pay that
14:43
... Well, I guess the pay doesn't really matter, right? I mean, it's
14:45
a nominal amount for people's time. Legitimate
14:47
exchange of time for a ... I
14:50
mean, these are modest side patterns, right? It's usually 100
14:52
bucks, 150, 200 bucks. It's interesting that we don't do
14:54
that for politics, right? It's just something that we don't
14:56
do, right? We don't think it should be done. We
14:59
don't think you should pay for political opinion. Do you
15:01
know why that is or how that came to be?
15:04
Oh, I think it's just the desire to want to keep
15:06
the political process and the process
15:08
of trying to understand voter opinions without
15:10
any sort of commercial incentive. Again, as
15:12
I say, a majority of my work
15:14
is all done in corporate business side
15:16
and people are quite happy to offer
15:18
their opinions. What you've got to do
15:21
is be very careful that you're not
15:24
getting professional respondents, people that a panel
15:26
provider is bringing back in over time.
15:29
You've got to watch out for all kinds of
15:31
sophisticated fraud. So there's a time delay in
15:33
answering an online question. You can see if
15:35
somebody's just going through and ticking a box
15:38
versus someone that's really thinking through
15:40
and you statistically measure those averages.
15:43
Try to make sure that the actual question's been
15:46
thought through and answered based on the style of
15:48
question. A lot of different things sort
15:50
of go into it and then you end up with
15:52
a margin of error 2.5%. The important
15:54
thing is you ask those questions lots of
15:56
different ways and you go back and you
15:59
maybe keep trending. a question. One of
16:01
the things that guys we've done since the
16:03
beginning of COVID was just
16:06
measure basic questions every week
16:08
in America about America. And so
16:10
we kind of call it our America
16:12
this week report that we put
16:14
out every week on LinkedIn. But
16:16
it's just looking at questions like, I
16:19
feel secure in my job, or I
16:21
feel my health isn't at risk, or I
16:23
feel the economy is trending in the
16:25
right direction. And so that's the
16:27
easiest way to try to understand if you're
16:29
sort of in the ballpark on your questions
16:32
or if you've got anomalies. So given
16:34
that you have these weekly polls,
16:37
what is the mood of America right
16:39
now? Because I keep hearing
16:41
the same stuff over and ever again that people
16:43
are pissed off about inflation and everyone feels
16:45
that it's a fractured society. But maybe that's what
16:47
you're hearing too, but I would love to hear some other stuff. Well,
16:50
yeah, I mean, I got two great stats for
16:52
you. And so this is from last
16:55
week, 69% of Americans
16:57
say the current chaos
16:59
happening today makes me optimistic something
17:01
might change. And 73% of
17:04
Americans say they are concerned that nothing will change.
17:07
So there's your answer.
17:09
No, seriously, it's a
17:12
real interesting time. And I could give you this.
17:14
This is just my chart. It looks like an
17:16
EKG, right? This is our weekly
17:18
tracking sort of different things in what
17:20
we call the poly crisis. And I
17:23
think what's fascinating about today's times is
17:25
this has been a period
17:27
in American history that we're living through that's sort
17:30
of not unlike the depression,
17:33
right? Where you had four sustained years or
17:35
World War Two, where there
17:37
is just been four straight years of just
17:39
new crisis sort of coming one after the
17:42
other. And obviously, when COVID to the economy
17:44
to the banking crisis to
17:46
Ukraine to solvency of US
17:48
banks, it just seems like there's something that's
17:51
coming sort of every week. And
17:53
I think that's important to understand right now.
17:55
The big optimistic side I kind of see
17:57
on this is that I'm with the first
17:59
stat. that something's going to change because
18:02
things have been so bad
18:04
and so sort of down for
18:06
so long. This is a time
18:08
of reappraisal and we see it in our data.
18:11
Seven in 10 Americans believe society
18:13
needs a complete overhaul to
18:16
make significant change. I've got all kinds
18:18
of data about Gen Z. We
18:20
call them Genzilla but we think Gen Z is
18:22
going to move into an activist sort
18:25
of position in the coming years
18:27
based on the financial situations they find
18:29
themselves in the war that they have
18:31
with old people. I mean, 71% told
18:34
us last week that they think
18:36
older generations are short-term thinkers
18:38
exploiting next generation's future. Obviously, only
18:40
31% of boomers agreed with that.
18:42
That's a good line. Gee. I
18:45
mean, yeah, the 69% who, as you say,
18:47
think that things are so bad that therefore
18:49
they believe that will create change for the
18:52
better. It's a little like Herb Stein's old
18:54
adage of something cannot go on forever, it
18:56
will stop, right? So that it's
18:58
the, if it can't continue, it won't, which almost
19:00
seems like a syllogism and yet has a certain
19:02
truth to it. I do wonder about this. And
19:05
I don't know if you've asked any of
19:08
these questions at a granular level, just trying
19:10
to get people's attitudes about if confronted with
19:12
data or statistics that
19:15
suggest that their views or beliefs are
19:17
at least disconnected from the data doesn't
19:19
mean that therefore they're wrong. Do
19:21
people ever seemingly reconsider
19:23
their views or beliefs or no?
19:27
Rarely do they reconsider their
19:29
beliefs, but they're happy to
19:31
sort of live in their own opinions.
19:34
We're seeing that right now with immigration.
19:36
It's the largest, single biggest issue in
19:38
the country as of this past month.
19:40
And a lot of that, if you
19:42
actually ask people, have you witnessed
19:44
the immigration problem where you live, you're
19:46
going to get far fewer people that
19:48
can actually really say that with certainty,
19:50
with observation. But it's like all
19:53
other types of perceptions that exist in
19:55
American culture, perceptions of crime, perceptions
19:58
of definitely immigration. those things
20:00
just become sort of self-fulfilling prophecies.
20:03
We did a poll with The Guardian
20:05
last summer when Biden was
20:08
sort of at his lowest, not getting
20:10
credit for anything. And everybody was
20:12
so completely dour. I guess
20:14
it was in the fall. A bunch
20:16
of questions that basically every American flunked
20:18
the quiz to. We asked questions like,
20:20
is the economy shrinking or growing? And
20:23
the majority of Americans thought it was shrinking when
20:25
it wasn't. We asked if the
20:27
GDP was up or down, which again, they got
20:29
it wrong. We asked if the S&P was up
20:31
or down and they had to filter to know
20:34
the S&P. And then we asked
20:36
about inflation. We asked that one really pointedly.
20:38
We said, is unemployment at record highs or
20:40
record lows? And 62% said it's
20:42
at record highs. So
20:44
it just shows you facts are facts. And that's
20:47
probably why my favorite polling fact is that 52%
20:50
of voters say they rarely believe
20:52
political poll results. So
20:54
you're dealing with a difficult business, but what
20:57
becomes more important
21:00
than sort of the anomalies is
21:02
the underlying questions of why, like
21:04
why people are thinking
21:06
the way they are. And I think that's
21:09
the start of where things get
21:11
really interesting. Like I said about this
21:13
Genzilla, we started to weave
21:15
and see a theme of
21:17
activism in a lot of the things
21:20
that young people were telling us. Feeling
21:22
that older people won't get out of
21:24
the way, feeling that capitalism
21:26
isn't working for them, feeling that
21:28
they're starting behind financially. Each
21:31
and every poll, all these different projects, you
21:33
start to see these patterns and
21:35
that starts to become the thing that I think is more
21:37
meaningful. So I'm curious to hear
21:40
if you want to elucidate some
21:42
of the whys for us on that that
21:44
you see as far as Gen Z. I
21:46
do have kind of a halfway question
21:48
before that, which is going
21:51
back to our conversation about how
21:53
poll questions are phrased, something like
21:56
whatever percentage of it was of Gen Z that think
21:59
the older generations are at work. term thinkers. Like I'm
22:01
thinking about how I would answer that question, right? Like
22:03
I don't walk around thinking that the older generations are
22:05
short term thinkers, but if someone put that question in
22:07
front of me in a poll and was like, do
22:09
you think that they are? I'd be like, I don't
22:11
know. Yeah, probably maybe climate change,
22:13
I guess. You know, that would be the thinking. That
22:16
doesn't necessarily mean that I have like
22:18
a really strong belief about that. So
22:20
yeah, thoughts on that. Yeah. So
22:23
again, it's just looking at the
22:25
patterns. First of all, just not
22:27
asking leading questions, right? And
22:29
again, trying to ask questions that involve
22:32
trade offs, personal trade offs,
22:35
you know, so it's like, would you want
22:38
more of something and would you give up
22:40
something in return? And so
22:43
I guess that becomes a really important thing.
22:45
But we also have to try
22:47
to go find when people are sort of not really
22:50
asking the questions the right way. And a great
22:52
example of that is we just asked this real
22:54
simple question, because there's
22:57
all this political toxicity right now. So
22:59
we just asked Americans in your opinion,
23:01
which of the following words are more likely
23:03
to bring people together or divide
23:06
them? And right
23:08
off the bat, DEI was alienating
23:10
by 38% of Americans,
23:13
including 47% of Republicans. Now
23:18
only 27% of Democrats, there was
23:20
like a 20% split. In
23:22
short, it's just become sort of a
23:24
politically toxic word. Well, we went back
23:26
and we started talking and we worked
23:28
with the Black Economic Alliance.
23:32
And we thought that the way that
23:34
questions mask is highly simplistic, right? And
23:36
what we figured out is that DEI,
23:38
like ESG, these have just
23:40
become dog whistles, right? When you hear those,
23:42
you actually don't think like a human being,
23:44
you think like a voter, and
23:47
you kind of go into your own
23:49
political ideology around these words because you've
23:51
sort of been trained that these are bad words
23:53
or good words. We just went and asked the questions
23:56
in an entirely different way. And we asked
23:58
Americans around. corporate
24:00
diversity and whether that was a good thing or
24:03
a bad thing, is it related to them? Did
24:06
they think that corporations would be
24:08
better at customer service if they were diverse? They
24:11
think they'd be more profitable for shareholders.
24:13
Do they think that reflecting the diversity
24:15
of American population was going to make
24:17
for better products and better services? And
24:20
we have high support for corporate
24:22
DEI. They actually think it makes stronger
24:24
companies. But the questions aren't asked
24:27
that way, right? When the news cycle hits,
24:29
it's like DEI is a bad
24:31
term. So everything has got new ones
24:33
in it. Hey,
24:43
everybody. I'm Scott Schaeffer. And I'm Marisa
24:45
Lagos. We're the host of Political Breakdown,
24:47
a show that pulls back the curtain
24:49
on the people and forces driving politics
24:51
in the Golden State from KQED in
24:53
San Francisco. And
24:55
now ahead of the 2024 election, we
24:58
are bringing you even more, more
25:00
conversations with the top movers and
25:02
shakers at the state capital and
25:04
in national politics. But the dyslexia
25:06
was the greatest gift that ever happened to
25:09
me. Nothing was wrote, nothing was linear. I
25:11
had to work around things, work differently, see
25:13
the world differently. And I say that to
25:15
young people and say, know how important your
25:18
participation is. And I think it's the time
25:20
for this generation to put forward new voices.
25:23
More reporting with analysis. It's
25:25
been a very good session for organized labor, but
25:27
hot labor summer. Hot labor summer, it's turning out
25:29
to be a nice fall as well. More
25:32
politics with personality. That's what election
25:34
day meant our life. We
25:36
hear that. Political breakdown daily.
25:39
Every weekday, we'll break down what's happening
25:41
and why it matters with news
25:43
that informs, surprises and maybe even
25:46
inspires you. Political breakdown goes daily
25:48
starting January 8th. They
26:00
say you should learn something new every day.
26:02
It's good advice, but with so much to do in
26:05
your daily life, how are you gonna make the time
26:07
to learn and stay curious about our world? Well,
26:10
with everything everywhere daily, you can
26:12
easily make that goal an actual reality.
26:15
Everything Ever Daily is one of the world's
26:17
most popular daily education podcasts and a top
26:19
three history podcast. In about
26:21
10 minutes, you can learn something new every
26:23
day. The show covers history,
26:25
science, geography, mathematics, and technology, as well
26:28
as biographies from some of the world's
26:30
most interesting people. Fans
26:32
of the show are so passionate that you don't
26:34
work to join the Completionist Club, the
26:36
group of dedicated listeners who have listened to
26:39
every single one of the show's more than
26:41
a thousand and counting episodes. All
26:43
the episodes are informative, interesting, and best
26:45
of all, always under 15 minutes. So
26:48
go ahead, learn something new every single
26:51
day with everything everywhere daily. Find
26:53
it on Apple Podcasts, Spotify, or wherever you
26:55
get your podcasts. The
27:01
lack of zero sum questions is a really good one,
27:03
is a way of getting about it. I mean, one
27:05
of my bugaboos is
27:08
knee-jerk anti-GMO reactions, where people don't entirely
27:10
know what it is that they are
27:12
opposing when they're opposing GMO, as in
27:14
like what that really means in a
27:17
fundamental sense, even though huge majorities, and
27:19
certainly Western Europe and urban
27:21
America would be against GMO, but if they were
27:23
asked the question, would
27:25
you support the growing and the isolation of
27:27
certain plant strains that are far
27:29
less water intensive and don't require insecticides or
27:31
pesticides? You'd have a very different response, right?
27:34
It's kind of the same question. One
27:36
of my favorite, the zero sum ones, right, which is
27:38
the famous 2015 poll, or something like 35%,
27:42
or 30-something percent of presumptive Trump voters
27:45
were in favor of bombing Agrabah, which
27:47
is the name of the kingdom in
27:49
Aladdin, a fictional kingdom in Aladdin. This
27:53
is not a bacrifal, you remember this, this actually was a
27:55
polling question, so I wanted to get an attitude. But of
27:57
course, when asked, you know, do you want to bomb a
27:59
country? country that you think may be a vague threat, sounds
28:02
like a threat, without any trade-offs, right? Would
28:04
you send your children to die to prevent
28:06
Agrabah from doing whatever? It's a very
28:08
different question than... Exactly. Do
28:11
you support the bombing of Agrabah? That's a classic.
28:13
That's correct. Public policy polling got 30%
28:15
of Republicans and 19% of Democrats to say they support bombing, well, Agrabah. Yes,
28:18
Volume Millennials, that Agrabah. The older Princess Jasmine of
28:20
the street, that's a legend. After
28:22
the news broke, headlines largely
28:27
focused on
28:29
the Republican respondents
28:32
in the polls, which kind of misses the
28:34
sizable percentage of the other side of the
28:37
aisle. In corporate land, right?
28:39
The separate from, do you
28:42
like strawberry vanilla, yuho, versus
28:45
more caffeinated Mountain Dew? Do you find that
28:47
there's sort of company data
28:49
about consumer behavior that would be
28:51
more useful if it were far
28:53
more in the public realm? Because
28:56
really, a lot of companies want their
28:58
product data and their corporate data to be their
29:00
own. Yeah. Go a bit deeper on
29:02
that. What are you thinking? CompanyX is
29:04
finding something vital about their consumer
29:06
base. Could that information have constructive
29:08
public good applications rather than it
29:11
simply being kept by a company
29:13
for its own ability
29:15
to sell its particular product? A
29:17
lot of what is changing
29:20
today is the kind
29:22
of new style corporate advertising, which
29:25
is finding your own
29:27
platforms and releasing your own research.
29:29
And one that comes to mind, it's
29:31
a client we worked with that's Google, but we
29:34
found in this study a really
29:36
fascinating stat that
29:38
72% of global executives
29:41
would admit to greenwashing and they admit
29:43
that other people do it. And it's
29:46
because they don't have metrics,
29:48
like all the metrics are squishy, all
29:51
the standards, and it actually
29:53
exposed a problem. And they thought, this is a
29:55
really great thing we should put out. And
29:58
so we put it out and it spurred on. a
30:00
significant amount of debate,
30:02
but that's a topic that is important to
30:04
Google and they said, yeah, let's let it
30:06
fly. Was there ever a
30:08
time where what you expected to find
30:10
as an answer was just like completely
30:12
the opposite? Totally. One of
30:15
my favorite ones this past year was
30:17
we do a corporate reputation survey. It's
30:19
consumer-based. It's Main Street. So instead of
30:21
asking elites, we just ask ordinary Americans
30:23
to buy products and services. What are
30:25
their favorite companies, which companies you admire
30:27
the most and there's sort of different
30:29
dimensions that make the ranking, but it's
30:32
really revolves around character, trust and ethics
30:35
are sort of the big drivers. And
30:38
number one was Patagonia
30:40
and number five out
30:42
of the top 100 was Chick-fil-A. No,
30:45
really? And this
30:47
is all Americans, right? Really?
30:49
And so I'm trying
30:52
first of all to imagine these
30:54
two companies like at the
30:56
same corporate picnic and I'm just
30:58
fascinated why all Americans
31:01
would like rate those
31:03
two companies because their ideologies are
31:05
so completely different. And what comes
31:08
out in the data that's actually
31:10
really, I think, encouraging is
31:13
that people didn't agree with
31:15
their platforms, but they respected
31:17
them for upholding their values,
31:19
for sort of walking their talk. And I
31:21
kind of looked at it as an opportunity
31:23
for a conversation between these two
31:26
bipolar sides because we
31:29
have another Harris data right now, all
31:31
the toxic politics that have kind of
31:33
tinged AB InBev or Disney or Target,
31:35
other companies that got caught up in
31:37
a lot of this political toxicity. These
31:40
two companies actually are liked by
31:42
both Republicans and Democrats because they're
31:45
just true to their values and
31:47
they don't flip flop and they don't change positions.
31:49
They don't virtue serve. They just are who they
31:51
are. It's an interesting kind of
31:54
point because there seems to be this
31:56
time right now where trust
31:58
is so low. and expectations
32:00
are so low among the American public
32:02
for a candidate or even a company
32:05
that when you are just old-fashioned,
32:08
you can be relied on even if we don't agree
32:10
with you. Thought that was really kind of hopeful. So
32:13
it definitely wasn't that so many Republicans were
32:15
pro-Chik-fil-A. Like it was that sort of
32:17
equal amounts of both Democrats and Republicans.
32:20
Because when I was in university, it was social suicide
32:22
eat at Chick-fil-A. There was one Chick-fil-A
32:24
at the food court. You did not
32:26
wait in line for Chick-fil-A, if you
32:28
wanted to have friends. Yeah, I'm looking
32:30
at the data right now. So Chick-fil-A
32:32
was number five among GOP. The GOP
32:34
also had Patagonia number three. Ten.
32:37
Which is really interesting, and it cuts
32:39
across rural and urban. Democrats are less
32:42
forgiving of Chick-fil-A. They're more, they have
32:44
them up much further down the rankings.
32:47
But it's just fascinating that, I'm just gonna hold
32:49
this up for you guys. I know this
32:51
is bad, bad form. We'll try to
32:53
explain this for audio. So this is
32:56
Chick-fil-A and Patagonia, and
32:58
how they just effortlessly surf across
33:01
sort of different political parties, different
33:03
ideologies. So they're kind of highlighted.
33:06
And you can just see they're top
33:08
10 with everybody. Part of that is
33:11
the Democrat issue with Chick-fil-A is the
33:13
corporate supportive conservative causes, religious causes. In
33:16
terms of employee satisfaction, Chick-fil-A is off the
33:18
charts. It's only off the charts compared to
33:21
mass employers of minimum
33:23
wage or entry level workers, compared to Walmart or
33:25
the rest. So the Walmart's improved mightily over, I
33:27
think over the past 10 years. Walmart's interesting to
33:29
come up. I don't know where they show up
33:31
in your top 100, but they would've
33:33
been way, way, way down 20 years ago, and
33:36
I think they're probably upper quadrant now. Is
33:38
that right? Do you know offhand where Walmart
33:40
is? That's right, yeah. They've improved significantly over
33:42
the last 10 years. Simply by paying attention
33:44
to their workers. I mean, Walmart's cynically
33:46
at one point, right, is that if you don't pay your
33:48
workers enough money to shop
33:51
at the stores that they're working at, particularly for Walmart, so
33:53
it has like a million and a half employees, you've actually,
33:55
you're losing their customer base. You might as well pay them
33:58
enough to at least buy your own stuff. Chick-fil-A
34:00
is very, very high. So it
34:02
was Patagonia on obviously different metrics
34:04
than college lunchroom,
34:07
right? You talk about Emma. Because no
34:09
one's thinking that about are they treating
34:11
their employees well, but it's hundreds of
34:13
thousands. I don't know what the
34:15
number is. Emma talked about and asked about the
34:17
people are pissed off and feeling negative about all
34:19
sorts of things, direction of the country, economic
34:22
security, immigration, kind of down the list.
34:25
There's nothing other than the 69% who feel that
34:28
things are sufficiently bad, but therefore I think are
34:30
likely to get better, which is a
34:32
form of optimism, but it is an optimism informed
34:35
by pessimism. Is there anything in the polling over the
34:38
past 20 years that would give you any sense as
34:40
to why of this progression?
34:42
I mean, I don't know. Maybe
34:44
in 2019, do you recall people probably were
34:46
feeling okay about the economy. It's been a
34:48
long time before people in any
34:50
significant numbers felt good about the direction
34:52
of the country, right? I mean, that's...
34:55
It's been way better than it is now, but it
34:57
hasn't been good for the past 15, 20 years. No,
35:00
that's right. And those, Zachary, have
35:03
moved lockstep with our growing
35:05
political divide. What basically happens
35:07
on these questions now is
35:10
that they're highly geared toward
35:12
being anti-incumbent. So, in
35:14
this month's Harvard Harris poll, 62% of
35:18
Americans say that the
35:20
country is on the wrong
35:22
track and that includes 32% then saying
35:24
it's on the right track. But
35:27
if I hadn't shown you that data four
35:29
years back, it would be
35:31
nearly identical, but it would be flipped
35:34
where Democrats would say it's on
35:36
the wrong track. So it's high Republican numbers
35:38
that are creating that. You
35:41
only have 18% of Republicans saying the country's
35:43
on the right track and 55%
35:45
of Democrats. So
35:47
that's the biggest issue. I think it's
35:49
less accurate rather than ask the big
35:52
macro questions because you're going to get
35:54
this political tinge response. So
35:57
give an example, right now, 60% of Americans...
36:00
Americans say the economy is on the
36:02
wrong track versus 34%
36:04
rather say it's on the right track.
36:07
But when you ask people how do
36:09
they feel about their own personal financial
36:11
situation, right? So we're taking
36:13
that out of that voter mentality. It's
36:16
at 28% improving and 27% just as well off.
36:20
So you put those two together, now suddenly you've got more
36:22
than 50% of the country, 55% and
36:26
now say the economy is looking better. Exactly to
36:28
your point earlier, you know, about how you frame
36:30
a question and how you ask it, the
36:32
more personal you can be in
36:34
these questions and the less kind
36:36
of vague and macro and political, the better
36:39
answers you're going to get. We do see
36:41
a lot of hope on the economy and
36:43
who knows, it's way too early right now to
36:46
be projectable in terms of the election.
36:49
That's another really important thing. I just
36:51
made your caveat polling is only as
36:53
good as right now, right? Polling
36:55
doesn't tell you the future and doesn't often
36:57
do a good job of even telling you
36:59
the present sometimes. But in this instance, what
37:01
we're seeing is some really
37:03
nice upticks in people's
37:06
confidence on economy. We
37:08
just released a survey with Axios, we
37:10
call Axios hair spool vibes and
37:13
two thirds of Americans think 2024 are going to
37:15
be better than 2023. Yeah,
37:18
there's always a bit of a
37:20
lag, right? Between the actual economic numbers and how
37:23
people feel about them. That reminded me to ask
37:25
you previously why you
37:28
said there might be signs that there is a secret
37:30
Biden voter out there in the way that there
37:32
used to be a secret Trump voter. Well, there's
37:34
a couple things that are a little concerning
37:37
in the polling as we
37:39
kind of think about moving forward
37:41
to the election. A
37:44
couple things that we're seeing happening, one
37:46
is demographic shifts and voting patterns among
37:48
black, Latino and Asian communities towards
37:51
Republican candidates. That needs
37:53
to make sure that the pollsters sort of go that
37:55
way. But when you get into
37:57
those biases, this sort of Biden
38:00
voter is just what
38:02
happens when there's all this speculation
38:05
of all this criticism of Biden
38:07
being too old and too
38:09
out of touch. There's so
38:11
many other issues that still might bring
38:13
people to the polling booths, right? It's
38:15
obviously abortion at Roe v. Wade. It's,
38:18
you know, immigration. It's the
38:20
economy. So one of
38:22
the places we're really interested in is
38:24
like, are these left leaning younger demographics
38:27
that are being highly critical of Biden
38:29
right now on Gaza and Israel? Are
38:32
they going to sort of, you know, when Trump then,
38:34
which is like galvanize these voters and get
38:36
them there and the pollsters aren't going to pick it up.
38:47
This episode is brought to you by Shopify.
38:48
This episode is brought to you
38:51
by Shopify. Forget the frustration of
38:53
picking commerce platforms when you switch
38:55
or business to shopify. The global
38:58
commerce platform that super as you're
39:00
selling where ever you sell with
39:02
Shopify. Kill Harness the same intuitive
39:05
features, trusted apps and powerful analytics
39:07
used by the world's leading brands.
39:10
Sign up today for your one
39:12
dollar per month trial period at
39:14
shopify.com/tech all lower. That's shopify.com/ Tech.
39:19
So I want to hit it back to something
39:21
you said before about your, your bot problem or
39:23
making sure that the person who's answering your online
39:25
question is a person. I've tried
39:27
to avoid asking questions about AI just
39:30
because everybody seems to be
39:32
intentionally asking questions about AI in areas where
39:34
it really doesn't apply and it's
39:36
completely speculative. But
39:39
certainly in the polling world
39:41
of trying to gain things
39:43
from essentially online presences,
39:46
that will be an issue or certainly
39:48
a priority is an issue for you.
39:50
What do you see going forward about
39:52
your ability to authenticate a human
39:55
in your responses? Yeah, no, that's going
39:57
to be a significant issue and we're
39:59
working. And right now, it takes
40:01
our fraud protection up
40:03
another level and match AI with AI to
40:05
make sure that we've got a personal identification,
40:08
whether it's a web address or other firm
40:10
to make sure that we're not getting bought
40:12
it. And this is going to be a
40:14
massive, massive issue with
40:16
identification. I think the
40:18
thing that we're doing on the positive side,
40:21
we just launched a product we call
40:23
Quest DIY, which is sort of do-it-yourself
40:25
sort of polling, where you can kind
40:27
of create your own surveys. It's like
40:30
a survey monkey, but it's AI driven.
40:33
And the cool thing about it is it allows you
40:35
to sort of get your polling
40:37
questions going and you
40:39
get improvements. So, you get suggestions from AI
40:42
to sort of help make them stronger. And
40:44
look, the way I think about all this stuff, it's
40:46
kindling with. It's just a great
40:48
way. It's not the answer.
40:50
It's not the fact. It's not the creative
40:53
insight. It's not the big idea, I don't
40:55
think, not yet anyway. But it's
40:57
sure a great start. And so,
40:59
if you've got thoughts and ideas, if you're trying
41:01
to write a poll, this is an incredibly great
41:03
way for us to use it. And
41:05
we're seeing a lot of progress of
41:07
it in our business. Yeah, maybe elaborate
41:09
on that for a moment, because I know there
41:12
was an attempt a bunch of years ago, especially
41:14
a bunch of Argentine economists. And after
41:16
the collecting government data about inflation and
41:18
other things became so politicized, it was
41:20
a bunch of MIT people. They found
41:22
a way to just sort of scrape
41:24
data from prices that were posted online
41:27
as a proxy way of you
41:29
can't collect official inflation statistics because there's enough
41:32
price data out there that if you could find
41:34
a program that would scrape the data, you'd essentially
41:36
come up with a very accurate read. Is there
41:38
a polling equivalent, meaning if you can't, or
41:41
maybe juxtaposed to it, where you
41:44
have a question and there's so much
41:46
between whether Reddit and message
41:48
boards and all the stuff out
41:51
there that you could essentially use
41:53
algorithms and AI tools to collate
41:55
existing online data to get at
41:57
some of the same answers?
42:00
Absolutely, Zachary. And we are right
42:03
now working on a project to take
42:05
our 60 years of Harris
42:07
Pole archives of our surveys and
42:10
try to do exactly that. Because
42:13
we've got just wonderful treasure
42:15
trove of data on
42:18
major social issues in America, right?
42:21
Gun control, racism and
42:23
equality, freedom of choice
42:25
versus pro-life, pro-choice.
42:28
And we're just interested to try to go back
42:30
and use that to try to understand how
42:32
the social attitudes have changed. And right
42:35
now, it's all sitting on like PDFs
42:37
of old surveys that were done
42:39
on paper in the 60s. I'm really
42:41
excited by that because I think it could really provide
42:44
incredible... And we're not the only ones doing
42:46
it obviously, but just incredible insights
42:48
that might help us make stronger, smarter polling
42:50
questions in the future. Could you
42:52
scrape how Zachary is asking from external data?
42:54
So not data that you've already pulled for
42:57
internally, but like he's saying from Reddit or
42:59
TikTok comments or something like that, or is
43:01
it just completely not hygienic in a way
43:03
as far as data goes? We're not
43:05
doing that per se, we have companies
43:07
like in our larger group that are
43:09
looking at those projects. I mean, I
43:12
think the big important thing is
43:14
how you train your language learning
43:16
models on the right data, so you obviously
43:18
have garbage in, garbage out and you don't
43:20
have bias and you don't have
43:23
something that comes in that really as you
43:25
say sort of taints your data stream.
43:27
And like we're going to have to just for that
43:29
even go to this Harris thing because of the way
43:31
that they asked polling questions in the 60s. I mean,
43:33
some of them guys are like cringe-worthy, right?
43:37
So that's always going to be a problem.
43:39
But like you
43:41
said with your example with Argentinian
43:43
inflation, that's just
43:45
a really interesting way to have sort of
43:48
behavioral data versus just
43:50
stated data. Right. Have
43:53
you been able to discern... One thing
43:55
I've noticed is particularly pronounced
43:57
really over the past decade, probably. it's
44:00
been true for longer is
44:02
the disjuncture, you mentioned this about immigration,
44:04
that a lot of people don't have direct
44:07
experience of whatever the immigration problem is, but
44:09
because it's constantly out there as a problem,
44:11
they've internalized it as an act of concern.
44:14
I'm not saying it isn't, I'm just saying it's
44:17
a difference between personal experience and public
44:19
perception or public awareness. There's always been
44:21
a disjuncture between, let's say, how people
44:23
feel about Congress and how they feel
44:25
about their congressperson, or how they feel
44:27
about schools and how they feel about
44:29
the particular school their children are at,
44:31
or how they feel about the economy
44:33
versus how they feel about their particular
44:35
business. Has that split gotten
44:37
wider? Meaning, there's just a lot of
44:40
people will say that their families are
44:42
fine, their kids' school is okay, their
44:44
job is decent, their employment
44:46
level, they feel okay about X,
44:49
Y, and Z, but they dislike
44:51
or distrust the general version of
44:53
the same specific. Yes, I
44:55
mean, everything generally is
44:58
trusted more at the local level. And
45:01
as you expand out, it's mistrusted when it
45:03
goes macro. That's for
45:05
the news media, that's for politicians,
45:07
that's for government agencies.
45:10
I mean, it's really fascinating that
45:12
local, like if we're going to get anything done in
45:14
this country, like you're going to have to do a grassroots
45:17
up because that's where the
45:19
trust is. I think that that's really
45:22
interesting and particularly like when we get to
45:25
the macro news, we just did a poll
45:28
on misinformation and we found that 62% of
45:30
Americans, they're cutting
45:33
back on their news consumption in
45:35
order to protect their mental health. And
45:38
so it's just that the national
45:40
news media has just become so tainted
45:44
with perceptions of bias that it's not
45:46
only an echo chamber, it's now a public health
45:49
crisis. And that's the same
45:51
with government. So I don't
45:53
know if that answers your question, but definitely
45:55
it's what we're seeing in our patterns. Emma
45:57
did a piece early on in the Progress Network about
45:59
how... how to read the news without losing your
46:01
mind, kind of apropos the, you know, how do
46:03
you actually digest the news without
46:06
going insane or driving yourself crazy as the
46:08
case may be. So kind of very
46:10
much in this line of most people's
46:12
response to that is to simply tune
46:14
out, right? Because there's no, they don't,
46:17
because they haven't read Emma's guides to the,
46:21
guides to the dispossessed of how to
46:23
digest it and stay okay with it.
46:25
It's sort of an all or nothing
46:27
approach. It's either overindulgence or abstinence. I
46:30
got to read your post, Emma. Sounds great. I'll
46:32
send it along. I'm working on a book on it too. So. Oh,
46:35
nice. Awesome. I think as
46:37
the perceptions of bias is a huge thing. And
46:39
I think it's also just a feeling of helplessness
46:42
because we're just inundated with so much news that
46:44
we have absolutely no control over from places that
46:46
50 years ago, we wouldn't
46:48
even have seen in our lives, you know? So
46:51
I think that does a decent amount
46:53
of making people feel. As
46:56
you're saying, maybe great about my personal life, about
46:58
the world, I feel constantly terrible.
47:01
I'm really interested in the rise and fall of
47:03
company reputations. 15 years ago, Google,
47:06
Facebook, I don't know
47:08
about Amazon, but so, you know, Google, Facebook,
47:10
Apple were seen as good companies
47:12
by people, right? There was a high
47:14
degree of trust since, I
47:16
remember this at the time of the financial crisis, right? In 2008,
47:18
2009, if you'd ask people which
47:21
companies that were most negative about it,
47:23
they would have listed Goldman Sachs, Bank
47:25
of America, a series of financial institutions.
47:28
And now today, I don't
47:30
know if you're finding this, but I think the
47:33
things that most people are agitated by are big
47:35
tech companies of one form or another. We're
47:38
recording this, by the way, during
47:40
the week when the House of Representatives passed its
47:44
first effective step toward banning TikTok,
47:46
which may be an anti-Chinese move,
47:48
but it also has a degree
47:50
of just generalized social media. These
47:52
companies are bad companies doing bad things. Do you have
47:55
a sense of why that is? Or is that too
47:57
just another one of the press, the
47:59
PR go. worse and then people's attitudes followed
48:01
or do people's attitudes get worse and then
48:03
the PR followed? I think it's
48:06
a little bit more the latter but we have
48:08
seen over the last five to seven
48:10
years in our reputation surveys,
48:12
the Axios, Harris, Paul 100, that
48:15
tech is sort of bifurcated into two
48:17
camps, sort of good tech and bad
48:19
tech and so I'm generally
48:21
speaking but the good tech are tech
48:24
companies that make things. So
48:26
you know Samsung, some electronics,
48:29
Microsoft, but if
48:31
you're a social media company, you're at the
48:33
bottom of our survey. If you're TikTok, if
48:35
you're meta, you know if you're
48:37
I just Instagram rolls up
48:39
under meta but those companies are sort
48:41
of seen as harmful to society and
48:44
not being productive as
48:46
it were. The other really interesting thing
48:48
that's happened, again my low-tech way
48:50
I'll show you but we tracked 15 years
48:52
of Harris data looking
48:55
at companies that had had a
48:57
huge crisis. So this
48:59
has got BP, Wells
49:01
Fargo, Volkswagen, Chipotle with
49:04
E. coli and
49:06
generally what ended up happening is they all
49:08
basically had a U-shaped recovery, right? They had
49:10
a crisis and their reputations came
49:12
back. So they're all kind of back to the same
49:15
place they were but the
49:17
new thing we're seeing is a bell-shaped
49:19
crisis which here we
49:21
have Disney, AB InBev and notably
49:24
meta. And meta has
49:26
the longest timeline but what
49:28
you might be able to see in my wonky
49:31
chart here is that meta
49:33
hasn't recovered and the
49:35
reason these companies are all facing these
49:37
elshid crisis where their reputations haven't come
49:39
back is they fragmented between
49:42
Democrats and Republicans. So they've
49:44
entered a new territory, a political
49:46
territory where for one
49:48
reason or another or you know with Disney
49:50
it was the don't say gay bill with meta
49:53
it was being perceived as sort
49:55
of anti-republican in Cambridge Analytica. ABI
49:58
was obviously last year with Bud Light. you just
50:00
end up in a potentially
50:03
a longer term crisis because you just
50:05
alienated half your audience by
50:07
sort of challenging them on a political level which
50:09
nobody wants to be challenged on here just want
50:11
to have a beer. Interesting.
50:15
Well John I think we are at our time but
50:18
I really want to thank you for the
50:20
conversation kind of wide ranging from tech to
50:22
consumer to politics. I urge everyone
50:24
is listening that the weekly what's the exact name
50:27
of the weekly surveys so people could look it
50:29
up. America this week it's
50:31
under my name John Gerzmo from the Harris
50:33
Poll at LinkedIn. I would definitely
50:36
check that out it's a really really good
50:38
snapshot of just attitudes writ large and obviously
50:40
looking at it not just
50:42
week to week but over time is particularly
50:44
helpful just to see some sense of trends
50:46
but really appreciate the
50:48
time definitely appreciate the work and the
50:50
data and we'll see how this year
50:53
plays out and public attitudes in both
50:55
tech regulation and political
50:58
outcomes and whether
51:00
or not sentiment matches reality. Really
51:02
enjoyed it guys thank you. Thanks
51:04
John. Well
51:07
that was a fun one for us data
51:09
nerds here at the Progress Network. There's
51:11
an interesting little tidbit at the end
51:13
there that polarization has come for company
51:16
crises right like I had even forgotten
51:18
that Wells Fargo had a crisis completely
51:20
forgotten about BP until he mentioned it.
51:23
But I guess nowadays once you
51:25
go down you stay down.
51:28
Although the fact that you'd forgotten about BP whoever
51:31
is you know listening if BP's
51:33
crisis PR team stumbles
51:35
upon this episode they'll be.
51:38
My friend is advertiser then that's probably why
51:40
so like my most recent reference point doesn't
51:42
have to do with the oil slicked pelicans
51:44
and so on and so forth. Good
51:47
for BP not good for us. This
51:49
constant disjunction between personal attitudes local attitudes
51:51
person like your your sense of how
51:53
your own life is doing versus the
51:56
sense of the collective and the degree to
51:58
which the skew is personal. positive,
52:00
collective negative, the generalizable statement
52:03
that tends to be the
52:05
case. And that's an even
52:07
more stirring
52:10
indication of most of people's experience of a
52:13
collective reality is filtered through whatever we call
52:15
the news and or social media, right? Because
52:17
we don't have a direct experience of it.
52:20
We only have the experience of what we
52:22
hear online or see online or read. And
52:24
so our sense of reality
52:26
beyond our immediate is shaped
52:28
intimately by a few
52:31
types of medium, but not our own senses
52:33
and not our own life experience. And
52:36
if that's largely being told in
52:39
the negative, it's likely that we're going to have
52:41
a negative perception, right? Yeah, I was
52:43
thinking about that as we were discussing that,
52:45
that the ideal situation would be that we
52:47
feel positive about our personal situations and we
52:49
feel positive about the collective, right? Right. So
52:51
we're not in the ideal, but I'd rather
52:54
have a situation that we have now that
52:56
a situation in which you feel really negative
52:58
about your personal situation, because that likely means
53:00
that there are real reasons for that and
53:02
feel positive about the collective, which would just
53:04
be like a delusion that would
53:06
be just as harmful, right than the one that
53:09
we have now. But actually, I think I'm saying
53:11
I think it would be more harmful than the
53:13
one that we have now. I'm kind of pulling
53:15
for the at least in our personal lives, we're
53:18
doing kind of swell and that might affect the
53:20
intensity of how negatively we think of the
53:22
collective. And the one exception, which we didn't talk
53:24
about with him is, of course, people do, at
53:26
least in the United States, feel economically insecure by
53:28
large measures. And that's a very personal one, meaning
53:31
do you personally feel economically insecure? Do you have
53:33
enough money to meet a crisis of one from
53:35
another? And the answer is largely no. So
53:38
those are personal ones about one's experience of
53:40
what's going on economically. And I do think
53:42
that those have a high degree
53:44
of legitimacy. What's interesting
53:46
is, it's not entirely clear, or
53:48
rather it's not clear at all, that
53:50
one's ability to do that 30 or 40 or 50 years ago
53:54
was any better. What clearly has changed is
53:57
that your expectation that it not be as
53:59
bad. has risen, meaning a
54:01
tenable level of insecurity in 1970 is
54:04
not a tenable level of insecurity in
54:06
2024, if you are your average citizen.
54:08
Yeah, and what's really interesting to me about
54:10
this is that people now are starting to
54:13
bring up the economic insecurity of Gen Z,
54:15
but it's actually a repeat of the
54:17
pattern that we saw with millennials, I
54:19
think. I don't think things have materially
54:22
deteriorated between Gen Z and millennials that
54:24
much. And there was a very long
54:26
time that millennials were saying that were
54:28
economically insecure, and there were reasons for
54:30
that, and that wasn't illegitimate. But now
54:32
it shows that millennials have
54:34
caught up wealth-wise with other
54:36
generations where they were at this age
54:39
and in fact, surpassed them. So there
54:41
actually is some economic data that
54:43
says that, in fact, we are in
54:46
a better position than we were previously.
54:48
And I think that some of the
54:50
insecurity, particularly around the younger generations, just has to do
54:52
with the fact that you're young, you're not making that
54:54
much money, a lot of people have college loans, but
54:56
that doesn't mean it's going to stay that way forever.
54:59
Yeah, I mean, it's hard for me to believe this now, but
55:01
the onset of the 2008-2009 financial crisis
55:05
is 16 years ago, or
55:07
15 and a half years ago. And if
55:10
you were a millennial who's kind of first brush
55:13
with the external world was that
55:15
crisis, that's certainly going to inform
55:17
your attitudes for a whole bunch of years, because it was a
55:19
bad couple of years and it took a long time for things
55:22
to quote unquote normalize. The other range to
55:24
think about John and Harris Poll is, at least
55:27
in the private realm, they've kind
55:29
of gone around the whole issue of how do
55:31
you reach people, like simply paying for panels, which
55:35
remains a problem with political polling, right?
55:38
Because one of the hardest things now
55:40
for a lot of the pure political
55:42
pollsters is actually finding people to
55:44
answer the questions and not just online.
55:47
Like I guess Ugov does tens
55:49
of thousands of online surveys, but you never know how
55:52
representative those tens of thousands who are
55:55
answering those are. I'm still a little
55:57
bit unclear. Like I know that if you offer people...
56:00
I've read that opt-in online
56:02
surveys are notoriously unreliable because
56:04
they attract trolls, because who are
56:06
the people that are hanging around online?
56:09
It's the trolls, right? And that's why
56:11
you get really crazy numbers on some
56:13
polls. Like 30% of Americans are Holocaust
56:15
deniers, but like, in fact, they're not. So
56:18
I guess I still don't understand like
56:20
what the correct
56:22
way to attract someone online
56:24
is. So you like, you buy
56:26
a list off someone and you email them and you offer them 100
56:28
bucks, is that how they do it? I'm not sure.
56:31
And look, this is going to be the year of obsessive
56:33
following of polls politically. But
56:36
frankly, all that's going to matter is the polls in
56:38
like six states for the US presidential election. National
56:41
polling is going to make very little difference. It doesn't really
56:43
matter. Trump is 4%
56:45
ahead of Biden or Biden's 6% ahead
56:47
of Trump. It matters how either of them
56:49
are doing head to head in Georgia, Arizona,
56:52
Wisconsin, Michigan, Pennsylvania, and Nevada. Anyway, going
56:54
to be a fun year. Everyone should
56:56
check out the Harris Poll information. They're
56:58
one of the more, at least
57:01
I find one of the more insightful,
57:03
nuanced polling organizations because they're not driven
57:05
primarily by a news electoral cycle where
57:07
they're supposed to give some sort of
57:09
binary horse race. They're much more
57:11
in forward of polls about attitudes than they
57:14
are horse race polls. And I think they're
57:16
more interesting to read. So we turn to
57:18
our rapid fire things you should have
57:20
been paying attention to. Rapid fire positivity.
57:22
Yeah, let's do it. All right. So
57:25
let's take a look at some news that
57:27
probably went under most people's radar, starting with
57:30
the Caribbean island nation of Dominica. When's the
57:32
last time you heard a new thought of
57:34
them at the Caribbean island nation of Dominica?
57:36
They have overturned a ban on
57:38
consensual same-sex activity, good for them,
57:40
that has been on the books since British rule
57:43
in the 1800s. And
57:45
what's neat about this is that it's part
57:47
of a bigger trend with Caribbean nations in
57:49
the past few years. So Barbados, St. Kitts
57:51
and Nevis, Trinidad and Tobago, and a few
57:53
other countries have also overturned this ban. It's
57:56
not completely legal all over the Caribbean, but
57:58
it's kind of creepy. and trending that
58:01
way. So it's nice and safe. Yeah.
58:04
There's a lot actually truly going
58:06
on between Caribbean nations around sustainability
58:08
and environmental change. Island nations tend
58:10
to have a strong
58:13
self-interest in doing something about climate change
58:15
because of obviously rising sea levels and
58:18
dependency on tourism and the degree to
58:20
which those two things could
58:22
not easily coexist. So
58:25
these are countries that have not usually
58:27
received much particular attention in global affairs
58:29
but have become – we
58:31
once talked about American states as being incubators
58:35
and experiments of democracy. A lot of
58:37
these island nations have become oddly,
58:39
oddly in the sense of their history and
58:42
not oddly per se, creative
58:44
in the way in which
58:46
they're dealing with certain problems and progressive, small
58:49
p progressive, in how they come
58:51
up with solutions. Speaking
58:54
of that, on the sustainability line,
58:57
Ember just released their
58:59
global electricity review and
59:02
the world hit a pretty
59:04
big renewable energy marker last
59:06
year. 30% of the world's electricity
59:09
was produced by renewables last year, which is
59:11
honestly more than I would have expected. A
59:13
third. Not too shed. That is
59:15
a lot. They think that fossil fuel generation
59:17
will fall in 2024 and then really
59:19
start to decline post
59:21
2024. But I've been waiting around since
59:24
about 2022 to hit peak fossil fuels.
59:29
It's been kind of plateauing and people are wondering
59:31
when is that peak going to be and I'm
59:33
hoping it's going to be right now. Yeah,
59:36
I don't think we're getting to peak oil
59:38
consumption anytime soon. What's declining radically globally is
59:40
coal, oil, and natural gas. I think we're
59:43
not going to plateau quite yet.
59:45
I think a lot of that is coal is
59:47
the one that's shrinking
59:50
precipitously. Coal
59:52
is particularly dirty and natural gas
59:54
is far less so. So
59:57
moving on, last but not least. a
1:00:00
fun one. So we definitely talked about
1:00:02
this on the podcast, the Herculan AIM
1:00:04
Scrolls that were decoded in this challenge
1:00:06
called the Vesuvius Challenge, and the winning
1:00:08
team used AI to figure out
1:00:11
what was written on these scrolls that they
1:00:13
haven't been able to unroll for, I think,
1:00:16
centuries now. I think they were discovered in the 1800s
1:00:18
because they're essentially like
1:00:20
calcified by lava. Now
1:00:22
that they figured out a technology to
1:00:25
read what's on the inside of the
1:00:27
scrolls without physically unrolling them, they're discovering
1:00:29
all sorts of fun things. One
1:00:31
of them is the exact location of Plato's burial
1:00:33
place. They didn't know exactly where he was buried.
1:00:36
And they also found this fun fact that apparently
1:00:39
there was a slave woman that was playing
1:00:41
the flute for him on his deathbed. And
1:00:44
he said while running a high fever
1:00:46
and actively dying that she had a
1:00:48
scant sense of rhythm. So that was
1:00:50
just Plato's last words. Wow.
1:00:53
Yeah. Kind of like bitchy
1:00:55
at the end, you know? Like here he is. But
1:00:58
the math. All he can do.
1:01:00
Yeah, that's just amazing. But whatever. It's
1:01:02
very cool. I mean, it's definitely very cool what
1:01:04
we can discover. And I wonder if this will
1:01:06
lead to a whole slew
1:01:09
of both new data, a way
1:01:11
of like ruins and rocks and some
1:01:13
writing. But it's a
1:01:15
lot of piecing together little fragments.
1:01:18
And you know, the idea that
1:01:20
this can unlock more texts
1:01:23
and that that can enrich our
1:01:25
sense of what happened thousands of years ago
1:01:27
is a really cool idea. Yeah,
1:01:29
I think there's going to be a lot
1:01:31
of like, we thought this, but really this
1:01:33
because there's, it was one of the biggest
1:01:35
libraries at that time. So there's an enormous
1:01:38
amount of information to be discovered. Stay
1:01:40
tuned. Yeah.
1:01:43
So that's it for today. Hopefully one of
1:01:45
those things got you into a better
1:01:47
mood from all the doom scrolling that we normally do.
1:01:50
So please send us your comments.
1:01:53
Send us your tired, you're hungry. No, we are
1:01:56
not. We are not MLAs and or the Statue of
1:01:58
Liberty, but we are what could go wrong. right
1:02:01
and much like Emma Lazarus and the
1:02:03
Statue of Liberty, we want to welcome
1:02:05
hopes and dreams and possibilities. So if
1:02:07
you have those and you want those
1:02:09
to be examined more, send us your
1:02:12
ideas. If you think that we are
1:02:14
glossing over things that we should focus
1:02:16
on more intently, send us those as
1:02:18
well. And thank you for listening. We
1:02:20
will be back with you next week.
1:02:22
Thank you, Emma. Emma Lazarus Thanks,
1:02:24
everyone. Thanks, everyone. What
1:02:27
Could Go Right is produced by
1:02:29
The Podglamorant, executive produced by Jeff
1:02:31
Umbro, marketing by The Podglamorant. To
1:02:34
find out more about What Could
1:02:36
Go Right, The Progress Network, or
1:02:38
to subscribe to the What Could
1:02:40
Go Right newsletter, visit theprogressnetwork.org.
1:02:44
Thanks for listening.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More