Podchaser Logo
Home
The Skeptics' Guide to True Crime with Dr. Steven Novella

The Skeptics' Guide to True Crime with Dr. Steven Novella

Released Tuesday, 2nd July 2024
Good episode? Give it some love!
The Skeptics' Guide to True Crime with Dr. Steven Novella

The Skeptics' Guide to True Crime with Dr. Steven Novella

The Skeptics' Guide to True Crime with Dr. Steven Novella

The Skeptics' Guide to True Crime with Dr. Steven Novella

Tuesday, 2nd July 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Thank you so much to our

0:02

sponsor, Vaya Hemp. These folks make

0:04

delectable gummies for you to enjoy

0:06

of the THC and THC free

0:08

CBD and CBN varieties. We love

0:10

the CBD and CBN options. There's

0:13

something for everyone, whether you

0:15

want to relax, get high, become

0:17

more productive, or boost your sleep

0:20

routine. Whatever mood you're trying to

0:22

create, Vaya's got you covered. These

0:25

gummies are delicious and legal to ship

0:27

to all 50 states. If you're

0:29

like us, you might need some help settling

0:31

down at night to go to sleep. We

0:33

can be downright jittery after a day spent

0:36

running around looking into disturbing stories about crime.

0:38

I found Zen to be a nice THC

0:40

free option. CBN and CBD

0:42

helped me unwind as I tried to catch

0:44

some Z's. Vaya's new

0:46

dreams formulations also can help with

0:49

sleep. They allow for a

0:51

fully customizable sleep journey, featuring two, five,

0:53

and 10 milligram options, depending on the

0:56

strength you're seeking. Head to viahemp.com and

0:58

use the code MSheet to receive 15%

1:00

off, off, plus one

1:03

free sample of their award-winning gummies. That's

1:07

V-I-I-A, hemp.com, and

1:09

use MSheet at checkout.

1:12

Please support our show and tell them we sent you. Get

1:15

the rest you deserve with dreams from Vaya.

1:19

Worried about letting someone else pick

1:21

out the perfect avocado for your

1:23

perfect impress them on the third

1:25

date, Guacamole? Well, good thing Instacart

1:27

shoppers are as picky as you

1:29

are. They find ripe avocados like

1:31

it's their guac on the line.

1:33

They are milk expiration date detectives.

1:35

They bag eggs like the 12

1:37

precious pieces of cargo they are.

1:39

So let Instacart shoppers overthink your

1:41

groceries so that you can overthink

1:44

what you'll wear on that third date. Download

1:47

the Instacart app today to get free delivery

1:49

on your first three orders. While supplies

1:51

last, minimum $10 per order. Additional

1:54

terms apply. This is

1:56

an interesting moment in true crime.

1:58

More and more people seem to

2:00

be trying. to spread elaborate conspiracy

2:03

theories in an attempt to explain

2:05

the world or try to diminish

2:07

a favored defendant's guilt. We've begun

2:09

to see this again and again

2:11

in case after case. And

2:14

let's be clear, this wild talk is not

2:16

just about true crime, and it is not

2:18

just people in the audience who believe it.

2:21

You may recall something we recently reported

2:23

concerning Kara Wienicke, an attorney closely linked

2:25

to the defense in the Richard Allen

2:27

case. Ms. Wienicke has stated

2:30

that she doesn't believe we landed on the moon.

2:32

We laughed about that a bit on the program,

2:34

but let's be honest. It is

2:37

not funny when seemingly educated people

2:39

believe and spread ridiculous nonsense. Let

2:42

me make just one more point about

2:44

that. When someone says that we didn't

2:46

land on the moon, it is easy

2:48

for most of us to immediately recognize

2:50

that that person doesn't know what they

2:52

are talking about. But

2:55

the problem is that most nonsense

2:57

and lies are not as

2:59

easy to spot as that. So

3:01

how can we, as consumers of true

3:03

crime, identify what is true

3:06

and what is not? For that

3:08

to happen, we all need to brush

3:10

up on our critical thinking skills. And

3:13

we definitely include ourselves in that because as

3:15

you'll hear in this episode, there have certainly

3:17

been times when the two of us have

3:19

failed its critical thinkers. To get some

3:21

help with all of this, we decided to turn

3:23

to someone whose work has taught me a great

3:25

deal over the years. Dr.

3:28

Steven Novella is an academic clinical

3:30

neurologist at the Yale University School

3:32

of Medicine, but I've come

3:34

to know him best through his work

3:36

in movements associated with skepticism and critical

3:38

thinking. He hosts and produces the

3:41

podcast, The Skeptic's Guide to the Universe,

3:43

and he's also written a book of that same name. We

3:46

highly recommend both. He

3:48

also is, you will soon hear, an

3:50

articulate and witty speaker. Critical thinking

3:53

is a lifelong process, and frankly it

3:55

isn't always easy. But it is

3:57

better than the alternative. When we don't do

3:59

it, we make it easy for people to

4:01

fool us and take advantage of us. Looking

4:04

at it that way, you can almost

4:06

imagine critical thinking as a shield which

4:08

can guard us against nonsense and lies.

4:11

It is time we started brandishing that

4:13

shield as we consume content around true

4:15

crime. My name

4:17

is Anya Cain. I'm a journalist. And

4:20

I'm Kevin Greenlee. I'm an attorney. And

4:22

this is The Murder Sheet. We're a

4:24

true crime podcast focused on original reporting,

4:27

interviews, and deep dives into murder

4:29

cases. We're The Murder Sheet. And

4:32

this is The Skeptic's Guide to

4:35

True Crime with Dr. Stephen Novella.

5:22

Well, let's start by asking an obvious

5:24

first question. Can you tell us a little bit about

5:26

yourself? Yeah, so

5:28

I'm Stephen Novella. I am the host

5:31

and producer of a podcast called The

5:33

Skeptic's Guide to the Universe. We're

5:36

in our 20th year actually. So we've been doing that

5:38

for quite a bit of time. And we

5:40

talk about everything to do

5:43

with science, with critical thinking,

5:45

with pseudoscience, skepticism, weird fringe

5:47

beliefs, conspiracy theories, all that

5:49

stuff. You mentioned skeptics.

5:51

What is a skeptic? So we

5:55

use the term skeptic to refer to scientific

5:58

skeptics, which is a term

6:00

that was popularized by Carl

6:03

Sagan and others, you know,

6:05

of his ilk. So the idea is

6:07

that, you know, we want

6:09

to believe what's true, what's actually

6:12

verifiably true. We follow

6:14

the epistemology of science, you know, we

6:16

think that science is the best way

6:18

to figure things out. And

6:20

we promote not only scientific literacy and

6:23

a scientific worldview, but also critical thinking.

6:25

You have to understand something about how

6:27

our brains work. And,

6:31

and how we deceive ourselves how

6:33

others can deceive us to understanding

6:35

critical thinking is critical as well.

6:37

And also media savvy, we live

6:39

in a, you know, multimedia universe

6:41

now. So you have to understand

6:44

how to access information, how to evaluate

6:46

and assess information in order

6:48

to get to some

6:50

belief that's more likely to be true than not

6:52

to be true. You know, that's the ultimate goal

6:54

of things is to believe things as probably true

6:56

only when they ask for information. And

6:59

I think that's what we actually are. Yeah, we, we, we

7:01

obviously are a true crime podcast. And

7:03

right now in the true crime world,

7:06

there's a lot of talk about

7:08

elaborate conspiracies. In one of the

7:10

cases we're covering some defense

7:12

attorneys are saying that the murder was

7:14

actually committed not by their client, but

7:17

like a religious cult of odeness. And

7:20

it's really interesting to me that there are some

7:22

things that are going on in the conspiracy, other

7:24

sorts of conspiracies raised in the Karen Reed case.

7:26

What is it about people that draws us to

7:29

these conspiracy theories? And

7:40

I think that there's a little conspiracy there. It's

7:42

in all of us, right? There are some basic,

7:45

just human psychological elements

7:47

that do attract us

7:49

to conspiracy theories. And

7:52

a big one is something that

7:54

psychologists call apophenia, which is basically

7:56

a technical term for we see

7:59

patterns, right? We you

8:01

know, we our brains function

8:03

basically by noticing and picking

8:05

out patterns Our brains

8:07

also have another function though I mean

8:10

literally this is parts of your brain

8:12

that do this So we are constantly

8:14

sifting through all of the data out

8:16

there in the universe and this could

8:18

be visual It could be auditory could

8:21

just be ideas abstract whatever events connecting

8:23

the dots, you know Our brains really

8:25

good at that, but we

8:27

also have a reality testing filter So

8:30

it seems you know as best

8:32

as we can Neuroscientists think

8:35

that our brains are

8:37

designed are evolved to like really

8:40

Oversee patterns we see patterns

8:42

everywhere. So we have a

8:44

very sensitive pattern recognition It's

8:47

meant to to see patterns that

8:50

aren't even there and

8:52

then we filter out the ones that are probably

8:54

not true with the reality testing

8:56

circuitry in our brain so

9:00

conspiracy thinking part of it is Again,

9:03

and this is actually a big research

9:06

question. Do people who tend to believe

9:08

in conspiracy theories? Are they overseeing patterns

9:10

or are they under filtering them? Right

9:13

at what end is the problem? Occurring

9:17

I don't know that there's a definitive answer to that

9:19

It kind of depends on your research paradigm and how

9:21

you look at it But it does seem that it's

9:24

there's some preliminary evidence suggests. It's probably

9:26

an under filtering pattern They don't

9:29

we all see lots of patterns, but they

9:31

they don't filter out the not real ones

9:33

So like you might notice of

9:35

some bizarro coincidence. I've seen

9:37

the number seven so many times today

9:40

I'm what people will recognize that as

9:42

just a coincidence But

9:45

some people will be like the universe is speaking

9:47

to me, you know There's some magical thing going

9:49

on because it can't be in coincidence I see

9:52

this pattern and the pattern speaks

9:54

to me at a primal level, you know

9:56

And so I think that's kind of

9:58

how we're conspiracy theories come from. The

10:01

truth will set you free. We live by

10:03

that on the murder sheet. We're always looking to

10:05

get at the truth when we cover criminal cases,

10:07

when we're parsing through legal documents

10:10

and stories from survivors and detectives and

10:12

attorneys just trying to get the full

10:14

picture. So you can imagine why

10:16

we love to listen to Britney Arred's Quest

10:19

for the Truth on the new podcast, you

10:21

probably think the story's about you. Brit

10:23

is all about getting an answer

10:25

to a deeply personal question. What

10:28

if the person you thought was

10:30

your soulmate never really existed? After

10:33

a chance meeting on the Hinge

10:35

Dating App, a man named Kanan

10:37

stole Brit's heart. She felt hard

10:39

for him, but he ended up dragging

10:41

her into a web of lies. The

10:44

Kanan she came to love was an invention,

10:46

a ghost. Brit's journey to piece

10:48

together this disturbing mystery isn't just

10:50

compelling. It's a raw look at

10:52

self-discovery and the power of coming

10:55

together to form a community through

10:57

shared grief and trauma. Listen

10:59

and follow, you probably think this

11:01

story's about you, wherever you listen

11:03

to podcasts. But there's

11:05

a number of other psychological sort of self-reinforcing aspects

11:07

to it as well. You

11:10

know, conspiracy theories perpetuate

11:12

themselves because by

11:15

design they are insulated from

11:17

disprove, right? So once

11:19

you're inside a conspiracy, any

11:22

bit of evidence that contradicts the

11:24

conspiracy, well that was planted,

11:26

right? That's a false flag

11:28

operation. And any

11:30

bit of evidence that's missing, well

11:33

that's a cover-up. So there's

11:36

literally no way you can get out

11:38

of the conspiracy theory once you're fully

11:40

in it. It's an insulated

11:44

belief system. You know, that's why they

11:46

tend to perpetuate, like why do people still

11:48

believe we never landed on the moon when

11:50

the evidence is overwhelming? I mean it's overwhelming

11:52

that we landed on the moon. And

11:54

then, you know, you ask,

11:57

well, you know, try to just come up with a

11:59

coherent explanation for how how it all happened. And

12:01

they really can't. It's just they do a couple of

12:03

things. They do what we call anomaly hunting, right? So

12:05

they look for things that are weird. And

12:08

then they just weave some

12:11

sinister story around

12:13

these apparent anomalies. Why

12:16

isn't the flag waving? Whatever, just, you know, they, or

12:18

whatever they come up with. And

12:21

then they, again, that's where the apophilia then

12:23

comes in where they connect those dots. Then

12:25

you have your conspiracy theory. And then once

12:27

they're inside the conspiracy theory, they're done, right?

12:29

Then there's no way out. Yeah,

12:31

we've actually seen instances of cases where people

12:34

say the evidence here is too good. It

12:36

has to be a framed job. It

12:39

definitely seems like a self-fulfilling prophecy for some

12:41

of these folks. But I'm curious, you know,

12:43

broadly speaking, you've been part of the sort

12:45

of skeptical world for years. In

12:48

your kind of general view, do you

12:50

feel that people are becoming more skeptical

12:52

or less skeptical? Yes. I

12:56

do think it's both at the same time. There's

12:59

objective evidence to back that up. I mean, if you

13:01

look at, you know, at any

13:03

component of skepticism, scientific

13:05

literacy, people are actually getting

13:07

more scientifically literate. It's

13:09

still bad. I'm not telling you that, I'm not saying it's good, but it's

13:12

like, oh, we've gone from 18% to 24% of people in

13:15

this specific test of scientific

13:18

literacy that was done over 20 years. So

13:20

it's incrementally getting a little

13:22

bit better. People's IQs

13:24

are actually been increasing by about

13:27

three points per decade for the

13:29

last 60 years, as long as we've been measuring

13:31

them. Not even sure why that is,

13:33

but, you know, but

13:35

that's happening as well. I think people

13:38

have a lot more information, right?

13:40

We process a lot more information. And

13:42

so people, you know, I think are

13:44

generally more savvy, but at

13:46

the same time, the forces

13:48

that are trying to deceive us are also

13:51

getting better, getting really

13:53

good at creating

13:55

a narrative and

13:57

then fine tuning that narrative. So... It

14:00

pushes all of our emotional buttons and

14:03

then creating an information

14:05

ecosystem that

14:07

locks people into those

14:10

narratives. You

14:12

get to the point where you don't even realize

14:14

that the information that

14:16

you consume is being curated

14:19

for you by

14:22

massive organizations that people, it sounds like

14:25

a conspiracy theory, that this is actually

14:27

happening because the market

14:29

forces and media forces are real.

14:32

I don't think it's one guy in

14:34

a room doing all this. Some of

14:36

it is organic. Some of it is just this is

14:38

what market forces do. But sometimes,

14:41

you do have the head of Fox

14:44

News who has a very specific editorial

14:46

policy that has a very specific purpose

14:48

behind it and literally

14:50

curating the news for

14:53

an audience and for a narrative.

14:56

If you're not aware of that, you think

14:58

it's just the news. You

15:00

don't realize that your reality is being

15:03

created for you. This

15:05

happens in totalitarian countries, obviously, but

15:07

even in an open country like

15:09

the United States or most Western

15:11

countries, we still have people feeding

15:14

us information unless you

15:16

really make an effort to

15:18

get a wide variety of

15:20

sources of information and double, triple, check

15:22

everything. Unless you're really

15:24

actively skeptical of everything you come

15:27

across, chances are you

15:29

are being herded into a certain narrative

15:31

by people who want to sell you

15:34

something, people who want you to vote

15:36

a certain way, people who want you

15:38

to feel or believe a certain way. Again,

15:41

I think a lot of it is

15:43

organic. It just emerges from the culture,

15:45

from society and the institutions and people

15:47

doing their own little thing, but it

15:50

still can be a very powerful force.

15:53

That's where you get to the point, like in the

15:55

United States today, we think things are really polarized. This

15:58

has been, again, well-established. was not just

16:00

my opinion. This has been researched. It's well

16:03

established that people of certain

16:05

political end of the spectrum literally

16:08

consume different information with almost

16:10

no overlap, like no overlap

16:13

compared to people of the opposite

16:16

political persuasion. We're literally consuming two

16:18

completely different subsets of information. So

16:20

of course we're polarized. Neither

16:24

side could understand the other. Both

16:26

sides think the other side's crazy, because how

16:29

could they possibly believe that when there's

16:31

all this information that is

16:34

going in the other

16:36

direction? And they both think that the

16:38

other side's being lied to. I think

16:41

this existed hundreds

16:44

of years before social media, but social

16:46

media ramped it up by an order

16:48

of magnitude. And

16:50

we'll see what happens. I hope things will

16:52

sort themselves out and reach some sort of

16:54

equilibrium point. But nobody knows at this point

16:56

in time. We're just sort of taking it

16:59

as it comes. What

17:01

you say about people subscribing to

17:04

and getting their information from completely

17:06

different sources with different versions of

17:09

reality, we see a lot of that

17:11

in the true crime world as well. And

17:13

I guess I'm curious, with all these

17:15

sources of information people have, how can

17:17

they use critical thinking or the

17:19

principles of skepticism to figure out

17:21

where the truth is? Yeah.

17:24

So I wrote an entire book about

17:26

that. The Skeptics Guide to the Universe

17:29

book is a primer on how to

17:31

do that. How do

17:33

we know what's real in the world today with

17:35

so many sources of information? One

17:37

of my favorite examples is Steve Jobs. One

17:40

of the richest men on the world,

17:42

top of an information company, had every

17:45

resource to everything in the world diagnosed

17:48

with a very treatable, very survivable

17:50

form of cancer, and

17:53

yet made personal health decisions

17:55

based on misinformation and ultimately

17:57

died. Now, we don't know for sure. it

18:00

happened, it maybe wouldn't have made a difference.

18:02

I can't say that that affected the outcome,

18:04

but it absolutely could have. And

18:07

at the very least, we know that he delayed proper

18:10

treatment, science-based treatment, and

18:12

that may have been critical in his

18:15

outcome. How did that happen? How did

18:17

Steve Jobs not have access to the

18:19

right information? And it's because, again,

18:21

we're living in this world of different

18:23

narratives. And once you follow one

18:26

narrative, it

18:28

seems like the truth to you. So there's

18:31

a few good rules of thumb, like if you

18:33

want to be more skeptical. Again, I can't tell

18:35

you in two minutes how to do that. It's

18:37

a lifelong project. But there's a few

18:39

good tips on how to be skeptical, I

18:42

guess. So one is question everything, obviously. I

18:44

don't take anything at face value. If it's

18:46

important, the more important it is, the more

18:48

you should question it. And how

18:50

do you question it? You should

18:53

especially question things that fit

18:55

your narrative. That's the

18:57

thing that people don't do. We

18:59

instinctively believe in acceptings

19:01

that fit our narrative, our

19:04

beliefs, and question things that

19:06

don't fit our beliefs. That's confirmation bias.

19:09

Confirmation bias is powerful, massively

19:11

powerful. It makes us think that there's so

19:13

much evidence to support my view. It's like,

19:15

yeah, that's because you're only looking for and

19:17

accepting the evidence that does, and you're rejecting

19:21

or overlooking or dismissing the evidence

19:23

that doesn't. So do the opposite.

19:25

Be especially skeptical of your own

19:28

beliefs. And when information confirms what

19:30

you believe, that's

19:32

the information you need to be most

19:34

skeptical of. And then in

19:36

terms of sources, again, the big

19:39

rule of thumb is always go

19:41

back to the primary original source.

19:43

Don't accept somebody else's summary of

19:46

what a piece of information is or

19:48

whatever. Go back to the primary source always as

19:50

much as you can. It should be

19:52

a reflex. I hear something in the news. I'm like,

19:55

oh, I wonder where that information came from. You trace

19:57

it back to whatever the original source of the information

19:59

is. You often... find it's something very

20:01

different than the narrative that

20:03

was created for you or

20:05

the way it's being used. It's

20:08

never the same, right? It's always

20:10

different. It's always more complicated. It's

20:12

never as clean or as simple as it's being

20:15

presented. But you'll find that, okay, now I have

20:17

a much clearer idea of what apps is going

20:19

on. I'm seeing where the consensus

20:21

of opinion is, you know, try to get

20:23

to objective sources as much as possible. Find

20:25

out what the other side is saying and

20:28

why and see who has the better evidence,

20:30

the better argument. It's a process. There's no

20:32

one trick to it. It's a process. Again,

20:35

the more important it is to you, the more effort

20:37

you should put into it. If you're making a life

20:39

or death health decision, I would put a lot of

20:41

effort into trying to figure out how

20:44

reliable the information is

20:46

that you're seeing. So if you at least

20:48

have a process, right, don't just go with

20:51

the flow of this is my

20:53

tribe, this is what we believe, this is

20:55

the information that supports what we believe. Everyone

20:57

else is crazy and that's good enough for

20:59

me. You know, chances are you probably not,

21:02

you know, getting to reliable

21:04

conclusions if that's your process, you know, just

21:06

believe whatever the whatever the tribe says,

21:08

whatever the narrative is, probably didn't

21:10

just happen to fall into the one true

21:13

narrative in the world, right? Yeah,

21:15

that would be pretty pretty lucky. And

21:19

one thing you mentioned, I want to go back to

21:21

I thought was really interesting with Steve Jobs, obviously,

21:23

you know, for we kind

21:26

of all regard him as a very smart

21:28

man, a very visionary man with technology business.

21:30

So is being a good critical

21:33

thinker, the same thing as being intelligent and

21:35

smart? Or can you be an intelligent and

21:37

smart person and terrible at critical thinking? Yeah,

21:39

that's a great question. So intelligence is

21:41

many different things. And then

21:43

neuroscientists like college, whatever, don't even like

21:45

to use the term very much. Because

21:48

it's, it's a loaded term,

21:50

like IQ testing, what are you

21:52

actually measuring? What is it? And

21:54

we recognize that there's it's multifaceted.

21:57

And you absolutely could be intelligent in

22:00

one way and not in another,

22:02

because there's different skill sets that we

22:05

would all think of as, quote

22:07

unquote, intelligence. There is, you know, some

22:09

psychologists will use terms like emotional intelligence,

22:11

right, to represent your ability to like

22:13

pick up on social cues and think

22:15

about interpersonal relationships and things like that.

22:18

And certainly there are people who may

22:20

have very high engineering, pragmatic,

22:23

mathematical, whatever intelligence who had

22:25

very low emotional intelligence. And

22:28

I absolutely critical thinking is

22:30

its own skill sets. It

22:32

is a very involved skill

22:34

set. It's not easy. It's very hard to

22:37

do. It takes vigilance. It takes the ability

22:39

to look at your own beliefs and go,

22:41

I could be wrong. I am wrong. I

22:43

know I'm wrong. The question is how, how

22:45

wrong am I? And in what way am

22:47

I wrong? And how can I become a

22:49

little bit less wrong, you know, by trying

22:51

to, you know, be open to

22:54

new perspectives, new information, whatever that

22:56

answer your question? Did? Yeah. Okay.

22:58

It sounds like, and correct me if I'm

23:00

wrong, but it almost sounds like critical thinking

23:02

is a bit like a muscle. If you're

23:05

not using applying it again and again, it

23:07

can atrophy and not be super helpful. It's

23:09

not something where it's something you should be

23:12

applying across your life, which may be hard

23:14

for some people. Yeah, it's a

23:16

pattern of behavior too. It's what we call

23:18

metacognition, right? So you're thinking about your own

23:20

thought process. You're thinking about

23:22

it critically. Like you're always trying to second

23:24

guess yourself and prove yourself wrong in a

23:27

way. And those ideas

23:29

which survive your attempts to

23:31

prove it wrong. All right, that's what

23:33

has some value. You know, I can't immediately disprove

23:36

it. If you really want to go on this

23:38

skeptical journey, it's not, it's not just saying, all

23:40

right, I'm going to be skeptical. I'm going to

23:42

question things. It's like, it's a

23:44

massive skill set. Like psychologists

23:47

have been studying this for 100 years. Neuroscientists

23:49

now, we know a lot more about how

23:51

the brain works. There's a lot

23:53

of literature. You can go deep into

23:55

the weeds just on conspiracy thinking, for

23:58

example, or just on How

24:00

do you tell real science from

24:03

pseudoscience, or what's the actual difference

24:05

about the philosophy of science, neuropsychological

24:07

humility, how your brain

24:09

deceives itself and instructs reality, and

24:11

how your memory is flawed. There's

24:13

so many different sub-areas of information,

24:16

and you could fall victim to

24:18

any one of them. One of

24:20

the things, as skeptics, we remind

24:22

each other and ourselves, the ultimate

24:25

bias, the ultimate way that you

24:27

deceive yourself, is thinking that

24:29

you're a critical thinker, and therefore you

24:31

can't be deceived. Once

24:34

you think, oh, I've arrived, I'm a critical thinker,

24:36

I'm a skeptic, no, you can't fool me, that's

24:39

the easiest person to fool, is the one who

24:41

thinks they can't be fooled. Ask

24:43

any magician that, they will tell you that. The

24:46

person who thinks they're too smart to be fooled, that's

24:48

the easiest person to fool. They

24:50

love scientists as an audience.

24:54

And we see this all the time, where you

24:57

have a scientist who's brilliant, brilliant

24:59

scientist who falls for

25:02

complete chicanery, because

25:04

it's not their skill set. It's a different

25:06

skill set. They're not used to nature

25:09

actively trying to deceive them, so they don't

25:12

know how to deal with that. It's

25:16

100 skill sets, and any one

25:18

of them contribute up, if you're not aware

25:20

of it. Again,

25:23

it's a journey you will never get

25:25

to the destination of. You have to

25:27

constantly be questioning and

25:29

trying to improve how you

25:32

process information, and questioning your own

25:34

biases. Even for somebody who's

25:36

been doing this my entire adult life, there

25:38

are just biases in the way that

25:40

we look at reality, and the value

25:43

judgments that we make, and it's really

25:45

hard to weed that completely out of

25:47

your thought process. We're biased

25:49

in ways we're not even aware of. So

25:52

you always have to just be always

25:55

first and foremost, open to

25:57

the possibility that you're wrong, and just always try

25:59

to get a... be more sophisticated

26:01

version of or nuanced version

26:03

of what you think about things. Obviously

26:07

from what you're saying it's clear that it's

26:09

not easy to be a critical thinker. It's

26:11

a process. It takes a lot of

26:14

hard work. So I'm sure some people

26:16

in our audience might be wondering what

26:18

makes it worth it. What's the

26:20

harm if I want to believe we never went to the

26:22

moon? Yeah, so believing in magic

26:24

is a really dangerous thing, right? So

26:27

one aspect of organized skepticism,

26:30

a massive aspect of it,

26:32

is consumer protection. And

26:34

there's many layers to that as

26:36

well. So ultimately, again, I mentioned

26:38

Steve Jobs. Steve Jobs, you

26:40

know, arguably, died prematurely because

26:43

he fell for this natural is

26:45

better narrative that sounds good superficially.

26:48

But it's total BS when you

26:50

really dig deep into it from

26:52

a scientific and critical thinking perspective.

26:54

But it sounds good. And it's

26:56

very persuasive, pervasive, especially in California.

27:00

So it kind of fit with the culture that he

27:02

was embedded in. So, you know,

27:04

we make decisions individually, as

27:06

a family, as a group,

27:08

as a society, all

27:11

the time informed by sides of

27:13

critical thinking. You know, should

27:15

we be investing in green energy? Or

27:17

is global warming a hoax

27:19

and a conspiracy? Right? You know,

27:22

can I trust institutions? Like, can I

27:25

trust the FDA to evaluate

27:27

medicine? You know, should I take medications or

27:29

not? Or are they all poisons that are

27:31

going to kill me? Because the pharmaceutical industry

27:34

is trying to keep me sick. Which

27:36

one of these things do you believe? So we make, you know,

27:39

these life or death decisions all the time,

27:41

or decisions that have massive effects on our

27:43

life, on our prosperity, on our society,

27:46

that are in where critical

27:48

thinking and scientific literacy is

27:51

absolutely critical. So if

27:54

you are not, so I also, when people

27:56

ask that question, why, what's the harm? So

27:58

what's the alternative to? being a

28:00

critical thinker. It's the opposite

28:02

of being a skeptic being a critical thinker

28:05

is being gullible. But

28:07

nobody will say, yeah, I'm gullible. I'm okay

28:09

with being gullible. But that's really what you're

28:11

saying, right? If you're not a critical thinker,

28:13

you are gullible. That's what the word means.

28:16

And that means you're vulnerable to any con

28:18

artist out there. And there's

28:20

lots of con artists out there. There's 8

28:22

billion people in this world. You

28:25

know, 1% of them are psychopaths, sociopath.

28:28

That's a lot of people. There's a lot

28:30

of people out there that have

28:32

no compunction about stealing all of

28:34

your money. We're swimming in scams

28:36

right now, right? I mean, I

28:39

get scam emails and texts every

28:41

day. It's like everybody does phone

28:43

calls. It's constant, constantly being assailed

28:45

by people attempting to calm me

28:47

in one way or another. You

28:50

have to be skeptical in today's world. You have

28:52

to be. Otherwise, you're vulnerable as

28:54

hell, right? For my

28:56

next question, I'm gonna out us

28:58

our embarrassing critical thinking failures. So

29:00

I know when I

29:02

was we all have them. Yeah, well, I'm gonna

29:04

ask you yours, but I want to do ours first.

29:06

So you don't feel like we got it. So mine

29:09

was I used to when I was just

29:11

a consumer of true crime, I would disagree

29:14

with whatever podcast or documentary I was watching,

29:16

unless it was really egregiously bad. And

29:18

over time, I'm like, I'm just kind

29:21

of taking in this information uncritically. I'm a

29:23

journalist, I'm a history major, I would never

29:25

assess anything like this at work. But it's

29:27

like, I like this podcaster, so they must

29:29

be right. So that's my embarrassment. You want

29:31

to tell me yours? Years ago, I

29:33

used to be I

29:35

felt prey to Kennedy assassination conspiracies. Yeah,

29:37

you waste of time, but analyzing

29:40

them did teach me a lot about critical thinking.

29:42

So I want to ask, is there a time in your

29:44

life where you've kind of failed in the

29:46

critical thinking space? Oh, yeah, when I was younger,

29:48

like before I was

29:52

like teenager in into my 20s,

29:54

I believed in everything. I believe

29:56

in UFOs, Bigfoot, you know, the

29:58

whole thing. Basically, was

30:00

on in search of with Leonard Nimoy,

30:02

I like that. Spock is telling me

30:04

that this is correct. And

30:07

the thing is, like, you know,

30:09

this pseudo doc, I was a

30:11

total science documentary junkie. I watch

30:13

any science show on TV. And

30:15

the pseudo scientific ones were just

30:17

as slick, just as compelling. It

30:19

was adults with some authority

30:21

figure saying these ancient astronauts made

30:23

these lines in the desert, which

30:25

I believed all of it until

30:27

I started to learn not, you

30:30

know, more science and more critical

30:32

thinking. And the first

30:34

one to fall was the whole UFO

30:36

thing, like aliens were visiting the earth.

30:38

And then once the first domino goes,

30:41

and you realize, all right, this is

30:43

bullshit. And yet, there's this

30:45

whole infrastructure, a

30:47

whole ecosystem of belief in it.

30:51

You know, so if that could be a

30:53

lie, right, that could all be just self

30:55

deception and pseudoscience. How do I know that

30:57

any of these other things that I've been

30:59

believing are correct, then you start to examine

31:01

things one by one, and they just, all

31:04

of the pseudoscience, you know, just collapses one after

31:06

the other. But then

31:08

you get to more and more nuanced things, right.

31:11

So we start off as what we

31:13

in the community call Bigfoot skeptics, which

31:15

is meant to be a

31:17

derogatory term, but it's really perfectly legitimate.

31:19

It's just that's kind of where you

31:21

start off, like the low hanging fruit

31:23

is, yeah, there's no breeding

31:25

population of giant primates in North America

31:27

that we that we have not

31:29

been able to find for the last 60 years.

31:32

Like, maybe you had a point when

31:35

it was brought up in the 1960s.

31:38

But I mean, come on, it's like,

31:40

hey, whatever, how many years later, we've,

31:43

there's this seven foot

31:45

primate living in Oregon, and

31:47

we've never, no one's been

31:49

able to find a single piece of

31:51

convincing evidence. It bottles in

31:53

mind, right? There are still

31:56

shows about finding Bigfoot, though. It's amazing. So

31:58

anyway, you start to about things

32:00

like that and you realize

32:03

how deception gets built into

32:05

the culture in so many ways. But those

32:07

were mine. It was a good experience. I

32:10

think I'm better a better skeptic because I was on

32:12

the other side of it at some point. I kind

32:14

of know how people think and how

32:17

they get into that and how

32:19

you sort of cocoon off your

32:21

beliefs and dismiss skepticism

32:23

until you pant anymore. But

32:27

also just the ability. It's very

32:29

liberating. It's very freeing. Once

32:31

you realize, I don't have to believe anything. I

32:34

can decide for myself what to believe

32:36

based upon logic and evidence.

32:39

And in a way, it is extremely

32:41

freeing. You

32:44

use the word pseudoscience in case some

32:46

members of our audience aren't familiar with

32:48

it. What is pseudoscience

32:50

and how is it different from

32:53

science? Yeah, it's another good

32:55

question. First of all, there's what we

32:57

call, what philosophers call, the demarcation problem,

32:59

which is a fancy way of saying

33:01

there's no sharp line between science and

33:04

pseudoscience. It's a continuum. It's

33:07

more like what are the features

33:09

of good science versus pseudoscience. The

33:11

more features of pseudoscience you have,

33:13

the more you are towards that

33:15

end of the spectrum. But sometimes

33:17

even legitimate science will use some

33:19

kind of squirrely techniques. It's

33:23

not these two clean, sharp categories.

33:25

But the features of pseudoscience are basically, they're

33:27

going through the motions, so like pretending to

33:30

do science, but they're not doing the

33:33

real spirit of science. So

33:35

the big thing is that pseudoscience is

33:37

generally start with the conclusion and

33:39

they work back from there. Unfortunately,

33:42

like in the legal system, like the

33:44

true crime area, that's a lawyer's job.

33:46

The lawyer's job is to start with the

33:48

conclusion to work backwards. So

33:50

in a way, like it doesn't surprise me that

33:52

a lawyer is defending a conspiracy theory. They don't

33:54

care. They don't even have to believe it. That's

33:57

not their job. Their job is to make whatever defense.

34:00

They can for their client, right? But

34:02

if you're doing science that you can't do that,

34:05

but you can't start with the conclusion and work

34:07

backwards They do things like they

34:09

only look for evidence that supports their

34:11

hypothesis rather than trying to disprove their

34:13

hypothesis Which is what a legitimate scientist

34:15

would do they do things like dismiss

34:18

Evidence that doesn't support their hypothesis or

34:20

they find reasons to ignore it or dismiss

34:23

it and their methods are terrible Right. They

34:25

just don't use good Double-blind

34:27

controlled methods that you know, they

34:29

will use terms differently in different

34:31

contexts. They won't be measuring things

34:33

properly whatever they speak of they

34:35

make so many mistakes that basically

34:37

their data is meaningless or uninterpretable

34:39

or You know they

34:42

and if they can twist it to

34:45

say whatever they wanted to say, right?

34:47

So they're not really asking questions. They're

34:49

just twisting the

34:52

whole process and the data to fit

34:55

Their conclusion that they want to that they want

34:57

to have I Was

34:59

curious. This is something that sort of

35:02

I think when I look at history

35:04

Sometimes you have, you know instances where

35:06

experts people who are trusted either in

35:08

politics or science or medicine You know

35:11

any field really, you know Do

35:13

betray trust or betray the public's trust or just

35:16

make a bad call, you know Like I think

35:18

of pushing opioids in the early 2000s, right? You

35:21

know, they were kind of a miracle drug and then all of

35:23

a sudden, you know We have a epidemic

35:26

so people I think right

35:28

now Especially are very skeptical or maybe

35:30

skeptical is the wrong word but very

35:32

much dismissive of experts how

35:35

when we're being skeptical when we're applying

35:37

critical thinking can we Hear

35:40

out experts without necessarily dismissing them

35:42

but also leaving room for you

35:44

know, I guess not believing

35:46

everything wholeheartedly I mean, what's the balance? It's kind

35:48

of a big question. I suppose but

35:50

what are your thoughts on that? Yeah, that's that's

35:52

the trick isn't it right is knowing

35:54

how to respect expertise But

35:57

not idolize an individual expert, right? So

36:00

One way is to not

36:02

invest authority in one person,

36:05

right, or one group or one

36:07

institution. You want to,

36:09

as much as possible, rely

36:11

upon a consensus of opinion

36:14

among many different individuals, as

36:17

much as possible, right? So any person

36:19

could be wrong, even any scientist could

36:22

make a bad call, could have a bias,

36:24

you know, could have a conflict of

36:26

interest, could get desperate and cheap because he

36:28

thinks he knows what the answer is, but he's having

36:30

his hard time getting the data

36:33

to do what he wants it

36:35

to do, whatever, it happens. So

36:37

there's never any authority in one

36:39

individual. But if you have

36:41

a broad consensus of many different

36:43

individuals bringing to bear independent lines

36:45

of evidence, right, so like the at the other

36:47

end of the spectrum, right, you have, you

36:50

know, hundreds of experts

36:52

with hundreds of studies, thousands

36:54

of studies, multiple independent lines of evidence,

36:57

all pointing towards the same conclusion, that's pretty

36:59

rock solid, right? That's pretty reliable. And then

37:02

of course, there's everything in between, it's a

37:04

spectrum. And you have to make a decision

37:06

like, what are what are

37:08

people saying about this? How controversial is

37:10

the claim? How politically charged is it,

37:12

right? If it's a pretty mundane claim,

37:15

that's not a political football

37:17

at the moment, and you have

37:19

somebody who is clearly a recognized

37:21

expert who's making pretty ordinary claims

37:23

about something, they're probably independent lines

37:25

of evidence, all pointing towards

37:27

the same conclusion. That's pretty rock solid, right?

37:30

That's pretty reliable. And then of course, there's

37:32

everything in between, it's a spectrum. And you

37:34

have to make a decision like, what

37:37

are what are people saying about this?

37:39

How controversial is the claim? How politically

37:41

charged is it, right? If it's a

37:43

pretty mundane claim, that's not a political

37:46

football at the moment. And you

37:48

have somebody who is clearly a

37:50

recognized expert, who's making pretty ordinary

37:52

claims about something, they're probably accurately

37:55

reflecting the evidence, although even then, you're

37:57

getting their view of the science, right?

38:00

You'd always want to know, do other experts agree

38:02

with you? You know what I

38:04

mean? That could be on

38:06

anything. There's lots of just pure scientific

38:09

controversies that don't deal with pseudoscience

38:11

or anything else. Just did an

38:13

asteroid wipe out the dinosaurs or was it something

38:15

else? There's experts who have the

38:17

minority opinion that, no, it was

38:20

the volcanic eruptions in the Deccan

38:22

traps and the asteroid was

38:25

incidental or whatever. 95%

38:29

of scientists are saying that it was the asteroid.

38:31

So maybe that's probably true. But it's always good

38:33

to know what the minority opinion is and to

38:35

recognize that there is one and there's debate about

38:37

it. So I say it's tricky.

38:39

But again, big rule of thumb is trust

38:42

in consensus more than any individual

38:44

because individual opinions could be quirky,

38:46

they could be biased, they could be

38:48

flawed. Anybody

38:51

could be wrong on any given day and you

38:53

just got to make a judgment call based upon

38:56

that, but also be open to change as the

38:58

data changes, that's expert opinion changes. But

39:01

also avoid the temptation to reject an expert

39:03

because you don't like what they have to

39:05

say because it conflicts with your narrative, your

39:08

tribe, your worldview. Don't cherry

39:11

pick your experts. It's very easy because

39:13

there's so many people out there with

39:15

varying degrees and levels of expertise. You

39:17

can find a quote unquote expert to

39:19

say anything. You

39:23

shouldn't just pick the ones. Don't

39:26

start with the conclusion and then pick your

39:28

expert to support that conclusion. That's

39:31

not going to get you to the right answer. You want to just

39:33

say, start with the experts.

39:35

What are they saying? Is

39:37

there a consensus? How solid is it?

39:39

Who disagrees? Why do they disagree? Maybe

39:43

we just don't know the answer at this point in time.

39:45

But again, there's a process. You

39:47

follow a scientific objective process and

39:50

you're more likely to get to

39:52

an answer than this is what I

39:54

want to believe. This guy over here agrees with me. Well,

39:56

there you go. I have an expert who

39:58

agrees with me, so I'm right. Yeah,

40:00

it's scary because that can lead to

40:03

wrongful convictions. We actually recently covered a

40:05

case of a man who was accused

40:07

of a crime. Prosecutors thought he did

40:09

it. They find an expert to say,

40:12

blood splatter. Turns out it's just total

40:14

pseudoscience. He was eventually acquitted after a

40:16

series of appeals, but it's

40:18

actively dangerous when it gets applied to the

40:20

legal system. I also

40:22

wanted to ask you about something

40:24

else, which is logical fallacies. I

40:26

find it useful in evaluating people's

40:28

argument to look and see if

40:30

they're using any logical fallacies. What

40:33

are some of the logical fallacies that people

40:35

use? Yeah, there's a chapter in our book,

40:37

Just How Logical Fallacies. There's a lot of

40:40

them. We have an article on

40:42

our site, The Top 20 Logical Fallacies, but there's a

40:44

lot of them. And

40:47

there's more or fewer, depending on

40:49

whether you're a lumper or a

40:51

splitter, but there are some basic

40:53

ones. The mother of all logical

40:55

fallacies is the non sequitur, which

40:57

just means that the logic doesn't

40:59

follow. So anytime you make a

41:01

conclusion that does not follow from the premises,

41:04

it's a non sequitur. To

41:06

clarify, these are informal

41:08

logical fallacies. So informal

41:10

logical fallacies mean they don't say anything absolutely

41:12

about the conclusion. It's just a good rule

41:14

of thumb, as opposed

41:16

to the formal logical fallacies where

41:19

they're 100% always incorrect.

41:22

If you say one equals two,

41:24

and two equals three,

41:26

therefore one equals three. That's a formal

41:28

logical fallacy. It's math. It's always wrong.

41:30

But an informal logical fallacy is more

41:33

like a, it's just a clean

41:35

way to think versus a sloppy way to think.

41:39

So if you say, for example, this is

41:41

true because this one expert over here says

41:43

it's true, we call that an argument from

41:45

authority. That one expert could

41:47

be wrong. You can't say it has to

41:49

be true because this one expert says

41:52

it's true. Or you could say, well, this

41:55

may be wrong, but that person's also doing it wrong.

41:57

So it's okay. You know, it's the two quote. very

42:00

logical thought. So you can't say it doesn't, you could

42:02

both be wrong, right? The fact that some other guy's

42:04

doing it wrong doesn't mean that it's it's magically right

42:06

for you to do it. You can

42:08

make what we call an argument from final consequences.

42:10

Like this is wrong because if it were true,

42:13

that would be bad. It's like, well, it could

42:15

be bad, you know, it doesn't mean it's not

42:17

true. So anyway, there's a

42:19

huge list of them. And the

42:21

one thing I always like to remind people of

42:25

about that is the use of

42:27

logical fallacies as a tool, not a

42:29

weapon, right? So when people learn

42:32

the logical fallacies, the first thing you do

42:34

is use them as a weapon against other

42:36

people. But that's not really

42:38

what they're for. They're for you to police your

42:40

own thinking to make sure that you're thinking in

42:42

a clear and logical manner, that you're

42:45

not falling for these mental

42:47

shortcuts that may superficially make

42:50

sense, but are not logically

42:52

valid. Right? So whereas

42:54

that then we actually call there's a fallacy for

42:56

that called the fallacy fallacy. It's like, oh,

42:59

look, I can frame what you said

43:01

as if it was a fallacy, therefore

43:03

you're wrong. It's like, no, you can

43:05

you could twist anything kind of these are informal

43:07

logical fallacy. So you could like I could say,

43:10

well, 90, 97, 98% of scientists think that the

43:15

planet's warming because of manmade release

43:17

of co2. And someone else say

43:19

that's an argument from authority. Well,

43:22

not really. It's the

43:24

scientists are basing their opinion on evidence and

43:26

analysis I'm just telling you what

43:28

all the experts think. But their but their

43:31

thinking is based upon, you know, evidence and

43:33

anyway, so it's that's a

43:36

legitimate reference to authority as opposed

43:38

to an argument from authority fallacy,

43:40

which is this one guy that

43:42

I found agrees with this opinion,

43:44

or an ad hominem, right is another

43:47

logical fallacy where you're

43:49

wrong because your breath smells, whatever

43:51

you but sometimes like saying

43:53

this guy's a convicted con artist is

43:55

a legitimate thing to point out like,

43:57

I wouldn't put any trust in this

43:59

guy. In what he's saying,

44:01

he was convicted of lying to con people

44:03

out of things, but you could

44:05

say that's an ad hominem attack. So

44:08

again, you can frame anything as a

44:12

fallacy if you try hard enough, and

44:14

that's why these are informal logical fallacies.

44:16

It is okay to put history into

44:18

context as long as

44:21

you're not saying he's wrong because

44:23

he's a con artist. You should say

44:25

he's wrong and he's a

44:27

con artist, which is probably why he's saying

44:29

this, but you want to then say, here's

44:31

the reason, the factual reasons why I think

44:33

he's wrong. So logical

44:36

fallacies are tricky to use. They're easy

44:38

to, again, deceive yourself into thinking, I'm

44:40

a critical thinker because I could name

44:42

logical fallacies. The

44:45

best way to approach is to use them as

44:47

tools to help you think more clearly. And

44:50

don't just use it as a weapon because it's

44:52

so easy to abuse. That's

44:55

your only point is I gotcha in a debate.

45:00

One more point on this is that when

45:03

you are having a discussion with somebody about

45:05

something, if you take the debate approach, I'm

45:07

going to prove you wrong, that's

45:09

not really going to get you very far

45:12

because again, if that's your goal, if you're

45:14

going to lawyer the topic, right, if that's

45:16

your goal, you can

45:19

brain everything in a as a fallacy in

45:22

a sinister way, whatever. But

45:24

if your goal is let's both figure

45:27

out what our common ground is, try

45:29

to build what we can, what we

45:31

actually know together, examine both of our

45:33

positions to see where the facts align

45:36

and where maybe we disagree and then

45:39

and then figure out what the right

45:41

answer is not who's right, but what's

45:43

right. It's a much more

45:45

useful approach. I'm

45:48

curious, you mentioned some of the areas where,

45:51

you know, trying to adopt more critical thinking

45:53

could even be a bit of a pitfall

45:55

for some people because they're almost adopting the

45:57

trappings rather than the real core ethos. And

46:00

I'm curious, do you have any tips

46:02

for somebody who wants to get started trying

46:04

to apply more critical thinking in their lives,

46:06

where to begin without falling into those traps? Yeah,

46:09

first you buy my book, it's

46:11

the 17,000 universe, and read it twice. No,

46:14

I mean, it's a primer, it's a primer, that's

46:16

the, we wrote the book because people ask us

46:18

that question, how do I start thinking critically? It's

46:21

like, well, here's your primer. This will lay it

46:23

all out for you. And there's other ones, Demon

46:25

Haunted World was a good one to

46:27

start with. We kind of wrote our

46:29

book as an updated version of the Demon Haunted

46:31

World, Why People Believe Word Things by

46:33

Michael Shermer is still a great sort of primer

46:35

book that was more from the, I think the 90s.

46:39

So, every now and then somebody writes a book about

46:41

this, Nonsense on Stilts

46:43

by Massimo Pilucci. There's a lot of

46:45

great books out there that go over

46:48

science versus pseudoscience, critical thinking skills, basic

46:50

skills, packaged in slightly different ways. They're

46:53

all good, there's many good books out there. So

46:56

that's always a good place to start. There's like a

46:58

book level, here's everything. There's

47:00

a lot of activists, science communicators and

47:03

skeptics out there who are breaking

47:05

down the news

47:07

science, critical thinking, pseudoscience, from

47:11

many different perspectives. So that's

47:13

good. I find it very useful just

47:15

to get into discussions with people, you

47:17

know, and again, but

47:19

with the approach of let's figure

47:22

out what's right. You know, how do we know

47:24

what's right? And let's go through a process and

47:26

try to figure out if we can figure

47:28

that out together, right? You know, especially, I

47:30

especially love talking to people with whom I

47:32

disagree, right? It's kind of boring

47:34

to talk with people that agree with everything

47:36

that I think. So, but

47:38

if I talk with somebody who has a

47:40

completely different viewpoint for me, I want to

47:43

know, why do you think that? What thought

47:45

process led you to that? Why do you

47:47

think I'm wrong? Can

47:49

we make any progress

47:51

sort of figuring, you

47:53

know, resolving our differences? You learn a

47:55

lot. You know, sometimes even if you

47:57

know something is correct. It

48:00

doesn't mean you could defend it against a

48:02

dedicated attack or a dedicated attempt at

48:06

proving it incorrect. Another

48:08

way, knowing the science of something

48:10

doesn't mean you know the pseudoscience

48:12

automatically. So historically,

48:15

a great example of this is

48:18

that creationists made,

48:20

there were several creations, multiple creations, you

48:22

try to make their career debating evolutionary

48:25

biologists about creationism and evolution.

48:28

Duane Gish is the most infamous of

48:30

them. As a debate,

48:32

he tended to kick the rest. The

48:35

scientists lost because they would go into

48:37

it thinking, well, I know way

48:40

more about evolutionary biology than this guy

48:42

does, so I could handle

48:44

anything. But what they didn't know was

48:47

the pseudoscience of creationism. They didn't know

48:49

what arguments were going to be levied

48:51

against them and how facts were going

48:53

to be twisted and how logic was

48:55

going to be subverted. And so

48:57

they weren't prepared for that and they just got overwhelmed

49:00

by that. But

49:03

if you actually get into a conversation

49:06

with them about it, it's almost like

49:08

an investigation onto

49:10

itself, like a forensic examination. Where

49:13

is their thought process going wrong? Or

49:15

where is my thought process going wrong? And

49:18

if that's your approach, you learn a lot

49:20

of critical thinking from doing that. Yeah,

49:24

I love that kind of collaboration. I'll tell

49:26

you, I mean, I've kind

49:28

of distrusted debate ever since I was in

49:30

a college course where you had a

49:32

debate on whether or not, basically

49:34

from the concept of the Byzantine Empire, whether or

49:37

not the art should be destroyed. And we were

49:39

on the anti-art side and it was like, how

49:41

are we going to win this? This is a

49:43

class where we all love Byzantine art. I just

49:45

got in their faces and just started accusing them

49:48

of writing biblical fan fiction and we're all going

49:50

to be punished by God. And we won somehow

49:52

just because of yelling louder. And

49:54

it just kind of underscored, like obviously this,

49:57

you know, basically whoever's louder. Savvy

50:00

or slicker is not necessarily the person

50:02

who's working. Yeah, debate is

50:04

its own skill set. And you could be a really

50:07

good debater, even if

50:09

you don't have facts-illatric on your side.

50:11

It's a performance, you know, more than anything

50:13

else. And the

50:15

courtroom is very much a performance as

50:18

well. I've had many interactions

50:20

with the legal system myself as an expert

50:22

witness. I was sued at one point for

50:24

an article that I wrote. We won. We

50:27

got a judgment against. I

50:29

had to actually pay most of my legal fees. But

50:33

you learn a lot about the legal

50:35

system through those various interactions. And

50:37

the way the legal system is set up, as

50:39

you guys know, right, it's not like, let's all

50:41

figure out together what the truth is. It's an

50:43

adversarial system. You have to do

50:45

everything you can to prove this guy guilty. You

50:47

do everything you can to prove him innocent. There's

50:50

strengths and weaknesses to that system, right? And the

50:52

weakness, I think, is that it sort of encourages

50:55

that approach, this adversarial

50:57

approach. And I've sat

51:00

across from lawyers in

51:02

depositions or whatever where they made arguments like,

51:05

I know you know that that's bullshit, right?

51:07

They don't believe that for a second, but it's the

51:10

argument to make for their side,

51:12

right? So they do it. And

51:15

the way they rationalize that's like, well, it's the

51:17

jury or the judgment, they'll sort it

51:19

out. We're just doing our part. And

51:22

it's true. The system is set up that

51:24

way. And they're not blaming them. That's the system. We

51:27

need to decide if that's the system we want.

51:29

I don't know that that's the optimal system, but

51:32

it's the system we have. So within that system,

51:34

they're playing their role. The good thing about the

51:36

legal system, though, the thing that's really a strong

51:38

point and the reason why it works is

51:41

because there are rules of evidence, right?

51:44

So all the logical fallacies

51:46

that I'm talking about and all the

51:48

evidentiary stuff, there are very strict rules

51:50

of evidence in a courtroom. Again, they

51:53

may not be perfect. They may not

51:55

be complete. I think they have

51:58

a lot of issues with how science to introduce to

52:00

the courtroom. But at

52:02

least there's rules of evidence. You can't

52:04

just bullshit your way through a case,

52:06

right? You have to have sources

52:09

for your claims, whatever. You

52:11

can't introduce ideas that have

52:13

not already been established, whatever.

52:15

There's a lot of shenanigans

52:18

that you cannot do that a competent judge

52:20

would not allow you to get away with

52:22

or a competent attorney

52:24

on the other side will know when to object. Like,

52:26

oh, if they're breaking the rules of evidence, you can't

52:28

do that. So that, I think,

52:30

is the strength of the system. The

52:33

adversarial part is kind of a

52:35

plus-minus. And the relationship

52:38

with science is, I think, weak.

52:40

It needs to be strengthened. I

52:42

want to underscore what you said

52:44

there about attorneys sometimes knowingly making

52:46

arguments they know are false, just

52:50

because that's their job. And I think people

52:52

need to remember that and keep that in

52:54

mind when they hear arguments

52:56

from attorneys. My

52:59

attorney told me that. It's like, I don't

53:01

believe this, but this is the

53:03

point that I need to take. I don't

53:05

have to believe that from a legal ethical

53:08

point of view, I don't have to

53:10

believe it personally in order

53:12

to say it in court. It just has to

53:14

be reasonable. Somebody might

53:17

believe this or this. It's a reasonable

53:19

approach to take. I wouldn't personally

53:21

endorse it. I don't have to. That's not

53:23

my job. That's

53:25

why they could say, even though I think you're guilty,

53:27

it doesn't matter. I'm presenting a case, and it's for

53:29

other people to decide if you're guilty or

53:32

not. Another

53:35

area where people often fall prey

53:37

to things that aren't true,

53:39

probably because of wishful thinking, is in

53:41

areas of health, because we

53:43

all like to believe in miracle

53:45

cures or what have you. And

53:48

it's the leading question, but what

53:50

can people do if they want

53:53

to look and find accurate information

53:55

about scientific-based medicine? Yeah,

53:57

I run a website called Science-Based Medicine. Yeah,

54:00

very leading question. So it's

54:03

tough, there's a very complicated relationship between science

54:05

and the practice of medicine. And that's exactly

54:07

what we could explore, how to optimize that

54:09

relationship, how to make decisions based upon the

54:12

best science and evidence available. It's complicated, you

54:14

know, is the short answer. But

54:16

as a consumer, again, there's sort of

54:18

a process you can go through. And

54:21

unfortunately, you know, you

54:23

have to make health decisions unless you are

54:25

a physician. In

54:27

fact, unless you are an expert in

54:30

whatever the specific field is that's relevant

54:32

to your condition, you have to

54:34

rely on other people who know more than you. That's

54:37

just like these things, right? No

54:40

one is an expert on everything. You drive over

54:43

bridges, did you investigate the

54:45

engineering of that bridge to make sure that

54:47

the ratio of the width and the width

54:49

of all... Of course not, you

54:51

trust that some civic engineer knew what they

54:53

were doing, that the regulatory agency made sure

54:55

that they knew what they were doing before

54:57

they licensed them. And that

55:00

whatever commissioned the bridge made sure that they

55:02

found experts, whatever. You trust,

55:04

you have faith in the process, in

55:07

the transparency and the whatever, in the

55:09

expertise of the people involved. The same

55:11

is true in medicine. There's a process,

55:14

we go to medical school, you get

55:16

licensed, you get board certified, you get

55:18

privileges at a hospital. These are all

55:20

multiple different layers of trying

55:24

to say that, yeah, this person is

55:26

competent, knowledgeable, and ethical, right? Those are

55:28

like the three big things. And

55:30

if you violate that, you can get sued, you can

55:33

get your license taken away by the state. There's

55:35

remedies for people who fall below the standard. So

55:38

as a consumer, you have to have a certain amount of

55:40

faith in that system, right? If you don't have any faith

55:42

in that system, you're living in a very dark world that

55:44

I don't know how you get through your day, right? Again,

55:47

this doesn't mean it's perfect. There

55:49

are clunkers out there, absolutely. But

55:52

at least there's a process. So

55:55

again, how important is the,

55:57

do you have a cold or you

55:59

have. terminal cancer, right? How serious is

56:02

the illness? But really big decisions

56:04

get a second and a third opinion, you

56:06

know, makes be a fine

56:08

you should people should know how

56:10

to evaluate at least

56:12

the background of a physician Are you

56:14

board certified in this specialty? Right?

56:17

That's like a first layer, you know,

56:19

do you have sufficient expertise and

56:21

then if You

56:23

think that there's you don't feel comfortable with

56:26

the decision or you know Whatever you want

56:28

to make sure that it's the of it

56:30

you're making the right decision If there's someone's

56:32

recommending surgery or whatever get a second opinion,

56:34

right or you get a third opinion I

56:37

also tell people if the doctor starts doing crazy

56:39

stuff, you know, you might want to not sometimes

56:42

I think you know, like they're selling homeopathy out

56:44

of their office leave I thought

56:46

not somebody that I would trust go

56:48

through that same kind of process of evaluating

56:50

experts, right? And you can get to

56:53

the point where you like yeah, this is pretty much and everyone's

56:55

telling me the same thing even

56:57

even very credentialed experts, so it's

56:59

probably correct sometimes patients fall into

57:01

this trap of Doctor

57:04

shopping, you know where it's like again

57:06

pick your expert Keep

57:08

going until you find somebody who gives you the

57:10

answer you want if that's your process that answer

57:12

is probably not reliable It's probably

57:14

just what you want to hear and then

57:17

you end up like Steve Jobs, right? Then you end

57:19

up doing the thing that they're telling you because it

57:21

sounds good May not give you

57:23

the best outcome and we can we know this

57:27

Scientifically because you could study it you could say

57:29

wait people do this process What what outcomes do

57:31

they have and the more

57:33

you sort of go outside the lines that the

57:36

the worse your outcome You know, it actually

57:38

does affect the medical outcome. I Want

57:41

to ask you something just because this is

57:44

a term that gets thrown around a lot

57:46

in true crime Especially on that the concept

57:48

of Occam's razor the simplest solution is often

57:50

the best I guess as a

57:52

skeptic as someone who practices critical thinking What

57:55

do you think about Occam's razor? What are the

57:57

flaws or is it a pretty good paradigm? So

58:00

it's a good paradigm, but you misstated

58:02

it because everybody

58:04

misstates it It's not the

58:07

simplest answer is the most likely to be true

58:09

Because sometimes the real answer is very

58:12

complicated and you could invent a simple

58:14

answer. That's Complete horseshit, right?

58:16

So it's lost in translation kind of thing,

58:18

right? So he wasn't lying in English was

58:21

Atlanta the real translation is I'm

58:23

just gonna paraphrase it But the

58:25

answer that introduces the fewest new

58:27

assumptions is more likely to be

58:29

true And that's a

58:31

critical difference Right because

58:34

you could say well aliens did everything that's

58:36

my simple one answer for everything And you're

58:39

coming up with this complicated explanation for

58:41

every different thing Like

58:43

yeah, but you're introducing this massive new assumption

58:46

that there are aliens on earth and I'm

58:48

not introducing any new assumptions I'm just going

58:50

by things that we know exist, you know,

58:52

so That's

58:55

the real way to approach. Are you introducing

58:57

a new? Assumption assuming the

58:59

existence of a new element and

59:02

That's what Occam's razor tells you

59:04

to avoid or to minimize if

59:06

you could explain Something

59:09

using stuff we already know

59:12

It's more likely to be true then

59:14

if you were saying well, maybe there

59:16

was this unknown thing that is happening,

59:18

you know And it's okay to hypothesize

59:20

that but then you've got to test

59:22

it, right? It's okay, you

59:24

know, so maybe there is an unknown

59:27

element and you have you that's now

59:29

a hypothesis But that doesn't become your

59:31

conclusion. You can't skip over the whole

59:33

testing part of it Yeah But just

59:35

because you can weave a narrative that's

59:37

complicated or that introduces random elements ad

59:39

hoc is another good concept ad hoc

59:41

means you're introducing an

59:43

element as needed right or

59:47

Special pleading is another term that we

59:49

use you're making up an Explanation

59:51

ad hoc as needed to explain anything

59:53

that you need to explain. We're really

59:55

good at that. People are really creative

59:57

We're very good at that Again,

1:00:00

if that's your process, you could defend anything. But

1:00:02

outcomes rates are as part of a process saying,

1:00:04

nope, we're going to stick with the evidence

1:00:07

that's been established, facts that are

1:00:09

established, see if we can

1:00:11

explain it without introducing anything too complicated

1:00:13

or anything new, any new

1:00:16

assumptions. And those explanations are more

1:00:18

likely to be true because you're not introducing a

1:00:20

bunch of new stuff. In

1:00:23

medicine, it's the same thing. Can I

1:00:25

explain your symptoms with your no diseases or do

1:00:27

I have to introduce a new disease? Now

1:00:30

maybe they do have a new disease, but what you don't

1:00:32

want to do is

1:00:34

introduce three new rare diseases. You

1:00:36

have three rare diseases. What are

1:00:38

the odds that versus,

1:00:41

well, there's one disease

1:00:43

that could explain everything. It's not

1:00:45

simple, it's just the introducing new

1:00:48

elements. But sometimes patients

1:00:50

do have three diseases, but we know

1:00:52

that they have them or they have

1:00:55

one disease that leads to all the

1:00:57

other ones. We have diabetes, which causes

1:00:59

heart disease and neuropathy. So

1:01:01

I'm not really giving you three things. I'm giving you

1:01:04

one thing, which I know you have, and all of

1:01:06

the complications of that disease. That's

1:01:08

fine. That's outcomes rates is okay with that, even though I'm

1:01:10

giving you multiple explanations for your

1:01:12

symptoms, it all flows from what we

1:01:16

know is happening already without willy-nilly

1:01:19

just throwing in some completely new

1:01:21

random disease that we have no

1:01:23

evidence for. That's what

1:01:25

that means. That makes a lot of

1:01:27

sense. And yeah, thank you for correcting me because I

1:01:30

always heard it the simplest, but I think that

1:01:32

makes even more sense in a true crime setting.

1:01:35

Although, as you said, the evidence has to be

1:01:38

ultimately the end all, I mean, I can think

1:01:40

of one case we did where we interviewed a

1:01:42

couple. They had a crazy story. The

1:01:45

girlfriend was abducted and the man was

1:01:47

told, you're being monitored by this camera.

1:01:49

And it just sounded like something that

1:01:52

was completely made up. It was true,

1:01:54

though, and when police actually invented it,

1:01:56

they found, no, this is exactly what

1:01:58

happened. important to remember that,

1:02:01

you know, obviously in our legal system, the evidence

1:02:03

has to carry things. Yeah, sometimes people

1:02:05

do have rare diseases, not often, by

1:02:08

definition, they're rare, but not

1:02:10

never, right? Sometimes really weird shit

1:02:12

happens. You have to be able

1:02:14

to pick up those cases as

1:02:16

well, because as long as you

1:02:19

have a process, as long as it's like flowing

1:02:21

from the evidence, and it's not just ad

1:02:23

hoc, right? Absolutely. I wanted

1:02:26

to ask you one thing. You

1:02:28

know, I think

1:02:30

I know the answer, but I'd be curious

1:02:32

what your take is. Is being a critical

1:02:34

thinker the same thing as being a cynic,

1:02:36

and are there pitfalls that you could fall

1:02:39

into if you take the cynicism approach to

1:02:41

everything? Yeah, so being a

1:02:43

cynic is actually being anti-critical thinking, right?

1:02:45

Because you're basically rejecting things just

1:02:48

to reject them. That's your

1:02:50

process, right? I don't believe in that because

1:02:52

I don't believe in anything, or whatever. Sometimes

1:02:55

we use the term contrarian. It's like, well,

1:02:58

everybody thinks this, so I think it must not be

1:03:00

true, right? The mainstream media

1:03:02

thinks this, so that's, it's got to

1:03:04

be wrong. It's

1:03:06

like, well, that's sort of the opposite

1:03:09

of the argument from authority, or the

1:03:11

argument, or the ad hominem. It's just

1:03:13

that I reject anything mainstream, or I

1:03:15

reject whatever, anything that's institutional, or if

1:03:18

the government says it, the government lies, therefore everything

1:03:20

they say is a lie. Those

1:03:22

are also logical fallacies, and again,

1:03:24

that's not a skeptical critical thinking process.

1:03:27

It's just a negative process, right?

1:03:29

So skeptics are not cynics.

1:03:32

We are open to anything, whatever

1:03:34

the evidence and logic leads, you know,

1:03:36

wherever it leads, that's where we will follow. Sometimes

1:03:39

the mainstream media is correct. Sometimes the

1:03:42

government's not lying to you, you know,

1:03:44

but by definition, a cynic,

1:03:47

that's a bias, right? That's a

1:03:49

filter, and it's not following

1:03:51

the evidence. It's assuming something

1:03:53

bad about people, or it's to

1:03:55

say that global warming is not

1:03:57

real, but they're both pseudoscience, it's

1:03:59

just indifferent. directions. Sounds

1:04:02

like cynicism and denialism are just gullibility

1:04:04

dressed up in a black leather jacket

1:04:06

with smoking a cigarette. So it looks

1:04:08

cooler, but it's basically the same thing.

1:04:10

Sometimes there's gullibility, people following the narrative

1:04:13

of their tribe, but sometimes it's, it's,

1:04:16

you know, you're the fossil fuel industry, you

1:04:18

have a pretty strong motivation. They're not gullible.

1:04:20

I don't think they're gullible. I think they

1:04:22

know exactly what they're doing. And

1:04:24

if you're selling something, you know,

1:04:27

the contrarian version is just assuming

1:04:29

that whatever is mainstream is wrong.

1:04:32

Same thing with what we would call

1:04:34

denialism, right? So denialism is when it's

1:04:36

pseudo skepticism, just like pseudoscientist

1:04:38

decides denialism is to skepticism.

1:04:40

It's, you're taking

1:04:42

something that you don't like, but sometimes

1:04:45

it's, it's, you know,

1:04:47

you're the fossil fuel industry, you have a

1:04:49

pretty strong motivation. They're not gullible. I don't

1:04:51

think they're gullible. I think they know exactly

1:04:53

what they're doing. And if you're selling something,

1:04:56

it's not necessarily gullibility, right? You have powerful

1:05:00

motivation. And a lot, most people,

1:05:02

I think just like we are

1:05:05

victims and perpetrators at the same time, like, like,

1:05:08

I think anti-vaxxers are sincere, but

1:05:12

they were convinced by a pretty

1:05:14

package and they are passing

1:05:16

it forward, right? So they're now deceiving the

1:05:19

next person down the line in the same

1:05:21

way that they were deceived. But

1:05:23

I don't think there's any cynical,

1:05:26

you know, reason for

1:05:28

it. I think they're sincere. They're just suffering

1:05:30

from misinformation and critical lack

1:05:33

of, again, of

1:05:35

critical thinking. Most con artists

1:05:37

are themselves deceived. And

1:05:40

then in the mix are the real sharks who

1:05:42

are taking advantage of the whole thing to prey

1:05:44

upon people. But most of us

1:05:46

are just paying it forward. You know,

1:05:48

just whatever deceptions we've

1:05:50

been victimized by, we pass on to

1:05:52

other people. This

1:05:55

has been a great conversation. I really want to thank

1:05:57

you for taking the time. Before we go,

1:06:00

want to emphasize how great science-based medicine is

1:06:02

on a personal level. There was a time

1:06:04

in my life when I had a relative

1:06:06

with some pretty serious health problems, and

1:06:09

that was a place to go

1:06:11

to to get clear explanations of

1:06:14

different treatments and stuff. I

1:06:16

believe that there were writers there like David

1:06:18

Gorsky, Harriet Hall, that it was

1:06:20

really very helpful, and so I would

1:06:23

encourage people to check that out. Where

1:06:25

else can people find you in your

1:06:27

work? Yeah, if you just go

1:06:30

to Fimmick it Easy, if you go to

1:06:32

theskepticsguy.org, that's like the portal into everything that

1:06:34

we do. And the last question we always

1:06:36

ask is, is there something we didn't ask

1:06:38

that we should have asked that you wanted

1:06:40

to mention? You guys asked a lot of

1:06:42

great questions. I think we really covered a

1:06:44

lot of territory. Awesome. Thank you so much

1:06:46

Dr. Novella, it was really great talking to you. Yeah,

1:06:48

it's been a lot of fun guys. Thanks for having me. We

1:06:51

would like to close by once again

1:06:53

thanking Dr. Steven Novella for taking the

1:06:55

time to speak with us again. Again,

1:06:57

we highly recommend his podcast and his

1:06:59

book and we will link to both

1:07:01

in our show notes. Thanks

1:07:03

so much for listening to The Murder Sheet. If

1:07:06

you have a tip concerning one of the cases

1:07:08

we cover, please email

1:07:11

us at murdersheet@gmail.com.

1:07:15

If you have actionable information about

1:07:17

an unsolved crime, please

1:07:19

report it to the appropriate authorities.

1:07:23

If you're interested in joining our

1:07:26

Patreon, that's available at www.patreon.com/murder

1:07:30

sheet.

1:07:33

If you want to tip us a bit of money

1:07:35

for records requests, you can do so

1:07:38

at www.buymeacoffee.com/murder

1:07:41

sheet.

1:07:44

We very much appreciate any support. Special

1:07:47

thanks to Kevin Tyler Greenlee who completed the

1:07:49

show.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features