Podchaser Logo
Home
"Biology's Image Detective" - Fraud In Science

"Biology's Image Detective" - Fraud In Science

Released Thursday, 8th July 2021
Good episode? Give it some love!
"Biology's Image Detective" - Fraud In Science

"Biology's Image Detective" - Fraud In Science

"Biology's Image Detective" - Fraud In Science

"Biology's Image Detective" - Fraud In Science

Thursday, 8th July 2021
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

I just feel there has been so much damage

0:02

done to the credibility of

0:04

scientists. You know, in, particularly

0:06

in the past four years under a president

0:09

who did not seem to support science

0:11

or you know, even suppressed, good information.

0:15

And also the, the power of social media,

0:17

where there seems to be very little incentive

0:20

by the social media platforms to. Bring

0:23

out only the truth and to suppress false

0:26

information. And I can sort of see

0:28

that we want in the United States in particular,

0:30

we want freedom of press and freedom of

0:32

opinion, but I also feel

0:34

that there should be some way of reporting people

0:37

who. Who make false claims,

0:39

like the vaccines have killed more people

0:41

than COVID-19 itself. So

0:51

well thank you for coming on the podcast to listen. Just to get

0:53

started. I wanted to. Kind of

0:55

ask you. The new Yorker just published an article about

0:57

you and in the headline that described you as biology's

1:00

image detective, can you explain what that means? Yeah,

1:03

so, well, I'm a biologist by training.

1:05

I'm a microbiologist and

1:07

I also apparently have some talents to

1:09

find, or to see duplicated

1:11

parts of images or duplicated images.

1:13

So I kind of detect a Photoshop,

1:16

but I can detect if a photo

1:18

has elements that are repetitive. So

1:20

for When we talk about scientific images,

1:22

that could be an image showing

1:25

multiple cells, but all the cells look

1:27

identical. So it appears that those

1:29

have been cloned by Photoshop.

1:31

And so not about biology

1:33

cloning, but by somebody taking

1:35

a cell and stamping that a couple of

1:37

times in a photo. And so that

1:39

is my specialty. I look at photos

1:42

in scientific papers and I will detect

1:44

duplicated elements or just panels

1:46

that have been tested. How big of

1:48

a problem is this well, that is a big problem

1:50

because that means that somebody has. Manipulated

1:53

the results and photos in scientific

1:55

papers are the data. When

1:57

you read a scientific paper, it

1:59

will say we found such. And

2:01

so, you know, this, this

2:03

isn't that experiment and this was the outcome.

2:06

See, figure one. So figure one,

2:08

any figure in a scientific paper usually

2:11

is data that's. It's a photo

2:13

of cells. It's a photo of tissues. Of

2:16

course also be a line graph or a table

2:18

or things like that, but yeah, figures

2:20

are the data. And so if somebody changed

2:23

the results in a photo, or

2:25

if somebody used the same photo to represent

2:27

two different experiments, Might

2:30

be science misconduct, that's data, fabrication

2:33

or falsification. And that's, that's a really

2:35

big, no, no. In science you should

2:37

not be doing that. Is this a

2:39

widespread issue? No, it's

2:41

not. It's it. When you look at my work,

2:43

you might think it's a widespread issue because

2:46

this is what I do, and this is what I find. But

2:48

I actually did a. Research to look

2:50

at how many times I would find

2:52

these duplicated images. And

2:54

so I scanned 20,000 papers that

2:57

had at least one photographic image

3:00

and 4% of those papers

3:02

had a duplicated image or

3:05

duplicated parts within an image. So

3:07

it's about 4% and

3:09

the real percentage of fraud might be

3:11

higher because I'm only looking at photos. And

3:14

so I'm not looking. Tables or

3:16

sequencing data or any other type of data.

3:18

So the real percentage of fraud

3:20

might be higher. I would just estimate

3:22

between five and 10%, but

3:25

I'm only probably detecting the tip of the iceberg

3:28

if you are a really good Photoshop or I wouldn't find

3:30

that. But yeah, so it's 4% of detectable

3:33

duplications in biomedical papers.

3:36

What's kind of the scientist's motivations here. They're

3:39

risking their careers, right. And their professional reputations

3:42

by conduct conducting

3:44

scientific misconduct. And more importantly, there.

3:47

Putting out false, false science and putting a

3:49

bad science. So why are they doing that? Because

3:51

the repercussions are surprisingly

3:54

small, like a lot of scientists

3:57

who do this and who are being caught,

3:59

don't get any punishment for that.

4:01

Like they add best some

4:03

scientists. And this is really. Only

4:06

a very small fraction of scientists who have been

4:08

caught. They might maybe be

4:10

punished by not receiving

4:12

grants for a year or something like that.

4:14

But that's for most scientists

4:17

who already have a lot of brands going on, probably

4:19

not a big problem. So a

4:21

lot of scientists who are being caught doing this

4:23

are still yeah.

4:25

Still having an at, at a university

4:28

and I'm not being fired. That's

4:30

frustrating because it's cheating and

4:32

we would, you would think that the person would be punished

4:34

for that, but there's very, very little punishment.

4:37

And in most cases, people are, yes.

4:39

They'll have a glorious career and get

4:41

more and more. Right. When

4:43

did, when did this first come across your radar as

4:45

something you were interested in and an issue that you saw

4:47

was significant? I

4:50

started his work working on plagiarism,

4:52

actually. So I heard on a podcast or

4:54

I was reading about science. Misconducts,

4:56

but specifically about plagiarism. And

4:59

I thought let's just check a random

5:01

sentence that I had written in a science paper

5:04

in in Google scholar. So I put it, put

5:06

that sentence between quotes in Google scholar

5:08

expecting only to find my own paper. And

5:11

I found another paper published. Predatory

5:15

publishing book or like some, yeah,

5:17

some, some strange online book that was

5:19

free for download, but it was actually my texts.

5:21

So they had used my sentence and

5:24

pass it off as their own in this paper.

5:26

And it turned out that this paper, not

5:28

only I'd use my sentence, but the sentences

5:30

of many other scientists. So it was sort of this,

5:32

this patchwork of many different scientists

5:35

at different, many different papers that they. Put

5:37

together and sort of passed off as a

5:39

new paper, but it was all plagiarized

5:42

text. So I, I worked

5:44

on plagiarism for about a

5:46

year. And then by another

5:48

coincidence I spotted in a PhD

5:50

thesis, a Western

5:52

blots. It's a protein bar lot. It's a photo.

5:54

And I, it had a very particular smear

5:57

that are recognized. And then I saw

5:59

a couple of pages in another chapter

6:01

or so in that sense, PhD

6:04

thesis. I spotted the same photo, but

6:06

it was upside down and it had been used

6:08

to represent the different experiments. And,

6:11

but yeah, I recognize it. It had this weird

6:13

little spot or smear, so

6:15

that was not good. And this paper had been published

6:18

in a scientific paper as well. And

6:21

I recognized I had some talent to, to do that.

6:23

So it was all by coincidence mainly, but

6:25

it. It's one of those moments

6:27

that sort of make your career or make

6:29

your career change? Yeah, if I hadn't

6:31

seen that, then I probably would never

6:34

have worked on this while you were doing

6:36

this. What was your day job? I

6:38

worked at Stanford, so I was a microbiologist.

6:41

I worked on the microbiome

6:44

of Marine mammals and humans. So the microbiome.

6:47

The bacteria that live inside our bodies

6:50

and, or on our bodies on our skin. And

6:52

I was working on, on the microbiome of

6:54

humans, but also dolphins and

6:56

sea lions. And that was my day job. I

6:58

was, I guess, a regular scientist

7:01

working at Stanford and writing

7:03

papers, doing research. And I was

7:05

doing this image duplication

7:07

searches in the evenings or in the weekend. So it

7:09

was sort of my, my hobby. I

7:12

yeah, I saw, well, I think you're, you're being a little

7:14

humble here. Cause I saw on Google scholar that you have

7:16

a paper that has like 20 K citations,

7:18

something crazy like that. And I, I

7:21

worked in a PhD lab in college and so

7:23

I knew a bunch of PhD students here and PhD

7:25

is you're by far the most cited person. I know there's,

7:30

there's plenty of other people who will play. I'm

7:32

just a very modest scientist by their standards.

7:34

So there's always, people have published more,

7:37

I think you know, four, four at the point. Of

7:40

my career that I'm at, I

7:42

am a probably, yeah,

7:44

sort of a middle of the pack type of scientist,

7:47

but yes, there's one paper that we published

7:49

in science and I'm the second author of

7:51

that. And that paper has a lot of

7:53

a lot of citations is what was one of the first

7:56

publications. Analyzing the microbiome

7:59

of humans using DNA sequencing.

8:01

So there had been other papers. We were not the first,

8:03

but it was one of the first large-scale papers.

8:06

And that has been cited a crazy amount

8:08

of times, I guess it was published in science.

8:10

And so I was incredibly

8:13

lucky to have worked on, on that project.

8:15

And yeah, I think the, the

8:17

paper still stands as we, we, we made

8:19

sure it was high quality and. No

8:23

image duplication. Oh, wow.

8:25

There's actually no phone auto in it. So like

8:27

most scientific papers. There's actually no photo in

8:29

this. Just, just line graphs. And

8:32

but yeah, I can vouch for

8:34

it that there's no, there's no science misconduct.

8:36

I'm sure there's errors in it. Like in any paper

8:39

that, in which you analyze, you know, thousands

8:41

and thousands of, of DNA sequences, there's.

8:44

It's very hard to not make any errors. We all make

8:46

errors. But it's it's done with the best of

8:48

intentions and it has stood up

8:50

to the, to the test of time. It's still being

8:52

cited. So Elizabeth

8:55

in do some research for this pod, or this

8:57

particular subject, I realized that

8:59

there are a number of ways to publish

9:01

fraudulent data in science. The one

9:04

that you specialize in, which is image doctoring it,

9:06

I guess it's somewhat prolific, like what you

9:08

just described. But then there are other ways like people

9:10

will run experiments. Nine out of 10 times

9:12

the result is no, but then like one out of 10 times,

9:15

it ends up being exactly what they want and then they get

9:17

that published. So what's your sort of take

9:19

on the full wide variety

9:21

of flawed in science and whether the incentive

9:24

structures that allow image doctoring to happen or

9:26

the same instead of structures that allow this to happen.

9:28

How would you break down sort of the incentives

9:30

that, that lead to both? Well,

9:33

so the incentives in science are,

9:36

are to publish. So we, as scientists

9:38

are. Encouraged, but also

9:40

almost forced to publish because it's needed

9:42

for our careers, like as a postdoc

9:45

or a professor you need to, or you're expected

9:48

to publish an X amount of papers per year.

9:50

And unfortunately, scientific publishing focuses

9:52

on positive results. So

9:55

if you have done a lung study

9:57

showing that a particular drug does not help

9:59

against the particular cancer, that's

10:01

not a very publishable. Paper,

10:04

because it's sort of a negative result. It

10:06

shouldn't be, but unfortunately, a lot of journalists

10:08

will say, well, that's result is not very

10:10

novel or, you know, earth shattering.

10:13

We want to have a positive results. So

10:15

the incentive to publish positive

10:17

results is one of them. Important. Yeah.

10:19

Incentives to cheat because people want

10:22

to have a positive results. And like you said, if

10:24

you have you know, 10%

10:26

only 10% of your experiments gives

10:28

the results you would like, then that's the

10:31

experiment that you'll pick for your paper. So

10:33

that's called cherry picking is basically

10:35

picking the results that you

10:38

like to see that fits your own hypothesis. But

10:40

ignoring the results. They do not fit your

10:42

hypothesis. And that would be

10:45

also called publication bias or like

10:47

we are, we're all biased. We all want our

10:49

experiments to work out a certain way. And if

10:51

it doesn't do we accept those results

10:53

or do we keep on trying until we have a positive

10:55

results? And, and that's still a big step

10:58

to where science misconduct. I do feel

11:00

that it's, it's cheating in a way, but

11:02

it's. I feel as bad

11:04

as really faking or forging

11:07

results. Like when you have have

11:09

you met, if you have measured a couple of things

11:12

and you change the results, you've changed the

11:14

values so that they cross a particular

11:16

threshold and suddenly your negative sample becomes

11:18

a positive. That is where

11:21

we're really talking about science misconduct. So

11:23

there's, there's a whole range. Steps

11:26

in between from publication

11:28

bias, towards P hacking, which

11:30

is sort of the cherry picking where you keep on

11:32

doing statistical tests. And

11:34

there's one statistical test in which your results

11:37

are only is significant. You pick that. Really

11:40

changing or fabricating results

11:43

that which so fabricating

11:46

and falsification, those are considered science

11:48

misconduct to get to with plagiarism, by

11:50

the definition of the office of research

11:52

integrity, she hacking and

11:54

publication bias are not necessarily

11:57

included in the pure

11:59

definition of science Smiths for that, but there's a

12:01

lot of gray in between. There's, there's a,

12:03

it's hard to draw the line. What is misconduct

12:06

and what is. Bias. Right.

12:08

I think in reading the New York article specifically,

12:11

I was kind of surprised to find that you had

12:13

found issues and some of the most

12:15

important and prestigious journals

12:17

and articles. And when I'm always listening to

12:19

news or other podcasts

12:21

about science, they're always referring to peer reviewed

12:23

articles. Or these editors at these

12:25

journals have the highest standard. But

12:28

obviously not because, because you've found you've,

12:30

you've discovered some sort of fraud. What are these editors

12:32

missing? Like what is, what is the issue within

12:35

this organization that allows stuff

12:37

like this to get published? Oh,

12:39

there could be all kinds of issues that editors

12:41

are either not paying attention to, or just

12:44

not trained to find problems with papers.

12:46

So you would hope that an editor would

12:48

find them. Yeah. Obvious Photoshops

12:51

in, in, or most obvious errors

12:53

in papers, but an editor

12:55

is usually a person who's unpaid

12:57

who does editing sort of as a side

13:00

job, but might be a very busy professor

13:02

who being asked to do an editing. B being editor

13:04

of a journal. So basically

13:06

what they do is they, they got the manuscripts

13:09

that are usually pre-screened and

13:11

then did need to find peer reviewers, or

13:13

also busy and unpaid and, and

13:15

have no really no time to really look

13:17

carefully at a paper. And

13:19

and then when they received the peer reviews, they

13:21

sort of compile that and

13:23

make a final decision. But very often

13:26

I've been an editor myself for a very short

13:28

amount of time. I found it very hard. It's

13:30

You don't really have time to read the paper yourself.

13:32

You sort of rely on your peer reviewers to do

13:34

a good job. And sometimes they also do don't

13:36

do a good job. So it's, it's really tough because

13:39

none of these jobs are paid. And

13:41

unfortunately we have to pay the publisher a lot

13:43

of money to get our work either depending

13:46

on the model of publishing to, to

13:48

get the paper published. So where that money

13:50

goes into is, is not very clear.

13:52

Everything is published online. So it's.

13:56

And, and also editors are not trained to

13:58

find these problems. I'm looking at it with

14:00

a lot of experience. I

14:02

I've seen a lot of different types of Photoshops

14:05

or photo duplications. And so

14:07

I'm very trained for these situations

14:09

because I've seen so many of them, but

14:11

a lot of people don't really see these problems

14:13

until you pointed out to them. I point

14:16

point I do a lot of puzzles on

14:18

Twitter. I will post them on their image forensics.

14:21

And then of course, when I write. One of

14:23

those images, people know there's something wrong

14:25

and they'll usually find it. But

14:27

people forget that I've seen hundreds

14:29

of images maybe before this one

14:31

with the duplication. And so once,

14:33

you know, there's an arrow, you'll find it. But if you

14:36

just quickly look at fairs, you

14:38

might not spot it. And you need to be

14:40

told that this might be a thing before you start

14:42

to see that yourself. So if you were

14:44

a benevolent dictator, what would you change about

14:46

the system to avoid these issues? It's

14:48

really hard because fraudsters are going to fraud.

14:51

And the only thing you can

14:53

do is, is to like

14:55

ask people to ascend

14:57

in raw data. That would be sort of

14:59

an extra hurdle, but if a real fraudster

15:02

wants to fraud, they're going to fraud. So I,

15:04

I have no illusions that. Make

15:07

science a hundred percent foolproof and fraud

15:09

proof. There's always going to be people

15:11

who want to cheat the system because

15:13

we, we put so much emphasis

15:15

on outputs on science papers as,

15:17

as the output of scientists. And

15:20

it's only when we have replicated

15:23

a paper that we sort of know it was

15:25

probably true, but you can never a hundred

15:27

percent be certain that

15:29

everything in the paper was honest and

15:31

that's very unfortunate. Science

15:34

in a way is, is about finding the truth.

15:37

I've always felt like when you are in

15:39

science, you want to discover

15:41

a particular pathway or a

15:43

particular bacteria, or

15:46

you want to discover what is true.

15:48

What is, what is the truth about

15:50

a particular biology process?

15:52

And so I've always felt that science should be

15:54

about reporting the truth. So

15:57

for scientists to fraud, I feel that's

15:59

a huge violation of, of our profession

16:02

as well. Yeah, I definitely

16:04

agree with that. So I'm, I'm someone who reads

16:06

a lot of papers. Anytime. I'm curious about something I'll

16:08

hop on Google scholar and I'll, I'll see what I can find.

16:10

But I I'm, I'm not a PhD.

16:13

I don't have that much training in this field.

16:15

And I have a lot of friends who do the same thing as me and don't

16:17

have any experience with this. how do we read

16:19

papers and say, this is something which we should, we should have high

16:21

degree of credibility. And this is a really good paper versus

16:23

this is something that, you know, maybe we shouldn't put

16:26

too much confidence in. Yeah,

16:29

that's a good question. I don't have a standard

16:31

answer for you because even papers

16:33

that have been published in high impact journals

16:36

by. You know, officers who

16:39

work at the institutions that seem to

16:41

have some credibility, even

16:43

those papers have been caught with

16:45

fraudulent data. So it's not a hundred

16:47

percent guarantee, but having said that

16:50

papers that have been published in science

16:52

or the Lancet, usually with

16:54

some very big exceptions, usually

16:56

are more credible than papers in that are,

16:58

for example, published on a preprint server

17:01

that have not been peer reviewed. Those are the two

17:03

extremes, but yeah. There,

17:06

there has been a big paper in published

17:08

in the lenses that had been busted

17:10

retracted last year, because it was based

17:12

on probably based on fraudulent data. And

17:15

so that's one of those big exceptions that

17:17

makes headlines and that make a lot of people

17:20

who are not scientists, think that all

17:22

science is flawed. Well, that was really

17:24

the exception. It's it's like saying,

17:26

yeah, thoroughness, you know, it was a company that

17:29

did not really do well, so

17:31

we kind of trust any biotech company.

17:33

Anyway, you cannot make those, those

17:36

extra population extrapolations

17:38

based on one bad apple. It's

17:40

it's usually the. Those cases

17:42

make headlines and for good reasons.

17:45

And that was a fraudulent paper by at

17:47

least from old evidence I've seen, but

17:49

it was hard to recognize it as a fraught Olympic

17:51

for, I did not recognize if myself, either.

17:54

I actually tweeted about this paper, my

17:56

haters, my trolls who are my

17:59

loyal enemies on Twitter are still

18:02

saying, oh, big tweeted about

18:04

this paper. So she cannot detect any

18:06

fraud. Right. And it was hard just

18:08

to look at that paper and realize it wasn't fraud.

18:10

You really had to dive deep into the paper,

18:12

knew a lot about particular numbers

18:14

that were misreported to find out

18:17

that that was fraud. So it happens anywhere. Those

18:19

cases make big headlines, but in the end, Usually

18:23

you can trust those, those journals. But yeah,

18:26

it's a, there are exceptions, of course. So,

18:28

so let me, let me play a sort

18:30

of devil's advocate really quickly, or at least

18:32

from what I've read and what I understand by be totally

18:35

off. I came across these pre-publication

18:37

sites, right? Like I haven't

18:39

written down here AR XIV and

18:41

bio R X archive

18:45

archive, bio archive, archive.

18:49

That's how you pronounce it. Okay.

18:51

Sorry. This is So

18:56

not everybody knows where I have that on this argue.

18:59

Right. And the argument that I basically read is, well, sometimes

19:02

it's worthwhile to publish some sort

19:04

of science output. Just get the output

19:06

out there, even if it's just an idea, even if it isn't

19:08

peer reviewed, even if it isn't a hundred percent

19:10

accurate and has veracity. Just to get that idea out

19:12

there, you know, into the, into the minds of people that

19:15

might do more research and build on it, even

19:17

though it's like incredibly low barrier to entry and

19:19

anybody can get it out there. Is that generally

19:21

a good thing for science? Do you think? I

19:23

believe so. And especially in the case where

19:26

we were last year at the beginning

19:28

of an epidemic where. Quite

19:31

frankly, we're all in a state of panic where

19:33

that was, you know, a lot of mortality,

19:36

a lot of people dying, a lot of people getting

19:38

sick, a new virus, nobody really knew

19:40

about, you know, th the new enemy

19:42

was in, in a situation like that. We need

19:44

science to be fast, and we need

19:47

to have a very quick model

19:49

of scientific publishing. So if a person

19:51

has found a result

19:54

that is worth sharing, that might save lives.

19:57

There's a big argument to make, to

19:59

publish this quickly, even though it might mean

20:01

publishing before peer review, but

20:03

just getting it out there so that a lot of people can

20:06

read it and, and benefit from

20:08

these results. But there's a delicate

20:11

balance between wanting to

20:13

publish fast and doing good science. So

20:15

those things are. Yeah, they're, they're, they're

20:17

two ends of the spectrum. It's, it's two parts

20:19

that are usually not in agreement

20:22

with each other because science, if it's

20:24

done well, it's very slow. It's

20:26

a Spain awfully slow. It's like looking for

20:28

tiny details, having long

20:30

arguments with other scientists about how

20:33

to interpret the particular results. Yeah,

20:35

that just is not, cannot be done

20:37

in a very fast way. And so it,

20:40

it's, it's finding this balance between

20:42

publishing results really fast, but

20:44

knowing it's on a preprint server,

20:47

it's not being peer reviewed. It's

20:49

just a view of one particular lab. And

20:51

that couldn't be very right biased because no

20:53

other people have had a chance yet to

20:55

carefully digest it and

20:57

give feedback and go through these normal

21:00

and slow processes. So it's. It's

21:03

I'm all for pre-print service,

21:05

but it comes with a lot of caveats. You

21:07

need to interpret it as just

21:10

a fuse of one lab, not being peer reviewed

21:12

and take it with a grain. So

21:14

one high profile instance that

21:17

I think most people are familiar with of

21:19

of a paper being just rushed out before

21:22

it was ready. Was the, was

21:24

Trump's favorite? Hydro, hydro

21:26

chlorine. Sorry. How do you say that? Hydroxy

21:30

chloroquine study. That's just like God got

21:33

shuttled out and I believe you were one of

21:35

the early scientists to say, this

21:37

is, this is bad research. Right? So

21:39

can you, can you kind of talk about that situation? Sure.

21:42

So this was a paper by the

21:44

group of professor Howell in Maaseiah

21:47

in France. And he claimed that hydroxy

21:49

cork Quinn was a

21:51

really good medication to get

21:53

rid of the virus. So he looked at patients

21:56

who had the virus who were positive for the PCR,

21:58

and he looked at clearance of the virus. Repeatedly

22:01

testing these patients and seeing when

22:03

they would become negative. And he showed

22:06

in his paper, which was only, I believe 40

22:08

patients. So it's a very small study and

22:10

he had three different treatment groups. So some

22:12

people were not treated. Some people only got hydroxy

22:15

Clarke when and the third group got hydroxychloroquine.

22:18

Plus I see assay throw mycin, which

22:20

is an antibiotic. And he showed that

22:22

the the both groups that had the hydroxy

22:25

Clarke. Treatment that those people

22:27

cleared the virus. So got PCR negative

22:30

faster than the people who did not

22:32

receive any of those drugs,

22:35

but the, the groups were really small. So if you

22:37

have 40 patients and you divide them over three groups,

22:39

you can already see that the numbers get pretty small,

22:42

but there are a lot of flaws

22:44

with this study. So one of the things was that. There

22:47

were six patients who were in

22:49

the hydroxychloroquine groups in either one

22:51

of these groups who didn't most of these

22:53

patients were, did not really do very

22:55

well on the hydroxychloroquine and they were

22:57

left out of the study. So they started

23:00

with a particular patient group

23:02

that's but six patients were left out. So one

23:04

of them died. Three, I think two

23:06

of them got really bad to got really

23:09

sick. So they were transferred to the intensive

23:11

care. One patient got little. Side

23:13

effects are two patients and one

23:15

patient just walked out of the study. So it wasn't

23:18

pretty clear. So it looked like the

23:20

researchers might have decided

23:23

to do this cherry picking that we talked about

23:25

previously, where, you know, the results

23:27

were really quite what they had hoped. You know,

23:30

if one patient dies on your drug, You

23:32

should not leave them out of the study. Right? science,

23:37

it's us. So basic, like just

23:39

say, I did not want this result. I'm just going

23:41

to leave it out. And that was actually, it had

23:43

noted that they had written it in the paper.

23:46

So who else might to have

23:48

left? Left? That was just one,

23:50

there were many other problems with this paper. So I

23:52

wrote a long review about it. There

23:54

was problems with ethical, ethical,

23:57

the dates of approval, or first the start of the

23:59

study that there was some problems

24:01

there. He included some children,

24:04

even though he wrote that he didn't think Lew children,

24:06

but he did. And then. There were differences

24:08

between the treatment group. So those people

24:10

were all treated at a different hospital than

24:13

most people who were not treated. And so there were

24:15

all these differences between the different treatment

24:18

groups that you would not expect. Red

24:21

or rigorously set

24:23

up scientific study, you would do shoots,

24:25

randomize your patients. And he

24:27

didn't do that. He appeared to have handpicked patients

24:30

and maybe those patients who were

24:32

on hydroxychloroquine were already

24:34

less sick to start with, or

24:36

maybe farther in their in their disease

24:38

status. So they would have cleared the virus even faster.

24:41

So all kinds of problems. And I,

24:43

I wrote a, a critical blog post about

24:45

that, and that got me into trouble. Can

24:48

you elaborate on that? Yeah. So

24:51

obviously professor Al

24:53

did not enjoy my critique

24:56

and and I can understand that I

24:58

understand that he was not happy with my critique

25:00

and I, so I raised the concerns

25:03

not only by writing about this on a

25:05

blog post, but I also posted on a

25:07

website called pub peer, which

25:09

is a website where you can leave comments on scientific

25:11

papers. And he did

25:14

not answer to my command. Stare

25:16

in study, started calling me all these names.

25:18

So he's called me a witch hunter can

25:20

sing play, which means like a crazy woman.

25:23

He called me a girl who hunted me down

25:25

and, and no, all kinds of, not

25:28

very nice words, but yeah.

25:30

So, and he also some

25:32

of his people who works for him who

25:34

are working for him in the same institution,

25:37

Started to harass me on Twitter. And so

25:39

there was one one of these professors

25:42

like shabby and started to ask me all

25:44

these questions on Twitter. Which

25:46

may boil down to who are who's

25:48

paying you, are you being paid by

25:50

big pharma to bring down

25:52

my professor, professor And

25:55

in the meantime, I started to look at more

25:57

papers by that. Well, and found more problems.

25:59

So there were some, actually some, some

26:01

image problems. So some image, duplication

26:03

problems, but also other problems with

26:06

ethical approval of some of

26:08

his studies. So it appeared that there's not

26:10

just this hydroxychloroquine paper that had

26:12

some. But also a bunch

26:14

of other papers. So I posted all of these on

26:16

puffier and other people started chiming

26:19

in finding more and more problems. So by

26:21

now I think this professor has

26:23

270 papers

26:25

from his group that are, have been flagged

26:27

on pop here. And he's becoming a bit

26:30

annoyed with all of us and yeah, he has

26:32

now. Threaten me with a lawsuit. So he

26:34

has filed a complaint against me with

26:36

the prosecutor in Maseo

26:38

claiming that I harassed him and that

26:41

I extorted him and blackmailed him.

26:43

And that's all based on two answers.

26:45

I gave them on Twitter where they asked me,

26:48

who's paying you. And I said, oh yeah, You

26:50

can donate money, I'll make patron accounts.

26:53

And another one I said, well, I can, I'm a consultant.

26:55

And so I could check papers if you want,

26:57

if you want me to check some papers, happy

27:00

to do so, as long as you pay me. So

27:02

he claims that's blackmailing. I

27:04

kind of imagined that's blackmailing, but yeah, it's

27:07

you filed a police report with

27:09

the prosecutor in the Messiah and

27:12

this case is under investigation. And

27:15

I, I hope this will not lead to a

27:17

lawsuit because I don't

27:19

think I did anything wrong, but I'm

27:21

not quite sure how the legal system

27:23

and friends work. So, so for

27:25

now it seems to be. Threats to try

27:27

to silence me, but I've already sat on Twitter

27:29

a couple of times. I'm not going to be silenced. I'll

27:31

keep on a stand behind all my

27:33

questions. You can answer them on pop here.

27:36

And I don't think that a scientist

27:38

should be resorting to legal

27:40

steps to silence your critic

27:43

hustlers. But yeah, I guess that's. No,

27:45

a couple of other authors in

27:48

some cases have, but not all himself

27:50

and should have. Yeah, I have not answered any of the questions.

27:54

I saw a petition. I think it

27:56

was like a thousand scientists who came out to support you.

27:59

Why? I mean, how could you not, this is, I

28:01

don't know, this, this is just so outrageous is unbelievable

28:04

that like, not only is this

28:06

guy putting out garbage science that

28:09

has affected us that has affected the United States

28:11

because the president used that as,

28:13

as policy. But he's also going after. Whistleblowers

28:16

who are trying to keep science clean. This is so outrageous.

28:19

It is. Yeah. And unfortunately, that's,

28:22

I'm not the only person who has

28:24

been harassed or threatened on

28:26

Twitter or even in real life.

28:28

I have not been to that country in real life, but there are

28:30

a couple of scientists who are

28:33

just trying to bring out good news

28:35

or, well, let's say honest news

28:37

and try to. Go against

28:39

people who spread misinformation. There's

28:42

so much misinformation right now on social

28:44

media, where, for example, there's,

28:46

there's these tweets where people claimed that more

28:48

people have died of the COVID vaccine

28:50

than of COVID itself. And as a

28:52

scientist, you kind of be silenced. You kind of

28:54

look at these numbers and

28:57

just looked out away because it's completely not

28:59

true. And so a lot of scientists will say,

29:01

that's not true. We'll get to here, here at a number. But

29:04

then there's all these people are not scientists,

29:06

usually who claim they know better.

29:08

And they have fun to another website that disagrees

29:11

with all these hundreds of scientists. And so you get

29:13

all these very polarized

29:15

situations where scientists try

29:17

to bring out honest

29:19

information and based on facts and

29:21

other people say that the

29:23

facts are. Yeah, all these, these

29:26

things, these wars going on on Twitter,

29:28

and sometimes scientists have been threatened.

29:31

And there's actually one scientist in Belgium

29:34

who is now living under police protection

29:36

in a secret location because

29:39

he's being threatened. Then I was the soldier

29:42

who has escaped from

29:44

the army or something with a little of weapons

29:46

and it was trying to kill the scientist.

29:48

And it's just very strange

29:50

situations. We'll see that as well. Yeah,

29:53

I know there's just this like rising tide

29:55

of scientific misinformation. And I can think of one

29:57

big reason why. But of course, social media

29:59

has also, I think I contributed to this

30:01

now. Just anyone can publish without any

30:04

sort of. I dunno. I dunno, peer

30:06

review is not the right term for when people who are not

30:08

scientists say stuff, it's like the idea of sharing an

30:10

article without reading it first. Something like that.

30:12

Elizabeth, I think on this topic. And this is something

30:15

I've been thinking about a lot, is that it

30:17

just feels like in the since the pandemic, I guess

30:19

it's not exactly what we're, but since it we're coming to an end

30:21

to it it seems like the public. Really

30:24

doesn't trust the scientific community or it's

30:26

at an all time low. What do you think can

30:28

be done to improve

30:30

that trust whether it's in the United States or

30:32

around? Oh,

30:34

that's a great question. I. I

30:37

don't really know to answer because I,

30:40

I just feel there has been so much damage

30:42

done to the credibility of

30:44

scientists. You know, in, particularly

30:47

in the past four years under a president

30:49

who did not seem to support science

30:52

or you know, even suppressed, good information.

30:55

And also the, the power of social media,

30:57

where there seems to be very little incentive

31:01

by the social media platforms to. Bring

31:04

out only the truth and to suppress false

31:06

information. And I can sort of see

31:08

that we want in the United States in particular,

31:11

we want freedom of press and freedom of

31:13

opinion, but I also feel

31:15

that there should be some way of reporting people

31:17

who. Who make false claims,

31:19

like the vaccines have killed more people

31:22

than, than COVID-19 itself.

31:24

And, but yeah.

31:26

Then other people will say, well, if you suppress

31:28

that opinion, then that's oppressing

31:31

freedom of speech. And I, yeah, I,

31:33

I think that's a very hard

31:35

to solve. Issue. And I,

31:37

I sort of want this country to be

31:39

about freedom of speech, but when

31:42

that turns into misinformation, that

31:44

could actually cost life. I do feel there needs

31:46

to be drawn a line somewhere. And

31:48

as scientists, we are all very frustrated

31:51

that there's no way to report on

31:53

Twitter, false information. There is

31:55

actually no bottom. There's no way to report

31:57

people who send me emails saying

31:59

you belong in jail, or you are

32:02

a fraud. Like I cannot report

32:04

that I've reported several of these tweets and

32:06

I always get to hear from Twitter.

32:08

We don't feel that violates our rules.

32:10

Like you can actually say a lot of

32:13

things to each other before, before

32:16

tweets are being taken down. And I,

32:18

yeah, it's, it's this delicate balance

32:20

between freedom of speech. Yeah. Still

32:23

trying to be polite to each other, and I'm

32:26

not sure how to solve this, this this

32:28

is a very important question with with

32:30

a lot of aspects to it and yeah, just

32:32

don't have the answer. Do

32:34

you do you think all scientists, whether

32:36

it's physics, math, biology,

32:38

geology, whatever have a problem with

32:40

false data or is it just a bigger issue

32:43

within certain subsets yeah, so

32:45

I feel fraud in science is

32:48

probably anywhere in any particular

32:50

field of science. I focus

32:52

on images, which are a part

32:54

of molecular biology type of papers

32:56

because they have a lot of protein blots or DNA

32:59

blots. And so those are generally

33:01

photos, but there's also.

33:03

A lot of other types of data,

33:05

like optical spectra, where I found

33:07

fraud in I haven't really looked into

33:10

a lot of other fields, but I do feel there's

33:12

probably fraud everywhere, but I don't know

33:14

enough for example, to, to look

33:16

at the mass paper or a geology

33:19

paper to find a potential problem in it,

33:21

because that's not really. My

33:23

background, I, I look at these papers

33:25

and I just see numbers or graphs,

33:27

and I just don't understand what they mean. So I

33:29

kind of detect problems in them, but I'm

33:32

pretty sure that fraud is everywhere, but

33:34

I also think it is important to

33:36

be it's, it's easy to listen to my

33:38

story and the story of

33:40

misinformation and scientists and, and I want

33:43

to make sure that we. Confuse these two things

33:45

because there's fraud in science and that's what I

33:47

work on, but I also want to make

33:49

sure that there's, that most science is

33:51

to be to be trusted. And I feel

33:53

it's very easy to hear my steel, sorry.

33:56

And interpret like, oh, all science is,

33:58

is flawed. And we kind of trust that. And

34:00

at the same time I'm telling no, we should trust

34:02

science. And I, I feel that's a very

34:04

important thing to To distinguish

34:06

between these two things. So I, I will

34:09

say that there is fraud in science. It's probably

34:11

everywhere. There's fraud everywhere. There's fraud in banking,

34:13

in construction. You know, what

34:16

is there, there's probably no field that

34:18

you can think of that has no fraud. So science is

34:20

not immune to that either, but as

34:22

a whole, science is about finding that truth.

34:25

And and it's the only solution we have. I

34:27

feel to solve the big problems that we're

34:29

currently facing in the world. Epidemics

34:32

and climate change and,

34:34

and things like that. And I think by now, most

34:36

people will be convinced that for example,

34:38

the earth is not flat. And I feel that

34:41

a lot of these misinformations in science

34:43

are based on, you know, the earth is

34:45

flat. Data, like there's, there's no real data

34:47

to believe that that's the case, but people, if

34:49

they want to believe that they'll believe in that.

34:51

Right? The reason we ha the reason we live

34:53

longer than 40 years and we have cell phones

34:56

and the internet is because of science. Exactly.

35:00

I, I yeah, I I'm, I'm like, at this point

35:02

I'm like, we need more Elizabeths in the science community, but

35:05

people like me, I'm not, I'm really

35:07

not the only person, but the most of these people

35:09

work anonymously for good reasons,

35:11

as you can see, because I'm being haunted down

35:13

by the French you know,

35:15

disinformation, trolls so most people

35:17

will choose to, to do this work

35:20

anonymously, but I'm definitely not the only person

35:22

doing this topic. Do you

35:24

think that in the near future, we'll

35:26

be able to train an algorithm to spot

35:28

the doctorate image if given like 10,000

35:30

images, basically give your eye your

35:33

particular unique talent to an algorithm.

35:35

Is that a possibility? Yes. And I actually, I'm

35:37

going to take a sip of

35:39

water because I turned this into a drinking game

35:41

because I get this question so many times

35:43

on Twitter. So I'm just going to take a sip

35:45

of water. Delicious.

35:49

So a lot of people will

35:51

say, oh, I can, I can ride

35:53

it tool on a Friday afternoon. That can do what

35:56

you can do. It's much harder than

35:58

you than you might think, because a lot

36:00

of these duplications

36:02

are not pixel to pixel identity. So

36:04

science images have usually

36:06

been compressed a lot. They're inserted

36:08

in a PDF. There's all kinds of image

36:11

compression and data processing

36:14

that. Made one image

36:17

looked like an outer, but not image, not

36:19

pixel to pixel. So you're kind of. Do

36:21

your standard pixel to pixel comparison

36:23

and find these things. A lot of them, a lot of

36:25

the times the images also rotate or zoom

36:28

dinners ripped out or like mirror.

36:30

So it's a little bit more complex than that.

36:32

And I've actually participated in a DARPA

36:34

challenge where I came with my data stack

36:36

of flawed images and good images. And

36:39

nobody could crack the code. There were several groups

36:41

that, that all claim that

36:43

they could ride on a Friday afternoon. They could write

36:46

this probe into detectives. And we're now

36:48

three years later and now they're starting to

36:50

develop tools that can actually do

36:52

this. So it's, it's pretty hard. But

36:55

on the other hand, yes, this is information. This

36:57

technology will be there and it's,

36:59

it's actually getting ready. There's a couple of tools

37:01

I'm already starting to use that

37:04

are. Starting to find applications,

37:06

but, and in some cases they're better

37:09

than what I can find. They're definitely faster,

37:11

but there's also duplications that I just see

37:14

what my, just my eyes

37:16

and the suffer just cannot see it.

37:18

I'm like, come on. This is there. It's so clear.

37:20

Yeah, so it's, we're still, we still have a long

37:22

way to go and it always needs human

37:25

interpretations. Software

37:27

can easily, maybe in the end pick up

37:29

a duplicated image, but in some cases

37:31

there are duplications that are expected

37:34

and actually quite normal. Where,

37:36

for example, you do a control experiments and

37:39

you compare to two particular drug

37:41

treatment, but then later you have the same photo

37:43

of the control experiment, and you compare to another

37:45

experiment. So in those cases you might see the

37:47

same photo. But it's a total,

37:50

a totally normal and acceptable

37:53

way of, of reusing the photo. So

37:55

the software might still detect it as a duplicate,

37:58

but then you need a human to interpret. Yeah.

38:00

Well, this is actually the same experiment, so that's fine.

38:02

The advantage of software will be

38:04

in the end that any image in a

38:06

manuscript that is sent into

38:09

a journal could be scanned against

38:11

the database of all images that have ever

38:13

been published, which is, you know,

38:15

competition was still challenging. Something

38:18

that I expect to be solvable so that people

38:20

who want to reuse an image from an outer

38:22

group from an older paper will

38:24

be caught. And that is something I could

38:27

never do. I can only compare a couple

38:29

of images to each other, but. I

38:31

cannot remember enough of them to remember

38:34

an image I've seen three years ago. I would not

38:36

remember that. The image. Sure. So

38:38

yeah, it sounds like the software will play a role, but it will never

38:41

be able to totally replace that human factor.

38:43

I think it's fair to say that you've had like an

38:45

undeniable impact on

38:47

science. But as you look out into the future,

38:49

what is sort of your longterm aspiration

38:52

impact that you want to have? You know, what's the north star

38:55

of what you're trying to work towards. I

38:57

hope there will be more well punishments

39:00

is always a big board, but like, like some way

39:02

that people who are caught doing

39:05

fraud, that there will be

39:07

repercussions for them because

39:09

I feel there's, there's too many cases. I've

39:11

reported to journals in institutions

39:13

where our data was just simply no reply.

39:15

So about 60% of the papers I've

39:18

reported in the past years have

39:20

not been acted upon. These are

39:22

papers with very flawed images. Some

39:25

of them just simple errors that could be addressed

39:27

with a correction. Some

39:29

of them like re outrageous

39:32

Photoshop jobs that are so

39:34

clear to me in five seconds, that

39:36

there is something that is very fishy going

39:38

on there. But five years down

39:40

the road, these papers are still out there. And

39:43

so. I'm looking forward

39:45

to work together more with journals,

39:47

with institutions, with publishers to

39:50

very quickly address these, these

39:52

problems and not have them look

39:54

the other way. There are so many conflicts of interests

39:57

where journals do not want to respond

39:59

because they might lose their citations. They

40:01

might lose their image. As

40:03

a, as a, you know, a good journal that would never

40:06

publish any fraud and institutions

40:08

also do not seem to want to address

40:10

these cases because maybe they have a very

40:12

famous professor who is

40:14

being accused of something bad, but

40:16

he or she brings in lots of money. And so

40:19

let's just pretend this didn't happen.

40:21

And so I'm looking forward to. A

40:24

time where these cases are swiftly addressed

40:26

and where there's much more room to

40:29

give money to honor scientists and not the scientists

40:31

who cheat and that's still a long time. Cool.

40:35

So yeah, we, we have a bunch

40:37

of admiration for your work. I'm sure all of our listeners

40:39

will too. I just want to wrap things up now

40:41

by asking you, what is your favorite. Oh,

40:45

my gosh. One of my favorite papers is

40:47

that of Lawrence David in

40:49

which he. Sampled

40:52

himself sampled his microbiome. So

40:54

took his own stool samples and

40:56

followed himself and another scientist

40:59

for a, about a year. I

41:01

looked at how his, the composition

41:03

of his bacteria in his stool changed

41:05

over time. When, when for example,

41:08

he went camping or he went like people sick

41:10

or went to another country and you can

41:12

see it. You can see the stability

41:14

of the human microbiome. And you can see

41:17

also the periods

41:19

where the microbiome just changes

41:21

because he got sick or, you know, the little

41:23

things we go through over time. And I felt that

41:26

paper was

41:28

so important to show the enormous

41:30

stability of our microbiome. And which

41:32

is amazing because we eat different things every day.

41:34

And so we feed our microbes different, different

41:37

foods every day, but it's pretty

41:39

resilient to the changes that

41:41

we, we we bring along to it. But

41:44

when we have a big change, when

41:46

we got really sick or we go to a different country,

41:48

this is where the microbiome of the human changes.

41:50

And I thought it was so elegantly done. That

41:52

was one of my favorites. That's

41:54

I would love to read that it, could you send us a link and you

41:56

can put it there? Of course. Yes. It's it's an all

41:59

okay. For, I haven't really kept up to

42:01

date with microbiome papers, but so it's

42:03

probably all my questions around eight years old

42:05

or so, but yeah, it's a, it's I'd love the paper.

42:07

Just has some really cool graphs. Oh,

42:09

nice. Yeah. And it sounds like such an interesting story

42:11

that he's telling you to know. It's not just one piece

42:13

of research, but this, this guy's life

42:16

that is being explored through science. I think it's a

42:18

very poetic yes. All

42:21

right. Well, thank you so much for coming on the podcast. Elizabeth

42:23

has been such a cool episode. One of my favorites so

42:25

far, and I've learned a ton. Well,

42:28

thank you for us. Thank you. For sense. Was my,

42:30

my pleasure being here. Thank you. Thank you

42:33

so much, Elizabeth. I really appreciate it.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features