Podchaser Logo
Home
Data Science Innovation: Driving Alzheimer's Disease Insights

Data Science Innovation: Driving Alzheimer's Disease Insights

Released Tuesday, 19th September 2023
Good episode? Give it some love!
Data Science Innovation: Driving Alzheimer's Disease Insights

Data Science Innovation: Driving Alzheimer's Disease Insights

Data Science Innovation: Driving Alzheimer's Disease Insights

Data Science Innovation: Driving Alzheimer's Disease Insights

Tuesday, 19th September 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:13

Hello and welcome to the Data

0:13

Science Innovation Driving

0:18

Alzheimer's Disease Insights.

0:18

I am Jeana Konstantakopolous,

0:22

the senior Director of Partner

0:22

Engagement here at Matter and Matter is a

0:26

healthcare technology

0:26

incubator and innovation hub.

0:29

Built on the belief that collaboration

0:29

between entrepreneurs and industry

0:33

leaders is the best way to

0:33

develop healthcare solutions.

0:36

Our mission is to accelerate the

0:36

pace of change of healthcare,

0:39

and we do three things in service of this

0:39

mission. First, we incubate startups.

0:44

Since we launched eight years ago, we've worked with more than 800 companies

0:46

that range from very early growth

0:50

stage startups to larger companies,

0:53

and we offer them a suite of services

0:53

to help at every stage of their

0:56

development. Our member companies have raised more

0:57

than $5 billion to fuel their growth.

1:02

Second, we work with larger

1:02

organizations such as health systems,

1:05

life science companies, payers, foundations to strengthen

1:07

their innovation capacity,

1:11

and we help them find value in emerging

1:11

technology solutions by unlocking full

1:15

potential of both their internal

1:15

innovators and then creating more

1:19

human-centered healthcare experiences

1:19

through system level collaboration. And

1:24

third, we're a nexus for people who are

1:24

passionate about healthcare innovation.

1:28

We bring people together to be inspired

1:28

to learn and to connect with each other.

1:33

We produce a lot of programs

1:33

like this one today,

1:35

including large scale events for the

1:35

broader community and small forums

1:39

exclusively for our members and partners. Today's event is being conducted

1:42

alongside our Brain Health Innovation

1:47

Challenge, which you'll hear a little bit more about

1:47

at the end of this program that we're

1:51

producing with support from the

1:51

Lundbeck US Charitable Fund.

1:56

It is an independently managed

1:56

nonprofit 5 0 1 C three that is

2:01

committed to the responsibility

2:01

and appropriate support of programs

2:06

about restoring brain health. The Lundbeck US Charitable Fund

2:08

is wholly owned by Lundbeck,

2:12

a global pharmaceutical company

2:12

specializing in brain disease.

2:16

But for more than 70 years, Lundbeck has been at the forefront

2:18

of neuroscience research tirelessly

2:22

dedicated to restoring brain health so

2:22

that every person can be their best.

2:27

So for today, for years,

2:29

data analytics has been used

2:29

in healthcare to fuel faster,

2:33

accurate diagnose to

2:33

inform decision-making,

2:37

personalized treatment, improve

2:37

patient care and outcomes,

2:41

lower costs and more. But with the recent advances that we're

2:43

seeing with big data and generative

2:48

artificial intelligence, more organizations are exploring

2:50

how these new ways can take

2:55

modern data science tools to address

2:55

persistent healthcare challenges.

3:00

So talking about challenges, one of the key challenges right now

3:02

in advancing care for our growing

3:07

population of older people is living

3:07

with Alzheimer's disease and the related

3:12

dementias. And we're seeing that there

3:13

are a wide range of disparate

3:18

sources of raw data, including

3:18

electronic health records,

3:21

personal health records, patient

3:21

portals, health related smartphones,

3:25

wearables, and lots of

3:25

unstructured data out there.

3:29

And the question is how can we

3:29

gain meaningful insights? Well,

3:33

hopefully our panel today

3:33

will help us to dig into that.

3:37

Today we're joined by Mary Furlong,

3:37

a leader in the longevity market.

3:42

Elizabeth Powers the Vice President and

3:42

general manager of US Regulatory Science

3:46

and Study Innovation, and I

3:46

Q V I A and Ryan Urbanowicz,

3:51

research scientist, computational

3:51

biomedicine at Cedar,

3:54

Cedar-Sinai Medical Center, and the co-lead of Tech ID and Training

3:56

Corps at Penn AI Tech in the A two

4:01

collective. And our conversation today

4:01

is going to just dig into this topic.

4:05

So with that, hello everyone.

4:11

Hey, how's it going? Hi, Ryan. Hi, Elizabeth.

4:16

I'm good. Great to have you guys join us today.

4:20

Before we really kind of

4:20

dig into the topic at hand,

4:23

I'm going to go around the table and

4:23

have you tell me a little bit more about

4:27

yourselves and how you've

4:27

come to look both at

4:32

the growing older population in the

4:32

US brain health as it relates to

4:37

Alzheimer's disease and dementia. And

4:37

then this question of data. All right.

4:41

So I'm going to start with you, Mary, if you can tell me a little

4:42

bit about you and your role.

4:46

Well, I've been at this a long

4:46

time. I'm a serial entrepreneur,

4:50

have started three companies

4:50

and I think have raised

4:55

about 250 million in corporate

4:55

sponsorships and venture

4:59

financing for startups that are

4:59

building companies in the longevity

5:04

market. I produced the

5:04

Longevity Venture Summit,

5:09

which I've done for about 20 years,

5:09

and the Washington Innovation Summit.

5:14

And I have a podcast called

5:14

Longevity Deal Talk in terms of brain

5:18

science. I'm an advisor to the

5:18

Canadian Brain Health CBI Group.

5:24

I've been part of Poit

5:24

Science since the beginning,

5:27

and I'm recently judging

5:27

the UK competition for

5:32

business plans related

5:32

to research and dementia.

5:37

Thank you. I'm sure your a wide array

5:38

of experience here will be of

5:43

great use today in our

5:43

conversation. Elizabeth,

5:45

can you tell me a little bit about yourself? Yes. So as you said,

5:49

I'm vice president and general manager

5:49

of a group within I Q V I A called

5:53

Regulatory Science and Study Innovation.

5:56

Our primary mission is to find ways

6:01

of unlocking access to

6:01

clinically rich data,

6:06

including new data sources, some traditional data sources

6:07

like electronic medical records,

6:11

but also new data sources

6:11

like wearables and so forth,

6:15

big data sources, small data sources,

6:19

and really figuring out how to use

6:19

those data sources in a way that has

6:24

scientific credibility and rigor.

6:28

And so we are in the early stages of

6:33

building out some new research

6:33

networks that focus on C N Ss,

6:38

including Alzheimer's, and obviously there are a

6:40

lot of exciting treatments

6:45

coming to market for

6:45

Alzheimer's and dementia.

6:48

And with that comes a bolus of research sponsored

6:53

by pharma companies,

6:56

and we are heavily involved in

6:56

various efforts around that.

7:01

So really happy to be here with you today. Thank you.

7:04

We're excited to have you and certainly

7:04

have someone who's riding the wave,

7:08

so to speak, of what's happening

7:08

out there in the ecosystem. Last,

7:12

but certainly not least, Ryan, we have you and I think that

7:14

you are our resident data guru,

7:18

so if you would please share a little

7:18

bit about yourself, that would be great.

7:22

Sure. I'm Ryan Urbanowicz. I'm currently an assistant professor at

7:24

Cedar Sign and Medical Center as well as

7:28

an adjunct at UPenn. I run the Herbs Lab.

7:31

We do research in the development

7:31

of machine learning and artificial

7:34

intelligence methods as well as our

7:34

application to a variety of biomedical

7:39

target data points or objectives.

7:42

And our lab specializes in development

7:42

of automated machine learning tools as

7:47

well as interpretable

7:47

rule-based machine learning.

7:49

So I'm very much coming at this

7:49

from the computer science side,

7:52

data analytics side, and I've gotten involved in Alzheimer's

7:54

research in particular over the last

7:57

couple of years through the Penn AI Tech

7:57

and a two collaborative that's funding

8:03

research grants for technologies and ai,

8:06

especially applied to Alzheimer's

8:06

and dementia and other aging

8:10

issues. And I approach data kind of agnostically

8:12

because I'm involved in a lot of

8:16

domains and also because many of the

8:16

challenges and questions and data

8:21

science are kind of

8:21

universal and generalizable.

8:26

But there's a lot of things that I

8:26

think about in terms of Alzheimer's

8:31

that are unique in terms of data

8:31

quality, what features to collect,

8:36

things like that. So anyway. Yep.

8:39

Great. I think we have the perfect set

8:39

of perspectives to get our conversation

8:44

underway here. So let's kind

8:44

of pivot here, which is,

8:47

I'm going to start with you, Elizabeth,

8:50

which is Alzheimer's disease and related

8:50

dementias are on the rise. As we know,

8:55

our older population is growing and

8:55

because Alzheimer's disease and dementia

8:59

tend to have a later

8:59

onset for most people,

9:02

it kind of goes in hand with the

9:02

prevalence of these cognitive disorders

9:07

increasing. It means that we're looking at more

9:08

people encountering these diseases.

9:12

And what are you hearing about some of

9:12

the challenges in the marketplace as it

9:16

relates to this? I think

9:21

I'll kind of organize this along

9:21

the patient journey, if you will.

9:27

I think first of all, there is patient and caregiver fear

9:33

and uncertainty about what is

9:38

happening with what is happening to

9:38

me, what is happening to my parents,

9:44

my aunt, my grandparent,

9:44

my brother, my sister,

9:49

and deep fear about knowing

9:49

an answer because of the

9:54

implications of that from

9:54

a caregiver perspective.

9:57

Then once someone is

9:57

within seeing a clinician

10:02

about this, we're not.

10:06

So I was just sitting here thinking, I've been working in Alzheimer's

10:08

for almost 20 years now,

10:12

and I think we're still

10:12

seeing very inconsistent

10:17

practices. It's not like frankly,

10:20

oncology where they're

10:20

relatively clear lines of

10:25

care. There's not even clear diagnoses.

10:30

And that is still the case. Now,

10:33

my own personal hope is that over the

10:33

next five to 10 years with new therapies

10:37

coming, it will overcome both

10:42

physician, patient and caregiver

10:44

resistance to a diagnosis

10:49

because there are treatments oftentimes

10:49

when you're in therapies where there

10:53

aren't really meaningful treatments,

10:56

it can be very difficult

10:56

to get to a diagnosis.

11:02

And then there's just record keeping.

11:04

Actually what gets put into

11:04

the EMRs is very different

11:09

from physician to physician. And then the last thing

11:13

I'll say is then there's the

11:18

burden on the caregiver,

11:20

whether that is someone in a nursing

11:25

facility, step down,

11:25

step up care facility,

11:30

assisted living facility, or just in a home with family,

11:37

there has to be a limit on the

11:37

burden that's put on the caregiver,

11:41

and that's something that really has to

11:41

be taken into strong consideration both

11:46

in terms of treatment and care and in

11:46

terms of data collection for research.

11:51

So I'll just hit pause there. I think that's a rich ground for

11:54

us, I think to play in. But Mary,

11:58

I'm going to pivot to you next. I know that longevity is something

11:59

that you think a lot about and kind of

12:04

this growing population of

12:04

concern around brain health,

12:09

not just by the way, do I

12:09

have OID disease or dementia,

12:13

but preventatively, what can I

12:13

be thinking about to do that?

12:16

And kind of having a

12:16

community of peers focused as

12:21

well. What's kind of your take on

12:21

what's happening in the marketplace?

12:25

I thought I might size market. So the longevity market's

12:27

an 8.3 trillion market,

12:32

and then there's a lot

12:32

of riches in the niches.

12:34

So if you look at the boomers,

12:39

they are at the top end 77,

12:42

so in three years they're going to be 80,

12:46

and then you've got 20 years

12:50

of older adults coming behind

12:50

them. So it's a huge market. Now,

12:56

the opportunity for innovation

12:56

in the home, in the care setting,

13:01

in the adult day setting, in senior housing communities and

13:03

in places with dementia wings,

13:09

that's really important to look at.

13:09

But we're just at the very beginning.

13:13

I mean, more people watch Wheel of

13:13

Fortune then that's their

13:18

cognitive fitness. And so if

13:18

you take an issue like driving,

13:22

which I'm very concerned about

13:22

right now because you look at

13:27

the number of people who are not going

13:27

to be able to drive in the next 10 years,

13:32

and we're not prepared for that in

13:32

terms of accessing resources in the

13:36

home. So some of the analysts think Uber

13:37

Health is one of the most important new

13:42

brands or brands out there

13:42

that could play a role,

13:46

but lighting can play a

13:46

role, pharma can play a role,

13:51

and there's a huge staff shortage.

13:54

So there's really got to be

13:54

brand new models for how we

13:59

find train and retain caregivers.

14:07

It certainly, no, there's not.

14:10

I think that maybe that's part of this,

14:10

right, which is that there's so much,

14:13

not just from a standpoint of talking

14:13

about the number of people that are

14:18

potentially impacted by this, but the myriad of kind

14:19

of concerns that they're

14:24

starting to have to consider not just

14:29

healthcare and data, but these larger access issues

14:30

that you pointed to that all

14:35

have a role to play in things

14:35

like diagnosis or care.

14:39

So really interesting. I think I want to take some of

14:41

these things and maybe Ryan,

14:44

I'll come to you next, which is this notion of data

14:46

we heard from Elizabeth about

14:50

data and relating to this kind of

14:50

group being in lots of different

14:55

places. As someone who spends their days

14:57

looking at these troves of data,

15:02

what do you think the current

15:02

state of certainly Alzheimer's

15:07

data, but maybe broad more broadly,

15:09

this older adult and population level

15:09

data that could start to have a play

15:14

into things, what does that look like to you? I think like a lot of biomedical domains,

15:20

the data is distributed,

15:20

it's siloed, it's messy.

15:26

We're still figuring out in many

15:26

cases, what are the right variables?

15:29

What is the right information

15:29

to collect on patients?

15:32

What do we need to be

15:32

collecting in order to target

15:36

care or to monitor for care?

15:40

So there's a lot of unanswered questions

15:40

in terms of just knowing what to gather

15:45

correctly, let alone how to do it well,

15:45

and thinking ahead is really important.

15:50

I think that's already been brought up. We need to think about what the future

15:52

needs are going to be and make our data

15:56

collection systems adaptable so that

15:56

as our understanding of these issues

16:01

changes, so can be the way that we collect our

16:01

data and the way we utilize our data

16:06

to translate it back into patient

16:06

care or help or whatever it is that we

16:11

want to focus on. And maybe as a way to

16:14

take another kind of step

16:19

back here, when you're looking

16:19

day-to-day at these datasets,

16:23

are you only thinking about things like

16:23

the older adult and Alzheimer's disease,

16:27

or are you looking at other kind of

16:27

patterns and approaches that you're

16:32

taking elsewhere? Yeah, no, I wish I could say I'm

16:34

entirely focused on Alzheimer's disease,

16:37

but no, I think very broadly about a lot

16:37

of biomedical outcomes and work

16:42

with a number of data

16:42

types. One small example,

16:47

I was involved in a clinical trial for,

16:54

I forget, I'm getting my biomedical

16:54

outcomes mixed up. But anyway,

16:58

in this project they had clinical

16:58

trial data from multiple sites and just

17:02

harmonizing the data from

17:02

these pretty well structured

17:07

programs was an absolute

17:07

nightmare. It took two,

17:11

three years just to bring this data

17:11

together before we can even really analyze

17:15

it, and just this as a reflection of

17:15

what the current state tends to

17:20

be in the medical field in terms of

17:20

being on the same page from the get

17:25

go and how we're going to

17:25

collect information or how

17:29

that's already out there, how do we bring it together and leverage

17:30

it in a way that is reliable and

17:34

trustworthy. Ryan, if I can just

17:36

dovetail on that. I mean,

17:40

you just said that you were involved

17:40

in a clinical trial and there was an

17:43

enormous amount of effort

17:43

to harmonize the data across

17:48

sites. That's under the best of circumstances

17:49

where sites are entering things into a

17:54

pre-designed E C R F case

17:54

report form to feed into

17:59

an electronic data capture system. When you're working with

18:01

real world data where

18:07

even if the E M R is epic, every

18:07

system is configured differently,

18:12

much less than adding in data

18:18

that is occurring from outside

18:18

the actual side of care,

18:24

but getting a sense of is a person's activity level,

18:32

what's happening with certain biometric,

18:36

biometric data, heart rate, sweat anxiety,

18:41

these are all things that are

18:41

relevant to patients, people,

18:46

people with dementia, and

18:50

that data gets very hard to

18:50

integrate and is incredibly messy.

18:55

Absolutely. And one other

18:55

quick sort of side note,

19:00

in addition to when we're thinking

19:00

about collecting new data,

19:04

another big point that might be worth

19:04

discussing more is thinking about patient

19:08

privacy concerns. We want to be collecting this data

19:10

so that we can make best use of it,

19:14

but also how do we do that

19:14

without patients feeling

19:19

or giving away their personal freedoms? There's actually a question

19:24

in the chat about passive data

19:29

collection. So in I Q v's business, we're really just

19:35

beginning to see large scale studies,

19:40

real world studies come through

19:40

where sponsors are hoping

19:45

to have passive data collection

19:45

through a wearable. I mean,

19:50

that's among what it really

19:50

has to be in a certain way,

19:55

but actually having a

19:55

validated tool for that.

20:02

In my experience, it's still very much early days and

20:12

in our view, it is critical for research and

20:13

dementia and Alzheimer's to have this

20:18

for a whole range of reasons

20:18

tied to things I've already said.

20:24

But I think we're still some years

20:29

at least two to three, if not five to 10,

20:31

from really having the sophistication of

20:36

tools and sensors to be able to do this

20:36

easily and without burdening the patient

20:41

or the caregiver. It's interesting because I think we

20:43

do this in healthcare lot, Elizabeth,

20:47

which is one of the first

20:47

questions I asked you.

20:50

You started with patient experience

20:50

and caregiver experience,

20:53

which is that individual

20:53

level of healthcare,

20:56

which is so intimate and so personable,

20:59

but at the same point to make

20:59

that a really meaningful kind of

21:04

evidence-based experience for them.

21:04

We depend on population level data,

21:09

we depend on the rollup of all

21:09

of that to enable that personal

21:13

experience. So maybe

21:13

Ryan, a question for you.

21:18

As we talk about these

21:18

disparate realms of data

21:22

development, production

21:22

living, little living spots,

21:27

how do you think about population

21:27

level data and unlocking some of those

21:31

insights where either for this older

21:31

population or you thinking that we

21:36

need to look and what's your

21:36

experience as someone who

21:41

is looking at things

21:41

like AI to uncover some

21:46

of those things? So there's already an incredible

21:48

wealth of tools out there for

21:53

machine learning and ai.

21:53

There's some great advancements.

21:56

Obviously that's still an

21:56

evolving field as well.

21:59

There's a billion unanswered questions, but in terms of thinking about

22:01

analyzing this kind of data,

22:04

the first thing that I worry about that

22:04

sometimes can get overlooked I think is

22:10

the data quality and collection. And that is absolutely essential. You

22:13

might've heard the phrase garbage in,

22:17

garbage out, right? It is tempting to have this magical

22:19

thinking about machine learning and ai.

22:23

It's like, I'll just pass it to the

22:23

tools and then I'll get something good.

22:27

And the real meat of all of

22:27

this is always going to be

22:32

what variables do I collect? What is the quality of the data I'm

22:34

collecting so that I can leverage these

22:38

awesome new tools to really make

22:38

the boat most out of the data,

22:42

but at the same time not

22:42

fall into pitfalls like

22:47

bias, right?

22:47

Bias is a huge one.

22:49

Making sure predictive

22:49

algorithms are fair, that

22:55

they allow fairness in

22:55

their decision-making

23:00

information we glean from

23:00

these tools and models. One,

23:04

in terms of methodologies and making

23:04

use of the data. I mentioned earlier,

23:08

one of the areas of research I'm

23:08

in is automated machine learning,

23:11

and I'm actually just writing on a paper

23:11

on it right now and surveyed a large

23:16

number of auto mail tools, which is making it a lot easier for

23:17

people to use machine learning and

23:22

hopefully do a better job than

23:27

the problem with machine learning analysis

23:27

pipelines is that there's a billion

23:31

ways to make one, right? Everyone

23:31

has an opinion, everyone has belief,

23:36

and there's a lot of right ways to do it,

23:38

but there's also a lot of wrong ways to do it. So paying attention to the

23:40

evolving machine learning AI in

23:45

terms of how data analysis

23:45

conducted I think is important.

23:49

We all taking a role and

23:49

being critical of how we do

23:54

that, I think is going to be

23:54

really valuable in the future.

23:59

Well, maybe kind of

23:59

taking a page from that,

24:03

this notion of garbage in garbage out, there are a lot of efforts currently

24:05

underway such as the National

24:10

Institute on Aging is creating

24:10

some data repositories.

24:14

There are a number of other organizations

24:14

that are looking to kind of create

24:18

these data pools of information that has

24:22

been well collected so that

24:27

train their solutions, train their ai,

24:29

have data that is relevant and meaningful.

24:35

I'd love to hear maybe Elizabeth and Mary,

24:37

if you know about some of these

24:37

kinds of efforts underway,

24:40

what your thoughts are as far as these

24:40

data collection and repository efforts

24:44

that are happening. Well, I know it from the N I A funding,

24:51

so there's $160 million I

24:51

think every year that they are

24:56

funding. And I think about 60%

24:56

of that is going into innovations

25:01

related to brain health.

25:03

And so we can look to the work of

25:03

some of those entrepreneurs and see

25:09

what they see. But maybe I should turn it

25:10

to Elizabeth to say more.

25:16

So there are some data sets that

25:24

are large enough, but with the evolution of treatment,

25:32

I think what we're seeing

25:32

is that those dataset,

25:36

what needs to be in those

25:36

data sets is evolving.

25:41

And so they don't always fit the bill

25:41

depending on what you're wanting to

25:46

research. And I do want to address a couple of

25:54

points. Ryan made a point earlier,

25:57

and there are a couple of things in

25:57

going on in the q and a chat here about,

26:04

I'll call it broadly social

26:04

determinants of health.

26:07

Part of what we're seeing

26:07

with the coming surge

26:12

of the existing and

26:12

tidal wave of Alzheimer's

26:17

research that we're seeing is the need for more diverse data.

26:23

And what that means is

26:23

that you can't just go.

26:27

So typically large pools of data have been

26:32

driven through academic research centers.

26:36

Those research centers skew wealthier,

26:40

wider, I'm sorry to make

26:40

mass generalizations,

26:44

but this is what we see again and

26:44

again regardless of therapeutic area.

26:49

And so we are seeing a push

26:49

on the part of pharmaceutical

26:54

companies and specialty

26:54

organizations to really try and get

26:58

attached to community

27:03

community points of care. And what that means is

27:06

that you're tapping into

27:10

physicians who are not

27:10

accustomed to research.

27:15

They want to be part of research,

27:18

but they don't have the practices, they

27:18

don't have the staff to support it.

27:24

And even then getting,

27:30

there's also in the chat

27:30

a little stream of, yes,

27:33

so much valuable healthcare information

27:33

exists outside of a point of care

27:39

and getting access to that

27:39

also means that you're

27:43

skewing richer, wider.

27:49

And so it's really, I think there's one challenge in

27:51

getting to the community physicians.

27:56

There's another challenge in getting

27:56

at the data that is happening

28:01

outside of a care setting and

28:01

just part of activities of

28:06

daily living that is really important. And

28:12

again, I think we're probably five

28:12

to 10 years from really having

28:17

good validated ways of

28:17

collecting that data.

28:21

I hope it's faster, but I think that the reality

28:23

of that term validated

28:29

means that it's a medium term

28:29

thing, not a near term thing.

28:34

I think this is an important

28:34

thread to talk about,

28:37

which is the notion of social

28:37

determinants of health. I think, Mary,

28:41

you had that really interesting

28:41

comment earlier about transportation,

28:46

and I know in some of our

28:46

earlier conversations that

28:50

you talked to me a little bit about

28:50

banking and financial records,

28:54

and you've talked to me a little bit

28:54

also about the role of that secretary of

28:58

state and the driver's license and what

28:58

this means for older adults is to kind

29:02

of, I'll say outside the healthcare

29:03

point of data that still have

29:08

incredible relevancy. And I'd love for you to just maybe

29:09

talk about a few of those things. Yeah.

29:12

I have a really great example. So

29:17

I am renewing my license,

29:17

and so I'm going to be 75.

29:21

A lot of my friends are doing the same, and so they all have to take and

29:23

prepare for the driver's test.

29:27

And so I took the a r P

29:27

driver's class and I had

29:32

found, my husband found for me a really

29:32

good program with AI built into it.

29:37

So it doesn't just teach

29:37

you the rules of the road,

29:39

it helps you understand rules of

29:39

the road. So in this class I was in,

29:44

which was live, I said, oh,

29:47

this program will really help you

29:47

understand the signs and everything. Well,

29:52

people said, I don't have a computer. So when you realize that digital

29:54

access to your point about data

29:59

sets is not there for everyone.

30:02

So the notion of just what Covid

30:02

taught us is we first have to help

30:07

people get digitally literate, and then that's another

30:09

way to gather information.

30:14

Otherwise, you could just have someone

30:14

check the box and say unfit to drive.

30:19

And so I think that's a big place that you

30:24

see it. Banking is another,

30:26

so older people are

30:26

looking for things to do,

30:30

and so they want to go and talk to the

30:30

banker, the savings and loan person.

30:34

They don't want the automated

30:34

necessarily want the automated

30:40

a T m because it's part

30:40

of their socialization.

30:43

And so it's the book

30:43

clubs, it's the pharmacies,

30:47

it's these local places where older adults

30:47

appear that that's where you begin to

30:52

see that they might just be losing it. So when their boyfriend tells them to

30:54

withdraw some of money, someone they met,

31:01

then the banker now sees that

31:01

maybe something's not okay

31:06

with mom or dad. So it's looking at maybe some of

31:08

these unconventional sources of

31:13

data, and not just data,

31:16

but potentially thinking about this

31:16

a little bit more humanistically

31:21

contact points that we

31:21

have in our community.

31:24

But for a lot of those contacts,

31:24

there is kind of a record.

31:28

There's something that exists out there. I think the surgeon general's report is

31:31

really important about loneliness and

31:36

even the fewer hours that

31:36

families are connecting,

31:39

fewer places that people can

31:39

go and gather. And as they

31:43

they have often fewer friends

31:43

because they lose their friends,

31:48

they lose some of their friends. So we have to think locally and we have

31:49

to think very broadly about how do we

31:54

reach these caregivers and how

31:54

do we train the caregivers,

31:58

many of whom do not understand the

31:58

nuances of the research reports

32:03

on things like dementia, but they're

32:03

in the front lines day to day.

32:09

I think it's a really interesting

32:09

point. Maybe Ryan, I'll ask you,

32:14

we've talked about some of

32:14

these kind of unstructured data

32:18

things that we see, for instance,

32:21

like notes in a health record or

32:21

maybe a banking flag or things

32:25

like that. How do you, as someone who's looking at data

32:27

sets start to consider some of this

32:31

unstructured data that's out

32:31

there to help weave a more

32:36

rich and thorough story about

32:36

the that we're trying to analyze?

32:42

Sure. So first off, I guess I should acknowledge that

32:43

I'm not really an expert on analyzing

32:46

unstructured data. I certainly

32:46

collaborate with those that do.

32:50

And this is an exciting time

32:50

to be in that area of research.

32:56

There have been over

32:56

the last 5, 6, 10 years,

32:59

pretty incredible advances in natural

32:59

language programming use of large

33:04

language models, deep learning methods in general to

33:06

work directly with unstructured data to

33:10

analyze that type of data. And I think it's important to keep an

33:12

eye on these emerging methodologies.

33:17

It's hard to keep track of because

33:17

there's so many people working in this

33:22

domain, it's hard to decide what is the

33:22

most valuable research to focus

33:27

on. And part of that is

33:27

that from a machine learning

33:32

there aren't really established

33:32

rigorous benchmarking

33:36

approaches. Every problem is different.

33:36

It is challenging to say, well,

33:41

this method works better than this method

33:41

for these reasons. It actually turns

33:45

out to be a whole can of worms. And so there is currently a

33:47

little bit of faith in saying,

33:50

I'm going to use this method

33:50

and run with it. But in general,

33:54

trying to understand the advantages and

33:54

disadvantages to any given approach,

33:58

typically with these new methodologies

34:01

that rely on deep learning.

34:04

The biggest trade off for me is that

34:04

you're most of the time giving up

34:09

interpretability, which in medicine is often a huge

34:10

selling point for methodology.

34:14

We want to be able to trust our models

34:14

and understand the predictions that are

34:19

being made by 'em. And that's always been a big disadvantage

34:20

of deep learning despite the hype

34:25

and the attention that

34:25

deep learning has received.

34:30

Not to say anything deep

34:30

learning is also incredible.

34:32

It does some pretty amazing stuff,

34:32

but it's just one tool in the toolkit,

34:37

right? We've got to remember to use

34:37

the right tool for the right job.

34:40

And often that's not going to be deep

34:40

learning for some of those reasons that I

34:44

just mentioned. With that, maybe Elizabeth,

34:49

you're sort of in the throes of where

34:49

the rubber meets the road between these

34:53

things, which is maybe traditional

34:53

and conventional data thoughts,

34:56

and then the understanding that they're

34:56

not capturing necessarily the full

35:00

picture. How do you bridge that chasm and what

35:02

are some of those things that you think

35:05

about day to day? Well,

35:13

one is let's start with counts.

35:18

Patient counts like how

35:18

many to answer a certain

35:23

question with

35:27

the required degree of

35:27

rigor for the purpose.

35:33

How many patients do you need

35:33

in your analysis and how are you

35:38

going to get there? On some

35:38

level you would say, oh,

35:43

it should be easy. There are

35:43

lots of Alzheimer's patients,

35:48

but actually finding the patients.

35:52

And then someone mentioned federated data

35:57

in the chat. I would add

36:02

this idea of federated data to Ryan's

36:07

list of wishful thinking. It's just not that easy

36:10

because everybody's structured

36:15

differently. Similar terms, even among very,

36:24

the leading experts or maybe even

36:24

especially amongst the leading

36:29

experts are used somewhat

36:29

differently and means somewhat

36:33

different things depending

36:33

on who entered the data.

36:38

How you bring all that together

36:38

in the end means that actually

36:42

getting sufficient patient

36:42

numbers can be tough.

36:49

I think that, I dunno, I feel like we've covered

36:53

a fair amount of this ground, but

36:58

we do think a lot about

37:02

are there big data

37:02

records that we can go to?

37:06

So we have another study,

37:09

another in another therapeutic area

37:09

where we actually are using driving

37:14

records as an indicator of safety events.

37:22

I've heard of research

37:22

around gun ownership and

37:26

Alzheimer's. As someone said, we now

37:32

doesn't take much to get a gun. A lot of people are going

37:33

to have Alzheimer's and

37:37

And that is when someone mentioned this

37:42

research to me, I was like, oh my

37:42

God, I'd never thought about that.

37:45

That's really, really scary. So we think about things like that.

37:53

Someone has mentioned prisons, we've been exploring work with prisons.

37:59

It is very hard for a

37:59

whole range of reasons,

38:05

but we do try and push

38:05

ourselves to think about

38:09

how big can you get with

38:09

some degree of reliability

38:14

depending on the research

38:14

purpose. Maybe I'll pause there.

38:20

Well, it's interesting. I want to maybe pivot the conversation

38:20

just a little bit as we are getting

38:25

towards the tail of our conversation here.

38:28

The first thing I guess I want

38:28

to ask a little bit about is

38:34

you mentioned how many people do

38:34

I need to have a successful study?

38:38

What is that number? And I would ask because of the varied

38:40

nature of Alzheimer's disease and

38:44

dementia, which is that it

38:44

involves a person who is not

38:48

cognitively at normal function

38:48

and sometimes is depending on

38:53

other people to help guide

38:53

them to appointments or create

38:58

continuity in their day-to-day life

38:58

and structure and medical records.

39:02

Is that part of the reason that you all

39:02

think that we have difficulty getting

39:07

access to this is because of the

39:07

nature of the disease itself and the

39:12

necessity for that? I'll say caregiver role

39:13

in its acquisition and

39:19

regular study. So I would say that a

39:21

lot of the challenges in

39:26

accessing data for

39:26

Alzheimer's are in many,

39:31

many therapeutic areas. It's just,

39:35

it's the reason even real world

39:35

research is supposed to be

39:40

more efficient and cheaper than

39:40

clinical trials. And it isn't always.

39:47

But I think for all the other

39:47

societal reasons that we've been

39:51

talking about, Alzheimer's

39:51

presents and a special challenge,

39:56

Alzheimer's psychiatry,

40:02

dare we even say the word pain, where

40:06

you begin. So we begin to ask ourselves,

40:10

is it that actually

40:10

there is that the science

40:15

is nascent because there's a lack of data

40:19

or is there a lack of data because

40:19

the science is really nascent?

40:23

Because when you, I mean, look, I know there's so much research

40:25

going on in Alzheimer's, I get that.

40:29

But when you look at

40:33

the way research has evolved relative

40:33

to how research has evolved in

40:37

oncology, inflammation, rheumatology,

40:40

those sorts of things,

40:46

the science isn't as mature

40:46

as those disease areas.

40:52

We simply don't know as much

40:52

about the evolution of the

40:56

disease, the methods of research,

40:59

the methods of clinical practice

40:59

as we do about these other areas.

41:04

And I do think that the lack of

41:04

data, it becomes an iterative thing.

41:10

Well, I think that's actually really

41:10

interesting and where we want to end our

41:13

conversation, which is

41:13

talking about insights,

41:17

because the hope is

41:17

that through using data

41:22

more constructively, that we are able to glean some

41:23

of these insights that help us

41:28

create better care pathways that empower

41:28

caregivers to have more meaningful

41:32

interactions. I'll just say

41:32

all the good things that we

41:37

hope for here in healthcare. So let's talk a little bit about

41:39

that interpretation part of it.

41:44

We're talking about all these

41:44

things, these massive data sets,

41:48

these social determinants of health, lots of points of M for data

41:50

to enter into the picture.

41:55

How do we make that meaningful? And

41:55

going back to this patient experience,

42:00

what does that maybe look

42:00

like for this disease state?

42:05

And I know this is asking us

42:05

to be a little bit of that half

42:09

full kind of perspective, but I think

42:09

that's a good place to sometimes go.

42:15

Maybe I'll start with you, Mary.

42:15

Thinking about a lot of people,

42:18

where do you think people who are going

42:18

through some of this would want to see

42:24

this data that they're

42:24

participating in their data?

42:27

How would they make it meaningful or what

42:27

is maybe some of the hope around that?

42:32

I don't know if this is the

42:32

exact answer to that question,

42:35

but what I see entrepreneurs

42:35

doing is focusing on

42:40

the care managers. And the problem with being a caregiver,

42:46

it's an average of 49 year old

42:46

woman who's already got a full-time

42:51

job in a family. She's

42:51

very club sandwiched.

42:54

And so to add anything else

42:54

into that role is hard.

42:58

And a lot of the caregiving

42:58

is done by the family,

43:01

but recently I've been seeing

43:01

a kind of new category.

43:05

And companies like the Key, I think do a really good

43:07

job of finding great

43:12

caregivers and educating

43:12

them about dementia.

43:15

So I think you have to look

43:15

at the corporate role of

43:20

who is playing a role in the

43:20

caregiving economy. In fact,

43:26

we're having a conference

43:26

on that in December.

43:30

And I also think AI can play a role. So

43:35

I think AI can play a

43:35

role in the training.

43:38

So I think that we have

43:38

to look at the people who

43:42

are touching the patient,

43:42

not the patient themselves,

43:47

to build data sets around

43:47

the caregiver economy

43:51

because that's the one that's going

43:51

to be on the front lines pointing to

43:56

this person has a problem or this

43:56

person doesn't have a problem.

44:00

There still will be a need to

44:00

have research with the end user.

44:04

And groups like Cabi

44:04

and the group in the uk,

44:08

they're beginning to create panels

44:08

where you can aggregate and find some of

44:13

those data samples. Thanks, Mary. I think that's

44:17

food for thought. Ryan,

44:21

when you think about the

44:21

synthesis, so what of it all,

44:25

what are things that you're thinking about? Sorry.

44:33

Yeah, I mean, we have,

44:33

all data is wonderful,

44:35

but if we can't distill it to have

44:35

a meaningful insight that can change

44:40

either a care pathway or

44:40

inform how people should

44:45

be approaching their disease management,

44:49

just data numbers. So how do we do that?

44:52

How do we look for those insights?

44:55

Gotcha. I mean, so first off I mentioned transparent

44:56

or interpretable machine learning.

45:01

That's one element that I always sort

45:01

of fall back to as being important.

45:05

And like I said, there's a lot of hype around

45:07

AI technologies that are

45:12

They're opaque. And so I'd like to see more research

45:14

and researchers developing and using

45:19

methods that are directly interpretable.

45:19

There's a small subset of us out there,

45:22

but we're largely overwhelmed. So

45:22

that's one thing to pay attention to.

45:27

And focusing on understanding what

45:32

are the variables that are important in

45:32

our data sets and what else we should be

45:37

collecting. So when we're analyzing data,

45:40

I feel like one question we should always

45:40

have in the back of our heads is if I

45:43

was to do this study again in the future,

45:43

what would I want to collect instead?

45:48

What would be better variables or

45:48

a better way to collect the data?

45:53

Another thing that's important

45:53

on the topic of data size and

45:58

fairness, most people when

45:58

they think of data collection,

46:02

they think more is better. And

46:02

that is in most cases, true.

46:07

We need more data to have more

46:07

power to more have more confidence.

46:10

But with larger data sets, typically they're going

46:12

to end up being messier.

46:16

They're going to be more heterogeneous,

46:16

which is both good and bad.

46:20

It's good because we're representing

46:20

a greater diversity of people,

46:23

hopefully if we're doing a good

46:23

job gathering a broader dataset.

46:29

But the downside is that a lot of

46:29

methodologies are not really set up

46:34

in analyzing data to

46:34

consider this heterogeneity.

46:37

And what I mean heterogeneity outs, I

46:37

don't just mean different backgrounds,

46:40

but I mean heterogeneous

46:40

associations where if we're trying to

46:45

predict an outcome, the factors that contribute to the

46:46

occurrence of that outcome can be very

46:50

different for different groups of people.

46:50

So this group of people over here,

46:54

they get the disease due to these genes.

46:56

And over here it's this combination

46:56

environment and some maybe other gene or

47:01

and beyond. And a lot of

47:01

methodologies that we have,

47:05

they're just trying to put together the

47:05

one best holistic model that's going to

47:09

make a decision for everyone. And that's a problem in itself

47:11

from a methodological perspective.

47:15

This is an area that we're

47:15

interested in, we study, and again,

47:18

I'd like to see more people just take

47:18

this into consideration and think about

47:23

what methodologies could we develop or

47:23

could we improve to tackle those kinds of

47:28

problems. Might be a little

47:28

bit in the weeds, but.

47:32

Oh, you know what? This is all about creating

47:34

these data insights.

47:37

So I don't think it's in the weeds at all. And hopefully we have people sitting on

47:39

the line that are thinking very much in

47:43

the same way that you are about

47:43

this. And maybe Elizabeth,

47:47

I'll give you kind of the

47:47

last call on this one,

47:50

which is what are those insights? What's

47:50

the things that we're hoping to glean?

47:55

What are some of the things that you'd

47:55

like to glean as you're thinking about

47:59

this? So when I look at what

48:03

one can look at the pharma

48:03

pipeline for treatment,

48:08

and it seems to me that there is a shift

48:13

to earlier and earlier

48:13

treatment. This is harder.

48:18

This is hard to do because people don't,

48:21

diagnosis doesn't always happen early.

48:24

And someone, clearly,

48:29

I've been monitoring

48:29

the chat all along here,

48:32

but someone mentioned something

48:32

about could we do something

48:36

like what was done during

48:36

Covid, where many people,

48:41

unfortunately I think not enough, but many people shared

48:43

their data in some way.

48:47

And I think that being

48:47

able to get access to

48:52

data in a way that allows us to understand

48:57

what early diagnosis really looks

48:57

like and begin to develop some more

49:02

definitive early diagnosis,

49:05

there's a lot to be overcome in that.

49:08

We've talked about a lot of

49:08

that here just in the last hour.

49:11

But I would really like

49:11

to begin to see more rigor

49:16

around early diagnosis, more concrete understanding of

49:17

what early disease looks like

49:25

and what both patients

49:25

and provider and payer

49:30

systemically, what does

49:30

that need to look like?

49:33

Because that is not where our

49:33

system is early diagnosis is

49:38

not where our system is geared to. I think.

49:43

It's going to be essential

49:43

for good treatment.

49:46

Mary and Ryan, nodding

49:46

aggressively. Go ahead.

49:50

I just want to say I really

49:50

agree with you about the

49:55

identification of a

49:55

patient population that can

49:59

participate in that early diagnosis,

50:01

and I think that's where you're going

50:01

to get the motivation of the end

50:06

user to participate. Absolutely.

50:12

Yeah. I think this is a good example for an

50:12

opportunity for maybe citizen science.

50:17

How do you get people engaged? How do you give them incentives

50:19

to be engaged in either

50:23

providing data or helping us

50:23

to understand early detection

50:28

and at the same time link that

50:28

to a direct benefit to them?

50:35

Well. Wild idea, which is

50:39

why not make it fun? So ais Innovation has partnered with a R

50:46

P to stimulate new people to play games,

50:52

and they launched new games like

50:58

Monopoly and Trivial Pursuit

50:58

with many generations.

51:01

But one of the things older people

51:01

talk about is using gaming to

51:07

keep their brains active,

51:07

like the crossword puzzle.

51:10

So if you kind of begin

51:10

with what they're doing,

51:12

150 million people watch Wheel of

51:12

Fortune talk about an audience and a

51:17

population. And I was at a memorial

51:18

for a woman who was 102,

51:22

and the secret was Wheel of

51:22

Fortune, keeping that brain

51:27

So I kind of think we need

51:27

a big initiative to get

51:32

people to get motivated and to

51:32

make brain health as important

51:37

as physical health and cardiac health. Well, I love this. I think we've heard

51:41

everything from citizen scientists to

51:47

extended multi-generational

51:47

gameplay from Mary and Elizabeth's

51:52

concerns about keeping the caregiver

51:52

involved and staying patient-centric.

51:57

I think ultimately what we've heard

51:57

is there's lots of opportunity here.

52:02

So I'm actually going

52:02

to say at this point,

52:05

thank you to all of my

52:05

guests today for your

52:10

really poignant insights for your

52:14

experience in your relative fields. I

52:14

think there's a lot to be done here.

52:19

So with that, thank you for joining us

52:19

today. We're thrilled to have had you,

52:24

and we hope that you'll visit us for

52:24

more information. Have a great day.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features