Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:03
But when you
0:04
start looking at a lot of the
0:06
technologies and trends and how they're being utilized
0:08
today, though I suppose there is that
0:10
existential risk out
0:13
there in the longer horizon from
0:15
superintelligence or AI, the near-term
0:18
problem is in fact us, right? How
0:22
do we manage forces at scale when warfare
0:24
is being conducted at machine speed? Can
0:27
you do air-tasking orders? Can you coordinate
0:31
frequencies and communications at the
0:34
scale and speed when you have hypersonic
0:37
weapons, energy weapons, cyber moving
0:39
at machine speed? Hey,
0:43
welcome back to the Modern War Institute podcast.
0:46
I'm Jon Amble, editorial director at MWI,
0:48
and I'm joined on this episode by August Cole.
0:51
He is the co-author, along with Peter Singer, of a new
0:53
book that has just been released called Burn In. It's
0:56
a novel about technology, and really about
0:58
humans' relationship with technology. But
1:01
although it's a work of fiction, it is really well
1:03
researched, and the pretty remarkable technologies
1:05
it includes are actually things that, as you can
1:07
see in the hundreds of footnotes included in the book, already
1:10
exist in some form or are starting to emerge,
1:13
which gives it a sort of hybrid fiction-nonfiction
1:16
feel. In the conversation we talk
1:18
about the book and its plot, but August also
1:20
discusses the way he conceptualizes fiction
1:23
as a tool with which to think about and even
1:25
better understand the world and the future,
1:28
including the future of war. I think
1:30
it's a great discussion and I hope you enjoy it. Before
1:32
we get to it though, a couple quick notes. First,
1:35
if you aren't yet following MWI on social media,
1:37
find us on Twitter, Facebook, and LinkedIn. It
1:40
is a great way for us to stay in contact with
1:42
the incredible community of listeners and readers
1:44
who share our interests in topics related
1:46
to modern war. And lastly, as always,
1:49
what you hear in this episode are the views of the participants
1:51
and don't represent those of West Point, the Army,
1:53
or any other agency of the US government. Alright,
1:56
here's my conversation with August Cole.
2:05
August, thank you so much for joining us. John,
2:08
it's great to be with you. So
2:11
you have a new book out, a new
2:13
novel co-authored with Peter Singer.
2:17
And that's really kind of what I wanted to have you on to talk about
2:20
because I had an opportunity to
2:22
read it and found it fascinating and
2:26
sort of illuminating and really kind of clever
2:28
way to think through some things that a lot of people in the
2:30
defense community, the defense space and the military
2:32
are talking about, but
2:35
sometimes in kind of an abstract sense. And
2:37
the book as a novel sort of makes it real
2:39
in certain ways. So I really
2:41
kind of want to talk to you a little bit about the process
2:44
and your objectives when you sat down to
2:46
write it, if you don't mind. So
2:48
the book is called Burn In. I wonder if
2:50
you can give kind of, you know, without giving you any spoilers,
2:52
what's the sort of elevator pitch that the
2:55
idea really kind of developed around
2:58
when you first, you and Peter first started talking about this?
3:01
With Burn In, what we were trying
3:03
to do was blend fiction
3:05
and nonfiction for this new kind
3:08
of book
3:09
that leads you into this world
3:11
in which a FBI agent is hunting
3:13
a terrorist through Washington
3:15
DC, you know, a decade or two
3:18
from now.
3:19
But unlike your conventional counterterrorism
3:21
story,
3:22
this FBI agent is paired with a robot partner
3:24
which has been forced upon her as
3:27
a kind of policy
3:29
move. And it comes in the midst
3:31
of this moment in which America
3:34
is really struggling to
3:36
manage a society that has been
3:38
fundamentally transformed at every level from the household
3:41
to commercially
3:44
by automation and AI. So
3:48
when you say that, you know, kind of blend
3:50
fiction and nonfiction, I mean,
3:53
it is very much a novel, right? I
3:55
mean, that's the subtitle is a novel
3:57
of the real robotic revolution. Can you explain a little
3:59
bit more?
3:59
you mean by blending fiction and nonfiction?
4:03
When Pete Singer and I wrote Ghostly,
4:05
we felt like that if we were going to
4:08
really stretch the elastic band of credulity
4:11
and posit a Chinese sneak attacking
4:13
Hawaii and setting off a third world
4:15
war, we had to anchor
4:17
the book in the world,
4:20
not just as it is, but as it will be, not as we
4:22
want it to be. So all the technology, all
4:25
the trends in Ghostly were real. And I think this
4:27
helped underscore the credibility of the story
4:30
that otherwise could have seemed fantastical. So
4:32
with Burnin, we took a similar approach where
4:35
we looked at hundreds of different technologies
4:38
and trends and fuse
4:40
them together into a techno thriller. And make
4:42
no mistake, Burnin is fundamentally a thriller.
4:45
But we're allowing people to see what
4:48
is ahead and what is going to happen in
4:50
our politics, our economy, with security, even
4:53
how they're going to be parenting in this
4:55
world in which some of the basic assumptions
4:58
that we have today about what role will
5:00
work now in our lives in the future, what
5:02
about privacy in a total data
5:04
society?
5:05
How will we think about the kinds
5:08
of questions that have to do
5:10
today with political leadership
5:12
and partisan politics in 10 to 20
5:14
years? Is that system going to be equipped
5:17
to deal with really existential questions about capitalism?
5:20
And so what we're doing is we're building
5:22
this world that we hope is compelling, using
5:24
the traditional approaches of plots and characters.
5:27
But we're also very mindful of
5:30
the opportunity to kind of help close this
5:32
gap, as you mentioned, in understanding about
5:34
technologies like machine learning and AI.
5:36
So the hope is you can read a book that'll
5:38
keep you up all night because it's so engrossing,
5:41
it's either scaring you or it's exciting. And
5:43
then the next day you can go to work and have a
5:46
better understanding about AI
5:48
and robotics are going to transform our world.
5:51
So you said it's a techno thriller and technology
5:54
is clearly a centerpiece
5:57
of the story, especially
5:59
sort of mankind's relationship with technology.
6:02
But there's also this, it's set against
6:04
this backdrop of a political
6:06
crisis, a social crisis really in this country.
6:09
What was the reasoning for doing that? Because presumably
6:11
you could explore some of these without that
6:14
backdrop,
6:16
and yet you chose deliberately to have that in there. Why
6:19
would you do that? The ways
6:21
that we think of the human
6:23
relationship with robots, especially
6:26
in how, for
6:27
example, they're going to shape our future. We often
6:29
get locked into a Terminator type narrative
6:32
in which the
6:36
robot becomes self-aware and AI
6:38
conspires to wipe out humanity. But
6:40
when you start looking at a lot of the technologies
6:43
and trends and how they're being utilized today,
6:46
though I suppose there is that existential
6:49
risk out there in the longer horizon
6:52
from superintelligence or AI,
6:55
the near-term problem is in fact us. We're
6:58
on the cusp of something that is more like
7:01
an industrial revolution, but has these
7:03
really transformational aspects throughout
7:06
society and is happening
7:08
to us in a sense
7:10
at a time when we seem incredibly ill-equipped
7:13
to handle really the most fundamental aspects
7:15
of what we would normally expect from
7:17
a functional and prosperous democracy.
7:20
The COVID crisis and the tragedy
7:23
that it's becoming
7:24
I think is revealing that we still have
7:26
some massive holes in our
7:29
social safety net that are
7:31
going to create questions around
7:34
how resilient can we be in the face of not
7:36
just a pandemic, but other kind of systemic
7:38
shocks.
7:40
So in really trying to encompass
7:43
the whole of society picture,
7:46
thinking about
7:47
the way that technology is going to influence
7:49
politics, for example,
7:51
seems to be a really important element that's missing
7:53
for a lot of the conversation today. And
7:55
even if we can identify discrete ways that say
7:57
algorithms and social media are shaping, you know,
8:00
actions in the real world or perceptions
8:02
and emotions online,
8:04
there's yet another
8:06
layer to this that when you start to peel
8:08
back
8:09
or really dig into, it's
8:12
in fact more troubling. In building the world
8:14
of burn-in, our hope was to
8:17
really put people in the middle of that so they could
8:19
be thinking about when I
8:21
wake up in that world, what would I be eating?
8:24
What would I be smelling? Really engaging that sensory
8:26
aspect that I think really good fiction can do.
8:29
But at the same time, prompting folks to
8:32
wrestle with a lot of these fundamental questions that
8:34
they may not have time or had the attention
8:36
to otherwise do. When you're
8:38
giving someone a novel, I think they're even better equipped
8:41
to do that because they've actually experienced a lot
8:43
of this from a character's point of view rather
8:45
than just reading it analytically like a white paper
8:48
or something like that.
8:50
It strikes me that the
8:54
story is about technology, I mentioned
8:56
this before, but it's really about our
8:58
relationship with technology. You
9:01
also seem to have made the deliberate choice of
9:03
making the protagonist, Agent Keegan,
9:05
this FBI agent, somebody who seems quite
9:08
skeptical of technology at various
9:10
times.
9:12
Why make that decision?
9:15
One of the aspects of Agent Keegan's relationship
9:19
to technology that
9:20
I think is really important in understanding
9:23
her is the role
9:25
that she had in the Marine Corps when she was deployed
9:28
as a robot wrangler.
9:29
She has a very utilitarian approach to technology.
9:32
The
9:33
ways that we ascribe
9:35
emotional value
9:37
to inanimate objects, anyone who's ever
9:39
owned a classic car, you've seen this, is
9:42
I think going to become even more pronounced when we start
9:44
integrating the personality
9:47
traits that can go with
9:49
everything from just simply software
9:51
like Alexa or Siri, to
9:54
objects that are mobile or humanoid. under
10:00
using kind of similar operating systems where, you know,
10:02
it's typically like a voice interaction. What
10:05
we've seen already is even in the ways
10:07
that we've used robots in wartime in
10:09
the last 20 years that
10:11
there have been pack bots that
10:13
have been given, you know, essentially battlefield burials
10:16
that the emotional attachment that can
10:18
become real
10:19
to a machine is, of course, something that
10:21
is a very real and human experience. But we
10:23
wanted to have a character in
10:25
that protagonist role who
10:28
would be somewhat skeptical and would not
10:30
want to
10:31
have the sort of ultimately
10:34
aspirational kind of narrative
10:36
around technology that many of the people who, I
10:39
think, shape today what we expect
10:41
from the
10:42
different sorts of systems out there, whether they're robotic
10:44
or AI. So, you know, this
10:46
kind of utopian vision of the future is certainly not the
10:48
one that we are landing in, in, in burn-in
10:51
and finding someone who could stand within
10:53
that world and look around and kind of look
10:55
at it analytically and skeptically.
10:58
And having that skepticism be born
11:00
out of experience and not just cynicism,
11:03
I think is a really important, really important aspect
11:05
for
11:06
having a guide
11:07
to understand when you're standing on
11:09
the edge of this, you know, transformational revolution
11:11
and trying to figure out just what is going on.
11:14
So, it's interesting that
11:16
you mentioned sort of pack bots, because
11:19
there's a line in the book where
11:23
Agent Keegan is talking about her time in the
11:25
Marines and how the
11:27
Marines interacted with robots
11:29
that were quadrupeds and saying that they grew
11:31
more attached to them. And maybe it's something
11:33
about, you know, they're more similar
11:35
to our pets. There's a sense of dependency
11:38
that quadrupeds, whether, you know, it's horses
11:40
or dogs or cats or what have
11:42
you, the sort of relationship that
11:44
we have with them creates
11:47
this sort of sense of, you said
11:49
the Marines grew more
11:51
attached to them. There's kind of an intimate relationship
11:53
that builds up on an individual level.
11:57
Is that, you know,
11:59
in your opinion,
11:59
then is that sort of relationship with technology
12:02
sort of a good thing, a positive in terms of the
12:05
way that we can leverage its
12:07
advantages or is that sort of
12:09
more utilitarian mindset that Agent
12:11
Keegan at least begins with maybe
12:14
more useful?
12:15
I think it is okay to love a machine.
12:18
I mean, maybe not in the truly like romantic sense,
12:20
but we can have a strong
12:23
sort of connection with technology,
12:25
especially when it's manifest like the bots
12:28
that we talk about.
12:29
But what I feel like the most important
12:32
aspect is really understanding the role that those machines
12:34
or whether it's software, you know, play in
12:36
our human relationships. And so that
12:39
I think can become really difficult to
12:42
understand and ascertain, especially in today's
12:44
world when if you think about how we're relating
12:47
to, for example, our mobile phones,
12:50
to the social media applications and platforms,
12:53
and we're being shaped in our relationships
12:55
algorithmically, right? You know, whether it's curating
12:58
dopamine
12:59
hits, or whether it is, you know,
13:01
spooling up outrage and joy,
13:03
you know, in different intervals, whether
13:05
it's A-B testing and figuring out what works at
13:08
scale with different concepts. So
13:10
that's all in the virtual and kind of software domain.
13:12
Imagine when we start applying or seeing people apply
13:14
those kinds of capabilities and technologies to
13:16
things in the physical world.
13:18
That is a fundamentally different level of human
13:20
experience that I think we haven't really come to grips
13:22
with. And in the context of the future
13:24
of conflict,
13:26
you know, one of the really interesting aspects about
13:28
the understanding of what
13:31
the role of the robot will be in the future of
13:33
war
13:34
is, of course, you know, still open
13:36
for debate and for contest. In
13:39
Burnin, you know, we really, I think, tried to
13:41
shed a little bit of light on that and talk about some
13:43
of the trend lines that we see in that, which is that,
13:45
you know, battle bots will be small.
13:48
We won't see these kind of hulking, titanic,
13:50
large mecca in
13:52
the near term here that
13:55
will be most useful, but rather insect-like,
13:58
you know, as you said, quite a bit.
13:59
underpeds, dog-like size machines
14:02
are probably more practical, less
14:04
vulnerable, and
14:05
more scalable, which I think is another aspect
14:08
of robotics. And a very different
14:10
kind of paradigm, if you will, than the way we acquire
14:12
and procure
14:14
complex technical systems today.
14:16
So the way that we unpack
14:19
that too, and the way that Keegan reflects on
14:21
her experience as an FBI counter-terrorism agent,
14:23
how that's informed by our military service, and
14:26
the relationship she has with TAMs, this robot,
14:28
is absolutely part of
14:30
her services as a Marine, as
14:33
being
14:33
the framework through which she understands
14:36
what's happening around her and what's going to be happening
14:39
ahead.
14:41
So there's a good example.
14:45
There
14:47
are many of these micro robots that
14:49
are doing small tasks that are enabling humans to
14:56
do things that humans have already done, always done
14:58
already, but do it a little bit better, a little bit faster.
15:02
We've published at MWI, we've had a number
15:04
of articles about the role of AI
15:07
on the future battlefield. And the consensus
15:09
is that it's not going to be
15:11
our robots fighting their robots
15:14
and taking
15:15
over the
15:17
core functions of war fighting, rather
15:20
enabling better human
15:23
war fighting. Is that the sense that you get
15:25
in from, I
15:26
guess, the research? There are 28 pages of
15:29
a footnote to this, so there's clearly a lot of research
15:31
done. Is that your sense?
15:33
Oh, that's a great question. The
15:35
way that Keegan and this investigation
15:37
that she's undertaking,
15:39
and
15:40
the way that she learns
15:43
to
15:44
change the relationship that she has
15:46
with this robot TAMs from the beginning,
15:48
and how the software itself and
15:50
the robot evolves throughout, you're
15:53
really, I think, creating
15:55
that
15:56
storyline in the book, trying to figure out, are
15:59
you just writing
15:59
like a human to human dialogue,
16:02
a human to human kind of emotional
16:05
relationship, or you really actually understanding
16:07
a human-machine relationship. And
16:10
I think fundamental, just like with the way we relate to one another,
16:12
is that arc of change. Our relationships
16:14
do have beginnings, middles,
16:16
and ends. And I think to some
16:19
extent, the question about the role
16:21
of machines in the future of war
16:23
is going to reflect that both individually, but also collectively.
16:26
The way that machines are
16:29
adopted and implemented in civilian
16:32
applications, whether it's kids
16:34
with ever more sophisticated gadgets and toys,
16:37
whether
16:37
it's life-saving octocopters
16:39
that are being used by EMS services, that's
16:42
going to fundamentally change our expectations too of what happens
16:44
during conflict with bots.
16:46
And so I think that almost
16:49
holistic view, I think is really, really important
16:51
in making sure that
16:53
we are
16:54
using the same kind of analytical
16:56
mindset that we would
16:58
with other aspects of the way civilian
17:01
societies produced fighting forces, that
17:03
we do that with robotics too. And
17:06
so as we kind of researched this and looked at
17:08
how people relate emotionally, but also the boundaries
17:10
in terms of what
17:11
is possible technologically speaking,
17:13
because everything in burn-in is real or in development,
17:16
just like with ghost fleet. And the hope
17:18
is that by anchoring in that reality,
17:21
when we portray these often
17:24
fantastical things happening,
17:26
people will know in fact that that is very
17:28
much in our near future. And from
17:30
that, then be able to kind of think through some of this for themselves
17:32
and really unpack what
17:34
they think about that very really important question
17:36
you identified.
17:38
I think that decision to root it in reality,
17:41
but put it far enough into the future that it is substantially
17:43
different or recognizably different than kind
17:46
of our current experiences is
17:48
a really unique one. One
17:50
of the advantages to
17:52
say science fiction especially is that sort of
17:55
breaks us free of the constraints that
17:57
are placed on us by the real world and our perceptions
17:59
of the
17:59
world and, you know, Starship Troopers,
18:02
Ender's Game, these things that kind of let
18:04
us like burst through those boundaries
18:08
and explore some things on kind of a different
18:10
level. On the flip side, you've got,
18:13
you know, say especially fiction that's written
18:15
in the here and the now
18:19
that is really kind of
18:21
forced to stay within those boundaries. You
18:24
kind of are able to push through a little bit but still
18:26
make it something that we can connect with. And to demonstrate
18:28
that, again, footnotes showing, hey,
18:30
these programs already exist or they're being explored.
18:33
Was that a
18:35
deliberate choice in your experience?
18:37
Is that something that it is as unique
18:40
in the world of fiction as it seems?
18:43
Putting endnotes in a techno-thriller
18:46
still seems to be a pretty novel concept. And
18:49
when we considered the
18:51
role that they played in making Ghost Fleet a believable
18:54
story,
18:55
we were from the beginning
18:57
certain that we wanted to do the same thing with
18:59
Burn-In. You know, there's almost
19:02
a creative, right? And then there's also a, you know,
19:04
kind of an analytical, you
19:07
know, rationale behind them that I
19:09
like to kind of identify. You know, on the
19:11
analytical, I'm
19:12
able to find information
19:15
and
19:15
be fully transparent with the reader about
19:18
what role it plays in the
19:21
genesis of the story. You
19:22
know, a good example would be, you know, the job
19:25
that Keegan's husband has is
19:28
directly tied to some
19:30
of the studies that have been done about the role
19:33
of white-collar professions
19:35
in the AI era and how many of the
19:37
jobs that
19:38
people who have gone to traditionally excellent
19:40
schools and worked hard at, you
19:43
know, may be, you know, algorithmed out of existence.
19:45
So
19:46
trying to understand, if you're
19:48
reading that, why we made a choice to
19:51
make somebody, you know, an ex-lawyer or not,
19:53
I think is really important because those are the sorts
19:55
of ways we can connect with that factual
19:58
information.
19:59
On the creative side, side, the use of end notes,
20:02
again, even though it is unusual, is
20:04
a great tool because when you're really
20:07
pushing the boundaries of expectations
20:10
of what is possible or not, when someone
20:13
encounters something that they feel
20:15
like shouldn't even pass the giggle test, but
20:18
then they realize it's got an end note
20:20
right there pointing to it, it fundamentally
20:22
allows you to connect with
20:23
that experience
20:26
intellectually where you're just like, wow,
20:28
I can't believe that's real. And you can continue
20:30
to read the story
20:32
and look up that, whether you're looking
20:34
at an ebook, for example, you can do it right away, or you can flip to
20:36
the back. The thing you don't
20:38
want to do, obviously, when you're weighing
20:41
what to use an end note for or when
20:43
not to, is
20:44
you don't want to interrupt the flow of a really
20:46
great
20:47
passage. But we were very
20:49
aggressive in using end notes because so many of the technologies,
20:52
so many of the concepts too, took around AI, around
20:54
machine learning or other technological things
20:57
about cybersecurity or infrastructure or security.
20:59
We felt like we didn't want to lose that moment, that
21:01
teachable moment in the book. And so the end notes
21:04
really help us connect with
21:06
the reader in that way.
21:09
I'm glad you brought up the example of Agent
21:12
Keegan's husband because I found that really, really fascinating
21:15
on a couple of levels. Number one, what
21:17
he ends up doing when
21:20
he can no longer do
21:22
his white collar job as a lawyer is filling
21:25
a gap
21:26
that
21:28
at this point still, AI cannot
21:30
do. There's a human emotional component
21:33
to relationships that AI cannot provide. And
21:36
I think it's a really interesting window into that. But more
21:38
broadly, you sort
21:40
of paint a picture of the legal field from
21:44
what we experience now, what we expect
21:46
now, which is all about billable hours, maybe
21:49
in 15 minute increments or eight minute increments, depending
21:51
on the firm, down to
21:53
lawyers being increasingly displaced
21:56
by machines and really fighting
21:58
over, I think you were even saying, in their billable seconds,
22:02
which is really, really fascinating because there's
22:04
a corollary in the military. When
22:07
we talk about AI and robots on the battlefield,
22:09
we're talking about operational battlefield
22:12
machines that can do things kind
22:14
of on the front lines at the tip
22:16
of the spear. And yet we haven't
22:18
really explored some of those sort of
22:22
rear echelon
22:24
operational command and leadership
22:26
and strategic decision making components
22:29
that could equally be displaced. I
22:31
think that's a really fascinating window into
22:33
that. Was that sort of deliberate?
22:36
Choosing a character to understand
22:39
a macro theme is like a great
22:41
way to embody a concept in a person.
22:44
And it was very deliberate in our
22:47
creation of Keegan as a very complete
22:49
character herself who's an F... A
22:51
marine turned FBI character as an agent. She's
22:54
a parent. She's kind of watching her marriage
22:56
fall apart and watching her husband's place
22:58
in society slip every
23:01
month farther and farther from where
23:03
they both thought it was going to be.
23:07
The way that
23:08
we look out into the landscape
23:11
of work in the future
23:13
and think that automation and
23:15
job replacement happens to other
23:17
people, I think is one of the biggest blind
23:19
spots that we often have
23:21
in considering what lies ahead.
23:24
And I think that's also true in the conversation
23:26
about defense and security as well
23:28
in terms of where can technologies
23:30
that are scalable
23:33
and fundamentally game changing in terms of how
23:35
we reallocate intellectual,
23:37
political and physical effort, the
23:39
tip of the spear kind
23:41
of concept
23:42
as being the part of
23:44
the AI conversation or robotics conversation
23:47
that gets most of the airtime
23:48
probably doesn't reflect the reality of implementation
23:50
in the next five or 10 years, that the
23:53
easier on-ramps for using
23:56
these kinds of systems in logistics, in
23:59
personnel management... in intelligence
24:01
collection and analysis seems to be
24:03
a far richer, less ethically fraught,
24:06
potentially, although you could argue
24:08
there's just different issues that are being raised, that the same
24:10
pitfalls are there. And so what
24:12
we're trying to do when we, again, we have a character
24:15
that represents this trend line, is
24:18
get people to connect with that concept and idea and start
24:20
thinking about through more broadly
24:22
just beyond that one person's
24:24
future existence. And
24:27
I think the same way you look
24:29
at Keegan's husband, you could have the same
24:31
fictional explorations and that kind of ficant model
24:34
or useful fiction model of exploring the logistics
24:38
operations or, especially in a great
24:40
power conflict context, how do we
24:43
manage forces at scale when warfare
24:45
is being conducted at machine speed?
24:47
Can you do air tasking
24:49
orders? Can you coordinate frequencies
24:52
and communications at the scale
24:55
and speed when you have
24:57
hypersonic weapons, energy weapons, cyber
24:59
moving at machine speed? That
25:02
doesn't seem to be a very realistic
25:05
possibility. And so the more time
25:07
we invest in these kinds of questions, I think,
25:09
especially from that human perspective, the
25:12
better chance we have of getting ahead of these problems.
25:15
So I want to shift gears a little bit and ask
25:17
you a question, maybe
25:19
a little bit about inspiration and process.
25:23
We published a review a couple of weeks ago by
25:25
one of our senior fellows, Steve Leonard,
25:27
who was struck by
25:30
the sort of parallels between
25:33
Burnin and a short
25:35
story called A Boy and His Dog. They made a movie about
25:37
it in the 70s, but the story was by an author
25:40
named Harlan Ellison. Was that something,
25:43
were those parallels something that you were aware of when
25:45
you wrote it? Was it deliberate? And maybe
25:47
were there other sort of inspirations to kind
25:49
of help shape the story in the way that you kind of
25:52
convey some of the lessons you're trying to convey?
25:53
You know, it's funny,
25:55
I've actually seen that movie that started a very young
25:58
Don Johnson and his
26:00
dog as part of that Cold War sci-fi
26:02
canon. And I did read the book a long
26:04
time ago. No, I mean, it wasn't
26:07
really at the fore, and at least
26:10
consciously in thinking about it as a
26:12
parallel. But ultimately,
26:15
the relationship between our
26:19
human and robot
26:21
in Burnin is something that we
26:23
wanted to stand
26:24
apart and be
26:27
unique. Because one of the interesting facets of writing
26:29
about something like
26:30
a, not a sentient machine per
26:32
se, but a machine that you can have a relationship
26:35
with, is that we're doing so in an era
26:37
where we can wake up and talk to Alexa
26:39
about what our day holds in store for us. Or
26:42
my daughter will be having
26:44
Siri conversations, and it has
26:46
been in fact for years. So the uncanny
26:50
valley you're living in when you're writing about
26:52
a lot of this from a sci-fi perspective is
26:54
really interesting because you can identify all
26:56
these little threads that start pulling all around you.
26:59
And whether it's micro robotics, whether
27:01
it's this kind of, again, the software, human
27:04
relationship. I don't know if you remember
27:07
the film Herb by Spike Jones, it came out
27:09
I think five or six years ago, which I think is one of my
27:11
favorite films that really unpacks
27:13
like that human operating system relationship.
27:16
And if you can be as
27:19
a creator, getting people to so
27:21
connect with that sort of a storyline,
27:23
because that fundamentally, for example, is a love story. But
27:26
it really redefines and tests us in
27:28
terms of understanding what is love in the algorithmic
27:30
era. That's the kind of aspiration
27:33
that I have when I'm writing, not just burning, but other
27:35
short stories too, is really getting people to
27:38
place themselves into these positions
27:40
where these bigger truths
27:42
are out there and you're trying to understand
27:44
them through a narrative that has a very
27:47
real world, often gritty aspect
27:50
to it.
27:50
So you mentioned a term,
27:53
thickened. I wonder if we can kind
27:55
of unpack that within the context of of burning a little bit. The idea
27:57
is that fiction is a very, very important part of the world. a
28:00
tool with which we can better explore and understand
28:03
the world. And correct me if there's a better way of
28:05
describing it, but is that something
28:07
that I've heard you talk about it? Did
28:09
you write this book with that specifically in mind?
28:12
That is exactly how I articulate
28:15
fiction. I think it's something
28:17
that is woven throughout
28:19
the mission that I'm trying to do
28:22
right now, which is using
28:24
fiction in various ways. The
28:27
quick tag is, can you use narrative to
28:29
avoid strategic surprise? The
28:32
differentiation, I think, between a classic
28:35
science fiction
28:36
work and something that has this useful
28:38
fiction or ficant aspect to it is
28:41
how closely does it tether itself to reality?
28:44
And so,
28:45
burn-in was very much a product
28:48
of this ficant mindset.
28:50
The endnotes, as we talked about, the
28:53
technological
28:54
and trendline cornerstones
28:56
that are there in the story to
28:58
anchor not just world building, but
29:01
actual the direction of the plot and what people
29:03
do or don't do. So, it certainly
29:05
is this moment where it's something
29:07
that you can use to write
29:09
short fiction, whether it's crowdsourcing,
29:11
like the Army Mad Scientist program has done,
29:13
like MWI has showcased on its
29:15
own website, to something as big as
29:18
a novel. Because if the objective is to have this
29:20
educational aspect, when I'm
29:23
envisioning a story,
29:25
even one as big as this, I'm often thinking about what
29:27
is the ask of the reader who's going to be consuming
29:31
this? Is there something that I want them to see
29:33
differently in their world or understand better about themselves
29:35
or questions that I think they can start posing?
29:38
And so, my hope is that that
29:40
approach is
29:42
creatively credible so that people are
29:44
actually reading what you write because there's no point writing
29:47
something that no one will finish if it's not any good. So,
29:49
the hope is that you can hold those things in tension
29:52
and create something that can balance both
29:54
the useful, but also the entertaining.
29:57
And I don't think it's a bad objective to have. either
30:00
to produce something that is ultimately entertaining,
30:03
yet just packed with insights too.
30:06
Well, I don't want to give away any
30:09
sort of secrets of the plot because it's
30:11
an
30:12
enjoyable book in and of itself,
30:14
even if people aren't setting out to learn from
30:16
it, but I think they'll find that they do. But
30:19
without giving any of that away, what is it that you're hoping
30:22
readers, specifically the types of readers that say, listen
30:24
to the MWI podcast, members of the military, the defense
30:26
community, people with a keen interest
30:28
in some of these issues, what is it you're hoping
30:31
they will take away from the book?
30:34
One of the takeaways that I think is most important
30:37
for people who read Burn It is to understand
30:39
that
30:40
we are in the midst of literally
30:42
an historic revolution happening
30:44
all around us that could be as
30:48
profound as the industrial revolution
30:50
itself, if not more. And
30:53
yet, our ability to understand
30:56
what is driving that change, particularly technologies
30:58
like AI, which have this black box aspect
31:00
to it, meaning they're quite mysterious.
31:03
Even their inventors often have difficulty understanding
31:05
why software does certain things. And
31:07
so the objective
31:09
is that if you can
31:11
read a story that is fictional,
31:14
of course, but rooted in this nonfiction
31:17
framework, that you're going to have a much better sense
31:19
of understanding what is in
31:22
store. And that means not just
31:25
listening to the utopian
31:28
perspective from many people in the technology community,
31:30
but really understanding these sorts of tech trends
31:33
as they might play out and considering what
31:35
the consequences are, or the
31:37
risks are, or the threats are.
31:40
And to be able to wrap that into a thriller novel, it's
31:42
like ready made, because those are the ingredients of
31:44
a good story. And the
31:47
way that we're trying to process what
31:49
is happening all around us right now,
31:51
I think it very much speaks to that need
31:54
to be able to consider
31:56
these massive forces that are
32:00
at work right now that
32:02
when we wrote Burn-In,
32:03
you know, over the last three years, we anticipated
32:06
things like
32:08
the rapid virtualization of
32:10
medicine
32:11
or remote work in
32:14
fields as diverse and as
32:16
education to sales,
32:19
for example,
32:21
that they would take place over
32:24
five, 10, 15 years. And instead, much
32:26
of this has happened in weeks. So a lot
32:28
of the conversation now when we're at a point in America
32:31
where unemployment is at great depression levels
32:33
and a lot of the assumption by many Americans
32:36
is that things will go back to normal, that their old jobs
32:38
will be there. I don't think that's a bet
32:40
we can make collectively
32:42
or let alone individually. Moreover,
32:45
the use of data as we come
32:47
up with our societal response to this pandemic
32:50
is fundamentally rewriting
32:52
the rules about what information government
32:54
collects and how it uses it, what information
32:57
business collects and how it uses it.
32:59
And for my red team, kind
33:01
of utopian,
33:05
well, I should say I'm an optimist who stares into the abyss,
33:09
from that perspective to somebody who's thinking about bad
33:11
stuff happening a lot, thinking about
33:13
ways people can exploit not only
33:15
the gaps that society is starting to see form
33:18
in terms of the political
33:20
fissures or cultural ones right now, but also
33:23
what all that data means for
33:25
the stability and security of society, especially if it's
33:27
not well managed to handle.
33:29
So it's a lot that is being wrapped
33:31
up of course in a simple book,
33:33
but our hope is that people will connect with the story
33:35
enough that they start pulling these threads themselves.
33:40
I've long felt that it's impossible
33:42
to read a book and completely
33:45
discount the context of
33:47
your own life when you're reading it. We're
33:50
in this kind of unprecedented set of circumstances
33:52
and there are a number of lessons that you sort of touched on and
33:55
highlighted that are especially
33:57
sort of resonant today in
34:00
this context that maybe wouldn't have been if,
34:02
you know, this book had gone on and been published
34:04
and the current, you know, the sort of pandemic
34:07
circumstances hadn't
34:09
sort of emerged. So I
34:11
think readers are also going to, especially those that have
34:13
pre-ordered or are going to be reading in
34:15
the coming weeks or hopefully not too many
34:17
months, depending on how long this goes on, are
34:20
going to kind of appreciate it on a
34:22
different level
34:24
as I did. So thank you so much for joining
34:27
us for this episode of the MWAI podcast. It's
34:29
a great book. I think it's out right
34:32
around the day that this podcast, this episode
34:34
will go live. So I
34:37
hope it does well and best of luck.
34:39
Thanks again. And it's always great to be in contact
34:42
with MWAI.
34:48
Hey, thanks again for listening to the MWAI podcast.
34:51
One last thing before you go. If you aren't subscribed
34:53
to the podcast, you can find us on Apple Podcasts,
34:55
Stitcher, Spotify, and Twitter. We'll
34:58
be right back. Thanks again. All right.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More