Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Brought to you by Toyota. Let's
0:02
go places. Welcome
0:07
to Forward Thinking. Hey
0:13
there, and welcome to Forward Thinking,
0:15
of the podcast that looks the future and
0:17
says it's gonna be the future soon.
0:20
I'm Jonathan Strickland, I'm Lauren, and
0:22
I'm Joe McCormick. And today we
0:24
thought we would participate in a quintessential
0:27
forward Thinking exercise, which
0:29
is analysis of the prediction
0:32
of the future. Yeah, we've done a couple
0:34
of episodes where we've talked about projections
0:38
that current futurists have about the future.
0:40
We've even talked about really lousy
0:43
predictions that happened in the past. Lauren
0:45
and I did an episode when you were out once,
0:47
Joe and I remember when you came back you were sad to find
0:49
out that you didn't get to participate, where we talked
0:52
about really off
0:54
the mark predictions from the past where
0:56
people were just thinking all sorts of
0:58
crazy things we're gonna happen by now. So
1:01
we wanted to talk about some of our favorites, whether
1:03
they were completely off
1:05
track way back when, or if it's
1:07
a current prediction about what the future will
1:09
be and what our thoughts are and so what kind
1:11
of all gout some notes about
1:13
the sore of stuff that we want to talk about. My first
1:15
one is actually about a series of drawings
1:18
postcards really from France,
1:20
from one
1:23
in nineteen ten. You know, I'd seen
1:25
these things before, and these are great,
1:28
beautiful they're like full color beauties
1:32
that are I don't know that they're not actually
1:34
as cartoonish as one might imagine.
1:37
Some of them have some interesting detail in
1:40
them. Yeah, there's a lot of different,
1:42
um well,
1:45
different aspects of what they thought the future would
1:47
be like. Uh, and it's all sort of
1:49
fantastical. Some of them actually are
1:52
are pretty prescient. Uh. They were
1:54
essentially thinking about what is the world going to
1:56
be like in the year two thousand. Yeah,
1:58
So one of them, I know, it's got like a
2:01
barber shop where people are getting
2:03
some gentlemen are leaning
2:05
back in very cushy looking chairs getting
2:08
their necks shaved by robots
2:10
with razors going all over the place, crazy
2:13
robot arms off of off of big they
2:16
look like kind of like pneumatic polls that come
2:18
up in in intervals out of the floor. Yeah,
2:21
essentially like kind of like a column but with some
2:23
sort of pneumatic element to it. So that the
2:25
arms that are bolted onto the pneumatic
2:27
part could be raised or lowered. When
2:30
I watched those darker robotics
2:32
challenge fails. The main thing I think
2:35
is I want one of these things shaving me with
2:37
a straight razor. Well, and and
2:39
let's let's also point out that in this particular
2:42
illustration, there is a gentleman uh
2:44
well quoft which would not normally
2:46
happen if you happen to be the barber. If you always
2:49
know the barber in the town, he's the one with the worst hair
2:51
do. Oh
2:53
I guess, yeah, but he
2:55
this guy has a great hair do because it's all done
2:58
with these robot arms, although he is currently
3:00
operating a
3:02
system, well at least a giant lever, so
3:05
perhaps there is at least some human power
3:08
in this in this system,
3:10
even if it's just to turn it on or off.
3:12
I think what I like less than the idea of
3:15
robot arms shaving me um
3:17
is uh robot arms
3:19
controlled by a person who's not looking
3:21
at me. That's true, that is,
3:24
to the customers. I think these are
3:26
not autonomous shaving robots.
3:28
These these are in fact just like multiple
3:31
extensions. You know you've got those machines
3:33
where you press one button
3:35
and it does like five things at once. I
3:37
think that's what's going on here. The barber is
3:39
operating a machine and the machine translates
3:42
all his actions into
3:44
actions on five different customers
3:46
at the same time. Yeah, it's more. It's a
3:49
more sophisticated version of that old This
3:51
might actually be too old for you guys. But for
3:53
for chalkboards, there there were
3:55
occasionally these things. It was like just a little board
3:57
with little wire holders that you talking about
4:00
multiple pieces of chalking. Yeah,
4:02
it was the way of cheating. If the teacher
4:04
told you to stay after school, right,
4:07
I will not spit in my neighbor's milk, you
4:09
know times right. So so it's
4:11
it's like that where it's really the
4:13
idea is to to make it easier for one
4:16
person to do the work of like
4:18
five all at the same
4:20
time. Another image has
4:22
an arrow cab station. I love
4:24
this this picture. This is actually something that's
4:27
kind of interesting because, uh, right
4:29
now, as we record this, there's
4:31
talk of Uber looking
4:33
into creating vt O
4:36
L vehicles for taxi
4:38
cabs. We talked about this. I think
4:40
that's a publicity stunt.
4:42
No, I don't think
4:44
it's a publicity stunt. I think it's I think it's
4:46
a company that is used to making
4:49
grandiose proclamations
4:52
and then not quite
4:54
realizing what the consequences are ahead
4:56
of time. I don't think that. I don't think it's so much
4:58
a publicity stunt as an is them rushing
5:01
into an area that's not quite
5:03
mature. Well, I mean, like when Uber says,
5:05
oh, we want to have driver less supercars,
5:08
I can see that being a feasible
5:10
thing not too far in the future. When
5:12
they say we want to v t o L. Uh
5:15
flying taxi cap you're flying ubers?
5:18
Yeah? Yeah, whatever? Yeah. Well
5:20
the picture, the picture from these postcards shows
5:23
uh tiny caps that have wings
5:25
that apparently can extend or or can
5:28
tilt upward so they can doc
5:30
with a landing station,
5:33
and then they have propellers on the front that allow them
5:35
to fly. Obviously
5:37
that would not be the case with the the Uber
5:39
approach. They're looking at vto L and we talked
5:41
about this when we said what would it take to have
5:44
flying cars, and I think we all agreed vo
5:46
v t o L would be absolutely necessary just
5:49
from a space issue, and that autonomous
5:52
operation would be necessary. Because we don't think any
5:54
human being should ever operate a flying
5:56
cars unless they are like a
5:59
very accomplished by islet, especially
6:01
if you're talking about within a dense urban
6:04
setting, right. Yeah, if it's out in
6:06
the open and there's there's plenty of space,
6:08
maybe you could have like some exhibitions
6:11
or whatever. But if you're talking about
6:14
mundane day to day operations, we
6:16
want autonomy in that. Yeah.
6:19
I I never want the day when, uh, when
6:21
a car crash results in like in
6:24
like fiery parts raining
6:26
from the sky. That's a that's not
6:28
good news. Uh.
6:30
I just want those people operating a multi
6:33
ton lethal weapon they can drive around on the
6:35
ground at seventy miles. I've
6:37
been saying this whole time that I can't
6:39
wait for driverless cars. I mean, I'm just
6:41
I want to make that clear. Uh.
6:43
No, I want to point something out because I've looked at a
6:45
bunch of these postcards, these French postcards
6:48
you're referring to, and a whole
6:50
lot of them involve flight. Many
6:53
of them do, I would say from my memory,
6:55
just more than half of them involved flying
6:58
humans of one type or another. Well, and
7:00
that's the thing, it's always humans, Like the concept
7:02
of flying robotics I don't think had occurred
7:04
to to anyone doing this postcard
7:07
series, particularly like like one of the
7:09
other images that you that you added to our show notes
7:11
here has a fireman with
7:14
these like kind of bat looking wings
7:17
that are that are putting out of fire. It's essentially
7:19
like they're all wearing jet packs, except instead
7:21
of jets, they are flapping wings. Yeah.
7:24
Yeah uh. And and there are several
7:26
in the series that are similar to that. I only
7:28
included the fireman one because it was such
7:30
a beautiful picture of these
7:33
people who are are are
7:35
flying through the air using water
7:37
hoses to fight fires and to
7:40
their you know, flying up to rescue a baby.
7:42
And can I point out that in this picture
7:44
if I'm understanding what the artists meant
7:46
correctly, so the firefighters have these bat
7:49
type wings, and I believe the wings
7:51
are secured to their ankles
7:53
via some kind of connecting strap.
7:56
So it looks like what the artist had in mind
7:58
is that the firefly or is
8:01
held aloft in the air by flapping
8:03
wings that are powered by
8:05
the firefighter's own leg movements.
8:08
They could also be pneumatic. It's
8:11
hard to say that particular image
8:13
it's very difficult to say, but you see,
8:17
I haven't noticed that before. But yeah, that's that's very
8:20
well. If you if you look at some of the other ones from the
8:22
series, there's another one where there's a police
8:24
officer who's pulling over an aviator
8:27
who has clearly done something
8:29
naughty, and so the police officers wearing
8:32
wings that are that extend
8:34
out from his back, but has a tail
8:36
as well, presumably to provide stability.
8:39
I mean, we're not crazy and
8:41
uh that, but the tail is attached
8:43
to his ankles, So he's got he's
8:45
got these tethers that essentially go from the tail
8:47
to his ankles, and he's in
8:50
an upright position with
8:53
respect to the airplane that he's trying to pull
8:55
over. Not entirely certain how
8:57
he's maintaining altitude well
9:00
in a vertical position with winks,
9:03
you would think that would pretty much cause you to plummet.
9:06
But in the future, physics will
9:08
no longer apply. Uh well, I and I
9:10
did. I did want to mention that you know, we're we are
9:13
this year we've been fighting fires
9:15
with drones. Yeah, no, yeah,
9:17
we're starting to see and we've also seen issues
9:20
with drones getting in the way of firefighters. But it's
9:22
nice to see them being used in the
9:24
in scoping out the
9:26
the um like how big a
9:28
fire is, like how far does it stretch? And
9:31
also to get a good look at places
9:33
that would obviously be dangerous to send
9:35
a person into without first checking
9:37
out what's going on. But we don't attach
9:40
a drone to a person and send a person in because
9:43
that's crazy, because that's obviously terrible. Yeah,
9:45
that would be not so good. Also,
9:48
there's another element in some of these
9:50
and you know, I didn't include all of them. I
9:53
included you know a few, but if
9:55
you go to there's a website the public Domain
9:57
review dot org who has the entire collection
9:59
as since Lee of all the ones that have survived,
10:02
then they're gorgeous. There's a collection
10:04
of them. Also where another big
10:07
theme was underwater activities.
10:10
Yeah, there were there people riding
10:13
on seahorse, giant seahorse.
10:16
In the future there will be giant sea horse. There's
10:18
giant sea horses. There's one where they're there,
10:20
uh, doing the
10:22
equivalent of a horse race, except they're all riding
10:25
very long eel like fish.
10:28
Uh. There's another one where they are
10:30
fishing for seagulls. So
10:32
they're throwing hooked lines up
10:35
through the surface of the water. Seagulls
10:37
grabbed them, and then they dragged the seagulls
10:39
underwater, presumably to
10:41
their deaths. Uh. I don't know
10:43
why you would want to do it, drowning birds
10:45
for fun. I guess that's no more horrifying
10:48
than pulling fish, which breathe underwater
10:51
out into our atmosphere. Maybe
10:53
I think it's a little more. And
10:55
I like fish better than I like birds, so
10:59
I'm sort of okay with it. All right, Well,
11:01
Lauren the seagull murderer, Sorry, I'll
11:03
know that. Speaking of disturbing
11:06
uses of birds, there's another one of
11:08
these images that is just titled
11:11
intensive breeding, and
11:14
it is a picture of a woman who
11:16
I think is working on an egg farm
11:19
with a basket full of eggs and she's attending
11:21
this big refrigerator sized machine
11:24
that has lots of little yellow chicks
11:26
coming out of it on a slide. Yeah.
11:29
Yeah, you put the eggs in the
11:31
top and chicks come out. They get
11:33
to write a slide. What's what's
11:35
so sinister about that? I don't
11:37
know what is this machine supposed to be doing.
11:39
I think it just hatches eggs faster than a
11:42
than a chicken could. Apparently I don't
11:44
buy magic and technology. There's
11:47
one that I think is actually kind of kind
11:49
of interesting. There's the electric scrubbing
11:52
one, where it's a showing
11:54
a machine that is used to brush
11:57
and scrub the floor. This again
11:59
is being operated by somebody a maid
12:01
at this point off to the side, so it's
12:04
not fully autonomous. Who in the year two thousand
12:06
is still wearing something that looks strikingly like
12:09
like Edwardian made attire. That's
12:11
something that we wanted to mention too, is that,
12:14
Joe, You brought this up earlier today when I
12:16
was telling you this was going to be something I was going to talk
12:18
about. So while the
12:20
technology uh seems
12:22
to be kind of an advanced version of
12:24
what they had available to themselves in these
12:26
visions, when it comes to things
12:28
like costume and
12:31
uh and and hairstyles,
12:34
everything remains exactly
12:36
of that era. They can imagine the technology
12:39
changing, but they can't imagine the culture changing
12:41
exactly. Yeah, you don't see
12:43
any representation of other ethnicities
12:46
either, Yeah, well,
12:48
and and in and in these cases, I mean, it's it's certainly
12:51
not as though other other work from about
12:53
the same time period didn't portray like
12:55
like futuristic in big honking
12:57
quote marks um quote marks, Yeah,
13:00
but I like them, um
13:03
uh a future futuristic
13:05
fashioned um in another another
13:08
science fiction and science presumptuous
13:12
materials um but
13:14
but yeah in this particular case, it's it's
13:16
part of its charm. But but yeah, so so this so this cleaning
13:18
device is essentially a very large,
13:20
manually operated room by yeah,
13:23
yeah exactly, and and it's kind of
13:25
this also kind of plays into
13:27
something else I want to talk about, just in general
13:29
when it comes to make predicting the future.
13:32
Two things. One was that that idea that
13:35
we find it very difficult to imagine
13:38
how people change along
13:41
with technology. Right We we sit there and we look
13:43
at the technology, say well, here's what the technology is
13:45
going to be like in another fifty years, but
13:47
it's hard for us to imagine what people
13:49
will be like at that time. Can I offer
13:52
a possible reason for that? Sure? I
13:54
would think that that might have something to do with the
13:56
fact that technology
13:58
changes along
14:01
a predictable path, in that you
14:03
don't know exactly what solutions
14:05
technology will will represent,
14:08
what it will come up with, but you sort of know
14:10
what problems it will be trying to solve,
14:13
unless you're imagining a very far future
14:15
where there are new problems that haven't even emerged
14:17
yet. But all these problems are things
14:19
that people understood at the
14:22
time and were things that people actually
14:24
wanted done. They wanted robots to make
14:26
work easier, they wanted
14:28
the power of flight, they wanted
14:31
faster transportation, they wanted
14:33
automation. All these things are
14:35
real obvious concerns when
14:38
you're trying to predict the future of culture,
14:41
like what how clothes
14:43
will look different? You don't have that same
14:46
prompt you You can't say, like, I
14:48
know what problems future fashion
14:50
you know fashion designers will be trying to solve.
14:53
It's just not something you can predict. It's
14:55
not a linear change pattern. You could
14:57
probably predict that whatever the fashion
15:00
is of that particular era, it would
15:02
be some sort of take on the fashion
15:04
from twenty years previous. Yes,
15:07
that's usually a pretty good rule. And furthermore,
15:09
it's really difficult to predict how
15:12
technologies like that could change us,
15:15
like like what like how those technologies
15:17
would be changing the culture that would lead to
15:20
those kind of differences in appearance and
15:22
uh, and day to day life
15:24
all of that sort of thing, like like you can sit there and
15:26
imagine that they like, yes, there's going to be
15:28
this this cleaning robot,
15:30
but you're still hiring a maid to
15:33
operate it, right, Yeah, And I'm
15:35
not I don't really want to call out
15:37
the artists of that time
15:39
period for being particularly silly
15:41
or whatever. This is true for every era, right
15:43
yeah, yeah, like it just beautiful examples
15:46
example. Yeah, and I would I would also
15:48
say these things are probably made with some sense of
15:51
humor about them. Sure, I'm just
15:53
saying I can't imagine, say, being in the
15:55
nineteen eighties and being able to
15:57
anticipate like the man Bun. It
16:00
just wouldn't have occurred to me. Never would
16:02
have never would have thought that that would become
16:04
a thing. This isn't even me judging.
16:06
I'm just saying that in the eighties I never would have huh
16:09
that that would have been a big deal. Like it
16:11
would have been one of those deals
16:13
where someone had shown me a picture and I'm like, well, what,
16:16
why is what's the he where did you get
16:18
this? What if in the nineteen eighties somebody
16:20
told you people will be making multimillion
16:22
dollar live action transformers
16:25
movies. I mean I would have at
16:27
the time thought that that would have been brilliant,
16:30
because it would have been before Michael Bay
16:32
got a chance to do that. Wait,
16:34
how would you even imagine that the nineteen eighties
16:36
live action transformers that didn't
16:38
even make sense. I be able to imagine
16:41
it in the nineteen eighties. I was a kid. I could imagine
16:43
that I was a giraffe. Okay,
16:46
come on, I mean as an adult it's a lot
16:48
harder, but as a kid limitless.
16:51
So also I wanted to mention
16:53
that that the other issue, because I said
16:55
there were two. The other issue with predicting
16:57
the future, especially when it comes to technology, is that
17:00
we typically look at what is
17:03
available today, we look at the things
17:05
that we are currently trying to develop right
17:08
now, and then we just sort of project based
17:10
upon that. Right we sit there and say, okay, let's extend
17:13
outward from where we are now and
17:15
where we look like we're headed. But
17:17
that means we we can't and by definition,
17:19
we cannot anticipate any
17:22
innovations that come out of left field that
17:24
changed the game entirely, which
17:27
means that by the time we get to that future
17:29
and maybe that certain things we thought were promising
17:31
turned out to be dead ends, other things we
17:33
hadn't even considered might be the norm.
17:35
A great example of this would be back in the nineteen
17:38
fifties when everyone was building electronics
17:41
with vacuum tubes. No one at the point
17:43
that point in the early fifties was really anticipating
17:46
the that transistors were going to one
17:48
become small enough to really become
17:50
an important electronic component and to replace
17:53
vacuum tubes. And that's why you get these predictions
17:55
about giant computers in the future, right
17:58
because back then, if you were using vacuum
18:00
tubes, they take up a lot more space than transistors
18:02
do, and it would mean that if you want a really powerful
18:04
computer, you'd have to have a lot more vacuum tube.
18:07
So it's the reason that a very very
18:09
powerful computer would be enormous the size
18:11
of a building are bigger. So because
18:14
of that, we find
18:16
it amusing to see some of these
18:18
predictions. But we have to keep in mind that we're doing
18:20
the same thing, right, We're making predictions
18:23
based upon where we are right now. Uh
18:25
And by again, by definition, we cannot
18:27
anticipate something that comes out of
18:29
seemingly nowhere. Uh. So
18:32
I don't want to heap too much uh
18:36
jovial uh disdain
18:39
or anything like that towards people who have made
18:41
these predictions. I prefer we heap saturnine
18:44
disdain other than all
18:46
the time. Yes, of course. Well, the
18:48
other one I wanted to talk about, which is
18:51
a little less less whimsical,
18:54
I would say, is the driverless
18:56
car predictions, because
18:58
I mentioned us a few minutes ago.
19:01
I love the idea of
19:03
a society that has driverless
19:06
cars, especially if driverless cars are the primary
19:08
vehicles on the road. Yeah. I try
19:11
not to be overly sanguine about
19:14
naive, optimistic future
19:16
predictions, but I feel like this is
19:18
one I've always been pretty
19:20
optimistic about and remain
19:22
pretty optimistic about, even though even
19:24
though we've now seen some tragic accidents
19:26
with driverless cars. I think, I
19:29
don't know, I I'm I'm pretty bullish on
19:31
this one. Yeah, I think I think it's completely possible,
19:33
just there are so many kinks to work out before
19:36
it's practical. I think
19:38
the kinks are mainly on the political side
19:40
and social side as well, and not so much on the
19:42
technological side there. So yeah, yeah, exactly.
19:44
No, I mean because because mostly like like we we technically
19:47
have the technology to to have all of these
19:49
devices talking to each other. We just need a
19:52
the rules and regulations in place that will
19:54
make companies make their products talk
19:56
to each other and and beh
20:00
uh build it.
20:03
Yeah. Well, well we also need
20:05
to get people on board
20:08
literally and figuratively with the
20:10
idea of driverless cars. Yeah, I mean you've
20:12
got you've got an expectations problem in
20:15
that people. Essentially,
20:17
it's like, you're not gonna hold driverless
20:20
cars to the same standard that you would hold
20:22
cars with human drivers who can take
20:24
personal responsibility to drive.
20:27
Human drivers are already so bad at
20:30
driving driverless cars. I don't
20:32
think you're gonna have a problem of being worse
20:34
than human drivers. But the problem is
20:36
it's not okay for them to just be better
20:38
than human drivers. They essentially need
20:40
to be perfect well, and when
20:43
it comes to driverless cars. So I would say
20:45
that that the truly,
20:49
the truly autonomous cars
20:51
that are out there, not
20:53
not like Tesla's autopilot, but the
20:55
truly autonomous cars out there have demonstrated
20:58
that they based upon the number of my as they've
21:00
driven without any uh
21:02
major accidents, that they are
21:06
by far better than human drivers
21:08
if you look at them, you know, for a
21:10
million miles and how many accidents are
21:12
are represented. Um, it's pretty
21:16
cut and dry that they're superior.
21:19
Uh. When you get to autopilot,
21:21
then that's not supposed to be
21:23
an autonomous system that's not intended
21:25
to be. People treated as if it is,
21:28
and that's a that's a problem. We're talking
21:30
about public opinion. It's all about perceptions,
21:33
right, and and also there's this public opinion.
21:35
So so the Verge ran a piece recently in
21:37
which they referenced a poll where
21:40
people were asked US drivers were asked
21:42
about their opinions on self driving
21:44
cars. One of the questions they were asked was that, um,
21:48
they were given choices between different levels of autonomy.
21:51
So a level five autonomous car would be
21:53
one that's fully autonomous and has no
21:56
no control system for a human. Level
21:58
four would will have control systems
22:01
that humans could use. And everyone's like,
22:03
I don't want level five. Not everyone like
22:05
I don't want level five, I want level four. Like
22:07
I absolutely want to be able
22:10
to rest control from the system
22:12
if I need to. I think they're they're
22:14
anticipating a scenario where they really
22:16
want to be able to run somebody over with their
22:18
car on maybe maybe, But the
22:21
immediately I realized that that's
22:23
a terrible problem for people
22:25
to say, I want to still be able to take over a control
22:27
of this car. I get it that if you
22:30
find driving enjoyable and
22:32
you still want to have that experience occasionally,
22:34
I get it. But if you're talking about taking
22:36
over control of the car after it's under autonomous
22:39
control, what you're really talking about when
22:41
you boil it down is two drivers struggling
22:44
for control of the same vehicle at the same time.
22:46
Even if you have an autonomous vehicle that can very
22:48
quickly figure out that a human
22:50
is trying to take over and switch
22:53
over to manual control, it's it's
22:55
still kind of like if I were to reach over while Lauren's
22:57
driving and just grab the steering wheel and give it a nice
22:59
sharp tug to the right, that's not
23:02
good. That's bad.
23:05
For some reason I had stopped paying attention. Was
23:08
driving us off a cliff. That would be good.
23:10
And I think that's the kind of scenario
23:12
that people people are thinking
23:14
about. Yeah, I guess they're imagining
23:17
like that. They're thinking this car is
23:19
going to have problems or at least early generations
23:22
are and I'm going to need to find it. I'm going
23:24
to need to have manual override to get around
23:26
those problems, Like there's going to be a life or death situation
23:28
that it's going to come up that that I am going to
23:30
need to be able to stop the car in case. Yeah,
23:33
that's that's what they're imagining, and I think it's
23:35
um my opinion is
23:37
that they are are largely wrong, or
23:39
maybe they're mistaken, and they think that the total
23:42
control means that the car is also going to tell
23:44
them what they have to listen to on the radio. Yeah,
23:47
well, I mean they
23:50
would have issues with that too, especially that radio
23:53
or like or like tell them which like coffee
23:55
shop to go to, and right, we
23:57
I should also point out that I won't take
23:59
you there. Apparently,
24:01
this pole also had people saying the
24:04
responded saying that they don't want the car that
24:06
they owned to be autonomous, and I
24:08
think that presupposes the idea that they would own
24:10
a car. Um As
24:13
we've talked about a lot of the models
24:15
that would have autonomous cars on the road suggest
24:17
that these would not be personal vehicles. You would not
24:19
own an autonomous car. It
24:21
wouldn't make sense. It would make more sense
24:24
to have a company operating fleets
24:26
of autonomous cars and it's all
24:29
on demand. And the reason why it makes sense is that
24:31
if your car is sitting idle more than nine
24:34
percent of the time you own it,
24:36
doesn't it make more sense for that car to go out
24:38
and do work rather than just sit there.
24:40
And if it were doing that, then you could
24:43
end up freeing up space that would otherwise
24:45
be used for things like parking spots
24:47
or garages. Uh, anyone
24:50
who has a garage would have essentially an extra
24:52
room now for storage or whatever you
24:54
wanted. Um, you wouldn't You would be able to
24:56
free up space on streets where you wouldn't have
24:58
people parking all up and down streets. My
25:00
street, you would actually be able to drive down because
25:03
people park on both sides of the street to the
25:05
point where if your car is wide enough,
25:07
you're gonna reconsider going down that way.
25:10
Um. So, I mean, I
25:13
think the most realistic vision
25:15
of driverless cars remains this
25:17
idea of a fleet that it's
25:19
like an uber or a lift where you call
25:21
a ride when you need to. And now, obviously that
25:24
means that the best use case
25:26
for This would be in dense urban environments.
25:28
If you're out in a rural area, it
25:30
makes more sense to have a personally operated
25:33
vehicle because you're not going to
25:35
have enough density of vehicles
25:38
in those rural spots to have a
25:40
reasonable UM response time.
25:42
If you need to have a ride, well, I
25:44
mean, you know you could. You could say that it would be
25:46
something like a like a Netflix subscription and
25:48
and having different subscriptions
25:50
for different needs. Like if you're in a city, Uh,
25:53
then you call a car up whenever you need
25:55
one. If you're in the country, maybe you
25:57
have a you have a car that you essentially
26:00
like lease out. Maybe it is autonomous that
26:02
that that you hang out with for a certain
26:04
period of time. I could see that possibly
26:06
being the case. Uh, you know,
26:09
there are obviously there's a lot of opportunities
26:11
for different UM business strategy.
26:13
So just as I was saying earlier where we were,
26:16
you know, taking today and then projecting outward,
26:18
I was just doing that right now
26:20
when let's talking about driverless cars, and you
26:22
brought up in a situation
26:25
Lauren that I had not really thought
26:27
about, but it totally makes sense
26:29
that could completely be someone's business
26:31
plan moving forward. So UM
26:34
I also think the autonomous cars, they're gonna
26:36
be a thing, whether they become the thing
26:39
and replace uh personally
26:41
like manually operated vehicles. I
26:44
don't know what timeline we'd be looking
26:46
at for something like that. I'm guessing a
26:48
couple of decades at the earliest to
26:51
to really get to a point where you've
26:53
got enough of the population saying yeah, I
26:56
don't care about driving a car. I just
26:58
want to have a way of getting to or I
27:00
need to go, and then not worry about
27:02
it. Again. Um, they may
27:04
take like a generation or two to get through. Well,
27:07
we keep hearing stories about how more
27:09
younger people have less
27:11
of a desire to own a car for
27:13
for a variety of reasons, largely economic,
27:16
but not only economic. And
27:18
I imagine if those trends continue, then
27:21
we'll see people much more receptive to this
27:23
idea. But that
27:25
that again presupposes that a trend
27:27
continues and doesn't change. That's
27:29
still a possibility. Anyway, those were
27:31
the two I really wanted to talk about. One that was more
27:33
whimsical and one that was more grounded
27:36
in reality. Um,
27:38
but it's the kind of stuff that, you know, it's
27:40
hard for me to pick just a couple of predictions
27:43
about the future that I really love, but uh,
27:45
I decided to to
27:48
to just commit to those. So I
27:50
want to hear what you guys picked.
27:53
Well. I wanted to think about
27:56
predictions about the social impact
27:58
of telecommunications. Who
28:02
uses their phone nows?
28:05
Well, no, I would say the Internet counts as teleki
28:09
and so does the telegraphy. That's
28:13
also emoji. Emoji,
28:15
Yeah, if you're sending them across
28:17
an Internet or text message. You know, we
28:19
have never done a full episode about emoji
28:21
before. Go
28:25
ahead, Joe cry
28:29
laughing. It would be amazing
28:31
if we came up with never mind. Okay,
28:34
So, yeah, I wanted to talk about something
28:36
that I've actually mentioned. I know I mentioned
28:39
once on a Text Stuff
28:41
episode that I guessed it on with you,
28:43
Jonathan. Which is
28:46
an article from nineteen twelve
28:49
from Technical World magazine.
28:51
I didn't check to see if this magazine still exists.
28:54
I wonder if it does. Probably not, but
28:56
the article is by one Ivan Narodney,
28:59
and it's a profile of Giglielmo
29:03
Marconi. Marconi build in
29:05
this article as the inventor of
29:07
the wireless telegraph, but a
29:09
well known inventor at the time. Sometimes
29:11
also credited with the radio right right
29:14
Tesla fans hate that. Yes,
29:16
he often is. So the article
29:18
is called Marconi's Plans for the World,
29:21
and it's got a bunch of long quotes from
29:23
from Marconi, and this is one of the
29:25
things he says. Quote. I am
29:27
not personally a socialist. I have small
29:30
faith in any political propaganda.
29:32
But I do believe that the progress of invention
29:35
will create a state which will realize
29:37
most of the present dreams of the socialists.
29:40
The coming of the wireless era will
29:42
make war impossible, because
29:45
it will make war ridiculous.
29:48
The inventor is the greatest revolutionist
29:50
in the world, if
29:52
only it's such a nice thought. Now, I
29:54
think there is a certain grain of truth
29:57
to this, because I've read arguments and I think
29:59
they're kind of can vincing that a lot of
30:01
social changes you see throughout history
30:04
that you would see attributed
30:06
to changes in ideology or
30:08
political movements and stuff like that,
30:11
is actually better understood as a direct
30:14
result of technological change
30:16
rather than ideological or political
30:18
change. I think in a lot of cases that is
30:20
sort of true. So his comment about the the
30:23
inventor being the greatest revolutionist,
30:26
I think there might be something to that. But
30:29
on the other hand, he says, the wireless
30:32
era is going to make war impossible,
30:34
because it will make war ridiculous. This
30:36
was right before the
30:39
the First Great War, and
30:42
then of course we got a few
30:44
more after that. It was the War to end all wars,
30:46
and then the one after that one. Yeah, another
30:49
funny thing. Later in the same article, the author
30:51
summarizes more of Marconi's comments
30:53
by saying, quote, a step further
30:55
in the progress of wireless stands
30:57
wireless lighting, heating, and transmit
31:00
shon of motor power. Each of these
31:02
systems is based on the same principle as
31:04
wireless tell the wireless telegraph,
31:06
only the transmitting and receiving
31:08
instruments are different in the vibrations
31:11
of the etheric waves have a
31:13
different nature, intensity, and length.
31:15
This is also very Tesla issue etheric
31:17
waves. That's great, um, and we
31:20
we have done episodes on on like
31:22
wireless lighting. Yeah, we
31:25
talked about about things like using
31:27
inductive coupling and that's right stuff. Yeah.
31:29
But he's thinking at the grand scale,
31:31
so he envisions like he
31:33
talks about a Niagara Falls power
31:36
plant that would generate hydro power
31:38
and then wirelessly transmit a hundred
31:41
and fifty million horsepower
31:44
across New York State and that sell
31:46
it to other states. It's very, very similar
31:48
to the Tesla UH
31:51
approach. Are the Tesla beliefs of the time
31:53
to UH wildly impractical?
31:56
As it turns out, right, this this thing about
31:58
power transmission is sort of to the But
32:00
I did want to mention that because two
32:02
main principles come up in this article, the wireless
32:05
telecommunications, the wireless
32:07
telegraph, and then wireless
32:09
power transmission. Of course, now one
32:11
is a reality and one the other is
32:13
not, at least not at the large scale. And
32:16
I guess mar Coney could have been referring
32:18
to either or both when he predicted
32:20
that these conditions would bring about the end of
32:23
war. But I tend to think more
32:25
likely that he's referring to to
32:27
the prospect of universal instantaneous
32:30
wireless communication as
32:32
that set of conditions that he thinks is going to make
32:34
war ridiculous and bring it to an end.
32:37
And I think this because it fits in
32:39
with a perennial strain of utopian
32:41
thinking about the implications of
32:44
new telecommunications technology.
32:46
Yeah, well, I mean I can I can see that the
32:49
merit in that kind of thought
32:51
process. It's it's tempting to believe
32:54
that if people are able to communicate
32:56
quickly with each other um, then
32:58
then they'll be able to reach a better understanding
33:01
of each other's motivations and and therefore
33:03
not have as much stuff to fight about. Yeah,
33:05
it's essentially a grander scale
33:08
version of telling children
33:11
just talk it out, will be
33:13
fine, very much, And there are people
33:15
who very very much bought into
33:17
this ideology and have throughout
33:19
history, or at least throughout the past couple
33:22
hundred years. So I want to read you a quote
33:24
from a couple of authors named Charles F. Briggs
33:27
and Augustus Maverick from
33:29
eighteen fifty eight, and this is what
33:31
they wrote. It has been the result
33:34
of the great discoveries of the past century
33:36
to affect a revolution in political
33:38
and social life by establishing
33:40
a more intimate connection between
33:43
nations with race and race.
33:45
It has been found that the old system
33:47
of exclusion and insulation, or
33:49
stagnation and death national
33:52
health can only be maintained by the
33:55
free and unobstructed interchange
33:57
of each withal. How potent
33:59
a power in is the telegraph
34:01
destined to become in the civilization
34:03
of the world. This binds
34:05
together by a vital cord all
34:08
the nations of the Earth. It is impossible
34:11
that old prejudices and hostilities
34:13
should longer exist while such
34:15
an instrument has been created for an
34:17
exchange of thought between all
34:19
the nations of the Earth. So
34:22
this seems like sort of along the lines of the
34:24
Marconi strain of thinking. Right, you get
34:26
people connected to each other through instantaneous
34:29
telecommunication and they
34:31
just sort of like become one very harmonious
34:34
mind. And there
34:36
was an American communication theorist I was reading
34:38
about named James W. Kerry, who
34:41
wrote a book called Communication
34:44
as Culture. This was in the
34:46
nineteen eighties, I think, in
34:48
which there is a chapter about the
34:50
impact of the original telegraph. Now this
34:52
was the wired telegraph, not the wireless
34:54
one, but I think the same principle applies
34:56
to how people were thinking about
34:58
them and carry points out that this
35:00
was you know, it's the first major invention
35:03
to separate the concepts of communication
35:06
and transportation. Really, it's the
35:08
first example of telecommunication communication
35:10
without the transport of mass
35:12
basically. And I want to read a passage
35:15
where Kerry summarizes
35:17
this historical attitude. He says,
35:19
quote, there were dissenters, of
35:21
course, but the general uniformity
35:23
of reaction to the telegraph demonstrated
35:26
how it was able to fuse the opposite
35:28
polls of the electrical sublime,
35:30
the desire for peace, harmony,
35:32
and self sufficiency with the wish
35:35
for power, profit and productivity.
35:38
The presumed annihilation of
35:40
time and space heralded by the telegraph
35:43
promised to bind the country together, just
35:45
as the portance of the Civil War, we're
35:47
threatening to tear it apart. And
35:49
he goes on to quote a horrible
35:52
poem from eight seventy
35:54
five by somebody named Martin F.
35:57
Tipper. Never read about that guy in my poetry
35:59
age occasion. But
36:02
I've got to read this selection from this poem
36:04
too, And then I'll try to stop with all the quotes.
36:06
But this is just too good. Yes,
36:09
this electric chain from east to west,
36:12
more than mere metal, more than mammon,
36:15
can binds us together, Kinsman,
36:17
and the best as most affectionate
36:20
and frankist Bond brethren
36:22
as one and looking far beyond
36:25
the world in an electric union
36:27
blessed. I don't know what
36:29
you're saying, man, that's awesome. That's
36:32
some darn fine poetry. Do you hear the alliteration
36:35
there. That guy could have written an old English
36:37
Okay, okay, so he maybe he's technically
36:40
competent that this is a poem about the
36:42
telegraph, the electric Union. They
36:45
all have to be about clouds and feelings.
36:47
Joe but Carry has
36:49
a name for this mode of talk, which he claims
36:52
he coined in conjunction with somebody named
36:54
John Quirk, which is a game. And
36:57
the name for this this whole style of talking
36:59
is the rhetoric of the electrical
37:01
sublime. I loved that.
37:04
Yeah. Actually, come to think of it, I
37:06
like at like a great deal of poetry
37:08
that I saw in my poetry
37:10
classes, and like the early two thousand's in college
37:13
would have fallen under that kind of category.
37:16
It's like people celebrating how technology
37:18
is going to save us all or or not. Not not
37:20
like how it will save us all, but like, oh man,
37:22
this is magic. Check out this magic.
37:26
It relates to feelings. Okay,
37:28
okay, I'll see that all
37:31
poetic elitist is anyway
37:35
to bring us up to today.
37:37
I would say that the Internet is the
37:39
essentially the ultimate extension of the telecommunications
37:42
principle. It's all of the telecommunications
37:44
principles switches just flipped onto
37:47
full And I
37:49
think the rhetoric of the electrical sublime
37:51
absolutely extended to those
37:54
beautiful, innocent, naive
37:56
early days of the Internet, not
37:58
even the earliest days into the
38:00
nineteen nineties. Well, to be fair though, the
38:02
nine nineties that's where you get the
38:05
public u understanding
38:07
of the Internet, because before that it was essentially
38:10
the domain of researchers and students
38:12
and mega nerds. Well, but yeah,
38:14
there's those first few people who were on a
38:16
Prodigy or a o L We're
38:19
all like, oh, man, like I can talk
38:21
to my grandmother or if
38:23
she had a computer, understood
38:25
how it worked. It's something like, I mean, you know, the
38:27
guys, there were some savvy there were some savvy grandmas
38:30
out there on l uh and
38:32
and we hadn't we hadn't seen like the
38:35
other side of that connectivity yet, which
38:37
is, you know, just like more opportunities
38:40
to argue about terrible stuff. Well,
38:42
at the time when the the Internet was young,
38:44
in the early nineteen nineties, when the Web in particular
38:46
was young, because that didn't really become a thing till
38:49
two right, so when and that
38:51
was the easiest way to access the Internet.
38:53
Otherwise you were accessing elements of the Internet
38:56
like email, which you would just realize
38:58
as like, okay, well, this is just a super or fast
39:00
version of mail. I mean, someone's going to get it
39:02
as soon as I hit send. But other than that, it's
39:04
not it's not as transformative
39:06
as some of the other implementations
39:09
of the Internet. Once you get the Web there, everyone
39:12
saw that this was a thing of a
39:14
seemingly limitless possibility,
39:16
and because it was just such a big
39:19
opportunity, it was really
39:21
hard to get an idea of how would it actually
39:23
be used, right, because if you
39:25
have every option open to you,
39:27
you don't know what pathway you're going to walk
39:29
down. It may be that you've been walking for a while
39:32
before you realize which path you took. Yeah,
39:34
And so not to say that there wasn't plenty
39:36
of paranoia and dystopian thinking about
39:38
the Net back then, too, given
39:41
things like the movie The Net starring
39:43
Sandra Bullock that was phenomenal and it
39:45
predicted being able to order pizza online
39:48
within the first five minutes of the movie. I might have
39:50
man that movie. Rachel and I
39:52
watched that not too long. It was
39:54
hilarious. It's so good, right stands
39:56
up as being completely terrible. I covered
39:59
it on a episode of Tech Stuff about
40:01
hacking in Hollywood. I love
40:03
it. It's great. But anyway,
40:06
it's those pictures from the nineties where
40:08
people didn't realize what computers could
40:10
and couldn't control yet, so they'd
40:12
have you know, like you get into your office,
40:15
uh, and you sit down at your computer and you can
40:17
control like the fire alarms in the
40:19
building you're in. Yeah,
40:21
okay, but yeah, anyway,
40:24
so I'm sure you'll remember this
40:26
strain of techno utopianism
40:28
from the nineties about the about the internet.
40:31
The Internet is going to be a global village, right,
40:33
this Marshal mccluan concept, uh,
40:36
the information super Highway. There's
40:38
the idea that there's just sort of like all
40:40
learning, all sharing, all the world
40:43
and a kind of mutually informative benign
40:46
communion where people around
40:48
the world connect, do you remember that word connect
40:50
all the time, and they collaborate
40:52
and learn to understand one another. And
40:56
I think that's funny now, not because I
40:58
would say the Internet has turned out to be
41:00
a bad thing, which I certainly
41:02
don't think, because if you did, then you'd
41:04
be like get filled with self loathing. No,
41:07
I don't think that at all. But I do think it's turned
41:09
out to be a thing so ubiquitous
41:11
and so invisible as a substrate
41:13
for day to day behavior that
41:16
it goes beyond categorization is good
41:18
or bad? Like saying whether the Internet
41:20
is good or bad is kind of like saying society
41:23
is good or bad. Sure, I mean, you know,
41:25
on the one hand, you could say, look at look
41:28
at vine, our forums where people
41:30
are sharing innovative
41:32
ideas and collaborating in our real
41:35
sense and trying very hard to work
41:38
through challenges. It's it's very inspiring.
41:40
Or look at YouTube comments. Yeah,
41:43
and so I wonder is there anything
41:46
we could say so obviously Marconi
41:48
was wrong, Uh, in the
41:50
specific example of wireless communication
41:53
and in the broader idea of
41:55
just telecommunications more
41:57
people connecting instantaneously
41:59
a cross distances that that would
42:02
solve social ills and in wars
42:04
and stuff. Is there is there any grain of
42:06
truth to that? I mean, is there any way of saying,
42:08
well, maybe maybe in some way the Internet or
42:11
or other forms of telecommunications
42:13
around the world have in some way caused
42:15
social change for the better. Well,
42:18
they've clearly caused social change, And
42:21
I would argue for the better in many cases.
42:23
But I would say that it wasn't uh
42:26
in the way that Marconi was necessarily
42:29
anticipating. So, for example, the Arab
42:31
Spring being able to use
42:33
the Internet in order to organize
42:36
protests and to inform people as
42:38
to what was going on beyond
42:40
the boundaries of a country
42:43
became incredibly powerful or and and even
42:45
within the boundaries of a country wherein these people
42:47
were not allowed to to organize
42:50
in other ways. Right and then if you want
42:52
to look at right now, the the
42:54
Black Lives Matter movement, I would argue,
42:57
without the various
42:59
tools that we have connected to the Internet, including
43:01
things like live streaming video, what
43:04
what has been a problem for a very
43:06
long time in the United States is
43:09
just now getting the attention that it
43:11
deserves because the tools
43:14
to distribute that information are
43:16
now in the hands of the people that have been
43:18
UH suffering from this problem
43:21
for decades. Really,
43:23
it's not like this is a new problem. It seems
43:25
new to people who were not part of that community
43:28
because it wasn't something
43:30
them. Yeah, similar in
43:32
the way to it. How when when television
43:34
became a thing that people had in their homes in the
43:36
nineteen sixties, suddenly or
43:39
in the ninete sixties early nineteen seventies, UM,
43:42
the the Vietnam War
43:44
suddenly was thrown into very harsh relief
43:46
because when you started getting images from that thing,
43:49
it wasn't just like, oh, yeah, let's go. Oh this
43:51
is terrible like being able
43:53
to actually see the results,
43:55
as opposed to you get uh an
43:58
article in the paper that gives you very
44:00
relevant information, but it distances
44:02
you from the actual results. Yeah.
44:04
I think I think it's you can't argue
44:07
that it hasn't caused some social
44:10
good, or at least, uh, it has facilitated
44:13
some social good. It hasn't been smooth,
44:16
It's never going to be, because we're human beings
44:19
ultimately, and human beings were messy,
44:21
right, But it has opened
44:24
up opportunities that previously
44:27
weren't there. Do you think though,
44:30
that expanding the power
44:32
of telecommunications always just
44:34
sort of like it causes
44:37
a change in human behavior too for
44:39
the people who have access to this technology,
44:41
and then sort of settles into an
44:43
equilibrium that was similar to
44:45
how things were beforehand, except now you
44:48
just have some new tools. Or does
44:50
it lead to lasting changes?
44:52
I think again, I don't know that it
44:54
leads anywhere. I think What it does is facilitates.
44:58
Just like I said earlier, I think that the
45:01
actual leading to change is dependent upon
45:03
whatever force is trying to
45:05
enact that change, and they're using the telecommunications
45:08
tools as one of
45:10
the methods to enact
45:13
that change. Well, I guess to be more specific
45:15
that the change I'm thinking about is is general
45:18
uh, general increases in harmony,
45:21
right what what Marc had? And I think
45:23
I don't think that it is magically making
45:26
people more harmonious. I think that that was
45:28
a naive kind of prediction. Yeah,
45:30
I think it'll it lets more people
45:33
hear stories, and
45:35
that can ultimately lead
45:38
to change. But it's not. It's not as
45:40
simple, just as it wasn't when you say, hey,
45:42
you kids, sit down and have a talk and everything
45:45
will be fine. It's rarely
45:47
that simple. I think that you can change
45:49
minds with with telecommunication
45:52
missives. I mean, I would hope
45:54
that you can, because otherwise we've been spending
45:57
a whole lot of time sitting in the studio over the past few
45:59
years for your friend a thing. If we're just talking to
46:01
people who would already agree with us. Uh.
46:03
And I have seen in in common threads
46:06
on on YouTube and Facebook people
46:08
say, oh, I didn't know that before, thank
46:10
you, thank you for telling me about it, like, you
46:12
know, like like I would have been ignorant and I would have
46:15
kept on going doing what turns
46:17
out to be this harmful behavior if I had not known about
46:19
this thing. Um. And also
46:22
uh, on a on on
46:25
again a personal person to person kind of
46:27
basis. Um, the Internet has allowed
46:30
uh, people who would not have a support group
46:32
in their local area to have a support
46:35
group for um, you know,
46:37
if if they if they're gay, and they're in a in
46:39
a very anti homosexual, homophobic
46:41
kind of kind of town, then then they
46:43
have that support and maybe they get
46:45
to continue living their lives and live happier and
46:48
get out of there. Yeah, on a
46:50
on a much um
46:53
less uh impactful
46:55
scale. I mean, the thing that I think of is
46:58
what it was like. See, I I grew
47:00
up in the eighties, and uh, the
47:02
stuff I was interested in, None of my friends
47:05
were really particularly interested in the same
47:07
sort of things I was interested in, and so I didn't
47:09
really have people to chat about chat
47:11
with about the things that I was really passionate
47:14
about. Uh. And then I would end up
47:16
going to conventions with my dad,
47:18
you know, my science fiction author dad
47:20
and run into people groups eventually,
47:23
But I'm talking about the early eighties, so you
47:26
gotta you gotta walk before you can run.
47:29
Um running there
47:32
it was compared to what was happening before. So
47:36
but but the point being that that the
47:38
conventions gave me a chance to chat with people who
47:40
were like minded, who enjoyed the same things I
47:42
enjoyed. And then eventually the Internet
47:44
allowed that on a much grander scale
47:47
where I could see like, oh, they're all these people who share
47:49
the same interests I have, but I never
47:51
had an opportunity to chat with them because they don't
47:54
happen to be near me and same thing
47:56
with them, they have the same experience. Uh.
47:59
And obviously, again that's that's
48:01
tiny on the scale. Is something like someone dealing
48:03
with uh, you
48:05
know, intense homophobia or racism
48:08
or whatever it may be, like some sort
48:10
of prejudice against them for whatever
48:12
reason. It's very different, but
48:15
um, it makes it makes
48:17
a difference in a person's self
48:21
image. They stop asking
48:23
themselves like am I am I weird?
48:25
Am I alone? Am I terrible? And that
48:29
I think is huge. I
48:31
mean it's huge on an individual to individual basis,
48:33
but collectively you have to say, like, that's
48:36
fantastic to take away
48:39
that burden that some people feel
48:41
because they don't fit whatever their community
48:43
has identified as the norm.
48:46
On the flip side, I it
48:48
can. I mean, there are still people who
48:50
are jerks, and they can also congregate on the
48:52
internet and and kind of enter into
48:54
a positive feedback loop where where
48:56
they they are told that
48:59
that that that racism or whatever
49:01
it is is okay and is accepted by their
49:04
peer group. Yeah, that is not so good.
49:06
That's very not good. But yeah, I mean
49:09
I'm thinking about so
49:11
in the broader sense of creating global
49:14
harmony. I do think there is
49:16
some of what I don't know, the
49:18
the you know, the people who spoke
49:20
the rhetoric of the electrical sublime, what
49:22
these people had in mind, connecting brother
49:25
with brother across the world. Uh,
49:27
people people forming bonds
49:29
they would not have formed in physical space.
49:32
I think that's certainly true. But I also think that there
49:35
is global antagonism
49:38
that would not have existed otherwise.
49:40
And so I wonder if essentially,
49:42
I guess what I'm trying to say is is
49:45
there any way to figure out if there has been a
49:47
net change or have we
49:49
just sort of like settled into a
49:51
new wider equilibrium that's
49:54
about the same as it was before. Well,
49:56
I think the thing is is that, I mean, technology
49:58
certainly changes us, but it isn't It
50:00
doesn't change very basic
50:03
parts or no technology that we've had yet
50:05
has been big enough to change extremely
50:08
broad parts of the human experience
50:11
of human nature, like the fundamental elements
50:13
of being humans. Yeah, and and unfortunately,
50:15
like being a jerk is kind of one of those, it's
50:18
on the list. I would also argue that your
50:20
question is impossible to answer, and the reason why
50:23
it's impossible to answer is because we don't have
50:25
a separate pathway that we could judge
50:27
against. Right if we had,
50:29
If we if we could peer into a parallel
50:32
universe where telecommunications were never developed,
50:34
but but human race continued on to
50:36
their tween, and we would
50:38
compare their twenty sixteen to our sixteen,
50:41
maybe then we could draw at least some conclusions,
50:44
knowing that there's still thousands,
50:46
millions of other variables at play.
50:48
But without that, it's impossible, right
50:51
because we live in the world that
50:53
we forged, and so we can't
50:55
really say what it would be like if we had gone a different
50:57
route. Uh, that being said, it
50:59
is a fascinate anything to think about. I mean, I
51:03
also, like you guys, believe in
51:05
the power of telecommunications. If I didn't, I
51:07
would not work here. I would be doing
51:09
something else. But I very much believe
51:12
in the power to do good with it. I
51:14
know, and without denying the fact that you can also
51:16
do evil things with it, absolutely
51:19
you can. But it's such
51:21
a powerful tool that if enough
51:23
people choose to do good with it, I think
51:25
you can't help but enact a positive
51:27
change in the world. And that's what I strive
51:29
for. And uh, I
51:32
guess to sum it up, as Shakespeare would
51:34
say, nothing is good or bad, but
51:36
thinking makes it so. Okay,
51:39
Yeah, I like your I like your southern
51:42
Southern hey, to be fair, And in Shakespeare's
51:44
time, the accent was closer to Appalachian
51:46
English than any other accent. Wow.
51:49
Yeah, Well, you know that we've
51:52
been talking a lot already. We
51:54
still have a ton more that we want to cover in
51:56
this, uh, in this this whole topic,
51:59
but we're super chatty, so
52:01
we're going to end up. No, So
52:04
we've got another one from Joe, and then we have
52:06
Lauren's favorite kind
52:08
of futuristic predictions to to talk about, and
52:10
we're gonna save that for our next episode. Guys,
52:13
if you have any suggestions you would like to
52:15
give us for future episodes, you
52:18
can write us our email addresses FW
52:20
thinking at how Stuff Works dot com, or drop us
52:22
a line on Twitter or Facebook. We are FW
52:25
thinking on Twitter. You can search f W thinking
52:27
on Facebook or pop right up. You can leave us a message
52:29
and we will talk to you again about our favorite
52:32
predictions of the future, like
52:35
in a couple of days I predicted. For
52:41
more on this topic in the future of technology,
52:44
visit forward thinking dot com
52:57
brought to you by Toyota. Let's
52:59
Go Places,
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More