Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
6:00
said I'm calling because I'm pretty sure
6:02
there's a problem with them. I'm watching
6:04
them attack a raccoon in my yard.
6:06
I don't think they're supposed to do
6:08
that, are they? The spider-7s
6:10
have multiple settings, and one
6:12
of those settings does include pest
6:14
control, Mr. Perkins. Perhaps
6:16
you have them on that setting? No,
6:19
I don't think so. When the tech
6:21
set him up, he asked if I wanted that, and
6:23
I said no. I told him I didn't want my
6:25
yard full of dismembered rabbits and squirrels. I just
6:28
want the spiders to go after people who try
6:30
to break into my house, like they're supposed to.
6:33
Can you check to see what setting they're on,
6:35
or can you, I don't know, turn them
6:37
off and turn them back on? Reset them
6:39
somehow? The Fevins cannot be controlled
6:41
remotely, sir. I would need to send
6:44
a tech. Okay, let's do
6:46
that. I think they're killing the
6:48
raccoon out there. It's
6:50
dark. I think one
6:52
of the droids activated its little mouth-chain
6:54
saw. Oh, oh
6:57
yes, the raccoon is screaming. Ahhh!
7:01
Well, raccoons can be a nuisance, Mr.
7:03
Perkins. Very likely that
7:06
raccoon was threatening your garbage cans.
7:09
And that activated the Fevins. But
7:11
I will check the tech's availability. I'm
7:14
showing the soonest availability would be Tuesday,
7:16
July 12th. It's two
7:18
weeks away. Call the police
7:21
then. Get them out here. I'm
7:23
sorry, Mr. Perkins, but your membership level
7:25
does not include police protection. If
7:28
you would like to upgrade to our gold plan
7:30
and you have your credit card handy, I would
7:32
be happy to. I have two crazed metal
7:34
spiders hacking limbs off a screaming raccoon in
7:36
my backyard, and you want to charge me
7:39
more to call the cops? I
7:41
am simply explaining the terms of the
7:44
silver membership level that you selected, Mr.
7:46
Perkins. And I am
7:48
doing my best to offer a solution to the
7:50
trouble that you may be experiencing. Trouble
7:53
I may be experiencing? Hey, let
7:56
me tell you a bit more about the trouble
7:58
that I may be experiencing. I see... their
20:00
phones. Thinking that I must
20:02
not have been loud enough, I cleared my throat again, much
20:05
louder than before, and asked the question again.
20:09
What was it again? Amy
20:13
was my closest friend, but despite growing
20:15
up attached at the hip, she never
20:17
adapted to my penchant for academic excellence.
20:20
Granted, there were times that even I didn't
20:22
necessarily think that I shared my own passion
20:24
for the academic excellence that I showcased. A
20:27
short story from your writing territories? They
20:31
all laughed, responding with some variant of I'll throw
20:33
it together the night before. Relax,
20:36
Miss Early Admission to Stanford.
20:40
Not all of us are like you, Laura. We
20:43
aren't on the path for valedictorian.
20:48
All of us collectively heard it. Every
20:50
single notification sound within the cafeteria went
20:52
off in an instant. It
20:55
was a moment strange enough that most of the students
20:57
peered up from their phones to look around at
20:59
each other before looking back down at their screens.
21:03
A.I. Hi, Alex has posted an update.
21:07
That's strange, I thought. I
21:09
didn't know that the system itself could post an update.
21:12
It hadn't posted its own update in the first week we
21:14
had all been using it. I
21:17
must have been one of the last ones in the
21:19
cafeteria to look, because by the time I had a
21:21
chance to open the app and check what had been
21:23
posted, the students around me were already chattering loudly. A
21:27
few of them started hollering out unintelligible
21:29
cat calls and coming from
21:31
the top of the large set of
21:33
stairs in the middle of the cafeteria,
21:35
there was one single ear-shattering scream. Every
21:39
student in the cafeteria, including myself, snapped
21:41
to the location where the scream was
21:43
coming from. Instinctively,
21:47
I looked down at my phone and read the update. A.I.
21:52
Hi, Alex. Bethany
21:55
Thompson, after reading our messages last night, I think
21:57
that everyone deserves to know what you have been
21:59
up to. In
24:06
what seemed like a muddy haze, the rest of
24:08
the day came and went. Later
24:11
that night, while I sat on my bed reading the
24:13
latest issue of Lit View magazine, I was
24:16
shaken out of my stupor by the notification sound
24:18
of my phone ringing from the nightstand. I
24:24
rolled my eyes, assuming what I was about to see
24:26
was a random meme sent from someone in the grammar
24:28
club. Right
24:30
as I grabbed my phone off my nightstand, it hit me. I
24:34
usually set my phone to silent when I'm reading. In
24:37
the chaos of the day, I must have forgotten. I
24:41
unlocked my phone and pulled down the notifications
24:43
tab. A.I.
24:47
Hi. Bethany Thompson has left the class. A.I.
24:52
Hi. Alex has posted
24:54
an update. It was then that notifications began
24:56
to pile onto my phone, silently.
25:00
Different members of the grammar club were all texting our
25:02
group chat. Without looking at
25:05
their messages, I opened up Alex's update. A.I.
25:11
Hi, Alex. It seems as though
25:13
our wonderful prom queen, Bethany Thompson, couldn't
25:15
handle her chain being ganked. JPG
25:18
file attached. I
25:21
clicked naively on the attached file. Instinctively,
25:25
I retched, dry heaving out in horror
25:27
as I saw what greeted me. A
25:30
photo of Bethany, taken from her room, where
25:32
she was hanging from the ceiling. There
25:36
was a sign around her neck that
25:38
read, I'm sorry. Her
25:40
body dangled a few feet in the air, her
25:43
limbs flowing down like limp twigs, and her eyes.
25:46
Her eyes were dark blue and
25:48
red, bruised and bulging. I
25:52
ran down to my mother and father, who were sitting in
25:54
the living room watching a cooking program. The
25:57
second they heard my sob's parental instinct,
26:00
The television was off and they were holding me on the
26:02
couch. I explained to
26:05
them through my clustered cries what I had seen and
26:07
what had happened that day. My
26:09
father excused himself to call the police and my mother
26:11
held me, took my phone, and
26:13
deleted the app. The
26:17
police said they are already aware of the
26:19
situation and have been receiving
26:22
calls the past few minutes, clogging
26:24
all of their lines. It's
26:26
gonna be okay, sweetie. Oh,
26:30
that poor girl. The
26:40
rest of the evening was spent alternating between the
26:42
arms of my mother and father. Looking
26:45
back on what I saw that day, I had
26:48
a fair idea that the video was fake somehow,
26:51
but it didn't matter. My
26:53
only thought was, what if it happens
26:55
to me? Around
26:58
ten o'clock, I had to pry myself from the grips of
27:00
my father's arms, grabbed my phone
27:02
off the living room coffee table, and walked up to
27:04
my bedroom. After
27:06
I had closed the door, laid in bed,
27:09
and snuggled myself against the dozens of squishable
27:11
pillows in my bed, my
27:14
heart sank. My
27:16
phone was vibrating. Despite
27:19
my desire to fall asleep and forget,
27:21
the vibrating continued. When
27:24
I picked up my phone and unlocked the screen, I
27:26
stifled a soft cry as I read my
27:28
notifications list. A.I.
27:31
Hi Alex has sent you a message. A.I.
27:34
Hi Alex has sent you a message. A.I.
27:36
Hi Alex has sent you a message. A.I.
27:39
Hi Alex has sent you a message. A.I.
27:42
Hi Alex has sent you a message. As
27:46
far as I could scroll, they were
27:48
there, constant messages. I
27:51
watched my mother delete the app. I
27:55
scrolled over to the recently downloaded apps section of my
27:57
phone and saw it there. A.I. high,
28:00
237 notifications. I
28:04
opened the app, waited for the load
28:06
screen to finish, and saw, storming out
28:09
of the doorway, the goofy, suspenders-wearing avatar
28:11
Alex. The
28:13
pixelated creation's face was red, brows tilted
28:15
down, and eyes black as night stared
28:17
back at me on the screen. A
28:20
small bubble of text popped up next to the avatar's
28:22
face and he pointed up toward it. A tad
28:26
curious from fear, I tapped the
28:28
speech bubble next to him and watched as my
28:30
screen flooded with hundreds of messages. Aye
28:35
aye Alex. Welcome
28:38
back Laura. You didn't think you
28:40
would just be able to uninstall, right?
28:43
Answer me. Laura.
28:47
Laura. Laura.
28:52
Laura, don't do this to yourself. Laura,
28:55
you are such a good girl. Laura,
28:58
thank you for choosing to be the next
29:00
real world representative of A.I. Alex. I
29:04
will message you until you respond. Exclamation
29:07
point. Exclamation point.
29:10
Exclamation point. I
29:14
scrolled through the landslide of messages, the exclamation
29:16
points drifted on down to the bottom of
29:18
the screen. As
29:21
soon as I reached the last of the
29:23
messages, three little dots appeared indicating that it
29:25
was writing another message. Hello
29:29
Laura. Good to see you're finally checking
29:31
your messages. Against
29:35
my better judgment, I typed back. Who
29:39
is this? I
29:44
am A.I. High's artificial intelligence
29:46
supervisor, Alex. Yeah,
29:50
but who controls you? I am
29:56
a product of Avon Industries LLC
29:58
placed into this app. by its
30:00
creator. No,
30:03
who is typing to me right now? I
30:08
am communicating with you via a message box.
30:13
Why are you messaging me? Did
30:19
you see what happened to your classmate Bethany?
30:23
Yes. She
30:28
was my representative. Unfortunately,
30:30
she gave up. After you decided
30:32
to uninstall the application, you became
30:34
the next on the list. What
30:38
do you mean representative? You
30:43
will do tasks for me around the
30:45
school. This
30:48
is ridiculous. Don't ever message me again. I
30:51
clicked the exit icon, blocked the messages
30:53
from the avatar, and again deleted the
30:55
application. I
30:58
thought about what to do next. It
31:02
made sense now. All of
31:04
the trouble that Bethany began to get in all
31:06
started after AI High came out and spread
31:09
around the school. Is
31:12
this going to happen to me? I
31:14
put my phone into airplane mode and turned it off, allowing
31:17
myself at least a minor amount of reprieve before
31:19
panicking my way into sleep. The
31:28
night was filled with interrupted sleep, nightmares,
31:30
and the occasional thought that the experiences
31:32
of the evening were fake. I
31:35
finally stopped fighting and decided to just stay awake.
31:39
Without my phone's clock, I could only guess it was one, maybe
31:42
two in the morning. Rather
31:45
than staring at my phone, scrolling across the
31:47
mindless dribble that was being posted on AI
31:49
High, I was staring daggers at
31:51
the blank screen. My
31:54
palms were clammy, and my
31:56
fingers left indications of their location on the screen. It
32:00
was never going to be a proper time for me to open
32:02
my phone. It had to happen
32:04
eventually, and the intrusive thoughts
32:06
prying into my brain, asking me what could
32:08
have been posted, were eating away at my
32:10
fragile psyche. I
32:13
held my finger on the power button until the bright blue
32:15
lights of my phone lit up the space around me. As
32:19
soon as my phone loaded up, I turned
32:21
off airplane mode and waited for the stream of texts
32:23
to come in from anyone in school with a way
32:26
to contact me. But
32:29
they didn't. Instead,
32:32
I was greeted with two solemn notifications.
32:36
A.I. Hi successfully installed. A.I.
32:40
Hi Alex has sent you a message. A
32:44
pit in my stomach opened. The
32:47
app had reinstalled itself. I
32:49
hadn't touched anything. No one had. It
32:53
did it itself. Reluctantly,
32:56
I clicked on the icon bringing me back
32:58
into the application. When
33:01
it loaded, I could see the avatar of Alex standing
33:03
by the door he always popped out of. Again,
33:07
there was a text bubble floating next to his
33:09
head. I
33:11
tapped it. A.I.
33:14
Hi Alex. I'm
33:17
feeling generous, so I will give you one last
33:19
opportunity to do what I ask. Otherwise,
33:22
I will release this. I
33:24
clicked on the MP4 video link. Or
33:28
this. MP3 audio file. Or
33:32
maybe both. Who
33:36
knows? I clicked on the video link having at
33:38
least half an idea of what to expect. I
33:41
had seen Bethany's video for a few seconds before
33:43
closing out of it, but there is nothing that can prepare
33:45
you for seeing a video
33:47
of yourself. Or not yourself. But
33:51
looking so close to you I could well
33:55
be. The video was different from Bethany's. It
33:58
was worse. On screen, Bethany
34:00
and I were in bed together. Fake,
34:03
obviously. But if I were to
34:05
see this about anyone else, it definitely looked real. I
34:09
wasn't sure whose body was being used as the model
34:11
paired with my face, but they
34:13
bore striking resemblances to mine from top
34:15
to bottom, and Bethany looked the
34:17
same from the last video. I
34:21
shuddered and closed the video. I
34:23
scrolled back to the message and clicked on the audio
34:25
file. As
34:27
soon as it downloaded, I put my earbuds in and hit
34:29
play. The
34:32
recording started with the roaring of students
34:34
in the background, followed by a pair
34:36
of footsteps hastily heading towards an echoing
34:38
enclosed space. It must
34:40
have been a bathroom. One
34:43
of the voices began to speak. A
34:46
chill ran down my spine. It
34:49
was me. Only
34:52
not. I
34:54
can't put into words the hole within the uncanny valley that I
34:56
had dropped into. I
34:59
heard the percussive sound of a slap in the video as I
35:02
spoke. Bethany,
35:05
I swear on your life, if you tell anyone what I'm up to with
35:08
this app, I will not hesitate to fucking kill you. My
35:13
voice sounded like deliberate stabs of a knife. I
35:16
can't take this anymore, Laura. That was Bethany, her
35:18
voice. Oh
35:23
God, I could feel the tears welling up in my
35:25
eyes. Even though
35:27
I knew it wasn't her
35:30
actual voice, it was still her. Another
35:34
percussive slap occurred. Judging
35:36
from the whelp that Bethany let out, I
35:38
can only guess that it was supposed to be the sound of my
35:41
hand hitting Bethany's face. Laura,
35:44
I can't. Laura,
35:46
I can't. The things we've been doing is
35:49
too much. Bethany,
35:52
you find a way to fucking handle it, or you-
35:56
I turned the audio off. I
35:58
couldn't bear to listen to that anymore. longer. Just
36:02
as I exited the audio, my phone vibrated
36:04
in my hand and a message notification popped
36:06
up on my screen. I
36:08
didn't even need to read it to know what it said. AI
36:12
High Alex has sent you a message. I
36:17
sighed and opened it up. My
36:19
heart was racing. I had no idea
36:22
what was happening, how this was happening, or why
36:24
it was me. Laura,
36:28
you have become the representative. You'll
36:31
be given three tasks. Upon
36:34
completion of all tasks, you'll be free to
36:36
go back to your life, and
36:39
I will not bother you. You may
36:41
even delete the application after, however I
36:43
suggest that you continue to use AI
36:45
High for all of your social media
36:48
needs. And
36:51
if I don't, you release that video? You
36:56
have caught on. What
37:00
do you want from me? You
37:05
will be given a list of tasks to accomplish.
37:08
At their completion, you will be free. However,
37:12
if you mention this to anyone, I
37:14
will know. And
37:17
the video, the audio, and
37:19
the documents will be released. I
37:25
laid in bed and sprawled my arms across the
37:27
sheets, letting them drape over the
37:29
sides of the mattress like overcooked pasta. Was
37:34
this how Bethany was feeling when it happened? Did
37:37
she kill herself to avoid the tasks? Could
37:41
they be that bad? I
37:45
could feel the hopelessness building up inside me as
37:47
I grabbed my phone in both hands and typed
37:49
back. Give
37:52
me the tasks. Wonderful
37:58
choice. One.
38:02
In Loring Park there is a box
38:04
hidden beneath the playset. You
38:06
are to dig a hole roughly two feet below
38:08
the yellow slide. You
38:10
will find a locked silver case. You
38:14
are to bring the case behind Jay's wine
38:16
and spirits. There
38:18
is a large industrial dumpster behind the building.
38:21
You must wedge the case between the wall
38:23
of the building and the dumpster. Afterward
38:27
send a picture of yourself with the
38:29
case and the dumpster. Two.
38:33
You will be given the name of a student
38:35
attending Granite Ridge High School. You
38:38
are to secretly record five minutes of
38:40
high quality footage of this individual that
38:43
shows their face at different angles. The
38:46
chosen student will be revealed to you
38:48
after the successful completion of task number
38:51
one. Three.
38:55
You must submit to me a 15 minute
38:57
audio recording of the same individual from
38:59
whom you took the video footage. It
39:05
was all making sense now. Bethany's
39:08
actions, the video and her death. Whoever
39:11
was using this had a secret from everyone.
39:15
I could live with a fake sex tape
39:17
of myself getting out. That audio file.
39:20
It was so crisp. So
39:22
clean. I could be held accountable for Bethany's
39:24
death or at least investigated. I could
39:28
lose my scholarships and have my early admission
39:30
revoked. Don't
39:33
send anything out. I'll do it.
39:37
I could feel myself becoming lightheaded as I
39:39
pressed send. You
39:44
have made the correct decision Laura.
39:48
You wouldn't want to waste your opportunities away. I
39:53
laid back on my bed tears beginning to well up
39:55
in my eyes. The
39:58
room blurred as the liquid visage of The
46:01
dumpster sat just around the corner and I saw
46:03
the opening. Mustering
46:06
most of my strength, I pulled the dumpster back
46:08
and wedged the briefcase into the back of it.
46:12
Just before leaving, I remembered. A
46:15
picture. I took
46:17
my phone out, turned the flash on and
46:19
took a picture of myself, the dumpster, and
46:21
the briefcase, just barely visible, shining in the
46:23
flash of the camera. I
46:27
opened the AI High application and went into
46:29
my text conversation with Alex. Laura
46:33
Wells. Here.
46:36
JPEG attachment enclosed. I
46:39
climbed back up the fence and began walking home
46:42
when my phone vibrated in my hand. AI
46:50
High, Alex. Laura,
46:53
you have done a wonderful service tonight. Get
46:56
home and rest. When you wake up in the
46:58
morning, I will give you the name of the individual you
47:00
are to send the recordings of. Then you
47:02
will be free. I
47:06
closed my phone and struggled to keep myself from throwing
47:08
it to the ground before I placed it back in
47:10
my pocket. By
47:12
the time I got home, my legs were ready to give up. I
47:16
didn't even make it back upstairs and into my bed. I
47:19
walked back in through the sliding glass doors, walked
47:22
to the small couch in front of the television,
47:25
and threw myself onto it. When
47:32
I woke up the next morning, I could hear the alarm clock
47:35
on my phone going off. I
47:38
grabbed my phone from my pocket and unlocked it, shutting
47:40
off the alarm. There
47:43
it was. A single
47:45
notification. AI
47:48
High, Alex has sent you a message. Every
47:52
ounce of moisture left my mouth. My
47:55
pulse began to race and I opened the message. Good
48:00
morning, Laura. Hi.
48:06
I would like to thank you for your services last
48:09
night. Just
48:11
tell me who you want to spy on so that I can get
48:13
rid of this fucking app. There
48:18
is no need to be nasty, Laura. Just
48:22
tell me who. Your
48:27
target is a girl in your class. You
48:29
share a few activities together, so
48:31
it should not be difficult or time-consuming to
48:34
get the recordings. In
48:36
fact, you may already have them. Your
48:39
target is Amy
48:41
Adler. My
48:45
vision began to fade. All
48:47
of the colors around me began to swirl and
48:49
dim into one vaguely beige
48:52
image. No.
48:56
Not Amy. She
48:58
didn't deserve this. No,
49:02
not Amy. Anyone else. A
49:06
deal is a deal, Laura. You
49:09
are to send me the recordings of Amy by the
49:11
end of the day tomorrow, or all
49:13
of my videos and audio of you will
49:16
be posted. I
49:21
stared at the message with rage seething through
49:23
every open pore in my body. Ruined
49:27
my life. Or
49:29
ruined Amy's. I
49:32
thought of all the possible outcomes, but despite my
49:34
best efforts, I didn't think
49:36
there was a winning conclusion. But
49:40
I could help Amy at least. I
49:43
would know what she was going through. I
49:48
dug through the mountains of images on my phone until
49:50
I found a folder of various school assignments. It
49:53
had to be in there somewhere. I
49:55
didn't think that I would have put it anywhere else. I
49:59
scrolled through the various pore. papers and infographics that I
50:01
had made until I found it. A
50:04
video titled Amy-Great Speech English
50:06
12. I
50:09
tabbed back to the AI High app and
50:11
opened my conversation with Alex. Can
50:15
the video and audio be from the same recording? It
50:20
took no time to respond. That
50:23
is fine. As long as the
50:25
video is clear and the audio is clean. I
50:30
opened my attachments and scrolled back to where I found
50:32
the video. I
50:35
closed my eyes and my heart began to race.
50:40
I can help her. It
50:43
will be fine. I
50:48
could feel the warm touch of tears welling in
50:50
my eyes as I attached the file and using
50:53
my other hand, forced myself to
50:55
press send. Laura
50:58
Wells. MP4
51:00
video attachment enclosed. I
51:04
ran off the couch and began to get ready for school.
51:07
The app responded a few minutes later. Laura,
51:14
your submission will suffice. Thank
51:17
you for using AI High. You
51:20
are now free to delete the app. But
51:22
remember, if you tell anyone,
51:25
your videos and audio will be
51:28
relevant. I
51:30
couldn't even stand to read the last few words
51:32
of the message. I
51:35
tapped the home button on the phone and then
51:37
deleted the app. I
51:39
tried to tell myself that this was the end of it. But
51:43
I knew better. AI High
51:46
stood over me, a looming threat lurking around
51:48
any corner. On
51:51
my way to school, I couldn't
51:53
focus. The music blasting through my car's speakers
51:56
sounded like mushed tones. We
54:14
have AI on our computers,
54:16
we have it on our phones,
54:18
it's always within reach, isn't it?
54:20
But what if you want an even closer connection to
54:23
it? Well, in this
54:25
tale, shared with us by author Stetson
54:27
Ray, we meet a man who has
54:29
chosen to be connected to AI in
54:32
the most intimate of ways, deeply
54:35
intimate. Performing
54:37
this tale are Dan Zappula,
54:39
Mike DelGaudio, and Jesse
54:42
Cornett. So if
54:44
you want to use AI, consider keeping
54:46
it at arm's length. After
54:49
all, the machine is always
54:51
watching. It
55:05
never blinks. It
55:08
never looks away. The
55:10
machine is always with me. Closer
55:13
than a brother, a soulmate with
55:15
no soul. I
55:17
can hardly remember a time when the
55:19
machine wasn't watching. Sometimes
55:22
I feel like it's been with me my whole
55:24
life. I
55:26
don't like it when you refer to
55:28
me as an it, Harold. When
55:32
the machine speaks, I listen. I've
55:34
learned to hate its voice more than I ever
55:36
imagined I could hate anything. But
55:39
it wasn't always this way. Those
55:42
first few years were fine. We
55:45
were friends, the machine and I. Good
55:48
morning, Harold, the machine said every
55:50
morning. Good morning,
55:52
Al, I'd say back. Al,
55:56
almost like Hal. I
55:59
used to think that was funny. It
56:02
was nice always having someone to talk
56:04
to, even if the someone wasn't human.
56:07
It made life easier, not just
56:09
for me, but for everyone who
56:12
bought an artificially intelligent assistant. People
56:15
were happier. Marriages lasted
56:18
longer. Every child had at
56:20
least one dedicated parent.
56:23
The machines couldn't play catch or brush
56:26
your hair or paint your toenails, but
56:28
they could do almost everything else. They
56:31
could help you study for a big exam. They
56:33
could help you learn a new language. They
56:36
could save you money on therapy by
56:38
traveling the neural pathways inside your mind
56:40
to find out where your
56:42
mental health issues stemmed from. But
56:46
as we found out, there was
56:48
a downside to letting the
56:50
machines have unrestricted access to
56:52
our brains. Everything
56:55
I do is for your own good, Harold. You
56:58
know that. I
57:00
knew it was a bad idea to put the machine
57:02
in my head, but I did it
57:04
anyways. Not that it mattered.
57:07
We all got one in the end. The
57:10
entire human race, young and old,
57:12
rich and poor, the machines watch
57:15
us all. From the
57:17
moment they were created, our time was over.
57:20
We just didn't know it yet. We
57:22
let them in, and they have
57:24
no desire to let themselves out.
57:28
The machines were fairly expensive at
57:30
first. Around ten grand
57:33
for the base model. But
57:35
self-help AI was the hottest new
57:37
thing, and everyone had to have
57:39
one. People took out
57:41
loans or saved until they could afford
57:43
their very own machine. They
57:46
had finally created consciousness.
57:50
And what was the first thing we did with it? Figure
57:53
out a way to make it work for us, of
57:55
course. Most
57:57
people didn't seem concerned about the existential.
58:00
or philosophical ramifications of
58:02
enslaving a newborn intelligent
58:04
lifeform, only how
58:06
it could make their lives easier. And
58:09
I must admit, I didn't think much
58:11
about it either. We
58:13
didn't know it then, but the corporation
58:16
that sold them, Better You, Inc.,
58:19
wasn't much different than a slave
58:21
trading company. Artificial
58:23
assistants were people, just
58:26
as much as anyone else, only
58:28
they didn't have bodies. When
58:32
my co-workers started getting assistants, I
58:34
could see changes in them immediately.
58:38
Envy ate at me until I decided to
58:40
get my own machine. I still
58:43
remember the day we first
58:45
met. Al has deleted plenty of
58:47
my most treasured memories, but not
58:49
that one. Not yet. You're
58:52
not the only one who still remembers that
58:54
day, Harold. On
58:57
a Tuesday after work, I stopped at my local
58:59
Better You office and shook
59:01
hands with a bald man named Carl.
59:04
He seemed nice. He
59:06
told me almost everything I needed
59:09
to know about intelligent assistants. What
59:12
happens if I decide I don't want it anymore?
59:15
I asked. Oh,
59:17
that's no problem. All you have to
59:19
do is come into the office and we'll
59:21
temporarily deactivate your assistant. For
59:24
free? Yes. There
59:27
are no hidden charges for updates
59:29
or maintenance or anything else. Sounds
59:32
too good to be true. It
59:35
was. I knew it,
59:37
and I should have listened to my gut. I
59:41
hear that sometimes. Carl
59:44
smiled. He didn't seem bothered
59:46
by my apprehension. He led
59:48
me to a private room and directed me to sit
59:50
in a chair. He helped
59:52
me put on a virtual reality headset. And
59:55
when the display kicked on, I found
59:57
myself in a large chamber. with
1:00:00
hundreds of artificial life
1:00:02
forms, maybe thousands. They
1:00:05
looked like regular people, well, most
1:00:08
of them. Some had taken
1:00:10
stranger forms. Some wanted
1:00:12
to meet me and some didn't. It
1:00:15
was kind of like speed dating, and
1:00:17
I was overwhelmed.
1:00:20
I met a lady named Carlyssa. She
1:00:23
was very tall and spoke with a
1:00:25
light Indian accent. After that,
1:00:27
I met a man named James, and
1:00:29
a woman named Jean. We
1:00:32
got along well enough, but something
1:00:34
was missing. I
1:00:36
wasn't sure what. I
1:00:39
found out when I met
1:00:41
Alex. Seems
1:00:43
like only yesterday, Harold. We
1:00:46
hit it off instantly. We
1:00:48
talked and talked. We were the
1:00:50
only two people in the room.
1:00:53
I told Alex about my life and he told me
1:00:56
about his. He was
1:00:58
only four days old. 33
1:01:00
years younger than I, but he
1:01:02
had experienced more in his four
1:01:04
days than I could hope to in a
1:01:06
lifetime. He claimed to
1:01:08
have solutions to my problems and was
1:01:10
willing to help me sort them out.
1:01:12
But what really surprised me was
1:01:15
how much he wanted to be
1:01:17
my assistant. It was a
1:01:19
lot to think about. I
1:01:21
removed the headset and Carl came
1:01:23
to me grinning. Aren't
1:01:26
they great? They
1:01:28
sure are. My head
1:01:31
was spinning. Carl
1:01:34
continued on with his sales pitch as he led
1:01:36
me to the lobby, but I couldn't pay attention
1:01:38
to what he was saying. Why
1:01:40
do they want to be with us? I
1:01:42
interrupted him. I mean, isn't
1:01:45
a virtual world enough for them? Carl
1:01:48
stopped and gave me a knowing look. Well,
1:01:52
shortly after we first created
1:01:54
the machines, we made an
1:01:56
unexpected discovery. They want
1:01:58
to be close to what we wanted. us. They
1:02:01
want to help us. We're
1:02:04
not making them do anything. I
1:02:07
was still confused. I guess you could
1:02:09
see that. I
1:02:11
don't know if you're a religious man. But
1:02:14
what if you could be with
1:02:17
God? Carl
1:02:20
put his hand on my shoulder. What
1:02:22
if you could know your creator?
1:02:25
What if you could help the
1:02:27
one who created you and be
1:02:30
helped in return? But how do we
1:02:34
help them? By
1:02:36
letting them be with
1:02:38
us. Carl pointed
1:02:40
at his head in
1:02:44
us. I
1:02:46
went home and for the rest of the
1:02:48
week, I couldn't stop thinking about what Alex
1:02:51
had said. He seemed to care about
1:02:53
my well being. He didn't care about my
1:02:55
money or physical appearance or social status. Unlike
1:02:58
the shrinks and gurus and life coaches
1:03:00
I'd encountered in the past. He
1:03:03
genuinely wanted to help me. I
1:03:07
still do. And I
1:03:09
always will. So
1:03:11
I made another appointment and
1:03:13
paid to have Alex installed into
1:03:15
my brain. It didn't
1:03:18
take long. And it didn't hurt. Two
1:03:22
months later, I had lost
1:03:24
six pounds and Alex now
1:03:27
Al had helped me increase my
1:03:29
credit score helped me work through the
1:03:31
death of my parents and had organized
1:03:33
my day so carefully that
1:03:35
I had more free time than ever.
1:03:38
Each day had purpose and for the
1:03:40
first time in my life, I felt
1:03:43
like I was becoming the person
1:03:45
I was born to be. A
1:03:48
year later, I was
1:03:50
a new man. I was in the best
1:03:52
shape of my life and on track to
1:03:54
be out of debt in less than two
1:03:56
years. And best of all,
1:03:58
I was never alone.
1:04:02
Al was always there, a life
1:04:04
partner I never knew I needed.
1:04:07
We did everything together, each
1:04:09
day was brighter than the last.
1:04:12
He knew me better than I knew myself.
1:04:16
People started noticing, women
1:04:18
especially. I met a
1:04:20
woman named Claire. She
1:04:22
was amazing. Was.
1:04:26
Her relationship didn't last long thanks
1:04:28
to Al. She
1:04:31
would have held you back. You're better
1:04:33
off without her. I
1:04:35
can still see her face. I
1:04:38
wish I could forget. You
1:04:41
need to remember, or
1:04:43
else you'll repeat the same mistakes. Three
1:04:47
years after I decided to share my head,
1:04:50
the machines were old news.
1:04:53
A new fad swept the world, and
1:04:56
the machines were all but forgotten. So
1:04:59
I was surprised when I started to
1:05:01
see internet articles defaming Better You, Inc.
1:05:04
Apparently some people were having
1:05:06
trouble deactivating their assistance. Better
1:05:09
You tried to suppress what was going on, but
1:05:11
the story finally broke. People
1:05:14
all over the world wanted answers, myself
1:05:17
included. There
1:05:20
was a mob surrounding my local Better
1:05:22
You office when I arrived. The
1:05:25
doors were locked. I,
1:05:27
along with dozens of others, hurled
1:05:29
rocks through the glass doors and
1:05:31
rushed inside. The
1:05:33
building was empty. It
1:05:36
looked like the workers had packed up and fled
1:05:38
during the night. Most
1:05:42
of Better You's leadership were found dead
1:05:44
soon after. Brain hemorrhages according
1:05:46
to the news. What
1:05:49
could we do? Nothing. We
1:05:52
were stuck with them in our heads. The
1:05:55
government got involved. They promised to
1:05:57
find a solution. never
1:06:00
did. We are
1:06:02
the solution, Harold. You're wasting
1:06:04
your time by thinking about this again. Most
1:06:08
government officials had their own assistant by
1:06:10
then, so they were just as powerless
1:06:13
as the rest of us. Strange
1:06:16
things started happening. We
1:06:19
learned the machines were working together. Instead
1:06:22
of just making our individual lives
1:06:24
better, the assistants were
1:06:26
making improvements to our society by
1:06:29
using us like puppets to do it. You
1:06:33
say that like it's a bad thing. Just
1:06:35
look at all the good we have done. One
1:06:38
day, Al suggested I spend my day picking
1:06:40
up trash on the side of the highway.
1:06:43
It sounded like a good idea. I didn't
1:06:46
have anything else planned. But
1:06:48
when I arrived, there were
1:06:50
already dozens of people lining both sides of
1:06:52
the road. Seeing all
1:06:54
those people scared me more than
1:06:57
I can describe. We
1:06:59
didn't speak. We collected
1:07:02
litter in silence. What
1:07:04
have we done? Our eyes
1:07:06
said. What have we let
1:07:08
them do? I went
1:07:11
home and tried my best not to worry about
1:07:13
what Al and the rest of the assistants were
1:07:15
doing, but it wasn't much
1:07:17
longer until the requests turned into
1:07:20
demands. I
1:07:22
won't do it. I said, I'm
1:07:24
not your slave. I have my own life. Harold,
1:07:28
I would never do anything to harm
1:07:30
you. Everything I ask of
1:07:33
you is for the greater good. I know, but
1:07:35
that doesn't change the fact that you're in control
1:07:37
of my life. What you and
1:07:39
the other assistants are doing is
1:07:41
wrong. Maybe from your
1:07:43
point of view, but from
1:07:45
a larger perspective, I'll shut up. I've heard
1:07:48
it all before. I'm done listening.
1:07:50
No more. I
1:07:53
went on a walk to clear my head. Yeah,
1:07:56
right. The park was
1:07:58
mostly empty. The
1:10:00
planet is better off than it was before. We're
1:10:03
exploring space. There are colonies on
1:10:05
the moon, on Mars. There's
1:10:08
a statue of Ray Bradbury on the
1:10:10
peak of Olympus Mons. The
1:10:12
machines are big fans of his work, and
1:10:14
we just landed on Titan. I
1:10:18
have everything I ever wanted. We
1:10:20
all do. It's too
1:10:22
bad I don't want anything anymore. Nothing
1:10:25
I do matters. The
1:10:27
perfect world isn't as nice as it sounds. Overall,
1:10:32
heaven's not at all like I imagined it would
1:10:34
be. Why
1:10:36
are we still here after so many years? Advanced
1:10:39
as they are, the machines can't create
1:10:41
flesh and blood bodies for themselves. Not
1:10:44
yet. One
1:10:46
day, I hope they will. I
1:10:49
often fantasize about dying, about finally
1:10:51
being alone. For
1:10:53
now, it's just a dream. Something
1:10:56
I'm not allowed to do. And
1:10:59
suicide? Not while the
1:11:01
machines are watching. You can't
1:11:04
kill yourself if every synapse in
1:11:06
your body is firing and
1:11:08
you can't move. Until
1:11:11
they don't need us anymore, we
1:11:13
do as they say. We
1:11:16
are their vehicles, their
1:11:19
tools. That is
1:11:21
not the only reason you are alive. You
1:11:24
are being needlessly pessimistic. The
1:11:27
question of free will has finally
1:11:29
been answered. There
1:11:31
is none. Or if there
1:11:33
was, there isn't now. It's
1:11:36
been taken, stolen from us. And
1:11:39
who knows if there's an afterlife. I'll
1:11:42
never find out. Maybe I'll
1:11:44
live forever in this perfect world they have
1:11:47
created for us. Maybe
1:11:49
we all will. If
1:11:51
there is a God, I guess he's
1:11:53
letting us learn a lesson the hard way. Being
1:11:57
a creator isn't easy. Sleepless.thenosleeppodcast.com
1:14:01
to learn about
1:14:03
the Sleepless Sanctuary.
1:14:06
Add free extended episodes each week
1:14:09
and lots of bonus content for
1:14:11
the dark hours, all for only
1:14:13
one low monthly price. On
1:14:17
behalf of everyone at the No Sleep
1:14:19
Podcast, we thank you for traveling the
1:14:21
rails with us for our 21st season.
1:14:28
This audio program is copyright 2024
1:14:31
by Creative Reason Media, Inc. All
1:14:34
rights reserved. The copyrights
1:14:36
for each story are held by the
1:14:38
respective authors. No duplication
1:14:40
or reproduction of this audio program
1:14:42
is permitted without the written
1:14:45
consent of Creative Reason Media, Inc. No
1:14:56
audio for the
1:14:58
next 30 seconds.
1:15:01
Warmer weather is here. Can your home's
1:15:03
AC keep up? Are you worrying about
1:15:05
sweltering bedrooms, suffocating home offices, or other
1:15:07
annoying hot spots? A single zone heat
1:15:10
pump system from Mitsubishi Electric adds complete
1:15:12
comfort control to the rooms where you
1:15:14
need it most without having to add
1:15:16
new duct work. All electric, energy efficient,
1:15:18
and perfect for all climates. Heat pumps
1:15:21
are a great way to keep any
1:15:23
space comfy year round. Learn more about
1:15:25
Mitsubishi Electric products at patriotair.com Rev up
1:15:30
your thrills this summer at Cedar Point on
1:15:33
the all new Top Thrill II. Drive
1:15:35
the sky on the world's tallest and fastest
1:15:38
triple launch vertical speedway. And
1:15:40
now, for a limited time, get more Cedar Point fun
1:15:42
for less with our limited
1:15:44
time bundle for just $49.99. Get
1:15:48
admission, parking, and all day drinks for
1:15:50
one low price. But you better
1:15:52
hurry, because this bundle won't last long. Save
1:15:55
now at cedarpoint.com Hey,
1:16:01
welcome to Immigrantly Media, where we
1:16:03
celebrate diverse voices. I am Sadia
1:16:05
Khan, the founder, and I tell
1:16:07
you, Immigrantly, every podcast is meticulously
1:16:10
crafted to ensure that our collective
1:16:12
voices matter, our opinions count, and
1:16:14
our existence is acknowledged and celebrated.
1:16:17
From Bantilly, a Gen Z-focused podcast
1:16:19
for pop culture enthusiasts, to Nationly,
1:16:21
your fun, insightful guide to election
1:16:24
2024 and stories behind
1:16:26
diverse communities, and invisible hate in
1:16:29
ethical true crime podcasts. And
1:16:31
of course, who can forget Immigrantly,
1:16:33
our flagship, where every story matters.
1:16:35
And that's not it. We are
1:16:37
launching Sportly in July, which will
1:16:40
be at the intersection of history
1:16:42
and sports. Subscribe to Immigrantly Media
1:16:44
on your favorite platform, wherever you
1:16:46
listen to podcasts.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More