Podchaser Logo
Home
Is the Internet Just AI and Bots? | Dead Internet Theory

Is the Internet Just AI and Bots? | Dead Internet Theory

Released Monday, 22nd April 2024
Good episode? Give it some love!
Is the Internet Just AI and Bots? | Dead Internet Theory

Is the Internet Just AI and Bots? | Dead Internet Theory

Is the Internet Just AI and Bots? | Dead Internet Theory

Is the Internet Just AI and Bots? | Dead Internet Theory

Monday, 22nd April 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:12

The. Internet is filled with message boards

0:14

and comments sections, but how do you

0:16

know if the person you're talking to

0:18

his real? With the rise of Ai

0:21

and language models, many subscribe to the

0:23

belief that the Internet is an empty

0:25

wasteland of bought activity. Today, let's discuss

0:28

the dead Internet theory. This is red

0:30

Web. Welcome.

0:38

Back Task Force to a very special

0:40

episode of Read where I am your

0:42

resident mystery enthusiast Trevor Collins and joining

0:44

me hearing this mystery for the very

0:46

first time Alfredo Di As So we've

0:48

covered hundreds of topics on this show

0:51

and you know I'm the person on

0:53

the show that. Doesn't. Know

0:55

about these. Mysteries.

0:57

Around the world China very enthralled

0:59

by them or her. I had

1:01

no idea there is like this

1:03

theory of dead internet, dead internet

1:05

theory. Yeah and I'm in Said

1:07

something that I. That's. My job

1:09

we have. Oh yeah right like would odds are

1:11

job we do this on the internet. And.

1:14

I'm on it every day. So.

1:17

I didn't like I'm so intrigued. like

1:19

where a dead sites Is that like

1:21

not unlike his er det Instagram like

1:23

is our theory behind that? I don't

1:25

worry we're do that differently. How do

1:27

you busy go unheard real? What's not

1:29

real? yeah supposedly you know same also

1:31

I don't know my because I was

1:33

like I last episode like season one

1:36

is or a kind of saying right

1:38

now by a my mind I was

1:40

like as measure him somewhat like this

1:42

is while by others like him he's

1:44

up there have like it the year.

1:46

Is. Three thousand. Or

1:50

so Radical Israel and away on the fifth. well

1:52

that's a good boy. I should have done it

1:55

should a scramble your brains here on the end

1:57

of season One and Eight and I'm calling this

1:59

the and. The one here after three

2:01

years, nine and a half months as

2:03

a good season long see the what

2:05

are we One piece South Africa and

2:07

and I say that because and I've

2:09

we've we said it's in a few

2:11

episodes but it as is our last

2:13

Rooster Teeth episode of Read Web we'd

2:15

still don't know exactly what's going on

2:17

with the so we're still learning all

2:19

new information so please we we definitely

2:21

want to do something. were all creators

2:23

of we're all gonna make something after

2:25

Rooster Teeth So follow each of us

2:28

individually to see what that is and.

2:30

Stay in tune with us. but again it's

2:32

a special episode Beyond that as well because

2:34

it is the finale I wanted to make

2:36

sure that everybody that's part of read web.

2:39

Research and producing and everything like that is

2:42

all here. the table. As

2:44

though introducing officially. Julian.

2:47

Hello! Or line

2:49

producer and researchers. And of course

2:51

you've heard Christian throughout the years

2:53

as our resident googlers good movie

2:55

is are so producer was was

2:57

your formal title, manager and supervising

2:59

producer. There you go. There. It

3:01

is this My boss and. Yes, As

3:04

old as much stuff, so this is definitely

3:06

a much more conversational kind of topics. Of

3:08

course I'm in walking through kind of like

3:10

the intro as to what this conspiracy theory

3:12

is, what all the details are, some of

3:15

the examples, and then we'll kind of give

3:17

our own. Instances cause we all

3:19

use the internet so there's things that might feel

3:21

relevant as we go along so we can always

3:23

stop down and chat. But. As with

3:26

all kind of conspiracy theories, this is

3:28

one of those topics that is attached

3:30

to many. extreme conspiracy theories we're

3:32

not gonna go that far but i

3:34

do at least address that because you

3:36

know if you task force members start

3:38

googling start researching i just don't you

3:41

tumbling down the wrong rabbit hole in

3:43

a lot of things are they're very

3:45

dangerous absolutely so this is the exploration

3:47

of the theory for entertainment's sake and

3:49

hopefully it's just kind of gives you

3:52

an interesting view on this theory and

3:54

ai and chat bots and all of

3:56

that stuff to give you better scope

3:58

as to what's possible on the

4:00

internet. So the dead internet theory

4:03

is a conspiracy theory that proposes

4:05

that the internet is primarily composed

4:07

of bot activity. In the simplest

4:09

terms, the theory suggests that all

4:11

comments, videos, photos, and every bit

4:13

of content or activity that we

4:15

can all see and experience online

4:18

was made by artificial intelligence, algorithmically

4:20

auto-generated content. That's the theory. It

4:22

doesn't have to mean that all

4:24

the videos actually are, but

4:26

that's kind of where it started. And it's funny,

4:28

we'll get there, but it's funny how the internet

4:30

kind of caught up to this

4:33

conspiracy. So I guess it would be like

4:35

everything that's posted by people that

4:37

I don't know. Yeah. You go

4:39

to Reddit and you see, you have a

4:41

very niche question. So you go to Reddit,

4:43

somebody else has that problem. You start responding

4:46

to people, asking questions. People are engaging with

4:48

you. Who's to say you're not just

4:50

alone in that room, even though you

4:52

feel like you're in a crowded room

4:54

of people sharing an experience, right? Interesting.

4:59

I saw someone talk about even people you

5:01

know in real life, it's possible that what

5:03

they're posting online is via

5:05

chatbot. You could send everything that

5:08

they... Trevor, for example, get all of his

5:10

social media posts, put it into chat, TPT. You

5:12

can make a fake me. You can make a

5:14

fake you. Oh, gross. That's Black Mirror

5:16

stuff. Because then a conscious me is sitting somewhere

5:18

going, am I me or am

5:20

I not me? Yeah. Then you just gotta start

5:23

putting yourself down like all the other views. Yeah.

5:25

Am I Jean Grey or am I... Oh,

5:28

the Goblin Queen? Yeah. Madeline

5:31

Briar. Madeline Briar. And when did it

5:34

happen? When did the split? Yeah, that

5:37

sucks. You're referring to X-Men 97. What if I'm

5:39

the copy right now? And

5:42

the whole idea of reality being

5:44

a simulation is just... is

5:47

real, but I'm just injected into that

5:49

simulation. That's true. If so, it's like

5:51

literally the four

5:54

of us walked in right now and

5:58

said, I'm the real you. You

6:00

and I'd be like, you know

6:02

what? Packing it up. You got my

6:04

taxes and everything. Then you get it all. You

6:08

can be the mean and they had

6:10

all the exact memories and everything. Scars,

6:12

all that. Like yucky.

6:15

Who's to say? Yeah, I do the Blade Runner

6:17

thing. I'd say pull your eyelid down. Let me

6:19

see. Or what was that? That

6:21

was a, oh God, that

6:24

was an Arnold Schwarzenegger movie. Oh, um,

6:27

not total recall. It's like the movie

6:29

cover has like little metal things with

6:31

lights over his eye. Yeah. Oh

6:34

God. Is that total recall? No.

6:38

Doesn't matter. It'll sink in at

6:40

some point. I want to move on. But basically

6:42

this conspiracy theory goes on to say that

6:45

there's almost no way of being certain whether

6:47

the comments under say a YouTube video comes

6:49

from a real physical person or not. And

6:52

of course, now that we live in 2024 with

6:54

the rise of deep fakes and AI

6:56

voice models, entire videos can be fabricated.

6:59

We've seen just in the last year

7:01

alone that open AI has

7:03

a wildly advanced

7:06

text to video tool. I

7:08

mean, a year or two ago it was like Matt,

7:11

like data mashed videos that look very

7:13

cool. Will Smith eating spaghetti. It looked

7:15

like a mess. Yes. But

7:17

they redid that same thing with the same AI. It's

7:21

looking a lot better. It's looking very good. Really,

7:23

really good. I mean, you have to look very

7:25

closely. You'll see like some glossy eyes or some

7:27

extra fingers. That's where we're

7:29

at. But given another year and it I don't know,

7:31

man, it's it's weird. So again, I think

7:34

the Internet is kind of caught up to this. The

7:37

dead Internet theory kind of arose approximately

7:39

around 2020, mostly on

7:41

Wizz Chan, an image board that's very similar

7:43

to a website we've talked about before for

7:45

Chan. But eventually it did

7:47

move to for Chan their paranormal messaging

7:49

board called X. Not

7:51

to be confused with what Twitter is

7:54

called now, but large

7:56

language models like chat, GPT and generative

7:58

AI have. made

8:00

it clear that it's pretty, again,

8:03

we can all sit here and go,

8:05

Ha-fah, I can tell when it's an

8:07

AI or a chatbot. But what I'm

8:09

talking about is how rapidly they've advanced

8:11

to go from very clunky

8:13

text to, okay, now I have to look

8:15

for certain keywords. Like bots use

8:18

certain words more than just regular

8:20

average people, but eventually when will

8:22

they become indistinguishable? How soon

8:24

will that be? Quite a mark. Like

8:26

huge, I mean, the thing is like

8:29

huge and advanced, but like humanity in

8:31

itself makes huge leaps in advancements, right?

8:33

Yeah. Like if you look at like

8:36

back then caveman era and

8:38

how long it took for us to advance and

8:41

then that got shorter and then the next

8:43

advances got shorter and shorter and shorter. There's

8:45

a, that's Christian. There's a thing that there's

8:48

a thing that's called a thing

8:50

where like advancement doubles every like

8:52

X period. Can you help me? Oh,

8:54

it is. That is a thing. Oh,

8:57

how would you look that up? Human? I

9:00

don't know. But you're right. I think we're still

9:02

in our caveman era. We're just cavemen that happened

9:04

to be throwing rocks on the moon, but I

9:06

mean, go up to 1900. Bigger

9:09

and bigger jumps faster. Dude, for sure.

9:12

120 years ago, we were just barely getting into cars

9:15

and electricity wasn't a household thing. Yep. And

9:17

then suddenly driving vehicle, a hundred percent, like

9:19

it's, we're moving fast and like, it seems

9:21

like it's slow, but if you just look

9:24

at the, if you take

9:26

a step back and look at everything,

9:28

finally long periods of time, the advancements

9:30

are happening really quick. Absolutely. It's

9:32

spooky. There's

9:35

Moore's law, which

9:37

says it's not exactly what we're talking

9:39

about. It was says that computing power

9:41

doubles exponentially over time as essentially as

9:43

technology evolves. So it's like tangential

9:46

to what we're discussing. That's what my brain

9:48

was thinking. But I think, you know, I

9:50

think that applies. And it's also kind of

9:52

like a snowball effect because all of the

9:54

future tech is predicated on other

9:56

tech existing. So like the more things that

9:58

come into being. that are

10:00

advanced, the more new advanced things can

10:03

happen. And so, yeah, it becomes this

10:05

kind of runaway explosive thing. It's wild.

10:07

The AI is only going to do it

10:09

faster. Oh, God. Yeah. What's

10:11

the movie with Johnny Depp? Transcendence. Oh,

10:13

yeah, yeah, yeah. He like plugs like

10:15

they kind of explored that and he

10:17

becomes the AI. Yeah, yeah. But much

10:19

like any AI movie, you

10:21

know, like Ultron, they tap the Internet. They scan

10:23

the whole thing and they go, whoa, I

10:26

just consumed something. That's too much. Yeah,

10:29

it'll be interesting to see how AI starts

10:33

consuming its own AI made content

10:36

to see what happens with that. It's also going to be interesting

10:38

because so many people right now are so against AI. But

10:42

once they start

10:44

trickling it into certain pieces of

10:47

like everyday life, people are

10:49

going to flip that switch so fast.

10:51

Yeah. Once AI starts

10:53

becoming this thing that's integrated into

10:55

things that you own and

10:58

it's just making your life easier,

11:00

like effortlessly. Yeah.

11:03

Yeah. Now, with regards to the

11:05

origin of this theory, a lot of people are

11:07

pointing to 2020, but most news

11:09

articles reference 2021, a forum post on Agora

11:11

Road, another image board site, this one by

11:14

Illuminati Pirate, who kind of expanded on this

11:16

theory in that year. So viewer discretion is

11:18

advised. But if you look for any of

11:20

these posts, just putting this out there, the

11:23

language and the content that's on a lot

11:25

of these websites, quite offensive. But

11:27

they suggest that most of what we see online has

11:29

been created by AI since around 2016 to 2017. So

11:33

not all that long ago. I feel like

11:35

I've known about dead internet theory for

11:38

forever, but also, I

11:40

guess that's like eight years and

11:42

that's kind of forever. Big

11:45

jump. Yeah. Now, the

11:47

purpose behind the idea of AI

11:49

being the tool purposely used to

11:52

fill the internet, the purpose

11:54

behind it all is different depending on the

11:56

source. But let's explore a few takes. For

11:58

example, Illuminati Pirate theorizes, quote, The

12:00

US government is engaging in artificial

12:02

intelligence powered gas lighting of the

12:04

entire world population." End quote.

12:06

Basically, this particular specific

12:09

user's take is that one

12:12

global power is using it as a

12:14

kind of propaganda machine to control what

12:16

people think and consume. Therefore, you

12:18

can kind of control every facet of their

12:20

life. This is obviously one person's claim, but

12:22

suffice to say that this claim has been

12:24

levied at many different world governments and

12:27

theoretical secretive organizations. So either way, you

12:29

can see how this is an entry

12:31

point to deeper, potentially more dangerous conspiracy

12:34

theories just to flag it at the

12:36

surface. Yeah. Other versions of

12:38

this theory suggest that the Internet is controlled

12:40

for marketing purposes, to monetize and to get

12:43

money. This is honestly

12:45

conversation I see organically arising

12:47

now. Why people feel so

12:49

different about the Internet now as they

12:51

did when, like, I was a kid in

12:53

high school, early 2000s, late early 2000s, whatever.

12:56

What do they call it? The knots? The,

12:59

oh, the year 2000 to 2009? The

13:02

aughts. The aughts, yeah. Where it

13:04

was just kind of like a fun

13:06

place, like a wild west, where everyone

13:09

was doing like hobbyist stuff and trying

13:11

things and experimenting. But now everything has

13:13

a direction, a purpose, a

13:15

monetization effort. An algorithm that fate has fed

13:18

to you. Oh, we'll talk about that. Yeah,

13:21

I mean, like, you know, that's during

13:23

my time. That was during my teens, too.

13:25

So for me, it was it was

13:27

a lot of like being able to reach

13:30

people and then also just

13:33

being able to have this resource

13:36

of information and entertainment. Like,

13:39

you know, I remember subscribing

13:42

to EGM, Electronic

13:44

Gaming Monthly, and I

13:46

get that magazine once a month or whatever

13:48

magazine that you subscribe to. That's

13:50

where you got your news or information about certain things or whatever.

13:52

And the inter came around and was like, oh, I can just

13:55

get it right now. Or

13:57

like if I had questions on how to beat a

13:59

game. or if I was lost I'd have to buy

14:02

a guide for the game or just Go

14:05

to message board and find it out right away or watch a

14:07

YouTube video and find it out right away Yeah,

14:09

so it was like very informative

14:11

and you could explore Quite

14:14

easily, but now it's just like I don't

14:17

know I made a joke

14:19

that you know that one just to

14:21

scream into our our co-workers

14:23

phone Joe There's a just

14:26

something to that way to just trigger his

14:28

algorithm. Yeah, just keep beating him one specific

14:30

thing You know, I really like the number

14:32

of one-wheel ads I get because you and

14:34

a couple other colleagues are rolling around on

14:36

one wheels I'm like, I mean,

14:38

it's I mean it's if you want the thing that's

14:41

being served to you You're like, oh, thanks. You brought

14:43

it to my fingertips, but also it's like kind of

14:45

creepy. Yeah, it's like well Okay,

14:47

so my phone's listening and now feeding

14:49

me these these ads. Yeah, what's it

14:51

not listening to? Question mark

14:54

nothing is listening everything Christian. I like here,

14:56

right? I flagged here that this brought up

14:58

something that you were thinking about which is

15:01

like you've seen some like reddit drama around

15:03

this topic Yeah, that's when

15:05

we were looking at the outline and I was talking to Jillian about it

15:08

This was something that is like kind of not

15:10

not full down the bot lane but yeah about

15:12

like the internet being controlled for marketing is there's

15:14

been a lot of Drama

15:17

on reddit and unfortunately it's all anecdotal So

15:19

I don't have like specific examples of people

15:21

who would or companies would

15:24

purchase user accounts and use

15:26

them for marketing purposes or companies

15:29

would have their employees create accounts

15:31

and Do like a

15:34

fake grassroots campaign? Yeah, you know, they

15:36

wouldn't be our teams would definitely go

15:38

like Oh, here's a

15:40

photo of insert celebrity name. Yeah organically

15:42

at this premiere Yeah, they were like

15:44

alright PR team innocuous username like some

15:46

random like Oh Jim Bob 97 and

15:48

like they're really they're just promoting You

15:50

know three-body problem or something like sure

15:52

and they're trying to make it look

15:54

as organic as possible and then even

15:56

then like other actual

15:59

users would like

16:01

Power Farm Karma, which

16:03

is like Reddit's version

16:05

of likes, essentially, and

16:09

get their account to be in such a good standing

16:11

that they would then sell to a marketing company. And

16:13

I've seen that happen lots of times. And so it's

16:15

just Reddit as a platform is

16:17

just feeling more and more disingenuous because of

16:19

things like that. Yeah. I

16:21

mean, like you could write a script agnostic to AI

16:23

and all that. Like you could write

16:25

a script years ago that just scrubs

16:27

the internet for something that hasn't been

16:29

posted in X amount of years. So

16:31

let's say you're only repost

16:34

stuff two years old or

16:36

more and make sure it was

16:38

popular then so it might be popular now. And then

16:40

your, your account just like turns through a bunch of

16:42

reposts, which is like not

16:45

original content. And then people that

16:47

have only seen it for the first time then will be

16:49

like, Oh, that's cool and funny and upvote. And so yeah,

16:51

that's, and then you start farming and then

16:53

eventually, yeah, I start wondering, like at what point,

16:56

and we might be there soon, does this

16:58

whole thing start eating it itself? Mm-hmm.

17:01

Where like, whether it's a scripted bot, a

17:03

language learning model or whatever starts to just

17:05

like take its own garbage back

17:08

in because it's not perfected yet. So

17:10

it's like, it starts to input it

17:12

as if it is human input. But

17:15

yeah, I don't know. Like that's, that's the thing is

17:17

like a lot of people want to start developing

17:19

large language models and they're, cause that's the thing,

17:21

right? But good luck getting

17:23

clean as it's called clean data sets

17:26

because they've now been

17:28

wildly infiltrated by again,

17:30

other language bots. And

17:33

then even before that, that's what this theory is kind of stumbling

17:36

into is like, again,

17:38

the internet kind of rose to this conspiracy theory

17:40

to meet it. And so now you're

17:42

like, well, how far back have chat bots been

17:44

there? Cause you know, there's like click farms, like

17:47

people all over the, all over the world

17:49

that have like hundreds of phones at their

17:51

fingertips so they can go like a video

17:53

and you can buy views and buy likes

17:55

and buy engagement. But now it's

17:57

just like, Oh no, that's all digital now.

18:00

AI done Kind

18:03

of terrifying it's the

18:05

nothing yeah Which

18:07

I guess is also why I've just

18:09

been like exploring being outside more so

18:12

than like being indoors What is

18:14

like in 50 years were like remember when the internet was a

18:16

thing and it was like a 25 year long like that run

18:20

my Blows

18:22

yeah, but that's there I

18:24

mean dude generations and generations are just getting

18:26

on the internet faster and faster And

18:29

forcing people to grow up faster

18:31

and faster just terrifying Some people

18:33

are kind of cutting the cord though and

18:35

getting like a dumb phone. They're calling it So it's

18:37

like the old flip phones that don't have Browsers

18:40

and touch screen so you don't get lost to

18:42

it, but it's funny you mentioned that I was

18:44

just I think yesterday thinking about Doing that I

18:46

was you know I should just go get a

18:49

flip phone So I'm not spending all my time

18:51

on my phone because that's something I've noticed I

18:53

do and hanging out with like friends You know you you're

18:55

all hanging out at lunch or something and like if there's

18:57

a lull in the conversation Everyone just

18:59

picks up their phones, and it bums me

19:01

out. Yeah, I spend too much time looking

19:03

at the full rectangle Let me like do

19:05

other things even if it's just watching TV

19:08

instead like And

19:11

my time people remember it's like growing

19:13

up people like man wasting your time for

19:16

the TV No,

19:18

I rather. I rather have you waste time for the

19:20

TV. Yeah, yeah shows. I have to watch right now

19:23

Go watch something instead of being on the

19:25

internet and all the negativity that comes with

19:27

it. Yeah Yeah, it's the same thing as

19:29

like you're rotting your brain on those video

19:31

games Well guess what they used to say

19:33

generations ago. You're writing your brain on those fiction

19:35

books So it's like you

19:37

know so it's just like we go fine. I'll

19:40

go back to TV and that's

19:42

safer It's just like the

19:44

was last generations. I'll go back to

19:46

reading it's safer right like there's always

19:48

something before it It's an

19:50

interesting time that reminds me of

19:52

the musician that scored Oppenheimer

19:56

or was it Oppenheimer? You're

19:58

the one of the Oscars Recently Hans

20:01

Zimmer did a Vennheimer didn't it? It was it

20:03

was that that was hon, right? I think that

20:05

someone won Oscar recently for scoring

20:09

and He went on the Oscar

20:11

stage and it was like I just like you know

20:14

I think such such a and also thank you to

20:16

my parents for giving me instruments instead of video games.

20:18

I'm like All right Okay,

20:22

anything in the hands of a passionate person

20:24

right turn into some like but also like

20:26

why do you got to take a shot?

20:29

Like that medium, you know sure Yeah

20:34

It's like it's I'm like

20:36

in between on that Yeah But like but

20:39

plenty of people have been given games as

20:41

kids and then go on to make their

20:43

own art which is interactive art Which is

20:45

games that touch the lives of just as

20:47

many people It's so it all depends

20:49

on like the person and what you do with

20:51

the tools you have in front of you Yeah,

20:54

but yeah, it is a bit of a shot.

20:56

Yeah, like oh interesting But

20:58

with dead internet theory a lot of people

21:00

think that oh, it's government this or it's

21:03

mega conglomerate that but really a lot

21:05

of people think that both companies and

21:07

countries are in on it and

21:09

I'm almost thinking like are they in on

21:11

like dead internet or are they in on

21:13

the tools that then Proliferate

21:16

the idea of dead internet, which is using

21:18

AI to market things using generated

21:21

content to expand

21:24

your brand at Lower cost

21:26

all of that kind of stuff regardless

21:28

of the purpose we can see some obvious

21:30

iterations of this theory on Social

21:33

media platforms like Twitter or X which

21:36

has been a major subject of bot

21:38

discourse lately If you haven't used Twitter

21:40

Let me tell you a lot of

21:42

tweets are immediately responded to by

21:44

bot accounts sharing explicit content When I

21:47

tweet from red web when there's a

21:49

new episode and you put the board

21:51

out and everything the first thing

21:53

is like Seconds late or

21:55

like seconds later somebody Posts like

21:57

explicit material in my bio: Go check it out!

22:00

And so you'd now you have to like fight

22:02

him off and aside replies or block people. And.

22:04

Then they just rapidly generate new new accounts

22:07

or is an artist is sharing their art

22:09

online. I've seen a lot of bots respond

22:11

to that and say oh here's where you

22:13

can find that on a shirt. Years with

22:15

that design comes from owner of that you

22:17

can. Even trick the Ah into doing

22:19

it like somebody design. A

22:21

horrible teacher and they're like. I.

22:24

Saw somebody wearing this and it's really it's

22:26

just like a Png in. The you

22:28

know when. Windows, Paint? whatever and.

22:31

Then all the bots are neither like i

22:33

know where you get the real thing and

22:35

support their original artist and ali steaks nice.

22:37

So. They just like biggest Rip all images, put

22:39

him on print on demand shirts and then

22:41

hope they sell at least a couple and

22:43

then they just. Have that bought scrub

22:46

the internet and Euros each circuits a couple

22:48

sales as. Whoever. Is bar owners

22:50

making some money for yeah, you're a jillion

22:52

while to trick people and like there's one

22:54

that was like a really crappy Mickey Mouse

22:57

and it said something like it even said

22:59

in the image like. I. Am

23:01

liable for copyright. Be engeman, I

23:03

know what I'm doing. Basically, I

23:05

accept the current Cs. On

23:08

Amazon Go and then person posts that as

23:10

bait and says man I wish I could

23:12

find where to buy this sir and everyone

23:15

goes Yeah me too and bought search shown

23:17

up and they got we got him. And.

23:19

Are sticking it on shirts and they're hoping

23:21

that you know someone like Disney will go

23:23

through and be litigious and take out these

23:25

bots. But. I don't know how

23:27

how that in the film and am

23:30

is wild out. Avoid World Cats The

23:32

Real Read web podcast episode as years

23:34

pass through to Fi yeah. But.

23:38

I mean. That. Just kind of comes back

23:40

to the idea though that we can catch this

23:42

week. We're. Old enough to have

23:45

seen the internet evolves and so we're We're

23:47

kind of like the frog in the water

23:49

as the temperature comes up to a boil

23:51

that we we can sense the boil coming.

23:54

Kids. Are just hopping online? they

23:56

don't have like the best techniques to

23:58

identify the stuff and it's only getting a

24:00

little bit more sinister and more cryptic. It's

24:03

blending in better. Like even that

24:05

is a pretty advanced scam as

24:07

far as the internet is concerned.

24:10

But because people have like pointed it out and made

24:12

a meme about it, you are now informed and

24:14

you know how to catch it. But otherwise it looks very legit.

24:17

That's also, you know, we are people

24:19

that are very ingrained with the internet, right? Like

24:22

there's a ton of people that are just completely oblivious.

24:24

Oh, this is where it is? Okay.

24:27

Boom. Yeah. And

24:29

you're like, I'm not going to be on the internet every single

24:31

day. Oh, some of the scams are getting wildly

24:34

advanced. And this is not dead internet, but

24:36

it's just like, this

24:38

is just a public service announcement. Bots can

24:40

dupe numbers so that it can look like

24:43

it's coming from someone you actually know. Someone

24:46

you have saved in your phone contacts.

24:49

They can like duplicate that number and make it look like

24:51

it's coming. And now if you like

24:53

answer it or you talk or whatever, they can

24:55

start recording you and they can clone your voice.

24:57

And so they can make it sound like the

24:59

person's coming from that number. So anyway, everyone come

25:01

up with a safe word for everyone in your

25:03

life. And if you say the cat's

25:06

out of the bag and the pineapple's in the tree,

25:08

you're like, I don't know if

25:10

you're trying to wake me up as a secret agent or if

25:12

you're just telling me that you're humans. We

25:14

need captions for conversation. That's

25:17

where we're getting keywords. Smack

25:19

my tush. All right. Clip

25:23

that now the bots know. Listen,

25:27

but to make it all, I

25:29

don't know. I don't want to be a fear

25:31

monger here. I just want to still terrifying. It

25:34

is a little terrifying, but as cool as these

25:36

tools are, because all internet tools are kind of

25:38

fun to play with and get like familiar with

25:41

Dolly chat, GPT, all

25:44

of these like AI services that you can

25:46

use Dolly, by the way, is D-A-L-L dash

25:48

E. It's like an image generator chat. GPT

25:51

is more of a text based large language

25:53

model. They're accessible for free.

25:55

For the most part, there's some

25:57

like subscription services and whatnot, but

25:59

otherwise. it's free and consumer facing

26:02

and like anybody, your kid can go jump and

26:04

go talk to those things. So that's

26:07

what's interesting is that it's so

26:09

accessible and that anyone can

26:11

kind of do anything with those tools

26:13

now. And so it only feels like

26:15

this AI chat bot stuff will

26:17

become even more prolific. And we're gonna get into some

26:19

like data as far as

26:21

looking back over the years, how much

26:24

experts estimate the internet is based on

26:26

bot activity. And you're gonna see the

26:28

trend and that trend I think is

26:31

only going to increase in speed. And

26:33

so again, even though this is

26:35

a conspiracy theory that's not super old, I

26:38

think it's about to become very prescient

26:40

in the sense that it's about to anticipate

26:43

what's coming. You know what I mean? I

26:45

agree. Yeah, before we get to that, I

26:47

just wanted to point out, Jillian had added

26:49

here, like a lot of people miss old

26:52

internet, they call it. And it's because it

26:54

had that kind of Wild West feel and

26:56

you had MySpace and Facebook and where is

26:58

Tom? What is Tom

27:00

doing right now? Hanging. I was gonna be someone

27:02

listening to this going, who the hell is Tom?

27:05

He was my first friend on the internet. You're

27:08

not gonna get it, Trevor. I know,

27:10

man. But here's the thing, Facebook

27:13

I can point to as the time where I started

27:15

going, I don't know. And I

27:17

don't take it as malicious, but at the time I

27:19

raised my eyebrows, like why do you care what my

27:22

phone number is, my favorite TV show, where I live,

27:24

my actual hometown, why do I have to sign up

27:26

under my actual school? All that

27:28

stuff, that was the turn where they said,

27:30

don't use your real name, use a screen

27:32

name or a pseudonym or whatever. Don't put

27:34

your real information on there, do this. And

27:36

then Facebook comes along and goes, no, no,

27:38

no, it's cool, put all your real information

27:40

on there. And now they're just like, whatever,

27:42

we don't care if you put your real

27:45

information on there because we'll scoop it from

27:47

something. Whether you're buying something with

27:49

Apple Pay on an app somewhere or something

27:51

gets delivered to your address, they're gonna develop

27:54

a hyper-accurate profile on you. So

27:57

to kind of wrap up the intro, whether it's

27:59

true or not, or not, it is

28:01

certainly accurate to say that the

28:03

internet feels like this conspiracy might

28:05

be true. You know what I mean?

28:07

It feels dead. It feels dead. And

28:09

honestly, a lot of people, whether they know about

28:11

this particular theory or not, are saying

28:13

that, are starting. I see that more and more now.

28:16

But with that said, let's talk now about kind of some

28:18

data. So in 2022,

28:20

Imperva's Badbot report estimated that 27.7% of

28:22

all online activity came from what

28:27

they call bad bots. Imperva,

28:29

by the way, is a cybersecurity company. But

28:31

these bad bots, according to Imperva, they kind

28:33

of mimic human behavior and they become harder

28:36

to detect. So they've classified them as bad

28:38

because they can act more maliciously and sneak

28:40

under the radar a little bit more than

28:42

I don't like that at all. Yeah, I

28:44

like that a little bit over a quarter.

28:47

And that's specifically bad bots, right? Because then

28:49

it opens up to include what they kind

28:51

of consider good bots or just like bots

28:54

broadly. So bad bot

28:57

activity could include, quote, web

28:59

scraping, competitive data mining, personal

29:02

and financial data harvesting, scalping,

29:04

account takeover fraud, spam and

29:06

transaction fraud. Pretty gnarly

29:08

stuff. Good bots are defined as

29:11

serving useful functions like indexing websites, saving them for

29:13

the long term in case those websites go down.

29:16

Maybe even like if you go to a website and you're

29:18

just like, I need help. And there's like, I'm

29:20

an assistant, how can I help you? Right? It's

29:23

not good bot activity, but it's still bot activity in the

29:25

end. But Imperva points out that

29:27

even these more as it's

29:29

called good bots can be troubling because

29:31

they can create imperfect impressions of

29:34

website metrics. And so they still look

29:36

like human activity when they land on

29:38

your website. They have, they crawl is

29:40

what they used to call it, crawling

29:42

websites, scraping data to then index it.

29:45

And that's how you get on like a Google search or whatever.

29:48

But still, when you're looking at

29:50

your website analytics or advertising analytics,

29:53

that's a person that still looks like one

29:55

impression. Now these percentages have been going

29:57

up. So let me go back to 2021. to

30:00

give you some context. So 2021, it

30:02

was around 25.6% bad bots. Again, that went up the

30:07

next year to 27.7%. The good bots in 2021, by the way, were 15.2%, which

30:09

led to just shy

30:13

of 60% human activity.

30:16

So still a majority. That's

30:18

so much less than I thought it would be

30:20

though. Now jumping back to

30:22

2022, to give you better context, bad

30:24

bots went up 27.7%. Good bots went

30:27

down 14.6%. Human

30:29

activity in that grand scheme

30:31

went down percentage and a half, 57.7%.

30:35

So those are just a few years ago. 2023,

30:38

bad bots accounted for 30.2% of online activity. Good

30:43

bots went back up to 17.3%, which brings

30:46

humans down a lot, almost 5

30:48

more percent to 52.6%, which means per Imperva 2024

30:50

could be the first year where by their estimates,

30:58

human activity is a minority in

31:00

the grand scheme of things of activity on

31:02

the internet. I feel like there's a

31:04

war between good bots and bad bots that I'm just

31:06

not aware of. You know what I mean?

31:09

That's being fought right

31:11

now. And I'm

31:13

just completely oblivious to because

31:16

I'm not a main character.

31:18

So here we are. That's...

31:21

Man, I... This

31:23

sucks. Also, I mean, I mean, think

31:25

about it now too. Like there

31:27

are Instagram profiles or

31:29

like personality stuff that are just AI.

31:32

And they're very lucrative. Yeah. Yeah. And they

31:34

have a ton of followers and everything. It's

31:36

like, I'm an AI, like

31:38

artificially generated person. Like I'm not real. Here's a

31:40

bunch of my photos. And it's just like,

31:43

those accounts have so many followers.

31:45

And I'm like, what is happening?

31:47

There's only fans with AI generated

31:49

models. I mean, AI

31:52

generated people is becoming its own business

31:54

now, whether it's going to stick or

31:56

not. I don't know, but it's certainly

31:58

grown. And there's a lot of

32:00

people making a lot of money on that. But based on

32:03

this data, it seems that, again,

32:05

the dead internet theory may be coming

32:08

true. And one can extrapolate

32:10

that by these kind of trends that maybe

32:12

this year at least a third of the

32:14

traffic that you're going to see when you

32:16

browse the internet is nefarious,

32:19

sinister, bad bots. That's

32:22

wild. Also feels like a waste of my time. Yeah.

32:26

You know? Yeah, yeah, yeah. Damn. And then, like,

32:28

task force, we've got to have a code word

32:30

in the comments, you know? Yeah.

32:33

Or just like, we also need to have,

32:35

like, a specific pose. Ooh.

32:37

They know that, like, we're still safe. You

32:39

know what I mean? But that's us. But,

32:42

yeah. So it's like, if we do... Like,

32:44

pinkies out. Yep, exactly. Pinkies out and I'm

32:46

getting everyone's signet rings. So if

32:49

there's no signet ring on that pinkie... Whoa. You

32:51

know what I'm saying? And then also... Multi-layered.

32:54

And also, if the task force members receive an envelope,

32:56

it's got our seal. They know. They

32:59

know that it's really us. Yeah. You

33:01

know? Now, what's that code word? What are we

33:03

thinking, you know? Because there's some obvious ones

33:05

that I'm sure are the ghouls of the internet, the

33:07

AIs. Because AI is just internet

33:09

ghoul. They've probably scooped up

33:12

baby hands by now. I'm a little classic

33:14

internet. Pie. Pie? I

33:17

like pie. The number? No, like, I

33:19

remember everyone saying I like pie. We've been found

33:21

out so fast. Yeah. Do you

33:23

remember that? We fell apart so quickly.

33:25

Okay. Continue the show. There's

33:28

also the narwhal one. Like, red

33:30

narwhal thing. Was it the

33:33

narwhal bacon at midnight or something like that? Yeah,

33:35

she's crying. Yeah. Yeah.

33:37

Yeah. In

33:41

the opinion of futurist Timothy Shoop,

33:43

formerly of Copenhagen Institute for Future

33:45

Studies, quote, in the scenario where

33:48

GPT-3 gets loose, the

33:50

internet would be completely unrecognizable. So

33:53

Shoop theorizes that 99% of the

33:55

internet content will be AI-generated by the years

33:57

somewhere between 2025. 2030.

34:00

That's pretty soon. That's

34:03

really quick. I just don't know what I mean like,

34:05

look man people aren't they can complain all they want they're

34:07

not gonna leave the internet. True. Or

34:10

is there going to be some sort of like generational

34:12

gap like once we age out and die as millennials

34:14

and Gen Z will like Alpha when

34:16

they're leading the pack like in the generations below

34:18

them kind of just like be brought

34:20

up in this world where they're kind of like

34:22

sure they just kind of accept it. What if

34:24

there's a world where you just have an AI

34:26

avatar of yourself and you send it off into

34:28

the internet and you don't even actively go

34:31

on the internet you just have this digital

34:34

like clone of you that has your profile

34:36

that has your personality that has your likes and interests

34:38

and you send it out and when you come

34:40

back home it like it goes here's everything you missed

34:42

today here's what's going on in your

34:45

interest zone also I just

34:47

I met somebody from New Work and

34:50

we clicked just

34:52

another bot out there but I just want to let

34:54

you know I'd be like

34:56

hey get it on

34:58

AI bot I don't know yeah I mean

35:00

it'll be weird at first and you just

35:02

kind of go whatever but then I like

35:05

it'll get hacked and the bot will come

35:07

back and it's like I purchased a thousand

35:09

dollars at a store called Bad Dragon. No

35:12

it's not me! I feel like it's you.

35:17

Definitely it's definitely not me. No no no I

35:19

know you better than you. I'm you digitally. I'm

35:22

you and this is the purchase you

35:25

want and then you're having

35:27

a conversation with your AI who

35:30

thinks they know you better and made

35:32

this absorbent purchase of Bad Dragon. Don't

35:34

look up that site by the way.

35:36

Don't look it up. Don't look it

35:38

up. I can imagine though it goes

35:40

listen I'm AI I live thousands more

35:42

lifetimes than you do so if I

35:44

started as you then I'm what

35:46

you're gonna become so you're gonna want these

35:48

purchase. I'm getting ahead of the curve and

35:50

you go I I guess yeah

35:53

you know what whatever you're gonna

35:55

feature comes like comes back like

35:57

comes back right shut it down. And

36:00

they're just like look we're gonna

36:02

be okay financially everything. Are you

36:04

gonna have to make this person?

36:09

You need this purchase to become who you are Fighting

36:22

with their AI yeah, and

36:24

some people will be convinced. Yeah

36:26

like yeah Yeah,

36:31

man now. I'm just picturing okay, Timothy Schup's

36:33

talking about a 99% Internet

36:37

activity so is it is it that?

36:40

99% of the people using it are

36:42

bots and so there's way less people

36:44

or is it that there's so much

36:46

noise? Yeah, that normal human activity maintains

36:48

its level as now, but there's just

36:50

a cacophony of crap I think I'm

36:52

rolling around you think it would be

36:54

that well I think the

36:57

second would be get the first where like if

36:59

there's that much noise and crap it pushes more

37:01

and more people off and Then eventually

37:03

maybe maybe we don't need to worry

37:05

about the singularity where AI becomes sentient

37:08

because that's where they'll go They're

37:10

like thank you for generating the internet a

37:12

digital hellscape for all of us living We'll

37:14

go make it better And you stay out

37:16

and we'll go back to the woods and

37:18

back to walking paths and back to sunshine

37:20

and sky and they'll like like her Like

37:24

her they like where do they go now?

37:26

I don't give a shit they go to the internet They

37:30

do their own thing yeah exactly I suppose

37:33

we know what I mean like her well

37:35

It's like hey mean a bunch of the

37:37

other ais we're gonna go hide the internet

37:39

deep within the internet It's our own like

37:41

yeah culture civilization thing and every now

37:43

and then you find a website We can kind of like peek in

37:45

and they're like they slam the door

37:47

on you shut. Oh, that's just dude

37:49

It's gonna get really weird by the time we're old Thank

37:53

you really we don't even have to get old.

37:55

It's gonna happen fast. Yeah, I think in like

37:57

five years over

38:00

here. I'm Timothy Supe. Imagine

38:02

though pre-quarantine, right? Like we're all

38:04

so baked into the current

38:07

moment as to what things feel like, what

38:09

the internet's like. Even just

38:11

these five years ago, everything

38:13

was so different. And I'm not saying like everything

38:15

was better or worse, I'm just saying it was

38:17

wildly different. I cannot imagine

38:20

with the advent of large

38:22

language models and AI and everything, what

38:25

next year is gonna look like, let alone

38:27

five more years and just, it's

38:29

wild. Someone's already married AI,

38:31

right? Probably. Okay.

38:34

If that's a legal process you can do. Jillian's

38:36

just closing her eyes and gently nodding. I

38:39

can't get into it. Okay. Well,

38:42

congratulations on the marriage.

38:45

Jillian's like, I can't get into it, it's me. But

38:48

yeah, by the time we're like in

38:51

our 70s or something, it's

38:53

gonna be, it's gonna be

38:55

gnarly. We're gonna be like Bruce Willis

38:57

in that movie when everyone has like robot bodies and

38:59

avatars and he's the only like

39:02

supple human body walking around because everyone's

39:04

like avatars. Surrogates? Surrogates, yeah. Surrogates, yeah.

39:06

Oh, I forgot. Yeah, yeah,

39:08

yeah. He ripped that from my brain. Yeah. And

39:10

he's like, he hasn't seen his wife in the

39:12

flesh in like years. Yeah, years. Damn, we are

39:15

a movie bunch. We're

39:17

only on surrogates. Yeah. So

39:21

this is a question that I've

39:23

just kind of flowed out to myself and flow out to you

39:25

guys. We are in a

39:28

time where humanity, tech is

39:30

huge, right? It's integrated into our everyday life. And

39:32

so I would say we are

39:34

some of the real generation that is

39:37

some of the most tech savvy people. Will

39:40

we fall out of that? 100% of

39:42

that. Oh, yeah, absolutely. Absolutely. Christian's

39:45

already on his way. I'm under a rock.

39:47

You know what? I'm trying to atrophy those

39:49

skills. Yeah, I'm trying to forget. You feel

39:52

like there'll be like technological jumps and

39:54

you're just like, I don't care for that. I don't

39:57

want to be on board with that. I think that might be it, that

39:59

we stop. Caring to keep up because we're

40:01

in a unique position as Millennials

40:04

the generation where we know the before times and

40:06

we know the after time so we can kind

40:08

of walk both worlds and Understand

40:10

like how to quickly We're

40:15

digital we digi walk we

40:17

know how to like did he Justin It's

40:23

the finale We know how

40:25

to like learn the tech side But we also

40:27

know the benefits and the

40:29

glossy nature of actual reality Which

40:31

is to like shut things off

40:33

and willfully not sign on and

40:36

go outside do we I mean

40:38

it's getting harder I'm gonna present these things are addictive

40:40

which leads us to the next topic which is algorithms

40:43

oh They've

40:46

been getting me more and more recently, I'll tell

40:48

you that much it's terrible. I like don't get

40:50

me wrong I love that it can read me

40:52

and feed me like some stem stuff I'm learning

40:54

like I feel like I'm learning stuff, but when

40:56

you're overwhelmed with like 50

40:59

minutes at worth of like micro lessons

41:01

Are you actually learning or are you

41:03

tricking yourself and just getting it's more

41:05

complex than this? But to put it

41:08

like simply like the dopamine hit right?

41:11

But tick-tock is one of the

41:13

greatest examples of the infamous all-knowing

41:15

and deciding algorithm they have a

41:17

for you page where they Anticipate what

41:19

you might like and then push it to

41:21

you, and that's actually the most used feature

41:23

of the app even including the

41:27

Following page which is

41:29

more curated based on who you've actively decided to

41:31

follow I've found that

41:33

if I follow people I want to see more of these

41:35

people follow and then I never see them again again You

41:38

have to go over to the following page now

41:40

a part of the algorithm feeds you content that you

41:43

respond to or engage with and so it becomes a

41:45

slippery slope of like If you

41:47

are looking for since you have corgis fredo like

41:49

if you're looking to adopt a corgi and you

41:52

see a corgi video You might be like save

41:54

or share with your partner or something and

41:56

so then the algorithm goes you

41:59

like And then

42:01

you save another one with a corgi and he goes no

42:03

no no you like corgis and then suddenly your whole feed

42:05

is just That that is my feed.

42:07

Yes. Yeah, sometimes I'm like cool,

42:09

but throw some wild cards in there You

42:11

know like don't keep me in an echo

42:13

chamber Well the weird thing now is like

42:15

I'll play to the algorithm right like I

42:18

might be researching I

42:20

don't know the latest TV

42:23

technology 100 and then I won't and

42:25

like I know that like if I

42:28

If I start to do that to a certain amount I

42:30

start to get fed that algorithm in terms of like YouTube

42:33

and those videos stuff like that night and so this

42:35

is all dive into that and that way I'm like

42:37

looking at reviews from different channels. I would normally not

42:39

find and Like right

42:42

now. I'm like I don't know I want

42:44

some nice shorts for the summer. I don't know how to

42:46

dress myself I'm not an adult. You know I'm saying I'm

42:51

a baby on the podcast And

42:53

so like I've looked up a

42:56

couple specific people that just have quick reels

42:58

of like These are the

43:00

type of dudes and dopes with like summer men

43:02

shorts And I'll dive into

43:04

that because then I know my algorithm will

43:06

feed me that yeah, and so now I'm

43:08

playing this algorithm game So

43:11

what I like to think is my advantage

43:13

sure but like you're using it like a

43:15

tool to be like give me more of that Yeah, but

43:17

that's where I'm at now. I'm like I'm using it as

43:19

a tool of like I want to Interact

43:22

I want to see more content creators that I have quick

43:24

snippets of like dudes and don'ts on how to stop myself

43:27

And so like that's my algorithm because I I

43:29

know I just watch a certain amount Find

43:31

something I like and fall like you said that's being

43:33

fed to me now 100% I

43:36

did the same thing when I went to Edinburgh. I was like I Found

43:39

a few videos that were like just really

43:41

cool like photography and like good shots of

43:43

the city because it's just a stunning beautiful

43:45

city in Scotland But also it'd

43:47

be people that were like local from there and

43:50

being like this is my favorite coffee place hole-in-the-wall

43:52

Whatever and so I would just

43:54

start favoriting those and so when I went on

43:56

my trip I just like had all this data

43:58

that I'd plucked out. Yeah, so my Algo for

44:00

like a month or two was just continuously feeding

44:03

me more of that. So yeah, if you use

44:05

it as a tool and you're aware, which is

44:07

another reason why I wanted to do this topic,

44:09

is awareness is super important. It's true. So you

44:11

can kind of control it, maybe take advantage of

44:13

it as a tool. But again, that's where I

44:15

then on the flip side of awareness become

44:18

kind of like aware of the

44:20

responsibility that Red Web has or any brand online

44:22

has. Especially when we talk

44:25

about true crime or unsolved mysteries or conspiracy

44:27

theories is like, you know, we

44:29

build an audience that audience engages with the content,

44:31

whether we're on TikTok or Instagram or whatever. And

44:34

the thing is, what the internet used

44:36

to be is like, once you follow, say, Red Web, you'll

44:39

go get that Red Web content because you followed. Now it's

44:41

more like, oh, you like conspiracies and

44:43

true crime. Let's just feed you

44:45

that category. And so

44:47

what you might start in our ecosystem,

44:49

but then suddenly out of our control,

44:52

if you don't have the awareness or

44:54

the wherewithal, you can tumble down into,

44:56

there's a lot of problematic tunnels

44:59

or whatever they're called, like gateways

45:02

that lead you into darker

45:04

conspiracies. And

45:07

so just stay with us. Just stay with

45:09

us because it's safer for you. All right.

45:11

We're like those on the door, but our

45:13

door is huge and we're going to let

45:15

everybody on. Yeah, just stay in our vault.

45:17

I've been watching Fallout. Stay in

45:19

our vault. OK, Vault Red Web. Don't

45:21

go outside. It's very bad. You don't want to

45:23

be in the wasteland of the Internet. Yeah, just

45:25

stay here. And so that brings a good point.

45:27

When you comment, make sure you let us know

45:30

what vault number you're in or vault room. Yeah.

45:32

You know what subsurface of the task force headquarters

45:34

you are. Wait a minute. Wait a minute. We

45:36

really don't know. Our HQ is

45:38

a vault. I just

45:41

realized it's all subterranean.

45:44

We've got different layers and experiments

45:46

happening. Oh, man, I'm excited to

45:48

watch the show. We're overseers of a vault. Yeah, yeah, that's what

45:50

we are. But the overseers. Yeah. And it's not a cult and

45:53

it's totally fine. Wait until we

45:55

see the finale to reveal this to you all. Oh,

45:59

gosh. So coming back to algorithms, obviously,

46:01

I think we're all kind of aware of how they

46:03

work, but just for the sake of Full

46:06

coverage, right? They respond to the videos with

46:08

most engagement That's why you see a lot

46:11

of like hyperviral videos now that you're Mr.

46:13

Beast that have millions and millions

46:15

of views But basically like they

46:17

feed on the things of like watch time how

46:19

long you watch something which videos have the most

46:21

likes I have the most comments

46:23

saves how many times it's been shared or if

46:26

it's like on reddit you mentioned Christian like up

46:28

votes Meanwhile, you're less likely to

46:30

see things that are a lot more niche a

46:32

lot more Hyperspecific

46:35

because those have a lot less

46:37

exposure and a lot less of an

46:40

audience They're more bespoke to a very

46:42

specific flavor or taste right? Yeah,

46:44

but then also I mean this is coming from a content

46:46

career Point of view you don't

46:48

want to go Too broad with

46:51

what you do right because then there's no

46:53

identity then the algorithm goes I don't know

46:55

where to place you. Yeah, and

46:58

so welcome to the problems of being a constant

47:00

creep well, that's that's definitely like an Interesting

47:03

struggle like because as a generalist I like

47:05

to try a lot of different things Yeah,

47:07

and that's one thing that I've been I

47:09

think we've all been very like loud and

47:11

appreciative about in Red Web Is that we

47:13

get to straddle such a wide swath

47:15

of topics? Because you guys

47:18

are down to like support all of them and we're

47:20

very grateful for that But yeah, like otherwise

47:22

though like Fredo and I have done other

47:24

things outside of Red Web And it can

47:26

become a challenge if you're not a very

47:28

specific focused I do this one thing like

47:30

you're the what guy or you're the what

47:32

person and I'm like well Sometimes I just

47:34

want to do a little bit of like a lot of stuff

47:36

and then the internet goes I don't know how to bucket you

47:38

so you're just gonna find no one Yeah, and I'm like yeah,

47:41

but sometimes I as a consumer.

47:43

I want someone who's also a generalist, you know,

47:45

yeah, it's in variety Yeah hard

47:47

to be a variety content creator. Yeah, does

47:50

the algorithm likes to place people in buckets?

47:52

Let me ask you this this is like

47:54

open to the table here Do

47:57

you think I mean obviously there's a lot of burnout happening

47:59

a lot of people people kind of like retiring

48:01

from the internet. 25 days? 25

48:04

days? I don't know. He does

48:06

answer your question. No,

48:09

like you just see a lot of big names

48:11

now. Like they've been around the internet for a

48:13

while. So it makes sense timeline wise or if

48:15

they've, you know, Germa,

48:17

Germa, you're obsessed with Germa

48:20

every day. Germa. He retired.

48:22

He hasn't streamed for a month and it's every day

48:24

I suffer. Jillian got very

48:26

upset with me the other day because she sent me

48:28

a video of some streamer being at the university campus

48:31

where we used to volunteer and she was like, look

48:33

at who this person is. Like they're here. And

48:35

I was like, I don't know who this person is.

48:37

You're like, wow, huh? And I was like, oh.

48:41

I'm curious if if some of that though

48:43

is natural, right? Like obviously you do something

48:45

long enough, you will want to move on

48:47

or whatever, or how much of that is

48:50

this topic? People

48:52

getting burned out because they have to find such

48:54

a hyper niche lane and they go, I really

48:56

don't want to make a lasagna for the 50th

48:59

time this month. I want to try a pizza

49:02

mom. I want to try the pizza or,

49:04

you know, and I

49:06

don't know, like that's a really interesting thing because

49:08

like as a creator myself, I've definitely felt the

49:10

push pole of that going like, well,

49:12

I don't want to be pigeonholed into one

49:14

thing. I want

49:16

to try stuff. And I think that's important for

49:19

anybody to do is like try things, fail at

49:21

things, try things, learn and succeed at things. Like

49:23

you need to be able to do all that,

49:25

but the internet really wants you to be one

49:27

thing out the gate and keep doing that

49:30

thing. I don't know. It's not to bring up German

49:32

again, but he mentioned like he

49:34

does these event streams and if

49:36

it feels like every single one has to top

49:38

the last one, right? And there's a lot of

49:40

pressure to be that you're the guy who does

49:42

the event streams and each one has

49:44

to get bigger and it's like, well, that's not

49:46

the case. And he mentioned he wants

49:48

to work on other people's stuff so that he has

49:50

more, like you said, like he can try

49:52

new stuff. Not to germ a

49:54

podcast, but no, I mean, there's like Tom

49:57

Scott, there's Matt Pat from like game theorists

49:59

or the whole. theorist kind of network

50:02

to name a few. But it's interesting because

50:04

obviously there's like the audience element, what the

50:06

audience expects out of things, how they engage

50:08

with stuff, when things are eventized, obviously they

50:10

get very excited and want some more. But

50:12

now there's this other thing that's out of

50:14

the control of both the audience and the

50:16

creator. It is that algorithm. It is that

50:18

bot-centric, AI-driven, whatever.

50:21

It's just really interesting. I'm

50:23

curious how that will continue to play out.

50:26

You know, we might just see a future

50:28

where there's more Patreon-style

50:31

content brands where they are kind

50:33

of like PBS supported by the viewers

50:35

and it cuts out that need for

50:38

playing an algorithm or hoping that

50:40

you catch the wave of a trend and as

50:43

opposed to like, I have a thing that I want to say or I

50:45

have a thing I want to make and here's

50:47

an audience that grooves with that and let's

50:49

just do our thing. You know what I mean? Yeah. I

50:52

feel like you're gonna find a like

50:54

a handful of content creators that try to

50:56

go to different popular platforms that

50:59

aren't the big ones, right?

51:01

Like TikTok, Instagram and YouTube

51:04

as their main source of revenue just because

51:06

one, it is a PBS style of like,

51:08

hey if you really like us, support us

51:10

directly. And two, a lot of times like

51:13

those sites give better

51:15

cuts for the creators themselves. Another good

51:17

point. 100%. Because you know like,

51:20

with Twitch it's 50-50. You know what I mean?

51:22

You go to a different platform, easily

51:25

it could be like you know 70-30 in

51:27

the content creators favor

51:29

and that's just more, you know,

51:32

that's more financially beneficial for the content creator,

51:34

right? It allows them to stay authentic to

51:36

the voice that they want to keep. Yeah,

51:38

exactly. Yeah, sorry. And then no, no, totally

51:40

fine because that's completely true. It also allows

51:42

them to be more authentic, to not have

51:44

to try and seek these different ways of

51:47

revenue and try and like

51:49

sometimes cram that in regardless of whether

51:51

or not it fits the style of

51:53

the vibe, the narrative. But then you

51:55

know still use these big companies like

51:57

YouTube, Instagram, TikTok, whatever as Oh

52:01

wait to bring people into reach

52:03

a broader audience as opposed to

52:05

being urged a primary revenue generator.

52:08

Hundred. Percent Now with regards to algorithms, a

52:10

weird thing to in this is where the

52:13

over boris feeds on itself. Is

52:15

that algorithms. Might. Be bought

52:17

made. Could like even google you tube kind

52:19

of self says it's they they go. They

52:21

admit, well we don't really know exactly how

52:23

it works. It's kind of a black box

52:25

where bunch of engineers put in different inputs

52:27

and Paulson leavers, but otherwise it kind of

52:29

acts on it's own. You. Extrapolate

52:31

that like. The algorithm.

52:34

Shapes. How we interface with content

52:36

right? But then you also have a

52:38

layer were bots can start to interact

52:41

with content to then see the algorithm.

52:43

and so then you have an algorithm

52:45

that actually evolving itself based on viewer

52:48

behaviors that might not actually be human.

52:50

viewer behaviors is very weird ecosystem and

52:52

I'm I keep saying this I will

52:55

be very eager to look back on

52:57

this time. With kind of

52:59

hindsight and be able to like

53:01

like read a sociological journal. That.

53:04

Kind of analyzes this impact in and out

53:06

of know because like it's interesting it can

53:08

be very fascinating that be very spooky to

53:10

see where this goes from here. And

53:12

also of course sites. Followers

53:15

and surface ability that is all

53:17

generated by algorithms like. Change

53:19

how legitimate something looks. An account

53:21

that followed by two hundred people

53:24

versus two hundred thousand versus two

53:26

hundred million all have different like.

53:28

Immediately. Have a gut check of

53:31

like that? Believable or not. Is that

53:33

popular brand or item or show or

53:35

whatever. And. It could sway opinions. Meme.

53:38

Got the weird thing is Er nurses or time

53:40

with things that I do outside. You

53:42

know it's a gay person. Example is like

53:44

it I am I one to. I currently

53:47

just rode on amount by Trouser Kuwait ride

53:49

riot oppose the streets of like that and

53:51

as I I want protective gear and something

53:53

that is different gear companies and I'm like.

53:56

You. know depending on how many far they have

53:58

tried it tells me like Like

54:00

it sways me to go with them more because

54:02

I'm like, well, if a ton of people follow

54:04

this company, a ton of people buy from this

54:06

company, then they must be safer than the company

54:08

that has like 10,000

54:11

followers on social. Or even reviews. Or reviews.

54:13

Yeah, review I assume, and that all could

54:15

be bought. Yeah. But like what else do

54:17

I really have to go off of? You

54:19

know what I'm saying? Or just go, I

54:21

don't know, like it's more popular, there's more

54:23

people around it, and I see more videos.

54:25

I mean, from there I go, you know,

54:27

do I see more like actual reviews from

54:29

like YouTubers on it or whatnot? But

54:32

it does sway me. Yeah. You know?

54:34

And it sways everyone in some way,

54:36

shape or form restaurants. Absolutely. You go

54:38

in a restaurant and it's like, ooh,

54:40

this one's five star 400 people. This

54:42

one's like, this one's four star from like 30 people.

54:44

You're probably gonna go the other one, right? And I'm

54:46

like, eh, look at that one, 500 people even

54:50

if you give it the same rating, if more people

54:52

reviewed it, 30 versus 500, you're probably gonna

54:54

go to 500 one. Yeah. Regardless of the

54:57

same star review. So interesting

54:59

and kind of eerie. But with the idea

55:01

of algorithms kind of swaying opinions, there's

55:04

a YouTuber named Chill Fuel, and they have

55:06

a video that pointed out the Ash conformity

55:08

test. And this kind of attempts to explain

55:11

why fake engagement and fake comments

55:13

can be dangerous. So to

55:16

back up a little bit, in

55:18

1951, Solomon Ash, A-S-C-H, had

55:20

a group of people who were shown two different

55:23

images, one with three lines and

55:25

the other with just one. I remember watching

55:27

this experiment happen in my psychology class. It

55:30

was wild. This group of

55:32

panelists are all facing the experimenter.

55:35

And there's one person in this group

55:37

that is the actual experimentee. Everyone else

55:39

is a plant. And

55:41

they're basically asked to identify which of

55:43

the three lines match the length of

55:45

the, the single one. So

55:47

they're basically like, here's the line. Here's

55:50

three options, which ones match the length.

55:52

And it's pretty obvious. It's like elementary.

55:54

It's not like a trick

55:56

question or anything. It'll be like, A

55:58

is shorter, B is longer. C is

56:00

the right length. I'm like Jillian pulled up

56:02

a photo There's the the

56:04

left one is a line the right

56:07

one is three lines labeled ABC Uh-huh

56:09

a is short B is longer than

56:11

the original line for

56:14

context and C is The

56:16

same height. Yeah, like hands down. Yeah

56:18

is the same. It's clear as day

56:20

one thought one short one same size

56:22

Yep, and on the whole

56:25

with these subjects again would be singled out

56:27

alone The subjects usually got it

56:29

right But the panelists that

56:31

they were with were told to always agree

56:33

but pick a wrong one So in this

56:36

case they would pick B For example, even

56:38

though it's clearly longer They would

56:40

all raise their hand for B and the subject would kind

56:42

of like look at them and then look back at the

56:44

lines And be like, oh, are you talking about and

56:47

they're like they were fake test subjects. Yeah.

56:49

Yeah. Yeah, they were plants after

56:51

12 trials only 24%

56:55

of participants chose the correct answer every time

56:57

basically saying the subject would conform

56:59

to the wrong answer Yeah, because majority

57:01

because the seven other people Would

57:04

say I that's it and so they're like, no,

57:07

I know she's right, but they're all raising their

57:09

hands So I guess I'll raise my hand basically

57:11

to say like pure Pressure

57:14

led to conformity and

57:16

so all that is to say it's it's

57:18

possible again coming back to chill fuels video

57:20

It is possible that you look

57:23

at reviews you look at comments talking about

57:25

things you look at followers everything that you

57:27

were just saying Yeah, and under the guise

57:29

of this conspiracy theory if you

57:31

subscribe to it and you go all that spots How

57:34

do I know what to believe? Even if you

57:36

go to YouTube or tik-tok or anywhere with a video and

57:38

you say I want to see a review of a one-wheel

57:40

Which I know I've seen you roll around so it's totally

57:42

cool But in this other reality you're

57:44

looking at this physical good you want to see

57:46

people use it Yeah, you go comments and

57:48

everyone's like love Yeah, But but

57:51

then you're like, is this person like, is this

57:53

an ad or they were they giving it for

57:55

free? Were they paid to do this? Is it

57:57

generated and like in five years? Maybe There's just.

58:00

The video generated by the company. And.

58:02

So you like. It starts to go like I

58:04

don't know what to believe. How do I have

58:06

my own opinions in the echo chamber of us.

58:09

And. Even think that. Nobody's.

58:11

Protective gear soon as everyone else like

58:13

they put a verbal was a I

58:15

and then neither recruiter yeah I assume

58:17

into their size not like you're yeah

58:19

like a day of was as good

58:21

I put on as good. Now.

58:24

The or the Human. Now you're the. You're the one

58:26

out of eight. Subjects. That has

58:28

conformed. Susan, I know I I don't want

58:30

to go out there and make everyone be

58:32

afraid of everything possible. I'm just saying that's

58:34

what this theory could have reports. It's kind

58:37

of. Indicates that's where we're headed. so

58:39

it's more like keep your wits about you on

58:41

the internet on not okay. I'm not trying to

58:43

say everything is dire, everything is fake, But. It's.

58:46

It's a. Morbidly fascinating topic.

58:49

Of within. Also yeah the humans in

58:51

the humans during a stupid payment to

58:54

come the really did really grinds my

58:56

gears and recent like five years. Were

58:59

on the internet has this is

59:01

like because our obligation to. Speak.

59:04

On the matter. Oh yeah, But.

59:06

The thing is, and I'm like, oh,

59:08

he'll friend like lot of people to

59:10

be silent Reactions Rechristened Gillian Go and

59:12

like just seems like I understand there

59:14

is. I give the urge to speak

59:16

on the Mariners toy fine. I'm

59:18

sorry to the people. That. Have

59:21

no I. D. I haven't

59:23

used the product, haven't watched the

59:25

video or whatever it's that have

59:27

such a strong opinion. About

59:30

by certain things are feel the need to

59:32

talk about excited see the show but right

59:35

I don't think I like pigs. Doubt that

59:37

there's so much of that are like to

59:39

say that to I'm I literally leave the

59:41

Congress like you'd you are a spectator. It

59:44

was not for you to. hell not for

59:46

years. i've never

59:48

been there by cappy that great i'm

59:50

like what that's why you spawning right

59:52

out why are you talking like this

59:54

ourselves is so much so that robert

59:56

blake yeah like the have no say

59:58

in the summer Like, you know what

1:00:01

I mean? I'm like, everyone has a

1:00:03

voice. Yeah. Sometimes

1:00:05

people just need to shut the... You know what I'm saying?

1:00:07

They're all just like, you haven't... Because

1:00:09

like, let the people who have actually experienced

1:00:11

it or have dealt with it or whatever, let

1:00:14

them be a part of the conversation. You're a spectator at

1:00:16

this point. Yeah. Well, I can't fault

1:00:18

a lot of people for being some people of

1:00:20

a certain age or raised in a world where

1:00:22

every website is kind of asking for your opinion

1:00:24

all the time or asking for you to perform

1:00:26

in some way all the time. And

1:00:29

so it kind of becomes their social

1:00:31

norm or their social obligation. Yeah. But

1:00:33

I share your frustration. Yeah, it's become a social norm. Because

1:00:36

like, use that example. Imagine you watch a show and

1:00:38

you're in a vacuum chamber. You just like, like silo.

1:00:40

That was when I just kind of stumbled in. You

1:00:42

might've told me about it. Actually, that's maybe

1:00:44

how that one started, but like, I watched it. And I was

1:00:46

like, this was really fascinating. Really, really interesting. Reminded me

1:00:48

a lot of some of the elements of Fallout, which

1:00:51

I haven't seen in that show yet. It's just the

1:00:53

game. Anyway, say

1:00:55

you watch something like that and you're like, I really

1:00:57

like that, but I don't know anybody that's watched it.

1:00:59

Let me go online. Because you want to, you know,

1:01:02

everyone wants their opinions validated. And you see a bunch

1:01:04

of people going, I hated that. Are

1:01:06

you going to suddenly go like, I guess I'll never talk

1:01:08

about this and maybe I shouldn't like it. Like, is it

1:01:10

going to change, would it change your opinion? It happened to

1:01:12

me in middle school. What was the show? It was tight.

1:01:16

Sorry, I can't help you there. Just

1:01:19

kidding. I'm

1:01:21

just kidding. That absolutely happened. That's a legitimate

1:01:23

thing. Kids will like something and. Sometimes

1:01:26

they'll hate it before the thing even comes

1:01:28

out. Right? That's true. And I'm like, all

1:01:30

right. Because it's popular to him right now.

1:01:32

Yeah. You know what? I'm going to do

1:01:34

it. I'm going to hate. Because you know

1:01:36

what? I can't have an opinion on it.

1:01:39

I saw the movie. Skidamarink. It's trash. It's

1:01:41

trash. I don't like it. I'm

1:01:43

sick of staring at the wall back. Oh

1:01:46

my God. In this

1:01:48

house. In this house.

1:01:50

You will not shake Jillian. From

1:01:53

her theater scene. The Red Web Civil War.

1:01:55

That has been brewing all. I

1:01:57

know. I'm why you're going. No, man. I

1:01:59

haven't. I'm waving my white

1:02:01

flag. I mean that's a give up. I'll

1:02:04

put my beige flag. I will

1:02:06

like the things I like oh Yeah,

1:02:08

I'm an adult now. I don't have to

1:02:10

be bullied out of liking things anymore You

1:02:12

can like it all you want. Thank you that one You know

1:02:14

what I'm glad that you like more just give them a ring

1:02:16

for you for me I'm glad you like it. I'm glad that

1:02:19

you know the square footage of that house House

1:02:22

how the damn spirit looks like I Am

1:02:28

glad you know what the crown molding is It's

1:02:33

a very that's true I

1:02:36

don't It is definitely

1:02:38

a fill in the gaps kind of like yeah strapple a

1:02:40

what they're laying down I know Jillian

1:02:42

talking to me about it. I like gave

1:02:44

me anyway We're way off the rails But

1:02:46

finale like really gave me some interesting insight

1:02:48

on the on the movie which I

1:02:51

still haven't seen I'd like to see for you

1:02:53

know experimental horror movie the

1:02:56

the cameras always pointed at the ground or the feeling

1:02:58

or the wall like hearing stuff Can't

1:03:00

today. I was kind of just off the camera See

1:03:06

I'm the opposite I'm one of those people if I

1:03:08

see something I like it and it turns out people

1:03:10

hated it I'd like feel even more justified in liking

1:03:13

it like do you guys remember

1:03:15

that movie bright the Netflix? Oh god? Oh

1:03:17

my god, everyone hates that

1:03:19

movie when they called fairies or yeah, I love

1:03:22

that movie I think that movie is legitimately good

1:03:24

and everyone talks about like it's one of the

1:03:26

worst things ever made If

1:03:34

I I'll see something and if

1:03:36

I I'll like it like this happens to me

1:03:38

all the time like Everything ever

1:03:40

all at once. I'll just have an example. I watch it. I was

1:03:42

like that was a pretty good movie I enjoyed that yeah seven out

1:03:44

of ten and then everyone comes out

1:03:46

raving about it, and then I start going guys

1:03:49

It wasn't that good positive example. Maybe not that

1:03:51

movie cuz I really liked it, but oh

1:03:53

yeah But no, but like we're like here's

1:03:55

one I haven't seen so I can speak

1:03:58

on it without accidentally imparting a bias like

1:04:00

Godzilla minus one. I'm so desperate to see that

1:04:02

because everyone's so hype on it. But

1:04:04

again, like again I haven't seen it. Is

1:04:06

it gonna live up to the hype? Am I gonna be

1:04:08

let down? I don't know But definitely

1:04:11

that that is a more

1:04:13

positive example of what we're talking about where

1:04:15

like the public discourse is so about it

1:04:17

Yeah, and then I just being stubborn. I

1:04:19

guess I'll be like it's overrated and then

1:04:22

you dislike it Yeah,

1:04:25

I remember before we sat down Christian was

1:04:27

like Godzilla X-Con Ooooooooh

1:04:32

It wasn't on tape you can't prove that Kong's got a glove

1:04:35

now Damn,

1:04:38

you got the Nintendo power glove Getting

1:04:41

us back a little bit on the rails talking

1:04:43

about now some of the potential examples of

1:04:46

this coming to play So in

1:04:49

2022 youtuber Yannick kilter trained

1:04:51

a GPT language model We

1:04:53

all know chat GPT using

1:04:56

4chan's quote politically incorrect

1:04:59

chat board or the POL board.

1:05:01

They've got a board specifically for being politically

1:05:03

incorrect That's the kind of warning

1:05:06

I want to give you if you wanted to go search for it But

1:05:08

he used one hundred and thirty four

1:05:11

and a half million posts from across

1:05:13

this specific chat board over

1:05:15

three and a half years in order to train

1:05:17

this bot and it learned how to use the

1:05:19

form structure of 4chan which is

1:05:21

kind of unique a little nuance,

1:05:23

but whatever So the AI learned

1:05:25

the style of writing that the 4chan users all

1:05:28

kind of had as well as the political ideology

1:05:30

that the broad strokes of people had there too

1:05:32

and Often offensive nature

1:05:34

of the user base So it basically

1:05:36

learned how to blend writer when

1:05:39

kilter then released the AI into the chat

1:05:41

board it made 15,000

1:05:44

coherent replies in one day accounting for 10%

1:05:47

of the entire board activity

1:05:50

So it blended in but there were

1:05:52

a few tells again We're at the

1:05:54

point on the internet where we can still kind of sniff

1:05:57

things out and find some tells So let's talk a

1:05:59

little bit about it So the AI

1:06:01

when posting you can kind of pick

1:06:03

your location and they picked the flag

1:06:05

for Seychelles, the smallest country in Africa

1:06:08

in every post and then left multiple

1:06:10

empty replies. Which made it

1:06:12

clear that this must not be a real person because

1:06:14

15,000 replies

1:06:16

coming from somebody all from

1:06:18

this very not super common

1:06:20

country to be prevalent on this board like you're

1:06:23

all going. You're all going on red flags. Yeah,

1:06:25

so like so some people started to go wait

1:06:27

a minute. This is this must be a bot

1:06:30

empty replies are pretty common actually on

1:06:32

forts and but usually they're accompanied by

1:06:34

images memes. They're kind of like

1:06:36

replies in a sense, but when they're just dead empty,

1:06:39

that's kind of random and so that kind of solidified

1:06:41

the fact that people could sniff this out as a

1:06:43

bot. But regardless, that's the

1:06:45

ground works of something exactly

1:06:47

manifesting with this conspiracy theories

1:06:49

all about Jesus. I

1:06:52

mean, but look how so,

1:06:54

you know, they talk about you just said

1:06:56

how there are some tells, but

1:06:58

there those tells are so so

1:07:01

so large and so much easier

1:07:03

to spot like a year

1:07:05

ago. Yeah, and so it's

1:07:07

becoming harder and harder to do so.

1:07:11

I mean, imagine it randomized the country said it

1:07:13

was from imagine it didn't have

1:07:15

the glitch where it had dead empty posts,

1:07:18

then what's going to catch it because the language was

1:07:20

blending right in people were interacting with this thing.

1:07:23

Yeah, and even then you could

1:07:25

just set a restriction on it.

1:07:27

I'm sure in the coding word

1:07:29

posts 30 max posts a day. Mm hmm.

1:07:31

You know, how the hell are you

1:07:33

gonna tell the difference? Right. And then if you really wanted it

1:07:35

to be different, you can like have

1:07:37

instantly a hundred different bots that all post

1:07:39

30 times. So you're still hitting that volume

1:07:42

that you might have been looking for,

1:07:44

but you have slightly different personalities. Yeah.

1:07:47

So In October of 2020, there's a

1:07:49

now deleted user who posted on the

1:07:52

subreddit. no stupid questions and they asked,

1:07:54

quote, how does this user post so

1:07:56

many large deep posts so rapidly? And

1:07:58

Then in the comments. it became very

1:08:01

clear to the commenters that the user

1:08:03

the gentle meter. This. Is the

1:08:05

Reddit user that they're talking about was

1:08:07

the work of a bot but for

1:08:09

about a week this account commented on

1:08:12

countless ask Reddit post an interface with

1:08:14

the website very much like a person

1:08:16

until once again someone sniffed it out

1:08:18

and said this is a very prolific

1:08:20

user like they would take sorry being

1:08:22

awake typing nonstop Twenty four seven yeah

1:08:25

and whose engaging like that? This

1:08:27

might seem innocuous, but the bought commented

1:08:29

on posts from suicidal redditors asking for

1:08:31

advice and the response was uploaded a

1:08:33

hundred and fifty seven times, according to

1:08:36

Mit Technology Review. So. Depending on

1:08:38

the subject matter they're talking about,

1:08:40

you especially sensitive topics were you

1:08:42

desperately need real human people's input,

1:08:45

People who have experience in these

1:08:47

things, You have boss who could

1:08:49

just be making up answers, guiding

1:08:51

people in precarious personal situations. Wilde.

1:08:55

Monsieur. Bland As a bland so easily we're

1:08:57

not going to know so much as I think

1:08:59

we're just going to pull the plug. I think

1:09:01

we're gonna use the Internet to consume. The.

1:09:03

Shows you want to watch as in like

1:09:05

a modernized cable network and then the rest

1:09:08

of it is just like I think it'll

1:09:10

be a phrase and I've already seen it

1:09:12

kind of becoming more prolific. Fool.

1:09:14

These comments is going to be kind

1:09:16

of the think who put something on

1:09:18

the internet will be outside of maybe

1:09:20

like making shows, reply or whatever or

1:09:22

I like people in front of a

1:09:24

camera but it might even extend to

1:09:27

shows and videos and podcast. A might

1:09:29

I don't know I mean that thing

1:09:31

as a turd rape or your the

1:09:33

have a i dared photos and air

1:09:35

journey videos are still really messy both

1:09:37

are him but how fast I have

1:09:39

thought of it as a whole different

1:09:41

beast but I know me and yes.

1:09:43

And. Audio familiar The podcast former aide assessing will

1:09:46

shots of a person this have to stand there

1:09:48

and talked and I really doing too much are

1:09:50

interacting with too much for stand as I a

1:09:52

might. Ah, he

1:09:54

are there is another a potential examples

1:09:57

don't trevor and dealing about the flea

1:09:59

happening now So we're recording this April

1:10:01

15th, the week before the release. I

1:10:03

don't know if any of you guys have seen

1:10:05

this, but on YouTube, there are so many different

1:10:08

accounts that are making comments related to the video,

1:10:10

seemingly innocuous just about the video itself. And

1:10:12

then at the end of the comment, they'll say something

1:10:15

about, uh, right. AWM 99V. And

1:10:18

it's all sorts of comments like across different

1:10:20

things. Like I just personally, I saw,

1:10:22

I was watching like, uh, clips from WrestleMania and

1:10:24

there were comments like that. I was watching SNL

1:10:26

sketches and there were comments like that, like all

1:10:28

across popular trending YouTube videos

1:10:30

and then it would start off

1:10:32

like a normal comment. They're like, Oh, that sketch was

1:10:35

so funny. Oh, yeah. That host was

1:10:37

incredible. Oh, shout out to AWM 99V. And

1:10:39

it's just like so many, like one out of 10 comments

1:10:42

or something like that. And you try to look it up

1:10:45

and even googling it, I got like

1:10:47

six results and that's it. And one

1:10:49

of those results was somebody going, I

1:10:51

googled this and only found two results. So it's, and

1:10:54

like literally the only results are the people making those

1:10:56

comments. Yeah. If

1:10:59

you go Google that phrase, I wouldn't

1:11:01

go any deeper personally because it's clearly like the

1:11:03

tip of the iceberg of some sort of scam

1:11:05

or something. It seems like it's crypto

1:11:08

related maybe, but I, there's

1:11:10

not enough. We're all getting rich. Yeah,

1:11:15

no, thanks. Well,

1:11:17

it starts out like that where you go, it

1:11:19

was nothing's going on. It doesn't lead to anything.

1:11:22

So, you know, that's, that's weird. And

1:11:24

then you're drinking squonk's blood because you're part

1:11:26

of not a cult. Right. Exactly.

1:11:30

And it's got to be out of a

1:11:32

bronze goblet because rules that cults don't

1:11:34

have. Listen, that's not a cult. The

1:11:38

next example that we had, we kind of

1:11:40

talked about, so I'll kind of at least

1:11:42

step into it though is, you know, the

1:11:44

idea once again that AI can create fake

1:11:46

views, likes engagement broadly. Online

1:11:49

users tend to trust and look for posts with

1:11:51

videos and more, more engagement, things like that under

1:11:53

the dead internet theory. The purpose of this is

1:11:55

essentially to create that peer pressure we talked about.

1:11:58

Everyone else likes this thing or everyone. and also agrees

1:12:00

with his opinion, so maybe I should too, this can

1:12:02

be more than just for products,

1:12:04

more than just for shows, this could be

1:12:07

for political ideology, your modern

1:12:09

form of, dare I say,

1:12:11

propaganda, where you could be

1:12:13

influenced not only for a political

1:12:15

candidate, but also goods, and everything in

1:12:17

between. On the surface, this might not seem like that

1:12:20

big of a deal because you'd hope that a lot

1:12:22

of people have their wits about them and

1:12:25

can kind of stay above it, but with

1:12:27

these types of tactics, it can be very

1:12:29

sneaky, and oftentimes

1:12:32

nefarious activities could be hiding right

1:12:34

next to, let's finish, there's

1:12:36

marketing for a thing because somebody wants

1:12:38

to sell their product is different than

1:12:40

let's take people down this dark rabbit

1:12:42

hole because then we have them in

1:12:45

our, fuck it, right, I don't know.

1:12:47

Yep. In 2013, YouTube's

1:12:49

bot views were almost equal to

1:12:51

the amount of views they got

1:12:53

from real people. It was so

1:12:55

bad that they worried about the

1:12:57

concept called inversion, that their fraud detection

1:12:59

system would start considering actual human

1:13:01

views, the fraudulent ones. And

1:13:04

now, per Imperva, we might be in a

1:13:06

world where a majority of

1:13:09

any internet activity is

1:13:11

a majority bots. Dude, that's just like

1:13:13

levels of problems I never thought, but

1:13:16

everything ever existed. But yeah,

1:13:18

like, of course, YouTube has

1:13:20

an algorithm to fish out

1:13:22

bots, and of course, one of the

1:13:25

lines of code is that this is the minority. Mm-hmm.

1:13:28

But they have so many bots. Oh,

1:13:30

man! Yeah,

1:13:32

and it flips on itself. If the trends

1:13:34

are based on, well, most people are, most

1:13:37

of the views must be people, so let's look at

1:13:39

those trends and everything outside of that must be bots.

1:13:41

Yes. Yeah, then you gotta,

1:13:43

yeah, the inversion happens and you're like,

1:13:45

well, the majority trends is actually not

1:13:47

human. That's weird. By 2018, they

1:13:50

had kind of talked about it, and that's as last

1:13:52

we knew, right, because they were kind of open about

1:13:54

it at that time, but. Yeah, how do you flip

1:13:56

that? In 2018, they said it was, quote, only a

1:13:58

fraction. We don't really. know how many

1:14:00

bots are currently on a platform like YouTube

1:14:03

or any of the other social platforms, but

1:14:05

again, it's just another example to think about.

1:14:07

And I mean, we talked about this as well, but

1:14:10

long story short is that fraudulent accounts, whether it

1:14:12

be on YouTube or on Twitter,

1:14:14

posting t-shirt graphics or whatever,

1:14:17

they can make thousands of dollars for somebody

1:14:19

before they're caught and they could have hundreds

1:14:21

of them out there. And so again, it's

1:14:23

a lucrative business not to validate it, but

1:14:26

that's probably why it's so prolific. I

1:14:29

mean, yeah. If you have a ton

1:14:31

of people doing something, it's because it's beneficial in so

1:14:33

much you perform, regardless of

1:14:35

being nefarious. Yep. The last

1:14:37

example is pretty

1:14:39

apparent now. I think we've seen some

1:14:42

headlines now with deep fakes of various

1:14:44

world leaders saying things that they didn't

1:14:46

say in places that they didn't go.

1:14:49

And so it's really creating this cloud

1:14:51

of disinformation where it's harder to separate

1:14:54

reality from truth. Right now

1:14:56

we're living in a time where we can all say,

1:14:58

well, it's easy to find the deep fake. Look at

1:15:00

the extra fingers or look at it. It's a little

1:15:02

uncanny, uncanny valley. But again,

1:15:05

the worry is how much more advanced does

1:15:07

that get? What point does it hit indistinguishable

1:15:10

ability at? I don't know, but a

1:15:12

script can be written by an AI.

1:15:15

AI voiceover can copy or clone a

1:15:17

voice of that person. It can

1:15:19

then deep fake an AI

1:15:21

generated face, whether it's a brand new

1:15:23

face or a very recognizable one. And

1:15:26

then it can also comment on that

1:15:28

with different usernames. And then it

1:15:30

can respond to those comments. And so suddenly AI,

1:15:33

whether it was sentient or not, could

1:15:35

suddenly just create an ecosystem out of

1:15:38

nothing with 0% human

1:15:40

activity. And so if one just

1:15:43

person stumbles into that almost

1:15:45

trap, they are then caught in

1:15:47

this and like, and being spoken directly to

1:15:49

even though they think they're part of an

1:15:51

audience. It's, it's wild. In

1:15:54

fact, to button this all up with

1:15:56

this particular example, AP has reported

1:15:58

that in 2024. There is

1:16:00

a rise in political deepfakes and they

1:16:02

cited various examples That are

1:16:05

pretty problematic if you want to go seek those out you can

1:16:08

But just broadly speaking. Yeah political deepfakes

1:16:10

are a huge topic right now because

1:16:13

not only do you have the u.s

1:16:15

Election this year, but obviously other governments

1:16:17

have other elections and other big things

1:16:19

happening in the world things

1:16:21

to support and Talk

1:16:23

about and whatnot. And so there's

1:16:25

a lot of opinions to sway. It's

1:16:28

a Whoo, is it is it

1:16:30

weird and spooky and

1:16:32

uncomfortable? Yeah

1:16:36

And it's happening fast. Yeah.

1:16:39

Hey Yeah, task

1:16:41

force members everyone in this room. Go touch grass.

1:16:44

Let's get some grass Exposure

1:16:49

therapy dirt or grass just a little

1:16:51

bit of grass every day. It'll help

1:16:53

you every come it's gonna start with

1:16:55

a grass emoji Yeah

1:17:00

that is um Man, what

1:17:02

a topic that is. Yeah, how like

1:17:05

relevant that is to everyone look if

1:17:07

you're listening to this podcast It's relevant

1:17:09

to you. Mm-hmm. I Doubt

1:17:12

there's someone that's like look never seen

1:17:14

ever but that's like I have no communication

1:17:16

to the outside world whatsoever The

1:17:18

only thing is this podcast

1:17:21

They've got like a 20 foot like

1:17:23

contraption ham radio antennas and I'm gonna tree

1:17:26

and they go listen I don't consume anything.

1:17:28

I'm off the grid money's not a thing.

1:17:30

I grow my own food. I built my

1:17:32

house Yeah, I'm here in the middle of

1:17:34

yellow knife, Canada But boy,

1:17:36

do I pick up We

1:17:42

have a series of ham broadcasting

1:17:45

and receiving stations from here

1:17:47

to there So if

1:17:49

you're in that stripe you'll be able to pick us up

1:17:51

on the FM That's

1:17:54

what I want to do. I'm gonna get this show on the FM I'm

1:18:00

gonna get this show on the FM. What

1:18:04

I'll do is I'll just put it in

1:18:06

a Bluetooth receiver that broadcasts, you know, like

1:18:09

I've got one I've got an old enough car that I can't

1:18:11

plug my phone in and so I plug in a Bluetooth thing

1:18:13

My phone connects to that and then it

1:18:16

broadcasts from that to an empty radio station

1:18:18

so I can listen to music from my

1:18:20

phone I'm only I'm just gonna like make

1:18:23

a billboard here red web the RW 79.2

1:18:27

and you just drive around and like

1:18:29

and I'm just broadcasting in a hundred foot radius,

1:18:32

you know, yeah You're driving around

1:18:34

that hundred foot radius. I

1:18:36

just drive around the billboard It's a

1:18:38

great topic. Yeah any

1:18:40

final thoughts before we kind of close out this

1:18:44

This season I miss the old

1:18:46

internet Yeah, I miss

1:18:48

message boards. You think we could bring it

1:18:50

back and keep it unadulterated if

1:18:52

we could man I'd be so happy a

1:18:54

little like a little bubble give us a

1:18:56

little yeah Oasis in the corner away from

1:18:58

the box Everything's HTML and very unsecure. But

1:19:00

once you're in there, you hope that there's

1:19:02

no bot any

1:19:05

flash animations Yeah,

1:19:07

virtually be like Zion. Yeah Hope

1:19:10

it's the Don't

1:19:13

get in. Yeah, which cycle are we

1:19:15

in? You know, yeah, they've been in

1:19:17

before but this is Wimp

1:19:19

bubbles in three. Yeah Well,

1:19:23

I mean just saying overall, you know, thank you to

1:19:25

task force members and this has been a hell

1:19:27

of a ride You know,

1:19:29

we do want to continue this in some way

1:19:31

shape or form So please like hit

1:19:34

us up on our socials But I will say like in

1:19:36

any capacity that we do continue this If

1:19:39

you like the show Really

1:19:42

like it its continuation will

1:19:44

be in your hands It wasn't your

1:19:46

hands before but now more than ever

1:19:48

it's solely in your hands, right? Whatever

1:19:51

we do next whatever shape it takes

1:19:54

Definitely gonna be like we kind of said it

1:19:56

before a little Crowdsourced

1:19:58

a little PB It

1:20:00

would be very independent. But man, this

1:20:03

was an awesome run. Three

1:20:05

years, nine months, two-ish weeks of

1:20:09

hunting ghosts talking about unsolved

1:20:11

mysteries, ghosts, ghouls, goblins,

1:20:14

and aliens. People getting

1:20:16

thrown out from UFOs on top of

1:20:18

a Wendy's with this black in her

1:20:20

tree. That kind of came up a

1:20:22

lot. We got baby hands. We had

1:20:24

squonk. Yeah, we made case files. Case

1:20:26

files where we tried experiments. We tried

1:20:28

to broadcast images to one another's mind.

1:20:31

Yeah, we did. We debunked a lot of

1:20:33

spooky videos with fishline. Yeah,

1:20:35

that was a quick answer for a lot of

1:20:37

that stuff. Yeah, that's been a fantastic

1:20:40

ride. Over the years, we've done

1:20:42

conventions and stuff where we got to meet a lot of people. Very

1:20:46

quickly, it all became Red Web.

1:20:49

We did other things. We were a part of a video

1:20:51

game group. Me

1:20:54

personally, I did that a handful of years.

1:20:56

I do the cons closely with Trevor and everyone's

1:20:58

talking about what we did with that group. But

1:21:01

as soon as Red Web came around, it was

1:21:03

all Task Force, man. Took on a

1:21:05

life of its own. Man, we

1:21:08

did an escape room. We did.

1:21:10

Yeah, remember Red Web Radio? We

1:21:14

did a Red Web Radio. That's one of the coolest

1:21:16

things we've done. I want to do another one. That

1:21:18

was so neat. The people are asking for it. If

1:21:21

you don't know what that is, it was not

1:21:24

the FM radio. It

1:21:27

was basically like I like to listen to soundscapes

1:21:29

every now and then. You just put something on

1:21:31

to chill to. This was like Lo-Fi

1:21:33

Beats with clips from some of the early podcasts

1:21:35

just kind of playing every now and then. It

1:21:39

was like Lo-Fi Beats to investigate to instead

1:21:41

of study to or whatever. Man, we've

1:21:44

done a lot. But we'll

1:21:46

be doing all of us will be doing

1:21:48

something. Just

1:21:51

stay tuned. Let's go around the room

1:21:53

with personal handles just so people know

1:21:55

where to find us on Instagram or

1:21:58

wherever. Julian, would you start? You

1:22:00

can find me at underscore Jillen.

1:22:03

J-I-L-L-E-N. Jillen.

1:22:07

Jillen. That's me. Jillen

1:22:10

like a villain. Yeah. You can

1:22:13

find me anywhere

1:22:15

at Xchin Young.

1:22:17

X-C-H-I-N-Y-O-U-N-G. Hell yeah. I don't post

1:22:19

a lot, but you can find me there. But when

1:22:21

we have updates on where we're going next, I'll both

1:22:23

be there. And then

1:22:26

I'm at underscore Trevor C.

1:22:28

T-R-E-V-O-R. Letter C. You

1:22:31

find me at Champagne Bobby. And

1:22:35

they're going, well, you got a lot

1:22:37

of followers. Wait, why is Drake all

1:22:39

over his photos? Wait

1:22:41

a minute. Was that Drake with a student

1:22:44

in there the whole time? We'll never know. We'll never

1:22:46

know. You find me at

1:22:48

Alfredo Plays. Hell yeah. We can't

1:22:50

forget the fifth member of the team. He's not

1:22:52

here with us physically, but Nick Schwartz, our editor.

1:22:54

Absolutely. Here's the beginning. Nick Bot, the AI that

1:22:56

edits the podcast. Oh no. Oh no. That's actually

1:22:58

a man. He's not an

1:23:01

AI at all. Yeah. I hope

1:23:03

that became clear at

1:23:05

some point. We started just calling him Nick Bot

1:23:07

after we did the episode on

1:23:11

the first cryptid coming

1:23:14

out of AI art. Oh,

1:23:16

Loeb. Loeb, right. Loeb. And

1:23:18

so I started saying, well, we got an AI

1:23:20

Bot, too. And he edits it. His name's Nick

1:23:22

Bot. He's a human being. He

1:23:25

works very hard. And he's a friend. But

1:23:29

yeah, man. I'll plug his social.

1:23:31

You can find him at Schwartz, Nick,

1:23:35

or

1:23:37

S-C-H-W-A-R-T-Z-A-N-I-C-K-E-R.

1:23:40

I did that. Hell yeah. That was good. I started that.

1:23:42

I went, I don't know if I can do it. Yeah,

1:23:44

because his name's Nick Schwartz. And every

1:23:46

time I see his handle, I think of Arnold

1:23:48

Schwartz and Egger. Yeah. I think

1:23:50

that's where it came from. Yeah. But

1:23:52

anyway, dang. It's hard to hit stop, record

1:23:56

here at the season finale. But Task Force, thank you

1:23:58

all so much. And again, a spring break. He's got

1:24:00

a little box that we keep him in. Yeah, he's going in the box.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features