Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
It's time for security now. Steve Gibson
0:02
is here. My goodness. What would you
0:04
do if you found out
0:06
your washing machine was uploading 3.6 gigabytes
0:09
of data every single
0:12
day? Why would that
0:14
be? Well, Steve's got a good
0:16
solution. We'll find out which browser
0:18
is now totally dominant in the
0:20
world, and then we'll find out
0:22
what Google's doing to protect your
0:24
privacy and still give advertisers the
0:26
information they need to target you.
0:28
Is that possible? Stay tuned.
0:30
Security Now is next. This
0:33
show is brought to you
0:36
by Cisco Meraki. Without a
0:38
cloud-managed network, businesses inevitably fall
0:40
behind. Experience the ease and
0:42
efficiency of Meraki's single platform
0:45
to elevate the place where
0:47
your employees and customers come
0:49
together. Cisco Meraki maximizes uptime
0:51
and minimizes loss to digitally
0:53
transform your organization. Meraki's intuitive
0:56
interface, increased connectivity, and multi-site
0:58
management keep your organization operating
1:00
seamlessly and securely wherever
1:02
your team is. Let
1:05
Cisco Meraki's 24-7 available
1:07
support help your organization's
1:09
remote, on-site, and hybrid
1:11
teams always do their
1:13
best work. Visit meraki.cisco.com/twit.
1:18
Podcasts you love. From
1:20
people you trust. This
1:23
is Twit. This is Security Now with Steve Gibson. Episode 957,
1:26
recorded Tuesday, January 16th, 2024. The
1:34
Protected Audience API. Security Now is brought
1:36
to you by Collide. When you go
1:38
through airport
1:41
security, there's one line where the TSA checks
1:44
your ID, and then there's another line where
1:46
the machine scans your bag. Well, the same
1:48
thing happens in enterprise security. Instead
1:52
of passengers and luggage, it's end users and their
1:54
devices. Now here's the problem. The
2:00
problem Disease Most companies are pretty good at the first
2:02
part of the equation. They check user Id as he
2:04
may be using Octa. It Make sure the you you
2:06
know who are who you say you are. A.
2:08
Problem is what's a month and a kid? The
2:10
user the device is rules rates right through without
2:13
getting infected. And fact,
2:15
Sad to say forty seven
2:17
percent of companies allow unmanaged,
2:20
untested devices. In. The
2:22
access their data. And
2:24
that's bad news and employee can log
2:26
in from a laptop with a firewall
2:28
turned on or maybe a laptop with a
2:30
know as it has been up there
2:32
is a months or foods a a
2:34
copy of Plex that is well out
2:36
of date and fully in secure or
2:38
worse that laptop my belong to a
2:40
bad actor. using. Employee
2:42
Credentials. Collide. Solves
2:45
the device trust problem Collide
2:47
K O L Id insurers
2:49
know device and log in
2:51
Iraq to protected apps and
2:53
less. Passes your security checks
2:55
and this is the best
2:57
part. You. Can use
2:59
collide and devices without him The m.
3:02
Emmys, Your Linux fleet Your contract.
3:04
You devices every B Y O
3:06
D, phone and laptop in your
3:08
company. This. Is a
3:10
great solution. Physical line.com/security
3:13
now wants a demo
3:15
see how it works.
3:18
kolid.com/security Now. A. Cyber
3:20
Security Now normally Steve of the
3:22
over my left shoulder here but
3:24
he's actually over to my right
3:26
because argument left is over there
3:29
is over there sir I'm here
3:31
in their in their rely on
3:33
my mom's house. And visiting
3:35
mom and Steve's is at his house
3:37
and we're going to the show from
3:39
here. But. The good news
3:41
is the quality continues on. Steve.
3:44
Gibson. hello. The. Oh Leo great
3:46
to be with you wherever you are. He had
3:48
in the it's snowed here in the snow where
3:50
it's already getting dark because you're and and northern
3:52
latitudes you don't wanna. yeah this is else noted
3:55
that a rain is turning the slush and now
3:57
it's going to freeze and is that just Vol
3:59
boy. Yeah, architectural. Your.
4:01
Own of a Southern California. Words
4:03
always perfect. I am and of
4:05
there's no indication that I'll be
4:07
leaving any time soon as Own
4:09
assessed. ah, I don't have some
4:11
form after stage any of my
4:13
tabling his own. Okay to today's
4:15
topic of today's title: Is.
4:18
One of the dry as sounding
4:20
titles in a while. odd to
4:22
the zoo cured and now podcast.
4:24
Nine Hundred Fifty seven for to
4:26
eyes January sixteenth Point Twenty four.
4:30
Titled the Protected Audience
4:32
A P. Well.
4:34
That sounds fascinating. Bags many questions
4:36
waters the audience being protected from
4:39
as what are they needed. A
4:41
P I for right it's okay.
4:43
so. We're. Gonna explain all
4:45
that but first we're going to
4:48
examine what and I o t
4:50
device that have been taken over
4:52
would look like and do. What?
4:55
Would happen to them target
4:57
of the attacks that it
4:59
might participate in. What
5:01
serious problem was recently discovered in
5:03
a new post quantum algorithm Hoops?
5:05
And what does that mean? What?
5:08
Does the global map of web
5:11
browser usage reveal? And
5:13
after some entertaining thoughts and
5:15
feedback from our listeners and
5:17
describing the final touch, I
5:19
think it's going to be
5:22
final. That. I'm putting on spin, right?
5:25
We're. Actually gonna rock. Everyone's.
5:28
World of and I'm not kidding. By.
5:31
Examining. And. Mostly
5:34
understanding. What? Google
5:36
has been up to. For. The
5:38
past three years. Why?
5:40
it is going to truly change
5:42
everything we know about the way
5:44
advertisements are served to web browser
5:46
users and what it all means
5:49
for the future and this or
5:51
is this with the way we
5:53
kind of got to this podcast
5:55
is a is odd because i
5:57
thought i had have an idea
6:00
what I was going to talk about this week
6:02
and I mentioned that I had an idea last
6:04
week. Then when I
6:06
got into it yesterday I thought oh
6:09
no this doesn't really this is not going to
6:11
work. So you know the guy was
6:13
in the law and and so
6:16
but that dragged me into what
6:18
he was looking at which was
6:20
completely like what
6:23
is Google talking about? So then I
6:25
thought okay I can't even talk about
6:27
that after I'd invested rather
6:29
significantly in getting ready to talk about that.
6:32
I thought okay no. So then I
6:34
was upset and I moved it from being
6:36
what we would talk about into just
6:38
an item but then when
6:41
I tried to sort of massage it into away
6:43
from being our main topic into just a
6:46
news item then I thought oh I think
6:48
I kind of understand this. So I moved
6:50
it back into our main topic and expanded
6:52
it further and it took
6:54
up pretty much all the air of
6:56
the podcast. So I'm
6:59
already tired but just the
7:01
explanation is exactly. But believe me
7:03
this one and as I was
7:05
writing this I was thinking okay
7:07
as soon as this thing gets
7:09
produced and it's posted
7:11
I need to point Jeff Jarvis at
7:13
it because this is gonna wind him
7:16
up. I mean in a good way
7:18
he's gonna because you know Jeff likes
7:20
to understand things and he keeps telling
7:22
us how non-technical he is. Well everybody's
7:24
gonna understand this and
7:26
this is really important. Is
7:29
this the sequel to what Google's
7:31
doing with killing third-party cookies and
7:33
what was it flock and topics
7:35
and all the different things they
7:37
were trying to do to make
7:39
ads viable without invading privacy. Yes.
7:41
So it sounds like you think
7:44
pretty highly of it. It's
7:46
going to happen and what
7:50
is I Think the most
7:52
surprising thing is that the good
7:54
news is you know Tim Berners-Lee
7:56
is not in a grave from
7:59
which he. He could roll
8:01
over or turn over know Zachary
8:03
actively running bring wide web consortium.
8:05
Ah. But. But.
8:08
What Google has done.
8:11
To. Their. Browser
8:13
and they did this last summer.
8:16
This has been active since July
8:18
of last year. What they have
8:20
done. Is. Astonishingly,
8:24
Huge. And. As.
8:27
An by the end of this
8:29
podcast with our listeners are gonna
8:31
understand how the world has changed
8:33
and and we just haven't woken
8:36
up to it yet. Do? Eyebrow.
8:39
This obviously sounds like something everybody should listen
8:41
to. This is the argument, by the way.
8:43
We have this argument a lot on the
8:45
Windows Weekly about why there shouldn't be just
8:47
a monoculture with. With with. The browsing
8:49
because it gives google. Are. Outweighed
8:52
importance in all of this.
8:54
Well, the good news is
8:56
that the way, of course
8:58
everything they're doing is open
9:01
source, so Firefox will. End
9:03
up incorporating that. Into
9:06
is so so What Google has
9:08
essentially. Are browsers
9:10
used to be Html
9:13
renders I. Would.
9:15
Say what? This turns are
9:17
browser into. Is an
9:19
ad auctioning server? It is Us. I
9:21
know Leo it is. It is Huge.
9:24
It is. but but as the is
9:26
the only way for Google to deliver
9:28
what they want and what we demand.
9:30
Okay go ahead. I'm accesses interest now
9:33
this is this is a this is
9:35
a seminal podcast or it were to
9:37
have to listen and I'm going to
9:39
read about that word seminal but know
9:42
how I mean it's steep The game
9:44
of those us on your come out
9:46
come out on your be hundred. Are
9:49
you are are other topics to will get all
9:51
of that just a bit including. as
9:53
have a very funny as a picture the
9:55
week but first a word from our sponsor
9:58
for this segment security now look out Oh
10:00
man, I know Lookout. You
10:03
need Lookout. Your data is always on the move,
10:05
whether it's on a device, in
10:07
the cloud, across networks that are
10:09
at the local coffee shop. Now,
10:12
your workforce loves this flexibility, but
10:14
it is a challenge for IT
10:16
security. Lookout helps you control
10:18
your data and free your workforce.
10:20
With Lookout, you gain complete visibility
10:22
into all your data so you
10:24
can minimize risk from external and
10:27
internal threats and ensure
10:29
compliance. That's very important these days. By
10:32
seamlessly securing hybrid work, your
10:34
organization doesn't have to sacrifice
10:36
productivity, employee happiness for
10:39
security. Your IT department
10:41
is already under stress and strain, right? They have
10:43
to work with multiple point solutions. They've got legacy
10:45
tools. They're moving from tab to tab and app
10:47
to app, trying to get the job done. It's
10:50
not easy. It's too complex. And
10:52
as you know, when you move around from
10:54
context to context, information falls through the cracks.
10:57
And that means insecurity. That's
10:59
why you need Lookout. With its single unified
11:01
platform, Lookout reduces IT complexity
11:04
and means you can focus on
11:06
whatever else is coming your way.
11:08
And believe me, there's stuff coming your way. That's why you listen to
11:10
this show, right? Good data protection.
11:13
It's not a cage. It's a
11:15
springboard, letting you and your organization
11:17
bound toward the future of your
11:19
making. Visit lookout.com today. Learn
11:22
how to safeguard data and secure
11:24
hybrid work and reduce IT complexity.
11:27
All with one program. lookout.com.
11:30
We thank you so much for supporting security
11:32
now. All right, Steve, I'm ready
11:34
for the picture of the week.
11:37
Okay, so I've had this one in my
11:39
bag of tricks for a month or two
11:41
and just waiting for the right time. And
11:43
I just love this. So
11:46
for those who aren't seeing the
11:49
picture in live feed, I don't
11:53
know. We can only see where
11:55
this object is, that is the
11:58
focus of the picture, not the
12:00
setting, the larger setting which it's
12:02
meant to be describing. But
12:04
we have this large
12:06
square, probably
12:09
metal embossed sign where
12:12
in big, huge, all
12:14
caps in
12:17
relief it says, please do
12:20
not touch. So
12:22
it's referring to something in
12:25
its environment that we are being
12:27
told, whoa, do not touch. The
12:31
punchline here, however, is that
12:34
that admonition is repeated in
12:36
Braille, below the sign.
12:41
That's something you don't expect in Braille is please do
12:44
not touch. And
12:46
I'm wondering what happens if
12:48
a non-sided person reaches
12:51
out and scans us
12:53
with their fingers. Do they then jump
12:55
back? They leap back. Oh my God. I'm
12:58
not supposed to touch this. Anyway, I
13:00
gave this picture, the caption,
13:02
please provide a clear visual
13:04
example of irony. I
13:07
love it. Very
13:09
nice. Please do not touch
13:11
in Braille. Yeah.
13:14
Okay. So
13:17
what would an IoT device
13:20
look like that had
13:22
been taken over? That's something we've never talked
13:24
about. You know, we talked
13:26
a lot about the threat that's posed
13:29
by the remote takeover of IoT devices.
13:33
We know without any question
13:35
that there are a great
13:37
many very large bot fleets
13:39
and that they're composed of
13:41
individual, unattended, internet-connected devices. Well,
13:44
one of our listeners, Joe Lyon, sent
13:47
me an image of
13:49
a Twitter posting where the
13:51
poster is rhetorically
13:53
asking why his
13:55
LG washing machine is
13:58
using three-piece software. 3.6
14:01
gigabytes of data per
14:03
day. Wow. Yeah,
14:06
3.6 gig. And he
14:08
attached an image to his
14:10
Twitter posting that was produced
14:12
by some network monitoring tool
14:14
showing that something on
14:17
his network whose interface is
14:19
labeled LG Smart
14:22
Laundry Open, is
14:25
indeed quite busy on the
14:28
network. A little too smart. Yeah, exactly. A little
14:30
too smart for some good... Just watch the damn
14:32
clock. I
14:34
don't need to surf in the net.
14:37
Watch the damn clock. You know, whatever's going
14:39
on is happening
14:41
very uniformly for a full
14:44
24 hours because this
14:46
chart that we've got on the show
14:48
notes shows 24 hours of use with only
14:50
one hour of the
14:54
24 showing a reduced
14:56
total bandwidth during
14:58
that hour. So yeah, there's
15:01
certainly something sufficient
15:04
there to raise suspicion. Now
15:06
what also caught my eye was
15:08
that the labels on the traffic
15:10
flow show a download of 75
15:14
and a quarter or three quarters
15:16
megabytes for
15:19
the day and a whopping upload
15:21
of 3.57 gigabytes.
15:25
That's not good. Now, anyone
15:28
who's worked with networking gear knows
15:30
that it's very important to know
15:32
which directions up and down
15:34
are referring to. Cisco
15:37
has always used, I was
15:40
very pleased with them about
15:42
this, the unambiguous terms in
15:44
and out, as
15:46
in traffic flowing into or
15:48
out of a network
15:50
interface. So if the
15:53
interface is facing toward the internet,
15:55
then traffic flowing out of it
15:58
would be up up
16:00
toward the internet and
16:02
traffic flowing into it would be
16:04
down from the internet. But
16:07
if the interface is facing inward
16:09
toward, for example, connected to a
16:11
local area network, then the
16:13
meaning of in and out would be reversed. Okay,
16:16
so without a bit more information
16:18
about the network's configuration shown in
16:21
this picture, we can't be 100%
16:23
certain. Either
16:26
the washing machine's networking system
16:28
is badly broken, causing
16:31
it to continuously download at a
16:33
rate of three and a half
16:35
gig of something per day, or
16:38
as does seem more likely
16:40
given the evidence, the
16:43
label upload, even though
16:45
we cannot be certain what that means, you
16:47
know, suggests
16:50
that this washing machine has
16:52
probably become a bot in
16:55
someone's army. So
16:57
it's busy doing its part,
17:00
uploading 3.6 gigabytes
17:05
of junk on
17:08
a continuous basis, presumably
17:10
nonsense traffic just causing
17:13
some remote person grief.
17:16
That makes, now see, I saw this story
17:18
and I thought, well, what could it, is
17:20
it keeping track of what clothes you're washing?
17:23
No, no, it's been compromised.
17:25
Yes. Yeah, that would
17:27
be the conclusion. This
17:30
is what an IoT device looks
17:33
like when it's been compromised.
17:35
So this brings me to two final
17:37
observations. Since the
17:40
typical consumer is not monitoring
17:42
their local network's traffic in
17:44
any way, they would
17:46
have no way of knowing that
17:49
this was going on at all ever.
17:52
And given the closed turnkey
17:54
nature of an LG washing
17:56
machine, it's unclear how one
17:58
would go about it. about reflashing its
18:00
firmware to remove the bot, even if
18:03
you knew that one was in there.
18:06
It might be just living in RAM, in
18:08
which case, pulling the plug, counting to
18:10
10, and powering it back up might
18:12
be all that's needed to flush
18:15
it out of the system. But then
18:17
the device might become re-inhabited again
18:19
before long, as we know happens.
18:22
So the only real solution
18:24
would be to take the washing machine
18:26
off the net, which brings
18:28
me to my second point. What
18:30
the heck is a washing machine
18:32
doing, being
18:35
connected to the internet in the
18:37
first place? Exactly. You
18:39
know, is
18:41
this another of those, just because we
18:43
can, does it mean we
18:46
should situations? I've
18:48
owned washing machines my entire adult
18:50
life. After mom started to stop
18:52
washing my underwear for me, I
18:54
took that over. The only
18:56
thing any of them have ever
18:59
been connected to is
19:01
AC power. So
19:03
is it really necessary for
19:05
us to initiate a rinse
19:07
cycle while we're out roaming
19:09
around somewhere, or to
19:12
be notified with a message delivered through
19:14
an app when our clothes are dry?
19:16
But that is the purpose of that. I know I've seen them
19:18
sell it that way. You can
19:20
control your washing machine from anywhere.
19:23
Oh, that's great. So I get
19:26
it, that if all of
19:28
that amazing functionality is free
19:30
and included, and these
19:32
days nothing costs anything anymore,
19:35
then why not set it up and get
19:37
it on the internet? But we're
19:39
talking about this because of maybe why
19:42
not to
19:44
do that. Maybe something
19:46
is crawled into that machine,
19:49
and not just because you needed to wash
19:51
your clothes more often, and set
19:54
up housekeeping there. Maybe
19:56
the only thing it's currently doing
19:58
is flooding happiness. remote victims
20:01
with unwanted internet traffic and
20:04
maybe also if it wanted to it could
20:07
pivot and start
20:09
poking around inside your
20:11
residential network. Oh yeah.
20:14
And just maybe that could end
20:16
up being a high price to pay
20:18
for the luxury of being notified by
20:21
an app when the lint filter needs
20:23
to be changed. So if
20:26
these sorts of things are going on,
20:28
you know, like if these sorts of
20:31
things, these appliances are going
20:33
to be connected to your network again,
20:35
give some
20:37
thought to sequestering them on
20:40
a separate guest LAN which
20:42
has no access to
20:44
your high value LAN.
20:47
Most of today's consumer routers
20:50
now offer this feature. That makes
20:52
it easier to implement than it
20:54
was back when we first started
20:56
talking about the idea of LAN
20:58
separation many years back. You know,
21:00
remember my three dumb routers, you
21:03
know, concept for how
21:05
to create isolated
21:07
LANs when
21:09
that feature was not already built into our
21:11
routers. Well, the good thing is... Or better,
21:13
yeah, just don't connect it at all, right?
21:16
Exactly. Ask yourself, do I really need... And
21:19
here's the problem that
21:21
like this was thrown in
21:23
to a washing machine by people who
21:25
are more concerned about whether it actually
21:28
gets your clothes clean than it being
21:30
on the Internet. So the Internet is
21:32
a throwaway for them. They're not going
21:34
to be that concerned about the security
21:37
of their own washing machine that they're
21:39
shipping. This is not Cisco who is
21:41
selling you a washing machine,
21:44
you know. This is
21:47
LG. And probably they have a module they put
21:49
in all their appliances, right? This is just, you
21:52
know, the LG Internet of Things
21:54
module. And it's
21:57
using code from the dawn of the
21:59
Internet. because it worked
22:02
and they don't care. The
22:05
end user needs to care. Speaking
22:08
of DDoS attacks, this
22:11
related bit of news was also pointed to
22:13
by a listener, Sukema
22:16
who's at twit.social. He
22:19
wrote, I use this service
22:22
for all of my personal projects and
22:24
liked it so much I was motivated
22:26
to support them financially. And
22:29
yet they are having a massive
22:31
DDoS attack and thought
22:33
it worth talking about publicly, especially
22:35
as examples of tech doing everything
22:38
right while still
22:40
being vulnerable. And
22:42
his tweet to me, he
22:45
sent the URL outage.sr.ht. So
22:50
I went over and took a look and
22:52
I wanted to share what I found because
22:55
it's just such a
22:57
perfect example and then we'll talk a little
22:59
bit more about mitigation strategies. One
23:01
of the three guys who runs
23:03
the service, actually its founder over
23:06
at Sourcehut, which is the
23:08
name of the service. He
23:11
wrote, my name is Drew. I'm
23:13
the founder of Sourcehut and one of
23:15
three Sourcehut staff members working on the
23:18
outage alongside my colleagues,
23:20
Simon and Conrad. As
23:22
you've noticed, Sourcehut is down.
23:25
I offer my deepest apologies for
23:28
this situation. We've made
23:30
a name for ourselves for reliability and
23:33
this is the most severe and
23:35
prolonged outage we've ever faced. We
23:38
spend a lot of time planning to make
23:40
sure this does not happen and
23:42
we failed. We have
23:44
all hands on deck working the problem
23:46
to restore service as soon as possible.
23:49
In our emergency planning models, we
23:51
have procedures in place for many
23:53
kinds of eventualities. What
23:56
has happened this week is essentially
23:58
our worst case scenario. What
24:01
if the primary data
24:03
center just disappeared tomorrow?
24:07
We ask this question of ourselves seriously
24:09
and make serious plans for what we
24:12
do if this were to pass, and
24:14
we're executing those plans now, though we
24:16
had hoped that we would never need
24:18
to. I humbly
24:20
ask for your patience and support as
24:23
we deal with a very difficult situation,
24:25
and again, I offer my deepest
24:27
apologies that this situation has come to
24:30
pass. So what happened?
24:33
At 6.30 UTC on January 10,
24:37
two days prior to the time of writing,
24:39
a distributed denial of
24:42
service attack, DDOS, began
24:44
targeting SourceHut. We
24:47
still do not know many details.
24:49
We don't know who they are
24:51
or why they're targeting us, but
24:53
we do know that they are
24:55
targeting SourceHut specifically. We
24:58
deal with ordinary DDOS attacks
25:00
in the normal course of
25:02
operations. It's
25:10
like, okay, it's a sad state
25:12
of affairs that you
25:14
refer to ordinary DDOS attacks.
25:17
And he says, and we are generally
25:19
able to mitigate them on our end. However,
25:22
this is not an ordinary
25:24
DDOS attack. The
25:26
attacker possesses considerable resources and is
25:29
operating at a scale beyond which
25:31
we have the means to mitigate
25:33
ourselves. In response,
25:35
before we could do much ourselves to
25:38
understand or mitigate the problem, our upstream
25:41
network provider null-routed
25:45
SourceHut entirely, rendering
25:47
both the Internet at large
25:50
and SourceHut staff unable
25:52
to reach our own servers. The
25:56
primary data center, PHL, was affected
25:58
by this problem. problem. We
26:01
rent co-location space from our
26:03
PHL supplier where we
26:05
have our own servers installed. We
26:08
purchase networking through our provider who
26:11
allocates us a block out of
26:13
their AS, you know we talked
26:15
about AS numbers right, autonomous system
26:17
numbers, and who
26:20
upstreams with cogent which
26:23
is the upstream that ultimately
26:25
black-holed us. Unfortunately
26:28
our co-location provider went
26:31
through two acquisitions in the past
26:33
year and we failed to notice
26:35
that our account had been forgotten
26:38
as they migrated between ticketing systems
26:40
through one of these acquisitions. Thus
26:44
we were unable to page them.
26:46
We were initially forced to wait
26:48
until their normal office hours began
26:50
to contact them seven hours after
26:52
the start of the incident. When
26:55
we did finally get them on
26:57
the phone, our access to support
26:59
ticketing was restored, they apologized profusely
27:01
for the mistake, and we were
27:04
able to work with them on
27:06
restoring service and addressing the problems
27:08
we were facing. This led to
27:10
Source Hut's availability being partially restored
27:12
on the evening of January 10th
27:15
until the DDoS escalated in
27:17
the early hours of January 11th,
27:20
after which point our provider
27:22
was forced to null route
27:24
us again. We
27:27
have seen some collateral damage as
27:29
well. You may have noticed
27:31
that Hacker News was
27:33
down on January 10th. We
27:36
believe that was ultimately due
27:38
to cogent's heavy-handed approach to
27:40
mitigating the DDoS targeting Source
27:43
Hut. Sorry Hacker
27:45
News, that you, that, that you, or he
27:48
said sorry, and then he said Hacker News
27:50
glad you got it sorted. Then
27:52
he said last night a non-profit
27:54
free software forge known
27:57
as Codeburg also became
27:59
subject to the DDoS. to Adidas, which is
28:01
still ongoing and may have been
28:03
caused by the same actors. This
28:06
caused our status page to go offline. Codeberg
28:09
has been kind enough to host it for
28:11
us so that it's reachable during the outage.
28:14
We're not sure if Codeberg was targeted
28:16
because they hosted our status page or
28:19
if this is part of a broader
28:21
attack on Free Software Forge platforms. Okay,
28:24
so we
28:27
were just talking about, of course,
28:30
the LG smart washing machine and
28:32
the idea that it
28:34
was apparently sending a continuous stream
28:36
of traffic totaling about 3.5 gigabytes
28:40
per day out onto the internet
28:42
for some purpose. So
28:44
I wanted to put a face on this
28:46
to make it a bit more real for
28:48
everyone. What
28:50
I've just shared is a perfect
28:53
example of where such traffic goes,
28:55
like that this washing machine was
28:58
apparently admitting onto the internet and
29:01
its very real consequences for
29:03
people. You know, people are
29:05
having their lives seriously
29:08
affected by these sorts of
29:11
attacks. Now
29:14
Drew used the term null routing,
29:16
which is the action taken
29:18
by major carriers such as
29:21
Cogent in this case, when
29:23
some client or
29:25
client's client or
29:28
client's client's client, because
29:31
Cogent is a tier one provider, is
29:34
undergoing a sustained attack.
29:37
They essentially pull
29:39
the plug. They
29:42
have no interest in
29:44
carrying traffic that is
29:46
indirectly and inadvertently attacking
29:48
their network. When
29:51
an attack originates, as most
29:53
do now, from a globally
29:56
dispersed and distributed collection
29:58
of Anonymous
30:01
and Autonomous Bots. That
30:04
traffic, which is all aimed
30:06
at a single common IP
30:08
address somewhere, will enter
30:11
the network of a major
30:13
carrier like Cogent all
30:15
across the globe as well. So
30:18
that means that the attack
30:20
is crossing into Cogent's routers
30:22
from all of its many
30:24
various peering partners who
30:26
are the ones whose networks have
30:29
been infected with some bots. Or
30:32
perhaps the traffic is just transiting
30:34
across their network and originates from
30:36
some other major
30:39
carrier's network. Whatever the
30:41
case, the real
30:43
danger of these attacks is
30:46
its concentration. As
30:48
the traffic hops from one router
30:50
to the next, with each hop
30:52
bringing it closer to its destination,
30:55
that traffic is being aggregated.
30:57
It is growing in strength, and
30:59
it can get to the point
31:02
of debilitating the routers it is
31:04
attempting to pass through. This
31:06
means that the optimal action
31:09
for any major carrier like
31:11
Cogent to take is
31:13
to prevent this traffic aggregation
31:16
by blocking the attacking traffic
31:18
immediately at all and
31:21
each of the many
31:23
points of ingress
31:26
and entry into their network
31:29
from their peering partners. So
31:31
Cogent sends routing table
31:34
updates out to every
31:36
one of the peering routers on
31:38
their border instructing that
31:40
router to null route,
31:43
meaning immediately discard
31:46
any packets attempting to enter their network
31:49
which are bound for the IP that
31:51
is under attack. This
31:54
neuters the attack without causing
31:56
any harm to their network
31:59
because it is unable to concentrate. And
32:02
since there will almost certainly
32:04
be malicious bots running inside
32:06
the networks of some of
32:08
Kojin's client ISPs, this null
32:10
routing must also be applied
32:12
internally as well as on
32:14
their border. Okay,
32:16
but notice that now with
32:19
the targeted IP null
32:21
routed, it's also
32:23
impossible for any benign
32:27
traffic to reach its
32:29
destination service. As
32:31
Drew wrote, they were unable
32:33
to even reach their own servers,
32:35
you know, even if they had
32:37
some back way into them because
32:39
of this null route. No
32:42
traffic was getting to their servers
32:44
good or bad. A major
32:47
carrier's null routing inherently
32:50
not only blocks the attacking
32:52
traffic, but any and all
32:54
traffic to that service, no
32:58
matter what. In fact, once
33:00
the attack has subsided and full
33:02
service could be restored, that
33:05
site will remain dark until
33:07
someone at the service provider
33:10
notices that the attack has mitigated
33:12
and then lifts
33:14
the network wide block to
33:17
allow regular services to resume.
33:20
DDoS attacks like this one have become
33:22
a fact of life on the internet.
33:25
Anyone who's working for any major service
33:27
provider sees them and deals with them
33:30
now as part of their daily routine.
33:32
But as we've just seen, this
33:35
doesn't make such attacks any less
33:37
significant and potentially devastating to anyone
33:39
who's committed to keeping whatever services
33:42
they offer available. And
33:46
we also know where these
33:48
attacks originate. They
33:50
originate from devices exactly like
33:53
that LG smart
33:55
washing machine, a gadget
33:57
that largely operates autonomous.
34:00
anonymously, where networking
34:02
is not its primary focus.
34:05
It was tacked on, as we said
34:08
earlier, as a feature. So
34:11
it never got the attention that it needed
34:13
to be a truly
34:15
secure networking device. And
34:18
we also know that the phrase, unfortunately,
34:21
truly secure networking device almost
34:24
needs to be said with tongue in
34:26
cheek, because sadly it's become an oxymoron.
34:30
You know, truly secure networking
34:32
device. Well, it's almost
34:34
become the holy grail. Everything
34:37
we've learned is that it is
34:39
truly difficult to create and maintain
34:41
a truly secure
34:44
networking device. And
34:46
the more features are added, the more
34:48
quickly the challenge grows.
34:53
And Leo, I think at this point,
34:56
since I don't want to interrupt our major
34:59
topic, which we'll be
35:01
explaining, I
35:04
think that now would be a good time
35:07
to tell our listeners about
35:09
another sponsor. And then
35:11
we're going to talk about the major
35:13
problem that was discovered in quantum crypto. What?
35:17
In quantum crypto? Oh, not good.
35:19
Not good. You're nothing but bad
35:21
news today, I tell you. Let
35:26
me tell you about Bitwarden in that case,
35:28
because this is good news. By
35:31
the way, we were
35:33
talking on Windows Weekly last week and
35:36
and Paul Thoreau was all upset because he
35:38
didn't like to have to keep entering his
35:41
password into the password manager,
35:43
into Bitwarden. So what
35:45
did Bitwarden do this week? They
35:48
made it possible for you to use
35:50
passkey so to log into Bitwarden. It's
35:53
awesome. This is why
35:55
you want Bitwarden. It's open
35:57
source. People can see it.
36:00
submit poll requests, one of our
36:02
listeners, Quexin, actually made it possible
36:04
for Bitwarden to add a memory
36:07
hard public key derivation function to
36:10
replace pbkdf2. They put in
36:13
argon2, in fact I'm using it, thanks
36:15
to Quexin. He submitted it, open
36:17
source, they applied it, they checked it, they
36:20
tested it, you can test it and
36:22
go on and on. The other thing that's great about being open
36:24
source as a password manager is that
36:27
it's free. It's free forever. It's
36:29
free for the personal version of Bitwarden. It's
36:32
free for an unlimited number of
36:34
passwords on any device you want,
36:36
Windows, Mac, Linux, iOS, Android, it
36:38
works everywhere. You can even
36:40
use your authentication device. They used to ask
36:43
for a premium donation if you wanted to
36:45
use a YubaKey, not anymore. I
36:47
really love this company. They make it easier and
36:49
easier for people to use a password
36:52
manager. And as you know,
36:54
because you listen to this show, it's becoming
36:56
more and more important all the time that
36:58
you use a password manager. Generating
37:01
and managing complex passwords should
37:03
be easy. It should
37:05
be something people, even non-sophisticated, non-technical
37:08
people can do easily and with
37:10
Bitwarden they can. And they
37:12
can no longer say to you, oh, I don't
37:14
want to pay for a password manager because the
37:16
personal edition of Bitwarden is free forever.
37:19
You can access your passwords, your
37:21
pass keys, even sensitive
37:23
information. And every bit of your information,
37:25
including the sites you visit, the sites
37:27
you have passwords for, is encrypted with
37:29
Bitwarden across multiple devices and platforms for
37:31
free, keeping you secure at work, at
37:35
home, and on the go. And by the
37:37
way, we're moving to Bitwarden Enterprise. If you want
37:39
to have Bitwarden at work, you can too. They've
37:41
got a Teams plan, an Enterprise
37:43
plan, they have family plans. I actually subscribe
37:45
to the premium plan because I want to
37:47
support them. I don't need
37:49
the features. Every feature I want
37:51
is in there in the free plan, but I want to support
37:54
them. Get started with Bitwarden. They're
37:56
free trial of Teams or Enterprise, or you
37:58
can get started for free across all devices
38:00
as an individual user. This is a password
38:02
manager done right. And
38:05
now with Bitwarden you can go completely passwordless.
38:08
You don't even need to remember
38:10
a master password anymore. What
38:13
they're doing, and I think it's really important, and I
38:16
really honor them for doing it, is they know it's
38:18
got to be easy or people aren't going to use
38:20
it. So they make it as easy as possible. Tell
38:22
your friends, tell your
38:24
family, bitwarden.com/twit, tell your
38:27
work, maybe it's time
38:29
we moved to Bitwarden,
38:32
bitwarden.com/ twit. This is the password
38:35
manager, the only one I use and recommend,
38:37
the one Steve decided to go to when
38:39
he left LastPass. We're just
38:41
big fans, bitwarden.com slash
38:44
twit. Alright, this
38:47
is really a good episode. You've got so
38:49
much juice in this. But wait a minute,
38:51
quantum crypto problems? That's, wait
38:53
a minute now. Okay,
38:56
so Bleeping Computer recently reported the
38:58
news that many implementations
39:00
of the already in widespread
39:02
use post-quantum key
39:05
encapsulation mechanism known as Kyber,
39:07
which as
39:10
I said is in use for
39:12
example by the MoVAD VPN and
39:14
to provide signals post-quantum
39:16
redundancy. We talked about
39:18
that before. They jumped right on it and
39:20
we said yay, great, but whoops. So
39:23
it's been found to be subject
39:25
to timing attacks. The
39:28
set of attacks have been dubbed
39:30
Kyber slash. Okay,
39:32
now the first thing to understand here is
39:35
that nothing has been found wanting
39:39
from the post-quantum Kyber
39:42
algorithm itself. As far
39:44
as everyone knows, Kyber still provides
39:47
very good quantum resistance. The
39:49
problem and vulnerability is
39:52
limited to some of the actual code
39:54
that was used to implement the Kyber
39:56
algorithm. And this is part of
39:59
the typical shaking algorithm. out process that
40:01
new algorithms undergo. First
40:03
we need to get the theory right, then
40:05
it's tested to prove that it does what we thought.
40:08
Next, the code is implemented into
40:10
libraries where it can actually be
40:12
used and tested in the real
40:14
world. And it
40:16
was at this point in the process
40:19
that these troubles arose. The
40:21
problem is that
40:23
the vulnerable algorithms perform
40:25
an integer division instruction,
40:28
where the numbers being divided are
40:31
dependent upon the algorithm's
40:34
secret key. Huh,
40:37
whoops. Since division is an instruction
40:39
whose timing can
40:42
vary widely based on
40:45
the binary bit pattern of the
40:47
specific numbers being divided, this naturally
40:50
results in a situation where
40:53
the secret that needs to
40:55
be kept has the
40:57
potential to leak out through a
40:59
timing side channel. And
41:01
that's exactly what happened here. Now,
41:04
it's such an obvious and well
41:07
understood problem that it's
41:09
kind of shocking that this could
41:12
happen and really whoever wrote that
41:14
code should be scolded a bit.
41:16
You know, perhaps they were just
41:18
having a bad day or
41:20
perhaps they're solely focused on theory and
41:22
not enough on practice, so who knows.
41:25
But in any event, the problem was
41:28
found and it's well understood. And
41:30
many of the libraries that
41:32
implemented the vulnerable reference code
41:34
have now been updated and
41:37
more are being updated. So
41:39
it's a good thing that we're
41:41
doing this now rather than two
41:44
years from now or ten years
41:46
from now or whenever it might
41:48
be that we actually become dependent
41:50
upon the strength of these
41:52
post quantum algorithms to protect
41:55
against quantum based attacks. We're
41:57
okay, you know, and to their credit, we're okay.
42:00
it's signal, remember, added
42:02
this to their existing crypto rather
42:04
than switching over to
42:07
this, recognizing that it's unproven nature
42:09
meant that it really couldn't be
42:11
trusted fully yet. So,
42:13
signal was never in danger and now they're less
42:16
so. Okay, Leo,
42:18
we got a cool picture here.
42:22
And this is apropos of
42:24
the podcast's main topic. That
42:27
counter produced this somewhat
42:29
bracing screenshot of
42:32
global web browser use 12 years
42:35
ago back in 2012 and two years ago
42:37
in 2022. So
42:42
since today's podcast is all about
42:45
Google and their Chrome browser, yikes.
42:48
So for those... Oh, this is not good.
42:53
So what I want to know
42:55
is what's going on in Iceland.
42:57
I think that's Iceland. It's gray.
42:59
What is gray? Well, that's Safari.
43:01
Oh, wow. But, okay, so for
43:03
our listeners who can't see because
43:05
they're listening, thus listener, anyway,
43:08
the first picture from 2012 shows
43:10
what the world looked like what
43:13
12 years ago. All
43:19
of the US and Canada
43:22
and Alaska and
43:24
Iceland and sort of
43:26
the northwest
43:29
of Africa looks like all
43:31
of Australia. Anyway, all of that is
43:33
like a blue and that was IE.
43:37
Everybody 14 years ago was using
43:39
Internet Explorer. Interestingly,
43:42
Mozilla's Firefox was scattered around...
43:44
Yeah, look at France. Exactly,
43:47
Europe and
43:49
Italy. So
43:53
there was a lot of Firefox use and of
43:55
course Chrome was there. It
43:57
looks like Africa, the whole continent.
44:00
most of it, except for a
44:02
little bit of IE, was Chrome.
44:04
Russia was all Chrome. And
44:08
there was also some scatter bits
44:10
of opera. Anyway, so that
44:12
was then. Whoa.
44:14
This should be the title of the show,
44:17
Scattered Bits of Opera. Okay.
44:20
Now, on the key of this
44:24
second updated chart from two
44:27
years ago, there is blue for Microsoft
44:36
Edge. I don't know where it is on the map.
44:38
I don't see any blue. All I see is Chrome.
44:45
Chrome has taken over.
44:47
Is that Icelander Greenland?
44:49
What is it? Oh,
44:51
good question. Anyway. Oh,
44:54
there's the blue. We found the blue. It's
44:56
Chad. I don't know
44:58
where that is. This could also
45:00
be a map of COVID, unfortunately.
45:02
Unfortunately, it's all green. So
45:05
Chrome is the COVID of
45:07
browsers. It's just
45:10
everywhere. Okay. So with
45:12
that in mind, we'll be talking about
45:15
what Google has done to browsers,
45:18
which is to say all
45:20
browsers, because they're pretty much one and
45:22
the same. I have
45:24
a couple little bits of feedback
45:26
from our listeners I want to talk about, and then
45:28
we're going to plunge in. So of
45:31
course, there were predictably many replies
45:33
from our listeners about my follow-up
45:35
discussion last week of the Apple
45:37
backdoor. I just grabbed one
45:39
to finish out this subject.
45:41
David Wise wrote, Hey, Steve, listening
45:44
to your podcast about the Apple
45:46
vulnerability, could this
45:48
be a supply chain hack
45:50
with the phones being built in
45:52
China? My answer? Sure.
45:56
Absolutely. Unfortunately,
45:59
literally anything is
46:01
possible. I think it's safe
46:03
to say that by the nature
46:05
of this, we'll
46:08
never have a
46:10
satisfactory answer to the
46:12
many questions surrounding exactly
46:14
how or why this
46:16
all happened. All
46:18
we know for sure is
46:21
that backdoor hardware exists
46:24
in the hardware that Apple sold and
46:26
shipped. Notice
46:28
by David's question that
46:30
plausible deniability exists here.
46:33
All of the several possible sources
46:35
of this can claim absolute
46:38
surprise and total ignorance of
46:40
how this came to pass.
46:44
Is it significant that it
46:46
first appeared in 2018, three
46:49
years after the famous
46:51
high visibility 2015
46:53
San Bernardino terrorist attack? Several
46:57
years being the approximate lead
46:59
time for a new silicon
47:02
design to move all the
47:04
way through verification, fabrication, and
47:06
into physical devices. Again,
47:09
we'll never know. And
47:11
yeah, that's annoying. O'Sink
47:15
Sort tweeted, Hi Steve, I'm a
47:18
fan of the podcast and all the great
47:20
work you've done in your career, especially a
47:22
fan of squirrel. Thank you for
47:24
this great work and I wish it to be
47:26
mainstream in the near future. Well, so do I, but
47:28
don't hold your breath. I recently
47:31
listened to episode 885. As
47:35
you 885, okay, a while ago, and
47:37
you briefly touch on a subject that
47:40
I've been contemplating, getting
47:42
into infosec. I'm
47:44
currently thinking about getting into
47:46
infosec as a career. I'm
47:49
in my forties and wanted
47:51
to know from your perspective, if
47:54
forties is too old to get
47:56
into the field. My career
47:59
is online. and I've
48:01
been fortunate to have been doing it from the
48:03
early days of Web 1.0 to
48:06
what is now referred to as Web 3.0.
48:08
However, after COVID, I
48:11
have not had the same opportunities
48:13
in the marketing world, so I
48:15
find myself looking for a new
48:17
career and thinking InfoSec may be
48:19
the solution. Any advice
48:21
slash opinion is welcome. Thanks
48:23
in advance. Okay,
48:26
so the good news
48:28
is that there is a huge and unmet
48:31
and even growing
48:34
demand for information
48:36
security professionals today. I
48:40
think that the trouble with any sort of,
48:42
you know, am
48:44
I too old question is that
48:46
so much of the answer to that depends upon
48:48
the individual. A particular person
48:50
in their early 30s might already
48:53
be too old, whereas someone else
48:55
who's in, you know, who's 55
48:57
might not be. But
48:59
speaking only very generally, since that's all
49:01
I can do here, I think
49:04
I'd say that someone in their
49:06
40s probably spans
49:08
the range where the
49:11
question of too old might start to be
49:13
an issue if it's a concern for them.
49:16
Early 40s, not so much. Late
49:19
40s, well, maybe a bit more
49:21
so. But regardless, there's
49:24
definitely something, a lot actually, to
49:27
be said for having some
49:29
gravitas and life experience
49:31
that only comes with age. An
49:34
IT guy who's more world-wise
49:36
will generally be much more
49:38
useful than a fresh newbie
49:40
who is still addressing the
49:42
world with impractical expectations. And
49:45
especially for an IT guy, knowing how
49:47
to talk to others is a skill
49:50
that should not be undervalued. So I
49:52
think that on balance I'd say go for it
49:55
and know that the demand for that
49:57
skill set will only be increasing over
49:59
time. time. This
50:02
is where I would put in the plug
50:04
for ACI Learning, our sponsor of ITProTV. You
50:07
could certainly learn, I don't think it's ever
50:09
too late to learn the skills. Right. So
50:12
really the only question you're asking is can I get
50:14
hired? And that's just
50:16
going to depend if you've got the skill
50:18
set, if you've got the certs. I think
50:20
so. And Leo, if this guy
50:22
has been listening to this podcast for... Yeah, that's
50:24
a good way to start. From the beginning. Yeah.
50:27
We keep hearing from people. Yeah, I just went
50:29
in, I didn't even study it. Good training. I
50:32
passed the test. It's like, okay, that's great. Yeah, that's a
50:34
good point. So someone
50:37
whose handle is 3N0M41Y, which
50:40
gave me some
50:42
pause until I realized it was
50:48
supposed to be anomaly. He
50:52
said, hi Steve. I
50:55
would like to get your opinion
50:57
on Proton Drive versus Sync as
50:59
a secure cloud storage network.
51:03
Recently, the iOS Sync app
51:05
has broken the ability to
51:07
natively use the built-in iOS
51:10
file app to
51:12
navigate Sync's folder structure
51:14
properly. What happens
51:16
is that after drilling down one
51:18
to two directories, the Sync app
51:20
pushes the structure back to the
51:22
root folder. Now,
51:25
this is not a showstopper. It does break
51:27
the use of other third-party apps on iOS.
51:30
I've reached out to the Sync
51:32
dev team, but they've responded that
51:34
it will take, quote, quite
51:36
a while, unquote, to fix. This
51:41
functionality broke about two months ago. So
51:44
I just want to get your take if
51:46
Proton has matured enough to be
51:48
a replacement for Sync.
51:50
Cheers and happy new year. Okay,
51:53
so first of all, let
51:55
me just note that I'm
51:58
very disappointed. when something
52:00
I've deeply researched and
52:03
then strongly endorse, later
52:05
evolves or devolves in
52:11
such a way that I come
52:13
to regret my previous full-throated endorsement. So
52:16
I'm disappointed to learn that Sync
52:18
is not standing behind and repairing
52:21
their various client apps in a
52:23
timely way. As
52:25
for Proton Drive, I haven't looked
52:28
deeply enough to form any opinion
52:30
one way or the other. However,
52:33
its proton name should be
52:36
familiar, since these are
52:38
the same Swiss people who
52:40
offer the very popular Proton
52:42
Mail service. So
52:44
my strong inclination would be to
52:46
trust them. What
52:49
I have no idea about
52:51
is how their feature-rich offering
52:53
might match up against Sync. But
52:56
my shooting from the hip thought would be
52:59
that if it does what you want for
53:01
a price that makes sense, I'd
53:03
say that based upon their past performance on
53:05
everything else we know they've done, I'd be
53:08
inclined to give them the benefit of any doubt.
53:11
That's obviously not definitive, but at least
53:14
it's something. Okay,
53:16
so, and last, a note about
53:19
Spinrite. It
53:21
has been so extremely
53:23
useful these past final months
53:25
of work on Spinrite to
53:27
have this podcast's listeners taking
53:30
the current Spinrite release candidate out
53:32
for a spin and
53:35
providing feedback about their experiences.
53:38
That feedback has been the
53:40
primary driving force behind the
53:42
last few improvements to Spinrite
53:44
6.1, which turned out to
53:46
be quite significant. So
53:49
I'm glad that I did not declare it
53:51
finished before that. And
53:53
it's been a slowly growing chorus
53:55
of feedback about something
53:57
else that caused me to decide
53:59
that I am. needed to change one
54:01
last thing, sort
54:04
of echoing Steve Jobs. If
54:07
you've been following along, you'll
54:09
recall that one of the astonishing
54:11
things we discovered during spin rights
54:14
development was that all
54:16
of the original and past
54:18
versions of the PC industry's
54:20
most popular BIOS produced by
54:22
American Megatrends and commonly known
54:24
as the AMI BIOS contained
54:27
a very serious bug
54:30
in its USB handling code.
54:33
Any access to any
54:35
USB connected drive past
54:37
the drive's 137 gigabyte
54:40
point, which is where 28 bits
54:44
of binary sector addressing
54:47
overflow into the 29th
54:49
bit causes these
54:51
AMI BIOSes, which are in the
54:53
majority, to overwrite
54:56
the system's main memory
54:58
right where applications typically load.
55:02
When this came to light, I
55:04
was so appalled by that discovery
55:07
and by the idea that this
55:09
could be very damaging not only
55:11
to spin rights code in RAM
55:13
but potentially to its users data
55:16
that I decided to
55:18
flatly prohibit Spinrite 61
55:20
from accessing any USB
55:22
connected drive past its
55:25
137 gig point. The
55:28
next spin right won't suffer from
55:30
any of this trouble since it
55:32
won't, you know, it'll have its
55:34
own direct high-performance native USB drivers.
55:36
So my plan was just to get
55:39
going on Spinrite 7 as
55:41
quickly as possible. But
55:43
Spinrite's early users who
55:45
have attached larger than 137 gigabyte drives
55:47
via USB and then had 6.1 tell
55:49
them that they could only safely
55:56
test the first 137 gigabytes of their drive
56:00
have not been happy. And
56:04
also since then, one of the
56:06
guys who hangs out in GRC's
56:08
news groups, Paul Farrer, who I've
56:10
referred to before when this bug
56:12
was happening, he was curious
56:14
to learn more about this. So
56:16
he looked into the problem while I continued
56:19
to work on other aspects of Spinrite working
56:21
toward just getting it done. Paul
56:24
wrote a set of exploratory DOS utilities and
56:26
tested them with a bunch of old motherboards
56:28
owned by some of our Spinrite testers in
56:30
the news groups. What
56:32
he discovered suggested that more
56:34
could be done than just
56:37
turning my back on
56:39
all USB BIOS access
56:41
in disgust. And
56:44
the disappointment I was seeing from
56:46
new people being exposed to Spinrite's
56:48
refusal to work with any USB
56:50
BIOS convinced me that I needed
56:52
to fix this. So
56:54
I started on that last week and I expect
56:56
to have it finished this week because it's not
56:58
a big deal. Since
57:01
only the AMI BIOS is known
57:03
to have this problem, Spinrite
57:05
will start by lifting
57:07
this blanket ban from
57:09
all non-AMI BIOSes. Then
57:12
for AMI BIOSes, since they don't
57:15
all have this trouble, I've
57:17
reverse engineered the code surrounding the
57:19
bug and I now fully understand
57:22
what's going on. So
57:24
Spinrite can now detect when the
57:26
bug is actually present and
57:29
can patch the buggy BIOS
57:31
code in place to
57:34
raise Spinrite's access limit from
57:36
130 gigabytes to
57:39
2.2 terabytes. The
57:42
buggy AMI USB BIOS code
57:44
will still have a bug
57:46
that prevents Spinrite from working
57:48
past 2.2 terabytes, but that's
57:51
way better in today's world
57:53
than clamping all USB BIOS
57:55
access for everyone at 137
57:57
gig. So
58:00
that's the spin right that everyone will get and
58:04
maybe next week. So again,
58:06
a nice feature
58:09
benefit. And again, I'm glad that I
58:11
waited and am putting this in. Next
58:17
week, really? Next week? Like,
58:19
yeah. Maybe. Close. I
58:23
mean, there's really nothing
58:25
left. I'm
58:27
really happy with it. Steve
58:29
has found any problems at all
58:31
for the last couple of months
58:33
now. And while I've
58:36
been letting it sit and
58:38
stew and cook. So yeah, we're
58:40
right there. Well,
58:43
we'll have a cake ready for you. We'll have
58:45
a party. Just don't throw it on my face.
58:47
Yeah, well, no. We
58:50
might have some confetti that we can throw in
58:52
the air instead. How about that? A
58:54
little party for you, Steve. That would
58:56
be very nice. All right, we're
58:58
going to get to the heart of the subject. I can't
59:00
wait. Oh, boy. What is Google
59:03
doing to protect your
59:05
privacy and to help advertisers? We'll find out.
59:08
This is kind of breaking news, I think. So I'm
59:11
looking forward to hearing this. But before we
59:13
do that, let me tell you about our sponsor, one that
59:15
we have used and had to use.
59:18
And this has to do with security. If
59:20
you've ever searched for your name online, you
59:23
probably noticed there's a lot of PII,
59:26
identifiable information about you out there. Your
59:29
income, your house, how much you paid for
59:31
your house, your car, photos, et cetera, et
59:33
cetera. Now, you might say, well, that's a
59:35
privacy violation, but it's also a security issue.
59:38
And it was for us at Twit.
59:42
It really came to the head when
59:44
people started impersonating our CEO,
59:46
Lisa, sending urgent text messages
59:48
to employees saying, I'm in
59:50
a meeting. Can you quickly
59:53
get some Amazon gift cards
59:55
for me and send them to me? I'll
59:58
pay you back. something use
1:00:00
your twit credit card and You
1:00:04
know, of course it was a scam and fortunately our employees
1:00:06
are well trained, but here's the thing Whoever
1:00:09
did it knew that Lisa was the
1:00:11
CEO. She knew her where direct reports
1:00:13
were They knew the email and phone
1:00:15
numbers of those people that
1:00:17
stuff's online and that's why it's a
1:00:19
major security issue for
1:00:21
any company This
1:00:24
is this is enabling spear phishing basically,
1:00:27
so we went to delete me Lisa Actually
1:00:30
signed up for delete me and got her PII
1:00:33
Removed delete me helps reduce risk From
1:00:36
identity theft from credit card fraud
1:00:38
from robocalls cybersecurity
1:00:40
threats harassment Just
1:00:42
in general unwanted communications.
1:00:45
So here's how it works. She signed up You're
1:00:48
gonna have to give them some basic information because they need
1:00:50
to know what they're looking for, right? So
1:00:53
you give them that basic personal information? Don't
1:00:55
worry They keep it safe delete me as
1:00:57
experts will find and remove your personally identifiable
1:01:00
information From hundreds of data
1:01:02
brokers now that reduces your online footprint
1:01:04
helps keep you safe But
1:01:06
then then this is really important. They will
1:01:09
scan and remove your personal information regularly We're
1:01:12
talking addresses photos emails relatives
1:01:14
phone numbers social media property
1:01:17
value and on and
1:01:19
on and on the reason it's important is because These
1:01:22
data brokers are so scummy That
1:01:25
even if you do the takedown and you fill
1:01:27
out the form you say take my information down
1:01:29
they will But that doesn't
1:01:31
stop them from repopulating it as they get
1:01:34
more information from others That's why delete me
1:01:36
continually goes back and says is it still
1:01:38
down? Now you
1:01:40
will want to take advantage of their privacy
1:01:42
advisors because everybody's threat model is different Privacy
1:01:45
advisors are there to leave you to make sure
1:01:48
you get the support you need and understand what's
1:01:50
going on Really is
1:01:52
this is it really worked for us and
1:01:54
I highly recommend it protect yourself Reclaim
1:01:57
your privacy join delete
1:01:59
me.com joindeleteme.com. The
1:02:01
offer code TWIT will get
1:02:03
you 20% off and
1:02:06
of course let them know you heard it here which is
1:02:08
important to us. So go there, joindeleteme.com. Use the offer code
1:02:12
TWIT and save 20%. This
1:02:14
service not only works, I think
1:02:16
it's more important than you might imagine for
1:02:19
your company's point of view even
1:02:21
if it's not from your personal point of view. joindeleteme.com.
1:02:23
We thank them so much
1:02:27
for the work they've done for us and the support
1:02:29
they're giving us on the security now. All
1:02:31
right Steve, I'm ready.
1:02:33
Let's find out what's going on. What's Google up
1:02:35
to now? This is big. I
1:02:38
mentioned last week that I thought
1:02:41
I might be on to an interesting topic
1:02:43
to explore this week. It
1:02:45
turned out that while the guy I stumbled
1:02:47
upon was the real deal, his
1:02:50
several blog postings were sharing
1:02:52
pieces from his Master of
1:02:54
Law dissertation for the University
1:02:56
of Edinburgh. After
1:02:59
I looked into it more deeply, it didn't
1:03:01
really make for the sort of content I
1:03:03
know our listeners are looking for. This
1:03:06
scholar was carefully examining
1:03:08
the legal and policy
1:03:10
implications of Google's recent
1:03:12
work on the web,
1:03:14
the set of new technologies collectively
1:03:17
known as the privacy sandbox. And
1:03:20
he was looking at it against EU and UK
1:03:23
laws like the GDPR. What
1:03:26
would it mean in that context? And
1:03:28
this guy was not some lawyer. He
1:03:30
is a deep technology guy who
1:03:32
has been actively involved with the
1:03:34
W3C serving on many committees and
1:03:37
having co-authored a number of web
1:03:39
specs. His focus has always been
1:03:41
on privacy. And he's the
1:03:43
guy who years ago realized
1:03:45
that the addition of
1:03:47
a high resolution battery level
1:03:50
meter into the
1:03:52
HTML5 specifications would provide
1:03:54
another signal that could be
1:03:56
used for fingerprinting and tracking people across
1:03:59
the web. But, as
1:04:01
I said, his focus was
1:04:03
on what Google's recent work
1:04:05
would mean legally, and for
1:04:07
what it's worth, his very
1:04:09
well-informed legal and technical academic,
1:04:12
this guy who is also a privacy
1:04:14
nut, is quite bullish
1:04:17
on the future Google has been paving
1:04:19
for us. So
1:04:21
that just means that what we are going
1:04:24
to talk about this week is
1:04:26
all the more relevant and significant. And
1:04:29
what we are going to talk about
1:04:32
this week is something known as the
1:04:34
Protected Audience API. It's
1:04:36
another of the several components which
1:04:38
make up what Google collectively refers
1:04:41
to as their privacy sandbox. Now,
1:04:44
the name Protected Audience API
1:04:46
is every bit as awkward
1:04:48
as many of Google's other
1:04:50
names. You
1:04:52
know, they're a big company. They
1:04:55
could afford to employ someone
1:04:57
with the title of Director
1:04:59
of Naming Things and
1:05:01
give this person a big office and a
1:05:03
staff because it's clear, and it
1:05:06
will soon become much clearer, that the
1:05:08
nerds who invent this technology
1:05:11
should not be the ones to
1:05:13
name it. In this
1:05:15
instance, what's Protected is
1:05:18
user privacy. And
1:05:20
audience refers to the audience
1:05:22
for web-based display advertising. But
1:05:26
as it is, calling this the
1:05:28
Protected Audience API only tells
1:05:30
you what it is after you
1:05:32
already know, which is not the
1:05:34
definition of a great name. In
1:05:36
any event, this collection of
1:05:39
work that Google has called
1:05:41
their privacy sandbox currently
1:05:43
contains a handful, dare
1:05:45
I say a plethora,
1:05:47
of different APIs. There's
1:05:53
the new Topics API, which we've
1:05:56
previously covered at length, and
1:05:58
there's the Protected Audience API. API, which
1:06:00
is what we'll be looking at today.
1:06:03
But then there's also
1:06:05
something known as the
1:06:08
Private State Tokens API,
1:06:10
the Attribution Reporting API,
1:06:12
the related Website Sets
1:06:15
API, the Shared
1:06:17
Storage API, the Chips
1:06:19
API, the Fenced Frames
1:06:22
API, and the Federated
1:06:24
Credential Management API. And
1:06:27
if you didn't already know what
1:06:29
those things are, knowing their
1:06:31
names only helps with, you
1:06:33
know, very broad strokes. But
1:06:37
here's what everyone listening really does need to know.
1:06:39
All of this
1:06:41
brand new, serious, deliberately
1:06:44
user privacy-focused technology which
1:06:47
Google's engineers have
1:06:49
recently created and somewhat
1:06:52
unfortunately named is real.
1:06:56
It collectively represents a
1:06:58
truly major step
1:07:01
forward in web
1:07:03
technology. We all grew
1:07:06
up in and cut our teeth
1:07:08
on extremely simple web
1:07:10
technology that its founders would
1:07:12
still clearly recognize today. You
1:07:14
know, even after many years,
1:07:17
this baby hadn't grown much
1:07:19
and it was still far
1:07:21
from mature. We had cookies
1:07:23
and JavaScript and ambition
1:07:26
and a lot of ideas about what we
1:07:29
wanted to do with the web. But
1:07:31
everyone was heading in their own direction,
1:07:33
doing whatever they needed for themselves just
1:07:35
to get the job done. And
1:07:38
no one was thinking or
1:07:40
worrying about longer-term consequences. The
1:07:43
web lacked the
1:07:45
architectural and technological depth
1:07:48
to get us to
1:07:50
go in the way we needed
1:07:52
to get there. So we
1:07:55
wound up with the absolute chaos of
1:07:58
tracking and identity brokering, and
1:08:00
personal data warehousing, de-anonymizing, and
1:08:03
all the rest of the
1:08:05
mess that defines today's world
1:08:08
wide web. And
1:08:11
an example of the mess we're in
1:08:13
today is the utter pointless bureaucracy, you
1:08:15
know, the bureaucratic insanity of
1:08:18
the GDPR forcing all websites
1:08:20
to get cookie usage permission
1:08:22
from each of their visitors.
1:08:26
We know that Google
1:08:28
is fueled by the
1:08:31
revenue generated from advertising.
1:08:34
Advertisers want to know everything they possibly
1:08:36
can about their audience. They
1:08:39
want to optimize their ad buys. And
1:08:42
users are creeped out by the
1:08:44
knowledge that they're being tracked around
1:08:46
the internet and profiled. And
1:08:49
being the super heavy weight that it
1:08:51
is, Google is
1:08:53
increasingly coming under scrutiny,
1:08:56
you know, under the microscope. They
1:09:00
also have the technological savvy,
1:09:03
probably unlike most other players on
1:09:05
Earth at this time in our
1:09:08
history, to actually
1:09:10
solve this very thorny
1:09:12
problem which arises from
1:09:15
the collision of apparently
1:09:17
diametrically opposed interests on
1:09:20
today's web. One
1:09:22
thing is clear, we're in
1:09:24
desperate need of more technology
1:09:26
than cookies. Google
1:09:30
began the work to
1:09:32
truly solve these problems in earnest
1:09:35
three years ago at the start of
1:09:37
2021. And
1:09:39
this wasn't some half baked attempt to
1:09:42
gloss over the true problems that are
1:09:44
inherent in the use of a system
1:09:46
that was never designed or intended to
1:09:48
be used as it is being
1:09:50
used today. Google's
1:09:53
Privacy Sandbox Initiative was
1:09:55
and today is a
1:09:58
significant step forward. forward
1:10:01
in web browser technology and
1:10:03
standards, which is
1:10:05
designed to allow the
1:10:07
web to finance its
1:10:09
own ongoing existence and
1:10:11
services through advertising without,
1:10:15
in any significant way, compromising
1:10:18
the privacy of its users. Okay,
1:10:22
now, we've all, I get it, we've
1:10:24
all been so badly abused by the
1:10:26
way things have been that it
1:10:29
may be difficult to accept that
1:10:31
there truly is a way for this to
1:10:33
be accomplished, but there is, and
1:10:36
Google has done it. In
1:10:39
the future, the use of the web
1:10:41
will be much more private than it
1:10:43
ever has been since it
1:10:46
was first conceived. What's
1:10:48
required to make this possible is way
1:10:51
more technology that has
1:10:53
ever been deployed before.
1:10:56
It's been done before, now,
1:10:59
you know, couldn't even be called a half
1:11:01
measure. All
1:11:03
of the various APIs I mentioned
1:11:06
above, you know, whatever it is
1:11:08
they each do, became available
1:11:10
in the middle of last year at the start
1:11:12
of the third quarter of 2023. They
1:11:16
are all operable today, right
1:11:19
now, have been for the last six months,
1:11:22
and they are in the world's
1:11:24
dominant web browser and other browsers
1:11:26
that share its chromium engine. And
1:11:30
it's not as if there wasn't
1:11:32
something, you know, well, some wrong
1:11:35
terms that were made along the way,
1:11:37
right? You know, but that's also the
1:11:39
nature of pioneering where the path hasn't
1:11:41
already been mapped out. Flock,
1:11:44
remember, Google's federated learning
1:11:47
of cohorts was
1:11:49
an attempt at generating an
1:11:51
opaque token that revealed nothing
1:11:53
about the user's browser other
1:11:55
than a collection of their interests, but
1:11:58
Flock didn't get off the ground. It
1:12:00
failed. It failed. It failed. It
1:12:04
was later replaced by Topics, which
1:12:06
is a complex but
1:12:08
extremely clever system for doing
1:12:11
essentially the same thing, but
1:12:13
in a far less opaque and
1:12:16
thus far more understandable fashion.
1:12:19
Topics allows the
1:12:21
user's browser to
1:12:24
learn about the user by
1:12:26
observing where they go on the
1:12:28
web, all
1:12:30
of which information is retained by
1:12:32
and never leaves the
1:12:35
browser. Then through
1:12:38
the use of the protected
1:12:40
audience API, which I'll get to,
1:12:43
the user's browser is
1:12:46
able to later intelligently select
1:12:48
the ads that its own
1:12:51
user will see. I know.
1:12:55
If that comes as something
1:12:57
of a surprise, it should, since it's
1:12:59
certainly not the way any of
1:13:03
this has ever worked before. Okay.
1:13:07
We've got a lot to cover. It's good stuff.
1:13:09
One of the key features to note and
1:13:12
to keep in mind is
1:13:14
that this expands the role
1:13:16
of the web browser significantly.
1:13:19
There is now far more
1:13:21
going on under the covers than ever
1:13:23
before. It was once fun
1:13:26
and easy to explain how a web browser
1:13:28
cookie worked. It wasn't difficult to explain because
1:13:30
there wasn't much to it. But
1:13:33
there is very little that's easy
1:13:35
to explain about how
1:13:37
these various next-generation privacy
1:13:40
sandbox browser APIs function.
1:13:43
This is made even more difficult by the
1:13:45
fact that they're all so
1:13:48
deeply interconnected. When we
1:13:50
originally discussed topics, we
1:13:52
had no sense that its
1:13:54
purpose was to allow the
1:13:57
user's browser to autonomously perform
1:13:59
ad-libs. selection. But
1:14:01
that was always Google's intention. We
1:14:04
just needed to see more of the whole picture.
1:14:08
And even when we were only seeing a small
1:14:10
portion of the whole, explaining the
1:14:12
operation of topics required a
1:14:15
very careful description because
1:14:17
it is laced with important subtleties.
1:14:20
And I suppose that's the main point I
1:14:23
want to convey here, because
1:14:25
we're now asking so much from the
1:14:27
operation of the Web, even
1:14:29
wanting things that appear to be
1:14:32
in direct opposition, the
1:14:34
simple solutions of yesterday will
1:14:36
not get us there. So
1:14:40
what is this protected audience API? Believe
1:14:43
it or not, opaque as
1:14:45
even that name is, the good
1:14:48
news is they renamed it to
1:14:51
protected audience API for the purpose of
1:14:54
the project. From
1:14:56
what it was before, which of course begs
1:14:58
the question, renamed it from what?
1:15:01
Okay, recall that earlier that they
1:15:06
abandoned FLOC, F-L-O-C, which stood for Federated
1:15:08
Learning of
1:15:11
Cohorts. In a
1:15:13
similar vein, the protected audience
1:15:15
API was originally named FLEGE.
1:15:17
And that
1:15:20
was a painful, they
1:15:22
won't give up on these birds, it was
1:15:24
a painful reverse engineered acronym
1:15:27
which stood for First
1:15:30
Locally Executed Decision Over
1:15:33
Groups Experiment. Oh,
1:15:36
that's awful. Oh my God, yeah. That's
1:15:38
really bad. Okay, now not exactly a
1:15:40
catchy name. Where is
1:15:42
the director of naming things when you
1:15:44
need them? Because nerds
1:15:46
should not name things clearly. Okay,
1:15:50
and what you're really not going to believe is
1:15:52
that FLEGE grew out of a
1:15:55
project named TURTLE DUB. I kid
1:15:57
you not. And
1:16:00
yes, Turtle Dove was
1:16:02
also an acronym short
1:16:04
for two uncorrelated
1:16:07
requests then locally
1:16:09
executed decision on
1:16:11
victory. Oh,
1:16:14
God, that's terrible. It's
1:16:17
really bad. They're only missing... They're
1:16:19
getting worse. They're only missing a word to
1:16:21
provide the E at the end of dove.
1:16:24
So excellent, everlasting,
1:16:27
or maybe excruciating. Yeah.
1:16:30
Yeah. Anyway, I
1:16:32
was able to explain how topics worked
1:16:34
since while it was a bit tricky
1:16:37
and subtle, it was a relatively
1:16:40
self-contained problem and solution.
1:16:44
I don't have that feeling about
1:16:46
this protected audience API
1:16:49
because as I noted earlier, they
1:16:51
each only really make coherent sense
1:16:53
when they're taken as a whole.
1:16:56
So I'm not going to explain
1:16:59
it at the same level of
1:17:01
transactional detail. Okay, but
1:17:03
I want to at least share some
1:17:05
sound bites so that you can come
1:17:07
away with some sense for what's going
1:17:09
on here. And believe me, that
1:17:12
will be enough. So at
1:17:14
the start of Google's protected
1:17:17
audience API explainer
1:17:19
page, it opens
1:17:21
with one sentence that
1:17:24
needs to be taken absolutely
1:17:26
literally. Okay, they
1:17:29
start with on
1:17:31
device, add auctions
1:17:34
to serve remarketing and custom
1:17:38
audiences without
1:17:40
cross-site third-party
1:17:43
tracking. Okay,
1:17:45
on device, add auctions.
1:17:48
Wow. Okay, now,
1:17:50
I don't expect anyone to understand
1:17:53
in any detail what follows. I
1:17:55
don't. So just let
1:17:57
it wash over you and you'll
1:17:59
get... It's a very useful feeling
1:18:01
for what's going on. Google
1:18:04
explains, and I have explains in
1:18:06
air quotes, the
1:18:09
protected audience API uses interest
1:18:12
groups to enable sites
1:18:15
to display ads that are
1:18:17
relevant to their users. For
1:18:20
example, when a user
1:18:22
visits a site that wants to
1:18:25
advertise its products, an
1:18:27
interest group owner can
1:18:30
ask the user's browser to
1:18:33
add membership for the
1:18:36
interest group. If
1:18:38
the request is successful, the
1:18:40
browser records the name
1:18:42
of the interest group, for example,
1:18:45
custom bikes, the
1:18:47
owner of the interest group,
1:18:49
which is a URL like
1:18:51
bikesrs.example, and interest
1:18:54
group configuration information to
1:18:56
allow the browser to
1:18:58
access bidding code, and
1:19:01
add code, and real-time
1:19:04
data if the group's owner is
1:19:06
invited to bid in an
1:19:09
ad auction. Okay,
1:19:11
I know now, just let
1:19:13
your head spin, it'll be okay. So
1:19:16
there's a feeling of
1:19:18
the way topics works here. The
1:19:21
key is that the user's
1:19:24
browser visits a site
1:19:26
like custom bikes, and
1:19:29
because their browser is at that
1:19:31
site, thus the
1:19:33
user is implicitly expressing
1:19:35
their interest in custom
1:19:38
bikes. An advertiser
1:19:40
on that site can
1:19:43
ask the user's browser to
1:19:46
collect and retain some information
1:19:49
that might be used in the
1:19:51
future if an
1:19:54
ad from that advertiser will
1:19:56
be displayed. Okay,
1:19:58
now note importantly that The advertiser
1:20:01
learns exactly nothing
1:20:04
about the visitor to the site. All
1:20:07
of the information flow is
1:20:09
into the user's browser and
1:20:12
only because the website
1:20:15
they're visiting. Okay,
1:20:18
now Google and I continue. I because
1:20:20
I had to fix this language to
1:20:22
even give us a hope of understanding
1:20:24
it. So I've clarified this.
1:20:28
So they said, later when
1:20:30
the user visits the site
1:20:32
with available ad space, the
1:20:35
ad space seller, either
1:20:38
a seller side provider or the
1:20:40
site itself, can
1:20:42
use the protected
1:20:44
audience API to
1:20:47
run a browser
1:20:49
side ad auction,
1:20:52
which will select the most
1:20:54
appropriate ads to display to
1:20:57
the user. The
1:20:59
ad space seller calls
1:21:02
the browser's new, there's
1:21:05
a function navigator.runAdAuction
1:21:08
function to
1:21:10
provide the browser with a list
1:21:12
of interest group owners who
1:21:14
are invited to bid. Bids
1:21:17
can only be provided by interest groups
1:21:19
that the browser already became
1:21:22
a member of when
1:21:24
it had previously visited a website
1:21:26
where it was able to collect
1:21:28
that group and when
1:21:30
the owners of those interest groups had been
1:21:32
invited to bid. Understanding
1:21:35
code is retrieved from a
1:21:37
URL provided in the interest
1:21:39
groups configuration that was received
1:21:41
earlier. This code,
1:21:44
which is JavaScript, provides data
1:21:46
about the interest group and
1:21:48
information from the ad seller
1:21:50
along with contextual data about
1:21:52
the page and from the
1:21:54
browser. Each
1:21:56
interest group provided each
1:22:00
interest group providing a bid is known
1:22:03
as a buyer. When
1:22:06
the visited site's JavaScript
1:22:09
calls the new browser function to run
1:22:11
the ad auction, each buyer's
1:22:13
bidding code generates a
1:22:16
bid with the help of real-time
1:22:18
data provided by their
1:22:20
protected audience key value service,
1:22:23
whatever that is. Then the advertising
1:22:26
space seller receives
1:22:28
these bids as well as
1:22:31
seller-owned real-time data and scores
1:22:33
each bid. The bid
1:22:35
with the highest score wins the
1:22:38
auction. The winning ad
1:22:40
is displayed in a
1:22:42
fenced frame, which is
1:22:44
one of those new APIs which
1:22:46
absolutely prevents from having any interaction
1:22:48
with anything else anywhere. The
1:22:51
ad creatives URL is specified in
1:22:53
the bid and the origin must match
1:22:56
one in the list provided by the
1:22:58
interest group's configuration, that same information that
1:23:01
was received earlier. Finally,
1:23:03
the advertising space seller
1:23:06
can report the auction outcome with
1:23:08
a function known as report result
1:23:10
and buyers can report
1:23:12
their auction wins with a new
1:23:14
function report win. Okay,
1:23:17
and finally a bit later, Google
1:23:19
offers a bit more detail writing
1:23:23
in the protected audience API an ad
1:23:27
auction is a collection of
1:23:29
small JavaScript programs the
1:23:31
browser runs on the users
1:23:34
device to choose an ad. To
1:23:37
preserve privacy all ad auction code
1:23:39
from the seller and buyers is
1:23:41
run in isolated JavaScript
1:23:44
worklets that cannot talk
1:23:46
to the outside world. A
1:23:48
seller, a publisher or a
1:23:50
supply side platform initiates
1:23:53
a protected audience ad
1:23:55
auction on a site
1:23:57
that sells ad space such as a news
1:24:00
site. The seller chooses
1:24:02
buyers to participate in the
1:24:05
auction, indicates what space is for sale
1:24:08
and provides additional criteria for
1:24:10
the ad. Each buyer
1:24:12
is the owner of an interest group.
1:24:15
The seller provides the browser
1:24:17
with code to score bids,
1:24:19
which includes each bids value,
1:24:22
the ad creative URL and
1:24:24
other data returned from each
1:24:26
buyer. During the auction, listing
1:24:29
code from buyers and bids scoring
1:24:31
code from the seller can receive
1:24:33
data from their key value services.
1:24:36
Once an ad is chosen and displayed in a
1:24:39
fenced frame to reserve privacy, the
1:24:41
seller and the winning buyer can
1:24:43
report the auction result. Okay,
1:24:46
now, if all
1:24:48
of this sounds insanely complex, you're
1:24:50
right. This is not your grandpa's third-party
1:24:54
cookies anymore. Nor are our web browsers simple apps
1:24:56
running on our chosen OS
1:25:01
to display HTML
1:25:03
code. Those are the days that are
1:25:05
long gone and they're not coming back.
1:25:08
It should now be abundantly clear to everyone that what
1:25:10
Google has done with this privacy
1:25:14
sandbox is to radically transform our web
1:25:16
browser. From
1:25:19
passive displays of whatever page
1:25:21
is sent to them into
1:25:23
proactive advertising management engines.
1:25:28
All of this new technology is already
1:25:30
built into Chrome and has been there for the past
1:25:36
six months. Does all this probably
1:25:38
give Sir Timothy John Berners-Lee, the
1:25:41
web's original inventor,
1:25:43
a huge headache? I would not at all be
1:25:45
surprised if it did. Nothing less
1:25:47
than an
1:25:51
incredible mess is required to deliver the web. river
1:26:00
interest-driven advertising to users
1:26:03
without revealing anything about
1:26:06
those users to their
1:26:08
advertisers. And by
1:26:10
the way, an incredible mess, as I
1:26:13
said earlier, was the runner-up title for
1:26:15
today's podcast. A
1:26:17
large part of what I want to convey
1:26:19
here is that nothing
1:26:21
short of this level of
1:26:23
complexity is required to protect
1:26:26
our privacy while providing
1:26:28
what the websites we depend
1:26:30
upon and want unpaid
1:26:33
access to say they need.
1:26:37
Now, the nature of
1:26:39
inertia means that
1:26:41
we would never, and
1:26:43
I really mean never, move
1:26:46
from the absolute mess we're
1:26:48
in today to this new
1:26:50
promised land were it not
1:26:52
for a behemoth like Google to
1:26:54
first carefully design and
1:26:56
craft this solution, doing
1:26:59
so openly and in
1:27:01
full public view, inviting
1:27:03
collaboration and industry participation at
1:27:05
every step of the way
1:27:07
as they have, and
1:27:09
secondly, to then literally
1:27:12
force it down the
1:27:14
closed choking throats of
1:27:17
the rest of the existing
1:27:19
advertising technology industry by taking
1:27:22
Chrome, their world-domineering
1:27:24
browser and gradually
1:27:27
deprecating and foreclosing upon the
1:27:29
operation of all of the
1:27:31
previous tricks and techniques that
1:27:34
have historically been used
1:27:36
for user tracking and
1:27:38
compromising users' privacy
1:27:40
in the service of advertising tech.
1:27:45
No one else could do this but
1:27:47
Google. This is not
1:27:49
something where consensus could ever have
1:27:51
been reached. It would never happen.
1:27:54
It would be committee deadlock. I've
1:27:57
looked at the various ad tech blogs.
1:28:00
and they're all screaming and pulling their
1:28:02
hair out over this. But
1:28:04
they're all also busily conducting
1:28:07
experiments and getting ready for
1:28:09
what they too understand is
1:28:11
already inevitable. Notice
1:28:14
that one of the things Google has
1:28:16
done with this reconceptualization of web advertising
1:28:19
is to move the
1:28:21
advertising auctioning process away
1:28:24
from the advertiser and
1:28:26
into the browser. Traditionally
1:28:28
an advertiser would purchase
1:28:30
real estate on the web
1:28:32
on website pages. Then
1:28:35
they would run their own real
1:28:37
time auctions to determine which of
1:28:39
their many advertising clients' ads should
1:28:41
be inserted into that space for
1:28:43
any given visitor given
1:28:46
everything that the advertising
1:28:48
seller knows about
1:28:50
the visitor from tracking
1:28:53
them across the internet. This
1:28:55
changes all of that. Now
1:28:59
all of the work is being done on
1:29:02
the client side rather than on
1:29:04
the server end, and doing
1:29:06
this starves advertisers
1:29:08
of all the data
1:29:11
they were previously collecting
1:29:13
while convincingly arguing against
1:29:15
their having any
1:29:18
further need to ever collect anything.
1:29:22
In this new world, advertisers
1:29:25
place static purchase offers
1:29:28
to display content on
1:29:30
website real estate with whatever
1:29:33
ads they have to display organized
1:29:36
by interest group. Using
1:29:39
Google's new APIs, browsers
1:29:41
that had previously visited websites
1:29:43
representing various interest groups are
1:29:46
now able to collect the
1:29:48
advertiser's material that will later
1:29:50
be needed to display
1:29:53
ads for those interested.
1:29:56
Then later, when browsers visit
1:29:58
other websites with sell
1:30:00
offers behind available advertising
1:30:03
real estate, all
1:30:05
of the information about the
1:30:07
offers flows into the browser,
1:30:10
which then itself conducts
1:30:12
the auction and selects the
1:30:14
ad that is most relevant
1:30:16
to its user based
1:30:19
upon the places the browser has
1:30:21
visited during the past few weeks.
1:30:24
The results of the auction are
1:30:26
returned to all interested parties and
1:30:29
the ad tech company pays a piece of
1:30:31
the action or of the auction
1:30:34
to the site that offered up the real
1:30:36
estate. In something
1:30:38
of a follow up, Google explains,
1:30:41
quote, understanding user interests can
1:30:44
enable more relevant ads than
1:30:46
just choosing ads based on
1:30:49
site content, contextual
1:30:51
targeting, or by
1:30:53
using information provided by
1:30:56
a user to the site on which
1:30:58
the ad appears, first party
1:31:00
ad targeting. Traditionally,
1:31:02
ad platforms have learned
1:31:04
about user interests by tracking their
1:31:07
behavior across sites. Browsers
1:31:10
need a way to enable
1:31:12
ad platforms to select relevant
1:31:14
ads so content publishers
1:31:17
can get ad revenue without
1:31:19
cross site tracking. The
1:31:22
protected audience API aims
1:31:25
to move the web platform
1:31:27
closer to a state where
1:31:29
the user's browser on their
1:31:31
device, not the
1:31:33
advertiser or ad tech
1:31:36
platforms, holds the
1:31:38
information about what that
1:31:40
person is interested in. And
1:31:44
that states it perfectly, I think. The
1:31:47
way the entire web advertising world
1:31:49
has worked until now is
1:31:51
that every advertiser had
1:31:53
to collect all of
1:31:55
the information they possibly could about
1:31:58
every individual who was surfing
1:32:00
the internet for the sole purpose
1:32:03
of selecting the best advertisement to
1:32:05
show them. The
1:32:07
result was massively intrusive,
1:32:10
massively redundant,
1:32:12
and ultimately ineffective
1:32:15
utilization of resources.
1:32:18
But in the new world of
1:32:20
Google's Privacy Sandbox, it's
1:32:23
the user's browser that collects
1:32:25
the information about its own
1:32:27
users' interests by watching them
1:32:29
navigate the web. As
1:32:32
the browser moves around the web, future
1:32:36
advertising opportunities are collected by
1:32:38
the browser. And
1:32:41
later, when visiting a site that
1:32:43
is offering some available advertising space,
1:32:45
the browser itself runs an
1:32:47
auction on the fly to
1:32:50
decide which of the opportunities
1:32:52
it previously collected should
1:32:55
be presented to its user
1:32:57
based upon the criteria that
1:32:59
it solely maintains.
1:33:02
This is obviously a big deal, but
1:33:06
what seems just as obvious is
1:33:09
that no lesser of a deal would
1:33:11
get this important job done right.
1:33:15
We can argue, and we'll
1:33:17
always be able to argue, we
1:33:19
certainly know that the EFF will
1:33:21
always argue that all website user
1:33:24
driven advertising customizations should
1:33:27
simply be ended and
1:33:29
that advertisers should settle for
1:33:31
contextual advertising, placing their ads
1:33:34
on sites which are offering
1:33:36
content that's relevant and related
1:33:38
to their ads, just
1:33:40
like in the pre-tracking days. Unfortunately,
1:33:44
multiple studies have shown that
1:33:46
this would reduce website advertising
1:33:49
revenue by about half, and
1:33:52
many websites are barely making ends meet
1:33:54
as it is. So
1:33:56
the EFF's ivory tower stance
1:33:58
is simply not practical. it's
1:34:00
never going to happen. The
1:34:02
only way to permanently end
1:34:05
tracking is for it
1:34:07
to be flatly outlawed, but
1:34:10
tracking will never be outlawed while
1:34:12
the case can be made that
1:34:14
advertising customization is the only thing
1:34:17
that's keeping today's web alive and
1:34:19
financed and that there's
1:34:22
no alternative to tracking and compiling
1:34:24
interest profiling dossiers on
1:34:27
everyone using the internet.
1:34:30
So what Google has done is to
1:34:33
create a practical and functioning
1:34:36
alternative. Tracking is
1:34:38
no longer necessary. User
1:34:40
privacy is preserved and once
1:34:42
this new system has been
1:34:44
established we can anticipate that
1:34:46
we will finally see legislation
1:34:48
from major governments, probably with
1:34:50
Europe taking the lead, which
1:34:53
will flatly and without exception
1:34:55
outlaw any and all
1:34:58
internet user profiling and history
1:35:00
aggregation because it will no
1:35:02
longer be required. Google's
1:35:05
privacy sandbox masterpiece
1:35:07
has been in place as I've said several times
1:35:09
for the past six months and
1:35:12
although they've already been kicking
1:35:14
and screaming all other
1:35:16
serious advertisers have been exploring
1:35:18
it in anticipation of the
1:35:21
future which appears to be
1:35:23
all but certain. As
1:35:25
we move into 2024, fingerprinting
1:35:28
will become increasingly fuzzy
1:35:31
and Chrome's third-party cookie support
1:35:34
will be gradually withdrawn from
1:35:37
its ubiquitous web browser and
1:35:40
finally once the dust settles on
1:35:42
all this we can anticipate the
1:35:45
end of the annoying cookie permission
1:35:47
request pop-ups. We
1:35:50
are heading toward a brand
1:35:52
new web. Do you
1:35:54
think that like manifest v3 this
1:35:56
will be adopted by other browsers? at
1:36:00
some point, although as you pointed
1:36:02
out earlier, Google has complete dominance
1:36:04
in the browser. They have complete
1:36:06
dominance, not only them but all
1:36:08
Chromium. So really it's Safari and
1:36:11
Firefox that are
1:36:13
the remaining wildcards. And
1:36:16
this is what Google is going to
1:36:18
do. I think
1:36:20
they've nailed it. They
1:36:22
have a solution. And
1:36:25
the way they've nailed it is by massively
1:36:27
burdening the browser. Well I'm going to say
1:36:29
that. My system is
1:36:31
now working really hard to deliver
1:36:34
ads. Yep. By
1:36:38
the way, goodness, this would
1:36:40
be very easy to block. Yes,
1:36:43
and in fact you can opt out of this. Can
1:36:46
you? Absolutely. There
1:36:48
is a user-facing API that lets
1:36:50
you just say no. Google
1:36:53
knows most people will not say no.
1:36:56
And I will not say no. If
1:37:01
my use of the web is
1:37:03
now private and my browser
1:37:05
is selecting the best ads for me
1:37:07
to see which is returning the highest
1:37:09
amount of revenue to the websites I'm
1:37:12
visiting, it is a
1:37:14
win-win-win. It's really an interesting
1:37:16
idea. It's a great solution in terms
1:37:19
of protecting your privacy. Yes, it
1:37:22
turned the entire model on its
1:37:24
head. And
1:37:26
the fact is, today's, once
1:37:29
upon a time, a browser was a
1:37:31
little HTML rendering engine. Now
1:37:34
it is literally a behemoth. Well
1:37:36
that's one of the things that bothers me. The
1:37:39
browser is going to be 90% of your CPU pretty
1:37:41
soon. It
1:37:44
will. Although
1:37:46
it is little lightweight scripts, and we know
1:37:48
that Google has a frenzy about
1:37:52
performance and how
1:37:54
quickly This
1:37:56
all displays. Here's where Tim Berners-Lee might
1:37:58
actually like this. You've been
1:38:01
working toward ah, A.
1:38:05
Solution where you control your own
1:38:07
data. You. Know that your
1:38:09
data is yours in a new lease.
1:38:11
It out in affects people, which this
1:38:13
is basically an implementation of, so it
1:38:15
fits right into what Tim Berners Lee
1:38:17
has been doing of late. Though I
1:38:19
think that is possible Google may have
1:38:22
found a way. To. Give
1:38:24
what we would like our holy Grail would
1:38:26
be for us to controlling for our own
1:38:28
information about ourselves and then have the opportunity
1:38:31
of we wish to share it but. At.
1:38:33
A Price. You know that we get something
1:38:35
out of it while the others there's there
1:38:38
aren't. That far as I know, there is
1:38:40
no sharing opportunity. What there is in the
1:38:42
you live you can even browse the interest
1:38:45
group yeah yeah or browser he has saying
1:38:47
death here on it's head And and if
1:38:49
you object to any, you're able to delete
1:38:51
them and you're able to mark them as
1:38:54
never come back if he only don't want
1:38:56
it. It's interesting. Know. They've They've
1:38:58
really, they've nailed this. Yeah, I'm in
1:39:00
and and this is where they are
1:39:02
going and we know who they are.
1:39:05
So. M and their browser is. It's
1:39:07
funny, Duke, as I know we've given
1:39:09
a lot of space to the notion
1:39:12
of fingerprinting as A because it's kind
1:39:14
of a cool technology everybody is still
1:39:16
using cookies, could and is is and
1:39:18
and so when Google talks about right?
1:39:21
Wow. As as the As as of
1:39:23
the beginning of the year, one percent
1:39:25
of their users have third party cookies
1:39:27
turned off. And they're gonna
1:39:29
be. You know that they're They're doing
1:39:31
that as an initial experiment and then
1:39:34
they're going to be deprecating the rest
1:39:36
of third cookies. There will be no
1:39:38
more third party cookies by the have
1:39:40
by the middle of this Huge. That
1:39:42
said, if is and now. So that's
1:39:45
what's got the advertisers screaming and thinking.
1:39:47
Well in all, we liked knowing all
1:39:49
this about people, but we're gonna have
1:39:51
to fall in line buses the future.
1:39:53
Now it is the future. And it
1:39:55
really is a response to widespread ad
1:39:58
blocking. the cookies and
1:40:00
other GDPR requirements. Yeah,
1:40:02
I think it's interesting. Let's see
1:40:04
what happens. They've thrown so
1:40:06
many ideas up against the wall, none of them have
1:40:08
stuck. This might be the one. It
1:40:11
does solve the problem. I see nothing wrong with it.
1:40:14
Yeah. Good. Thank
1:40:16
you for filling us in the protected audience
1:40:18
API. Terrible
1:40:21
name, but a very interesting concept.
1:40:23
Yeah. Your browser is the
1:40:25
one that determines what you see. Yeah, and even
1:40:27
privacy sandbox. I mean, that doesn't tell you anything.
1:40:29
No. Like, you know, don't kick sand in my
1:40:31
eyes. Good. We'll talk about it tomorrow with Jeff.
1:40:34
Cool. Yeah. Thank you,
1:40:36
Steve. As always, security now
1:40:39
must listen every Tuesday, right? We
1:40:41
record the show Tuesday right after
1:40:43
Mac break weekly about one 30
1:40:45
Pacific for 30 Eastern 2130 UTC.
1:40:48
When we go live, when we start
1:40:50
recording, we'll go live on YouTube, youtube.com/twit.
1:40:52
So you can watch if you want
1:40:54
while we're doing it. Of course, club
1:40:56
members get special access. They, they, they
1:40:58
get 24 seven access to all
1:41:00
of our shows and Steve and even shows we don't
1:41:02
put out in public. The club is more and more
1:41:05
important as a way of going forward of us, uh, monetizing
1:41:08
because advertisers, you
1:41:11
know, they honestly, especially security now listeners,
1:41:13
but they love them and they hate
1:41:15
them. They love them because you guys
1:41:17
are the ones making all the ad rather
1:41:20
the technology buying decisions in
1:41:22
your companies. So they think you're great. I
1:41:24
mean, they're really crazy about you at the
1:41:26
same time. You're also the ones running ad
1:41:28
blockers. You're the ones they can't really reach.
1:41:32
Uh, and uh, and maybe that's why this protected
1:41:34
API is the, is the right way to do
1:41:36
it. Um, but what we would
1:41:38
like to offer you is a
1:41:40
chance to support a twit and this show and
1:41:42
all our shows directly by joining the club, twit.tv
1:41:45
slash club to it at seven bucks a month.
1:41:47
You get ad free versions of this show and
1:41:49
all of our other shows. You also get access
1:41:51
to our discord so you can chat along with
1:41:53
the show while you're watching it. You get a
1:41:55
shows we don't put out anywhere else. Uh,
1:41:58
but you really also get the warm and fuzzy. feeling that
1:42:00
you're helping us do, I think
1:42:02
what is a very important job, filling you in
1:42:04
on what's happening in technology thanks to people like
1:42:07
Steve. If you are interested,
1:42:09
please, twit.tv slash club twit.
1:42:11
I thank you in advance for your support.
1:42:14
While you're over there, take the survey. It's
1:42:16
really important that everybody, every show gets
1:42:19
well represented in the survey so
1:42:21
that we know, you know, what you want
1:42:23
as opposed to what people listening to other
1:42:25
shows want. Twit.tv slash survey 24. It'll take
1:42:27
you a couple of minutes. It's a lot
1:42:29
shorter this year. It's very easy. It's the
1:42:31
only thing we do once a year to
1:42:34
kind of figure out what your interests are.
1:42:36
So if you get a chance, please, twit.tv slash
1:42:38
survey 24. Thank you for
1:42:40
that as well. Steve's website is
1:42:43
grc.com. When you get there, you'll find all
1:42:45
of his great tools, including the world's
1:42:47
best mass storage maintenance and recovery utility,
1:42:50
Spinrite 6. Big announcement
1:42:52
next week. I got my
1:42:54
fingers crossed. It's possible. Believe me, mine
1:42:56
are, everything's crossed. I can't walk. If
1:43:00
you buy, let's use the good news, if you buy 6.0 now,
1:43:03
you'll get 6.1 for free when
1:43:05
it comes out, the minute it
1:43:07
comes out. So definitely check that out. Steve
1:43:09
also has the show, in fact, he has two
1:43:11
unique versions of the show. Of course, he has
1:43:13
a 64 kilobit audio, the same
1:43:16
audio we have, but he has 16 kilobit
1:43:18
audio, so it's a much smaller file
1:43:21
size. What is that, 1 fourth the size,
1:43:23
something like that? So by doing that, he
1:43:27
makes it easy for you to download
1:43:29
if you're in a bandwidth impaired situation.
1:43:31
He's even got a smaller version, which
1:43:33
is the text version of the show.
1:43:35
Yeah, human written, not AI generated, human
1:43:37
written transcript of this show, and every
1:43:39
show he's done. So it's very
1:43:41
useful for searching, reading
1:43:44
along as you listen, or if you don't have
1:43:46
time to listen, just reading the show notes, it's
1:43:48
all at grc.com. Go
1:43:51
to twit.tv.sn for the 64 kilobit
1:43:54
audio or the video, that's our unique
1:43:56
version, or go to YouTube, there's a
1:43:58
dedicated channel to security now. where the
1:44:00
video is. That's great for sharing little clips. And
1:44:02
of course, probably the best way to
1:44:04
get the show, make sure you don't miss a single
1:44:07
episode, subscribe to your favorite podcast player. You'll
1:44:09
get it automatically as soon as we finish it on a
1:44:11
Tuesday evening. And you can have
1:44:13
it for your Wednesday morning commuter or listen
1:44:15
at your leisure whenever you want. Just
1:44:18
look for security now, wherever finer
1:44:20
podcasts are aggregated and distributed via
1:44:22
RSS. Steve Gibson, grc.com.
1:44:24
Thank you, sir. Have a wonderful
1:44:27
week. I'll be back in
1:44:29
studio next week for another gripping, thrilling
1:44:31
edition of Security Now. See
1:44:34
you then, my friend. Bye. Hey,
1:44:36
I'm Rod Pyle, editor in chief of Ad Astra
1:44:38
magazine. And each week I join with my co-host
1:44:40
to bring you this week in space, the latest
1:44:43
and greatest news from the final frontier. We
1:44:45
talk to NASA chiefs, space scientists, engineers, educators
1:44:47
and artists. And sometimes we just shoot the
1:44:50
breeze over what's hot and what's
1:44:52
not in space books and TV. And we do
1:44:54
it all for you, our fellow true believers. So
1:44:56
whether you're an armchair adventurer or waiting for your
1:44:58
turn to grab a slot in Elon's Mars rocket,
1:45:01
join us on this week in space and be
1:45:03
part of the greatest adventure of all time. So
1:45:16
how do we get a i write. Well,
1:45:19
we need the right volume of
1:45:21
data, the software to train it,
1:45:24
and massive compute power. Or. Another.
1:45:26
One bites the dust. I really hate
1:45:28
reading this on how you know yet?
1:45:31
if you see. But with H P.
1:45:33
Green Lake we get access to supercomputing to
1:45:35
power a I at the scale we need,
1:45:37
helping generate better insights. Ah,
1:45:41
Nice teamwork guys. Search. Hp
1:45:43
green like.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More