Transcript
1
00:00:00,550 --> 00:00:04,430
Ned: Once again, I spent 30-plus minutes
2
00:00:04,440 --> 00:00:08,240
fighting with my webcam and microphone setup.
3
00:00:08,610 --> 00:00:11,129
Chris: You mean the webcam that you bought because it was going to be easier?
4
00:00:11,670 --> 00:00:12,400
Ned: That’s the one, yeah.
5
00:00:12,570 --> 00:00:12,800
Chris: Yeah.
6
00:00:13,639 --> 00:00:14,349
It’s going good, then?
7
00:00:14,900 --> 00:00:15,590
Ned: Going great.
8
00:00:16,090 --> 00:00:18,259
There was a firmware upgrade that failed.
9
00:00:18,510 --> 00:00:22,330
So, then I had to look it up, and what I discovered is that the
10
00:00:22,330 --> 00:00:26,290
cable they ship you is a USB-C to USB-C, which I happen to have
11
00:00:26,480 --> 00:00:30,320
a USB-C port on my computer, so that’s what I plugged it into.
12
00:00:30,809 --> 00:00:33,919
If you try to do the firmware upgrade across that cable, it will fail.
13
00:00:34,389 --> 00:00:37,940
Because everything is awful, and burn it with fire [laugh]
14
00:00:38,310 --> 00:00:38,730
.
Chris: Cool.
15
00:00:47,460 --> 00:00:51,669
Ned: Oh hello, alleged human, and welcome to the Chaos Lever podcast.
16
00:00:51,719 --> 00:00:54,509
My name is Ned, and I’m definitely not a robot.
17
00:00:55,059 --> 00:01:00,230
I’m a real human person who has roots, and sentience, and
18
00:01:00,230 --> 00:01:03,840
gets jokes, and sometimes I even go for a long strolls
19
00:01:04,059 --> 00:01:10,380
with my canine companion who is also not a robot, for real.
20
00:01:10,940 --> 00:01:12,630
With me as Chris, who’s also here.
21
00:01:13,049 --> 00:01:13,540
Hi, Chris.
22
00:01:14,200 --> 00:01:15,449
Chris: I really thought you were going to say
23
00:01:15,470 --> 00:01:18,350
that the dog is here and is now the new co-host.
24
00:01:18,360 --> 00:01:20,620
Ned: [laugh] . Don’t tempt me [laugh]
25
00:01:20,940 --> 00:01:22,339
.
Chris: Hard decisions have been made.
26
00:01:23,030 --> 00:01:24,600
Kibble bribery has happened.
27
00:01:25,410 --> 00:01:26,460
Ned: Ah yes.
28
00:01:26,539 --> 00:01:27,779
Bark-a-tron 3000.
29
00:01:27,789 --> 00:01:30,409
Has opinions, and he wants you to hear about them.
30
00:01:31,099 --> 00:01:33,550
Chris: Especially about the mailman.
31
00:01:33,560 --> 00:01:35,980
Ned: [laugh] . Mostly the UPS guy.
32
00:01:36,440 --> 00:01:38,860
Just loses his goddamn mind over that.
33
00:01:39,559 --> 00:01:41,179
That and a woman pushing a stroller.
34
00:01:41,259 --> 00:01:43,380
That’s just, like, Threat Level Midnight.
35
00:01:43,770 --> 00:01:47,159
Chris: Can you imagine if there was just a number of packages in the stroller?
36
00:01:47,850 --> 00:01:49,089
Ned: His brain would explode.
37
00:01:49,809 --> 00:01:51,910
Central Processing Unit just, pop.
38
00:01:51,920 --> 00:01:55,250
[I had] a really weird experience walking the dog this morning.
39
00:01:55,280 --> 00:01:58,530
Like the actual real dog… and not my fake
40
00:01:58,530 --> 00:02:00,650
robot dog that we used for the sake of a bit.
41
00:02:01,469 --> 00:02:04,789
We’re walking, and we look across the street, and there’s a fox.
42
00:02:05,180 --> 00:02:07,310
Now that, in and of itself, is not odd.
43
00:02:07,920 --> 00:02:09,840
We see foxes all the time.
44
00:02:10,620 --> 00:02:16,720
But the fox stopped and stared at us, and then started mewling.
45
00:02:16,720 --> 00:02:21,350
I learned what the fox says, and it sounds kind of like a dying baby.
46
00:02:21,790 --> 00:02:23,730
Chris: Mmm, yeah, that sounds about right.
47
00:02:24,289 --> 00:02:26,710
Ned: And it just continued to do that, looking at us.
48
00:02:26,740 --> 00:02:30,079
And my dog was very disturbed by this, as was I.
49
00:02:30,300 --> 00:02:32,800
It’s also, like, 5:30 a.m.
50
00:02:32,809 --> 00:02:34,370
so it’s barely light outside.
51
00:02:34,929 --> 00:02:38,220
So, we walk slowly in the other direction, keeping an eye
52
00:02:38,220 --> 00:02:41,210
on that fox, and now I’m a little scared to go outside.
53
00:02:41,719 --> 00:02:45,630
Chris: Well, that’s a useful and fun PSA, that if you’re
54
00:02:45,630 --> 00:02:47,910
out in the woods, and you hear what sounds like a baby
55
00:02:47,910 --> 00:02:52,170
crying, do not try to find it because it’s not a baby.
56
00:02:53,020 --> 00:02:56,460
It’s an animal with teeth, and probably rabies, and maybe a knife.
57
00:02:57,570 --> 00:02:58,919
Ned: [laugh] . They are tricksie.
58
00:02:59,359 --> 00:03:02,950
That’s one thing I’ve learned from Aesop’s Fables: you can’t trust a fox.
59
00:03:03,690 --> 00:03:05,060
Chris: A gift that keeps on giving.
60
00:03:05,810 --> 00:03:06,980
Ned: Every day of my life.
61
00:03:07,870 --> 00:03:11,440
Anyhow, you did a thing, or you wrote a thing about a thing you didn’t do.
62
00:03:11,820 --> 00:03:13,279
Chris: Hey, hey, hey.
63
00:03:13,280 --> 00:03:13,390
Ned: [laugh]
64
00:03:14,030 --> 00:03:14,840
.
Chris: You settle down.
65
00:03:15,589 --> 00:03:20,860
This is a long-standing tradition on this show, where I don’t go
66
00:03:20,860 --> 00:03:24,690
to various conferences, tell you about it, so you don’t have to go.
67
00:03:24,800 --> 00:03:25,110
Ned: Yay.
68
00:03:25,290 --> 00:03:26,880
Chris: I’m helping.
69
00:03:27,240 --> 00:03:27,580
Ned: Sure.
70
00:03:28,590 --> 00:03:32,439
Chris: In particular, in this instance, we will be talking about the RSA
71
00:03:32,450 --> 00:03:37,430
conference, or RSAC as they insist on calling it, which I absolutely will not.
72
00:03:38,200 --> 00:03:40,380
Ned: That makes me really uncomfortable.
73
00:03:41,200 --> 00:03:41,230
Chris: [laugh]
74
00:03:41,240 --> 00:03:43,980
.
Ned: Like, I shifted around in my seat a little bit when you said that.
75
00:03:45,460 --> 00:03:49,410
Chris: Now, for those who aren’t a hundred percent familiar, the RSA Conference
76
00:03:49,410 --> 00:03:53,400
is one of the premier security conferences that comes along every year.
77
00:03:54,039 --> 00:03:56,950
This year, it was held in San Francisco, which is not
78
00:03:56,950 --> 00:03:59,640
a surprise because it’s always held in San Francisco.
79
00:04:00,210 --> 00:04:00,600
Duh.
80
00:04:00,610 --> 00:04:01,510
Ned: Of course.
81
00:04:02,290 --> 00:04:04,519
Chris: Now, in another long-standing tradition, as
82
00:04:04,520 --> 00:04:07,859
discussed before, I didn’t go, but I heard about it.
83
00:04:08,280 --> 00:04:08,690
Ned: Yay.
84
00:04:09,429 --> 00:04:14,140
Chris: So, as usual, the sessions did all get recorded, and if you
85
00:04:14,190 --> 00:04:19,399
bought the remote attendees On-Demand Pass, you can watch them all now.
86
00:04:19,800 --> 00:04:22,070
And you can in fact, still buy the On-Demand
87
00:04:22,070 --> 00:04:24,200
Pass if this stuff is all interesting to you.
88
00:04:25,110 --> 00:04:27,689
There are already some sessions available for
89
00:04:27,690 --> 00:04:31,780
free, especially the keynote-y type sessions.
90
00:04:32,550 --> 00:04:35,210
Now, their official statement on the matter is that this
91
00:04:35,210 --> 00:04:38,840
stuff starts to become available approximately 30 days after
92
00:04:38,840 --> 00:04:42,350
the conference, but they’re actually already sharing them.
93
00:04:42,540 --> 00:04:43,790
There’s a good amount out there, and
94
00:04:43,790 --> 00:04:45,469
they’re going to just keep on trickling out.
95
00:04:45,490 --> 00:04:49,060
So, keep an eye on the RSA Conference’s YouTube channel if you’re
96
00:04:49,070 --> 00:04:52,880
interested in what you can see from this conference for free.
97
00:04:54,340 --> 00:05:00,829
Now, I said keynote-y type sessions and I said that for a reason.
98
00:05:01,540 --> 00:05:03,390
There are 36 keynotes.
99
00:05:04,030 --> 00:05:05,650
Ned: Uh— [sigh] —I—
100
00:05:05,959 --> 00:05:09,630
Chris: Which, if you’re doing the math at home, is a lot.
101
00:05:10,180 --> 00:05:12,690
Ned: It ceases to be a keynote at that point.
102
00:05:13,180 --> 00:05:13,550
Right?
103
00:05:13,810 --> 00:05:14,070
Chris: Right.
104
00:05:14,429 --> 00:05:14,759
Ned: Right.
105
00:05:14,980 --> 00:05:15,570
Chris: Correct.
106
00:05:15,639 --> 00:05:16,089
Ned: Okay.
107
00:05:16,510 --> 00:05:19,880
Chris: Because if you look up the definition for keynote,
108
00:05:20,910 --> 00:05:24,970
it’s a quote, “Talk that establishes a main underlying
109
00:05:25,000 --> 00:05:28,900
theme.” One would assume from this that there’s one.
110
00:05:29,950 --> 00:05:33,460
This has stopped being a thing for way too long now.
111
00:05:34,740 --> 00:05:36,830
Now, before we even talk about the conference, I want to
112
00:05:36,840 --> 00:05:40,439
start a change.org petition to call it something else.
113
00:05:40,969 --> 00:05:41,309
Ned: Okay.
114
00:05:41,690 --> 00:05:44,890
Chris: How about we’ve got ‘sessions,’ of which there are an
115
00:05:44,890 --> 00:05:50,940
infinite amount, “The keynote,” of which there is one, and the
116
00:05:50,940 --> 00:05:55,930
super-duper, fancy special speeches, of which there can be 36.
117
00:05:55,940 --> 00:05:56,500
Ned: Sure.
118
00:05:56,880 --> 00:05:59,510
Chris: Longer name, sounds way more important that way.
119
00:06:00,090 --> 00:06:00,680
Who says no?
120
00:06:01,960 --> 00:06:03,810
Ned: Can’t we just call them, like, ‘Premier Sessions?’
121
00:06:04,330 --> 00:06:05,380
Chris: Oh, that’s good, too.
122
00:06:05,500 --> 00:06:07,219
I think mine’s better, but that’s good, too.
123
00:06:07,650 --> 00:06:08,020
Ned: All right.
124
00:06:08,090 --> 00:06:08,849
Chris: We’ll workshop it.
125
00:06:09,520 --> 00:06:10,090
Ned: Of course.
126
00:06:10,260 --> 00:06:13,230
And, you know, if listeners have suggestions, write them in.
127
00:06:13,410 --> 00:06:13,889
Let us know.
128
00:06:14,280 --> 00:06:14,599
Chris: Yeah.
129
00:06:15,440 --> 00:06:15,740
Okay.
130
00:06:16,609 --> 00:06:18,609
So, on to more serious topics.
131
00:06:19,040 --> 00:06:19,290
Ned: Okay.
132
00:06:19,700 --> 00:06:22,590
Chris: Let’s start with the obvious question that a lot of people probably
133
00:06:22,600 --> 00:06:27,890
don’t know: what the hell does RSA stand for in the name RSA Conference?
134
00:06:28,730 --> 00:06:30,719
I’m not going to lie, I have probably looked this up ten
135
00:06:30,719 --> 00:06:33,450
times, found out about it, and then forgot it ten times.
136
00:06:34,170 --> 00:06:35,839
So, I had to look it up again.
137
00:06:36,660 --> 00:06:39,010
And it’s not as exciting as you might think.
138
00:06:39,040 --> 00:06:43,140
It’s actually just made up of last name initials of the cofounders of RSA.
139
00:06:44,020 --> 00:06:47,939
That would be Ron Rivest, Adi Shamir, and Leonard Adelman.
140
00:06:48,259 --> 00:06:48,590
Get it?
141
00:06:48,860 --> 00:06:49,250
R.
142
00:06:49,590 --> 00:06:49,979
S.
143
00:06:50,309 --> 00:06:50,609
A.
144
00:06:51,110 --> 00:06:51,719
Ned: I get things.
145
00:06:51,719 --> 00:06:52,869
I’m smart [laugh]
146
00:06:53,670 --> 00:06:54,259
.
Chris: So, there you have it.
147
00:06:54,380 --> 00:06:55,520
One mystery solved.
148
00:06:55,530 --> 00:06:58,270
[singing] And we’ve only just begun.
149
00:06:59,030 --> 00:06:59,479
Ned: Oh.
150
00:06:59,770 --> 00:07:00,460
Chris: And we’re sued.
151
00:07:01,040 --> 00:07:01,830
Ned: Eh, always.
152
00:07:02,110 --> 00:07:02,610
Always.
153
00:07:02,970 --> 00:07:06,700
Folks may be familiar with RSA if they look at different encryption
154
00:07:06,780 --> 00:07:11,830
types because RSA is one of the encipherment types, I think.
155
00:07:12,090 --> 00:07:12,599
Chris: Encipherment?
156
00:07:13,440 --> 00:07:13,860
Ned: Yes.
157
00:07:13,860 --> 00:07:14,392
It’s a word I use.
158
00:07:14,392 --> 00:07:15,580
Chris: That is absolutely not a word.
159
00:07:15,940 --> 00:07:17,450
Ned: [laugh] . It is now [laugh]
160
00:07:17,450 --> 00:07:20,640
.
Chris: It sounds like something you get six months in jail for.
161
00:07:22,200 --> 00:07:23,620
Ned: [laugh] . Encipherment, right next to embezzling?
162
00:07:24,330 --> 00:07:24,359
Chris: [laugh]
163
00:07:24,919 --> 00:07:27,039
.
Ned: He enciphered the company for thousands of dollars.
164
00:07:27,690 --> 00:07:29,049
I think it is a cipher type.
165
00:07:29,410 --> 00:07:29,870
Chris: Yes.
166
00:07:30,129 --> 00:07:31,620
Ned: So, you may have seen it there.
167
00:07:32,510 --> 00:07:35,490
Chris: Anyway, the theme of the conference—and they actually did have
168
00:07:35,490 --> 00:07:41,609
a theme—was ‘The Art of the Possible,’ which sounds wishy-washy, and
169
00:07:41,620 --> 00:07:46,019
more like you know the title of an airport self-help book than something
170
00:07:46,020 --> 00:07:51,099
that would be around relevant topics in technology, but in the way
171
00:07:51,100 --> 00:07:55,210
of the old blessing slash curse, “May you live in interesting times,”
172
00:07:55,890 --> 00:07:58,650
get to think a little deeper about ‘The Art of the Possible’ name.
173
00:07:59,210 --> 00:08:03,809
Kind of, the idea here is, not only what we as
174
00:08:04,000 --> 00:08:07,370
technologists can do, as you know, security professionals
175
00:08:07,370 --> 00:08:11,500
can do, what is possible that our adversaries can do.
176
00:08:11,970 --> 00:08:12,070
Ehhh?
177
00:08:15,160 --> 00:08:15,170
Ehhhh?
178
00:08:15,170 --> 00:08:15,640
Anyway.
179
00:08:17,090 --> 00:08:21,249
Ned: As a response to whether or not this is a self-help book, it is.
180
00:08:21,429 --> 00:08:23,320
It’s called The Art of the Possible: Create An
181
00:08:23,330 --> 00:08:26,150
Organization With no Limits, written by Daniel M.
182
00:08:26,150 --> 00:08:29,450
Jacobs in 2010, and it is, quote, “An integrated
183
00:08:29,480 --> 00:08:32,080
leadership and management guide to success.”
184
00:08:32,860 --> 00:08:34,959
Chris: Why do you insist on putting the audience to sleep?
185
00:08:35,400 --> 00:08:38,610
Ned: [laugh] . Well, I didn’t actually try to read it to you, though,
186
00:08:38,650 --> 00:08:42,309
that would absolutely put everyone to sleep, probably me included.
187
00:08:42,570 --> 00:08:43,970
Chris: That’s on the Patreon page.
188
00:08:45,360 --> 00:08:46,550
Ned: [laugh] . We should have one of those.
189
00:08:47,610 --> 00:08:52,329
Chris: So, with 36 sessions listed as a keynote, it was
190
00:08:52,330 --> 00:08:55,530
a little difficult to pin down where to start [laugh] and
191
00:08:55,549 --> 00:08:59,119
what the main mission of this year was supposed to be.
192
00:09:00,320 --> 00:09:04,620
But if you look at the schedule—and that is also available on
193
00:09:04,620 --> 00:09:11,329
their website—the first one that was not weird, was put on by Hugh
194
00:09:11,330 --> 00:09:16,679
Thompson, who is a big-wig at RSA, and did a number of keynotes.
195
00:09:16,680 --> 00:09:20,970
But this one, I think, got as close to a main topic or a main
196
00:09:21,340 --> 00:09:24,360
overarching theme speech type of thing that we’re going to get.
197
00:09:25,060 --> 00:09:27,750
It was titled “The Power of Community”, and is
198
00:09:27,750 --> 00:09:30,060
available for streaming on their YouTube channel.
199
00:09:30,809 --> 00:09:31,910
It’s only 18 minutes.
200
00:09:31,980 --> 00:09:33,429
It’s worth the watch, actually.
201
00:09:33,429 --> 00:09:34,150
It’s very entertaining.
202
00:09:34,160 --> 00:09:34,760
The guy’s good.
203
00:09:36,240 --> 00:09:38,540
The main theme of the speech was basically
204
00:09:39,210 --> 00:09:42,610
that none of us is smarter than all of us.
205
00:09:43,460 --> 00:09:46,100
Now, on the one hand, you might think this is a cynical
206
00:09:46,100 --> 00:09:49,120
message that only serves to reiterate the importance of
207
00:09:49,380 --> 00:09:51,449
paying a lot of money to go to a security conference.
208
00:09:52,209 --> 00:09:52,579
Ned: Fair.
209
00:09:53,070 --> 00:09:54,710
Chris: But on the other hand, he’s kind of right.
210
00:09:56,010 --> 00:10:00,630
These conferences are, in fact, a sharing of knowledge enabling every security
211
00:10:00,630 --> 00:10:05,059
practitioner to learn just a little bit more about their own areas of expertise,
212
00:10:05,450 --> 00:10:10,870
and the ones they need to know, but never had any type of experience with.
213
00:10:11,780 --> 00:10:16,849
And to that end, he talked about two things that are really good life
214
00:10:16,850 --> 00:10:20,270
lessons for conference-going that we’ve actually discussed before.
215
00:10:21,260 --> 00:10:24,689
The first one is to attend sessions outside of your wheelhouse
216
00:10:25,389 --> 00:10:29,340
because sometimes it might not help, but often it actually will.
217
00:10:29,730 --> 00:10:33,349
It’s fairly surprising to hear other subject-matter experts
218
00:10:33,350 --> 00:10:36,380
talk about what they’re passionate about, and then be able
219
00:10:36,380 --> 00:10:39,790
to kind of think about your own stuff in a different way.
220
00:10:40,440 --> 00:10:40,890
Ned: Mm-hm.
221
00:10:40,990 --> 00:10:43,920
Chris: External perspectives on other technologies
222
00:10:44,360 --> 00:10:46,640
opening up little neural pathways in your brain.
223
00:10:47,430 --> 00:10:48,740
It does actually help.
224
00:10:49,370 --> 00:10:49,790
Ned: It does.
225
00:10:50,030 --> 00:10:50,460
I agree.
226
00:10:51,120 --> 00:10:54,079
Chris: And the second one is a stupid thing.
227
00:10:55,380 --> 00:10:58,169
He thinks that you should talk to people.
228
00:10:58,940 --> 00:10:59,919
Ned: Bah, no.
229
00:11:00,949 --> 00:11:00,979
Chris: What?
230
00:11:00,979 --> 00:11:01,489
Ned: Thumbs down.
231
00:11:01,879 --> 00:11:02,099
Chris: You fool.
232
00:11:04,319 --> 00:11:04,349
Ned: [laugh]
233
00:11:04,359 --> 00:11:08,050
.
Chris: No, he enjoined the audience to, quote, “Meet at least three
234
00:11:08,050 --> 00:11:12,190
new people.” Which, to me, sounds terrifying and counterproductive.
235
00:11:12,880 --> 00:11:16,080
As all security practitioners know, new people
236
00:11:16,110 --> 00:11:18,060
have cooties, and they’re all out to get me.
237
00:11:18,410 --> 00:11:18,700
Ned: Uh-huh.
238
00:11:18,710 --> 00:11:19,480
Chris: I mean us.
239
00:11:20,049 --> 00:11:20,429
Ned: Yes.
240
00:11:20,730 --> 00:11:24,160
Chris: But if you wanted to get cooties, there was something
241
00:11:24,160 --> 00:11:27,500
like 42,000 people wandering around the halls of the RSA
242
00:11:27,500 --> 00:11:29,700
Conference, drinking stale coffee to give it a shot with.
243
00:11:30,760 --> 00:11:33,350
Ned: Wow, I did not realize it was 42,000 people.
244
00:11:33,980 --> 00:11:36,390
That’s getting up there with, like, re:Invent numbers.
245
00:11:36,840 --> 00:11:38,909
Chris: It is creeping up there, and it is a bit of a surprise.
246
00:11:38,990 --> 00:11:43,189
It also explains perhaps why there were 36 keynotes, but probably not.
247
00:11:43,760 --> 00:11:44,640
Ned: [laugh] . Still, no.
248
00:11:45,140 --> 00:11:50,179
Chris: Now, all of that being said, let’s talk about some of the themes of the
249
00:11:50,179 --> 00:11:54,740
conference, based on the material that they actually shared in the sessions.
250
00:11:55,630 --> 00:12:00,600
So first, with a bullet, you’re not going to believe this.
251
00:12:01,090 --> 00:12:02,660
Ned, I hope you’re sitting down.
252
00:12:03,230 --> 00:12:05,150
Ned: Can I sit down lower?
253
00:12:05,300 --> 00:12:05,670
Would that help?
254
00:12:05,670 --> 00:12:08,110
Chris: Can you put a chair on a chair, and then sit on that chair?
255
00:12:08,340 --> 00:12:08,530
Ned: Okay.
256
00:12:08,530 --> 00:12:09,710
Chris: Because that’s the kind of sitting down
257
00:12:09,710 --> 00:12:11,539
you’re going to need because this is a shocker.
258
00:12:12,200 --> 00:12:13,960
AI was a big old topic.
259
00:12:15,350 --> 00:12:15,460
Ned: [big silly gasp]
260
00:12:15,809 --> 00:12:16,690
.
Chris: I warned you.
261
00:12:17,289 --> 00:12:17,720
Ned: Damn it.
262
00:12:17,750 --> 00:12:19,310
I should have been sitting on a milk crate.
263
00:12:19,960 --> 00:12:20,590
[I need stability]
264
00:12:21,240 --> 00:12:22,730
.
Chris: I mean, good God, man.
265
00:12:23,290 --> 00:12:27,100
Of the 36 keynotes, 13 of them were directly AI-related.
266
00:12:27,660 --> 00:12:30,840
Of the regular sessions, it was 82 out of 298,
267
00:12:30,880 --> 00:12:33,610
and that is just ones that had AI in the title.
268
00:12:34,240 --> 00:12:35,530
Ned: Frankly, that seems low.
269
00:12:36,289 --> 00:12:41,400
Chris: [laugh] . So, what did they actually have to say about AI?
270
00:12:42,160 --> 00:12:45,320
A lot of what they had to say was not really that surprising.
271
00:12:46,260 --> 00:12:50,190
Main themes: AI is going to be more and more utilized by both attackers
272
00:12:50,429 --> 00:12:54,079
and defenders, and it’s an open question if there’s any way to
273
00:12:54,230 --> 00:12:58,860
guarantee that the defenders will maintain an upper hand going forward.
274
00:12:59,420 --> 00:13:00,740
Ned: Almost certainly not.
275
00:13:01,470 --> 00:13:02,270
Chris: Hold that thought.
276
00:13:02,610 --> 00:13:04,319
Because there is one piece of it that
277
00:13:04,320 --> 00:13:07,540
might be just the faintest glimmer of hope.
278
00:13:07,759 --> 00:13:08,469
Ned: Oh, okay.
279
00:13:08,750 --> 00:13:11,860
Chris: So, before we get to that, there are two things that attackers
280
00:13:11,860 --> 00:13:15,840
can do now that were just flat out harder for them before AI.
281
00:13:16,570 --> 00:13:23,060
The first one is, hackers attacking from all over the globe have a problem, and
282
00:13:23,060 --> 00:13:26,849
that problem is, generally speaking, they don’t speak every language on Earth.
283
00:13:27,330 --> 00:13:27,760
Ned: True.
284
00:13:28,039 --> 00:13:30,069
Chris: But they would like to attack everybody on Earth.
285
00:13:30,490 --> 00:13:35,700
So, what you used to end up with was hilariously poorly
286
00:13:35,700 --> 00:13:42,790
Google-translated emails that were obvious scams, or attacks, or
287
00:13:43,349 --> 00:13:48,389
just loaded with malware, and grammatically incorrect things, right?
288
00:13:49,130 --> 00:13:55,550
AI can help just rewrite communications into natural-sounding language
289
00:13:55,750 --> 00:13:58,319
for whatever they are trying—whomever they are trying to attack.
290
00:13:58,960 --> 00:14:02,390
You can do that with ChatGPT, for free, right now, indefinitely.
291
00:14:03,040 --> 00:14:07,939
Now admittedly, ChatGPT still has an extremely heavy English language
292
00:14:07,959 --> 00:14:12,370
bias, but they are working on including more and more sources from
293
00:14:12,390 --> 00:14:15,819
other languages, so it’s going to continue to get better and expand.
294
00:14:16,590 --> 00:14:22,339
Ned: There’s an interesting theory that I’ve heard proposed before that the
295
00:14:22,639 --> 00:14:28,850
poor wording syntax and English usage in those emails is somewhat intentional.
296
00:14:29,250 --> 00:14:32,430
Could they get someone to write it in a more natural-sounding way?
297
00:14:32,430 --> 00:14:37,190
Yes, but they’re using it as a filter to only capture those people
298
00:14:37,259 --> 00:14:40,229
who would read that email and think it’s perfectly fine because
299
00:14:40,379 --> 00:14:44,600
in order for their scam to work, they need someone who will read
300
00:14:44,600 --> 00:14:47,550
that email and think that it’s perfectly fine, and that also they
301
00:14:47,550 --> 00:14:51,120
should click on the link and transfer $1,000 to a bank account.
302
00:14:51,770 --> 00:14:52,880
Chris: Interesting theory.
303
00:14:53,690 --> 00:14:54,750
Ned: I don’t know if it’s a hundred percent
304
00:14:54,750 --> 00:14:57,039
accurate, but I have heard it proposed.
305
00:14:57,860 --> 00:14:58,880
Chris: Well, they didn’t talk about that.
306
00:14:59,320 --> 00:14:59,760
Ned: Well fine.
307
00:15:00,000 --> 00:15:00,970
Chris: So, stop helping.
308
00:15:01,340 --> 00:15:02,490
Ned: [laugh] . Okay.
309
00:15:03,290 --> 00:15:08,069
Chris: So, the second thing that AI can do to help attackers is research.
310
00:15:08,690 --> 00:15:12,629
Thanks to the careless, wide-open way that people use the
311
00:15:12,630 --> 00:15:15,989
internet, and the total lack of data sovereignty laws that matter
312
00:15:15,990 --> 00:15:20,240
at all, there is—and I looked this up—it’s a quatro-bajillion
313
00:15:20,700 --> 00:15:23,290
data points out there about everyone and every company.
314
00:15:24,189 --> 00:15:24,449
Ned: [laugh] . Okay.
315
00:15:25,160 --> 00:15:27,079
Chris: Now, in terms of corporate research,
316
00:15:27,220 --> 00:15:30,459
previously, you would have to rely on paid services.
317
00:15:30,799 --> 00:15:34,379
The most famous one that I know of is ZoomInfo, and what they do is collate
318
00:15:34,379 --> 00:15:38,260
all this information from LinkedIn, from blog posts, from 10-Ks, from
319
00:15:38,260 --> 00:15:42,189
wherever they can find it, to figure out, how is it company structured?
320
00:15:42,480 --> 00:15:43,870
What are they up to lately?
321
00:15:43,970 --> 00:15:45,459
How many employees do they have?
322
00:15:45,460 --> 00:15:46,419
What are their markets?
323
00:15:46,450 --> 00:15:47,560
Blahdy, blahdy, blahdy, blah.
324
00:15:48,230 --> 00:15:52,800
This may not sound like much, but remember, the human element
325
00:15:52,820 --> 00:15:57,340
is and probably will always be the biggest security risk.
326
00:15:58,130 --> 00:16:01,730
Figuring out who works where, what they’re responsible
327
00:16:01,740 --> 00:16:05,880
for, and how a company is structured, both from a personnel
328
00:16:05,880 --> 00:16:08,760
perspective and potentially from a security perspective—IT
329
00:16:09,840 --> 00:16:13,209
infrastructure, et cetera—often makes its way into press releases.
330
00:16:14,190 --> 00:16:19,620
This helps fine-tune long-running attacks, right?
331
00:16:19,620 --> 00:16:22,420
This is not the fly-by-night scams we were talking about before.
332
00:16:22,420 --> 00:16:25,280
This is APTs and state-level actors.
333
00:16:25,950 --> 00:16:29,229
But it makes it so much easier to do this research with AI.
334
00:16:29,229 --> 00:16:33,970
Now, in terms of people being the worst—just a little side to jaunt
335
00:16:33,990 --> 00:16:40,470
here, a little venture—the 2024 Verizon Data Breach Investigation
336
00:16:40,470 --> 00:16:45,400
Report, or DBIR—came out around the same time as the RSA Conference.
337
00:16:46,080 --> 00:16:50,650
And one of the main numbers to pay attention to from this report was 68%.
338
00:16:51,870 --> 00:16:55,690
68% of breaches involve a non-malicious human element like
339
00:16:55,690 --> 00:16:58,800
a person falling victim to a social engineering attack,
340
00:16:59,230 --> 00:17:02,590
making an error in configuration, et cetera, et cetera.
341
00:17:03,360 --> 00:17:05,949
Meaning people make mistakes.
342
00:17:06,969 --> 00:17:10,050
Now, if you want more info about the report, there is a link to it in the
343
00:17:10,050 --> 00:17:14,310
[show notes] . The report is free, it is informative, and it is depressing.
344
00:17:14,920 --> 00:17:16,999
Ned: Well, you and I have both read at least
345
00:17:17,000 --> 00:17:19,670
one Kevin Mitnick book about social engineering.
346
00:17:19,799 --> 00:17:23,800
And the general way that he got into anything was to figure out
347
00:17:23,930 --> 00:17:26,939
the organizational structure, and then find a weak point in it.
348
00:17:27,589 --> 00:17:31,940
And often that weak point is going to be the personal assistant for any of
349
00:17:31,940 --> 00:17:38,449
the muckety-mucks at a company because they have a vast amount of access—much,
350
00:17:38,469 --> 00:17:43,580
much more than they’re getting paid to have—and they know where all the
351
00:17:43,580 --> 00:17:47,230
bodies are buried, all the secrets are, and if you can compromise them,
352
00:17:47,310 --> 00:17:50,330
then it’s relatively straightforward to compromise the rest of the company.
353
00:17:51,290 --> 00:17:51,510
Chris: Right.
354
00:17:52,290 --> 00:17:54,179
Also people that are new to the company.
355
00:17:54,809 --> 00:17:56,240
Ned: That’s a good one, too, because they don’t
356
00:17:56,259 --> 00:17:58,441
know the proper procedures, and they’re report—
357
00:17:58,441 --> 00:17:59,780
Chris: And don’t they know everybody.
358
00:18:00,129 --> 00:18:00,509
Ned: Right.
359
00:18:00,810 --> 00:18:02,370
Chris: And they’re probably very stressed out
360
00:18:02,370 --> 00:18:04,820
to do a good job and to keep everybody happy.
361
00:18:05,280 --> 00:18:05,629
Ned: Right.
362
00:18:05,640 --> 00:18:10,660
So, you get a request from Angela over in HR to send over some
363
00:18:10,780 --> 00:18:14,370
very important information that they need to complete your benefits
364
00:18:14,370 --> 00:18:17,029
package, and you’d be like, “Sure, I’ll get right on that, Angela.”
365
00:18:17,350 --> 00:18:18,360
Chris: I like benefits.
366
00:18:18,600 --> 00:18:19,679
Ned: I—who doesn’t?
367
00:18:19,930 --> 00:18:22,740
You know, “Participate in our employee bonus
368
00:18:22,740 --> 00:18:25,020
program.” Like, yeah, I’m responding to that.
369
00:18:25,020 --> 00:18:27,115
Except Angela doesn’t work for the company [laugh]
370
00:18:27,320 --> 00:18:27,610
.
Chris: Right.
371
00:18:27,610 --> 00:18:29,280
Ned: And now they have all of your personal
372
00:18:29,280 --> 00:18:31,760
information and your password for some reason.
373
00:18:32,610 --> 00:18:36,379
Chris: So, let’s pivot, and we’ll talk about what they had to say about
374
00:18:36,389 --> 00:18:40,830
things from the defenders’ point of view because it’s not all a nightmare.
375
00:18:41,520 --> 00:18:41,780
Ned: Okay.
376
00:18:42,390 --> 00:18:46,289
Chris: One thing that could help tilt the balance in favor of defenders
377
00:18:46,960 --> 00:18:50,020
ties back to the whole, “We’re all in it together,” point from earlier.
378
00:18:50,799 --> 00:18:54,889
Defenders have a hell of a lot more data than attackers do.
379
00:18:55,830 --> 00:19:01,860
Bruce Schneier, IT security technology guy extraordinaire, stated it plainly.
380
00:19:02,150 --> 00:19:03,440
“We have more data than they do.
381
00:19:04,130 --> 00:19:09,990
LLMs and AI, they be trained on data, therefore, our data stuff
382
00:19:10,100 --> 00:19:15,870
is going to make our AI stuff better than their AI stuff.” Now,
383
00:19:15,870 --> 00:19:18,060
he said it a little better than that, but I’m paraphrasing.
384
00:19:18,800 --> 00:19:19,900
Actually no, I take that back.
385
00:19:19,900 --> 00:19:20,679
It was an exact quote.
386
00:19:21,170 --> 00:19:21,749
Ned: All right, fair.
387
00:19:22,230 --> 00:19:25,720
Chris: So, if you step back and think about it, it does make sense.
388
00:19:26,670 --> 00:19:31,570
Pick any random EDR company that’s large, like say, SentinelOne—whomever;
389
00:19:31,590 --> 00:19:36,530
doesn’t matter—now, these companies, they don’t release precise specifics
390
00:19:36,550 --> 00:19:39,850
on their deployments of the field, but SentinelOne proudly states that
391
00:19:39,850 --> 00:19:43,329
they have quote, “Tens of millions,” of protected endpoints out there,
392
00:19:43,880 --> 00:19:49,209
all of which report back about attacks, SentinelOne vulnerability
393
00:19:49,210 --> 00:19:53,410
researchers can then take that data and turn it into protections.
394
00:19:53,950 --> 00:19:57,580
Thus, you have areas of security where you didn’t have
395
00:19:57,580 --> 00:20:00,139
anything before because you can see what people are doing.
396
00:20:01,399 --> 00:20:05,100
And there are even areas of security where researchers from
397
00:20:05,100 --> 00:20:07,920
multiple different companies will pool their resources to
398
00:20:07,930 --> 00:20:12,139
help increase the overall security posture of everybody.
399
00:20:12,910 --> 00:20:13,580
So, that’s good.
400
00:20:14,410 --> 00:20:17,340
Attackers, on the other hand, don’t do that.
401
00:20:18,350 --> 00:20:19,000
Ned: [laugh] . Yeah.
402
00:20:19,309 --> 00:20:21,230
Chris: They all work, more or less, alone.
403
00:20:22,170 --> 00:20:25,730
And I don’t care how popular a given piece of malware is.
404
00:20:26,050 --> 00:20:29,470
It’s, one, probably not phoning home like that because that would
405
00:20:29,480 --> 00:20:33,530
be immediately discovered by network trackers, and two, they’re not
406
00:20:33,530 --> 00:20:37,140
hitting tens of millions of anything, except maybe dollars they get
407
00:20:37,140 --> 00:20:39,669
from ransomware payments that are never disclosed to the public.
408
00:20:40,219 --> 00:20:40,400
Ned: Wa-wah.
409
00:20:40,690 --> 00:20:46,290
Chris: Quote, “We have a very rich data source which, uncommonly, is
410
00:20:46,290 --> 00:20:50,370
an asymmetry in favor of the defender that we don’t see very often,”
411
00:20:50,460 --> 00:20:56,630
unquote, stated Daniel Rohrer, VP of Software Product Security at Nvidia.
412
00:20:57,610 --> 00:20:58,890
So, what does all this add up to?
413
00:20:59,180 --> 00:21:04,670
From the AI perspective, what it adds up to is finely tuned LLMs.
414
00:21:05,520 --> 00:21:09,899
Not general stuff like ChatGPT, but we’re talking about very focused
415
00:21:09,900 --> 00:21:15,009
things based on security, trained on that huge amount of data to
416
00:21:15,009 --> 00:21:19,860
help with things like identifying adversarial behavior, zero-day
417
00:21:19,860 --> 00:21:24,310
protections, and eventually, even things like auto-remediation.
418
00:21:25,030 --> 00:21:28,820
You can have a listener on each endpoint that tracks for all this stuff;
419
00:21:28,849 --> 00:21:33,220
if it sees something nightmarish, it could just turn off the network.
420
00:21:34,219 --> 00:21:36,800
Now, it’s going to take a long time for people to be comfortable
421
00:21:36,800 --> 00:21:41,000
with auto remediation based on AI, but if we look far enough
422
00:21:41,000 --> 00:21:43,810
into the future, and I’m thinking probably less than five
423
00:21:43,810 --> 00:21:46,180
years, my guess is that this is going to become the norm.
424
00:21:46,870 --> 00:21:53,499
Ned: We’ve already seen a flood of security vendors adding AI into all
425
00:21:53,500 --> 00:21:57,930
their marketing materials, which some of that is, you know, just marketing,
426
00:21:57,990 --> 00:22:02,989
marketecture—vaporware, if you will—but a lot of it is we’re bolting on a
427
00:22:02,990 --> 00:22:08,050
new feature to our existing offering that makes use of a trained LLM that we
428
00:22:08,080 --> 00:22:13,360
trained on our giant pool of data we’ve collected from all of our clients.
429
00:22:13,910 --> 00:22:14,150
Chris: Right.
430
00:22:14,310 --> 00:22:18,730
Ned: So, I think, from a security perspective, that could happen pretty rapidly.
431
00:22:19,350 --> 00:22:22,449
What’s a little more difficult is putting in the detection portion that
432
00:22:22,450 --> 00:22:25,379
you’re talking about because that needs to be closer to the endpoint.
433
00:22:25,870 --> 00:22:29,669
So, we’re going to have to wait for, probably, some specialized hardware to be
434
00:22:29,680 --> 00:22:36,879
available—the sort of TPUs, GPUs, XPUs—to be available on these endpoint devices
435
00:22:36,890 --> 00:22:40,470
to do that real-time scanning if you’re looking for that level of protection.
436
00:22:41,020 --> 00:22:43,450
Chris: TPU, of course, as everyone knows, is the Toilet Paper Unit.
437
00:22:44,030 --> 00:22:44,829
Ned: Yes [sigh]
438
00:22:45,490 --> 00:22:46,549
.
Chris: I got jokes, man.
439
00:22:47,139 --> 00:22:48,209
I got jokes.
440
00:22:48,629 --> 00:22:49,470
Ned: It’s fantastic.
441
00:22:49,720 --> 00:22:49,970
No notes.
442
00:22:49,970 --> 00:22:54,990
Chris: [laugh] . Oh, Bark-a-tron 3000 looks so disappointed in me.
443
00:22:56,650 --> 00:22:59,169
Ned: [laugh] . He’s kind of licking his own ass, but whatever.
444
00:23:00,509 --> 00:23:00,859
Chris: Okay.
445
00:23:01,340 --> 00:23:01,940
Moving on.
446
00:23:02,440 --> 00:23:04,300
Something else that was a big topic.
447
00:23:04,730 --> 00:23:08,430
Everybody’s favorite four letter word: regulations.
448
00:23:09,140 --> 00:23:10,370
These are a— [mouth noises]
449
00:23:10,370 --> 00:23:10,857
—
Ned: [laugh]
450
00:23:10,857 --> 00:23:12,830
.
Chris: That is not important.
451
00:23:12,950 --> 00:23:14,929
Just… doing the thing over here.
452
00:23:15,270 --> 00:23:15,680
Ned: All right.
453
00:23:16,300 --> 00:23:18,610
Chris: So, it was a big concern at the conference.
454
00:23:19,090 --> 00:23:20,910
Honestly, it’s a big concern all over the
455
00:23:20,910 --> 00:23:24,310
world, and it does have relations to AI.
456
00:23:24,770 --> 00:23:28,220
But first, one thing that came out at the conference
457
00:23:28,220 --> 00:23:31,240
was the quote, “Secure by Design,” pledge.
458
00:23:32,200 --> 00:23:37,420
Now, this was created by CISA, and is intended to ensure that quote, “Products
459
00:23:37,530 --> 00:23:42,039
should be secure to use out of the box, with secure configurations enabled
460
00:23:42,049 --> 00:23:46,310
by default, and security features such as multifactor authentication,
461
00:23:46,570 --> 00:23:53,449
logging, and single sign on available at no additional cost.” So, this
462
00:23:53,450 --> 00:23:57,320
pledge is super great, and I hope that it gets a ton of attention.
463
00:23:57,929 --> 00:24:04,060
It is disappointing and more than a little galling that it has to exist,
464
00:24:04,830 --> 00:24:10,570
but as the SSO, Wall of Shame website clearly shows, plenty of vendors have
465
00:24:10,719 --> 00:24:15,340
no qualms whatsoever charging extra to enable features like single sign-on.
466
00:24:15,900 --> 00:24:16,130
Ned: Yeah.
467
00:24:16,500 --> 00:24:20,790
Chris: Now, at the conference, 68 companies, including big-wigs
468
00:24:20,790 --> 00:24:26,610
like AWS, Google, Cisco, Microsoft, and IBM, signed the pledge.
469
00:24:27,420 --> 00:24:28,370
Kind of a big deal.
470
00:24:28,950 --> 00:24:30,450
Except, of course, that it’s not binding.
471
00:24:30,910 --> 00:24:32,249
It’s not a law.
472
00:24:33,469 --> 00:24:33,579
Ned: [laugh] . That’s true.
473
00:24:33,579 --> 00:24:38,689
Chris: But it does at least commit these companies from a PR perspective to work
474
00:24:38,720 --> 00:24:42,790
towards the increased product security goals over the course of the next year.
475
00:24:43,660 --> 00:24:44,950
So, why was this important?
476
00:24:45,580 --> 00:24:50,610
In my opinion, it’s one of those ‘which way is the wind blowing’ kind of moves.
477
00:24:51,330 --> 00:24:55,130
To wit, it is likely that these companies are trying to self-regulate to
478
00:24:55,140 --> 00:24:59,929
get ahead of the world’s various governing bodies regulating for them.
479
00:25:00,770 --> 00:25:03,989
To that end, other regulations are already happening.
480
00:25:04,510 --> 00:25:05,009
Ned: Mm-hm.
481
00:25:05,240 --> 00:25:06,550
Chris: There was a lot of discussion about these.
482
00:25:06,559 --> 00:25:11,180
In Europe, there is NIS2 and DORA, which focus on cybersecurity
483
00:25:11,190 --> 00:25:15,040
mandates and operational resilience requirements respectively.
484
00:25:15,450 --> 00:25:16,630
Ned: And also finding the map.
485
00:25:17,280 --> 00:25:17,440
Chris: Ugh.
486
00:25:17,440 --> 00:25:18,980
Ned: I’m sorry.
487
00:25:19,850 --> 00:25:20,310
So, sorry.
488
00:25:21,179 --> 00:25:24,250
[laugh] . I can’t tell if you’re frozen or just mad at me [laugh]
489
00:25:25,509 --> 00:25:28,609
.
Chris: [laugh] . For whatever reason, America continues to
490
00:25:28,609 --> 00:25:31,909
trail behind Europe on things like this, but it does stand
491
00:25:31,910 --> 00:25:34,610
to reason that we will have them sooner rather than later.
492
00:25:35,210 --> 00:25:39,129
Now, we already know about the cybersecurity mandate and the AI
493
00:25:39,130 --> 00:25:43,959
mandate, all basically just things that were executive-summarized.
494
00:25:44,429 --> 00:25:47,239
My assumption is that we will get a law at some point.
495
00:25:47,619 --> 00:25:48,580
Ned: It seems likely, yeah.
496
00:25:48,859 --> 00:25:54,630
Chris: Along the AI footpath to this issue, AI regulation and governments were
497
00:25:54,640 --> 00:26:03,360
big talking points, not just from regulating and governing AI, but using AI to
498
00:26:03,360 --> 00:26:08,060
make sure that your company stays compliant with regulations and governance.
499
00:26:09,090 --> 00:26:14,350
So, if you’ve ever looked into any of these regulations, you will know
500
00:26:14,360 --> 00:26:21,530
that there are, I think, six times eleventy-bajillion attestations and data
501
00:26:21,530 --> 00:26:26,980
points that have to be collected in order to prove evidence of compliance.
502
00:26:27,830 --> 00:26:31,230
There’s actually a huge subcomponent of the consulting world that
503
00:26:31,230 --> 00:26:34,790
is based exclusively around helping companies prove this compliance.
504
00:26:35,759 --> 00:26:38,250
A subcomponent of the consulting world that might be a little
505
00:26:38,250 --> 00:26:41,230
worried that AI tools are going to come around and do that.
506
00:26:42,190 --> 00:26:42,710
Ned: Yeah.
507
00:26:43,270 --> 00:26:48,260
And they’re almost necessary at this point because of all the security
508
00:26:48,260 --> 00:26:54,150
frameworks that we’ve baked into various deployments of compute and whatnot.
509
00:26:54,849 --> 00:26:58,409
They all have a virtual TPM now, they all are doing attestation
510
00:26:58,410 --> 00:27:01,820
of boot—not all, but a lot of them are doing that—and something
511
00:27:01,820 --> 00:27:04,620
needs to verify all of that and keep up to date with that.
512
00:27:04,630 --> 00:27:08,970
We’re generating so many more data points, as you said, there’s no
513
00:27:08,970 --> 00:27:12,870
way a human, even a consultant that’s getting paid thousands and
514
00:27:12,870 --> 00:27:16,840
thousands of dollars an hour, can comb through all of that and prove it.
515
00:27:16,910 --> 00:27:21,800
We need something to do, sort of, just gather it all up and look
516
00:27:21,800 --> 00:27:24,850
through it and then give us back a thumbs up or a thumbs down.
517
00:27:25,550 --> 00:27:25,830
Chris: Right.
518
00:27:26,860 --> 00:27:31,279
There were tons of other AI notes in dribs and drabs.
519
00:27:31,790 --> 00:27:35,880
Things like utilizing AI for data protection, utilizing AI for
520
00:27:35,889 --> 00:27:40,140
micro-segmentation, utilizing AI for log management and observability,
521
00:27:40,200 --> 00:27:45,680
using AI to make you a sandwich, et cetera, et cetera, [big gasp] et cetera.
522
00:27:47,040 --> 00:27:49,440
There’s a lot of AI, I guess is what I’m saying.
523
00:27:49,940 --> 00:27:50,559
Ned: It’s what I’m hearing.
524
00:27:51,139 --> 00:27:53,630
Chris: One thing that was notable for its
525
00:27:53,670 --> 00:27:57,550
absence, was the concept of zero trust.
526
00:27:57,960 --> 00:27:58,370
Ned: Oh.
527
00:27:58,490 --> 00:28:01,399
Chris: In previous years, zero trust was
528
00:28:01,410 --> 00:28:04,850
the belle of the ball at these conferences.
529
00:28:05,500 --> 00:28:11,149
This year, it showed up in a mere two sessions, and zero keynotes.
530
00:28:11,899 --> 00:28:14,970
Ned: That is a significant drop-off from, like, two years ago.
531
00:28:15,490 --> 00:28:16,870
Chris: Even last year, yeah.
532
00:28:16,940 --> 00:28:17,610
It’s crazy.
533
00:28:18,210 --> 00:28:23,160
So, the absence is probably down to a few things.
534
00:28:24,320 --> 00:28:29,179
First of all, companies that have embraced ZT are already
535
00:28:29,190 --> 00:28:32,709
doing it, and there’s really not anything to chat about.
536
00:28:33,830 --> 00:28:37,200
It’s an understood topic, philosophically and designalophically.
537
00:28:39,300 --> 00:28:40,490
Ned: Who’s making up words now?
538
00:28:41,320 --> 00:28:45,049
Chris: [laugh] . I mean, I think that’s really the problem is that it’s gone
539
00:28:45,059 --> 00:28:52,370
from marketing buzzword-y razzle-dazzle, to just a standard operating procedure.
540
00:28:52,870 --> 00:28:53,370
Ned: Yeah.
541
00:28:53,860 --> 00:28:57,129
That jives with what I’ve been hearing is, the companies
542
00:28:57,130 --> 00:28:59,720
that are actually doing it are now doing it, and the ones
543
00:28:59,720 --> 00:29:02,939
that it was all just fluff, they are no longer in business—
544
00:29:03,099 --> 00:29:03,379
Chris: Right.
545
00:29:03,660 --> 00:29:05,099
Ned: Or got bought by other companies.
546
00:29:05,949 --> 00:29:09,460
Chris: I mean, and the other thing is, it’s not the new hotness anymore.
547
00:29:10,130 --> 00:29:13,890
So, it’s really hard for its QScore to stay high.
548
00:29:14,650 --> 00:29:18,070
This is the downside of deep engineering concepts like zero trust.
549
00:29:18,799 --> 00:29:22,479
Once you get past that first wave of publicity,
550
00:29:22,830 --> 00:29:24,990
it’s just really hard to keep it top-of-mind.
551
00:29:25,540 --> 00:29:28,910
And since it is such a challenge to, you know, sit down and explain
552
00:29:28,910 --> 00:29:32,500
to somebody, things like zero trust just don’t get nearly as many
553
00:29:32,510 --> 00:29:36,450
influencers keeping up with it as whatever the latest new hotness is.
554
00:29:37,480 --> 00:29:38,770
Ned: But we’re better than that.
555
00:29:38,809 --> 00:29:40,660
Check out our previous episode on Zero Trust DNS.
556
00:29:42,170 --> 00:29:42,330
Ehh?
557
00:29:42,330 --> 00:29:42,360
Chris: [laugh]
558
00:29:43,890 --> 00:29:44,629
.
Ned: I worked it in there.
559
00:29:44,629 --> 00:29:45,200
That was good.
560
00:29:45,760 --> 00:29:46,629
I’m good at this [laugh]
561
00:29:47,110 --> 00:29:49,139
.
Chris: So, those were all the major things,
562
00:29:49,140 --> 00:29:51,260
and I could go on a little bit, but I won’t.
563
00:29:51,679 --> 00:29:55,849
One thing that was talked about that was interesting and was not necessarily
564
00:29:55,870 --> 00:30:01,780
technical in nature is the pernicious and not-talked about issue of burnout
565
00:30:01,800 --> 00:30:06,580
in IT, particularly in security, and it was boiled down to a couple of
566
00:30:06,600 --> 00:30:10,679
things: number one, there’s not enough people doing it; number two, the
567
00:30:10,679 --> 00:30:16,160
ones that are doing it are overworked and underpaid; number three, the
568
00:30:16,160 --> 00:30:20,959
budgets are just not there for what is required to keep modern IT security
569
00:30:21,260 --> 00:30:25,810
up to date in almost any company; and number four, whenever anything goes
570
00:30:25,820 --> 00:30:32,060
bad, it gets dumped on IT security’s shoulders with basically no support.
571
00:30:32,630 --> 00:30:34,690
Ned: Yeah, that all stacks up for me.
572
00:30:34,690 --> 00:30:40,490
I mean, burnout is a well known issue in IT, writ large.
573
00:30:41,360 --> 00:30:43,209
Security might even be worse.
574
00:30:43,950 --> 00:30:49,069
And we need more people to get into the field, which means
575
00:30:49,070 --> 00:30:52,809
the field needs to be more inviting, which is another issue
576
00:30:52,809 --> 00:30:57,560
with InfoSec as a whole, is it can be rather hostile to new
577
00:30:57,560 --> 00:31:01,639
people, especially ones that don’t fit a particular stereotype.
578
00:31:02,030 --> 00:31:03,439
Chris: That’s because new people have cooties.
579
00:31:03,460 --> 00:31:04,409
We talked about this.
580
00:31:04,620 --> 00:31:04,939
Ned: Yeah.
581
00:31:04,969 --> 00:31:05,640
And they smell.
582
00:31:06,300 --> 00:31:07,799
They smell good, which we don’t like.
583
00:31:08,290 --> 00:31:10,279
No, no good smelling people in InfoSec.
584
00:31:10,420 --> 00:31:11,669
Chris: That’s how you know it’s a trap.
585
00:31:11,719 --> 00:31:14,370
Ned: [laugh] . You’re probably right.
586
00:31:14,970 --> 00:31:15,530
Oh, dear.
587
00:31:15,990 --> 00:31:19,900
If anybody is interested in learning more about the
588
00:31:19,910 --> 00:31:23,280
Secure by Design proposal, we did a whole episode on that.
589
00:31:23,360 --> 00:31:24,389
Do you remember that, Chris?
590
00:31:24,469 --> 00:31:26,670
Chris: Nope, sure don’t [sigh]
591
00:31:26,670 --> 00:31:30,469
.
Ned: In March, we did a whole episode on Secure by Design and analysis
592
00:31:30,480 --> 00:31:35,679
of the proposal by CISA, and also some reactions to that proposal.
593
00:31:35,810 --> 00:31:38,879
So, if you’re interested, definitely worth checking out that episode.
594
00:31:38,889 --> 00:31:41,190
But hey, thanks for listening or something.
595
00:31:41,190 --> 00:31:43,730
I guess you found it worthwhile enough if you made it all
596
00:31:43,740 --> 00:31:46,110
the way to the end, so congratulations to you, friend.
597
00:31:46,459 --> 00:31:47,829
You accomplished something today.
598
00:31:47,850 --> 00:31:49,770
Now, you can sit on the couch, fire up
599
00:31:49,790 --> 00:31:52,520
your AI prompt and ask it about zero trust.
600
00:31:52,570 --> 00:31:53,740
You’ve earned it.
601
00:31:54,310 --> 00:31:57,910
You can find more about this show by visiting our LinkedIn page, just
602
00:31:57,910 --> 00:32:03,220
search ‘Chaos Lever,’ or go to our newish website, pod.chaoslever.com.
603
00:32:03,799 --> 00:32:06,400
You’ll find show notes, blog posts, and general tomfoolery.
604
00:32:06,620 --> 00:32:10,770
If we got something wrong, we did something you hated, you can leave
605
00:32:10,770 --> 00:32:16,149
us a comment or leave us a voicemail, which we will proudly read, or
606
00:32:16,150 --> 00:32:20,179
maybe even play during our tech news segments [laugh] , so enjoy that.
607
00:32:20,400 --> 00:32:23,249
If you want to be heard on tech news, leave us a voicemail.
608
00:32:23,430 --> 00:32:23,970
Go for it.
609
00:32:24,549 --> 00:32:27,070
We’ll be back next week to see what fresh hell is upon us.
610
00:32:27,390 --> 00:32:28,120
Ta-ta for now.
611
00:32:36,389 --> 00:32:39,650
Chris: We’re also not above doing dramatic recreations—
612
00:32:39,860 --> 00:32:40,240
Ned: Oh, yeah.
613
00:32:40,590 --> 00:32:42,920
Chris: In the style of unsolved mysteries, if you prefer.
614
00:32:43,510 --> 00:32:45,300
Ned: [laugh] . If you don’t want us to play
615
00:32:45,300 --> 00:32:47,720
the audio, we will do a dramatic recreation.
616
00:32:47,720 --> 00:32:49,570
We may even bring in a voice actor to do it.
617
00:32:50,110 --> 00:32:51,810
Chris: We’ll make Ned do an Irish accent.
618
00:32:52,380 --> 00:32:53,580
Ned: That goes well [laugh] for everyone.
619
00:32:53,990 --> 00:32:54,060
Chris: [laugh]