Transcript
1
00:00:00,480 --> 00:00:03,660
Ned: It is slightly more important than some half-marathon
2
00:00:03,700 --> 00:00:06,850
thing, where I’ll just overheat and then drink beer.
3
00:00:07,140 --> 00:00:10,610
Chris: I mean, I suppose… yeah, you could do that at the child’s birthday party.
4
00:00:11,179 --> 00:00:14,000
Ned: Oh… it is at a place called Urban Air.
5
00:00:14,219 --> 00:00:16,379
I guess they would have beer and maybe treadmills.
6
00:00:16,520 --> 00:00:16,959
I don’t know.
7
00:00:17,250 --> 00:00:19,770
Chris: Or it’s like a hipster place that makes mead.
8
00:00:20,110 --> 00:00:21,380
Ned: No, it’s not.
9
00:00:21,460 --> 00:00:22,710
There’s no axe throwing.
10
00:00:22,730 --> 00:00:25,870
It’s a kid’s trampoline park kind of thing.
11
00:00:25,870 --> 00:00:26,200
Chris: Whoa.
12
00:00:26,250 --> 00:00:27,340
Whoa, whoa, whoa, whoa.
13
00:00:27,500 --> 00:00:30,340
Kids on trampolines, throwing axes.
14
00:00:30,530 --> 00:00:33,110
Ned: [laugh] . That sounds great, doesn’t it?
15
00:00:33,170 --> 00:00:35,169
Chris: Tell me that is not a happy birthday.
16
00:00:35,199 --> 00:00:37,249
Ned: Let me tell you, the waiver I had to
17
00:00:37,250 --> 00:00:40,370
sign for this sucker went on for 36 pages.
18
00:00:45,580 --> 00:00:45,610
Chris: [laugh]
19
00:00:49,700 --> 00:00:52,370
.
Ned: Hello, alleged human, and welcome to the Chaos Lever podcast.
20
00:00:52,570 --> 00:00:54,970
My name is Ned, and I’m definitely not a robot.
21
00:00:55,150 --> 00:00:59,370
I’m a real human person who has feelings, dreams, the
22
00:00:59,370 --> 00:01:02,660
need to sleep every once in a while, possibly while some
23
00:01:02,660 --> 00:01:06,060
sort of programming tutorial is running in the background.
24
00:01:06,270 --> 00:01:09,629
Really, isn’t that the best possible white noise?
25
00:01:10,109 --> 00:01:13,630
With me is Chris, who is also white noise.
26
00:01:14,020 --> 00:01:16,200
And here [laugh] . What’s up, Chris.
27
00:01:16,650 --> 00:01:17,350
Chris: It’s amazing.
28
00:01:17,360 --> 00:01:19,809
I am both white noise and the black hole.
29
00:01:20,099 --> 00:01:24,440
Ned: [laugh] . Oh, you’re very Michael Jackson in that regard.
30
00:01:24,789 --> 00:01:25,949
Chris: Moving on as quickly as possible.
31
00:01:26,509 --> 00:01:28,719
Ned: [laugh] . Yes, I immediately regret it.
32
00:01:30,680 --> 00:01:32,580
That did not work the way I wanted it to.
33
00:01:33,540 --> 00:01:35,260
How about a nun falling down the stairs?
34
00:01:35,950 --> 00:01:37,290
What’s black and white and red all over?
35
00:01:37,290 --> 00:01:38,125
Chris: Buh-dum-shh.
36
00:01:38,960 --> 00:01:40,479
What is in that water glass?
37
00:01:41,429 --> 00:01:42,320
Ned: [laugh] . It’s not water, man.
38
00:01:43,630 --> 00:01:46,919
Okay, now that I’ve quenched my thirst for water,
39
00:01:47,469 --> 00:01:50,540
let’s quench my thirst for knowledge about security.
40
00:01:50,920 --> 00:01:54,870
Chris: Not just about security; about security conferences,
41
00:01:55,370 --> 00:01:57,170
which is apparently the only thing I talk about anymore.
42
00:01:57,170 --> 00:02:00,660
Ned: [laugh] . Well, this one’s a little bit different because—
43
00:02:00,670 --> 00:02:01,210
Chris: It is.
44
00:02:01,289 --> 00:02:04,720
Ned: You took a different approach this time, one that I have to say
45
00:02:04,760 --> 00:02:08,350
I’m a little proud of you for doing, like, actually leaving your house.
46
00:02:08,889 --> 00:02:11,910
Chris: Dude, the complaining that was involved in that effort—
47
00:02:12,210 --> 00:02:13,220
Ned: Entirely by you.
48
00:02:13,470 --> 00:02:15,120
Chris: Weeks, upon weeks.
49
00:02:16,040 --> 00:02:20,950
I went to the Gartner Security and Risk Summit, so you didn’t have to.
50
00:02:21,400 --> 00:02:22,150
Ned: Unless you did.
51
00:02:22,469 --> 00:02:23,219
Because you might have.
52
00:02:23,579 --> 00:02:24,310
Chris: It’s possible.
53
00:02:24,610 --> 00:02:25,800
There were other people there.
54
00:02:26,110 --> 00:02:27,400
Ned: Some, from what I hear.
55
00:02:27,710 --> 00:02:31,610
Chris: But, you know, if you didn’t go, you might want to consider it.
56
00:02:32,850 --> 00:02:35,309
The TL;DR here is, I was pleasantly surprised
57
00:02:35,309 --> 00:02:37,000
by how much I enjoyed this conference.
58
00:02:37,460 --> 00:02:40,149
And based on the conversations I’ve been having, you know, with
59
00:02:40,500 --> 00:02:44,040
coworkers and people that were there, that’s a pretty common thought.
60
00:02:44,910 --> 00:02:46,429
This seems like one of the good ones.
61
00:02:47,020 --> 00:02:47,580
Ned: Okay.
62
00:02:47,880 --> 00:02:50,900
Chris: So, let’s just start with the positive.
63
00:02:51,429 --> 00:02:55,790
It was held in National Harbor, Maryland, which I can’t for the life
64
00:02:55,790 --> 00:02:59,419
of me tell if this is just, like, an area that’s, like, designated,
65
00:02:59,420 --> 00:03:03,019
is it actually the name of the town, it’s all very confusing.
66
00:03:03,850 --> 00:03:06,610
But National Harbor, Maryland, is isolated by itself.
67
00:03:06,640 --> 00:03:08,960
It was like specifically purpose-built for this.
68
00:03:09,610 --> 00:03:12,559
They have a conference center, they have a lot of restaurants
69
00:03:12,559 --> 00:03:16,009
right up the street, there’s a casino, and the whole place has
70
00:03:16,029 --> 00:03:18,980
that distinct feeling of mandatory fun at high, high prices.
71
00:03:19,210 --> 00:03:21,440
Ned: The kind of prices that only the government can afford.
72
00:03:22,290 --> 00:03:27,070
Chris: [laugh] . Yes, it is quite close to a lot of government-type things.
73
00:03:27,070 --> 00:03:29,329
In fact, if you drive there in the right direction,
74
00:03:29,330 --> 00:03:31,790
you go right past the front gate to the NSA.
75
00:03:32,630 --> 00:03:33,359
Ned: I’ve been there.
76
00:03:33,830 --> 00:03:35,620
At least once of my own volition.
77
00:03:36,430 --> 00:03:39,050
Chris: [laugh] . It’s a weird area, National Harbor.
78
00:03:39,050 --> 00:03:42,540
I overheard one humble conference-goer refer to it as
79
00:03:42,540 --> 00:03:45,499
something to the effect of, “This place is crowded as all
80
00:03:45,510 --> 00:03:48,659
hell, and it smells like pot and body odor,” which, burn.
81
00:03:49,170 --> 00:03:51,760
Ned: An accurate descriptor of almost any city I’ve been in.
82
00:03:53,070 --> 00:03:55,390
Chris: Or just, like, a really expensive frat party.
83
00:03:56,450 --> 00:03:56,840
Ned: [laugh] . Fair.
84
00:03:57,160 --> 00:03:59,910
Chris: Interestingly though, if you’ve heard of this conference, and
85
00:03:59,910 --> 00:04:04,119
you’re like, “It’s not in Maryland,” and you’re now confused, don’t be.
86
00:04:04,490 --> 00:04:07,869
This conference is not only held in one time, in one place.
87
00:04:08,210 --> 00:04:13,920
There are a multitude of them held worldwide, including Dubai,
88
00:04:13,920 --> 00:04:18,750
Mumbai, Sydney, the aforementioned Maryland, Tokyo, and London.
89
00:04:19,540 --> 00:04:23,519
Okay, that’s not as many as I thought, but it’s still more
90
00:04:23,520 --> 00:04:26,759
than the single conference that RSA puts on every year.
91
00:04:27,250 --> 00:04:29,309
Ned: I love that you named off these cities.
92
00:04:30,000 --> 00:04:31,680
Chris: [laugh] . And I pronounced them all correctly.
93
00:04:31,740 --> 00:04:32,550
I think.
94
00:04:32,800 --> 00:04:37,060
Ned: And included amongst those cities, titans of—like, these are major
95
00:04:37,060 --> 00:04:41,230
cities that almost everyone on the planet has heard of is, Maryland.
96
00:04:42,300 --> 00:04:42,330
Chris: [laugh]
97
00:04:42,599 --> 00:04:46,060
.
Ned: We’re in Maryland, which isn’t even spelled like it
98
00:04:46,060 --> 00:04:50,430
sounds, but we’re here, and at least it’s not Delaware.
99
00:04:51,150 --> 00:04:55,940
Chris: And this is also not a spelling podcast, which for your case, thank God.
100
00:04:56,330 --> 00:04:59,469
Ned: Ugh, valid, though I would be even more
101
00:04:59,480 --> 00:05:02,229
ill-equipped at a pronunciation podcast.
102
00:05:03,410 --> 00:05:04,960
Chris: Pronunciat-e-own, if you will.
103
00:05:05,340 --> 00:05:06,663
Ned: [French accent] hon hon hon.
104
00:05:06,900 --> 00:05:08,989
Listen, we’re not at that conference.
105
00:05:10,170 --> 00:05:13,640
Chris: Anyway, I think as a direct result of the fact that this
106
00:05:13,640 --> 00:05:17,970
conference is held so many different times, the conference is way smaller.
107
00:05:18,500 --> 00:05:21,390
They did not throw out an exact count that I caught, but I
108
00:05:21,410 --> 00:05:24,030
could have sworn that I heard the number 5000 thrown around.
109
00:05:24,379 --> 00:05:24,769
Ned: Okay.
110
00:05:24,820 --> 00:05:27,600
Chris: Which, if accurate—and felt right—that’s a nice
111
00:05:27,600 --> 00:05:31,229
bump over the 4300 they had over the past two years.
112
00:05:31,900 --> 00:05:33,910
So, it’s a lot of people, but it’s not a number
113
00:05:33,929 --> 00:05:36,900
that feels completely overwhelming and crushing.
114
00:05:37,719 --> 00:05:41,359
Having said that, there were still times where the hallways
115
00:05:41,359 --> 00:05:45,330
were kind of crowded, and you felt like you couldn’t move
116
00:05:45,370 --> 00:05:48,270
all that well, which makes this humble conference-goer, think
117
00:05:48,280 --> 00:05:50,260
they might be [stage whisper] outgrowing National Harbor.
118
00:05:50,540 --> 00:05:53,689
It’s right on that line of uncomfortability.
119
00:05:53,719 --> 00:05:54,379
Which is a word.
120
00:05:54,389 --> 00:05:55,030
Don’t look it up.
121
00:05:55,400 --> 00:05:55,979
Ned: I never do.
122
00:05:56,530 --> 00:05:58,369
Chris: So, another thing that’s different: the conference is
123
00:05:58,369 --> 00:06:03,180
targeted at leaders, primarily CISOs and strategists rather than
124
00:06:03,430 --> 00:06:06,980
hands-on keyboard engineers, all the way up to people in the C-suite.
125
00:06:07,440 --> 00:06:11,919
The fanciest-of-pants attendees also got the opportunity to speak
126
00:06:11,920 --> 00:06:16,819
one-on-one with Gartner analysts about whatever topic they chose.
127
00:06:17,490 --> 00:06:20,680
So, whether they wanted to do a breakdown of a recent publication,
128
00:06:20,690 --> 00:06:23,820
whether they wanted to talk in private about a real specific
129
00:06:23,860 --> 00:06:26,979
use case, whatever they wanted to do, they were able to do it.
130
00:06:27,440 --> 00:06:30,640
Now unsurprisingly, my pants were not that fancy.
131
00:06:31,070 --> 00:06:33,479
Y’all are lucky I was wearing pants, is all I’m going to say.
132
00:06:33,910 --> 00:06:34,280
Ned: Fair.
133
00:06:34,570 --> 00:06:34,950
Chris: Now.
134
00:06:35,379 --> 00:06:39,049
I did spend some time at the lobby bar having unofficial one-on-ones
135
00:06:39,300 --> 00:06:43,360
talking to some analysts and other Gartner employees—Gartnerites?
136
00:06:43,510 --> 00:06:43,690
Gartnerers?
137
00:06:47,160 --> 00:06:47,200
Ned: Garteners.
138
00:06:47,200 --> 00:06:51,610
Chris: Gardeners—and basically found them all insightful and delightful to talk
139
00:06:51,610 --> 00:06:55,070
to, which of course immediately made me wonder why they let me in the door.
140
00:06:55,630 --> 00:06:56,390
Ned: It’s fair.
141
00:06:56,810 --> 00:07:02,500
And I think in the past, you and I have given Gartner, its fair share of lumps
142
00:07:02,860 --> 00:07:08,349
for always being one step behind in their advice and their observations, and
143
00:07:08,349 --> 00:07:13,030
charging absolutely ludicrous amounts of money for their reports or membership.
144
00:07:13,590 --> 00:07:16,200
But we can’t really pin that on the individual analysts.
145
00:07:16,260 --> 00:07:16,550
Chris: Right.
146
00:07:16,560 --> 00:07:18,260
Ned: It’s more of an institutional problem.
147
00:07:18,480 --> 00:07:21,439
Chris: And, probably in comparison to some others,
148
00:07:21,820 --> 00:07:23,870
they might actually even be reasonably priced.
149
00:07:24,360 --> 00:07:26,969
I’m not a hundred percent certain, but I think there’s a chance.
150
00:07:27,530 --> 00:07:32,670
Ned: I do recall Forrester being even more expensive and less useful.
151
00:07:33,090 --> 00:07:37,460
Chris: So anyway, “What did they actually talk about at this conference,”
152
00:07:37,500 --> 00:07:40,540
I’m sure you’re frustratedly growling to yourself under your breath.
153
00:07:41,020 --> 00:07:42,990
I suppose we can talk about that.
154
00:07:42,990 --> 00:07:45,780
Ned: I mean, we’ve only danced around it for the last ten minutes.
155
00:07:46,000 --> 00:07:49,179
Chris: But before that, let’s have a deep dive into peanut butter preferences.
156
00:07:49,469 --> 00:07:50,760
Ned: Ooh, crunchy all the way.
157
00:07:50,820 --> 00:07:51,570
No questions.
158
00:07:51,600 --> 00:07:52,180
Chris: Okay, good.
159
00:07:52,180 --> 00:07:52,915
You can stay on the podcast.
160
00:07:52,915 --> 00:07:53,340
Ned: Wooo.
161
00:07:54,140 --> 00:07:56,759
Chris: Because there was in fact, only one right answer.
162
00:07:56,970 --> 00:07:57,840
Ned: [laugh] . Oh no, I know.
163
00:07:58,770 --> 00:08:02,859
So, I will caveat and say if you were trying to bait traps, creamy is better.
164
00:08:03,130 --> 00:08:03,646
Chris: Why would I want to—
165
00:08:03,670 --> 00:08:05,179
Ned: But that’s because you’re trying to catch vermin.
166
00:08:05,410 --> 00:08:06,720
So, there’s that.
167
00:08:06,840 --> 00:08:08,050
Chris: So, what you’re saying is people
168
00:08:08,050 --> 00:08:10,070
that eat creamy peanut butter are vermin?
169
00:08:10,390 --> 00:08:11,590
Ned: I did not say that.
170
00:08:11,770 --> 00:08:17,360
But it didn’t not not [laugh] say that [laugh] . Moving on.
171
00:08:17,420 --> 00:08:18,130
Chris: Anyway.
172
00:08:18,670 --> 00:08:22,400
So, the conference was held over three days and included over
173
00:08:22,400 --> 00:08:26,150
150 sessions, so I’m not going to be able to cover everything.
174
00:08:26,670 --> 00:08:29,650
And there actually were some technical ones, like, yes, it
175
00:08:29,650 --> 00:08:32,590
was mostly aimed at leaders, but there was some really into
176
00:08:32,590 --> 00:08:36,140
the weeds type of stuff, particularly on the expo floor.
177
00:08:36,500 --> 00:08:39,749
The vendors would do 20-minute sessions, and unfortunately,
178
00:08:39,750 --> 00:08:42,880
these were not recorded, so even if Gartner does make stuff
179
00:08:42,880 --> 00:08:45,640
available, those are not going to ever be available to anybody.
180
00:08:45,900 --> 00:08:47,760
Which is annoying because I saw an awesome one about
181
00:08:48,400 --> 00:08:51,300
Passwordless by YubiKey that I did not take enough notes on.
182
00:08:52,340 --> 00:08:55,890
In terms of the formal conference sessions, however, there were
183
00:08:55,900 --> 00:09:00,719
blessedly only two formal Gartner keynotes—count them: two—
184
00:09:01,210 --> 00:09:01,620
Ned: One, two.
185
00:09:01,860 --> 00:09:04,470
Chris: —and then there were three guest keynotes.
186
00:09:05,070 --> 00:09:07,210
Now, it’s still too many keynotes, but
187
00:09:07,210 --> 00:09:10,525
it’s way more reasonable than the RSA’s 36.
188
00:09:10,960 --> 00:09:14,510
If you would like to hear the full summary of the summit from
189
00:09:14,910 --> 00:09:17,049
Gartner themselves—which hilariously, they published before
190
00:09:17,049 --> 00:09:20,490
the show started—got links in the [show notes] . Also, the two
191
00:09:20,490 --> 00:09:24,330
formal keynotes, as well as one random session about strategy
192
00:09:24,870 --> 00:09:27,689
from the CEOs perspective, are already published on YouTube.
193
00:09:28,210 --> 00:09:31,639
Also, also, also, if you just can’t get enough Gardner—
194
00:09:32,170 --> 00:09:32,690
Ned: Who can?
195
00:09:32,940 --> 00:09:36,340
Chris: They publish, like, 40 podcasts, which is a
196
00:09:36,340 --> 00:09:38,790
fun fact that I literally only learned yesterday.
197
00:09:39,120 --> 00:09:39,550
Ned: Huh.
198
00:09:39,780 --> 00:09:42,130
Chris: And scanning through them, they’re all nicely
199
00:09:42,130 --> 00:09:44,879
organized, and there’s a lot of episode titles that sound a
200
00:09:44,880 --> 00:09:48,640
lot like session titles, is all I’m going to say about that.
201
00:09:48,980 --> 00:09:50,709
Ned: And they’re just giving those away?
202
00:09:50,709 --> 00:09:51,240
Chris: [laugh] Yeah.
203
00:09:51,670 --> 00:09:52,469
Yes, they are.
204
00:09:52,820 --> 00:09:53,530
Ned: Fascinating.
205
00:09:53,940 --> 00:09:58,590
Chris: So, the main keynote that opened the show had an
206
00:09:58,600 --> 00:10:01,780
interesting position that I think is worth exploring in depth.
207
00:10:02,440 --> 00:10:07,280
And that position is this: IT, especially IT security, is
208
00:10:07,280 --> 00:10:12,049
paranoiacly [sp] focused on a hundred percent perfect performance.
209
00:10:12,629 --> 00:10:15,960
And that is not helpful, and we need to change that expectation.
210
00:10:16,640 --> 00:10:20,579
Now, it’s probably not a natural-feeling concept because
211
00:10:20,580 --> 00:10:24,610
you’re probably asking yourself, isn’t perfection the goal?
212
00:10:25,090 --> 00:10:27,519
Well, I mean, it is, but it isn’t.
213
00:10:29,049 --> 00:10:30,650
Think about anything.
214
00:10:31,250 --> 00:10:36,209
There’s no game, no job, no hobby, no activity at all where
215
00:10:36,210 --> 00:10:40,270
you can or should expect a hundred percent perfection.
216
00:10:41,000 --> 00:10:43,020
You’re going to lose a tic-tac-toe every once in a while.
217
00:10:43,070 --> 00:10:43,939
It happens.
218
00:10:44,490 --> 00:10:47,089
You didn’t get a hundred percent perfect grades in school, right?
219
00:10:47,119 --> 00:10:48,519
I mean, obviously Ned didn’t.
220
00:10:48,970 --> 00:10:50,220
It was a generic question.
221
00:10:50,370 --> 00:10:54,180
Even your valedictorian missed a question here and there.
222
00:10:54,550 --> 00:10:55,709
Ned: I was valedictorian.
223
00:10:55,910 --> 00:10:56,730
Chris: No, you weren’t.
224
00:10:57,980 --> 00:10:58,910
Ned: I sure was.
225
00:10:58,960 --> 00:11:00,530
Chris: You were only one of those syllables.
226
00:11:01,230 --> 00:11:01,260
Ned: [laugh]
227
00:11:01,370 --> 00:11:03,910
.
Chris: The comparison that was made at the conference
228
00:11:03,960 --> 00:11:07,710
in multiple places, not just this keynote, was retail.
229
00:11:08,310 --> 00:11:12,560
There is no expectation that, in retail, loss prevention
230
00:11:12,560 --> 00:11:16,050
teams, which do exist and are taken very seriously, there’s
231
00:11:16,050 --> 00:11:19,010
no expectation that they’re going to stop all losses.
232
00:11:19,780 --> 00:11:22,509
In fact, they don’t even call them losses, they categorize
233
00:11:22,510 --> 00:11:25,460
it as inventory shrinkage, which is a delightful term.
234
00:11:25,980 --> 00:11:26,590
Ned: It is.
235
00:11:26,940 --> 00:11:27,840
But there’s a reason.
236
00:11:28,200 --> 00:11:28,420
Chris: Right.
237
00:11:28,430 --> 00:11:30,850
It is a common and expected measure.
238
00:11:31,540 --> 00:11:34,689
And these losses are incurred, not just from shoplifting.
239
00:11:35,090 --> 00:11:40,500
It could be internal theft, external theft, errors in shipping, vendor
240
00:11:40,500 --> 00:11:45,430
fraud or vendor mistakes, or damaged goods that can’t be sold or returned.
241
00:11:46,160 --> 00:11:49,109
Ned: Or even mistakes when taking inventory.
242
00:11:49,770 --> 00:11:50,899
Chris: That’s a good point, too.
243
00:11:52,099 --> 00:11:54,790
Stuff in retail goes missing.
244
00:11:55,420 --> 00:11:58,079
It is just the reality of the business, and if you’re thinking,
245
00:11:58,080 --> 00:12:01,330
well, this is just the dollar store that you’re talking about, no.
246
00:12:01,850 --> 00:12:05,690
Even the craziest of high-end stores have shrinkage.
247
00:12:06,270 --> 00:12:08,290
Tiffany’s has losses.
248
00:12:08,679 --> 00:12:10,230
Ferraris gets stolen.
249
00:12:10,620 --> 00:12:11,250
Ned: Impressive.
250
00:12:11,500 --> 00:12:12,450
Chris: It happens.
251
00:12:13,250 --> 00:12:16,960
So, in terms of tying that back to IT security, the point was twofold.
252
00:12:17,680 --> 00:12:23,310
First, this causes us as an industry to be way too focused on prevention
253
00:12:24,010 --> 00:12:30,060
of breaches and not nearly enough on the recovery from breaches.
254
00:12:30,820 --> 00:12:35,680
The second point was that this obsession is causing people to drive
255
00:12:35,680 --> 00:12:40,410
themselves crazy with overwork, and brings about—you guessed it—burnout.
256
00:12:41,490 --> 00:12:43,430
So, let’s take the points in order.
257
00:12:43,950 --> 00:12:48,079
First off, we’re focused too much on prevention and not enough on recovery.
258
00:12:48,730 --> 00:12:51,900
And it feels weird that this is a problem because one of the major
259
00:12:51,900 --> 00:12:56,220
things that we say about security is, “It’s not if you get breached.
260
00:12:56,850 --> 00:13:00,990
It’s when you get breached.” So, on the one hand, we have
261
00:13:00,990 --> 00:13:02,839
this recognition that bad things are going to happen.
262
00:13:03,210 --> 00:13:05,290
On the other, we expect a hundred percent perfection.
263
00:13:05,929 --> 00:13:06,760
What are we doing?
264
00:13:07,219 --> 00:13:08,130
Ned: What are we doing?
265
00:13:08,740 --> 00:13:12,290
Chris: And this [sigh] paradox, dichotomy, whatever you want to
266
00:13:12,290 --> 00:13:16,380
call it, is doubly true when consistent evidence clearly shows
267
00:13:16,400 --> 00:13:19,550
companies, including CEOs and boards, are all perfectly willing
268
00:13:19,550 --> 00:13:23,349
to increase their risk exposure in order to achieve growth.
269
00:13:24,590 --> 00:13:26,940
You look like you were going to say something, but you might just be gassy.
270
00:13:27,349 --> 00:13:33,660
Ned: No, I’m thinking about the concept of risk and how it’s just, any large
271
00:13:33,660 --> 00:13:39,340
corporation, part of what sort of the financial side of the house does is
272
00:13:39,389 --> 00:13:46,359
assessing risk and determining what are the actual risks involved, whether it’s
273
00:13:46,369 --> 00:13:53,000
cyberattacks or something else; what is the expected cost of that risk, and
274
00:13:53,280 --> 00:13:59,160
what is the impact; and then, are we willing to do what needs to do to prevent
275
00:13:59,270 --> 00:14:05,250
that risk from occurring or are we willing to just accept the risk as is, and
276
00:14:05,250 --> 00:14:10,089
pay the penalty if we think it’s sufficiently unlikely or not expensive enough?
277
00:14:10,210 --> 00:14:10,360
Chris: Right.
278
00:14:10,400 --> 00:14:13,609
Ned: So, it’s more of a financial question than anything else.
279
00:14:13,960 --> 00:14:17,470
But I don’t think that perspective tends to trickle down to the rank
280
00:14:17,470 --> 00:14:21,240
and file InfoSec people, and those are the ones who need to get the
281
00:14:21,240 --> 00:14:25,360
message that a hundred percent perfection is actually not the goal.
282
00:14:25,670 --> 00:14:28,360
It’s mitigating the risks that are worth mitigating.
283
00:14:28,730 --> 00:14:29,020
Chris: Right.
284
00:14:29,350 --> 00:14:33,349
And being prepared to react when something goes wrong.
285
00:14:34,059 --> 00:14:34,639
Ned: Right.
286
00:14:35,309 --> 00:14:38,470
There’s a certain group of people who think of it as a
287
00:14:38,490 --> 00:14:43,150
defeatist attitude to say, “It’s not about if you get breached;
288
00:14:43,150 --> 00:14:45,240
it’s about when,” and they go, “Well, that’s just defeatism.
289
00:14:45,240 --> 00:14:46,830
I can prevent everybody,” you know?
290
00:14:47,420 --> 00:14:51,860
Or, “We should strive for perfection.” And I’m not in that
291
00:14:51,860 --> 00:14:54,410
camp, but I understand where they’re coming from, where
292
00:14:54,410 --> 00:14:58,089
you don’t want to just complacently accept mediocrity.
293
00:14:58,570 --> 00:15:00,569
So, there’s a balanced be struck.
294
00:15:01,020 --> 00:15:01,400
Chris: Yeah.
295
00:15:01,880 --> 00:15:04,140
And incidentally, one thing I didn’t have time to talk about, but
296
00:15:04,140 --> 00:15:08,170
they talked about in-depth is, how do you as a company, actually
297
00:15:08,170 --> 00:15:11,450
figure out what level of risk you’re comfortable with, you know?
298
00:15:11,450 --> 00:15:15,660
Designing a risk portfolio, and a risk registry, and all that type of stuff.
299
00:15:15,660 --> 00:15:19,720
It was an interesting topic that honestly, probably could have its own episode.
300
00:15:20,190 --> 00:15:23,080
But the point that they’re trying to make in terms of the business
301
00:15:23,100 --> 00:15:26,999
and how you operate as an IT shop is, you’ve got to focus on
302
00:15:27,000 --> 00:15:29,979
that recovery, and you’ve got to take it extremely seriously.
303
00:15:30,580 --> 00:15:33,540
So, this means things, like, actual immutable backups,
304
00:15:33,730 --> 00:15:37,120
creating and maintaining recovery runbooks, and especially
305
00:15:37,720 --> 00:15:41,460
DR practice needs to be brought further into the forefront.
306
00:15:41,670 --> 00:15:44,149
And based on what they talked about, it would
307
00:15:44,160 --> 00:15:46,310
be, frankly, irresponsible to do otherwise.
308
00:15:46,920 --> 00:15:50,500
And the other part of this is, that it’s super unfun.
309
00:15:50,500 --> 00:15:50,950
Ned: Oh, yeah.
310
00:15:51,620 --> 00:15:57,089
Chris: But it is totally necessary to practice your disaster recovery procedure.
311
00:15:58,050 --> 00:16:01,810
Everybody always thinks that recovering from an incident is as simple as
312
00:16:01,840 --> 00:16:05,510
I’ll just follow all the steps in the document, which is how people think.
313
00:16:05,820 --> 00:16:06,350
Ned: Right.
314
00:16:06,520 --> 00:16:06,859
Mm-hm.
315
00:16:07,250 --> 00:16:09,550
Chris: But anybody that’s ever been in a disaster situation,
316
00:16:10,059 --> 00:16:13,900
of any kind really, knows that it’s not that simple.
317
00:16:14,730 --> 00:16:16,930
You’re simply not in a state where you’re thinking clearly.
318
00:16:18,390 --> 00:16:21,960
Even the most basic of tasks is exponentially harder because the
319
00:16:21,960 --> 00:16:26,939
adrenaline is overwhelming your system, people are yelling, systems
320
00:16:26,940 --> 00:16:30,319
are slow, and oh, my God, this document hasn’t been updated since 2016.
321
00:16:32,250 --> 00:16:32,720
Jake?
322
00:16:32,900 --> 00:16:34,010
Why is Jake’s name in here?
323
00:16:34,010 --> 00:16:35,360
He doesn’t work here anymore.
324
00:16:35,560 --> 00:16:37,600
And he’s the one that knows all the passwords?
325
00:16:38,230 --> 00:16:39,970
These are not things you want to have happen at three
326
00:16:39,970 --> 00:16:42,980
o’clock in the morning when the systems are all on fire.
327
00:16:44,009 --> 00:16:49,140
Ned: My favorite was that all of the passwords were stored in a password
328
00:16:49,270 --> 00:16:54,170
security application that ran in the data center that had gone down.
329
00:16:54,600 --> 00:16:55,010
Chris: Nice.
330
00:16:55,840 --> 00:16:56,990
Ned: Sooo [laugh]
331
00:16:57,500 --> 00:17:00,420
.
Chris: Facebook had a little incident like that a few years ago, where—
332
00:17:00,940 --> 00:17:01,349
Ned: They sure did.
333
00:17:01,349 --> 00:17:03,710
Chris: Couldn’t get into their building because the system
334
00:17:03,710 --> 00:17:06,260
that controlled the front doors was in the building.
335
00:17:06,449 --> 00:17:06,899
Ned: Oops.
336
00:17:07,129 --> 00:17:09,970
Chris: These are the sorts of things you figure out when you practice.
337
00:17:10,349 --> 00:17:11,050
Ned: Yeah.
338
00:17:11,630 --> 00:17:14,359
Chris: So yeah, recovery has to be a priority because
339
00:17:14,380 --> 00:17:16,510
you all know eventually something is going to go wrong.
340
00:17:17,319 --> 00:17:21,599
And as I wrote in my notes about this point from when I actually
341
00:17:21,599 --> 00:17:25,339
watched the keynote, quote, “You are going to get breached at
342
00:17:25,339 --> 00:17:29,280
some point because, math.” And goddammit, I think I nailed it.
343
00:17:29,750 --> 00:17:31,919
Ned: Ah, yet another thing we have to blame math for.
344
00:17:32,769 --> 00:17:35,760
Chris: [laugh] . So, pivoting on to point two: burnout.
345
00:17:36,300 --> 00:17:40,639
This constant obsession with perfection, which we have established
346
00:17:40,639 --> 00:17:45,730
and agreed is an impossible target, is ruining people, has been
347
00:17:45,730 --> 00:17:50,940
doing so for years, and we are only now starting to reckon with it.
348
00:17:50,940 --> 00:17:53,000
It is interesting enough to me that it came up
349
00:17:53,000 --> 00:17:56,710
at RSA, and now it’s in the keynote at Gartner.
350
00:17:56,710 --> 00:17:57,819
Obviously, this is a thought.
351
00:17:57,849 --> 00:18:00,800
And there were, of course, statistics to back it up.
352
00:18:01,480 --> 00:18:04,639
Gartner—shar—bleh—Gartner sharted?
353
00:18:04,900 --> 00:18:05,190
No.
354
00:18:05,420 --> 00:18:06,010
No, no, no, no.
355
00:18:06,300 --> 00:18:06,920
Ned: Let’s hope not.
356
00:18:07,080 --> 00:18:09,030
Chris: I mean, they were all wearing dark suits.
357
00:18:09,059 --> 00:18:09,629
I’m not sure.
358
00:18:09,719 --> 00:18:10,200
Moving on.
359
00:18:10,309 --> 00:18:15,760
Gartner shared a statistic that I thought was interesting, and that is
360
00:18:15,770 --> 00:18:21,380
62% of cyber leaders reported experiencing symptoms of burnout last year,
361
00:18:21,850 --> 00:18:26,270
quote, “At least once.” And if you know anything about self-reported
362
00:18:26,280 --> 00:18:30,920
stats is that the actual number there is probably well higher.
363
00:18:31,730 --> 00:18:32,739
Ned: Agreed, yeah.
364
00:18:33,230 --> 00:18:37,420
Chris: So, one of the guest keynotes attacked this problem head on.
365
00:18:37,990 --> 00:18:41,429
First, though, I think it’s worth grabbing a definition.
366
00:18:42,080 --> 00:18:47,089
Burnout is a specific consequence of occupational stress.
367
00:18:47,570 --> 00:18:48,629
It’s a work hazard.
368
00:18:49,000 --> 00:18:52,600
So, it’s not in the DSM, although there is some argument that it should be.
369
00:18:53,209 --> 00:18:57,580
The World Health Organization defines it as, quote, “A syndrome conceptualized
370
00:18:57,610 --> 00:19:01,730
as resulting from chronic workplace stress that has not been successfully
371
00:19:01,730 --> 00:19:06,070
managed, characterized by feelings of energy depletion or exhaustion, increased
372
00:19:06,070 --> 00:19:11,380
mental distance from one’s job, and reduced professional efficacy.” Unquote.
373
00:19:11,380 --> 00:19:15,169
It is chronic and repeated stress that depletes your
374
00:19:15,170 --> 00:19:18,950
mental energy, which energy does not get recharged.
375
00:19:19,770 --> 00:19:22,560
And what she said was, it’s basically a permanent state of fight or flight.
376
00:19:23,210 --> 00:19:26,340
It is pernicious and damaging, and you might not recognize it when it’s
377
00:19:26,340 --> 00:19:30,079
happening at first, and it can take months to properly recover from it.
378
00:19:30,760 --> 00:19:34,640
Ned: Most people I know that have gone through severe burnout were not
379
00:19:34,640 --> 00:19:39,460
aware of the severity during the time they were actually getting burned out.
380
00:19:39,950 --> 00:19:44,610
And they all had a tipping point where they just
381
00:19:44,860 --> 00:19:48,340
collapsed, emotionally and sometimes even physically,
382
00:19:48,600 --> 00:19:52,530
and just were not physically capable of doing their job.
383
00:19:52,860 --> 00:19:57,600
And it was only when they went back and looked at what they’d been
384
00:19:57,600 --> 00:20:01,799
dealing with over the past, you know, six months, year, whatever, that
385
00:20:01,799 --> 00:20:05,450
they came to realize the level of burnout they’ve been experiencing.
386
00:20:05,650 --> 00:20:08,580
And this isn’t the sort of thing where, like, Joe goes on holiday
387
00:20:08,580 --> 00:20:11,890
for, you know, a week, and now he’s fine, and everything’s cool.
388
00:20:12,060 --> 00:20:14,680
This is the sort of thing where people just move to a totally
389
00:20:14,680 --> 00:20:17,840
different industry because they, they can’t do it anymore.
390
00:20:18,200 --> 00:20:18,480
Chris: Right.
391
00:20:18,759 --> 00:20:21,260
Ned: People will drop out of InfoSec because they are
392
00:20:21,260 --> 00:20:23,230
burned out, and that’s not, like, a thing, you can
393
00:20:23,230 --> 00:20:26,100
take a week off, or even two weeks, and be fine with.
394
00:20:26,330 --> 00:20:29,040
That’s something that you’re going to have to deal with more holistically.
395
00:20:29,660 --> 00:20:30,320
Chris: Yes.
396
00:20:30,670 --> 00:20:33,020
And it’s nice that it’s being talked about at this level
397
00:20:33,030 --> 00:20:36,850
because frankly, I wasn’t sure that I was hearing correctly.
398
00:20:37,160 --> 00:20:39,519
Because as I said, they’re talking to leadership,
399
00:20:39,639 --> 00:20:42,030
and leadership doesn’t often want to hear about it.
400
00:20:43,550 --> 00:20:45,760
But they shared more stats.
401
00:20:46,370 --> 00:20:47,820
Do you want to hear more stats?
402
00:20:48,040 --> 00:20:49,030
Ned: I love stats.
403
00:20:49,120 --> 00:20:49,690
You know me.
404
00:20:49,860 --> 00:20:51,170
Stats guy, all the way.
405
00:20:51,780 --> 00:20:55,889
Chris: In the past year, burnout is estimated to have costs the world economy
406
00:20:55,900 --> 00:21:03,659
$2 trillion in lost productivity, in days off, in inefficiencies, et cetera, and
407
00:21:03,700 --> 00:21:09,730
burnout related mistakes was a direct contributor to 83% of security breaches.
408
00:21:10,070 --> 00:21:10,550
Ned: Wow.
409
00:21:11,139 --> 00:21:13,050
Chris: And a lot of it, I think, comes down to exactly what
410
00:21:13,050 --> 00:21:15,419
you just said a second ago, which is when people start to
411
00:21:15,420 --> 00:21:18,320
suffer from burnout, they don’t recognize it right away.
412
00:21:19,110 --> 00:21:22,629
It becomes the new normal, and that normal means that
413
00:21:22,640 --> 00:21:27,690
you are now at 90% efficient, and then 80% efficient, and
414
00:21:27,690 --> 00:21:30,869
then 70% efficient, and then you feel tired all the time.
415
00:21:30,900 --> 00:21:35,510
And then everything is negative, and you are distancing yourself from your job.
416
00:21:36,000 --> 00:21:40,560
You are feeling complete sense of hopelessness, a total lack of initiative,
417
00:21:40,580 --> 00:21:43,800
and just want to get through the week, so you can sleep all weekend.
418
00:21:44,599 --> 00:21:48,689
And as the nice lady said, you can’t fix that with a fucking pizza party.
419
00:21:50,570 --> 00:21:52,309
Ned: I hope she said that on stage.
420
00:21:52,629 --> 00:21:55,679
Chris: She didn’t really say it, but I was reading between the lines.
421
00:21:55,859 --> 00:21:57,300
I think that’s what she wanted to say.
422
00:21:57,620 --> 00:21:58,409
Ned: No, I agree.
423
00:21:58,430 --> 00:22:01,820
And I mean, to compound problems, we have sort of this
424
00:22:03,969 --> 00:22:06,790
rockstar… mentality, warrior mentality, whatever you want to
425
00:22:06,790 --> 00:22:09,519
call it, where it’s like, well, yeah, things are tough right
426
00:22:09,520 --> 00:22:12,980
now, but I’m tough, and I can work through it, you know?
427
00:22:12,980 --> 00:22:14,930
It’s on me to help save the company.
428
00:22:14,990 --> 00:22:19,390
Like, that… [sigh] it’s even self-imposed sometimes, but sort of this
429
00:22:19,520 --> 00:22:25,299
expectation where we laud the people who persevere through adversity without
430
00:22:25,299 --> 00:22:28,179
reckoning with the fact that they’re probably traumatized at the end.
431
00:22:28,529 --> 00:22:28,809
Chris: Right.
432
00:22:28,850 --> 00:22:31,050
I mean, just to make a quick connection, that’s the
433
00:22:31,050 --> 00:22:34,710
same thing where you valorize people on Instagram
434
00:22:34,720 --> 00:22:38,060
based on what they post, and not on their actual life.
435
00:22:38,090 --> 00:22:42,740
Like, you have this vision of people that is not correct, not based in reality.
436
00:22:43,349 --> 00:22:45,470
But anyway, back to burnout, in general.
437
00:22:46,059 --> 00:22:50,550
She really did say it can’t be fixed by a field trip or a wellness program.
438
00:22:51,010 --> 00:22:54,550
Not that field trips aren’t important, and not that, you know, being
439
00:22:54,550 --> 00:22:58,680
mindful, doing meditation, being careful in mind, about your body.
440
00:22:58,680 --> 00:22:59,949
And all that is really important, but it’s
441
00:22:59,960 --> 00:23:02,510
not like a—you can’t take a pill to fix it.
442
00:23:02,889 --> 00:23:03,239
Ned: Right.
443
00:23:03,639 --> 00:23:06,550
Chris: And as an industry, it is a huge problem.
444
00:23:06,619 --> 00:23:08,510
$2 trillion is a lot of dollars.
445
00:23:09,000 --> 00:23:12,440
It will only be fixed if we fundamentally rethink the way
446
00:23:12,440 --> 00:23:14,850
that we think about work, the way that we report about
447
00:23:14,850 --> 00:23:17,240
work, and the things that we make people responsible for.
448
00:23:18,480 --> 00:23:22,300
Now, the session made one more fascinating connection for me.
449
00:23:22,809 --> 00:23:27,149
I think it is pretty well known, A, that burnout exists, maybe if you
450
00:23:27,180 --> 00:23:29,889
don’t understand the severity of it, but it’s pretty well known that the
451
00:23:29,890 --> 00:23:33,240
first thing that goes when you start to feel burned out—I mean, hell,
452
00:23:33,240 --> 00:23:36,879
even when you start to feel tired—you lose the spark of creativity.
453
00:23:37,490 --> 00:23:38,469
But here’s the thing.
454
00:23:40,029 --> 00:23:41,579
IT work is creative in nature.
455
00:23:42,109 --> 00:23:44,959
Problem-solving is a creative endeavor.
456
00:23:45,120 --> 00:23:47,590
This is not a factory floor.
457
00:23:48,370 --> 00:23:52,590
And what is IT if it’s not a series of problems that need to be solved?
458
00:23:53,080 --> 00:23:53,379
Ned: Right.
459
00:23:53,990 --> 00:23:57,949
Chris: Now, the session with the guest speaker is not posted on the
460
00:23:57,980 --> 00:24:01,750
Gartner site, as of time of recording at least, but what I am going
461
00:24:01,750 --> 00:24:04,470
to include in the [show notes] is a link to the speaker’s previous
462
00:24:04,480 --> 00:24:07,379
five minute… it’s not even a TED Talk, it’s like a TED summary
463
00:24:08,070 --> 00:24:11,810
that both hits a lot of these points, and also has fun graphics.
464
00:24:12,170 --> 00:24:13,550
Ned: Ooh, I like those.
465
00:24:13,840 --> 00:24:16,599
Chris: One quote that came from it that really summed this up was,
466
00:24:16,609 --> 00:24:20,810
quote, “Productivity has wrapped itself up in our self-worth so much
467
00:24:20,840 --> 00:24:24,430
that it’s almost impossible to allow ourselves to stop working.”
468
00:24:25,150 --> 00:24:25,450
Ned: Oh.
469
00:24:25,900 --> 00:24:28,149
Chris: That’s the problem, and we need to stop that.
470
00:24:28,900 --> 00:24:30,549
Ned: I feel attacked [laugh]
471
00:24:31,120 --> 00:24:33,720
.
Chris: [laugh] . So, that’s the gist of it.
472
00:24:33,900 --> 00:24:34,980
Oh, Bear Grylls was there.
473
00:24:34,980 --> 00:24:36,290
He told us to never give up.
474
00:24:36,480 --> 00:24:37,750
Ned: Ugh, fuck that guy.
475
00:24:39,210 --> 00:24:39,899
Chris: Now, he did well.
476
00:24:39,910 --> 00:24:40,830
I mean, he was a good speaker.
477
00:24:40,830 --> 00:24:41,460
I was surprised.
478
00:24:42,110 --> 00:24:43,610
Ned: I actually don’t have anything against him.
479
00:24:43,680 --> 00:24:44,070
I just—
480
00:24:44,080 --> 00:24:44,660
Chris: You like the—
481
00:24:44,670 --> 00:24:47,639
Ned: —that archetype, I don’t jive with it.
482
00:24:47,679 --> 00:24:49,599
Chris: Would it be better if his name was like Jerome?
483
00:24:50,030 --> 00:24:50,980
Ned: Jerome Grylls?
484
00:24:51,070 --> 00:24:51,470
Chris: Mm-hm.
485
00:24:51,780 --> 00:24:52,420
Ned: Yeah, maybe.
486
00:24:52,559 --> 00:24:52,999
Okay.
487
00:24:53,369 --> 00:24:55,340
Chris: Actually, Jerome Grylls sounds like a YouTube channel.
488
00:24:55,780 --> 00:25:00,380
Ned: I will send him a handcrafted note made from my own
489
00:25:00,380 --> 00:25:04,560
pressed papyrus, explaining to him in grave detail how
490
00:25:04,560 --> 00:25:07,559
he should change his name, so that I will approve of it.
491
00:25:08,200 --> 00:25:11,300
Chris: And really, that’s, I think, the validation he needs in life.
492
00:25:11,969 --> 00:25:13,330
Ned: It’s the validation we all need.
493
00:25:13,490 --> 00:25:16,150
Wouldn’t you feel better if you got, like, a handwritten
494
00:25:16,150 --> 00:25:20,350
letter on hand-pressed papyrus from somebody else?
495
00:25:21,220 --> 00:25:22,510
Chris: I don’t think I’d know the difference.
496
00:25:22,550 --> 00:25:24,470
It just looks like flat oatmeal.
497
00:25:25,080 --> 00:25:25,750
Ned: Fine.
498
00:25:26,120 --> 00:25:28,699
There goes your Christmas present, right out the window.
499
00:25:29,549 --> 00:25:31,090
Chris: I hope you at least made it into an airplane.
500
00:25:31,940 --> 00:25:34,100
Okay, what about AI?
501
00:25:34,420 --> 00:25:37,840
Well, AI is a big thing too, as you can imagine.
502
00:25:38,179 --> 00:25:41,770
But I was impressed with how careful, how realistic, and
503
00:25:41,770 --> 00:25:45,740
some would say cynical a lot of the presenters were about AI.
504
00:25:46,010 --> 00:25:46,330
Ned: Good.
505
00:25:47,199 --> 00:25:49,840
Chris: AI, in their perspective, simply put, is
506
00:25:49,840 --> 00:25:53,370
just another tool with just another batch of risk.
507
00:25:53,930 --> 00:25:55,980
No more, no less.
508
00:25:56,740 --> 00:25:59,439
It is not anywhere near the level of taking over the
509
00:25:59,440 --> 00:26:02,109
world that the press releases would have you believe.
510
00:26:02,860 --> 00:26:06,950
Hallucination is a huge problem and a massive risk to the
511
00:26:06,950 --> 00:26:11,500
enterprise, and one that in a lot of these tools has not been solved.
512
00:26:12,210 --> 00:26:15,110
So, talking about our risk registry above, how much
513
00:26:15,110 --> 00:26:17,270
risk are you willing to take around something like that?
514
00:26:17,830 --> 00:26:21,050
Again, it’s different at the enterprise level.
515
00:26:21,410 --> 00:26:26,350
Now, one thing that they did talk about was how AI has been tied into certain
516
00:26:26,370 --> 00:26:31,560
tools that are out there, and something that it really excels at, which is
517
00:26:31,880 --> 00:26:37,609
translating a human language statement into a query that is programmatical.
518
00:26:37,650 --> 00:26:40,319
So, for example, a lot of threat-hunting software
519
00:26:40,340 --> 00:26:44,169
out there uses proprietary query languages, like KQL.
520
00:26:45,060 --> 00:26:47,879
There are more; that’s the only one I can think of because it’s the worst one.
521
00:26:48,480 --> 00:26:49,480
Ned: It really is.
522
00:26:50,420 --> 00:26:56,630
Chris: So, GenAI tools can take a human language ask and spit out a KQL query.
523
00:26:57,330 --> 00:26:59,560
Now, this use case is really helpful, especially
524
00:26:59,560 --> 00:27:02,050
for people that are already fluent in KQL, right?
525
00:27:02,050 --> 00:27:04,440
And this is not that much different than the way we talked
526
00:27:04,440 --> 00:27:06,970
about Copilot when that first came out all those years ago.
527
00:27:07,480 --> 00:27:09,590
If you know what you’re doing, you can ask it the right
528
00:27:09,590 --> 00:27:13,069
question, and you can also vet what it sends back to you.
529
00:27:13,889 --> 00:27:17,020
But usually you can do that in a matter of seconds, especially when
530
00:27:17,020 --> 00:27:20,730
we’re talking about something as simple as a one or two-line KQL query.
531
00:27:21,350 --> 00:27:22,330
That’s a time saver.
532
00:27:22,740 --> 00:27:23,180
Ned: Totally.
533
00:27:23,640 --> 00:27:29,860
Chris: But the point that they made is that base QL system is still necessary.
534
00:27:30,550 --> 00:27:32,790
All AI is doing is an intermediary.
535
00:27:33,450 --> 00:27:37,890
When you ask it to interrogate that data directly, it falls
536
00:27:37,890 --> 00:27:41,660
flat on his face, then it tries to get up and it falls again.
537
00:27:42,680 --> 00:27:45,670
So, that’s where they’re at with that.
538
00:27:46,200 --> 00:27:46,720
Ned: Okay.
539
00:27:47,230 --> 00:27:50,959
Chris: They do have something out there called the AI Impact Radar,
540
00:27:51,469 --> 00:27:55,450
which is kind of a—really, it’s a graphic, basically, that helps guide
541
00:27:55,450 --> 00:28:00,020
understanding about where Gardner thinks the various pieces of AI are.
542
00:28:00,560 --> 00:28:03,070
Because there are more than just ChatGPT.
543
00:28:03,070 --> 00:28:04,980
I think there’s 30 of them on there.
544
00:28:05,610 --> 00:28:11,199
And they go out in escalating circles depending on how far Gartner thinks
545
00:28:11,230 --> 00:28:16,050
these various investments are from being valuable or being mainstream.
546
00:28:16,710 --> 00:28:20,190
But if you want the TL;DR, their recommendation is to hold off
547
00:28:20,190 --> 00:28:23,389
on long-range future GenAI technology investments at this time.
548
00:28:24,040 --> 00:28:26,110
Like I said, reasonable.
549
00:28:26,950 --> 00:28:27,430
Ned: Yeah.
550
00:28:27,679 --> 00:28:28,929
Remarkably reasonable.
551
00:28:29,459 --> 00:28:32,390
Chris: [laugh] . So, all right, we’re coming up on time.
552
00:28:32,390 --> 00:28:36,200
A couple of quick hits from some of the other sessions.
553
00:28:36,770 --> 00:28:40,690
One that came up was around security behavior and culture
554
00:28:40,690 --> 00:28:44,529
programs being more meaningful than just awareness.
555
00:28:45,210 --> 00:28:49,070
There’s a lot of evidence that shows that we have reached the rational
556
00:28:49,070 --> 00:28:52,860
limit of the amount of value that we’re going to get out of fake phishing
557
00:28:53,189 --> 00:28:58,360
campaigns, or forcing people to watch a 15-minute webinar once a year.
558
00:28:59,010 --> 00:29:02,529
What you need to do is teach your employees how to act and
559
00:29:02,530 --> 00:29:07,030
respond, not just look out for and be aware that this exists.
560
00:29:08,190 --> 00:29:13,020
Gartner does have a whole setup for this, which they called the PIPE framework.
561
00:29:13,320 --> 00:29:18,169
What that stands for, I have absolutely no idea, and I actually couldn’t
562
00:29:18,170 --> 00:29:21,320
find my notes from the session, so that’s all you’re going to get.
563
00:29:22,799 --> 00:29:25,209
I did watch a good one about zero trust.
564
00:29:25,570 --> 00:29:28,790
One of the biggest [laugh] problems that the Gartner people
565
00:29:28,790 --> 00:29:31,770
thought was that they don’t think ‘zero trust’ is a great name.
566
00:29:32,270 --> 00:29:33,080
Ned: Yeah…
567
00:29:33,320 --> 00:29:34,209
Chris: It’s catchy.
568
00:29:34,719 --> 00:29:35,679
Ned: It is catchy.
569
00:29:35,910 --> 00:29:37,689
Chris: But it’s not zero trust.
570
00:29:37,719 --> 00:29:41,910
It’s just the right amount of trust, albeit that amount of trust is
571
00:29:41,949 --> 00:29:45,800
pretty effing low, and can change at any time if you decide to be naughty.
572
00:29:46,140 --> 00:29:48,800
Which is a lot of words.
573
00:29:49,190 --> 00:29:52,570
Ned: I like zero trust for what it expresses.
574
00:29:52,580 --> 00:29:56,289
I don’t love how it was slapped on everything for about two years.
575
00:29:57,259 --> 00:29:59,339
Chris: [laugh] . Yeah, I don’t think they liked that either.
576
00:30:00,049 --> 00:30:03,260
There were quantum sessions.
577
00:30:03,770 --> 00:30:05,679
So, there’s a lot of interesting stuff there that I
578
00:30:05,680 --> 00:30:08,629
am, in fact, saving for a future episode, hint, hint.
579
00:30:09,120 --> 00:30:11,429
But if you want the real fast summary, there’s
580
00:30:11,429 --> 00:30:13,110
meaningful stuff going on with quantum.
581
00:30:13,160 --> 00:30:16,310
It’s super serious, but honestly, enterprise-level stuff
582
00:30:16,310 --> 00:30:18,690
is not going to become a serious issue until around 2030.
583
00:30:19,550 --> 00:30:24,580
Which might sound like a lot until you remember that Jurassic Park came out
584
00:30:25,400 --> 00:30:29,300
in 1993, and you don’t even want to do the math on how long ago that was.
585
00:30:29,940 --> 00:30:32,269
But it wasn’t ten minutes ago like I think it is.
586
00:30:33,099 --> 00:30:36,970
Ned: Chris, there’s all these formative albums that
587
00:30:36,970 --> 00:30:40,340
I really enjoyed from the ’90s that are now starting
588
00:30:40,340 --> 00:30:44,569
their 30th anniversary tours, and it just stings, man.
589
00:30:44,980 --> 00:30:46,300
It just [laugh] —it hurts.
590
00:30:46,820 --> 00:30:47,400
Don’t like it.
591
00:30:47,790 --> 00:30:50,659
Chris: No… no, I think I told you, and this is going to be a
592
00:30:50,660 --> 00:30:53,199
reference that only makes sense to people in the Philadelphia area,
593
00:30:53,199 --> 00:30:59,400
I died a little bit inside when I heard Alanis Morissette on 98.1.
594
00:30:59,400 --> 00:30:59,540
Ned: [breathes out] ohhh.
595
00:30:59,610 --> 00:30:59,639
Chris: [laugh]
596
00:30:59,640 --> 00:31:00,940
.
Ned: Hey, thanks for listening or something.
597
00:31:00,940 --> 00:31:03,110
I guess you found it worthwhile enough if you’ve made it
598
00:31:03,179 --> 00:31:05,959
all the way to the end, so congratulations to you, friend.
599
00:31:06,280 --> 00:31:07,630
You accomplished something today.
600
00:31:07,670 --> 00:31:11,204
Now, you can sit on the couch, tuned into Alanis Morissette’s
601
00:31:11,359 --> 00:31:14,560
“Jagged Little Pill,” and relax for the rest of the day.
602
00:31:14,860 --> 00:31:15,639
You’ve earned it.
603
00:31:16,090 --> 00:31:18,610
You can find more about this show by visiting our LinkedIn
604
00:31:18,640 --> 00:31:21,399
page, just search ‘Chaos Lever,’ go to the website,
605
00:31:21,399 --> 00:31:26,590
pod.chaoslever.com, or you can leave us feedback and comments,
606
00:31:26,590 --> 00:31:30,729
which we will read during the Tech News of the Week portion.
607
00:31:30,820 --> 00:31:31,820
If you say it’s okay.
608
00:31:32,000 --> 00:31:34,940
Chris: Yeah, if you don’t want us to, we’ll just read it, and that’ll be it.
609
00:31:35,290 --> 00:31:35,760
Ned: Yeah.
610
00:31:35,910 --> 00:31:38,429
Chris: Or if you don’t want us to read it, we can do that too.
611
00:31:38,639 --> 00:31:41,430
Ned: [laugh] . You can just write it and never send it.
612
00:31:41,550 --> 00:31:42,300
It’s up to you.
613
00:31:43,620 --> 00:31:46,369
We’ll be back next week to see what fresh hell is upon us.
614
00:31:46,430 --> 00:31:47,929
Ta-ta for now.
615
00:31:55,640 --> 00:31:57,420
Chris: Seventeen sessions I went to.
616
00:31:57,990 --> 00:31:58,910
Ned: That’s impressive.
617
00:31:59,389 --> 00:32:01,560
Chris: And I remembered so very little.
618
00:32:02,139 --> 00:32:03,310
I did take a lot of notes though.
619
00:32:03,940 --> 00:32:04,899
But then I lost them.
620
00:32:05,170 --> 00:32:06,250
Ned: Less impressive [laugh]