Assessing Risk Assessment

Wednesday, February 20, 2019

Computer security Bruce Schneier wrote some time ago about how easy it can be to accuse others of misjudging risks, even though most people actually have a good intuition about risk:

You may have excellent mountaineering advice, but I can safely ignore it. (Image by aatlas, via Pixabay, license).
This struck me as I listened to yet another conference presenter complaining about security awareness training. He was talking about the difficulty of getting employees at his company to actually follow his security policies... "We have to make people understand the risks," he said.

It seems to me that his co-workers understand the risks better than he does. They know what the real risks are at work, and that they all revolve around not getting the job done. Those risks are real and tangible, and employees feel them all the time. The risks of not following security procedures are much less real. Maybe the employee will get caught, but probably not. And even if he does get caught, the penalties aren't serious.

Given this accurate risk analysis, any rational employee will regularly circumvent security to get his or her job done. That's what the company rewards, and that's what the company actually wants.

"Fire someone who breaks security procedure, quickly and publicly," I suggested to the presenter. "That'll increase security awareness faster than any of your posters or lectures or newsletters." If the risks are real, people will get it.
Coming across this post again after listening to one of Alex Epstein's podcasts on human flourishing provoked my mind to make an interesting connection. (I don't specifically recall which one(s) this was -- my time for listening is currently limited mostly to time I set aside for running errands around town.)

One of Epstein's major themes is how to evaluate the many claims to knowledge that one encounters, and two obstacles that he has named to doing so are (a) experts don't explain things well, and (b) the importance of many such claims are exaggerated. Here, we have an expert quite possibly not being clear enough about an explanation (about, to be fair, a topic that is difficult to begin with) addressing an audience jaded by lots of bad and or over-hyped security advice. Schneier's advice cuts through both problems, and he ends his post by basically advising computer security professionals to be sure they understand risk from their audience's perspective before giving their recommendations.

This is good communications advice, but it can also be turned around and made into good thinking advice regarding claims to new knowledge one encounters. As with any claim, one should try to evaluate it as knowledge by asking oneself how well it integrates (or doesn't) with the rest of one's knowledge. But, assuming the claim is knowledge, how urgent is acting on it? That depends on integrating it within the full context of the rest of one's values. It can be easy to get carried away with new knowledge and forget to do this -- to assess one's own risk of not applying the knowledge. (The most obvious costs of unnecessarily acting on new knowledge are wasted time and effort.) If your primary use of a pen drive is to transfer music or video files between a couple of devices you own, the urgency of encrypting the data is probably zero -- if you work in a nuclear power plant, and use one at all, it is almost certainly for work, and you probably should be fired for it not being encrypted. With any claim to knowledge, one faces two questions: (1) Is it true? and (2) How important is it? 

-- CAV

4 comments:

Snedcat said...

Yo, Gus, you write, If your primary use of a pen drive is to transfer music or video files between a couple of devices you own, the urgency of encrypting the data is probably zero -- if you work in a nuclear power plant, and use one at all, it is almost certainly for work, and you probably should be fired for it not being encrypted.

Great! You have a whole episode of The Simpsons in embryo right there! Heck, it's probably already been done. But if it hasn't, demand royalties when they make it, and be sure to give me a cut.

Gus Van Horn said...

Snedcat,

I don't know one way or the other about the series, but there is this.

Gus

Jennifer Snow said...

It's another important note that when you're assessing risk, one of the factors people take into account is whether or not the person talking about the risk ACTS like they take it seriously.

There's a term for it in fiction, the "informed attribute", when the author tells you something (like the character is a genius), but the way the character acts contradicts it.

One of the major problems I have with businesses is that I am hypersensitive to people not "walking the talk". It leaves me baffled about how I'm actually expected to do my job.

Gus Van Horn said...

Absolutely, and I find the parallel to fiction to be a helpful way to think about the problem since it is also a similar type of bad communication.