Enoch’s Thoughts

May 3, 2011

Security and cognitive biases

Filed under: Uncategorized — etblog @ 1:04 pm

We are hosting a family of two cats, two dogs, and two humans this week. The cats actually arrived last Thursday, three days before the rest of the bunch. Their behavior surprised me, even though I know cats pretty well. For the first three days, they had a quiet house completely to themselves, and yet they were very shy and skittish, almost totally unapproachable.

But when the rest of the family, including the two very active dogs, showed up, the cats underwent an almost immediate transformation, from shy to confident. They no longer ran from our touch, but would readily let us pet them. They roamed into places they had previously avoided.

For some reason, the presence of their “mommy and daddy,” and their canine “sisters” made them feel more secure than they felt in a quiet home they had all to themselves.

Many of our reactions to modern life seem just as strange to me as the cats’ behavior. Here are a couple of anecdotes for your consideration:

  • I know someone who carefully blacks out her name and address before throwing out junk mail. When I pointed out that her name and address are readily available public information, she said, “But that doesn’t mean I have to make it easy for them, putting it right there in front of them. Besides, it makes me feel safer.” I’m not sure who “them” is, why they would have access to her trash, or what they could do with her name and address.
  • I also know someone who refuses to wear seat belts. She fears being trapped in a burning car more than she fears being violently projected through the front windshield, even though the odds of the latter are much greater than the former.
  • I know someone who refuses to transact any business via the internet, for fear someone will be able to get all her money somehow. As it turns out, several recents breeches of “secure” corporate databases have made that fear considerably more understandable, perhaps even justified.
  • I know several people with burglar alarms who have decided that the embarrassment of disturbing the neighbors with a false alarm outweighs the benefits of activating the alarm, so they just never turn it on. And it is not unreasonable to assume that the alarm company sign in the yard has as much actual security effect as the alarm system itself.

Bruce Schneier is a nationally recognized cryptographer, computer security specialist, writer and expert on the topic of security. He gave a TED talk last year that was recently posted by the TED folks. His talk explains much of what I observe about our reactions to threats, danger and security. He also has something in common with Chuck Norris.

Here are some points Bruce discussed in his talk.

The term “security” can really refer to two different things, the feeling, and the reality. It is possible to feel secure without actually being secure, and it is possible to be secure without feeling secure. Of course, what we all want is to both feel and actually be secure.

Security is almost always a tradeoff. You trade such things as money, convenience, capabilities, fundamental liberties, for an increase in security.

And the question then becomes not whether our security efforts make us safer (they almost all have some positive effect), but whether they are worth the tradeoff.

There are rarely any clearly defined right or wrong answers, either. “Should I get a burglar alarm?” Well, it depends. What is your house like? What is your neighborhood like? How valuable are your belongings? How much risk of theft are you willing to accept? Will you even remember to activate it?

In general people have a natural intuition for security decisions. The tradoffs around double-locking the door of your hotel room, or buckling your seat belt, are very reasonable to most people.

Security decisions can also be viewed from an evolutionary perspective. The rabbit in the field hears a noise, and must make a decision: do I keep eating, or should I flee? Make the wrong decision, and you either starve, or you get eaten.

From that we might conclude that humans are good at making correct security decisions. Unfortunately, we are not. And why not? The short answer is that we opt for the feeling of security rather than the reality.

Throughout most of human history, the perception and the reality of safety have been closely aligned. But one could argue that our reactions are still tuned for living in the East African highlands around 100,000 BC. Those instincts may not be so helpful in contemporary New York City.

We have some key biases that color our perceptions:

  1. We tend to exaggerate spectacular and rare risks, and downplay common risks. That’s why flying still seems more dangerous than driving, despite the statistics.
  2. The unknown seems riskier than the familiar. That’s why we tend to fear that our child will be kidnapped by a stranger, when it is very much more likely that a kidnapper will be a relative.
  3. Personified risks are perceived to be greater than anonymous risks. That’s why we used to fear Osama Bin Laden more than terrorists in general.
  4. People underestimate risks in situations they do control, overestimate risks in situations they don’t control. If you take up skydiving or smoking, you tend to downplay the risks, compared to a danger that seems outside your control, such as terrorism.

Other cognitive biases also affect our perception of security.

  • One is “availability” – we estimate the probability of something by how easy it is to bring something to mind. If we hear a lot about tiger attacks, but not much about lion attacks, we tend to rightfully fear the tiger. And this worked fine until the invention of the newspaper industry. Newspapers tend to publicize rare risks out of proportion to reality. Bruce tells people, “If its in the news, don’t worry about it.” By definition, things that appear in the news are rare! Common and likely occurrences, such as car crashes or domestic violence, just don’t make the news.
  • Our cultural identity as story tellers produces another bias. [This is one of my favorite hot buttons.] We tend to react to, and remember, stories much more strongly than we do to to data and statistics.
  • Another cognitive bias that affects our perception of security is basic innumeracy – we are pretty good at visualizing small numbers and small odds, but not at all good with very large (a trillion dollars) or very small (one chance in a million.)

All of these cognitive biases act as filters to our perception, and the result is that we often feel more or less secure than we actually are.

Schneier goes on in the talk to discuss “security theatre,” the difficulty of evaluating protection schemes for events that don’t happen very often, or for changes that span decades. He explains why and how mental models help us understand complex situations, and where those models tend to come from. He discusses the various agendas pursued by different security stakeholders, and how confirmation bias makes it very difficult for us to change our models.

Finally, he talks about our reliance on others for safety, for example, in pharmaceuticals, air safety, and building codes, and he describes his ultimate goal, which is to provide people with better security models so they can make better choices.

I think I’ll stop without trying to hammer conclusions into your head. I may revisit the topic later – you might want to find a hard hat.

Meanwhile, if you want to check out Bruce Schneier for yourself, here are some links:

No Comments »

No comments yet.

RSS feed for comments on this post. TrackBack URL

Leave a comment

Powered by WordPress