Home / Blogs

Illusory Correlation and Security

Fear sells. Fear of missing out, fear of being an imposter, fear of crime, fear of injury, fear of sickness ... we can all think of times when people we know (or worse, people in the throes of madness of crowds) have made really bad decisions because they were afraid of something. Bruce Schneier has documented this a number of times. For instance: "it's smart politics to exaggerate terrorist threats" and "fear makes people deferential, docile, and distrustful, and both politicians and marketers have learned to take advantage of this." Here is a paper comparing the risk of death in a bathtub to death because of a terrorist attack — bathtubs win.

But while fear sells, the desire to appear unafraid also sells — and it conditions people's behavior much more than we might think. For instance, we often say of surveillance, "if you have done nothing wrong, you have nothing to hide" — a bit of meaningless bravado. What does this latter attitude — "I don't have anything to worry about" — cause in terms of security?

Several attempts at researching this phenomenon have come to the same conclusion: average users will often intentionally not use things they see someone they perceive as paranoid using. According to this body of research, people will not use password managers because using one is perceived as being paranoid in some way. Theoretically, this effect is caused by illusory correlation, where people associate an action with a kind of person (only bad/scared people would want to carry a weapon). Since we don't want to be the kind of person we associate with that action, we avoid the action — even though it might make sense.

This is just the flip side of fear sells, of course. Just like we overestimate the possibility of a terrorist attack impacting our lives in a direct, personal way, we also underestimate the possibility of more mundane things, like drowning in a tub, because we either think we can control it, or because we don't think we'll be targeted in that way, or because we want to signal to the world that we "aren't one of those people."

Even knowing this is true, however, how can we counter this? How can we convince people to learn to assess risks rationally rather than emotionally? How can we convince people that the perception of control should not impact your assessment of personal security or safety?

Simplifying design and use of the systems we build would be one — perhaps not-so-obvious — step we can take. The more security is just "automatic," the more users will become accustomed to deploying security in their everyday lives. Another thing we might be able to do is stop trying to scare people into using these technologies.

In the meantime, just be aware that if you're an engineer, your use of a technology "as an example" to others can backfire, causing people to not want to use those technologies.

By Russ White, Infrastructure Architect at Juniper Networks

CircleID Newsletter The Weekly Wrap

More and more professionals are choosing to publish critical posts on CircleID from all corners of the Internet industry. If you find it hard to keep up daily, consider subscribing to our weekly digest. We will provide you a convenient summary report once a week sent directly to your inbox. It's a quick and easy read.

I make a point of reading CircleID. There is no getting around the utility of knowing what thoughtful people are thinking and saying about our industry.

VINTON CERF
Co-designer of the TCP/IP Protocols & the Architecture of the Internet

Comments

 Be the first to post a comment!

Add Your Comments

 To post your comments, please login or create an account.

Related

Topics

Domain Management

Sponsored byMarkMonitor

Domain Names

Sponsored byVerisign

Threat Intelligence

Sponsored byWhoisXML API

IPv4 Markets

Sponsored byIPXO

Brand Protection

Sponsored byAppdetex

Cybersecurity

Sponsored byVerisign