Mogull’s LawBy Rich
I’m about to commit the single most egotistical act of my blogging/analyst career. I’m going to make up my own law and name it after myself. Hopefully I’m almost as smart as everyone says I think I am.
I’ve been talking a lot, and writing a bit, about the intersection of security and psychology in security. One example is my post on the anonymization of losses, and another is the one on noisy vs. quiet security threats.
Today I read a post by RSnake on the effectiveness of user training and security products, which was inspired by a great paper from Microsoft: So Long, And No Thanks for the Externalities: The Rational Rejection of Security Advice by Users.
I think we can combine these thoughts into a simple ‘law’:
The rate of user compliance with a security control is directly proportional to the pain of the control vs. the pain of non-compliance.
We need some supporting definitions:
- Rate of compliance equals the probability the user will follow a required security control, as opposed to ignoring or actively circumventing said control.
- The pain of the control is the time added to an established process, and/or the time to learn and implement a new process.
- The pain of non-compliance includes the consequences (financial, professional, or personal) and the probability of experiencing said consequences. Consequences exist on a spectrum – with financial as the most impactful, and social as the least.
- The pain of non-compliance must be tied to the security control so the user understands the cause/effect relationship.
I could write it out as an equation, but then we’d all make up magical numbers instead of understanding the implications.
Psychology tells us people only care about things which personally affect them, and fuzzy principles like “the good of the company” are low on the importance scale. Also that immediate risks hold our attention far more than long-term risks; and we rapidly de-prioritize both high-impact low-frequency events, and high-frequency low-impact events. Economics teaches us how to evaluate these factors and use external influences to guide widescale behavior.
Here’s an example:
Currently most security incidents are managed out of a central response budget, as opposed to business units paying the response costs. Economics tells us that we can likely increase the rate of compliance with security initiatives if business units have to pay for response costs they incur, thus forcing them to directly experience the pain of a security incident.
I suspect this is one of those posts that’s going to be edited and updated a bunch based on feedback…