I’m about to commit the single most egotistical act of my blogging/analyst career. I’m going to make up my own law and name it after myself. Hopefully I’m almost as smart as everyone says I think I am.
I’ve been talking a lot, and writing a bit, about the intersection of security and psychology in security. One example is my post on the anonymization of losses, and another is the one on noisy vs. quiet security threats.
Today I read a post by RSnake on the effectiveness of user training and security products, which was inspired by a great paper from Microsoft: So Long, And No Thanks for the Externalities: The Rational Rejection of Security Advice by Users.
I think we can combine these thoughts into a simple ‘law’:
The rate of user compliance with a security control is directly proportional to the pain of the control vs. the pain of non-compliance.
We need some supporting definitions:
- Rate of compliance equals the probability the user will follow a required security control, as opposed to ignoring or actively circumventing said control.
- The pain of the control is the time added to an established process, and/or the time to learn and implement a new process.
- The pain of non-compliance includes the consequences (financial, professional, or personal) and the probability of experiencing said consequences. Consequences exist on a spectrum – with financial as the most impactful, and social as the least.
- The pain of non-compliance must be tied to the security control so the user understands the cause/effect relationship.
I could write it out as an equation, but then we’d all make up magical numbers instead of understanding the implications.
Psychology tells us people only care about things which personally affect them, and fuzzy principles like “the good of the company” are low on the importance scale. Also that immediate risks hold our attention far more than long-term risks; and we rapidly de-prioritize both high-impact low-frequency events, and high-frequency low-impact events. Economics teaches us how to evaluate these factors and use external influences to guide widescale behavior.
Here’s an example:
Currently most security incidents are managed out of a central response budget, as opposed to business units paying the response costs. Economics tells us that we can likely increase the rate of compliance with security initiatives if business units have to pay for response costs they incur, thus forcing them to directly experience the pain of a security incident.
I suspect this is one of those posts that’s going to be edited and updated a bunch based on feedback…
Reader interactions
7 Replies to “Mogull’s Law”
Rich,
small correction, if you don’t mind:
“The rate of user compliance with a security control is directly proportional to the pain of the control vs. the perception of the pain of non-compliance.”
The user rarely knows the real consequences and probabilities. So, even if the pain of non-compliance is huge, the user may have the perception that it is not that bad.
Sorry for the late addition, been too busy to post earlier
“Mogull’s Law” is a fine name, but if it were up to me I might call it “Hobbes’ Law” or “Leviathan’s Law”, after Thomas Hobbes and his opus “Leviathan”. His theory of State was based on social contracts and the forces necessary to enforce compliance to those contracts. Compliance was often in the form of punishments or pain.
That aside, Mogull’s Law also shows the fundamental deficiency of *any* compliance regime regarding security. Compliance is motivated by pain avoidance. It completely ignores the upside motivations of people and organizations—aspirations, taking initiative, and so on. While some people may argue that ordinary user compliance doesn’t require any upside motivation, I strongly disagree.
Every user needs to constantly monitor their information and their systems for anything that doesn’t look right. They need to take the initiative to ask for help, share what they’ve learned, etc. And over the scale of months and years, they need to constantly update their knowledge of security and risk.
At managerial, executive, or organization-to-organization levels, it’s *very* important to mobilize creativity and initiative. Examples include data sharing, metrics sharing, research collaboration, threat intelligence, and other forms of continuous learning and agility.
More generally, organizations and cultures that are exclusively compliance oriented are generally very poor at organization learning and agility, because they only draw on pain-avoidance motives.
Hmmm. Good point Dave, and now that I look at it, I don’t know if you or Rich got it right. Regardless, I get the gist of ye and agree with this concept.
My previous employer was working towards a fee-based delivery program. The overarching gov’t agency dictated baseline services that had to be provided and they funded. Our local org went way above and beyond, with only the original budget and what the local clientele would pay for in a general budget contribution. Moving to a fee-based SLA would greatly reduce the number of unimportant requests to the IT group.
Security incidents would be a whole new subject…particularly if the fee included realistic amounts for the time and materials of incident handling specialists! One instance of plugging a personal computer into a gov network and the department would be bankrupt for the year. 🙂 Ok, that might be an exaggeration, but I like this concept. A lot.
David,
I’m too fuzzy headed to think it through, and you are smart, so I will go edit.
Augusto,
Actually, I mean it the way I wrote it- they have to experience the pain… without that, they won’t ever care about the probabilities.
But I think you are right- it has to be the _perceived_ pain and probability. Starting on the first edit…
I’ll jump in first since I’ve been really quiet lately. I think you mean inversely proportional. After all, if the pain of the control is super low and the pain of non-compliance is really high, then the rate of compliance should also be really high right? In this situation you have an inverse relationship.