Yesterday Twitter revealed they had accidentally stored plain-text passwords in some log files. There was no indication the data was accessed and users were warned to update their passwords. There was no known breach, but Twitter went public anyway, and was excoriated in the press and… on Twitter.

This is a problem for our profession and industry. We get locked into a cycle where any public disclosure of a breach or security mistake results in:

  • People ripping the organization apart on social media without knowing the facts.
  • Vendors issuing press releases claiming their product would have prevented the issue, without knowing the facts.
  • Press articles focusing on the worst case scenario without any sort of risk analysis… or facts.
  • Plenty of voices saying how simple it is to prevent the problem, without any the concept of complexity or scale of even simple controls (remember kids, simple doesn’t scale).

To be clear, there are cases where organizations are negligent and try to cover up their errors. If a press release says things like “very sophisticated attack”, infosec fairies deservedly lose their wings, but more often than not we focus on blame rather than cause. This is true both in public and for internal investigations.

This is a problem many industries have faced; two in particular have performed extensive research and adopted a concept called Just Culture. It’s time for security to formally adopt Just Culture, including adding it to certifications and training programs.

Aviation and healthcare are two professions/industries which use Just Culture, to different degrees. My background and introduction is on the healthcare side so that’s where I draw from.

First, read this paper available through the National Institutes of Health: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3776518/.

The focus in Just Culture is to identify and correct the systemic cause, not to blame the individual. Here are some choice quotes:

People make errors. Errors can cause accidents. In healthcare, errors and accidents result in morbidity and adverse outcomes and sometimes in mortality.

One organizational approach has been to seek out errors and identify the responsible individual. Individual punishment follows. This punitive approach does not solve the problem. People function within systems designed by an organization. An individual may be at fault, but frequently the system is also at fault. Punishing people without changing the system only perpetuates the problem rather than solving it.

A just culture balances the need for an open and honest reporting environment with the end of a quality learning environment and culture. While the organization has a duty and responsibility to employees (and ultimately to patients), all employees are held responsible for the quality of their choices. Just culture requires a change in focus from errors and outcomes to system design and management of the behavioral choices of all employees.

In a just culture, both the organization and its people are held accountable while focusing on risk, systems design, human behavior, and patient safety.

The focus is on systemic risk first, and individual… later. This is something we face in healthcare/rescue every day, where many errors result from the system more than the person. For example in some prehospital systems it isn’t uncommon to have two medications with vastly different effects in very similar packaging, resulting in medication errors which can be fatal. That answer isn’t better training but better packaging.

Fix the system – don’t expect perfect behavior.

Let’s apply this to Twitter. Plain text passwords were stored in logs. This is bad, but there are many ways it could have happened. Think of all the levels of logging and software components they have, and all the places passwords might have fallen into logs. Using a Just Culture approach we should reward Twitter for their honesty, and learn what techniques they used to detect the exposed data, and what allowed it to be saved in those logs, undiscovered for so long.

What system issues caused the problem, and how can we prevent them moving forward? Not “Twitter was stupid and got hacked” (because apparently they weren’t).

Just Culture is about fostering an open culture of safety where mistakes – even individual mistakes – are used to improve overall system resilience. It’s our time.

Share: