I was pleased to see the next version of the Center for Internet Security’s Consensus Security Metrics earlier this week. Even after some groundbreaking work in this area in terms of building a metrics program and visualizing the data, most practitioners still can’t answer the simple question: “How good are you at security?”

Of course that is a loaded question because ‘good’ is a relative term. The real point is to figure out some way to measure improvement, at least operationally. Given that we Securosis folks tend to be quant-heads, and do a ton of research defining very detailed process maps and metrics for certain security operations (Patch Management Quant and Network Security Ops Quant), we get it. In fact, I’ve even documented some thoughts on how to distinguish between metrics that are relevant to senior folks and those of you who need to manage (improve) operations.

So the data is there, and I have yet to talk to a security professional who isn’t interested in building a security metrics program, so why do so few of us actually do it? It’s hard – that’s why. We also need to acknowledge that some folks don’t want to know the answer. You see, as long as security is deemed necessary (and compliance mandates pretty well guarantee that) and senior folks don’t demand quantitative accountability, most folks won’t volunteer to provide it. I know, it’s bass-ackward, but it’s true. As long as a lot of folks can skate through kind of just doing security stuff (and hoping to not get pwned too much), they will.

So we have lots of work to do to make metrics easier and useful to the practitioners out there. From a disclosure standpoint, I was part of the original team at CIS that came up with the idea for the Consensus Metrics program and drove its initial development. Then I realized consensus metrics actually involve consensus, which is really hard for me. So I stepped back and let the folks with the patience to actually achieve consensus do their magic. The first version of the Consensus Metrics hit about a year ago, and now they’ve updated it to version 1.1.

In this version CIS added a Quick Start Guide, and it’s a big help. The full document is over 150 pages and a bit overwhelming. QS is less than 20 pages and defines the key metrics as well as a balanced scorecard to get things going. The Balanced Scorecard involves 10 metrics, broken out across:

  1. Impact: Number of Incidents, Cost of Incidents
  2. Performance by Function: Outcomes: Configuration Policy Compliance, Patch Policy Compliance, Percent of Systems with No Known Severe Vulnerabilities
  3. Performance by Function: Scope: Configuration Management Coverage, Patch Management Coverage, Vulnerability Scanning
  4. Financial Metrics: IT Security Spending as % of IT Budget, IT Security Budget Allocation

As you can see; this roughly equates security with vulnerability scanning, configuration, and patch management. Obviously that’s a dramatic simplification, but it’s somewhat plausible for the masses. At least there isn’t a metric on AV coverage, right? The full set of metrics adds depth in the areas of incident management, change management, and application security. But truth be told, there are literally thousands of discrete data points you can collect (and we have defined many of them via our Quant research), but that doesn’t mean you should. I believe the CIS Consensus Security Metrics represent an achievable data set to start collecting and analyzing.

One of the fundamental limitations now is that there is no way to know how well your security program and outcomes compare against other organizations of similar size and industry. You may share some anecdotes with your buddies over beers, but nothing close to a quantitative benchmark with a statistically significant data set is available. And we need this. I’m not the first to call for it either, as the New School guys have been all over it for years. But as Adam and Andrew point out, we security folks have a fundamental issue with information sharing that we’ll need to overcome to ever make progress on this front.

Sitting here focusing on what we don’t have is the wrong thing to do. We need to focus on what we do have, and that’s a decent set of metrics to start with. So download the Quick Start Guide and start collecting data. Obviously if you have some automation driving some of these processes, you can go deeper sooner – especially with vulnerability, patch, and configuration management.

The most important thing you can do is get started. I don’t much care where you start – just that you start. Don’t be scared of the data. Data will help you identify issues. It will help you pinpoint problems. And most importantly, data will help you substantiate that your efforts are having an impact. Although Col. Jessup may disagree (YouTube), I think you can handle the truth. And you’ll need to if we ever want to make this security stuff a real profession.

Share: