So you have defined your peer groups and analysis and spent a bunch of time communicating what you found to your security program’s key stakeholders. Now it’s time to shift focus internally. One of the cool things about security metrics and benchmarks is the ability to analyze trends over time and use that data to track progress against your key goals. Imagine that – managing people and programs based on data, not just gut feel.
Besides being able to communicate much more authoritatively how you are doing on security, you can also focus on continuously improving your activities. This is a good thing to do – particularly if you want to keep your job. We will harp on the importance of consistency in gathering data and benchmarks over a long period of time, and then getting sustained value from the benchmark by using it to mark progress toward a better and more secure environment.
Programs and feedback loops
We don’t want to put the cart ahead of the horse, so let’s start at a high level, with describing how to structure the security program so it’s focused on improvement rather than mere survival. Here are the key steps:
- Define success (and get buy-in up the management stack)
- Distill success characteristics into activities that will result in success
- Quantify those activities, determine appropriate metrics, and set goals for those metrics
- Set objectives for each activity and communicate those objectives
- Run your business; gather your metrics
- Analyze metrics; report against success criteria/objectives
- Identify gaps, address issues, and reset objectives accordingly
- Wash, rinse, repeat
Digging deeply into security program design and operation would be out of scope, so we’ll just refer you to Mike’s methodology on building a security program: The Pragmatic CSO.
Communicating to the troops
In our last post, on Benchmarking Communication Strategies, we talked about communicating with key stakeholders in the security process, and a primary constituency is your security team. Let’s revisit that discussion and its importance. Your security team needs to understand the process, how benchmark data will be used to determine success, and what the expectations will be.
Don’t be surprised to experience some push-back on this new world order, and it could be quite significant. Just put yourself in your team’s shoes for a moment. For most of these folks’ careers they have been evaluated on a squishy subjective assessment of effectiveness and effort. Now you want to move them to something more quantified, where they can neither run nor hide. Top performers should not be worried – at all. That’s a key point to get across.
So exercise some patience in getting folks heads in the right spot, but remember that you aren’t negotiating here. Part of the justification for investing (rather significantly) in metrics and benchmarks is to leverage that data in operations. You can’t do that if the data isn’t used to evaluate performance – both good and bad.
It’s not a tool, it’s a lifestyle
Another point to keep in mind is that this initiative isn’t a one-time thing. It’s not something you do for an assessment, and then forget it in a drawer the moment the auditor leaves the building. Benchmarking, done well, becomes a key facet of managing your security program. This data becomes your North Star, providing a way to map out objectives and ensure you stay on course to reach them. We have seen organizations start with metrics as a means to an end, and later recognize that they can change everything about how operational efforts are managed, perceived, and supported within the organization. The lack of security data has hindered acceptance of benchmarking in the security field, but it’s time to revisit that.
As per usual, there are some caveats to data-driven management. No one size fits all. We see plenty of cultural variation, which may require you to take a less direct path to the benchmark promised land. But there can be no question about the effectiveness of quantifying activity, compared to not quantifying it.
If you have gotten this far, successfully implemented this kind of benchmark, and institutionalized it as a management tool, you are way ahead of the game. But what’s next? Digging into deeper and more granular metrics, such as the metrics we defined as part of our Project Quant research. So we will discuss that next.
Comments