Mapping Business Drivers to Program Metrics
You wouldn’t measure the value of a house by the cubic footage of volume — you could, but it wouldn’t tell you much about what we value in a house. Similarly, you can’t measure the success of a security program with typical IT metrics like service level agreements (SLAs) and number of transactions, at least not in isolation.
There is a basic three-step process to identify the measures. I have taken a number of clients through this process. It’s never easy, but it is repeatable.
First, prioritize the business drivers for security. These are typically simple phrases like operational efficiency, increased security, reduced risk, enabling business growth, etc.
Second, gain agreement on the program objectives for each of the business drivers. For example, I often hear statements like, “Protect the organization’s reputation and customer’s confidence by protecting confidential and personal data.” This is often an objective for the reduced risk business driver.
Finally, select two to four tangible measurements of each objective. These are your program metrics. We frequently see the same measurement occur for multiple program objectives and across multiple security programs.
So after hours of collaboration, with input from your team and external experts, you have your list of program metrics. Excellent — but you’re not done. You need to know what the thresholds are to initiate a reaction. Color-based statuses such as red, yellow and green are one way of reporting. But I like another option:
This model is an acknowledgment that security is dynamic; the work is never done. As security practitioners, we should be gaining maturity with every reporting cycle. The objective of a security program should always be to improve maturity at a rate equal to the threats and the business complexity.
Each of these reporting levels needs a corresponding objective measure — a numeric threshold. Each should be clearly defined for the program metrics. For example, an identity governance solution may be maturing if two or more applications were integrated, or declining if more than one access certification cycle was missed. Make this clear to your reporting audience and you will gain both credibility and engagement.
By providing this transparency to program stakeholders, it’s possible to build awareness, increase engagement and, most importantly, solicit help. Most IT services are reported on a monthly basis. Reporting can be a significant time burden and stakeholders may not need frequent updates. Finding the right balance between efficiency and frequency takes time.
My typical recommendation is to report on program-level metrics monthly. But your audience may want your team to follow the cadence of other IT groups.
We Missed Achieving a Metric — So What?
Security is a board-level topic in every company. Multiple regulatory requirements, including GLBA and SOX, require strong security and controls. If you miss a metric — and I can almost guarantee every security program will miss achieving one of their metrics over the course of a year — it will get visibility. Is that a bad thing? Maybe not.
Organization leadership most often wants to know what the downstream impacts are and the causes of missing that metric. If your security programs have mapped program metrics to business drivers, and they have a strong program management structure in place, the program leader and CISO will easily answer these questions.
Having seen this cycle occur many times with my clients, and having observed their reactions as they view these situations in hindsight, they typically find this is a healthy cycle that can increase the maturity of the security organization.