The practice of analyzing security data for detection and response — otherwise known as security analytics (SA) — comes in many forms and flavors. Consumed data varies from organization to organization, analytic processes span a plethora of algorithms and outputs can serve many use cases within a security team.
In early 2019, IBM Security commissioned a survey to better understand how companies currently use security analytics, identify key drivers and uncover some of the net benefits security decision-makers have experienced. The findings were drawn from more than 250 interviews with information security decision-makers around the globe.
7 Lessons From Top Performers in Security Analytics
Encouragingly, the study revealed rising levels of maturity when it comes to security analytics. Roughly 15 percent of all interviewees scored as high performers, meaning their investigation processes are well-defined and they continuously measure the effectiveness of the output. These respondents are especially strong in terms of volume of investigations (five to 10 times more investigations than the average) and false positives (approximately 30 percent below average). Meanwhile, 97 percent of these leaders successfully built a 24/7 security operations center (SOC) with a total staffing headcount between 25 and 50.
What lessons can organizations with lower levels of SA maturity take away from this shining example? Below are seven key lessons security teams can learn from the top performers identified in the survey:
- Top SA performers have a knack for integrating security data. While many mid-performing organizations struggle with this integration and consider the task an obstacle to effective security analytics, leaders identified in the survey have streamlined the process, freeing them to focus on use case and content development.
- Nine in 10 high performers have an accurate inventory of users and assets — in other words, they understand the enterprise’s boundaries and potential attack surfaces and continuously update their inventory. This is likely a result of effective, automated discovery using a combination of collected security data and active scanning. By comparison, less than 30 percent of low-performing security teams practice this approach.
- A robust detection arsenal contains an equal mix of rule-based matching (i.e., indicators of compromise), statistical modeling (i.e., baselining) and machine learning. In stark contrast, intermediate performers rely more on existing threat intelligence as a primary detection method.
- Top performers use content provided by their security analytics vendors. In fact, 80 percent of respondents in this category indicated that the vendor-provided content is sufficient, whether sourced out of the box or via services engagements.
- Compared to middling performers, top performers dedicate between two and three times more resources to tuning detection tools and algorithms. To be exact, 41 percent of high performers spend 40 hours or more per week on detection tuning.
- High-performing security teams automate the output of the analytics and prioritize alerts based on asset and threat criticality. They also have automated investigation playbooks linked to specific alerts.
- Finally, organizations with a high level of SA maturity continuously measure their output and understand the importance of time. Approximately 70 percent of top performers keep track of monthly metrics such as time to respond and time spent on investigation. Low-performing organizations, on the other hand, measure the volume of alerts, and their use of time-based metrics is 60 to 70 percent lower than that of high performers.
Build a Faster, More Proactive and More Transparent SOC
So what do the high performers identified in the survey have to show for their security analytics success? For one thing, they all enjoy superb visibility into the performance of their SOC. While many companies are improving, particularly in the areas of cloud and endpoint visibility, 41 percent of leaders in security analytics claim to have full SOC visibility, compared to 13 percent of intermediate and low performers.
In addition, while lower-performing organizations leverage security analytics to investigate and respond — i.e., react — to threats, high performers use SA to stay ahead of threats proactively. Finally, the leaders identified in the study generate their own threat intelligence and are experts in analyzing security data.
The key takeaway here is that security is a race against time — specifically, to outpace cyber adversaries. Leading security teams know this, which is why they continuously challenge themselves by integrating new data, extracting new insights, implementing smart automation, and, most importantly, measuring the time to detect, investigate and respond.
Learn 5 Steps to Achieve a Proactive Security Intelligence Program