Ransomware and data leaks are inconvenient and costly. But what about a cyber incident that leads to mass casualties?
The notion of “black swan” events — incidents that are so rare and unusual they cannot be predicted — is a “fallacy”, according to Sarah Armstrong-Smith, Chief Security Advisor at Microsoft, during her talk at UK Cyber Week 2023. Increasingly, experts are warning about the growing possibility (or inevitability) of an infrastructure cyberattack that leads to loss of life. How can organizations prevent and/or prepare for this kind of event?
The threat of cyber catastrophe
According to the World Bank’s Global Cybersecurity Outlook 2023, 93% of cybersecurity leaders and 86% of business leaders think a far-reaching, catastrophic cyber event is at least somewhat likely in the next two years. Additionally, 43% of organizational leaders think it is likely that a cyberattack will affect their organization severely in the next two years.
In addition to her work at Microsoft, Armstrong-Smith collaborates with the UK’s Ministry of Defence (MoD). During her talk, she said many agree that it’s only a matter of time before a cyberattack against critical infrastructure will cause an event that leads to “multiple fatalities”. She advises security teams to seize opportunities related to past failures to improve incident response approaches.
The reasoning behind her conclusion is that attackers are increasingly infiltrating operational networks. And this has the potential to cause far more destruction than breaching IT networks. “The capability is already there; it’s just a matter of time,” said Armstrong-Smith.
The need to learn from past mistakes
When it comes to cyberattacks and incidents, Armstrong-Smith said the cybersecurity sector is typically bad at learning lessons. “It doesn’t matter how many times we see these incidents. They continue to happen over and over again,” she stated.
Armstrong-Smith said that findings from public inquiries tell us about why catastrophic events occur— and how they are often preventable. Some key themes she touched upon included:
- Design or Use Change: Over time, buildings, technologies and products undergo numerous upgrades and changes in use. Meanwhile, people on the ground are not informed about these changes. Then when a crisis occurs, incident responders rely on an outdated plan.
- Communication: There is often an expectation that every decision must be communicated from the top of the organization down. This delays action and loses context for those decisions. Instead, teams on the ground need “specific and direct instructions”.
- Lack of Empowerment: An incident’s first response can vary substantially depending on time and context. Therefore, there must be clear rules about who is empowered and to what degree in events that require immediate decisions.
- Rigid Plans: Many incident response plans are too rigid. Then when things go off-plan, there’s panic and the plan fails. Organizations must establish a critical path and have a clear differentiation between orders versus recommendations when facing incidents.
Real-world simulated training
Armstrong-Smith said that one of the best ways to prepare for incident response in cybersecurity is regular training that replicates real-world situations. “It requires real-time training against the real-time risk that we’re trying to deal with,” she added.
Simulated training exercises should mimic previous cyber incidents as closely as possible. However, Armstrong-Smith noted that she has “never seen a company that goes anywhere near their worst-case scenario” during crisis management exercises.
For example, organizations often believe they can rely on backups to restore their systems in the event of a ransomware breach. But what if the backups are deleted?? Companies should have a game plan and practice for that type of scenario.
Only through realistic training exercises can security teams truly understand what they are trying to protect and why, Armstrong-Smith said. Is a crisis plan in place? Have drills been conducted that simulate a major crisis?? She also pointed out that we tend to think about how security should protect infrastructure, but we forget about the impact on people.
Assessing cyber-physical risk
Although in its infancy, the discipline of cyber-physical risk assessment is also gaining traction. As per Industrial Cybersecurity expert Sinclair Koelemij, the cyber-physical risk assessment extends the cybersecurity risk of a process automation system to the physical domain of the production process/process installation.
Cyber-physical risk connects the cybersecurity of the process automation functions with the process security of the entire production installation. This forms a link between human casualties, environmental damage and the legal criteria that apply to loss due to cyberattacks.
The impact of company culture
In a separate BCS Member Groups talk, Armstrong-Smith points out that prior to major catastrophes, you can often see a repeated pattern. Multiple warning signs, audit reports or smaller, similar incidents all lead up to a catastrophe.
Meanwhile, the culture of an organization is critical when things go wrong and mistakes have been made. How do leaders respond to audit reports that reveal problems? Do they proactively address them, ignore them or try to place blame?
“It’s not that you had a problem, but how you deal with it that matters,” Armstrong-Smith says. “We shouldn’t be playing the blame game. We should be looking at processes rather than blaming a person, for example, when someone makes a security mistake.”
Soft skills matter
Armstrong-Smith also pointed out that since the pandemic and with rising risks, security teams have been under a lot of stress. This makes us more vulnerable in many ways, including making mistakes that may lead to a cyber incident. She asks, “How are we looking after people? Do we have the right HR policies? Are we paying attention to what’s going on in a person’s life?”
In the event of a seismic incident, the human tendency is to work as much as possible until things are resolved, added Armstrong-Smith. But this can lead to exhaustion, and that’s when people are more prone to make mistakes.
Armstrong-Smith insists that leaders must be aware of and look out for the well-being of their teams— especially in the event of a catastrophe.
Freelance Technology Writer