In the months since we began simulating full-scale cyberattacks for customers at our IBM X-Force Command Center in Cambridge, Massachusetts, I’ve watched a steady stream of corporate security teams confront the hair-raising experience of a simulated cyberattack.
The Command Center recreates the whole crisis from start to finish, featuring a controlled, live-fire simulation of a cyberattack that enables the customer’s team to dry run a response with the help of the IBM X-Force Incident Response Intelligence Services (IRIS) team and advanced tools such as Watson for Cyber Security. It also includes some of the human issues that need to be addressed, such as calls from law enforcement authorities, angry investors, regulators and the press.
I had originally expected our customers’ greatest needs to be technology-related, but I’ve found that most of them have covered those bases pretty well. What has surprised me is how ill-prepared many are for the human decision-making part of the process. They’re pretty good at shutting down a distributed denial-of-service (DDoS) attack, but they turn to jelly when questioned by a reporter or are paralyzed by a ringing phone.
The reason? Lack of OODA loops. I’ll explain.
Two Styles of Decision-Making
When a serious cyberattack makes the headlines, companies need to take action — any action. They need to talk, even if they have no new information. As recent attacks against the airline industry have shown, silence and obfuscation are deadly in a crisis. You need to reassure stakeholders that you’re on top of the problem and doing all you can.
The boardroom thinking that guides many enterprises these days isn’t very well-tuned for rapid response. Managers who act quickly are seen as impulsive, reckless or undisciplined. In today’s big data world, our culture rewards people who gather all the facts and penalizes those who act on instinct.
But crises don’t give you the luxury of time to gather all the data. I spent 15 years as an emergency medical technician and firefighter before I joined IBM. In those roles, I came to appreciate the value of gut instinct by experienced people. There are many disciplines in which on-the-spot decision-making is critical — we don’t want trauma surgeons to have to consult manuals, or fighter pilots to wait for instructions when an enemy is on their tail.
Social scientists have a term for this style of decision-making: System 1 thinking. It’s characterized by a fast, automatic, intuitive approach, typically informed by experience. That’s distinctive from System 2 thinking, which is more methodical and thoughtful.
Both approaches are appropriate in different situations, but System 2 thinking is currently more in favor among corporate executives, thanks to the popularity of data-driven management. I fear that the value of System 1 thinking is being shunted aside in the process, and that’s bad when a crisis hits.
Observe, Orient, Decide and Act
OODA loops are a tool for System 1 thinkers. OODA stands for “observe, orient, decide and act.” The concept was developed by the U.S. military to train soldiers to make decisions when there’s no time to gather all the data. Instead of focusing on the best decision, the OODA loop concept teaches us to filter available information, put it in context and quickly make the best decision with the understanding that we can always make changes as more data becomes available.
It’s like navigating a maze in a puzzle book. Some people dive right into a solution, drawing lines and inevitably running into dead ends. Others wait, analyze and start drawing only when they’re certain of the correct path. Both approaches are valid ways to solve the puzzle. We need to appreciate the value of each.
Cyberattacks don’t often give us the luxury of time, which is why it’s important to have knowledgeable people who can show immediate progress. Unfortunately, those qualities are rare in my experience — our analytical culture discourages it.
Understanding OODA Loops
Every organization needs to understand how to use OODA loops. It requires rethinking some commonly held assumptions.
Know when some action is better than no action. If your house is on fire, anything you do to extinguish the blaze is better and then standing and watching it burn. The same applies to a cyberattack: Demonstrate that you’re doing something, even if that course of action doesn’t turn out to be the best one in the long run. Like a sailboat tacking toward its destination, you want to maintain forward momentum.
Accept and learn from failure. Most organizations are terrible at this, but failure is part of the process of finding a solution. Experienced people armed with facts and context will usually take constructive action. They’ll learn from their mistakes and make better decisions. The OODA loop teaches that no decision should ever be set in stone and that small failures are better than no action at all.
Finally, tell what you know. No one expects you to have a complete solution at hand in the midst of a crisis. What they do want to see is that you’re on top of the situation and working toward a resolution. In the absence of information, outsiders will tend to assume the worst, so be sure to talk a lot. Tell them what you’re doing and be open to suggestions.
Every chief information security officer (CISO) should be familiar with the concept of OODA loops. In a cyber emergency, the people who act are their best friends.