After such a tumultuous 2017, it’s hard to imagine things getting worse in the cybersecurity world, but one book predicted just that. While not solely focused on cybersecurity disasters, “Warnings: Finding Cassandras to Stop Catastrophes” by Richard A. Clarke and R.P. Eddy is a wake-up call for business leaders and lawmakers who often fail to heed warnings from experts about future calamities in the making, many of which are related to the evolving technology landscape.
Chief information security officers (CISOs) are sure to appreciate the many references to IT and security, and will likely want to share the book with the top leadership at their organization. In fact, The Washington Times called the book “essential reading” to understand how to improve our ability to deal with the “pervasive and continuous turbulence” of our times.
The Cassandra Coefficient
Before we dive into the authors’ advice on how to recognize upcoming disasters, just what is a Cassandra? According to Greek mythology, Cassandra had the gift of prophecy, but a curse cast upon her by Apollo ensured that her predictions always fell on deaf ears. The first half of the book offered many examples in which experts who sounded alarms about potential catastrophes were ignored. It then explains how these experts can flip the script and ensure that future warnings are given the attention they deserve.
Clarke and Eddy noted several recurring themes among the various reasons why past Cassandras were ignored — one being the character of those sounding the alarm, especially if they are deemed eccentric. Another theme is that the magnitude of what is predicted is often so large that most people cannot imagine it happening at that scale. This was true of the earthquake and subsequent tidal wave that hit the Fukushima nuclear reactor site in 2011. Similarly, Hurricane Katrina, which devastated New Orleans in 2005, proved to be far more powerful than most had expected. Such unprecedented events are difficult to process due to what the authors referred to “first occurrence syndrome.”
Even when Cassandras are believed, the decision-maker’s own agenda or internal biases often contribute to inertia, since acting on the Cassandra’s advice would require admitting he or she was wrong. Other times, the warning is simply too complex for a single individual to fully grasp — this is often the case with cybersecurity disasters.
So how can a coefficient help? Clarke and Eddy boiled 24 factors down to four main components, starting with the warning in question, the people making the warning, the leaders at the helm who would need to react, and the ever-present critics and naysayers. Each of the factors is scored on a scale of high, medium, low and absent to assess the likelihood of the warning becoming an event.
Predicting Future Cybersecurity Disasters
The book specifically mentioned two potential trouble spots: the Internet of Everything — the authors’ expanded view of the Internet of Things (IoT) — and artificial intelligence (AI). In a review of the book, Dale Peterson, CEO of Digital Bond, pointed out the close similarities between the industrial control system (ICS) field and chapters on the Fukushima and mining disasters.
As Peterson noted, ICS security professionals often adopt the mindset that “nothing will change until something really bad happens.” As Stuxnet and other attacks on ICS systems have proven, there are countless risks at the intersection of the physical and cyber worlds. “Today, there is a false sense of assurance that our safety and protection systems will effectively prevent really bad things from happening,” he wrote.
What about the IoT and AI? Cybersecurity issues stemming from the IoT, such as botnets and distributed denial-of-service (DDoS) attacks, are unfolding in front of our eyes. Regarding AI, Clarke and Eddy pointed to several experts who are worried about the potential for AI to engage in a cycle of autonomous learning, growing and connecting, which could theoretically spell trouble for its creators.
Don’t Dismiss Your Cassandras
When Harvard Business Review interviewed Clarke for its IdeaCast podcast, he noted that we still have options to ensure we’re not asleep at the wheel when someone warns of impending doom. For example, Clarke advised decision-makers not to dismiss warnings without first thoroughly investigating the issue at hand. “OK, let’s test this,” he said. “Let’s make sure we’re right. Let’s do peer review.” He went on to note that “it’s our obligation to disprove it. And if you can’t disprove it, then you’ve got to keep an eye on it.”
Surveillance can be a good interim strategy, especially compared to outright denial. For CISOs, this can take the form of providing metrics on key cyber risk indicators with the understanding that the metrics might be wrong, biased or simply measuring the wrong element. An airline pilot, for example, knows that there can be a disconnect between the readings on a dashboard and the insights gained by simply looking out the cockpit window. Watching dials is good, but Clarke and Eddy also advocated a hedging strategy to prepare a response in the event that a Cassandra’s prophecy turns out to be true.
The authors also asserted that, while controls have typically focused on reducing the likelihood of an event, decision-makers need to implement strategies to minimize the consequences of a foreseeable disaster. It’s critical to have a realistic picture of the effectiveness and reliability of the organization’s defenses — otherwise, security is just a mirage. Just as business leaders have shifted their mindset to account for the inevitability of a data breach, the many cybersecurity calamities of 2017 should influence them to reassess how they treat Cassandras and prepare their security teams for a potentially catastrophic cyber event.