January 5, 2016 By Bruce Schneier 3 min read

Professional pilot Ron Rapp has written a fascinating article on a 2014 Gulfstream plane that crashed on takeoff. The accident was 100 percent human error and entirely preventable; the pilots ignored procedures and checklists and warning signs again and again. Rapp uses it as an example of what systems theorists call the “normalization of deviance,” a term coined by sociologist Diane Vaughan:

Social normalization of deviance means that people within the organization become so much accustomed to a deviant behavior that they don’t consider it as deviant, despite the fact that they far exceed their own rules for the elementary safety. But it is a complex process with some kind of organizational acceptance. The people outside see the situation as deviant whereas the people inside get accustomed to it and do not. The more they do it, the more they get accustomed.

For instance in the Challenger case there were design flaws in the famous “O-rings,” although they considered that by design the O-rings would not be damaged. In fact it happened that they suffered some recurrent damage. The first time the O-rings were damaged the engineers found a solution and decided the space transportation system to be flying with ‘acceptable risk.’

The second time damage occurred, they thought the trouble came from something else. Because in their mind they believed they fixed the newest trouble, they again defined it as an acceptable risk and just kept monitoring the problem. And as they recurrently observed the problem with no consequence they got to the point that flying with the flaw was normal and acceptable. Of course, after the accident, they were shocked and horrified as they saw what they had done.

The point is that normalization of deviance is a gradual process that leads to a situation where unacceptable practices or standards become acceptable and flagrant violations of procedure become normal — despite that fact that everyone involved knows better.

The Normalization of Deviance in IT Security

This is a useful term for IT security professionals. The fundamental problems in cybersecurity are not about technology; instead, they’re about using technology. Security professionals have lots of technical tools at their disposal, and if technology alone could secure networks, organizations would be in great shape. But, of course, it can’t. Security is fundamentally a human problem, and there are people involved in security every step of the way. People are regularly the weakest link. IT security personnel have trouble getting people to follow good security practices and not undermine them as soon as they’re inconvenient. Rules are ignored.

As long as the organizational culture turns a blind eye to these practices, the predictable result is insecurity.

None of this is unique to IT. Looking at the healthcare field, John Banja identified seven factors that contribute to the normalization of deviance:

  • The rules are stupid and inefficient.
  • Knowledge is imperfect and uneven.
  • The work itself, along with new technology, can disrupt work behaviors and rule compliance.
  • I’m breaking the rule for the good of my patient.
  • The rules don’t apply to me/you can trust me.
  • Workers are afraid to speak up.
  • Leadership withholds or dilutes findings on system problems.

Dan Luu has written about this, too.

Combating Complacency in Cybersecurity

These same factors arise again and again in IT, especially in large organizations. Security teams constantly battle this culture, and and they’re regularly cleaning up the aftermath of people getting things wrong. The culture of IT relies on single, expert individuals — with all the problems that come along with that. False positives can wear down a team’s diligence, bringing about complacency.

There are no magic solutions here. Banja’s suggestions are good, but general:

  • Pay attention to weak signals.
  • Resist the urge to be unreasonably optimistic.
  • Teach employees how to conduct emotionally uncomfortable conversations.
  • System operators need to feel safe in speaking up.
  • Realize that oversight and monitoring are never-ending.

The normalization of deviance something we have to face, especially in areas like incident response, where people must be in the loop. People believe they know better, deliberately ignore procedure and invariably forget things. Recognizing the problem is the first step towards solving it.

Read the white paper: Six Steps for Building a Robust Incident Response Function

More from Incident Response

How Paris Olympic authorities battled cyberattacks, and won gold

3 min read - The Olympic Games Paris 2024 was by most accounts a highly successful Olympics. Some 10,000 athletes from 204 nations competed in 329 events over 16 days. But before and during the event, authorities battled Olympic-size cybersecurity threats coming from multiple directions.In preparation for expected attacks, authorities took several proactive measures to ensure the security of the event.Cyber vigilance programThe Paris 2024 Olympics implemented advanced threat intelligence, real-time threat monitoring and incident response expertise. This program aimed to prepare Olympic-facing organizations…

How CIRCIA is changing crisis communication

3 min read - Read the previous article in this series, PR vs cybersecurity teams: Handling disagreements in a crisis. When the Colonial Pipeline attack happened a few years ago, widespread panic and long lines at the gas pump were the result — partly due to a lack of reliable information. The attack raised the alarm about serious threats to critical infrastructure and what could happen in the aftermath. In response to this and other high-profile cyberattacks, Congress passed the Cyber Incident Reporting for Critical…

PR vs cybersecurity teams: Handling disagreements in a crisis

4 min read - Check out our first two articles in this series, Cybersecurity crisis communication: What to do and Crisis communication: What NOT to do. When a cyber incident happens inside an organization, everyone in the company has a stake in how to approach remediation. The problem is that not everyone agrees on how to handle the public response to cyber crisis communication. Typically, in any organization, the public relations team handles the relationship between the company and the media, who then decide…

Topic updates

Get email updates and stay ahead of the latest threats to the security landscape, thought leadership and research.
Subscribe today