I just finished a book which I think should be required reading for all security professionals, if not all persons in possession of a brain who would like to use it more effectively. The Intelligence Trap: Why Smart People Make Dumb Mistakes is the best non security security book I have read in a long while, offering explanations of why our intelligent colleagues do things that lead to security disasters, strategies for honing your security messages to account for the unconscious biases that can foil them, and deep insight into how to improve your own thought processes, recognize your biases and avoid falling in to the Intelligence Trap. Read this book. It is worth your time.
As an unrepentant idea thief, I plan on extracting some of the things in the book which I found most striking as a security professional and sharing them with you in a series of blog posts.
Today’s idea heist… “outcome bias.”
According to the font of all wisdom Wikipedia, outcome bias is defined as “an error made in evaluating the quality of a decision when the outcome of that decision is already known.”
The danger of outcome bias for security professionals and the organizations we are trying to protect comes in how we unconsciously deal with security “near misses” – situations where we detect and deal with a security problem before it becomes a disaster.
While we may (and should) congratulate ourselves for acting (hopefully quickly) to recognize a vulnerability and remediate it before our company is being skewered by The Register or our CEO is telling the press that we were the victim “of a highly sophisticated cyber attack,” the fact that we were able to avoid a massive security incident can subconsciously increase our colleagues’ and organizations’ willingness to repeat the risky behaviors that almost got us into trouble.
The events leading up to the loss of the space shuttle Columbia provide a dramatic case in point. Columbia exploded on re-entry due to insulation tiles being dislodged during launch striking the body of the spacecraft and damaging the wing to the point where it failed. Tiles being shed during launch was not a new issue; in fact it was a routine fact of life on just about all previous missions. It became routine. After all, all of the other shuttles landed safely, so the tile shedding was not a major issue… until it was. Seven astronauts died, a shuttle was lost and the program was stalled for over 2 years.
When we have a “near miss,” we need to make sure that we stop and take the time to think not only about what didn’t happen, but what could have happened. We security professionals are great at thinking of all of the bad things that could happen – and we may take it for granted that our colleagues’ minds work in the same dark and twisted ways that ours does. But they usually don’t (which is probably a net positive). We need to communicate and get them to use their imaginations as to what could have happened if the stars were not aligned in our favor. We need to inoculate them against the unconscious thought that risky behavior or decisions leading to the near miss were OK since nothing happened… this time.
So how do we turn our “near misses” into net positives? Some ideas…
Have a post-mortem – treat the near miss like a real incident and analyze it with the team. Identify any risky decisions or behaviors and figure out why they were made and how the processes that led to them can be improved.
Use the swiss-cheese model to analyze your near misses. Most failures and near failures happen because of multiple holes in multiple layers of protection “lining up” to allow a failure mode to occur. Look at your near misses and figure out what else could have gone wrong to cause the near miss to have changed into a security disaster.
Celebrate and reward people who report near misses. Your organization should be encouraging people to report near misses and holding them up as examples of security championship. Analysis of near misses makes your security program stronger and prevents repeating the same mistakes over and over.
Near misses are opportunities for learning and improvement, but when they are ignored, they can play a role in setting the stage for serious incidents. Use them wisely!