Bruce Schneier has an excellent opinion piece over at CNN, in which he discusses the criticism directed at security and intelligence agencies for not discovering and stopping the Boston Marathon bombing. The litany of complaint is familiar enough:
The FBI and the CIA are being criticized for not keeping better track of Tamerlan Tsarnaev in the months before the Boston Marathon bombings. How could they have ignored such a dangerous person? How do we reform the intelligence community to ensure this kind of failure doesn’t happen again?
Just as after the atrocities of 9/11, the agencies are being criticized for failing to “connect the dots” and uncover the plot.
Now, there have been specific incidents in connection with terrorism that one might think would raise some suspicions (for example, the 9/11 hijackers who took flying lessons but didn’t want to learn how land the plane). But for the most part, as Schneier points out, “connecting the dots” is a bad and misleading metaphor.
Connecting the dots in a coloring book is easy and fun. They’re right there on the page, and they’re all numbered. … It’s so simple that 5-year-olds can do it.
After an incident has occurred, we can look back through the history of the people and things involved, and attempt to piece together a pattern. But that is possible only because we know what happened. Before the fact, real life does not number the dots or even necessarily make them visible. The problem, generally, is not that we have insufficient information. It’s that we don’t now which tiny fraction of the information that we do have is relevant, and not just noise.
In hindsight, we know who the bad guys are. Before the fact, there are an enormous number of potential bad guys.
I heard a news report a few days ago saying that Tamerlan Tsarnaev, the elder of the two brothers, had taken part in a monitored telephone call in which the term ‘jihad’ was mentioned. Lumping together telephone calls (including those by reporters, of course), radio and TV broadcasts, and other forms of electronic communication, how many times per day would you guess that word might be mentioned?
As Schneier goes on to point out, this is an example of a psychological trait called hindsight bias, first explained by Daniel Kahneman and Amos Tversky,
Since what actually happened is so obvious once it happens, we overestimate how obvious it was before it happened.
We actually misremember what we once thought, believing that we knew all along that what happened would happen.
Telling stories is one of the primary ways that people attempt to make sense of the world around them. The stories we construct tend to be a good deal tidier and more logical than the real world. There is a strong tendency to adjust the “facts” to fit the story, rather than the other way around. (This is one reason that science is hard.)
You can observe this tendency regularly in reporting on financial markets. For example, whatever the stock market did yesterday — go up or down, a little or a lot — you can be sure that some pundits will have an eminently plausible explanation for why that happened. You are very unlikely to hear anything like, “Well, the S&P 500 went down 500 points, and we don’t have a clue why it happened.” (I have been saying for years that I will start paying attention to these stories when they are published before the fact.)
It is certainly sensible, after any incident, to look back to see if clues were missed, and to attempt to learn from any mistakes. But it is neither sensible nor realistic to expect prevention of any and all criminal or terrorist activity.
Update Tuesday, May 7, 17:05 EDT
Schneier’s essay has now also been posted at his Schneier on Security blog.
Like this:
Like Loading...