Editor's note: Bruce Schneier is a security technologist, and author of "Beyond Fear: Thinking Sensibly About Security in an Uncertain World." You can read more of his writing at http://www.schneier.com/
(CNN) -- The Underwear Bomber failed. And our reaction to the failed plot is failing as well, by focusing on the specifics of this made-for-a-movie plot rather than the broad threat. While our reaction is predictable, it's not going to make us safer.
We're going to beef up airport security, because Umar Farouk AbdulMutallab allegedly snuck a bomb through a security checkpoint. We're going to intensively screen Nigerians, because he is Nigerian. We're going to field full body scanners, because they might have noticed the PETN that authorities say was hidden in his underwear. And so on.
We're doing these things even though security worked. The security checkpoints, even at their pre-9/11 levels, forced whoever made the bomb to construct a much worse bomb than he would have otherwise. Instead of using a timer or a plunger or another reliable detonation mechanism, as would any commercial user of PETN, he had to resort to an ad hoc homebrew -- and a much more inefficient one, involving a syringe, and 20 minutes in the lavatory, and we don't know exactly what else -- that didn't explode.
At that point, AbdulMutallab's fellow passengers quickly subdued him. Yes, the screeners didn't notice any PETN in his underwear, but the system was never intended to catch that particular tactic. There probably were intelligence failures -- why wasn't his father's tip followed up on, and why wasn't his visa revoked? -- but it's always easy to connect the dots in hindsight.
We're doing these things even though this particular plot was chosen precisely because we weren't screening for it; future al Qaeda attacks rarely look like past attacks; and the terrorist threat is far broader than attacks against airplanes.
We're doing these things even though airplane terrorism is incredibly rare, the risk is no greater today than it was in previous decades, the taxi to the airport is still more dangerous than the flight, and ten times as many Americans are killed by lightning as by terrorists.
In fact, we're focusing on the specifics of the plot, not despite these facts, but because of them.
The Underwear Bomber is precisely the sort of story we humans tend to overreact to. Our brains aren't very good at probability and risk analysis, especially when it comes to rare events. Our brains are much better at processing the simple risks we've had to deal with throughout most of our species' existence, and much poorer at evaluating the complex risks modern society forces us to face. We exaggerate spectacular rare events, and downplay familiar and common ones.
We can see the effects of this all the time. We fear being murdered, kidnapped, raped and assaulted by strangers, when it's far more likely that the perpetrator of such offenses is a relative or a friend. We fear school shootings, even though a school is almost always the safest place a child can be. We worry about shark attacks instead of fatal dog or pig attacks -- both far more common. In the U.S., over 38,000 people die each year in car crashes; that's as many deaths as 9/11 each and every month, year after year.
Overreacting to the rare and spectacular is natural. We tend to base risk analysis on personal story rather than on data. If a friend gets mugged in a foreign country, that story is more likely to affect how safe you feel in that country than abstract crime statistics.
We give storytellers we have a relationship with more credibility than we give strangers, and stories that are close to us more weight than stories from foreign lands. And who is everyone's major storyteller these days? Television.
I tell people that if it's in the news, don't worry about it. The very definition of "news" is "something that hardly ever happens." It's when something isn't in the news, when it's so common that it's no longer news -- car crashes, domestic violence -- that you should start worrying.
But that's not the way we think. The more an event is talked about, the more probable we think it is. The more vivid our thoughts about the event are -- again, think television -- the more easily we remember it and the more convincing it is. So when faced with a very available and highly vivid event like the Underwear Bomber, 9/11, or a child kidnapping in a playground, we overreact. We get scared.
And once we're scared, we need to "do something" -- even if that something doesn't make sense and is ineffective. We need to do something directly related to the story that's making us scared. We implement full body scanners at airports. We pass the Patriot Act. We don't let our children go to playgrounds unsupervised. Instead of implementing effective, but more general, security measures to reduce the overall risk, we concentrate on making the fearful story go away. Yes, it's security theater, but it makes us feel safer.
As circular as it sounds, rare events are rare primarily because they don't occur very often, and not because of any preventive security measures. If you want to do something that makes security sense, figure out what's common among a bunch of rare events, and concentrate your countermeasures there.
Focus on the general risk of terrorism, and not the specific threat of airplane bombings using PETN-filled underwear. Focus on the general risk of troubled teens, and not the specific threat of a lone gunman wandering around a school. Ignore the movie-plot threats, and concentrate on the real risks.
The opinions expressed in this commentary are solely those of Bruce Schneier.