Politics

Safe, But Also Sorry

Security expert Bruce Schneier talks about privacy and property in the information state

|

As Washington, D.C., gears up for the inauguration, there's one thing that you're not seeing around town. Shoe-checking stations. While one attempted shoe bombing was enough to make all of us wander unshod through the airports of this great nation for years—there will be security check points all over Capitol Hill—shoe checking will not be part of the action.

Why? It's not that the chance of a shoe bombing has somehow been definitively eliminated. It's because the costs (frostbitten toes and long delays) have been weighed against the (low) possible risk of Richard Reid II. We probably should have reached the same conclusion about airports long ago. But this particular brand of cost-benefit analysis often eludes security officials, especially in the public sector.

Security officials at the inauguration are taking one tip from the Transportation Security Administration (TSA), though. They're dictating bag sizes. Nothing bigger than 8 inches by 6 inches by 4 inches will be allowed, according to The Washington Post. Basically, we're talking fanny packs here, people. So get ready for Obama's America to look something like this. The Washington City Paper is on the case, trying to figure out what the heck officials thought we were going to be packing our lunches in while we stand (since chairs are banned as well) along the parade route after waiting in a three-hour line to get in.

The TSA, meanwhile, is planning to phase out the small-bag-of-liquids regime this year after realizing that it may have overreacted to the single threat of a liquid-based bomb. New machines will be put in place to detect explosives, allowing the rest of us to pack our saline solution in peace. And so the security merry-go-round whirls on.

Wired columnist and BT chief security technology officer, Bruce Schneier, who started out as a cryptologist and has since expanded his portfolio to all things security, is the TSA's worst nightmare. A lot of what Schneier says sounds like the common sense you'll hear at the far end of the security line in any airport—you know, the stuff people say before they get within earshot of the badged and businesslike men and women who hold the fates of our vacations in their latex-gloved hands. Schneier adds some serious geek street cred and a respect for individualism to that common sense, and comes up with analysis and suggestions guaranteed to drive the TSA up the wall.

Associate Editor Katherine Mangu-Ward interviewed Bruce Schneier last week about his book, Schneier on Security (Wiley), and his thoughts on security, privacy, and profit.

Reason: You coined the phrase "security theater" and you've been critical of the TSA's choices on priorities and tactics. What has the TSA done wrong that's fixable? What has the TSA done right?

Bruce Schneier: The TSA focuses too much on specific tactics and targets. This makes sense politically, but is a bad use of security resources. Think about the last eight years. We take away guns and knives, and the terrorists use box cutters. We confiscate box cutters and knitting needles, and they put explosives in their shoes. We screen shoes, and they use liquids. We take away liquids, and they'll do something else. This is a dumb game; the TSA should stop playing. Some screening is necessary to stop the crazy and the stupid, but it's not going to stop a professional terrorist attack. We don't need more and better screening; we need less. On the other hand, I like seeing the direction they're heading in terms of behavioral profiling, though we need to be careful. Done wrong, it's nothing more than stereotyping; but done right, it can be very effective. It needs more focus on people and less on objects. We can't manage to keep weapons out of prisons; we'll never keep them out of airports. Oh, and stop the ID checking—the notion that there is this master list of terrorists that we can check people off against is just plain silly.

Reason: What would success look like for the TSA? If you were made King of Airport Security tomorrow and given the entire current budget of the TSA to do whatever you wanted, what kind of system would you design?

Schneier: If I were in charge of the TSA's budget, I'd give most of it back. Politically, I wouldn't be able to, of course, but it would be the best thing to do. Spending money on airport/airplane security only makes sense if the bad guys target airplanes. In general, money spent defending particular targets or tactics only makes sense if we can guess them correctly. If tactics and targets are scarce, defending against specific ones makes us safer. If tactics and targets are plentiful—as they are—it only forces the bad guys to pick new ones. Spending money on intelligence, investigation, and emergency response is effective regardless of the tactic or the target. Airport security is a last line of defense, and not a very good one at that. We need to remember that at budget time.

Reason: One theme that comes up in your book and some of the interviews you've done recently is the idea that when money/profit is involved, security operations tend to be tighter and more efficient. Explain to Reason.com readers—or at least speculate on—why that's the case.

Schneier: The person or organization who is subject to the risk needs to be responsible for risk mitigation. In banking, for example, the banks need to be responsible for their own risk. They lose money if the bank is robbed, so they're in the best position to weigh the cost of security measures against the risk of robbery. Customers don't lose money when there's a bank robbery, so they can't balance the risks and costs. Conversely, it makes no sense for bank customers to be penalized for identity theft losses. They're in no position to mitigate the risks—whereas the banks are—so customers shouldn't be responsible for the losses.

This doesn't mean there's no place for government to be responsible for risks. In airline security, the risks are far greater than any one airline. It makes no sense for airlines to hire security screeners—they can't do a proper risk analysis—and a lot of sense for the government to step in to fill that role.

Reason: In Schneier on Security, you emphasize that technology isn't the only (or even the most important part) of a security solution. Why do people tend to systematically discount cultural and economic factors in considering questions of security?

Schneier: We live in a technological world, and it's common for us to believe that technology can solve our security problems. It solves so many of our other problems, so it's a plausible belief. It's also easier to believe that a shiny new piece of technology—a new ID card, a new airport scanner, a new face-recognition system—can solve our problems than boring old concepts like culture and economics. Admitting that technology isn't the answer is admitting that there isn't an answer that will solve the problem, and many people can't do that yet. We've forgotten that risk is an inherent part of life.

Reason: Security and privacy (or, more controversially, security and freedom) are often described as being in opposition. When is that true? When it is untrue?

Schneier: The security vs. privacy dichotomy is a false one. Only identity-based security is in opposition to privacy, and there are limitations to that approach. I believe that approximately two security improvements since 9/11 have made airplane travel safer: reinforcing the cockpit door, teaching passengers they have to fight back, and—maybe—sky marshals. None of those measures has any impact on privacy. It's things like ID cards, and wholesale eavesdropping on telephone calls and Internet conversations, and large government databases that affect privacy, and their security value is minimal. The real dichotomy is liberty vs. control. There might be less crime in a society with strong government controls and police-state-like surveillance, but I don't think anyone would feel safer in that society.

Reason: What's your reaction when you hear people say that we live in a "security state"?

Schneier: We live in an information state, which is subtly different. All computer processes produce data as a byproduct. As more parts of our lives are mediated by computers, more personal information about us is produced. This information is collected, and then bought and sold, by other institutions, both government and commercial, without our knowledge and consent. Some of this is driven by security concerns, but a lot of it is driven by economics. The problem is that personal data is looked at as property, which can be bought and sold, instead of as a right. Long term, we need to fix that.

Reason: Do you consider yourself an optimist? Why?

Schneier: I consider myself a realist. Most people who say that are really pessimists, but I'm not. Most people are honest and trustworthy; society would fall apart if that weren't the case. Attacks are rare. Ten times as many people die each year in car crashes than did on 9/11, and the most dangerous part of an airplane journey is still the taxi ride to the airport.

Security is designed to protect us from the dishonest minority. It's important to remember that. I remember being told as a child: "Never talk to strangers." That's actually stupid advice. If a child is lost or scared or alone, the smartest thing he can do is find a kindly looking stranger to talk to. The real advice is: "Don't answer strangers who talk to you first." The difference is important. In the first case, the child selects the stranger—and the odds of him selecting a bad person are pretty negligible. In the second case, the stranger selects the child; that's more dangerous. I don't think that's either optimism to rightly point out that most people are honest, or pessimism to figure out how to best secure ourselves from the dishonest minority; it's analytical realism.

Katherine Mangu-Ward is an associate editor of Reason magazine.