Civil Liberties

The Pinpoint Search

How super-accurate surveillance technology threatens our privacy

|

Anyone would consider it a stroke of bad luck to be pulled over for driving six miles per hour over the speed limit, but Roy Caballes had an additional reason to curse his ill fortune. On the November afternoon in 1998 when an Illinois state trooper stopped him, Caballes was carrying 282 pounds of marijuana in his trunk. At first it looked like he'd get off with just a warning. Then another officer pulled up and swept his car with one of the most advanced pieces of technology then available to law enforcement: a drug-sniffing dog named Krott. The pooch uncovered the dope.

Caballes thought the cops didn't have a legitimate reason to bring in Krott, and he fought the search. In 2003 the Illinois Supreme Court ruled that the officers had indeed violated the Fourth Amendment by transforming the traffic stop into a drug investigation without probable cause, or even the weaker "reasonable suspicion." But in the 2005 decision Illinois v. Caballes, the U.S. Supreme Court ruled that the dog sniff could not have rendered an otherwise lawful traffic stop unconstitutional unless the dog sniff itself violated Caballes' "constitutionally protected interest in privacy." The Court concluded it did not, citing a 1983 decision in which it ruled that, because a dog sniff reveals only the presence of contraband in which there is no "reasonable expectation of privacy," it isn't a "search" at all. The Supremes sent the case back to Illinois, and Caballes ended up with a 12-year prison sentence.

The dog sniff that caught Caballes is just one crude, old-fashioned example of the search technologies available to law enforcement. A new wave of advanced surveillance tools is capable of detecting not just drugs but weapons, explosives, and illicit computer files, potentially flying under the Fourth Amendment's radar all the while. A handheld scanner picks up stray particles of cocaine on a car during a routine traffic stop. Is that a search? A high-tech camera detects the gun one pedestrian is carrying under his jacket. Is that a search? A forensic analyst finds a single image of child pornography on a computer server containing thousands of files owned by hundreds of users, without ever seeing any other private information. Is that a search?

In a nation whose reams of regulations make almost everyone guilty of some violation at some point, Americans have grown accustomed to getting away with minor transgressions: the occasional joint or downloaded movie or high-speed dash to the airport. For at least some crimes, though, the expectation that our peccadilloes will slip through the cracks may soon be outdated. The new style of noninvasive but deeply revealing detection—call them "pinpoint searches"—will require rapid adjustments in both legal rules and social mores.

Anatomically Correct Searches

The original pinpoint search, the drug dog's sniff, has built-in limits. A German shepherd is a cumbersome piece of biotechnology, making suspicionless sweeps during routine traffic stops the exception rather than the rule. But chemists and engineers are developing a variety of electronic sniffers that are competing to make Fido's schnoz obsolete.

DrugWipes, for example, are small, swab-tipped devices. Wipe the tip along a surface, or a sample of sweat or saliva, and in two to five minutes a simple indicator window reveals whether drug residue is present. Manufactured by the German firm Securetec, DrugWipes have been used by more than 2,000 law enforcement agencies in the U.S. since the late 1990s, and they're increasingly popular among schools and private employers as well.

DrugWipes have limitations: They're single-use devices, and while the basic model is inexpensive (less than $10 per unit), each picks up only one specific type of drug residue. Even with a relatively low per-unit price, the cost of sweeping a school or a parking lot can mount quickly. For more versatility, cops can turn to General Electric's VaporTracer, a seven-pound handheld particle sniffer that can test for a wide range of drugs and explosives in only a few seconds.

The VaporTracer and its nonportable cousin, the Itemiser, are already used in airports to scan luggage for explosives. With a price tag of $25,000 to $30,000, the VaporTracer is unlikely to become standard issue for beat cops in the near future. (G.E. estimates that about 4,000 are in use worldwide, most for explosive detection.) But researchers are developing ever faster, cheaper, and more sensitive electronic noses.

Among the technologies in the offing is the desorption electrospray ionization scanner. It uses charged droplets to lift particles from a surface and into a mass spectrometer, which can break down and analyze the components of any substance down to the molecular level. It's currently a desktop-sized machine, but its creators, a team of researchers at Purdue University, hope to develop a portable version that can fit in a backpack within a few years. The Purdue team's head, Graham Cooks, guesses such a device might cost about $4,000. That's not exactly cheap, but it's thousands of dollars less than a well-trained drug dog costs.

Meanwhile, scientists at Georgia Tech have developed prototype scanning technologies based on a penny-sized surface acoustic wave chip, which works by measuring disturbances in sound waves as they pass across small quartz crystals. This "dog on a chip" sensor is coated with a thin layer of cloned antibody proteins that bond to a specific molecule, such as cocaine or TNT. The sound waves passing through that sensor can then be compared with an uncoated control crystal: Differences in the waves mean the chip has picked up trace amounts—as little as a few trillionths of a gram—of the target substance.

Handheld scanners aren't the only possible application for such sniffer chips. Metro stations in Washington, D.C., have been fitted with fixed chemical weapon detectors, meant to give advance warning in case of a terrorist attack. Sensors with a range of a few feet could be combined with surveillance cameras to pinpoint passengers who might be worth extra scrutiny.

Police can use new devices to hunt not just for tiny traces of contraband but for larger objects. Millimeter wave (MMW) radiation is all around us. You're emitting it even as you read this article. More important, you're emitting it through your clothes, making it an ideal way to scan for hidden objects that distort or block those waves, whether they're made of metal, ceramic, plastic, or some other composite material—and without any of the health concerns associated with X-rays.

The Federal Aviation Administration began funding MMW research back in 1989. The technology has since been licensed to several commercial firms. Intellifit, for example, has set up MMW kiosks in several malls and clothing stores; they help people find clothing that's a good fit for their frame.

But the primary licensees have been in the security business. In the summer of 2005, a company called SafeView debuted a three-dimensional body scanner, SafeScout, for use in airports and at other security checkpoints. Think of the scene in the 1990 science fiction flick Total Recall where California's future governor races behind a panel that exposes, in real time, all the weaponry hidden away among bulging muscles.

In its most intrusive form, an MMW scanner can reveal a rough nude image of its subjects. The models being deployed for most security purposes get around that problem by projecting any objects the scanner detects on a generic virtual mannequin.

Less intrusive is the BIS-WDS Prime, a security camera created by the Florida-based firm Brijot Imaging Systems. Unveiled last spring, the camera pinpoints weapons and suspicious objects at a range of up to 45 feet by comparing hidden objects picked up by its millimeter wave sensor to a database of weapon shapes. The detection process, Brijot claims, takes less than half a second, and the higher-end models will display up to 20 threats simultaneously. (The camera was field tested this summer at New York's Port Authority Bus Terminal and at New Jersey PATH train stations.)

If a search technology based on shape matching still seems a bit low-tech, consider Pulsed Fast Neutron Analysis, which can reveal the molecular composition of a load of cargo without opening its vehicle. In the summer of 2004, U.S. Customs and Border Protection began testing a $10 million, car wash­–sized prototype facility at the Ysleta border crossing near El Paso, Texas. It bombards vehicles with high-energy neutrons, which excite the nuclei of atoms, causing the contents to emit gamma rays. Since different elements emit gamma rays at different energy levels, the scanner can infer the chemical structure of the cargo's contents, distinguishing plastic explosives from Play-Doh and table sugar from Colombian White.

Googling for Contraband

In 1996 Michael Adler offered a hypothetical question in The Yale Law Journal. Adler imagined a computer worm or virus that could quickly and unobtrusively scan thousands of computer hard drives simultaneously, looking for evidence of illicit files—classified documents, say, or pirated software or child pornography. Would this count as a "search" for Fourth Amendment purposes?

This remained a hypothetical question until 2001, when someone released that worm. Called Noped, the crude Visual Basic Script program would infect PCs by way of an email attachment and (after mailing itself out to everyone in the infected user's address book) scan for images with file names that its author considered suggestive of kiddie porn. If it found a match, it would email law enforcement a list of its findings.

A file name match is too thin a reed on which to hang an investigation, but better technologies can pinpoint specific files on a hard drive. By running a large file through a cryptographic algorithm, it's possible to generate a much shorter unique string of letters and numbers (or close enough to unique for any practical purpose) called a hash value, which can quickly determine whether two files are identical. The National Drug Intelligence Center's Hashkeeper database already contains the hash values of both common commercial software programs and known images of illicit child pornography, making it easy for a trained technician to discern whether a hard drive contains a copy of a particular file.

Orin Kerr, a law professor at George Washington University, literally wrote the book on government searches of computer data: the 2001 Justice Department manual Searching and Seizing Computers and Obtaining Electronic Evidence in Criminal Investigations. Kerr says sweeping viral searches would run afoul of existing wiretap laws, which raise more stringent barriers to "unauthorized access" than is imposed by the current judicial interpretation of the Fourth Amendment. Yet he says it's far less clear how the courts would treat digital searches that take place in the course of authorized searches for other material.

Imagine, for example, that a police technician has lawful physical access to a server containing thousands of users' files. He's supposed to be looking for evidence of tax fraud in a specific suspect's documents. Even though the warrant specifies only that user, will it count as an additional "search" if the officer runs a search program on the entire server, designed to alert him only when it locates known child porn? If there's "no reasonable expectation of privacy" when it comes to possessing narcotics, wouldn't the same rule apply to child pornography?

Government investigators may have already caught up with the legal theorists' hypotheticals—searching not for kiddie porn but for terrorist plots. Consider the controversial surveillance carried out by the hyper-clandestine National Security Agency (NSA) following the attacks of September 11, 2001. The first program to be revealed, in December 2005, involved conventional wiretapping, but further leaks hinted that far more sweeping surveillance might be taking place. Voice and text recognition software might be sifting through millions of communications in search of target words or phrases that raised red flags for investigators.

The NSA fired Russell Tice, an intelligence analyst at the agency, in May 2005; he now says he wants to tell Congress about NSA surveillance programs he believes are illegal. While Tice won't discuss specific programs, he notes that the technology exists to filter data and voice traffic on a mass scale, flagging communications where target words or phrases—jihad, say, or the names of known terrorists—are used. With the help of linguistic consultants, he says, intelligence agencies can even zero in on particular accents or speech patterns.

Privacy mavens have long whispered of a program called ECHELON, a massive signals intelligence network rumored to have been developed as a Cold War espionage tool. It supposedly used batteries of high-powered computers, called "dictionaries," to scan voice and data communications for suspicious phrases. While the NSA has never confirmed that ECHELON is real, a 2001 report by the European Parliament concluded that "the existence of a global system for intercepting communications…is no longer in doubt."

In a January 2006 speech at the National Press Club, in which he called ECHELON an "urban legend," former NSA head and current CIA chief Gen. Michael V. Hayden asserted that the NSA's warrantless wiretap program, exposed by The New York Times in late 2005, "is not a driftnet…grabbing conversations that we then sort out by these alleged keyword searches or data-mining tools or other devices.…This is targeted and focused." But reports from insiders continue to hint at something far more expansive.

A February Washington Post piece described a tiered computer filtering system that initially swept up hundreds of thousands of communications, using increasingly intrusive techniques to winnow the pool down to a far smaller number subject to human examination. In April the Electronic Frontier Foundation disclosed that a former AT&T technician, Mark Klein, had come forward with tales of a "secret room" his erstwhile employer had built at the NSA's behest, in which vast amounts of data were scrutinized by a "semantic traffic analyzer."

And in May USA Today revealed that the NSA had created a database compiling information about the calling patterns of millions of Americans. If such analysis was really limited to information about the calls—who phoned whom, when, and for how long—current Supreme Court precedent would not classify the interception as a "search." Less certain, however, is the status of intercepts using ECHELON-style "dictionaries" to probe the contents of some voice and data communications for target words or phrases. "If this approach was used, and hundreds of thousands if not millions of communications were processed in that manner," says Tice, "the argument could be made, well, if a machine was doing the looking and the sucking in, it doesn't matter because that's not monitoring until a human looks at it." Writing in The New Republic last February, Richard A. Posner, a judge on the U.S. Court of Appeals for the 7th Circuit, made exactly that argument, suggesting that automated searches do not violate the law "because a computer program is not a sentient being."

Storage Devices and Virtual Files

Courts already have started to tackle some of the questions raised by these new technologies, but not consistently enough for us to predict with confidence where they'll go in the future. In United States v. Runyan (2001), the U.S. Court of Appeals for the 5th Circuit ruled that when a woman who had found a few of her husband's child porn files turned his digital storage devices over to police, they had already been "searched." The cops, therefore, didn't perform any additional search when they did a more comprehensive analysis and found more extensive caches of similar material. But in a 10th Circuit case, United States v. Carey (1999), the appeals court held that a forensic analyst who was lawfully searching a hard drive in the course of a drug investigation did exceed the scope of the warrant when, after accidentally opening a child porn file, he abandoned the search for drug-related material and started digging for more porn.

Kerr characterizes these as "storage device" and "virtual file" approaches, respectively. The former treats a digital storage medium as though it's a single physical container, like a briefcase or a trunk: Once the lock is lawfully popped, all the contents are subject to observation. With the virtual file approach, a digital storage device is more like a warehouse containing many thousands of individual closed boxes: Police may have the authority to go looking through the warehouse for a few particular containers, but that doesn't mean they may pry anything open willy-nilly. Even under a "virtual file" approach, the logic of Illinois v. Caballes suggests that a scan for illicit files using something like the Hashkeeper database, which doesn't technically "open" the file, will not count as a search once police have lawful access to the storage medium.

Kerr has offered his solution, at least in the case of digital searches, in the Harvard Law Review. In a June 2006 article he proposes an "exposure theory" of the Fourth Amendment: Any time computer data or information about that data (such as whether it matches certain search criteria) is exposed to human observation via an output device such as a monitor or printer, those data have been "searched" for Fourth Amendment purposes.

This approach would attenuate, perhaps even eliminate, the "plain view" doctrine in the digital realm. That doctrine holds that any evidence uncovered in the course of a lawful investigation is fair game for police, even when the investigation was initiated for a different purpose—as when, for instance, police smell marijuana or spot a gun during a traffic stop. Such a principle would also, in effect, declare Caballes a dead letter online, since it would shift the legal focus to where investigators looked, rather than the amount of additional physical intrusion or the type of information uncovered.

Some want to see Caballes consigned to the dustbin of jurisprudence offline as well. Marc Rotenberg, executive director of the Electronic Privacy Information Center, proposes rolling back the Caballes exception and hewing to a strict version of the standard the Supreme Court articulated in Kyllo v. United States (see sidebar), under which any information about certain protected spheres, beyond what an unaided human observer could glean, would be regarded as presumptively private. "Your expectation of privacy really has to be measured against what an unassisted police officer might be able to obtain from you," Rotenberg argues, "not what technology might make possible." Otherwise, he suggests, that expectation will only grow ever weaker as technology improves.

There's another advantage to applying the Fourth Amendment's protections to pinpoint searches: It would create an obstacle to the use of the search power to harass, something that loomed large in the fears of the Founders. For generations, supporters of broad law enforcement powers have claimed that "if you're not guilty, you have nothing to hide." But as the Harvard law professor William J. Stuntz has noted, the Fourth Amendment—and the Fifth Amendment, which protects against self-incrimination—were intended not just as abstract procedural checks but as substantive safeguards against criminalizing certain kinds of activity, such as religious and political dissent. It's harder to prohibit a faith, for example, when police don't have the power to look through citizens' papers or burst into their homes without specific evidence of criminality to cite as grounds for a warrant.

If pinpoint searches are not subject to any judicial oversight, law enforcement agencies will have broad discretion over whom to search and how often to search them. There's ample reason to suspect that such discretion won't always be exercised equitably. Whites and blacks use illicit drugs at similar rates, for example, but blacks make up nearly half of state prison inmates convicted of drug offenses. It is easy to imagine some politically unpopular person or group subject to frequent pinpoint searches for minor drug infractions, zoning code violations, or whatever other commonplace low-grade statute violations new technologies make it possible to detect.

Should We Learn to Stop Worrying and Love Pinpoint Searches?

Despite such concerns, some civil libertarians greet these new technologies with surprising enthusiasm. Jeffrey Rosen, a law professor at George Washington University and the author of The Unwanted Gaze: The Destruction of Privacy in America (2001), stresses that such searches avoid some of the central problems the Fourth Amendment's framers worried about. "Privacy people should be unequivocally and unambiguously enthusiastic about technologies that can manage to find illegal activity without intruding on innocent privacy interests," he argues. "The paradigmatic example of an unreasonable search at the time of the framing of the Constitution was the search of private diaries, because you had to look at a lot of innocent and intimate information in order to find potentially illegal information." Pinpoint searches may allow the cop who pulls you over for speeding to scan you routinely for drugs or guns. But they may also mean he'll be less likely to invent a pretext to rifle your glove box, exposing that legal but embarrassing bottle of Viagra. And the subway cop who wants to be sure your backpack doesn't contain a bomb won't need to open it up and see what else you're carrying.

The veteran civil liberties litigator Harvey Silverglate has staked out a position between Rotenberg's and Rosen's. He notes that the Fourth Amendment's clause requiring searches to be "reasonable" is technically separate from the clause outlining the preconditions for a warrant to be issued, and that there are conditions under which courts have ruled warrantless searches to be reasonable. (In addition to the "plain view" exception mentioned earlier, there are exceptions for "exigent circumstance," as when a cop believes a dealer is about to flush his stash or a kidnapper is on the verge of killing his victim.) So you can concede that pinpoint searches really are searches subject to judicial oversight without ruling out the possibility that some searches, under some circumstances, are "reasonable" even without a warrant. Silverglate believes the law will move away from strict warrant requirements for minimally intrusive technologies, such as hand-held explosive sniffers, that are geared to prevent especially severe crimes, such as terrorist attacks. "The courts," he predicts, "are going to say that if some germ or atomic weapon could kill thousands of people, then some methods are going to be 'reasonable' that wouldn't be when you're trying to find a guy smoking pot.

For the most optimistic take on pinpoint searches, turn to the futurist David Brin, author of the 1998 book The Transparent Society. Brin believes a world of more perfect enforcement will create democratic pressure to either eliminate or drastically reduce penalties for "victimless" offenses. What matters, Brin avers, is not what the government knows about you but what it can do to you. To those who fear a world in which, for instance, routine speeding infractions are invariably met with stiff fines, Brin ripostes: "Can't you trust your fellow citizens to not want that either?"

Andrew Napolitano, the author of The Constitution in Exile, is unconvinced. A legal analyst for Fox News and a former New Jersey judge, Napolitano joins Rotenberg in insisting that a "neutral magistrate" stand between police and the subjects of all government searches. He argues that it's precisely when law enforcement agencies are most tempted to bypass checks on government snooping that the public is least apt to demand adherence to the letter of the law. For proof, he points to many Americans' indifference to—or support of—the NSA's warrantless wiretaps. "When the president can go on TV and get a 57 percent approval rating saying he doesn't care about privacy, he only cares about security," Napolitano concludes, "we may have to count on my black-robed colleagues to protect privacy."

We may hope our elected representatives will either exempt a pothead from pinpoint searches, lighten his punishment to compensate for the new ease of capturing him, or even abandon their long-running war on him altogether. But what about more serious crimes, such as terrorism? Should we allow electronic sniffers to troll through vast haystacks of telecommunications data searching for jihadist needles, in the hope that terrorists will not simply use encryption technology to render such surveillance useless?

Tradeoffs to Come

David Post, a cyberlaw expert at Temple University, hopes we can deploy pinpoint searches in ways that preserve the balance between security and privacy. "The kind of oversight you want in a system like this is very different from what you'd want in the ordinary warrant case," he says. "There you want someone looking at the evidence as it relates to a particular target. Here I want someone who's looking at the system as a whole, an ongoing systemic analysis of a kind that really is new." In Post's model, a panel of legal and technical experts with appropriate security clearances might be granted ongoing oversight responsibilities over an ECHELON-style vacuum-cleaner surveillance program to determine whether it was sufficiently fine-tuned.

Silverglate suggests another way to take advantage of the new surveillance tools while still protecting privacy: establish a "multi-level, tiered approach to electronic searches." At the first level, a filter system overseen by the kind of panel Post imagines sifts through communications, flagging suspicious conversations. Intelligence agents might then listen to brief snippets of conversation selected by the computers but, crucially, without learning the identities of any of the parties to the conversation.

Then, says Silverglate, "based on what the vacuum cleaner picks up, the NSA is going to have to go to a Foreign Intelligence Surveillance Act court and see if they have probable cause to find out the identity of the person on the line." Such an approach might even, paradoxically, make such secret courts, notorious for almost never rejecting wiretap applications, less inclined to defer to intelligence agencies, since instead of being asked whether they are prepared to give terror hunters the benefit of the doubt, judges will already know some of the contents of the communications for which they're being asked to authorize the release of identifying data.

That doesn't solve every problem with such systems, of course. It does not deal with the chilling effect that may occur when speakers begin to watch their words on the phone based on the fear that they will trigger a computer in Fort Meade if they say the wrong thing. And once the necessary infrastructure is set up to use such a system to catch terrorists, it would be both relatively simple technically and powerfully tempting politically to expand it to hunt for the least sophisticated perpetrators of whatever crime is particularly unpopular at the moment.

Whether we adopt the sanguine approach of Rosen and Brin, embrace the strong privacy protections of Napolitano and Rotenberg, or look for a middle path with Silverglate and Post, we will be forced to make difficult tradeoffs. But the debate over how to strike that balance must begin now, before today's prototype rolls off tomorrow's assembly line. These new technologies are too powerful to use thoughtlessly. We're already entering a pinpoint-search world. Now we must decide how to live in it.

Contributing Editor Julian Sanchez is writing a book on disobedience.