The Surveillance state and the search for limits

July 10, 2014Duke Law News

In his post as a fellow in national security law at the Brookings Institution, Wells Bennett ’06 is managing editor of Lawfare, an influential blog about the law and national security that is widely read by foreign policy experts. On June 6, 2013, the day after it was revealed that the National Security Agency (NSA) had been track­ing the phone calls of millions of Americans, Bennett took to the blog to pose a question: “What’s the limiting principle here?”

It was not a rhetorical query. Like many Americans, Bennett was taken aback to learn of the previously secret program, which had come to light when former NSA contractor Edward Snowden shared thousands of classified documents with journalists. The initial revelation that phone companies were turning over ordi­nary citizens’ call records to comply with an order of the Foreign Intelligence Surveillance Court (the FISA court or FISC) find­ing the program legal under Section 215 of the USA Patriot Act stunned many. While the government was not monitoring the con­tent of calls, it was collecting “metadata” that could tell it who was calling whom, and when.

In the real world of counterterrorism, there is much to be said for the extraordinary insights that one might obtain by seeing the communications from one intelligence target to another.”
— Professor Charles Dunlap Jr.

“We have only the order itself, not the application that underlies it, but we have a hard time imagining the application that could have produced it,” Bennett wrote in a Lawfare post co-authored with Brookings Senior Fellow Benjamin Wittes.

Even if the government can argue that “the giant ongoing flood of data from the telecommunications companies” met the Section 215 standard of being relevant to an authorized investiga­tion — perhaps of Al Qaeda or another international terrorist net­work, they wrote — “how is it possible that all calls between, say, a Washington, D.C. restaurant and its fish supplier are ‘relevant’ even to such a broad investigation?”

Conceding that a massive, catch-all data set might yield informa­tion relevant to a national security investigation if algorithmically analyzed, they stated the problem this way: If that constitutes rel­evance for the purposes of Section 215, “then isn’t all data relevant to all investigations?”

More than a year later, Bennett’s question regarding limiting prin­ciples is at the heart of a debate over the size and scope of government surveillance that has sharply divided the nation’s legal and political institutions — to say nothing of public opinion. The president and Congress have opened multiple inquiries into the NSA program, while at the same time reminding Americans of the need for vigilant monitoring of suspected enemies of the state. Meanwhile, articles based on Snowden’s leaks have exposed additional secret programs for monitoring everything from e-mail and text messaging to Web browsing and video chats.

The proliferation of technology in our daily lives enables collec­tion of massive amounts of data about individuals and their activities, information that can be of great use for tracking people and groups who pose a threat. But the tension between Americans’ need for national security and their expectation of privacy as a basic right has never been higher.

For the intelligence community, the ability to collect and consoli­date large data sets from many different sources offers the promise of disrupting terrorist plots, uncovering weapons transfers, and thwart­ing cyber-attacks.

“In the real world of counterterrorism, there is much to be said for the extraordinary insights that one might obtain by seeing the com­munications from one intelligence target to another,” says Professor Charles Dunlap Jr., executive director of the Center on Law, Ethics and National Security and the former deputy judge advocate general of the United States Air Force. “A picture can be assembled from many disparate pieces of information in an unprecedented way, and this can be invaluable in tracking down those who want to harm us.”

But if such information as the duration of calls and the identifica­tion of originating and receiving telephone numbers offers promising information, it also holds potential threats.

Big data sets acquired by mining average Americans’ daily use of computers and mobile devices are likely to contain large amounts of non-relevant information, along with information that has legitimate foreign intelligence value — like the fish supplier’s phone records. That’s unacceptable to civil libertarians and other critics of govern­ment overreach, who view the digital harvesting as an excessive viola­tion of privacy and of Fourth Amendment protections. And new tools that enable capture of biometric, genetic, and cognitive data raise even thornier issues for the future.

“9/11 was a horrible event that we don’t want repeated, and the battlefield of counterterrorism is intelligence-gathering,” says Christopher Schroeder, the Charles S. Murphy Professor of Law and Public Policy Studies and co-director of the Program in Public Law.

But, notes Schroeder, who grappled with the contemporary impli­cations of surveillance and data-gathering as assistant attorney gen­eral in the Office of Legal Policy (OLP) in the U.S. Department of Justice from 2009 to 2012, the public debate over the government’s surveillance programs is necessary and important.

“One thing the Snowden revelations revealed is that once programs like the one the NSA is running are pulled out into the light of day they are a lot harder to defend,” he says.

A legal landscape in disarray

At an April forum on national security moderated by Bennett and co-sponsored by Duke Magazine and Duke Law Magazine, Schroeder set out the legal landscape under which the NSA metadata collection program operates and which is now under close scrutiny in public and in the courts. While individuals enjoy a “reasonable expectation of privacy” in the content of their calls, he explained, metadata has not been similarly protected since 1979, when the Supreme Court considered information generated by a “pen register” — a device used to log all outgoing calls from a specific phone.

“The Supreme Court said that you or I have no reasonable expec­tation of privacy with respect to that pen register information and other information of what phones we were calling, essentially because we gave it up when we gave it to the phone company,” he said of the Court’s ruling in Smith v. Maryland. “So under existing Fourth Amendment constitutional law, protections do not extend to non-con­tent data stored by third parties.” This “third-party doctrine” encom­passes not just metadata that has been the subject of the NSA pro­gram, but all comparable information generated by e-mail and social media and anything stored on web pages or on servers maintained by companies like Google or Microsoft.

“That’s the legal landscape in which the NSA stands up this mas­sive metadata collection program beginning in the early 2000s,” Schroeder told his Duke audience, saying that government lawyers reviewing the program at its inception likely thought the constitution­al dimensions of the issue were fairly straightforward and settled.

“Any protections available under current law for privacy interests must be enacted by statute,” he says in an interview. “And the last sig­nificant piece of federal legislation to address this kind of non-Fourth Amendment protected data was in 1986, the Stored Communications Act,” which sets a lower standard, “relevance,” for getting court approval to access to non-content metadata.

Now that statute — and the NSA programs — are facing federal court challenges. Late last year, two opposing District Court rulings starkly framed the issues that will almost certainly land before the U.S. Supreme Court — and threw the legal landscape pertaining to government surveillance into disarray.

On Dec. 16, Judge Richard J. Leon of the U.S. District Court for the District of Columbia ruled in favor of plaintiff Larry Klayman T’73 in his challenge to the NSA’s bulk telephone metadata collection, taking special notice of the scope and duration of the program. “No court has ever recognized a special need sufficient to justify continuous, daily searches of virtually every American citizen without any particular­ized suspicion,” Leon wrote in his decision in Klayman v. Obama.

Records that once would have revealed a few scattered tiles of information about a person now reveal an entire mosaic — a vibrant and constantly updating picture of the person’s life.”
— Judge Richard J. Leon, Klayman v. Obama

In his examination of advances that facilitate data collection by pri­vate and government actors, Leon cited the Supreme Court’s 2012 rul­ing in U.S. v. Jones. He observed that the NSA was collecting informa­tion that could be eventually be compiled into a detailed profile of an individual’s family connections and political, professional, religious, and sexual leanings.

“Records that once would have revealed a few scattered tiles of information about a person now reveal an entire mosaic — a vibrant and constantly updating picture of the person’s life,” he wrote. “Whereas some may assume that these cultural changes will force people to ‘reconcile themselves’ to an ‘inevitable’ ‘diminution of pri­vacy that new technology entails,’ Jones, 132 S. Ct. at 962 (Alito J., concurring), I think it is more likely that these trends have resulted in a greater expectation of privacy and a recognition that society views that expectation as reasonable.”

On Dec. 28, Judge William J. Pauley ’77 of the U.S. District Court for the Southern District of New York declared the NSA’s program legal in ACLU v. Clapper. “There is no evidence that the government has used any of the bulk telephony metadata it collected for any pur­pose other than investigating and disrupting terrorist attacks,” wrote Pauley, a member of the Duke Law Board of Visitors.

In his opinion, Pauley observed that a bulk database of calls could have helped the NSA fill in missing — and misinterpreted — infor­mation after it intercepted calls made by a 9/11 hijacker based, unbe­knownst to the agency, in San Diego. That intelligence failure, he wrote, spurred a number of counter-measures that cast a wide net to “find and isolate gossamer contacts among suspected terrorists in an ocean of seemingly disconnected data.” He acknowledged the poten­tial for the misuse of metadata, however.

“This blunt tool only works because it collects everything. Such a program, if unchecked, imperils the civil liberties of every citizen,” Pauley wrote. Still, his task was only to rule on the legality of the bulk telephone data collection, he added, acknowledging the robust debate on the subject in public, in Congress, and in the executive branch. “[T]he question of whether that program should be conducted is for the other two coordinate branches of Government to decide.”

Yet regulation has lagged far beyond our technological capabilities, says Schroeder, who reviewed rules regarding surveillance and data-gathering during his OLP service.

Schroeder believes it’s the job of Congress to set limits on what the security and law enforcement agencies can do, but so far, he points out, they have been seemed to favor the appeals to national security interests.

 
There is no evidence that the government has used any of the bulk telephony metadata it collected for any purpose other than investigating and disrupting terrorist attacks."
—Judge William J. Pauley '77, ACLU v. Clapper
U.S. v. Jones: Moving beyond the beeper

Nearly a year-and-a-half before the word “metadata” had entered common parlance thanks to the Snowden leaks, U.S. v. Jones — cited in both the Leon and Pauley rulings — put the constitutionality of high-tech surveillance before the Supreme Court. In their concurring opinions in Jones, Schroeder points out, five justices expressed “an active interest” in revisiting the third-party doctrine in light of emerging technology.

Stephen C. Leckar ’73, who argued the case on behalf of respon­dent Antoine Jones, a Washington, D.C. nightclub owner sentenced to life after being convicted on drug trafficking charges, knew early on that the case would be groundbreaking: the key evidence against his client came from four weeks of constant tracking by a GPS device that police installed on his car without a search warrant.

“Coming into it, what struck me as very troublesome about the case was that it seemed most at odds with the Fourth Amendment to say that you could take someone’s property and effectively use it to spy on him or her without first making sure that you had the permis­sion of the federal magistrate judge or state judge,” says Leckar, a solo practitioner who usually focuses on federal commercial litigation and complex criminal appeals. “The notion of using someone’s property to do a dragnet search of every place they’ve been for any significant period of time was troubling.”

In 2010, an appellate court overturned Jones’ conviction, holding that his “reasonable expectation of privacy” had been violated by the GPS tracking. The government appealed to the Supreme Court. With the help of Walter Dellinger, the Douglas B. Maggs Professor Emeritus of Law, Leckar briefed and argued the case.

During oral argument, Deputy Solicitor General Michael Dreeben ’81 asserted that GPS tracking closely resembled the visual and beeper surveillance that, according to the Court’s 1983 ruling in Knotts v. United States, does not infringe the Fourth Amendment if the vehicle being followed is traveling on public roads. Chief Justice John Roberts challenged the government’s assertion, and the distinction proved to be key.

“The technology is very different, and you get a lot more informa­tion from the GPS surveillance than you do from following a beeper,” the chief justice said.

In holding, 5-4, that installing and using a GPS tracking device to extensively monitor the movements of a suspect’s car over a protracted period of time constituted a search under the Fourth Amendment and, as such, required a warrant, the Court signaled its willingness to consider how changing technology affects our understanding of a reasonable expectation of privacy; Justice Sonia Sotomayor expressly suggested it might be time to revisit the third-party doctrine. The Jones decision also challenged precedent in Smith v. Maryland.

“Justice Sotomayor said, in her concurring opinion, that we may be in a different world from when that case was decided,” says Lawfare’s Bennett.

When the Jones decision was announced, Dellinger hailed it as “a signal event in Fourth Amendment history,” and it was cheered by a coalition of supporters from across the political spectrum who had weighed in to support Jones’ position, including the American Civil Liberties Union, the Constitution Project, the libertarian Cato Institute, gun owners, independent truckers, a coalition of noted privacy schol­ars, and the Electronic Frontier Foundation. During a talk at Duke Law last October, Snowden’s lawyer, the ACLU’s Benjamin Wizner, went further, suggesting it “may turn out to be one of the most important Supreme Court cases of the last 50 years,” he said. (See On the Record at Duke Law, Page 24.)

Leckar agrees with Justice Sotomayor that it’s time to reconsider the third-party doctrine for the digital age.

“When you have the capacity of the government to maximize, at mar­ginal cost, information-gathering in a way that is so fine-grained, then there has got to be some protection,” he says. “No one expects a govern­ment agent to come looking for them in this way without a warrant. This is the type of abuse of official power that concerned the Framers. If Congress and the executive won’t step in, then the courts should.”

When you have the capacity of the government to maximize, at marginal cost, information-gathering in a way that is so fine-grained, then there has got to be some protection. No one expects a government agent to come looking for them in this way without a warrant.”
— Stephen C. Leckar ’73

 

The end of safe harbor?

However ambiguous U.S. constitutional protection of personal information may be, Europeans see data privacy as almost sacro­sanct. As a result, the Snowden leaks have reverberated across the ocean, raising concerns not just about security and diplomacy, but also the conduct of global commerce.

Recognizing how easily data crosses borders in the course of doing business, European leaders have long sought to ensure that foreign trading partners complied with their high standards for data protection. In 2000, the EU-U.S. Safe Harbor Agreement was ratified to safeguard the security of data traveling from Europe to the United States, where privacy protections are less stringent. And in March, partly in response to revelations about NSA surveil­lance, the European Parliament voted to implement an even more rigorous set of protections.

Sibylle Gierschmann LLM ’99, a partner and data compli­ance specialist at Taylor Wessing in Munich, says many of her U.S. clients, in particular, already are challenged to comply with European data protection and retention laws.

“We have a ‘purpose-limitation principle,’ which means you may only hold data for as long as is required for the purpose for which you collected it in the first place,” says Gierschmann. “From the outset, you need to tell the data subject why you’re collecting the information, and you may not use it for other purposes later on.”

Gierschmann will be engaged in forging new rules for data collection and transfer. She fears that Europe could become anti-competitive if it adopts a unilateral and inflexible standard. “It’s no use for Europeans to say we are the strictest and we have a regula­tion with which everyone has to abide,” she says. “That’s not the way technology works globally. It’s not innovative.”

For David Hoffman ’93, director of security policy and global privacy officer for Intel, the challenges for American companies operating in foreign markets are clear.

“Our position is that safe harbor is an incredibly important document for trans-Atlantic economic progress,” says Hoffman, who manages a team that oversees legal support for privacy and security, compliance activities, and global public policy engagement. “We think it’s important for countries and organizations to be hav­ing conversations about what the right role is for the protection of personal data. But it’s important to keep the structure that we already have in place while we have those discussions.”

According to Hoffman, the controversy is actually spurring innova­tion at Intel, where security is becoming an essential aspect of design at the earliest stages of product development. “If you can include [pri­vacy] early enough, then you truly get privacy by design,” he says. “My team works closely with Intel’s developers to get privacy included as part of the product statement up front.”

Gierschmann characterizes the broader challenge as being to “reach a global understanding of how much privacy we need and how much benefit certain data has.” She notes that many valuable applica­tions can function using anonymous data. “A lot of big data appli­cations have nothing to do with personalized data,” Gierschmann says, adding that even when big data is being mined for legitimate purposes, risks to the individual should be minimized by using only aggregated data or rendering the data anonymous at the earliest pos­sible stage. She recalls a presentation by NSA officers investigating credit card fraud.

In the EU, “from the outset, you need to tell the data subject why you’re collecting the information, and you may not use it for other purposes later on.”
— Sibylle Gierschmann LLM ’99

“They argued that without being able to use big data applications, they would not have been able to find these credit card abusers — a crime that’s bigger than drug [trafficking].”

Gierschmann attributes Germans’ inherent suspicion of govern­ment surveillance to the memory of Nazism and the East German Stasi. “That’s our historical fear when it comes to massive data collec­tion that might later on be used for purposes not made transparent at the time of collection or that are collected secretly without knowledge of the individual. It’s a Big Brother fear.”

Surveillance for a brave new world

Gierschmann’s reference to George Orwell’s Big Brother goes beyond a passing metaphor for Margaret Hu ’00, who points out that 1984 has influenced U.S. law, having been cited by the justices in U.S. v. Jones.

A scholar whose work lies at the intersection of immigration policy, national security, cybersurveillance, and civil rights, Hu’s research interests are informed by a decade of service in the Civil Rights Division of the U.S. Department of Justice that began on Sept. 10, 2001.

Last spring, an Indiana Law Journal article she wrote while a visiting assistant professor of law at Duke entitled “Biometric ID Cybersurveillance” garnered unexpected attention from outside of academia, resulting in more than 2,000 downloads from the Duke Law Scholarship Repository.

Examples of these biometric data include digital photos for facial recognition technology, fingerprint and iris scans, and DNA. In them­selves, the techniques aren’t controversial, but Hu explored how the collection of this data is becoming an inescapable, routine part of our lives, and how government agencies are eager to capture that infor­mation for purposes ranging from immigration control to day-to-day law enforcement activities.

Hu, who is now an assistant professor of law at Washington and Lee University, writes that “emerging biometric cybersurveillance technologies, and mass biometric data collection and database screen­ing, are adding an entirely new and unprecedented dimension to day-to-day bureaucratized surveillance.”

Immediately after the 9/11 attacks, Hu was assigned to a post-9/11 backlash discrimination task force in the Civil Rights Division and, by 2006, she was elevated to a senior management post that focused on immigration policy. Over time, she observed how the depart­ment’s counterterrorism and immigration objectives began to merge through data surveillance.

The questions for Hu started with a congressional proposal to modernize the Social Security card. “Why is the govern­ment proposing a DNA-based electronic Social Security card,” she asks. “And what are the cybersurveillance implications of that proposal?”

The stakes are high, Hu says, because the govern­ment’s analytical tools, coupled with the huge mass of available data, pose unprecedented threats to our Fourth Amendment protections.

Based on the digital identity of who we are, they can seize and search our identity. It’s not a search and seizure that we’ve previously conceptualized. … This is a violation that is just as — if not much more — intrusive than the search and seizure of your diaries.”
— Margaret Hu ’00

“Based on the digital identity of who we are, they can seize and search our identity. It’s not a search and seizure that we’ve previ­ously conceptualized. It’s not about someone searching your car, your house, or your diaries. This is a violation that is just as — if not much more — intrusive than the search and seizure of your diaries.”

A “siege against cognitive liberty”

And as technology begins to erode the divide between mind and body, surveillance is not far behind, says Professor Nita A. Farahany ’04, whose recent scholarship has focused on the constitutional issues that arise when machines can reach inside our brains.

Since the Supreme Court’s 1966 decision Schmerber v. California, which held that involuntarily produced blood samples in a drunk-driving case did not violate the Fourth and Fifth Amendments, courts have held to a mind-body distinction in matters of self-incrimination. But Farahany, who holds a secondary appointment in the Department of Philosophy and directs the Duke Science & Society initiative, says the debate needs to take account of recent breakthroughs in neuroscience.

In a recent article, Farahany describes the use of a brain-based polygraph test in the murder trial of a woman in India. The woman was convicted of killing her fiancé after being fitted with an electrode cap and read a series of questions. A machine analyzed her neural responses to the questions.

“The court placed great weight on the difference that emerged between these sets of measurements,” Farahany writes. “The software algorithm that interpreted the EEG signals, it reasoned, effectively divined her answers to the underlying (but technically unasked) ques­tions of guilt that the declarative statements were designed to stir.”

In another article, Farahany refers to the coming “siege against cognitive liberty,” also the subject of a book she’s writing.

“There’s something more than mental privacy that’s at stake,” she says. “It’s a liberty interest — a broader interest that includes ideas like freedom of thought, conscience, the right to self-determination, and autonomy. It’s about trying to recognize that things people are uncomfortable with [indicate] a kind of intuition that there is a realm of privacy that goes beyond privacy of information.”

Far from intentional, the “siege,” she suggests, is an inevitable consequence of the era of big data. Yet as threatening to civil liberties as these technologies may turn out to be, she questions whether they will raise any constitutional issues at all.

“It doesn’t seem like the Constitution, as currently written, protects us from intrusions of information,” says Farahany, who serves on the Presidential Commission for the Study of Bioethical Issues. “The way it’s currently interpreted is that there is no protection against inva­sion of privacy or intrusion against personal information. Information might just be information and there’s nothing personal about it.”

The ongoing search for limiting principles

In a January speech, acknowledging public outcry, President Barack Obama promised to enact several modest reforms to the bulk collec­tion of telephone metadata, including requiring the records to remain in private hands, rather than being housed within the NSA, and requiring the NSA to obtain authorization for specific searches, rather than having blanket access. In March, following recommendations for farther-reaching reforms by two expert panels, the president yielded further, announcing that he was sending legislation to Congress to end the program entirely.

Although the details still need to be worked out, Schroeder says it’s likely to produce a situation “less objectionable” to the public at large. “It aligns more closely with traditional modes of data inquiry that gov­ernment agencies have undertaken, where they don’t try to amass data themselves but try to seek data that exists in the commercial world for other reasons,” he says. “Ideally, it would be retained only for the amount of time phone companies claim it is needed for business purposes.”

It is very hard to calibrate, in terms of costs and benefits, the various counterterrorism measures the government is taking or could take. Dialing back on such measures risks being unable to prevent some otherwise preventable act of terrorism.”
— Professor Christopher Schroeder

In late May the U.S. House of Representatives passed a bill, dubbed the USA Freedom Act, designed to prohibit some forms of bulk collection, among other things by insisting that the govern­ment employ specific search terms in seeking phone and other business records from communications companies. Since then, civil liberties groups and government officials have disagreed about the degree to which the House bill would limit NSA data collection. In late June, as this issue of Duke Law Magazine was heading to press, the House passed an amendment to a defense appropriations bill that would, in part, bar the NSA from conducting warrantless searches of Americans’ communications within the data it collects on foreign targets.

At the April forum on national security, Schroeder praised the high standards of integrity under which members of the national security community operate, as well as the challenge of keeping America safe.

“The national security community takes their job seriously,” he said. “It is very hard to calibrate, in terms of costs and benefits, the various counter-terrorism measures the government is taking or could take. Dialing back on such measures risks being unable to prevent some otherwise preventable act of terrorism. Yet when John Kerry, the presidential candidate, suggested that it may be impossible to prevent all such acts, he was pilloried for saying we might have to tolerate some risk of terrorism.”

However productive the recent public conversation has been on the proper calibration between privacy and national security, he says, Snowden’s leaks were illegal. “He ought not to be rewarded for the fact that the conversation we’re having after the disclosures is having some positive effect. That would set a dangerous precedent for everyone else, that people with access to secure information could use their own judg­ment as to the merits of revealing certain aspects of programs that are instrumental in keeping the rest of us safe.”

Dunlap also is emphatic in saying that Snowden has “grievously harmed” U.S. national security. “He compromised techniques and methodologies that will be extremely costly and difficult to replace.”

Still, a culture of even benign surveillance has a chilling effect, Dunlap says, that can discourage creativity and innovation. “It actually changes the way people think and communicate, in ways that I think will suppress what is really productive about a free society, and that’s the ability to look at the world with a blank slate and come up with new and different ways of thinking.

“I think it’s important for people to have the opportunity to devel­op their ideas and bounce them off their confidantes — in privacy — before the world knows about them.”

Other News

On the record at Duke Law: "Something to Hide: New Technology, Dragnet Surveillance, and the Future of Piracy"

“ [H]ow does a democratic government deal with accountability over secret programs? That question is fundamental. If we think that any government activity is legitimately secret, there is going to have to be a way to create an oversight mechanism that doesn’t require the public knowing everything and then getting to deal with it through voting in elections. We’re going to need to have intelligence committees in Congress that essentially fulfill that oversight function for us. We’re going to need to have courts, like the Foreign Intelligence Surveillance Court, that fulfill that oversight function for us.
“What we’ve seen, and what the Snowden revelations have really laid bare, is that those oversight mechanisms completely failed here. … First and foremost, they failed because the amount of information that’s classified, and the subject of those top-secret stamps, massively exceeds any legitimate government secrecy interests. … [W]e have … a secret court that has effectively functioned as a rubber stamp because it’s one-sided. It hears only from the government, … from national security officials who have one interest. I don’t blame them. It’s their job to get at this information. It’s not their job to make the other side’s arguments.
“And in Congress, you have intelligence committees that have been really captured. They are captured by the agencies that they are supposed to be regulating, and they’re captured by the companies that fund their campaigns, that stand to gain the most from their appropriations. … The people on the intelligence committees who are the ones who should be most skeptical of what the national security state is doing are, in fact, the ones who are most captured by what the national security state is doing.

 

“… I do think that reforms to the FISC are likely to happen and should be supported. I don’t think it’s a cure-all, but I do think that having someone not on the court in the room (whose job it is to present the argument on the other side), will invariably have an effect, even if it just affects what arguments the government is willing to bring in to that process.”
 
- Benjamin Wizner
Director of ACLU's Speech,
Privacy & Technology Project
 

On the record at Duke Law: "Federal Courts and National Security"

“When I saw these statistics that [the Foreign Intelligence Surveillance Court is] approving 99 or whatever percent of the cases that come before us … I said I know that’s not a reflection of my experience, and I’ve been on the court for seven years. What I told my staff to do when those reports started to come out following the Snowden leak was … to start to collect our own statistics, because the statistics that you get are statistics from the Justice Department, and those are only applications when a final application is submitted to the court and rejected or approved. That’s the only statistic being captured. Well, there are a lot of things that are happening before that occurs. …
“There’s a lot of interchange that takes place between the Justice Department lawyers and our lawyers, and many times those cases don’t go any further. That, however, does not count as a rejection. Or, there may be as a result of those discussions significant alterations of what the government is seeking from the court. They may be requesting to target … seven or eight phone numbers, for example. We may only approve two. But that doesn’t count as five rejections.
“… [T]here’s another review by the judge based upon input from our lawyers. Again, we may decide that this is a problem case and therefore should not go forward. What the government may do at that point is pull that case and not seek to go forward with a final version. Again, that does not count as a rejection, and the same is true if we modify that application significantly — that doesn’t count as a rejection.
“Our [own] statistics show that about 24 percent of the applications submitted to us are either altered in some way [in] a significant manner, substantively, or … the government does not seek to go forward. So this whole myth about the FISA court being a rubber stamp is just not correct. … I think we do our job and I think we do it diligently and I’m very proud of my tenure on that court.”
 
-Judge Reggie B. Walton
U.S. District Court for the District of Columbia & former presiding judge of the U.S. Foreign Intelligence Surveillace Court
 

On the record at Duke Law: Keynote address at the Center for Law, Ethics and National Security's 20th spring conference

“[C]onsideration of the appropriate role of big data in our democratic society reflects the changing nature of privacy in the information age. Today we all share an enormous amount of personal and sensitive data with private companies and, in many instances, with the entire Internet-connected world. And yet we still want this information to be private for many purposes.
“… If we’re concerned about how government can use data, let’s craft sensible limits on the way we can use that data, what purposes we can use it for, who can use it — so that we can all be more confident that we’re protecting both national security and privacy. At the end of the day, we need to determine whether there’s a system of controls and oversight that can be put in place to give us comfort that our intelligence services are using big data appropriately.
“… We can impose strict limits on the purposes for which the data could be used. We could impose limits on the types of analytical tools that could be used against it and how information could be disseminated to others.
“We could set up approval and review processes to ensure that these limits are adhered to, and we could use technological tools of the type that now exist to restrict and monitor access to the data to further enforce the restrictions.
“We could place limits on how long data could be stored in our databases, and we could use the existing compliance and oversight framework to proactively discover mistakes and quickly fix them.

 

“If this approach sounds familiar, it’s because it’s exactly what the intelligence community already does. In fact, it’s pretty much the approach that we took with respect to the bulk telephone metadata program. … This regulatory framework goes far beyond the controls that most, if not all, of the private sector has with respect to its use of information. And while the NSA has had well-publicized technical challenges in implementing the bulk telephone metadata program … these problems were self-identified, self-reported, and self-remedied, thanks to the NSA’s robust compliance and oversight system.”
 
-Robert S. Litt
General counsel,
Ofice of the Director of National Intelligence
 

Surveillance and the First Amendment

Although much of the litigation regarding NSA data collection and surveillance focuses on Fourth Amendment concerns, David Greene ’91 is engaged in a First Amendment federal court challenge to the meta­data collection program brought by a Los Angeles church.

The plaintiffs in First Unitarian Church of Los Angeles v. NSA, are making a freedom of association argument, says Greene, a staff attorney for the Electronic Frontier Foundation, which has been active in cases related to NSA surveillance since 2006.

“All of our clients in First Unitarian are organizations that have an interest in keeping membership lists — and records of people who associate with them on the telephone — confidential from the government,” he says. “Often these are hotlines to give confidential advice, or for people who work in politically sensitive areas where they might not want the government to know they’ve been talking to them. All of these groups should be able to shield their associations from the government.”