Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

But when senators asked why, given the fact that FISA had provisions by which government agents could wiretap first and seek warrants later, the Bush administration had sidestepped its requirements at all, Gonzalez claimed he couldn’t elaborate for reasons of national security.

Former NASA director General Michael Hayden, in charge when the NSA’s surveillance program was initiated in 2002, was slightly more forthcoming. FISA wasn’t applicable in certain cases, he told the senators, because the NSA’s surveillance relied on what he called a “subtly softer trigger” before full-scale eavesdropping began. Hayden, who is nowadays the nation’s second-highest ranking intelligence official, as deputy director of national intelligence, said he could answer further questions only in closed session.

Gonzalez’s testimony that the government is making increased use of FISA, together with his refusal to explain why it’s inapplicable in some cases – even though retroactive warrants can be issued – implies that the issue isn’t simply that government agents may sometimes want to act quickly. FISA rules demand that old-fashioned “probable cause” be shown before the FISA court issues warrants for electronic surveillance of a specific individual. Probable cause would be inapplicable if NSA were engaged in the automated analysis and data mining of telephone and e-mail communications in order to target possible terrorism suspects.

As the Electronic Frontier Foundation’s lawsuit against AT&T reveals, NSA has access to the switches and records of most or all of the nation’s leading telecommunications companies. These companies’ resources are extensive: AT&T’s data center in Kansas, for instance, contains electronic records of 1.92 trillion telephone calls over several decades. Moreover, the majority of international telecommunications nowadays no longer travel by satellite, but by undersea fiber-optic cables, so many carriers route international calls through their domestic U.S. switches.

With the telecom companies’ compliance, the NSA can today tap into those international communications far more easily than in the past, and in real time (or close to it). With access to much of the world’s telecom traffic, the NSA’s supercomputers can digitally vacuum up every call placed on a network and apply an arsenal of data-mining tools. Traffic analysis, together with social network theory, can reveal patterns indiscernible to human analysts, possibly suggesting terrorist activity. Content filtering, applying highly sophisticated search algorithms and powerful statistical methods like Bayesian analysis in tandem with machine learning, can search for particular words or language combinations that may indicate terrorist communications.

Whether the specific technologies developed under TIA and acquired by ARDA have actually been used in the NSA’s domestic surveillance programs – rather than only for intelligence gathering overseas – has not been proved. Still, descriptions of the two former TIA programs that became Topsail and Basketball mirror descriptions of ARDA and NSA technologies for analyzing vast streams of telephone and e-mail communications. Furthermore, one project manager active in the TIA program before it was terminated has gone on record to the effect that, while TIA was still funded, its researchers communicated regularly and maintained “good coordination” with their ARDA counterparts.

It’s this latter fact that is most to the point. Whether or not those specific TIA technologies were deployed for domestic U.S. surveillance, technologies very much like them were. In 2002, for instance, ARDA awarded $64 million in research contracts for a new program called Novel Intelligence from Massive Data. Furthermore, overall, a 2004 survey by the U.S. General Accounting Office, an investigative arm of Congress, found federal agencies operating or developing 199 data mining projects, with more than 120 programs designed to collect and analyze large amounts of personal data on individuals to predict their behavior. Since the accounting office excluded most of the classified projects, the actual numbers would likely have been far higher.

13 comments. Share your thoughts »

Tagged: Computing, Communications

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me