Skip to Content

The Technology That Toppled Eliot Spitzer

Anti-money-laundering software scrutinizes bank customers’ every move, no matter how small.
March 19, 2008

If there is a lesson from former New York governor Eliot Spitzer’s scandal-driven fall (aside from the most obvious one), it is this: banks are paying attention to even the smallest of your transactions.

Poetic justice: Former New York governor Eliot Spitzer’s intimate knowledge of the tools used to foil organized crime didn’t keep him from running afoul of his own bank’s anti-money-laundering software.

For this we can thank modern software, and post-9/11 U.S. government pressure to find evidence of money laundering and terrorist financing. Experts say that all major banks, and even most small ones, are running so-called anti-money-laundering software, which combs through as many as 50 million transactions a day looking for anything out of the ordinary.

In Spitzer’s case, according to newspaper reports, it was three wire transfers amounting to just $5,000 apiece that set alarm bells ringing. It helped that he was a prominent political figure. But even the most mundane activities of ordinary citizens are given the same initial scrutiny.

“All the big banks have these software systems,” says Pete Balint, a cofounder of the Dominion Advisory Group, which helps banks develop strategies for combatting money laundering and fraud. “Depending on their volume, they might have thousands of alerts a month.”

Most of the systems follow fairly simple rules, looking for anomalies that trigger heightened scrutiny. Software company Metavante says that its software, for example, contains more than 70 “best-practice” rules, covering a wide variety of transaction types ranging from cash deposits to insurance purchases. The simplest rules might flag large cash transactions, or multiple transactions in a single day.

In Spitzer’s case, the three separate $5,000 wire-transfer payments reported by the Wall Street Journal would likely have triggered one of the most obvious of these rules, without any recourse to more advanced capabilities.

Banks are constantly on the lookout for activity that seems to be an effort to break up large, clearly suspicious transactions into smaller ones that might fly under the radar, a practice called structuring. Spitzer’s transactions almost certainly fit that profile, says Dave DeMartino, a Metavante vice president. Newspaper reports have identified New York’s North Fork Bank, owned by Capitol One, as Spitzer’s personal bank. A spokeswoman for the bank declined to identify which, if any, anti-money-laundering software the institution uses.

But banks, and law enforcement, are also looking for things that they can’t predict and thus can’t write rules for.

“If you’re just writing scenarios, you aren’t going to find things that you didn’t know about,” says Michael Recce, chief scientist for Fortent, another prominent vendor of anti-money-laundering systems. “About 60 percent of the things our customers find are things they knew about. The rest are things they didn’t know about.”

The simplest way to identify the unexpected is by contrast to the routine. A person who deposits just two paychecks a month for two years might be flagged if he suddenly deposits six large checks in two weeks, for example.

But software packages also group customers and accounts into related “profiles” or “peer groups,” in order to establish more-general behavioral baselines. Some software might group together all personal checking accounts with an average balance of less than $15,000, or merchant accounts with turnover of less than $100,000 per month. Some might go deeper, grouping together all business accounts specifically tied to dry cleaners or consulting firms.

The most sophisticated software packages can sort people or accounts into several categories at once: a single customer might be compared to other schoolteachers; to people who bank mostly at a single regional branch; and to people who have stable, pension-based monthly incomes, for example.

Each category is analyzed to determine patterns of ordinary behavior. Every single transaction by customers in these groups, and even patterns of transactions stretching back as far as a year, are then scrutinized for evidence of deviation from this norm using measures such as the number, size, or frequency of transactions, among others.

Follow the money: The U.S. Treasury Department’s Financial Crimes Enforcement Network keeps track of all suspicious-activity reports filed by banks. Here’s the geographic distribution of reports filed by banks in New York between 1996 and 2006.

Whether a deviation is flagged will depend in part on a customer’s risk exposure score, a rating assigned by the bank according to the customer’s occupation, geographical location, and other personal details. A retired schoolteacher who has lived in the suburbs of Minneapolis her entire life might have a lower risk score than a 42-year-old import-export businessman from Sicily, for example. So-called politically exposed persons–customers such as politicians, top executives, and judges–will automatically receive a higher level of scrutiny.

Every bank has a group of actual people who personally scrutinize transactions that have been flagged. The vast majority of alerts represent acceptable behavior, and nothing more is done. If the Minneapolis schoolteacher has sold her house, for example, the income will show as a clear deviation from her peer group’s norm. The human investigator will understand why and won’t pursue the matter any further.

“Banks do not want to be in the position of reporting on a customer without good reason,” says Ido Ophir, vice president of product management for Actimize, another large vendor of anti-money-laundering software. “They can’t just send in transactions that have no suspicious merits.”

However, if the human reviewers can’t explain away the activity, they will produce an official suspicious activity report (SAR), including a written narrative describing the transaction, and send it to the Internal Revenue Service and the Treasury Department’s Financial Crimes Enforcement Network (FinCen), the federal group responsible for administering the 1970 Bank Secrecy Act.

Most SARs are ultimately reviewed by regional teams of investigators, drawn from the IRS, the FBI, the DEA, and the U.S. Attorney’s office. But the reports also go into a Bank Secrecy Act database, which is made available to authorized federal law-enforcement agencies. Agents can search for specific names, account numbers, and details, such as telephone numbers, to see if the subjects of their own investigations have raised any financial flags.

FinCen spokesman Steve Hudek says that automated pattern-analysis software also runs on the Bank Security Act database, helping to spot patterns of activity or links between individuals that humans might miss. He declined to say which software or vendors FinCen uses, however.

As the software has gotten more sophisticated–and the government has applied more pressure to highlight suspicious activity–the number of SARs filed has gone up sharply. In 2000, banks (as distinguished from securities firms or casinos) filed 121,505 SARs. In 2006, they filed 567,080, and by the end of last June, the last month for which figures are available, 2007 was on track to set a new record.

Technologists say that future software will be even better at spotting anomalies, analyzing customers’ social networks, tapping into the vast databases of information held by companies such as LexisNexis and ChoicePoint, and using that outside information to help make judgments about customer transactions.

This might be a privacy advocate’s nightmare, but it helps keep banks safe from fraud and regulatory fines.

“We’re getting to the problem of how to digest larger and larger amounts of information,” says Fortent’s Recce. “There is fundamentally an enormous amount of information, and people are trying to hide in it.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.