Skip to Content
Uncategorized

The Webs We Weave

Lie detection has never been straightforward.
April 21, 2009

Methods for detecting lies have been around for as long as people have been telling them. There is something comforting in the notion that even the most skilled liar will unconsciously betray himself by some subtle cue–a reddening of the ears, a fidgeting of the hands, an uncontrollable shift of the eyes. But attempts to turn the art of lie detection into a science have always been controversial.

Nervous, pal?: A man in Greensboro, NC, being given a polygraph test in 1962

Although an automated version of psychologist Paul Ekman’s system for analyzing facial microexpressions to detect deceit may soon be deployed in counterterrorism settings (see “Telling”), and at least two companies are now marketing what they claim is a superior lie detection test based on functional magnetic resonance imaging (fMRI), the polygraph remains the best-known and most widely used lie detector nearly a hundred years after its invention. The polygraph is based on the principle that lying causes a physiological response in the teller that can be reliably measured by a machine and interpreted by a trained technician. In January 1981, a TR report on their then-prevalent use by employers described a typical examination:

The device typically monitors changes in breathing, blood pressure, pulse rate, perspiration, and electrical conductivity of the skin as the subject is asked a succession of questions by the polygraph operator.

The first questions, called “control questions,” are designed to elicit a deceptive statement from the subject and hence yield an example of a deceptive response. …Next, the polygraph operator poses questions relevant to the reason for the examination. … The subject is generally considered to be lying when the observed responses to relevant questions are similar to the subject’s reactions to control questions. The subject is deemed truthful when the responses to relevant questions are of lesser magnitude than those to control question, and when the pattern of reaction resembles truthful responses to irrelevant questions.

Questions about the validity of the polygraph have dogged it since its introduction, yet the device quickly gained popularity in the United States. Although a 1923 Supreme Court decision generally barred the use of polygraph results as evidence in court, the technology was widely used by both police and private investigators. Its use in the workplace was the most controversial application.

In recent years, the American public has expressed growing concern over the use of the polygraph, or “lie detector,” for the selection and management of workers in industry and government. …

The polygraph has become attractive to private industry because it is fast and cheap. … Employers use the polygraph primarily to curb employee theft. … Polygraphs are also used to verify employment applications and to assess periodically employee honesty, loyalty, and adherence to company policy.

Criticism of its reliability as a screening tool, combined with official outrage over President Ronald Reagan’s attempt to plug leaks of classified information through polygraph examinations of all civil servants with security clearances, led Congress to pass the Employee Polygraph Protection Act of 1988, which prohibits private companies from requiring employees to submit to the test. Government employees at the federal, state, and local level are exempt from the act, however, as are contractors in security-sensitive positions; the number of federal polygraph programs has been on the rise in recent years. Critics argue that expanding the use of polygraphs in counterintelligence is especially dangerous, because the test can be beaten. Aldrich Ames, for example, passed multiple polygraph tests while spying for the Soviet Union in the 1980s and ’90s.

But even assuming that a lie detector were perfectly accurate, its use would raise profound ethical questions. As the TR report concluded:

The use of a machine to “detect” lies is, arguably, inappropriate and impractical. More seriously, it violates our society’s cherished ideals of individual privacy and civil liberties. As former Senator Sam Ervin opined in 1974, “If the right to privacy means anything at all in our society, it means that people are entitled to have thoughts, hopes, desires, and dreams beyond the grasping reach of a bureaucrat, an employer, or an electronic technician.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

It’s time to retire the term “user”

The proliferation of AI means we need a new word.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.