Protecting information has become one of the most important and significant tasks in modern society. Many people have become used to memorizing passwords and PIN numbers, sometimes of bewildering complexity. Others use biometric indicators for protection: fingerprints, irises, and the like can all help identify individuals.
But these systems are not perfect. One significant problem is the threat of coercion—being forced to reveal a password or place a finger in the fingerprint scanner.
Today, Max Wolotsky at Cal Poly Pomona and a couple of pals have come up with a solution that can determine whether an individual is being coerced and deny authentication as a result.
The system is simple in concept. Wolotsky and co’s idea is to use the body’s stress levels to determine whether they are being coerced in any way. And they do this by measuring the individual’s response to “chill” music they previously identified as relaxing.
Chill music is so-called because it provokes a shiver down the spine, a response that is similar to being cold. It is the physiological effects of this shiver that Wolotsky and co set out to measure by monitoring heartbeat and brain-wave patterns.
Their hypothesis is that these signals are impossible to fake and only possible to measure when the subject is relaxed. Any duress would result in a different signal.
To find out whether this is the case, the team asked five test subjects to choose their favorite piece of chill music and then monitored their heartbeat and brain waves while they listened.
In particular, the team focused on the moments within the music that trigger the “chill” response on the assumption that this always occurs at the same point in the score. This section of the music—less than a minute or so—then becomes the key to the authentication process.
The idea is that if the subject is relaxed, he or she can experience the “chill” in the future and reproduce the physiological signals associated with this.
Indeed, the team carried out a number of tests and found that their subjects were able to pass the test with a 90 percent success rate.
There are some caveats, of course. The team was unable to test its subjects’ response under any kind of stress to simulate the kind of coercion that this test is designed to foil. ”One reason we did not do this is because it is unethical to threaten test subjects in order to verify that our system is fully coercion-resistant, as it could leave subjects with permanent physical or psychological damage,” they say.
That’s a significant limitation. If the team hasn’t checked that it works in the conditions it is designed to operate under, how can it be sure it is secure? There are other potential problems, too. The information that might benefit from this kind of increased protection is likely to be hugely valuable, things like the launch codes for nuclear weapons, perhaps. (One of the authors works at Sandia National Laboratories, which is responsible for nuclear stockpile management.)
But urgent access to this kind of information might only be necessary in times of high stress, and this could invalidate the test. The thought of somebody trying to access the launch codes as World War III unfolds, but having to chill out beforehand, has something of a black comedy about it.
Nevertheless, developing coercion-resistant passwords is an important goal. Wolotsky and co have taken some tentative steps that others can build on.
Ref: http://arxiv.org/abs/1605.01072: Chill-Pass: Using Neuro-Physiological Responses to Chill Music to Defeat Coercion Attacks
This is the real story of the Afghan biometric databases abandoned to the Taliban
By capturing 40 pieces of data per person—from iris scans and family links to their favorite fruit—a system meant to cut fraud in the Afghan security forces may actually aid the Taliban.
The Taliban, not the West, won Afghanistan’s technological war
The US-led coalition had more firepower, more equipment, and more money. But it was the Taliban that gained most from technological progress.
The covid tech that is intimately tied to China’s surveillance state
Heat-sensing cameras and face recognition systems may help fight covid-19—but they also make us complicit in the high-tech oppression of Uyghurs.
How Amazon Ring uses domestic violence to market doorbell cameras
Partnerships with law enforcement give smart cameras to the survivors of domestic violence. But who does it really help?
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.