Protecting information has become one of the most important and significant tasks in modern society. Many people have become used to memorizing passwords and PIN numbers, sometimes of bewildering complexity. Others use biometric indicators for protection: fingerprints, irises, and the like can all help identify individuals.
But these systems are not perfect. One significant problem is the threat of coercion—being forced to reveal a password or place a finger in the fingerprint scanner.
Today, Max Wolotsky at Cal Poly Pomona and a couple of pals have come up with a solution that can determine whether an individual is being coerced and deny authentication as a result.
The system is simple in concept. Wolotsky and co’s idea is to use the body’s stress levels to determine whether they are being coerced in any way. And they do this by measuring the individual’s response to “chill” music they previously identified as relaxing.
Chill music is so-called because it provokes a shiver down the spine, a response that is similar to being cold. It is the physiological effects of this shiver that Wolotsky and co set out to measure by monitoring heartbeat and brain-wave patterns.
Their hypothesis is that these signals are impossible to fake and only possible to measure when the subject is relaxed. Any duress would result in a different signal.
To find out whether this is the case, the team asked five test subjects to choose their favorite piece of chill music and then monitored their heartbeat and brain waves while they listened.
In particular, the team focused on the moments within the music that trigger the “chill” response on the assumption that this always occurs at the same point in the score. This section of the music—less than a minute or so—then becomes the key to the authentication process.
The idea is that if the subject is relaxed, he or she can experience the “chill” in the future and reproduce the physiological signals associated with this.
Indeed, the team carried out a number of tests and found that their subjects were able to pass the test with a 90 percent success rate.
There are some caveats, of course. The team was unable to test its subjects’ response under any kind of stress to simulate the kind of coercion that this test is designed to foil. ”One reason we did not do this is because it is unethical to threaten test subjects in order to verify that our system is fully coercion-resistant, as it could leave subjects with permanent physical or psychological damage,” they say.
That’s a significant limitation. If the team hasn’t checked that it works in the conditions it is designed to operate under, how can it be sure it is secure? There are other potential problems, too. The information that might benefit from this kind of increased protection is likely to be hugely valuable, things like the launch codes for nuclear weapons, perhaps. (One of the authors works at Sandia National Laboratories, which is responsible for nuclear stockpile management.)
But urgent access to this kind of information might only be necessary in times of high stress, and this could invalidate the test. The thought of somebody trying to access the launch codes as World War III unfolds, but having to chill out beforehand, has something of a black comedy about it.
Nevertheless, developing coercion-resistant passwords is an important goal. Wolotsky and co have taken some tentative steps that others can build on.
Ref: http://arxiv.org/abs/1605.01072: Chill-Pass: Using Neuro-Physiological Responses to Chill Music to Defeat Coercion Attacks
What’s next for AI regulation in 2024?
The coming year is going to see the first sweeping AI laws enter into force, with global efforts to hold tech companies accountable.
Three technology trends shaping 2024’s elections
The biggest story of this year will be elections in the US and all around the globe
Four lessons from 2023 that tell us where AI regulation is going
What we should expect in the coming 12 months in AI policy
The FTC’s unprecedented move against data brokers, explained
It could signal more aggressive action from policy makers to curb the corrosive effects that data brokers have on personal privacy.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.