“Using these pattern-recognition-based techniques, the authors have been able to show that there is information stored there, even if on the surface it might not be obvious because the overall activity levels don’t go up,” says Haynes.
Previous studies using fMRI have shown that it’s possible to determine which of a number of pictures a person is looking at. But the new study is unique in that it is not decoding sensory information in the brain, but memory.
The researchers also found that the brain-activity patterns linked to looking at a grating and remembering it bear a striking resemblance to each other. “During working memory for visual information, it almost seemed as though these early areas are holding an echo of the initial visual response,” says Stephenie Harrison, a graduate student at Vanderbilt and the lead author on the Nature paper. “It suggests, in a way, that the memory trace itself is very similar to perception.”
It still remains to be seen how the activity patterns detected by fMRI, which essentially measures blood flow in the brain, translate into actual neural signals, says Haynes. Because it measures information in chunks of three cubic millimeters, fMRI can’t gather information about what individual neurons are doing. But “it gives us a better sense of what memory is,” says Harrison. “It’s hard to know because it’s such a subjective personal experience, but this gives us a better sense of what someone might be doing: they might actually be visualizing the information.”
No need to worry yet about Big Brother reading your mind. For now, real-world applications remain limited, says Frank Tong, an associate professor of psychology and senior author on the study. The ability to reconstruct from scratch a complex memory or imagined scenario is a long way off. “We’re still just discriminating a simple binary state,” Tong says. “If you increase the number of options, this would get progressively more difficult.”