For a century and a half, the mystery of anesthesiology was one of the great unexplored areas of neuroscience, as impenetrable as consciousness itself. Though drugs like nitrous oxide and propofol were widely used in operating rooms around the globe, the exact mechanism by which they rendered patients oblivious to the world around them remained a matter of speculation, owing to the logistical and safety challenges involved in scanning the brains of anesthetized patients.
Which is why Professor Emery Brown, a statistician, neuroscientist, and practicing anesthesiologist, was so excited when he took a seat in a control room in Mass General Hospital back in 2004 and watched on a monitor as his first experimental patient, an elderly woman with electrodes attached to her scalp, was wheeled into an fMRI brain imaging machine in the room next door. By recruiting patients who had already undergone tracheostomies and could be hooked up to a respirator before the experiment began, Brown and his collaborators had found a way to ensure that oxygen could be delivered artificially. They could safely allow the subjects to lose consciousness—and, most likely, stop breathing on their own—so they could capture what the unconscious brain looked like and see how it behaved. Now they were finally about to peer inside the black box for the first time.
As technicians administered propofol, a potent anesthetic, and the patient began to slip into unconsciousness, Brown and his colleagues noticed immediately that something remarkable was unfolding on the electroencephalogram (EEG) monitor, which was tracking her brain waves as her brain was being imaged. The shape of those waves transformed in tandem with a rise in the dosage, from a rapid, chaotic drumbeat of bursts to a slow, steady crawl of repetitive gentle slopes. None of those present—even Brown’s colleague who was an expert on EEG—had ever seen anything quite like it before. “We saw it all of a sudden, and we were like, ‘Wowwww,’” recalls Brown. “‘We don’t know what we’re looking at, but this is really cool.’”
Back then, Brown could never have guessed that this moment would lead him on an intellectual journey that today is revealing fundamental new insights into the workings of the human brain. Anesthesia, it turns out, is good for far more than just helping patients get through surgery. These potent compounds—a diverse group of drugs that can be inhaled or delivered intravenously—are also powerful neuroscience research tools.
Last spring, Brown stepped down after 10 years as co-director of the Harvard-MIT Program in Health Sciences and Technology (HST), which trains clinician-scientists and engineers, to focus on creating a new joint research center between MIT and MGH. The center will use anesthesia to reveal new insights into a wide array of seemingly unrelated brain diseases, potentially aiding in the development of treatments. Among other things, he hopes to shed new light on depression, insomnia, epilepsy, and Alzheimer’s disease and explore the possibility of human hibernation for medically induced comas or space travel. He and his colleagues at the center may even help unravel the mystery of consciousness itself.
Brown, who began a sabbatical in January, is now preparing for a late-spring launch of the new center, which will begin with existing collaborators but may eventually expand to include new hires.
Anesthesia, he has demonstrated, works largely by modulating the delicate biochemistry of different regions of the brain in ways that disrupt how they normally communicate with each other—allowing anesthesiologists to effectively turn down, even mute, the volume of incoming or outgoing brain signals (like signals from nerve cells that would normally cause feelings of intense pain). He and his colleagues have spent more than a decade characterizing how different types of anesthesia—including propofol, sevoflurane, ketamine, dexmedetomidine, and opioids—affect different areas of the brain. This led to studies like one that found a way to administer anesthesia that could minimize the need for post-surgical opioids. And now they plan to apply some of the non-opioid anesthetics as experimental tools to modulate the volume of different signals across the brain and manipulate various brain states in ways that could reveal new insights.
Anesthesia, it turns out, is good for more than just helping patients get through surgery. These potent compounds are also powerful neuroscience research tools.
“We’re in a position to really study a lot of different phenomena,” he says. “Because we’re thinking about anesthesia as a neuroscience phenomenon, we can talk in an informed way about sleep, about depression, about Alzheimer’s, about coma recovery, about hibernation. Now that we’ve come to understand how these drugs act in the brain, we can postulate ideas about what’s being turned on, what’s being turned off in these brain states. And we can study this in a precise way.”
The opening of the center, for which Brown is still actively fundraising, comes amidst a season of triumph for the veteran MIT professor. (In addition to his multiple appointments at MIT—where he’s the Edward Hood Taplin Professor of Medical Engineering, a professor of computational neuroscience, a professor of health sciences and technology, an investigator at the Picower Institute for Learning and Memory, and a core faculty member at the Institute for Medical Engineering and Science—he’s also the Warren M. Zapol Professor of Anesthesia at Harvard Medical School and an anesthesiologist at Massachusetts General Hospital.) This fall, Brown was awarded the prestigious Gruber Neuroscience Prize, which comes with a $500,000 grant he split with three other scientists. He recently received a write-up in the New York Times for a paper that highlights similarities between the brain states of anesthetized covid patients and hibernating turtles. And once animal testing is successfully completed, he plans to apply for FDA approval to begin human trials for what he calls a closed-loop anesthesia delivery system—an autopilot technique of sorts. Based on an AI algorithm he developed, the system would measure the correct anesthesia dosage and update it second by second, in response to minute changes in brain waves, to precisely maintain a specified level of unconsciousness.
Brown was born in Ocala, a small city in Central Florida famous for its horse farms, near where his father grew up. His parents taught science and math in the local schools. But his mother, a Pittsburgh native, felt their precocious young son needed a school that would challenge him more. So to finish high school, they sent him off to Phillips Exeter Academy, the elite prep school in New Hampshire.
He went on to Harvard, where he arrived as an undergraduate in 1974. At the time, he hadn’t thought much about neuroscience or the science of sedation. Brown had spent a summer in France and the second semester of his senior year of high school in Barcelona. He was a language whiz. So he intended to major in Romance languages as he pursued the pre-med track. Then after medical school, he’d perhaps work for Médecins Sans Frontières or the World Health Organization. But he began to have second thoughts during his sophomore year, over the course of dining hall conversations and late-night bullshit sessions with his roommates, all of whom were studying economics or government.
“They would come out of class and they’d be talking about GDP and the things you read about in the newspaper, like labor disputes or inflation—it seemed like they were studying all the relevant things,” he recalls. “Whereas I would say, like, ‘Isn’t Don Quixote interesting?’ The conversations never seemed to drift that way. It was always current events.”
Brown had grown up around math. And in those conversations with friends, he was so intrigued by discussions involving the statistics and data analysis often central to economics that he switched his concentration to applied mathematics junior year to study statistics. After graduation, he attended the Institut Fourier des Mathématiques Pures in France as a Rotary fellow and then returned to Harvard for an MD-PhD program. The PhD was in statistics. For his MD specialty, he chose anesthesiology, because he liked the real-time physiology and pharmacology.
“You have problems present themselves, and you have to rectify them right away,” he says. “You have to work with your head and your hands. In addition, the schedules were regular, as opposed to being a general practitioner. I did want to eventually have a research career, and I figured I could have a more scheduled kind of lifestyle.”
Initially Brown saw his tracks in anesthesia and statistics as independent. Though he was interested in tackling data analysis questions and modeling questions that needed to be resolved using applied mathematics, the mystery around how anesthesia worked made it a hard subject to approach that way. So for his MD thesis, he collaborated with Chuck Czeisler, a neuroscientist who wanted to quantify the effect of light exposure, at different times of day or night, on the brain oscillations, internal clocks, and circadian rhythms of humans. The study, which offered a new perspective on jet lag, required putting subjects on a 28-hour-a-day clock and translating the data back into 24-hour terms. Brown designed the mathematical models required to make sense of it all. In a second study, they used his models to characterize the disrupted sleep patterns of shift workers.
Meanwhile, his medical career got underway. After an internship and research fellowship at Brigham and Women’s Hospital and a residency in anesthesiology at MGH, he became an anesthesiologist at MGH and a member of Harvard Medical School’s faculty in 1992, joined the faculty of the Harvard-MIT HST program in 1999, and was named a professor of both computational neuroscience and health sciences technology at MIT in 2005.
By the late 1990s, Brown had teamed up with MIT neuroscience professor Matt Wilson and his grad student Loren Frank, PhD ’01 (now a physiology professor at UCSF), among others, to study areas of the brain involved in memory formation, monitoring neural activity in animals as they moved around their cages. Wilson, Brown says, had already demonstrated empirically that it was possible to determine the location of an animal simply by reading its neural activity in real time; the key is to identify and track place cells, neurons in the hippocampus that fire when an animal visits a specific location or “place field” in its environment. But Wilson’s algorithms were fairly crude. Brown built a far more sophisticated version after studying the algorithms of satellite tracking systems. (These algorithms combine mathematically noisy satellite position measurements, which are captured at tracking stations on the ground, with model-based measurements that define the satellite’s intended trajectory.) Brown translated these principles into the realm of neuroscience by tracking the spiking activity of as few as 30 place cells and feeding that information into an algorithm of his own design. (A key insight that allowed him to do this was that because neurons don’t give off continuous signals, he could treat the place neurons’ spikes as 1s and 0s—and because he was tracking these 1s and 0s from 30 or more different neurons, he could treat their activity mathematically as a multivariate point process.)
This research proved that an animal can hold a map of where it is in its mind—and Brown would further refine the idea to do things like come up with a possible way to track the neural activity that accompanies learning. To do that, he developed one model showing the probability that an animal would get a correct response on a learning task and another showing the spike rates of individual hippocampal neurons. As the animal learned the task, certain neurons’ spike rates increased in lockstep with the probability of obtaining a correct response. This suggested that the behavior of these neurons could be used as a marker for learning.
The mental-map work brought him in contact with lots of neuroscientists (and would also help him to win the Gruber Prize). He recalls many saying to him at conferences or over drinks: “That was an interesting presentation about place fields. We hear you’re an anesthesiologist. Tell about that: How does that work?”
Brown was embarrassed to admit the answer: “We don’t know.”
“You say this enough and you start to sound kind of stupid, right?” he says. “Neuroscientists want to know the details. They want to know what receptors are involved, what parts of the brain, how the circuits are changing.”
They were valid questions. Brown began to believe that anesthesia “needed neuroscience.” And he seriously began to consider, for the first time, how he might combine his two interests.
By then, technological advances had revolutionized neuroscience, allowing scientists to record neural signals at a resolution and with an accuracy that had been unimaginable when Brown first started out. He had been on the front lines of efforts to make sense of this flood of data, developing novel mathematical techniques to pick out patterns and useful insights in plenty of areas unrelated to anesthesia. Maybe, if he combined his two passions, anesthesia wouldn’t have to remain a mystery anymore.
“The effect of anesthesia on the brain is a neuroscience phenomenon,” he explains. “And whereas anesthesiologists are quite skilled at cardiovascular physiology, respiratory physiology, and let’s say renal physiology … we haven’t learned the brain physiology. We don’t know that as well as we should.”
When Brown began paying more attention to neuroscience literature in the area, he found that much of what he was reading seemed to be speculative, even at times preposterous—which only argued further for the need to apply cutting-edge techniques to find out what was really going on.
“When you say you don’t know how something works, people can just fill in whatever they want,” he says. “So you got all these stories about, well, you know, this is what’s happening, and it makes no sense whatsoever if you really think about it.”
Brown decided to find out for himself.
At Mass General Hospital, soon after seeing the strange brain waves rolling across the EEG monitor and slowing almost in direct proportion to changes in the amount of anesthesia administered, Brown and his colleagues noticed something else that intrigued them. Brain imaging captured by the fMRI revealed that the areas of the brain processing auditory signals remained active even when his patients were unconscious. This suggested the brain was still absorbing and processing sounds from the outside world. But something was preventing the signals from the auditory processing areas of the brain from getting to the frontal areas that would allow a patient to become consciously aware of the sounds and interpret them.
It didn’t take long for Brown and his colleagues to figure out why. Though brain scanning techniques such as fMRI weren’t invented until the 1990s, EEGs have been in use since the first half of the 20th century. Theorists for years had suggested that the oscillations—or “brain waves”—captured by the machinery were important for maintaining communication between different brain areas. The shape change seen in the brain waves of anesthetized people could explain why it became more difficult on the drugs.
In a neuron’s resting state, the inside of the cell is negatively charged relative to the outside, thanks to different quantities of positively charged protein molecules and ions (like potassium and sodium) and negatively charged ions (like chloride). EEG captures some of what happens as polarization—the difference in electrical potential between the inside and outside—rises and falls. When the waves crest, the polarization is higher (meaning the number of positively charged ions inside the cell has risen, raising the charge in relation to the outside); when it dips into a trough, it is lower (meaning the number of positive ions inside the cell has fallen). Individual brain cells fire when the potential rises enough to surpass a specific threshold.
Brain imaging captured by the fMRI revealed that the areas of the brain processing auditory signals remained active, even when his patients were unconscious.
Brain cells are able to convey messages between brain regions because when an individual neuron fires, it releases a series of neurochemical signals that change the polarization of other neurons it is connected to. This can create a chain reaction of neuronal firing. It’s the firing of these brain cells that allows us to think, to feel, and to move.
Those slow-rolling waves, Brown would come to recognize, suggested that anesthesia was somehow affecting the polarization of individual brain cells in ways that made them far less likely to fire, because the polarization only neared the threshold briefly, at the peak. And even then, the polarization was far below the level the neuron needs to fire. That in turn made it far less likely that any single spike would start the chain reaction of firing neurons needed to convey a signal across the brain.
“If the neurons can’t spike when they want to,” Brown explains, “they can’t transmit information from one part of the brain to the other, like from your sensory area up to your prefrontal cortex, so you can interpret it. So you’re not going to be conscious of that sound that’s coming in. Or conscious of that visual information that might be coming in.”
Brown and his colleagues began monitoring the brain waves of volunteers as they took them in and out of consciousness by administering changing levels of propofol. They found that the drug produced very prominent slow-delta oscillations and frontal alpha oscillations (which typically began as patients lost consciousness and disappeared as they regained consciousness).
They then studied the impacts of a range of drugs on oscillations in specific brain regions—which vary depending on which anesthesia drug is used. On a molecular level, the drugs were known to work by binding to specific kinds of receptors associated with different brain chemicals, or neurotransmitters. By studying the locations of these receptors in different brain regions and how these regions are connected, Brown and his colleagues were able to explain the mechanisms of the oscillations and why they could produce unconsciousness.
Over a single six-week period in the fall of 2011, Brown rotated through several different anesthesia services at MGH, which required using a wide range of drugs. He was able to identify distinct patterns for all of the most commonly used anesthesia drugs and characterize how their effects varied in patients of different ages.
“We came to understand that these oscillations were really part of the story. They changed systematically with the class of anesthetic drugs, and they also changed very systematically with age and with health state,” Brown says. “The patterns you see in young kids were quite different from the patterns you see in older patients. And a patient in the ICU who has a lot of inflammation will have slow oscillations. We believe inflammation prevents the brain from efficiently functioning.”
Brown says the MIT-MGH research center will look at what he calls a “full set of problems that relate to what we call arousal control,” including both maintaining patients’ unconsciousness and waking them up. One possible application would be to develop better sleeping medications. Many existing sleep drugs, he notes, work on the same class of receptors as propofol: GABA receptors. Suppressing brain activity in parts of the brain rich in GABA receptors promotes the resting phases of sleep because it can quiet the anxious chatter that might keep one awake. But there are unfortunate side effects: this sedation also interferes with REM sleep, a very active part of the sleep cycle, in which the brain is performing necessary maintenance and housekeeping: reinforcing certain synapses, pruning others, replaying memories that need to be stored. By studying these phenomena, he hopes to develop new ways of quieting the brain that won’t disrupt REM sleep.
Examining depression is also on the center’s list. Although the anesthetic ketamine was recently approved as an antidepressant, there’s evidence to suggest that other anesthetics with very different mechanisms of action, including xenon and nitrous oxide, may also help treat depression. Part of the center’s charge will be to figure out why.
Brown also notes that the most widely used antidepressants, known as SSRIs, are designed to increase serotonin, on the theory that people with depression lack sufficient levels of this neurotransmitter. But tools that modulate other neurotransmitters could provide valuable insights.
“In the brain stem, there are a number of neurotransmitter systems; serotonin is only one of them,” he says. “Now let’s just ask a very logical question: Why would depression be the result of a deficiency in just one of these neurotransmitters?” This is an especially important question because, as Brown points out, SSRIs don’t work reliably in all patients. “Now we know that you can give someone ketamine and a nontrivial fraction of the patients feel better right away—and that effect lasts for seven days up to two weeks. So there’s something entirely different that’s going on,” he says. “We know that other anesthetics seem to work as well. So that’s why we want to study them. We want to understand why is that the case. This is in our wheelhouse.”
At the center, Brown says, “we’re going to be working on these problems collectively.”
He adds, “I think we’re going to become far more enlightened about all these over the next two to five years.”
DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.
“This is a profound moment in the history of technology,” says Mustafa Suleyman.
What to know about this autumn’s covid vaccines
New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.
Human-plus-AI solutions mitigate security threats
With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure
Next slide, please: A brief history of the corporate presentation
From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.