Skip to Content

Helping Doctors Feel Better

New computer simulations that re-create the sense of touch allow doctors-in-training to perform virtual procedures without risking harm to a human being’s precious skin.
April 1, 2001

The product demonstration room at Immersion Medical in Gaithersburg, MD, is a veritable arcade of medical simulation. There you can find a lineup of electromechanical, sensor-riddled, computerized devices, all coupled to virtual models of the human body. With these gadgets, students can practice the routine task of inserting a catheter into a patient’s hand, or more difficult procedures like a colonoscopy or even a lung biopsy. But these simulators don’t just provide vivid computerized visual renderings of human innards. They also re-create something equally critical: how all the injecting, cutting, inserting and palpating actually feel to the doctor performing them.

Here and in other corporate and university labs, computer simulation experts-having largely mastered visual displays and digitized sound-are demonstrating an increasing mastery over a third sensory frontier: touch. Their specialty is known as haptics, after the Greek haptikos, meaning to grasp or perceive. While the technology is still most widely known as the rudimentary shuddering of a video game joystick, more sophisticated versions are well on their way to enhancing basic medical simulation training.

Future haptic applications may even enable doctors to perform surgery over the Internet. Beyond medicine, haptics has also emerged as a tool for creating “touchable” 3-D models in the virtual world, and for conveying bumps and vibrations on the common computer mouse-you’d “feel” the icons on the screen (see companion article “Touchy Subjects”). But the technology is having its most palpable impact as an emerging tool for training doctors and nurses without risk to patients. “Haptics is a huge part of providing a realistic [medical] simulation experience,” says Gregory Merril, Immersion’s 32-year-old founder and self-described chief visionary officer. “When doctors are interacting with patients, a lot of it is the sense of touch.”

The net result of this amalgamation of hardware, software and mechanisms was that the procedure felt intuitively right: easy through the skin, resistance as the needle popped through the blood vessel wall, and a feeling of release as the needle reached the bloodstream. Even my ears were engaged, as my maiden foray into catheterization elicited jarring “ouches” from the computer’s speakers: cries of pain from the violated virtual man.

But, hey, I’d done it. Feeling good, I decided to take a stab at something more challenging: pediatrics. But after my third try at easing a thin-gauge catheter into a vein on a newborn baby’s virtual forehead, I gave up. My angle was too low. The resistance in the needle told me as much. The virtual baby told me, too, as shrieks rattled the speakers. Maybe I wasn’t meant to be a doctor after all.

Merril characterizes the hardware components of his company’s simulators as novel uses for existing mechanical and robotic components. With combinations of electromagnetic brakes, motors, cables and other devices, he says it’s possible to convey a wide range of tactile sensations. What makes it seem “like you are really sticking the needle in…and that you are feeling a force that corresponds to what you are seeing, is a computer model of the skin,” Merril says.

Using critiques from doctors and nurses who have done the real thing many times, software engineers tweak the simulators to improve the fidelity of the haptic sensations. “We know how much force it takes for the surface of the skin to break, and when that happens in the simulator, it tells the machine to let up on the brake,” which feels like a sudden reduction of resistance on the ingoing catheter, Merril says. Rabkin adds that today’s best devices are still a bit jerky, and just short of real time on the visual side.

After wielding a needle, I wanted to try a fancier-looking bronchoscopy simulator, which mimics the device used to inspect the bronchial passages of the lungs. I inserted a straw-thin flexible tube through a nostril of an artificial face staring at the ceiling from a stand. This time, the haptic interface was a free-floating, joystick-like controller-identical to the real medical device-not unlike a bartender’s multiple-button, soda-dispensing head. As I snaked the camera- and tool-tipped tube inward, a computer monitor displayed strikingly realistic visual and haptic simulations of the lung’s bronchial tree; the simulated airways even convulsed when my “patient” coughed (I forgot to “apply” local anesthetic). I felt particularly doctorly when a menacing mass suddenly became visible. Using a clawlike tool controlled with the haptic interface, I performed a virtual biopsy. A “bloody” spot instantly appeared where I had extracted tissue.

Such were my first encounters with so-called haptic rendering. They undoubtedly won’t be the last. These simulators presage haptic things to come in arenas ranging from product design to remotely operated robotic tools, perhaps even hands-on museums where you can rub a virtual Rembrandt. The collective aim of haptics researchers is nothing less than to encode and re-create the world’s tactile features with the same breadth and fidelity that has made digital visual and audio rendering so realistic and versatile. To this research community, the tactile surface of the world-real or imagined-is something they can capture, replay, even synthesize from scratch with a combination of computers and handheld gadgetry.

Although there are lots of potential applications on the horizon, medical training appears slated to become the first killer app of haptics. The initial waves of products are already diffusing into medical settings, where students can learn procedures under normal, novel, and unexpected conditions. In the past two years, Immersion Medical says it has shipped approximately 400 medical simulators to hospitals and medical schools. “It’s common sense-if that individual can rehearse that circumstance, he will be better able to deal with it when it really happens,” says E. James Britt, professor of pulmonary and clinical medicine at the University of Maryland and a consultant to the company.

The CathSim system was launched in 1998, followed by the bronchoscopy simulator in 1999 and a sigmoidoscopy (a similar procedure for inspecting the lower colon) system last year. Immersion Medical-formerly known as HT Medical Systems-remains a wholly owned subsidiary of the San Jose, CA-based computer interface company Immersion, which bought HT Medical last year. Additional product rollouts are planned, including a colonoscopy simulator, which will replicate procedures deeper inside the colon, later this year. All this makes Immersion Medical the leader in commercializing haptics-enhanced medical simulation tools, says Daniel B. Raemer, program coordinator at the Center for Medical Simulation in Boston, MA, which is affiliated with Harvard Medical School.

The company is not alone. Medical Data International, a market analysis firm in Santa Ana, CA, estimates that last year the U.S. market for medical simulation devices-haptic and otherwise-amounted to only $23 million, a figure that is expected to rise substantially in the coming years. Among the other corporate players are MedSim, based in Israel, and Medical Education Technologies, in Sarasota, FL; but these companies’ products focus on visual simulations or on medical training mannequins that do not yet include computerized haptics. At Beth Israel Deaconess, residents must demonstrate proficiency with a MedSim device-a non-haptic dummy torso used for ultrasound training-before performing a real intravaginal ultrasound. But with haptics so new, no student is yet required to master any haptics-enhanced device before performing the real procedure, Rabkin says.

And that’s largely because the technology still needs a great deal of refinement. The apparent haptic fidelity of these simulators relies on some mental trickery: the visual input helps the brain fill in where the tactile feedback itself falls short; the haptic sensation is still a rough approximation. At the moment, says Bruce Schena, chief technology officer at Immersion, haptics interfaces can convey coarse textural differences, such as the difference between corduroy and smooth cotton. However, it’s not possible to distinguish subtler differences between, say, cotton and polyester. When it comes to haptic fidelity, “we are in the days of eight-bit color,” or roughly at the cartoon level, Schena says.

Haptic rendering has taken longer to come online than visual rendering because it is a vastly more difficult problem, says physicist and haptics engineer Ralph Hollis of the Robotics Institute at Carnegie Mellon University in Pittsburgh. Rendering visual images is a one-way street. “Eyes take in photons but don’t shoot them out,” says Hollis, who designs haptic devices for controlling factory tools and robotic machinery. But haptics devices are two-way. “A hand manipulates, but there is force feedback too. So any kind of haptic device that we use to interface with the computer must…take input from users as well as deliver output through the same mechanism.”

In addition to this two-way problem, designers face a major computational challenge: simulating feeling is far more demanding than simulating seeing. Film projected at 24 frames per second creates the illusion of a continuous moving image; but, says Hollis, “You might need 1000 frames to fool the sense of touch.” This goes a long way toward explaining why haptics is just now building genuine momentum. Not until the past few years did computing power become cheap enough for Immersion Medical to reduce Haptics product prices below $10,000.

Even as haptics-enhanced training devices spread within the medical community, some physicians are eyeing another haptic frontier: telesurgery. They envision surgery-complete with the sense of feel-transacted over the Internet or even via datalinks to far-flung places like the International Space Station. This would be particularly helpful in the area of minimally invasive surgery, including the so-called laparoscopic procedures. In such procedures, doctors snake tiny cameras, scoops and knives into the body through thin tubes. The doctors operate the tools with mechanical controllers and watch the action on a video screen.

Such procedures have been great for patients, who experience smaller wounds, fewer infections and quicker recoveries. But they place new demands on doctors. By not opening the abdomen, says surgery professor Richard M. Satava of the Yale University Medical School, the surgeon suffers “a loss of 3-D vision, dexterity and the sense of touch.” As medicine moves toward more laparoscopic procedures-whether performed telesurgically or on site-haptics-based simulators can help doctors overcome those sensory deficits.

Haptics also promises to push the frontiers of robotic surgery. For example, Satava says it probably will become possible for surgeons to plan minimally invasive operations-using feedback from haptic renderings of the procedure-then allow robots to do the actual cutting. Some advances in this direction have already occurred. Computer Motion, based in Santa Barbara, CA, has sold about 40 of its ZEUS Robotic Surgical Systems, in which doctors control robots that wield three tool-tipped arms inserted into the patient. One of those robotic arms transmits forces experienced by the tool back to the human surgeon’s hand. One chief competitor: Intuitive Surgical of Mountain View, CA, whose flagship product, the da Vinci Surgical System, offers a similar combination of computers and robotics.

This future era of haptics-enhanced telesurgery and robotic surgery, Satava says, will require new models of human beings that include haptic parameters such as the resistance that various tissues offer when a knife or needle goes through them. A good place to start, he says, would be to expand on the National Library of Medicine’s Visible Human Project, launched in the mid-1990s. Male and female cadavers were sectioned at intervals of 1 millimeter and .33 millimeters, respectively. Each section was imaged by CAT scan, photography and other tools, then digitized and integrated into an enormous visual and informational database. “Now we need to add all the properties for the sense of touch,” Satava says. “That way, when a doctor gets a CT scan, he would be able to use a haptic interface to feel the 3-D model displayed on a computer screen.”

And Satava and Merril have an even more exotic vision for robotic medical haptics. A master surgeon could haptically record an intricate brain operation on a virtual rendering of a patient. A novice surgeon could use the master’s recording to rehearse the procedure, complete with tactile feedback personalized for that patient. Add robotics to this scenario, and it might become possible for a surgeon to perform an operation on himself, by doing it first in the virtual world, then unleashing a robot to “play back” the real thing. But that’s a long way off. What’s clear, Raemer says, is that today’s applications of haptics technology “are just barely scratching the surface.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.