Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

The CathSim system was launched in 1998, followed by the bronchoscopy simulator in 1999 and a sigmoidoscopy (a similar procedure for inspecting the lower colon) system last year. Immersion Medical-formerly known as HT Medical Systems-remains a wholly owned subsidiary of the San Jose, CA-based computer interface company Immersion, which bought HT Medical last year. Additional product rollouts are planned, including a colonoscopy simulator, which will replicate procedures deeper inside the colon, later this year. All this makes Immersion Medical the leader in commercializing haptics-enhanced medical simulation tools, says Daniel B. Raemer, program coordinator at the Center for Medical Simulation in Boston, MA, which is affiliated with Harvard Medical School.

The company is not alone. Medical Data International, a market analysis firm in Santa Ana, CA, estimates that last year the U.S. market for medical simulation devices-haptic and otherwise-amounted to only $23 million, a figure that is expected to rise substantially in the coming years. Among the other corporate players are MedSim, based in Israel, and Medical Education Technologies, in Sarasota, FL; but these companies’ products focus on visual simulations or on medical training mannequins that do not yet include computerized haptics. At Beth Israel Deaconess, residents must demonstrate proficiency with a MedSim device-a non-haptic dummy torso used for ultrasound training-before performing a real intravaginal ultrasound. But with haptics so new, no student is yet required to master any haptics-enhanced device before performing the real procedure, Rabkin says.

And that’s largely because the technology still needs a great deal of refinement. The apparent haptic fidelity of these simulators relies on some mental trickery: the visual input helps the brain fill in where the tactile feedback itself falls short; the haptic sensation is still a rough approximation. At the moment, says Bruce Schena, chief technology officer at Immersion, haptics interfaces can convey coarse textural differences, such as the difference between corduroy and smooth cotton. However, it’s not possible to distinguish subtler differences between, say, cotton and polyester. When it comes to haptic fidelity, “we are in the days of eight-bit color,” or roughly at the cartoon level, Schena says.

Haptic rendering has taken longer to come online than visual rendering because it is a vastly more difficult problem, says physicist and haptics engineer Ralph Hollis of the Robotics Institute at Carnegie Mellon University in Pittsburgh. Rendering visual images is a one-way street. “Eyes take in photons but don’t shoot them out,” says Hollis, who designs haptic devices for controlling factory tools and robotic machinery. But haptics devices are two-way. “A hand manipulates, but there is force feedback too. So any kind of haptic device that we use to interface with the computer must…take input from users as well as deliver output through the same mechanism.”

In addition to this two-way problem, designers face a major computational challenge: simulating feeling is far more demanding than simulating seeing. Film projected at 24 frames per second creates the illusion of a continuous moving image; but, says Hollis, “You might need 1000 frames to fool the sense of touch.” This goes a long way toward explaining why haptics is just now building genuine momentum. Not until the past few years did computing power become cheap enough for Immersion Medical to reduce Haptics product prices below $10,000.

Even as haptics-enhanced training devices spread within the medical community, some physicians are eyeing another haptic frontier: telesurgery. They envision surgery-complete with the sense of feel-transacted over the Internet or even via datalinks to far-flung places like the International Space Station. This would be particularly helpful in the area of minimally invasive surgery, including the so-called laparoscopic procedures. In such procedures, doctors snake tiny cameras, scoops and knives into the body through thin tubes. The doctors operate the tools with mechanical controllers and watch the action on a video screen.

Such procedures have been great for patients, who experience smaller wounds, fewer infections and quicker recoveries. But they place new demands on doctors. By not opening the abdomen, says surgery professor Richard M. Satava of the Yale University Medical School, the surgeon suffers “a loss of 3-D vision, dexterity and the sense of touch.” As medicine moves toward more laparoscopic procedures-whether performed telesurgically or on site-haptics-based simulators can help doctors overcome those sensory deficits.

Haptics also promises to push the frontiers of robotic surgery. For example, Satava says it probably will become possible for surgeons to plan minimally invasive operations-using feedback from haptic renderings of the procedure-then allow robots to do the actual cutting. Some advances in this direction have already occurred. Computer Motion, based in Santa Barbara, CA, has sold about 40 of its ZEUS Robotic Surgical Systems, in which doctors control robots that wield three tool-tipped arms inserted into the patient. One of those robotic arms transmits forces experienced by the tool back to the human surgeon’s hand. One chief competitor: Intuitive Surgical of Mountain View, CA, whose flagship product, the da Vinci Surgical System, offers a similar combination of computers and robotics.

This future era of haptics-enhanced telesurgery and robotic surgery, Satava says, will require new models of human beings that include haptic parameters such as the resistance that various tissues offer when a knife or needle goes through them. A good place to start, he says, would be to expand on the National Library of Medicine’s Visible Human Project, launched in the mid-1990s. Male and female cadavers were sectioned at intervals of 1 millimeter and .33 millimeters, respectively. Each section was imaged by CAT scan, photography and other tools, then digitized and integrated into an enormous visual and informational database. “Now we need to add all the properties for the sense of touch,” Satava says. “That way, when a doctor gets a CT scan, he would be able to use a haptic interface to feel the 3-D model displayed on a computer screen.”

And Satava and Merril have an even more exotic vision for robotic medical haptics. A master surgeon could haptically record an intricate brain operation on a virtual rendering of a patient. A novice surgeon could use the master’s recording to rehearse the procedure, complete with tactile feedback personalized for that patient. Add robotics to this scenario, and it might become possible for a surgeon to perform an operation on himself, by doing it first in the virtual world, then unleashing a robot to “play back” the real thing. But that’s a long way off. What’s clear, Raemer says, is that today’s applications of haptics technology “are just barely scratching the surface.”

0 comments about this story. Start the discussion »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me