The surgeon studies the face of a teenage boy whose upper jaw and cheek were destroyed by cancer years ago. Lifting his gloved right hand, he points to an area just below one of the patient’s eyes. As if by magic, an incision appears in the boy’s cheek, revealing the area of tissue and bone to be rebuilt. Pointing again, the surgeon begins a complicated procedure for transplanting bone and tissue from the boy’s hip to his face.
In the past, plastic surgeons had to be in the operating room to try procedures like these. Now some are using an experimental computer visualization tool called the Immersive Workbench, developed by researchers from Stanford University and NASA Ames Research Center, to plan and practice difficult operations. The software program combines data from CT scans, magnetic resonance images, and ultrasound to create high-resolution pictures of individual patients and display them in a virtual environment.
Unlike other software tools developed to visualize the results of plastic surgery, which rely on standard physical models of men and women, the Immersive Workbench generates images that depict the specific deformities or injuries of particular patients. The latest prototype of the software goes further, letting doctors wearing tracked-shutter glasses and special gloves test specific surgical approaches in rapid succession to see which produces the best results.
“The whole idea is to be able to interact with the virtual environment in the same way as you interact with a patient in real life-in a way that requires almost no training for the user,” says project director Dr. Michael Stephanides of Stanford University’s Division of Plastic Surgery.
The project started in 1991, when Stanford researchers began developing two-dimensional graphic renderings of patients from imaging data. Three years ago, Stephanides asked NASA Ames to create sophisticated software for building three-dimensional patient portraits from data collected in CT scans. At that time, NASA Ames engineers were spending most of their time creating visualizations of biological systems for space-related applications, but the lab’s collaboration with Stanford has led to the creation of NASA Ames’s Biocomputation Center, a new national center for research in virtual environments for surgical planning.
Plastic surgery offers a particularly rigorous challenge for software engineers and medical researchers developing virtual reality (VR) tools, since computerized renderings of patients must look almost exactly as they do in the real world. It is no small task to display human body parts at the necessary high resolution, says Kevin Montgomery, the leader of the NASA Ames group participating in this project. According to Montgomery, a 3D rendering of a human face and head contains 8 million tiny image slices that must be updated at a speed of 10 frames per second-processing demands that approach the theoretical limit of current computers; as a result, the NASA Ames researchers had to find ingenious ways to discard much of the raw data from patient images. Nonetheless, Montgomery’s group has been able to generate highly resolved images detailing such subtle features as small ridges of tissue, the impression of a vein beneath the skin on a human scalp, and the fine detail of a patient’s inner ear.
Doctors have already used the Immersive Workbench to plan some 15 surgeries involving reconstruction of bony defects in the skeleton of the face and skull. But Montgomery and Stephanides caution that the tool is still in the experimental stage. They expect clinical deployment in three to five years, when the next generation of processors and graphics cards makes $10,000 desktop computers as fast and powerful as the $100,000 graphical workstations now needed to run the software. Between now and then, the researchers hope to improve the program by creating a more intuitive graphical user interface, depicting virtual surgical instruments more accurately, and developing the capacity to update patient images in near-real time as doctors practice their procedures.
When hardware costs are no longer a limiting factor, Stephanides believes VR technology will replace current surgical planning methods and become an important tool for educating doctors in medical schools.
Embracing CX in the metaverse
More than just meeting customers where they are, the metaverse offers opportunities to transform customer experience.
Identity protection is key to metaverse innovation
As immersive experiences in the metaverse become more sophisticated, so does the threat landscape.
The modern enterprise imaging and data value chain
For both patients and providers, intelligent, interoperable, and open workflow solutions will make all the difference.
Scientists have created synthetic mouse embryos with developed brains
The stem-cell-derived embryos could shed new light on the earliest stages of human pregnancy.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.