Faster Maintenance with Augmented Reality
In the not-too-distant future, it might be possible to slip on a pair of augmented-reality (AR) goggles instead of fumbling with a manual while trying to repair a car engine. Instructions overlaid on the real world would show how to complete a task by identifying, for example, exactly where the ignition coil was, and how to wire it up correctly.
A new AR system developed at Columbia University starts to do just this, and testing performed by Marine mechanics suggests that it can help users find and begin a maintenance task in almost half the usual time.
AR has long shown potential for both entertainment and practical applications, and the first commercial applications are starting to appear in smart phones, thanks to cheaper, more compact computer chips, cameras, and other sensors. So far, however, these apps have been mainly limited to providing directions. But researchers are also working on many practical applications, including ways to help with specific repair and maintenance tasks.
The Columbia researchers worked with mechanics from the U.S. Marine Corps to measure the benefits of using an AR headset when performing repairs to a light armored vehicle. Currently, Marine mechanics have to refer to a technical manual on a laptop while performing maintenance or repairs inside the vehicle, which has many electric, hydraulic, and mechanical components in a tight space.
A user wears a head-worn display, and the AR system provides assistance by showing 3-D arrows that point to a relevant component, text instructions, floating labels and warnings, and animated, 3-D models of the appropriate tools. An Android-powered G1 smart phone attached to the mechanic’s wrist provides touchscreen controls for cueing up the next sequence of instructions.
The idea was to present a user with the “information they need to find and fix problems in a way that is going to be more efficient and accurate,” says Steven Feiner, a professor of computer science and director of the Computer Graphics and User Interfaces Laboratory at Columbia, who carried out the research with Steven Henderson, an assistant professor at the United States Military Academy’s Department of Systems Engineering. Henderson and Feiner presented their paper at the International Symposium on Mixed and Augmented Reality (ISMAR 09) in Orlando, FL, last Thursday, where it won the conference’s Best Paper award.
The work “provides more insights into what AR can contribute in the repair and maintenance domain, and in what specific situations AR interfaces can be helpful and advantageous,” says Tobias Höllerer, cochair of ISMAR 09 and associate professor at the University of California, Santa Barbara.
Henderson and Feiner first gathered laser scans and photography of the inside of the vehicle. They built a 3-D model of the vehicle’s cockpit and developed software for directing and instructing users in performing individual maintenance tasks. Ten cameras inside the cockpit were used to track the position of three infrared LEDs attached to the user’s head-worn display. In the future, the team suggests that it may be more practical for cameras or sensors to be worn by the users themselves.
Six participants carried out 18 tasks using the AR system. For comparison, the same participants also used an untracked headset (showing static text instructions and views without arrows or direction to components) and a stationary computer screen with the same graphics and models used in the headset. The mechanics using the AR system located and started repair tasks 56 percent faster, on average, than when wearing the untracked headset, and 47 percent faster than when using just a stationary computer screen.
“From a research point of view, [this work] is the best comparison yet of the different approaches you can take between a normal multimedia system, a wearable one, and a fully augmented-reality one,” says Georgia Institute of Technology professor Blair MacIntyre, who has worked with Feiner in the past but was not involved in this project.
Next, the team wants to expand the AR system so that it tells users how to perform a task better and faster. “We believe that by paying attention to the actual task itself, and giving advice about how to do it, we could get similar types of improvements using AR,” says Feiner. “That is something we want very much to explore.” In terms of practical AR systems for widespread use, Feiner says that having displays that aren’t too cumbersome or bulky will be important.
Even though the Columbia AR system was designed to help trained personnel repair a particular vehicle, similar technology could have a broader impact, says MacIntyre. Such a system could help regular car mechanics and eventually ordinary drivers. “If you’re going to build an elaborate system with all of the information about the engine, you can then build a stripped-down version [that] makes building those end-user systems more feasible,” he says. MacIntyre adds that a smart phone app showing how to change an engine’s oil probably isn’t far off but that headset technology will probably take longer to arrive.
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Deep learning pioneer Geoffrey Hinton has quit Google
Hinton will be speaking at EmTech Digital on Wednesday.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.