Skip to Content
Uncategorized

Virtual Assembly

February 1, 1997

“Some assembly required.” This phrase strikes fear into the heart of every parent faced with putting together a bicycle on the eve of a kid’s birthday. But it’s also an intimidating fact of life for manufacturing firms-perhaps none more so than aircraft makers, whose employees must often piece together bewildering assortments of parts using volumes of jumbled instructional verbiage. Wouldn’t it be nice if these complex transportation systems could tell an assembler how to put them together?

For some airplane-factory workers, certain plane parts can do just that, with the aid of augmented reality (AR). This technology melds virtual-reality viewers or other visual displays with positional trackers and ever smaller and faster computers to provide real-time assembly instruction. Unlike virtual reality, in which the user is completely immersed in an artificial world, AR lets you see the real world as well as an additional overlay of information that appears attached to the workpiece itself.

Seeing instructions and diagrams superimposed onto a workpiece beats “lining things up by sight, measuring, or trying to figure out what to do from a blueprint,” says Ulrich Neumann, a professor of computer science at the University of Southern California, who designs augmented-reality units for use in assembly operations. The technique is ideal, he says, for any jobs that are so complex that the operator is continually looking for instructions.

Anthony Majoros, a senior engineering scientist at McDonnell Douglas Aerospace in Long Beach, Calif., recently began testing one of Neumann’s prototype systems. Pointing to small sections of the fuselage of a DC-10 aircraft, he explains that the AR device can aid assembly workers by highlighting intended drilling locations or displaying explanations of where and how a particular sealant should be applied.

The prototype system consists of a video camera on a tripod that is connected to a Silicon Graphics workstation with a flat-panel color monitor, all on a roll-around cart. When a worker wheels the cart to certain sections of the aircraft and aims the camera, the computer looks for fiducial markings-pre-placed dots, targets, crosses, or natural features like holes, seams, or bumps-and uses pattern-recognition software to determine the particular unit under construction and establish the correct spatial relationship between the camera and the object. The computer then calls up the appropriate graphics and instructions, which have been preprogrammed into the system, and superimposes them in the proper orientation over the assembly on the computer screen. Once an assembly step is accomplished, the operator triggers the next procedure using a keypad.

Farther up the coast in Bellevue, Wash., aircraft maker Boeing is also exploring the potential of augmented reality but with more portable, “wearable” systems. Boeing project manager David Mizell believes that these “garments” would be perfect for a number of complex manufacturing and assembly jobs, particularly those that require two free hands to reach inaccessible places.

Users of Mizell’s system wear a modular, 2.75-pound Honeywell computer around the waist like a skin diver wears a weight belt. They also don a see-through visor from Digital Vision Corp. that swivels down from a headband over one eye, and a head-mounted camera from TriSen Corp. that looks for fiducial landmarks on the assembly. When the user’s head moves side to side or back and forth, Mizell explains, the computer keeps track of the position of the fiducial markings and automatically realigns the overlaid information.

Resolving Ambiguities

Boeing’s first application of the technology has been in its “wire shop,” where workers assemble wire bundles that connect circuits from one section of an aircraft to another. Each plane has about 1,000 such bundles, each of which must be preassembled on 3-by-8-foot pressboard sheets studded with pegs. The conventional assembly technique, which relies on myriad markings on plotter paper glued to the pressboard, is cumbersome because the bundles have to be assembled in hundreds of different ways depending on the aircraft model. “Storing 1,000 unique boards for bundles requires a lot of space,” says Mizell, “and many are changed for individual customers.” With AR, wiring configurations for all models are stored in the computer. When the aircraft model is called up, the computer displays how the bundles should be assembled one wire at a time using guide lines that appear over the blank board. This eliminates ambiguity and improves efficiency, he says. Moreover, any board can now be used for any bundle.

As promising as the results have been so far, researchers caution that the technology has been tested only under carefully controlled conditions and is not yet ready for prime time. The main limitation, says Neumann, is with tracking. The system often has difficulty repositioning the overlaid information in exactly the right spot over the workpiece when the camera’s view is distorted by bad lighting or occluded, he explains. For example, if an operator blocks the camera lens with a wrench while tightening a bolt, the onscreen overlay could drift as the operator’s head turns. “It isn’t very helpful if an arrow is suddenly pointing to the wrong place,” he says.

Neumann says the consensus is that hybrid trackers, units that have two or more different tracking technologies built in, will be required to solve the problem. One system, which is being tested in combination with the video pattern-recognition approach used by McDonnell Douglas and Boeing, is called magnetic tracking. This technique relies on an electrical device containing huge magnetic coils that generate three magnetic fields aligned at right angles to one another in the space surrounding the work area. A sensor in the AR helmet measures the relative strength of each field to divine the camera’s precise orientation with respect to the workpiece.

Eric Foxlin, chairman and vice-president of research and development at InterSense Corp. in Cambridge, Mass., is developing prototypes of an “acoustic inertial” hybrid tracker for a range of potential AR tasks, including Boeing’s “wire shop” application. In one such unit, three ultrasonic speakers placed at right angles to each other on the helmet send out ultrasonic chirps or pulses to microphones placed around the work area. By measuring the time it takes for the pulses to reach the microphones, the computer can calculate how far the headset is from each one and determine its precise location and orientation. The tracking unit also uses gyroscopes and accelerometers to measure the worker’s head movements to help keep track of the camera’s position.

Any technique, by itself, has limitations, says Foxlin. Metal objects can distort magnetic fields, any solid object can distort ultrasonic signals, and slightly inaccurate readings from the gyroscopes and accelerometers can accumulate rapidly and cause drift. But with two systems providing continual updates, he says, one can compensate when the other fails.

As more prototype experiments prove the concept and developers home in on remedies to technical shortcomings, interest in AR appears to be growing. The early work, though limited, says Neumann, “has brought more people and different technologies to the field.” In anticipation, AR developers such as Honeywell are now targeting a wide range of tasks beyond assembly and manufacturing, including maintenance, construction, military, and even medical and surgical applications.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

It’s time to retire the term “user”

The proliferation of AI means we need a new word.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.