A View from Will Knight
Five Futuristic Interfaces on Display at SIGGRAPH
Some very interesting ideas are being showcased this week at SIGGRAPH 2009.
The annual meeting of the ACM’s Special Interest Group on Graphics and Interactive Techniques, SIGGRAPH 2009, takes place in New Orleans this week. The event brings together some of the world’s best digital artists and computer researchers and is a showcase for some interesting new interfaces.
Here are five particularly cool ideas that will be on display at this year’s event.
1. Touchable Holography
A team of researchers at the University of Tokyo led by Hiroyuki Shinoda has developed a display that lets users “touch” objects that appear to float in space in front of them.
The virtual objects appear in mid-air thanks to an LCD and a concave mirror. The sensation of touching the objects is created using an ultrasound device positioned below the LCD and mirror. The airborne ultrasound tactile device used to produce the sensation of touch was demoed at SIGGRAPH in 2008.
2. Augmented Reality for Ordinary Toys
Frantz Lasorne, a student at L’École de Design in France, has invented an ingenious way to breathe new life into old toys.
Lasorne’s Scope display automatically recognizes ordinary toys that have been mounted onto platforms covered with hexagonal patterns. Viewed through the augmented reality display, these patterns become interactive buttons and can be used to make virtual modifications to the toy. As the video below shows, a Lego person can, for instance, be instantly armed with a giant virtual bazooka.
3. Hyper-Realistic Virtual Reality
A team from INRIA and Grenoble Universities in France will demo a new virtual reality system called Virtualization Gate that tracks users’ movements very accurately using multiple cameras, allowing them to interact with virtual objects with new realism.
The user wears a head-mounted display (HMD) and moves through a virtual space while several cameras track his movement. The video here shows a guy kicking over virtual vases and pushing around a virtual representation of himself. A cluster of PCs is needed to perform the necessary image capture and 3D modeling.
4. 3D Teleconferencing
Researchers at the University of Southern California will demo Headspin, a 3D teleconferencing system that maintains eye contact between a three-dimensional head and several participants on the other end of a connection.
To capture an image, a polarized beam-splitter “places” the camera virtually near the eyes of the speaker. The 3D display works by projecting high-speed video onto a rapidly spinning aluminum disk to generate an accurate image for each viewer.
5. Scratchable Input
Chris Harrison, a researcher at Carnegie Mellon University whose human-computer interaction work we’ve written about previously, will demonstrate his new scratch input technology. The system turns any surface into an instant input device by sensing the unique sound produced when a fingernail is dragged across it.
The interface is small enough to fit into a mobile device, Harrison says, and could thereby turn any surface the device is placed upon into an interface.
Learn from the humans leading the way in intelligent machines at EmTech Next. Register Today!
June 11-12, 2019