An augmented-reality headset developed at Columbia University shows virtual arrows, text labels, and warnings, depending on where the user is looking.
Ken Crozier, associate professor at Harvard University, shows off three optical devices he’s developing in the lab. These devices use laser light to grab onto particles such as biological cells and might be integrated into clinical diagnostics.
For an easy-to-make adhesive inspired by mussels, possible applications abound.
An assistive robot at Georgia Institute of Technology, called El-E, can open drawers, doors and even a microwave by grasping affixed towels, much in the same way a service dog would. El-E can also follow verbal commands, such as “tug it” or “tug it down.” An assistive robot like this could benefit those with disabilities or the elderly living at home.
In this video of a mouse embryo, made 8.5 days after conception and one day after the heart has begun to form, the heartbeat is visible. The video was made using a variation on a technique called optical-coherence tomography. It’s being used by researchers at the University of Houston to take the highest resolution video yet of the developing mammalian heart. Their goal is a better understanding of why one percent of US infants are born with cardiovascular problems.
The Semantic Web organizer Twine offers bookmarking with built-in AI.
Within this game, software automatically displaces virtual objects so that players will not interfere with one another’s physical space. Called Redirected Motion, the technique could help in other situations where users share an augmented reality space.
U.K. readers of Templar Publishing’s Drake’s Comprehensive Compendium of Dragonology can use a Web camera with their book to control an on-screen animation of a dragon.
Researchers at the University of Oxford have developed an object-recognition system that successfully tracks multiple objects in an indoor space.