IBM and the Beacon Institute, a nonprofit scientific-research organization in Beacon, NY, have announced a collaboration with several other research institutions to create an environmental-monitoring system for New York’s Hudson River. Their plan is to turn all 315 miles of the river into a distributed network of sensors that will collect biological, physical, and chemical information and transmit it to a central location, where it will be analyzed by IBM’s new data acquisition and analysis system. According to John Cronin, CEO of the Beacon Institute, the project is now in its “design phase,” which should be complete within a year and a half to two years.
The network’s sensors will be deployed in a variety of ways. Some will be mounted on a new robotic underwater vehicle developed by Rensselaer Polytechnic Institute (RPI) and the Woods Hole Oceanographic Institute, both collaborators on the project; the vehicle will be powered by solar cells and can operate either autonomously or under human remote control. Other sensors will be suspended from buoys or fixed in place along the riverbed.
“In terms of having an integrated network of sensors, and given the magnitude of it for the Hudson River, this project is without a doubt a huge advancement and on a much larger scale than anything that has been done before,” says Sandra Nierzwicki-Bauer, director of the Darrin Fresh Water Institute at RPI and a member of the science-research committee at the Beacon Institute.
The scale of the network and the variety of its sensors will demand a massive new data-analysis system, which IBM will provide. Comprising both distributed-processing hardware and analytical software, the system is designed to take heterogeneous data from a variety of sources and make sense of it in real time. The software learns to recognize data patterns and trends and prioritizes useful data. If some data stream begins to exhibit even minor variations, the system automatically redirects resources toward it. The system will also be equipped with IBM’s visualization technologies; fed with mapping data, they can create a virtual model of the river and simulate its ecosystem in real time.
The IBM system “enables us to do a great deal of work in the area of data integration and data management for very large volumes and different types of data,” says Harry Kolar, Global Alliance executive at IBM. “Another reason we are working in this sensor area is that we can actually build end-to-end solutions, meaning from the smallest device to a large back-end system.”
Sensor networks to monitor everything from sewage systems to battlefields have been under development for many years, at companies like Intel, Sun, and Siemens and at academic institutions like the University of California, Los Angeles. But what the “research community has not had is the making-meaning part,” says David Culler, a professor of computer science at the University of California, Berkeley. That’s what the IBM system is intended to provide.
“A lot of what the research community has been focused on is getting sensors and delivering them through reliable, energy-efficient networks to the computing infrastructure,” says Culler. “But once you have the data, what do you do with it, and how do you sort it?”