Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

IBM’s financial stream-computing system consists of three concepts, explains Halim. The first is the use of streams, data flows that move in one direction through the systems. The second is the fact that data is processed in chunks, or windows, within that stream. And the third is the use of a collection of algorithms that record the rate that the data comes in, that understand the capabilities of the hardware, and that direct the streams in the most efficient ways. These algorithms can take a stream and “spread it around in different ways,” Halim says, and “partition it on different kinds of hardware that are specialized to do certain tasks.”

For instance, some cores of a supercomputer might be optimized to process and summarize the text in news reports, such as the failing health of a company’s popular CEO, while others are better at performing simple mathematical operations on numbers that flow into the system. IBM has developed its own stream-computing language called Spade that can assess the capabilities on supercomputers and spread the data flows around appropriately, without needing much input from a programmer. Spade makes it possible, says Halim, for stream computing to run on other multiple-processing systems, not just Blue Gene/P.

Stream computing is not a new idea. In fact, concepts for processing data as it enters a computer were around in the 1960s, says Saman Amarasinghe, a professor of electrical engineering and computer science at MIT. But in recent years, it has become more practical to use, thanks to the growing popularity of multicore chips, which have multiple processing centers that crunch numbers independently. Streams of data can be broken up and partitioned to individual cores relatively easily, says Amarasinghe.

Amarasinghe adds that IBM has made improvements in the more academic, theoretical stream-computing work and has applied it to real-world problems. “IBM has brought stream computing to high performance,” he says. “They can make it run very fast.”

Amarasinghe suspects that the popularity of stream computing will grow due to a confluence of factors. First, the chip-making industry plans to keep increasing the number of cores that it builds on its chips. Second, stream computing is a relatively straightforward programming approach to making use of these multiple cores. Third, “there’s an explosion of data,” he says, “and it’s the type of data that streams in, like video and audio.” It could even lead to more advanced user interfaces for computers that can process real-time video and audio interactions from people, he says.

1 comment. Share your thoughts »

Credit: IBM

Tagged: Computing, Business, IBM, financial systems, multicore, market, IBM stream computing, trading

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me