Many websites offer application programming interfaces (APIs) that let programmers tap directly into their data and capabilities. This makes it possible to add the latest Twitter headlines to a desktop application, or combine content from several sites in a so-called “mash up.” Wikipedia Vision, for example, uses the Wikipedia API and the Google Maps API to show new entries to the online encyclopedia by the geographical location from which the entry was posted.
But many websites now offer a stream of data in near real time. Such streams offers new possibilities to developers. A company called Kynetx has developed a programming language, called the Kynetx Rule Language (KRL), to provide more sophisticated ways of using this data.
Phil Windley, Kynetx’s founder and chief technology officer, says KRL can help developers make more of a Web stream. Using an API usually involves writing a program to go and fetch data as needed. But Windley argues that it can be far more useful to track data more or less constantly. “To interact with Web streams, you need a different way to think about programming,” he says.
Windley was inspired to develop KRL by database software that can handle floods of real-time information, such the software that responds rapidly to data from financial markets. Since many APIs can now supply floods of data, he realized that it might be useful to handle these feeds with similar speed.
KRL allows a programmer to write rules that react when data is spotted in a feed of data. For example, a KRL program might be designed to check real-time data on product prices when a person enters a store, as revealed by a geolocation data feed.
KRL makes it simple for a programmer to write applications that make use of data from many different Web services, and store that data in the cloud, Windley says. KRL is free to use and test, but Kynetx charges for it if it’s incorporated into a commercial application.
Brian Mulloy, vice president of products for Apigee, a company that helps businesses create and manage APIs, says Kynetx’s approach has promise. He notes that KRL assumes that the data it is fed will change, and can adapt accordingly. He says this will provide important new ways to think about connecting different websites. For example, he says, sufficiently intelligent interfaces might be able to guess at what a user is trying to do and suggest other complementary Web services to connect to.
Other companies are building technology to handle the data spewed out by modern Web application interfaces. Gnip, based in Boulder, Colorado, connects to a variety of services, including Twitter, Facebook, and YouTube, and repackages data and functions from these services to make it simpler to use for other companies.
Jud Valeski, CEO of Gnip, says few developers or companies are set up to handle the torrent of data many Web services provide—they don’t have the bandwidth, storage, or real-time processing power. He agrees that new approaches are needed to handle these kinds of feeds. Gnip recently started selling access to filtered portions of Twitter’s real-time stream. Valeski says programmers can apply their own rules in turn to further refine the data and how it is handled. Better tools for handling this kind of data will be part of the next wave of big change on the Web, he predicts.