“This would be very useful for mainstream computing applications,” says Julian Miller, lecturer in electronics at the University of York in England, who has used FPGAs for evolutionary computing applications. Currently, for his purposes, FPGAs are simply too slow. “It’s a huge problem,” he says. Being able to reconfigure a chip within a single clock cycle would be a great benefit, he says.
ChaoLogix has gotten to the stage where it can create any kind of gate from a small circuit of about 30 transistors. This circuit is then repeated across the chip, which can be transformed into different arrangements of logic gates in a single clock cycle, says Ditto.
Despite having attracted the attention of both Intel and AMD, the technology is still in its early days, says Ditto. ChaoLogix is raising $2 million to produce a range of prototypes. But even if the company can gain only a tiny slice of the chip markets, it “will be huge,” says Ditto.
Besides being extremely fast, the use of a single circuit has huge advantages over FPGAs. The way FPGAs are designed takes up a lot of silicon real estate and consumes a lot of resources. With ChaoLogix’s chips, “you have one car in a smaller garage, and it can change between one hundred different car types,” says Ditto.
It’s not the first time anyone has tried to develop single clock cycle reconfigurable chips. “It is well-trodden ground,” says Cantle. “Most of the companies that have tried have come and gone.” One of the challenges lies in the software required to reconfigure the chip, says Mark Parsons, commercial director of the Edinburgh Parallel Computing Centre in Scotland, who is using FPGAs to make a supercomputer as part of joint industry and academic project. “They are still very difficult to program,” he explains. Not only is it complex to design each configuration, but each software template describing the configuration takes up computational resources.
Others agree. The success of a reconfigurable chip does not depend only on what it can do, says André DeHon, assistant professor of computer science at the California Institute of Technology in Pasadena. If it proves to be too complex for most programmers, it may never get off the ground.