The Future of Computing, According to Intel
Massively multicore processors will enable smarter computers that can infer our activities.
Intel recently demonstrated a new, low-power computer chip that will use as many as eight cores, or processing units. Expected in the second half of 2008, the new chip will increase the amount of data that a machine can process and enable more-realistic graphics. But Andrew Chien, the director of Intel Research, is looking beyond eight-core chips and into the range of terascale computing, in which machines with tens or hundreds of cores perform trillions of operations every second. Chien is working with computer scientists at Intel and at universities around the world to find the best uses for these future machines.
Chien is speaking at the Emerging Technologies Conference today about Intel’s exploratory research projects. Technology Review caught up with him beforehand to ask about the chip maker’s research goals.
Technology Review: What are the major projects at Intel Research?
Andrew Chien: One of the things that we’re very focused on is this idea of inference and understanding the world. The big idea is all about this question of whether inference and sensors are really the missing piece to make ubiquitous computing come to fruition. We can build small devices that fit into our pocket, but the things we’re falling short on are inference, making the devices work together well, and making them interact with us in natural ways.
Another area of research is obviously terascale computing. This has dual benefits. One very important benefit is to create the computing ability that’s going to power unbelievable applications, both in terms of visual representations, such as this idea of traditional virtual reality, and also in terms of inference. The ability for devices to understand the world around them and what their human owners care about is very exciting.
TR: Why would anyone want their gadgets to infer their behavior? Walk me through an example.
AC: One of the initial steps is to build systems that understand what we’re doing and understand the importance of different activities in our lives. Now, more than ever, we’re always connected. Imagine you have a phone that could be aware of when I get into a line at an airport. There’s a difference about what you want to be interrupted with when you’re being idle, standing in a line, [versus] when you’re going through the security procedure. Imagine if the sensor detects your motion and other information from your environment, such as the Internet signal, and it has knowledge of your past behaviors, so it can actually figure out if it’s crucial that the incoming phone call goes through. Is it your five-year-old who’s upset, or is it a friend who you talk to all the time? Do you need to take that call right away? The intelligent system could be using sensors, analyzing speech, finding your mood, and determining your physical environment. Then it could decide how that notification came through and how it came through in that context.
TR: The idea that you have sensors that record your activities raises quite a few privacy concerns. How is Intel addressing that?
AC: One of the things Intel is driving hard is [figuring out] how to build platforms with integrity. That means that they are securable, and someone can’t come in and take over your machines. There are also a lot of interesting questions about how much data you keep local, on your personal device, how much data you upload to the cloud, and which data you choose to destroy. It comes down to finding what people are comfortable with.
TR: Why is inference possible now?
AC: One thing is that computing systems are now able to tap into all the data that’s available on the Internet and learn from it. For instance, object recognition in machines is getting better because we are able to learn from all the pictures available on the Internet. (See “Better, More-Accurate Image Search.”) The same thing goes for language translation systems making use of the United Nations’ corpus of documents in Arabic and Chinese. This is also being fueled by disk drives getting big and cheap, and the powerful transition to nonvolatile memory. Being able to have random access to data with very low power is going to have a revolutionary impact.
TR: How does terascale computing fit into all this?
AC: In order to figure out what you’re doing, the computing system needs to be reading data from sensor feeds, doing analysis, and computing all the time. This takes multiple processors running complex algorithms simultaneously. The machine-learning algorithms being used for inference are based on rich statistical analysis of how different sensor readings are correlated, and they tease out obscure connections. Right now these algorithms work on large systems built for a specific purpose, and it takes a PhD to get these things to work. We are looking forward to having these algorithms be in an API [application programming interface] that you can call on, like a platform service which is as reliable to access as a file system. This way, the average programmer without a PhD can make use of these machine-learning algorithms.
TR: How far away are we from seeing this in consumer gadgets?
AC: Machine learning and interference technology have been accepted by a broad slice of the research community, but we’re mired in a moderate level of quality. It’s not unusual for these systems to get things right 80 percent of the time. The scientific community says that’s great. But it wouldn’t be helpful to have a personal assistant that looked at you and only correctly knew what you were doing 80 percent of the time. Likewise, a computer isn’t going to be helpful if it’s wrong part of the time.
Ultimately, I think it’s a dance between how well the algorithms will be able to work, and how people react to them being wrong. Within five years, I think you’re going to see significant advances in performance. You’ll see demonstrations in the research world that are credible. I think the mainstream marketplace could pick up on it three years later, but at that point it’s hard to predict. The precursors for this technology are all there, though, and I see a huge need for it.
Be there when AI pioneers take center stage at EmTech Digital 2019.Register now