Is the Death of Intel Research a Harbinger of Doom for Privately-Funded Technology Research?
Intel recently killed off its three industrial-research “lablets” in Berkeley, Seattle and Pittsburgh. Hardly anyone seems to have noticed, and that’s a terrible shame: Privately funded industrial research has given the U.S. and the world some of the most dazzling engineering innovations in history, and its slow demise could stymie innovation in ways that we may never fully appreciate.
Research at Intel, proper, is not going away: it’s called Intel Labs, and it is of course focused on innovation that is within spitting distance of affecting Intel’s bottom line. In that respect it’s like almost all corporate R&D. What is going away is an experiment conducted by Intel in freeing its researchers to conduct research a bit further afield.
In this way Intel was following in the grand tradition embodied most fully by AT&T’s Bell Labs. Supported by a monopoly, researchers at Bell Labs were free to do the kind of basic research normally supported only by governments. That is, research that wouldn’t necessarily lead to a product on any time-scale considered reasonable by today’s next quareter-obsessed corporate world. Bell labs brought us Unix, the laser, the transistor, and too many other inventions to name.
Subsequent industrial research labs with similar levels of freedom led to, for example, the graphic user interface, mouse and laser printer (Xerox PARC, where Steve Jobs was inspired). Microsoft Research currently serves a similar role, although its focus is software.
Matt Welsh, a senior software engineer at Google, worries that Intel’s closing of its lablets may signal that the days of industrial research are numbered:
Maybe this suggests is that the conventional industrial research model is simply broken. The only (important) places left that use this model are Microsoft, IBM, and HP. These companies can afford to set up big labs with lots of PhDs and pay them to do whatever the hell they want with little accountability, but maybe this model is no longer sustainable.
Randy Katz, a professor of computer science at UC Berkeley, argues in the uniformly thoughtful comments on Welsh’s post that all industrial research centers are temporary by their very (unprofitable) nature:
A wise older colleague, Lotfi Zadeh actually, observed some years ago that “research nirvanas never last forever.” In my 35 year career, I have seen the rise (and fall) of Bell Labs, Xerox PARC, and IBM Research that were and are no longer “unfettered research laboratories, unconstrained by the winds of the marketplace.” This is not to say that they don’t do useful and good work – just that the nature of the kind of work they do is now constrained by their internal and external patrons. The Intel Lablets where an experiment for Intel that had run its course. Ten years is actually a pretty good run these days. I would argue that [Microsoft Research] has a good chance to join this list at the next (first?) regime change in management
If Microsoft Research goes, it will mark the end of an era in which large, highly profitable, some might say monopolistic technology empires funded basic research just because they could. Intel’s alternative, which it outlines in the press release announcing the death of its lablets, is to fund research at universities. (Probably at a much lower net spend than it cost to run the original lablets).
This means universities get to do what they do best – academic research – but it leaves observers asking whether or not there isn’t something special about the industrial research lab that has allowed some of the most profoundly important innovations of our age. Even academia is fraught with complications – the need to win grants, the pressure to publish, academic politics – that can distract a scientist or engineer from his or her work.
Privately-funded research centers are commonplace in the biological sciences: who will fund them in software and engineering?
The inside story of how ChatGPT was built from the people who made it
Exclusive conversations that take us behind the scenes of a cultural phenomenon.
How Rust went from a side project to the world’s most-loved programming language
For decades, coders wrote critical systems in C and C++. Now they turn to Rust.
Design thinking was supposed to fix the world. Where did it go wrong?
An approach that promised to democratize design may have done the opposite.
Sam Altman invested $180 million into a company trying to delay death
Can anti-aging breakthroughs add 10 healthy years to the human life span? The CEO of OpenAI is paying to find out.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.