Using IT to Drive Innovation
Despite the vast amounts of computing and communication power in corporate hands, companies are at the early stages of using IT to revamp business practices, become more efficient, and drive the next wave of national productivity growth.
That’s the argument made by Erik Brynjolfsson, director of the MIT Center for Digital Business at the Sloan School of Management. He says most companies still aren’t using IT effectively to do things like measure the success of promotions or the performance of supply chains—data that can inspire changes that fatten revenues and ultimately benefit consumers.
He spoke with David Talbot, Technology Review’s chief correspondent, about how leading companies use IT to test new ideas, adopt successful changes, and disseminate innovations quickly and cheaply.
TR: You’ve been making the case that businesses need to increase their “information metabolism.” What do you mean by that?
Brynjolfsson: There have been huge advances in the underlying technology of computers and communications. But to make them effective, companies have to change their business processes and the way they organize decision making. There are many high-tech companies that are effective in using IT. Amazon and Cisco come to mind as companies that have fundamentally changed their culture, are data-driven, and use the data to drive decisions. There are also companies you don’t think of as high-tech—like Harrah’s [now known as Caesars Entertainment], CVS, and Walmart—that have been aggressive. That doesn’t necessarily mean they spend more on IT. But it does mean they use IT to rethink business processes.
In what kinds of ways? What are some examples?
Amazon runs 200 experiments a day, such as trying out different algorithms for recommending products, or changing where they put the shopping cart on the screen. When they moved the shopping cart from the left to the right of the screen, there was a few tenths of a percent improvement in the rate of abandoned shopping carts. That might not seem like much, but it’s meaningful with hundreds of millions of site visits, and the cost of running the experiment was trivial.
That sounds like something that would come naturally for a big e-commerce company. But what about “traditional” companies?
Offline companies like Harrah’s are trying different promotions and incentive systems. Harrah’s has gone from being a third-tier gaming and casino company to the largest one in the world, in large part because of their use of data and analytics. They collect detailed data on customer visits with their Total Rewards card. And the culture is one of “Let’s put forward a hypothesis and test it.” What if you give a steak dinner or a straight discount? Maybe different demographics respond to different incentives. Then these different groups get slightly different offers. This requires a management that steps back from the traditional ego of “I know all the answers.”
What other methods are companies using?
Some are using IT to replicate innovations as they are developed. They take an idea that works well in one location, embed it in software, and replicate it in thousands of locations.
So is this sort of experimentation—and rapid implementation and diffusion of improved processes—widely done?
Right now relatively few companies in the U.S. economy are using this methodology. But when [Caesars CEO] Gary Loveman spoke to my class at MIT, he said that what he did at Harrah’s, he could have done at most of the companies in most other industries. You can see that as more companies do this, they will find better ways to increase customer satisfaction, increase efficiencies, and make supply chains work better. The scientific method brought amazing progress to the sciences. Now it’s being used in management, and I expect similar results.
How do improved revenues for companies like Caesars or Walmart help consumers?
In the short run, companies that use IT see higher profits and stock market appreciation. Over time, as competitors learn how to do that, profits get competed away, and most of the benefits accrue to consumers. Ultimately, productivity growth determines living standards and the wealth of nations.
How close are we to seeing these broader benefits?
It’s fair to say that in most industries 70, 80, 90 percent of the companies aren’t even close to using IT to the potential it could be used. You might have thought that companies were converging as more of them learned best practices. But we looked at the data and found that rather than firms becoming more similar, the leaders were pulling away from the laggards. That suggests to me that rather than this being a mature, stable technology, if anything, there is a new frontier opening up.
What sectors are slow to catch on?
Health care, education, and parts of manufacturing have not been as quick to embrace IT. Health care can learn lessons from other industries. I would put them 20 years behind the rest of American industry. If they can adopt some of the practices we’ve seen in the rest of the economy, we will see dramatically lower health costs.
How should a business get started adopting these ideas?
In my book [Wired for Innovation: How IT Is Reshaping the Economy] I describe seven principles of digital organization. It starts with the digitization of analog business processes but also includes a shift toward decentralized power; broader information sharing; tighter linkage of performance to compensation; more emphasis on high-quality people and screening people who are hired; and more investment in training and education for the workforce once hired.
Right now the scare resource is not data. We’ve got tons of data sitting around. According to [Google CEO] Eric Schmidt, there was more data created in the last two days than in all of history until 2000. The scarce resource is figuring out how to use the data efficiently—not with more computers, but in changing how companies are run.
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.