Before Facebook and Google—even before the Internet—scientists at MIT had a radical vision they called the computer utility.
“Computing may someday be organized as a public utility just as the telephone system is a public utility,” Professor John McCarthy said at MIT’s centennial celebration in 1961. “Each subscriber needs to pay only for the capacity he actually uses, but he has access to all programming languages characteristic of a very large system … Certain subscribers might offer service to other subscribers … The computer utility could become the basis of a new and important industry.”
Those words presciently describe a phenomenon sweeping the Internet today: cloud computing. Instead of buying their own computer systems, companies, individuals, and even governments can share time on a common computing infrastructure, which consists of interchangeable parts providing computation, data storage, and communications. If one piece malfunctions or needs updating, programs and data automatically move to others. Multilevel security prevents users from interfering with one another. This vast system is cheaper to operate than many individual computers scattered among different businesses and agencies, because both the hardware and the administrative staff can be utilized much more efficiently.
What has changed since McCarthy’s time is the advent of advanced “virtualization” systems that can generate just the computing resources needed at any time, letting them be returned to a general pool when they are not. This means that service providers such as Amazon can offer a pay-as-you-go utility billing model to customers on a very large scale. The consequences of this shift are far reaching: one of the clearest is that today there’s very little need for businesses to purchase a computer system other than PCs and laptops for employees. Whether they need a mail server or a rack of computers for a high-performance computing cluster, companies can almost always save money and get better performance by hiring a service in the cloud instead of buying their own. (See our sidebar defining key terms in cloud computing.)
Consider the economics of handling e-mail in a company. Today the cost of an entry-level Dell server to receive, store, and route the messages is less than $300. But by the time you add Windows Server software to run the machine, a second hard drive for redundancy, Microsoft’s Exchange Server 2010 to let an administrator manage the e-mail, and employee licenses of $35, you’re up to at least $3,250 for a department with 50 employees. Alternatively, you can have your employees use Microsoft’s cloud-based service, Exchange Online, for $10 per user per month, with unlimited storage. On the surface, a $6,000 annual cloud bill might not seem like the better deal, but doing it yourself carries high hidden costs, from hiring someone to manage e-mail servers to keeping up with security updates to paying air-conditioning bills for your IT room. The cloud service is backed up at multiple locations, and it connects to mobile phones and group calendars. Most important, you take advantage of the utility model that John McCarthy envisioned in 1961: you purchase only what you need. Microsoft has enough capacity to let you keep adding employees as quickly as you want.
Despite such advantages, many businesses say they are avoiding the cloud because they aren’t fully confident about its security and reliability. Yes, Google has had a few Google Docs outages, and Amazon had an embarrassing situation in April 2011, when some customers lost service and data. But companies that manage their own data have downtime, too—typically more than a few hours each year. What’s more, Google and Amazon responded to these outages as only publicly traded companies would: they issued detailed reports on what happened, how big the problem was, and what they were doing to prevent it from happening again. When was the last time you got a detailed report from your IT group because you couldn’t read your e-mail?
The fact is, many companies are uncomfortable giving up control. In a March study of IT managers, sponsored by computer reseller CDW, one curious result was that most respondents said that their preferred way to use the cloud would be to have a private one. Private clouds feel the same to end users but are run by the companies themselves, not by a third party such as Amazon. But building a private cloud is no small undertaking. Private clouds need to have all the capabilities of cloud systems—virtual computing infrastructure, data centers with redundant cooling and power, off-site backup, etc.—but the costs are borne by a single organization, without the best benefit of the cloud: the utility pricing. As CDW points out in its analysis, running a private cloud means essentially “becoming a cloud hosting provider,” except you never recoup costs by selling your product. Private clouds might make sense only for organizations that have hundreds of thousands of employees or data that is so sensitive—such as military information or the financial transactions of a Swiss bank—that it can never be allowed near the public Internet.
One of the few areas where cloud-based offerings are not vastly superior to the systems that they replace is desktop productivity apps—word processing, spreadsheets, presentation software, and calendars. Yes, Google and Microsoft both offer cloud-based office applications. But the desktop versions still are faster, more flexible, and easier to use. What’s more, you can put 10 years’ worth of documents on your laptop and edit them on a cruise, on an airplane, or in one of the few remaining coffee shops that lack a decent Internet connection. But because laptops get lost, stolen, and dropped in swimming pools, be sure to encrypt the files on that laptop—and you should probably back them up to the cloud.
The facts are really simple: although every organization on the Internet essentially is using some cloud-based service, they should use more. The economies of scale are becoming mind-blowing. Someone who wants to go buy a rack of servers probably hasn’t done the math.
Simson L. Garfinkel is an author and researcher in Arlington, Virginia, who focuses on such topics as computer forensics and privacy. He is a contributing editor at Technology Review.
A chip design that changes everything: 10 Breakthrough Technologies 2023
Computer chip designs are expensive and hard to license. That’s all about to change thanks to the popular open standard known as RISC-V.
Modern data architectures fuel innovation
More diverse data estates require a new strategy—and the infrastructure to support it.
Chinese chips will keep powering your everyday life
The war over advanced semiconductor technology continues, but China will likely take a more important role in manufacturing legacy chips for common devices.
The computer scientist who hunts for costly bugs in crypto code
Programming errors on the blockchain can mean $100 million lost in the blink of an eye. Ronghui Gu and his company CertiK are trying to help.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.