To bring some order to the fuzzy world of cloud computing, the U.S. government’s National Institute of Standards and Technology has created a standard definition and a Cloud Computing Reference Architecture. Both are in the form of “Special Publications,” which are not official U.S. government standards but are designed to provide guidance to specific communities of practitioners and researchers.
The NIST Definition of Cloud Computing, currently in draft form, is based on NIST-sponsored workshops and public comments. The single definition helps ensure that government workers, industry, and other groups are talking about the same thing when they use the same words.
The draft document defines cloud computing as “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.”
The definition specifies five “essential” characteristics of cloud computing: self-service; accessibility from desktops, laptops, and mobile phones; resources that are pooled among multiple users and applications; elastic resources that can be rapidly reapportioned as needed; and measured service. These characteristics combine to make cloud computing a kind of infrastructure or utility. It’s not cloud computing when a company rents a specific computer in a rack at a facility that happens to be in Denver; it is cloud computing when a company rents a virtual host generated by machines that might physically reside in Denver, Atlanta, or New York.
NIST defines three “service models,” or types of service, that a cloud provider might sell.
Infrastructure as a Service (IaaS) is the most basic. Customers can buy processing, storage, and network services (typically using a Web-based self-service tool) and then build their own systems on top of this infrastructure. Two of the best known IaaS providers are Amazon and Rackspace. While customers have the illusion that they are renting specific servers, hard drives, and network switches, this equipment is in fact simulated by virtualization software, allowing multiple customers to be served by the same physical device.
Platform as a Service (PaaS) is one step up: vendors provide preconfigured computers running operating systems and applications. Amazon is a player here too, with a variety of highly specialized offerings that provide scalable databases, message queues, Web-accessible storage, and the “Mechanical Turk” system for organizing human computation. The Google App Engine, a system for high-performance computing, is another example. Microsoft is also a supplier: its Azure service provides preconfigured computers running Windows and SQL Server. All of these are elastic pay-as-you-go services. For example, you can tell Microsoft you want a “small” computer (one-gigahertz CPU, 1.75 gigabytes of RAM) and a 10-gigabyte SQL database. Your cost is $190 per month. Likewise, many Web hosting providers now sell “virtual private servers” priced at $17 to $400 per month or more.
Software as a Service (SaaS) is at the top of the cloud computing stack. Here the cloud providers have created full applications running on server farms that may themselves be geographically distributed. Although Salesforce.com has long been held up as the premier SaaS provider, Facebook, Flickr, eBay, Yahoo Stores, Amazon Marketplace, the backup-storage provider Carbonite, and even the financial assistant Mint.com offer SaaS as well.
Cloud computing does not require making your data available on the public Internet. If your data is too sensitive or valuable for that—and if you’ve got a lot of money and trained staff—your organization might be a good candidate for what NIST calls the private cloud deployment model, in which an organization operates a cloud strictly for its own use. A private cloud lets an organization benefit from the flexibility of cloud technology so that it can use its equipment more efficiently, but without the risk that other users of the cloud could snoop on its data.
The community cloud, yet another model, is a private cloud that’s shared by several organizations and typically supports a specific requirement. For example, a group of health-care organizations might create a community cloud to hold patient medical and billing records.
A public cloud is a system that’s owned by the cloud provider and made available to the general public. Facebook and Google fall into this category.
Last is NIST’s hybrid cloud model, in which multiple cloud systems are connected in a way that allows programs and data to be moved easily from one deployment system to another. For example, a company might develop its system on Amazon’s IaaS but then run one version on Amazon for public data and a second on a private cloud for its sensitive information.
DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.
“This is a profound moment in the history of technology,” says Mustafa Suleyman.
What to know about this autumn’s covid vaccines
New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.
Human-plus-AI solutions mitigate security threats
With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure
Next slide, please: A brief history of the corporate presentation
From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.