2011 saw the personal computer continue to be marginalized. Although PCs are still the workhorse computing device in homes and offices, the most exciting innovations over the last 12 months were centered on very small-scale computing, very large-scale computing, and networked combinations of the two.
Developments in small-scale computing, in the shape of consumer mobile hardware such as the iPad2 tablet or Galaxy Nexus smart phone, were naturally the most visible. Most of these lightweight devices use ARM-based chips, prompting Intel, best known for its desktop and laptop processors, to develop prototype smart phone and tablet devices that will almost certainly herald the arrival of new challengers to Apple’s iPad and iPhone in 2012 and 2013.
The software that runs, and run on, these devices saw tremendous development activity as well. Hewlett-Packard tried (and failed) to break into the mobile market with an operating system, WebOS, that was generally judged to be better than the hardware it ran on. The jury is still out on whether Research in Motion can revive the flagging fortunes of the BlackBerry following the announcement of its new mobile OS. And there were major updates from the two heavyweights in the mobile OS world, Google and Apple, with Google releasing Android 4.0 (also known as “Ice Cream Sandwich”) in September. With Android 4.0, tablet and smart-phone makers no longer have to run separate versions of Android, and the operating system also boasts features such as facial recognition. In October, coinciding with the launch of the iPhone 4S, Apple released its iOS 5 operating system, featuring the Siri voice-activated digital assistant.
These new devices and operating systems have driven an explosion of mobile applications for areas such as electronic payments, health care, augmented reality, and games. The richness and range of these applications is causing a shift in corporate IT, with increasing numbers of companies allowing workers to bring their own devices to work.
But a lot of this mobile flexibility is possible only thanks to cloud computing, which lets devices seamlessly hand off complex tasks to data centers. The Kindle Fire is essentially just a front end to Amazon’s cloud services, for example, and Apple’s Siri won’t work without a network connection. Consequently, companies have been building a new generation of data centers, designed to handle cloud-computing loads as efficiently as possible. HP and Calxeda are pioneering the use in servers of low-power chips originally designed for battery-constrained mobile devices, in a bid to slash the massive electricity bills of data centers.
Tying cloud computing to consumer devices meant that a lot of personal (and corporate) information migrated into cyberspace this year, putting an even sharper emphasis on security and privacy issues. While some argue your data is better off in the cloud, researchers continue to worry about holes that can be exploited by criminals.
The miracle molecule that could treat brain injuries and boost your fading memory
Discovered more than a decade ago, a remarkable compound shows promise in treating everything from Alzheimer’s to brain injuries—and it just might improve your cognitive abilities.
This scientist now believes covid started in Wuhan’s wet market. Here’s why.
How a veteran virologist found fresh evidence to back up the theory that covid jumped from animals to humans in a notorious Chinese market—rather than emerged from a lab leak.
The US crackdown on Chinese economic espionage is a mess. We have the data to show it.
The US government’s China Initiative sought to protect national security. In the most comprehensive analysis of cases to date, MIT Technology Review reveals how far it has strayed from its goals.
A horrifying new AI app swaps women into porn videos with a click
Deepfake researchers have long feared the day this would arrive.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.