Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

Super cool: When chilled to almost absolute zero, this chip allows for quantum computing.

Source: “Implementing the Quantum von Neumann Architecture with Superconducting Circuits”
Matteo Mariantoni et al.
Science 334: 61-65

Results: For the first time, a computer processor that uses the quirks of quantum physics to handle information has been teamed up with a memory device that can store data encoded into quantum states. The memory behaves like the RAM, or random access memory, of a conventional computer. Matteo Mariantoni and collaborators at the University of California, Santa Barbara, built the system and used it to run two different algorithms. One can allow a quantum computer to crack encryptions that are practically impossible for a conventional computer to break; the other is a kind of mathematical building block that can be used to fix errors in quantum computations, called quantum error correction.

Why it matters: Quantum computers have the potential to be much more powerful than conventional ones. The components inside the computers we use today represent all data using combinations of 0s and 1s, but each digital bit in a quantum computer can also take on both states at the same time in a phenomenon called superposition. The UCSB design is the first to combine processor and memory components in an architecture known as a von Neumann design, marking a milestone that conventional computing first passed in the late 1940s. It made computers easier to reprogram and enabled them to run more complex algorithms. The quantum equivalent of this achievement could have similar effects.

Methods: The UCSB computer’s circuits are made from metals that become superconducting when cooled almost to absolute zero, activating the quantum effects that make them useful. The computer’s processor consists of two or three “qubits,” each able to represent data as a 1, a 0, or both at once. The qubits can interact to process data and are also connected to memory elements that can store a quantum value for later use. Unlike previous quantum-­computing components, this type can be made using standard chip manufacturing techniques.

Next Steps: The researchers are working to try out more algorithms on the new computer design and considering how they might scale it up to include more processing or memory power.


Planning by Taxi

GPS data gleaned from taxicabs can identify flaws in transportation infrastructure

Source: “Urban Computing with Taxicabs”
Yu Zheng et al.
13th ACM International Conference on Ubiquitous Computing, Beijing, China, September 17–21, 2011

Results: Scientists at Microsoft Research Asia have used GPS data from more than 33,000 taxicabs in Beijing to pinpoint problem areas in the city’s transportation network. The researchers analyzed data collected in 2009 and 2010 to find areas where roads and subway lines were overloaded. Then they evaluated the results by examining how their calculations changed as Beijing’s transportation network evolved during the two-year period they monitored. They found that where city planners added connections between regions that algorithms had identified as overloaded, conditions did improve. Where flaws were identified but not fixed, traffic got no better.

Why it matters: Big cities worldwide are struggling to keep their infrastructure in line with the needs of their populations. A way to automatically detect problems with transit systems could be valuable to overburdened city planners.

Methods: The researchers used data from the taxicabs to identify congestion-prone transition points between regions of the city. Further analysis helped them predict exactly where new streets or subway lines might relieve the pressure. Analyzing complex traffic patterns this accurately would be impossible without information on the large scale yielded by the GPS data.

Next steps: The researchers hope to make their predictions more accurate by considering more types of data, such as information about geographical features. They might also look more closely at the congestion patterns that develop around specific occasions, such as sporting events. The researchers say their method could be applied to any major city ­​that has a large taxi fleet, including Buenos Aires and New York City.


0 comments about this story. Start the discussion »

Credit: Erik Lucero

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me