Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

Back in the 1960s, the IBM physicist Rolf Landauer showed that computation comes with a cost: every (irreversible) calculation, he said, always burns through a small amount of energy. That’s why silicon chips operate at temperatures hot enough to fry eggs.

It’s only in recent years, however, that computer scientists have begun to take Landauer’s work seriously. Current chips use up juice at a rate that is some eight orders of magnitude more than the theoretical minimum so they’ve begun to ask questions like how close can we get to the Landauer limit. Clearly, there’s plenty of room for improvement.

But there’s another type of computational system that works much more efficiently: life. Now physicists, biologists and computer scientists have also begun to think about what restrictions the theoretical limits of computation place on the way living things operate.

They know that any computation uses up energy, that even the simplest living things process information about their environment and that energy is a scarce resource in many systems. So what gives?

Today,  Pankaj Mehta at Boston University and David Schwab at Princeton University in New Jersey take a small but significant step in this direction.

One of the simplest information processing steps in living systems is a cell’s determination of the concentration of a chemical in its environment. 

Cells have receptors that do this job by binding to the chemicals in question. A simple measure of concentration is the amount of time the receptor is bound or unbound during a given interval.  

A receptor communicates that it has been activated  by adding a phosphate group to a protein inside the cell, which converts the protein from an inactive to an active state. The activated protein can, in turn, interact with other elements in the cell, transmitting the information across the biochemical network.

How quickly proteins switch determines the rate that information about the external chemical concentration flows into the cell. The work that Mehta and Schwab have done is to calculate the power consumed by this process and how it relates to the flow of information into the cell. 

These guys actually derive a mathematical expression that gives this biochemical circuit’s power consumption.The  expression shows that learning about the environment always requires the network to use up energy. 

There’s another process at work, however. Over time, cells always lose information, which gets destroyed by things like noise. So for a cell to maintain even the most basic knowledge of its environment, it must continually use energy. 

That may have important implications for our understanding of certain types of cell behaviour. During times of environmental stress  when resources are scarce, bacterial cells become metabolically dormant and can remain like this for  many years, a process known as sporulation. 

“While sporulation is relatively well understood, the reverse process of germination is much more difficult to study,” say Mehta and Schwab. How does the cell in this state extract information about its environment and so know when to come back to life?

Many biologists have assumed that sporulation is a process discovered by evolution as the best way to manage through hard times. 

Mehta and Schwab disagree. “Our results indicate that this behavior may be due to the extreme energetic constraints imposed on a metabolically dormant spore, rather than an evolutionarily optimized strategy,” they say.

In other words, sporulation and germination are the result of the fundamental limits of computation rather than an optimal survival strategy. Cells simply have no choice.

Interesting stuff. And be assured that there’s more to come from this new area of science. 

Incidentally, the study of way the limits of computation influences life does not yet appear to have a name. So any suggestions for the title of this incipient discipline in the comments section please!  

Ref: arxiv.org/abs/1203.5426: The Energetic Costs of Cellular Computation

17 comments. Share your thoughts »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me
×

A Place of Inspiration

Understand the technologies that are changing business and driving the new global economy.

September 23-25, 2014
Register »