We noticed you're browsing in private or incognito mode.

To continue reading this article, please exit incognito mode or log in.

Not an Insider? Subscribe now for unlimited access to online articles.

Emerging Technology from the arXiv

A View from Emerging Technology from the arXiv

How the Cost of Computation Restricts the Processes of Life

The energy required to process information places a fundamental limit on biological processes, say scientists who are teasing apart the link between computation and life.

  • March 28, 2012

Back in the 1960s, the IBM physicist Rolf Landauer showed that computation comes with a cost: every (irreversible) calculation, he said, always burns through a small amount of energy. That’s why silicon chips operate at temperatures hot enough to fry eggs.

It’s only in recent years, however, that computer scientists have begun to take Landauer’s work seriously. Current chips use up juice at a rate that is some eight orders of magnitude more than the theoretical minimum so they’ve begun to ask questions like how close can we get to the Landauer limit. Clearly, there’s plenty of room for improvement.

But there’s another type of computational system that works much more efficiently: life. Now physicists, biologists and computer scientists have also begun to think about what restrictions the theoretical limits of computation place on the way living things operate.

They know that any computation uses up energy, that even the simplest living things process information about their environment and that energy is a scarce resource in many systems. So what gives?

Today,  Pankaj Mehta at Boston University and David Schwab at Princeton University in New Jersey take a small but significant step in this direction.

One of the simplest information processing steps in living systems is a cell’s determination of the concentration of a chemical in its environment. 

Cells have receptors that do this job by binding to the chemicals in question. A simple measure of concentration is the amount of time the receptor is bound or unbound during a given interval.  

A receptor communicates that it has been activated  by adding a phosphate group to a protein inside the cell, which converts the protein from an inactive to an active state. The activated protein can, in turn, interact with other elements in the cell, transmitting the information across the biochemical network.

How quickly proteins switch determines the rate that information about the external chemical concentration flows into the cell. The work that Mehta and Schwab have done is to calculate the power consumed by this process and how it relates to the flow of information into the cell. 

These guys actually derive a mathematical expression that gives this biochemical circuit’s power consumption.The  expression shows that learning about the environment always requires the network to use up energy. 

There’s another process at work, however. Over time, cells always lose information, which gets destroyed by things like noise. So for a cell to maintain even the most basic knowledge of its environment, it must continually use energy. 

That may have important implications for our understanding of certain types of cell behaviour. During times of environmental stress  when resources are scarce, bacterial cells become metabolically dormant and can remain like this for  many years, a process known as sporulation. 

“While sporulation is relatively well understood, the reverse process of germination is much more difficult to study,” say Mehta and Schwab. How does the cell in this state extract information about its environment and so know when to come back to life?

Many biologists have assumed that sporulation is a process discovered by evolution as the best way to manage through hard times. 

Mehta and Schwab disagree. “Our results indicate that this behavior may be due to the extreme energetic constraints imposed on a metabolically dormant spore, rather than an evolutionarily optimized strategy,” they say.

In other words, sporulation and germination are the result of the fundamental limits of computation rather than an optimal survival strategy. Cells simply have no choice.

Interesting stuff. And be assured that there’s more to come from this new area of science. 

Incidentally, the study of way the limits of computation influences life does not yet appear to have a name. So any suggestions for the title of this incipient discipline in the comments section please!  

Ref: arxiv.org/abs/1203.5426: The Energetic Costs of Cellular Computation

AI is here. Will you lead or follow?
Join us at EmTech Digital 2019.

Register now
More from Intelligent Machines

Artificial intelligence and robots are transforming how we work and live.

Want more award-winning journalism? Subscribe to Insider Plus.
  • Insider Plus {! insider.prices.plus !}*

    {! insider.display.menuOptionsLabel !}

    Everything included in Insider Basic, plus the digital magazine, extensive archive, ad-free web experience, and discounts to partner offerings and MIT Technology Review events.

    See details+

    Print + Digital Magazine (6 bi-monthly issues)

    Unlimited online access including all articles, multimedia, and more

    The Download newsletter with top tech stories delivered daily to your inbox

    Technology Review PDF magazine archive, including articles, images, and covers dating back to 1899

    10% Discount to MIT Technology Review events and MIT Press

    Ad-free website experience

You've read of three free articles this month. for unlimited online access. You've read of three free articles this month. for unlimited online access. This is your last free article this month. for unlimited online access. You've read all your free articles this month. for unlimited online access. You've read of three free articles this month. for more, or for unlimited online access. for two more free articles, or for unlimited online access.