Skip to Content

How the Cost of Computation Restricts the Processes of Life

The energy required to process information places a fundamental limit on biological processes, say scientists who are teasing apart the link between computation and life.

Back in the 1960s, the IBM physicist Rolf Landauer showed that computation comes with a cost: every (irreversible) calculation, he said, always burns through a small amount of energy. That’s why silicon chips operate at temperatures hot enough to fry eggs.

It’s only in recent years, however, that computer scientists have begun to take Landauer’s work seriously. Current chips use up juice at a rate that is some eight orders of magnitude more than the theoretical minimum so they’ve begun to ask questions like how close can we get to the Landauer limit. Clearly, there’s plenty of room for improvement.

But there’s another type of computational system that works much more efficiently: life. Now physicists, biologists and computer scientists have also begun to think about what restrictions the theoretical limits of computation place on the way living things operate.

They know that any computation uses up energy, that even the simplest living things process information about their environment and that energy is a scarce resource in many systems. So what gives?

Today,  Pankaj Mehta at Boston University and David Schwab at Princeton University in New Jersey take a small but significant step in this direction.

One of the simplest information processing steps in living systems is a cell’s determination of the concentration of a chemical in its environment. 

Cells have receptors that do this job by binding to the chemicals in question. A simple measure of concentration is the amount of time the receptor is bound or unbound during a given interval.  

A receptor communicates that it has been activated  by adding a phosphate group to a protein inside the cell, which converts the protein from an inactive to an active state. The activated protein can, in turn, interact with other elements in the cell, transmitting the information across the biochemical network.

How quickly proteins switch determines the rate that information about the external chemical concentration flows into the cell. The work that Mehta and Schwab have done is to calculate the power consumed by this process and how it relates to the flow of information into the cell. 

These guys actually derive a mathematical expression that gives this biochemical circuit’s power consumption.The  expression shows that learning about the environment always requires the network to use up energy. 

There’s another process at work, however. Over time, cells always lose information, which gets destroyed by things like noise. So for a cell to maintain even the most basic knowledge of its environment, it must continually use energy. 

That may have important implications for our understanding of certain types of cell behaviour. During times of environmental stress  when resources are scarce, bacterial cells become metabolically dormant and can remain like this for  many years, a process known as sporulation. 

“While sporulation is relatively well understood, the reverse process of germination is much more difficult to study,” say Mehta and Schwab. How does the cell in this state extract information about its environment and so know when to come back to life?

Many biologists have assumed that sporulation is a process discovered by evolution as the best way to manage through hard times. 

Mehta and Schwab disagree. “Our results indicate that this behavior may be due to the extreme energetic constraints imposed on a metabolically dormant spore, rather than an evolutionarily optimized strategy,” they say.

In other words, sporulation and germination are the result of the fundamental limits of computation rather than an optimal survival strategy. Cells simply have no choice.

Interesting stuff. And be assured that there’s more to come from this new area of science. 

Incidentally, the study of way the limits of computation influences life does not yet appear to have a name. So any suggestions for the title of this incipient discipline in the comments section please!  

Ref: arxiv.org/abs/1203.5426: The Energetic Costs of Cellular Computation

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

It’s time to retire the term “user”

The proliferation of AI means we need a new word.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.