For more than a century, psychologists have used reaction time as a window
into the brain. The thinking is that information processing takes time, so the
average amount of time taken to begin or complete a task reflects the duration
of the cognitive processes involved in it.
For example, a typical reaction-time experiment might ask a subject to
classify a sequence of letters as a word or a nonword, by pressing a button.
This kind of experiment is called a visual lexical decision task.
Advertisement
This information-centric approach is clearly ripe for an
information-theoretic treatment. And sure enough, no sooner had Claude Shannon
published his theory of information in the 1940s, than psychologists began to
apply it to the exchange of information between the environment and the brain
that goes on during reaction-time experiments.
This story is only available to subscribers.
Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.
Their approach eventually led to Hick’s Law,
one of the few laws of experimental psychology. It states that the time it
takes to make a choice is linearly related to the entropy of the possible
alternatives. The results from various reaction-time experiments seem to show
that this is the case. Although one byproduct of this approach is that the
results are intimately linked to the type of experiment used to measure the
reaction time. And that makes each study peculiarly vulnerable to the idiosyncrasies
of the experimental approach.
Today, Fermin Moscoso del Prado Martín from the Université de Provence in
France proposes a new way to study reaction times by analyzing the entropy of
their distribution, rather in the manner of thermodynamics.
The entropy is an estimate of the amount of information needed to specify
the state of the system.
Moscoso del Prado says the entropy of the distribution of
reaction times is independent of the type of experiment and so provides a
better measure of the cognitive processes involved. That’s important, not least
because it provides a way to more easily compare the results from different
types of experiment.
Moscoso del Pradon uses his method to determine how much information the brain can
process during lexical decision tasks. The answer? No more than about 60 bits
per second. Of course, this is not the information-processing capacity of the
entire brain but one measure of the input/output capacity during a specific
task.
Moscoso del Prado goes on to analyze the data from various types of reaction-time
experiments, in particular to determine whether information-processing speed is
constant during a particular task, as implied by Hick’s Law. Moscoso del Prado reckons it
isn’t.
“This finding suggests an adaptive system
where the processing load is dynamically adjusted to the task demands,” he
says. That makes sense. It seems crazy to assume that the brain carries on processing
data at the same rate regardless of the complexity of the task at hand.
Advertisement
But this has an important implication: that the
linearity of Hick’s Law doesn’t always apply. So Hick’s Law will need some kind
of modification to cope with this nonlinearity.
Just how to rewrite one of the basic laws of behavioral psychology isn’t
clear yet. But it’s sure to involve a very different way of looking at the
brain from when it was formulated.
“I have a small scientific comment on your post. Although I think it
represents my results very well, I find the opening sentence:
“A new way to analyze human reaction times shows that the brain processes
data no faster than 60 bits per second.”
a bit misleading. I don’t think I have shown anything about the upper bounds
of the processing speed, in principle the curve I show in Figure 4 of the
manuscript could extend far beyond this, but I have no information to make
this extrapolation, so I would not claim (for the moment) any upper limit.”