Skip to Content
Uncategorized

Scientific History and the Lessons for Today’s Emerging Ideas

A better understanding of the scientific turkeys of the 19th century may provide a stark warning about the value of mainstream scientific thought today.

In a hundred years time, historians of science and technology will look back at our age and marvel at the theories, experiments, and breakthroughs that characterise our age.

But they will also puzzle over the scientific cul de sacs of our time: the theories and ideas that fell by the wayside because they turned out to be misconceived, wrong or just plain mumbo jumbo.

Inevitably, this raises an interesting question: how much of what we consider mainstream investigation will fall into this category of best-forgotten science?

One way to approach this question is to examine our own attitude to science at the end of the 19th century and the beginning of the 20th.

The popular account goes a little like this. This era was characterised by a sense that the universe could be more or less completely described by Newton’s laws of mechanics, the laws of thermodynamics and Maxwell’s electromagnetic theory.

All was well, save for one or two minor cracks that everyone expected could be easily papered over. Of course, these eventually led to two of the greatest revolutions in scientific thought: Max Planck’s quantum theory in 1900 and Einstein’s theories of special and general relativity a few years later.

However, this popular account understates much of the complexity of scientific debate at the time. In particular, it fails to capture the extent to which many mainstream scientific ideas turned out to be spectacularly wrong. These ideas were widely discussed, much cherished and, in many cases, widely supported. Now these cul de sacs of science are largely forgotten.

Today, Helge Kragh at Aarhaus University in Denmark puts the record straight by re-examining the end-of-the-century, or fin-de-siecle, physics and the ideas that dominated it. There is much to learn from the stories he tells.

One largely forgotten episode was the general dissatisfaction at this time with the notion of ‘matter’. Various lines of thought seemed to suggest that the idea of an atomistic universe built from fundamental units of matter was flawed.

For example, the laws of thermodynamics only made sense if atoms were rigid bodies with no internal structure. And yet the evidence from spectroscopic experiments suggested that atoms must have internal structure. The phrase “matter is dead” became a widely used catchphrase at the time and clearly something had to give.

One mainstream resolution of this problem was based in the idea that matter was not a fundamental property of the universe but an emergent one. This coincided with a growing understanding that various different forms of energy–kinetic, potential, chemical, thermal etc–were manifestations of the same thing. So perhaps matter was merely another form of energy too.

This idea, which became known as energetics, enjoyed strong support for many years. It held that since Newton’s laws could be described purely in terms of energy, there was no need for the hypothesis of atoms. This was a grand unified theory of the universe and one of its chief proponents was Willhelm Ostwald, who later won a Nobel prize in chemistry for his work on catalysts.

In a talk in 1895, Ostwald said: “The most promising scientific gift that the closing century can offer the rising century is the replacement of the materialistic world view by the energeticist world view”.

Another solution came from the notion of the luminiferous ether, which dominated scientific thought in a way that is hard to imagine today. “The basic problem was not whether the ether existed or not, but the nature of the ether and its interaction with matter,” says Kragh.

The ether was widely believed to be the fundamental bedrock of the universe, from which all other things emerged. Many physicists proclaimed that the ether would be the basis for a grand unified theory of everything, among them, ironically, Albert Michelson.

One theory widely discussed for several years was put forward by William Thomson, aka Lord Kelvin, who believed that atoms were vortices in the ether. Curiously, physicists never proved this idea wrong. Instead, it simply ran out steam.

Then there were the various discoveries that turned out to be little more than wishful thinking. The discovery of X-rays by William Roentgen in 1895 led to the announcement of a bewildering range of other rays, for example N-rays, black light, rays of positive electricity, Moser rays, selenic rays and magnetic rays.

All of these turned out to be figments of the fertile imaginations of the physicists involved; the result of a kind of ray hysteria.

Kraghe describes various other episodes in fascinating detail. What’s interesting of course is the extent to which it is possible to draw parallels between the trends in science then and now.

In the last 20 years there has  been a growing sense that various different forms of information–genetic, digital, entropic etc–are manifestations of the same thing. What’s more, there is intense interest in the role that information might play in the laws of physics. Could it be that information is more fundamental than the concepts of mass or even energy. Perhaps the laws of physics must derive from its properties, if only we could decypher them?

Then there is the search for dark matter, a mysterious substance that fills the universe even though we cannot see, feel or even measure it.

And of course there are various theories of everything that focus on uniting quantum mechanics and relativity while predicting various extra dimensions, other universes and even an infinite multitude of them.

How much of this will seem irrelevant, bizarre or wrong in a hundred years time? It’s impossible to say but the parallels with some of the episodes from a hundred years ago make for entertaining speculation.

Kragh clearly shows that only a small fraction of the mainstream scientific debate in the 1890s is relevant today. And there’s no reason to think that same won’t be true when historians reassess early 21st century science in a hundred year’s time.

 Ref: arxiv.org/abs/1207.2016: A Sense Of Crisis: Physics In The Fin-De-Siècle Era

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.