News of Death, Greatly Exaggerated
In his controversial book The End of Science, John Horgan suggested that the era of great scientific discoveries is over: Exploring the solar system or the human genome may keep us busy for a while, but our findings probably won’t require the invention of radical new theories on a par with those of Copernicus, Darwin or Einstein. After all, how much has happened in astronomy since the work of Edwin Hubble in the 1920s, or in genetics since James Watson and Francis Crick’s description of DNA in the 1950s, to fundamentally change the way we see the universe and our place in it?
When Horgan’s book came out in 1996, I was a reporter at Science magazine covering developmental biology, a discipline undergoing a stunning metamorphosis thanks to new techniques for manipulating genes. I knew better, therefore, than to swallow Horgan’s idea whole. I suspected that readers familiar with other fields would scoff just as loudly, but I lacked detailed evidence. Now John Maddox, one of “the last great scientific polymaths” (in the estimation of Richard Dawkins), has assembled that evidence into a captivating, highly readable book.
In many of his editorials in the prestigious research journal Nature, which he led for 23 years, Maddox played court cartographer, assembling scientists’ field reports into maps of the territories of the natural sciences that were being colonized successfully and those that remained terra incognita. In What Remains To Be Discovered, Maddox focuses on the empty parts of the map, those denoting woeful gaps in scientists’ understanding of such basic matters as the nature of the Big Bang and the connection between electromagnetism, the strong and weak nuclear forces and gravity. If many fields seem to be caught in the doldrums, with their last big organizing ideas having appeared more than a generation ago, it’s not a sign of the end of science, but merely a measure of the work left undone, Maddox chides.
“The truth is that the sheer success of science in the last half-millennium has engendered a corrosive impatience,” Maddox writes. “We too easily forget how recent are the empirical and theoretical foundations of present understanding. Prudence, or merely good manners, would dictate a seemly recognition that they may also be incomplete.” The news of the death of science has been greatly exaggerated.
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Deep learning pioneer Geoffrey Hinton has quit Google
Hinton will be speaking at EmTech Digital on Wednesday.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.