Bottlenose dolphins, in addition to their remarkable intelligence and agility, are able to “see” their underwater world with sound. This ability—echolocation—is often referred to as a “sixth” sense, but it’s more like a hyperdeveloped version of hearing.
Now a team of researchers from the nonprofit SpeakDolphin.com have released an image they claim captures the image of a diver as seen via a dolphin’s echolocation. This claim should be taken with about as many grains of salt as there are in the sea, as we’ll see.
You may well have seen this story already—it’s been making the rounds for its undeniable “wow” factor. If not, here’s the short version: the SpeakDolphin team, led by Jack Kassewitz, recorded the echoes when a bottlenose directed her echolocation beam (high-frequency sound pulses) over a submerged diver. They sent the recordings to be processed by vibration-imaging company CymaScope, which used the series of audio “snapshots” to reconstruct a three-dimensional representation of the submerged diver. And, in a final step, these images were passed along to 3D Systems, the inventors of 3-D printing, who created a three-dimensional printout of “what the dolphin saw.”
It all sounds very, very cool. But it’s time to break out my naturalist curmudgeon hat.
First point: dolphins can see, in the traditional sense, and can see quite well. (A fact apparently lost in much of the coverage so far.) Their color vision is crap, but they have good low-light vision and their eyes are specially adapted to allow them to see above and below the surface clearly. Unlike some bats, which rely on echolocation so heavily that it’s reasonable to say they perceive their world almost exclusively via sound, dolphins use a mix of sound and sight. As far as how their brains process this unique mix of sensory information … well, there’s no mention of brains anywhere in the press release or credulous first wave of coverage.
Before I go further, credit where credit is due: Rachel Feltman at the Washington Post posted her own takedown of the release while I was working on this post, and her article is worth reading in its entirety for a thorough breakdown of the problems with the SpeakDolphin release’s credibility.
There are also more general issues with the science behind the viral image … in that there doesn’t seem to be much. Or, more specifically, the usual ways of vetting research have been ignored. The results, as Feltman notes, are not published in any journal. SpeakDolphin founder Jack Kassewitz told Tech Insider’s Rebecca Harrington that he prefers to make his research publicly available in book form instead of through journals, a sentiment I empathize with greatly even as it tickles my skeptic bone.
Even more troubling, Feltman points out, Kassewitz seems to see himself as a rebel for the dolphin-human cause, calling mainstream science “bigoted” and “racist” in its attitudes toward dolphins in a statement on his site. While it’s certainly true that science has been slow in appreciating the full intelligence of nonhuman animals, his stance seems more passionate than reasoned. Not to put too fine a point on it, but people who view themselves as maverick geniuses tend to only be half-right. Good science is communal for a reason.
Harrington asked for clarification about how CymaScope generates its images, but did not get any. Instead, she received the following response from Kassewitz: “I am totally open to criticism, but from physicists, because that’s what is going on here.”
This is undeniably a great idea for an experiment. If we can figure out how dolphins perceive the world with a mix of echolocation and vision, it will almost certainly point to new and interesting techniques for sonar and underwater imaging more generally. For now the images may represent some interesting piece of the puzzle of how dolphins use sound and sight to navigate their world, but even if research methodology is solid, it will be a gross overstatement to claim this is “what the dolphin saw.”
How AI is reinventing what computers are
Three key ways artificial intelligence is changing what it means to compute.
These weird virtual creatures evolve their bodies to solve problems
They show how intelligence and body plans are closely linked—and could unlock AI for robots.
Our brains exist in a state of “controlled hallucination”
Three new books lay bare the weirdness of how our brains process the world around us.
We reviewed three at-home covid tests. The results were mixed.
Over-the-counter coronavirus tests are finally available in the US. Some are more accurate and easier to use than others.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.