As any music lover knows, the human ear is adept at picking out subtle patterns. And the growing power of computers to translate almost any kind of information into variations in pitch, rhythm, or volume is boosting the field of sonification, the representation of data as sound. From sounding out variations on a pathologist’s tissue section slide to flagging suspicious travel activity, sonification has the potential to help scientists, doctors, and analysts spot trends and trouble spots.
Ronald Coifman, a mathematician at Yale University, and Jonathan Berger, a composer at Stanford University, have developed software that transforms light reflected off colon cells under a microscope into pulsating sounds. Under one setting, cancerous cells are louder than healthy ones. Coifman and Berger’s study is mainly aimed at discovering which sound patterns are most effective at conveying complex data, which Coifman says they and other researchers will achieve in two years.
This artist is dominating AI-generated art. And he’s not happy about it.
Greg Rutkowski is a more popular prompt than Picasso.
VR is as good as psychedelics at helping people reach transcendence
On key metrics, a VR experience elicited a response indistinguishable from subjects who took medium doses of LSD or magic mushrooms.
This nanoparticle could be the key to a universal covid vaccine
Ending the covid pandemic might well require a vaccine that protects against any new strains. Researchers may have found a strategy that will work.
How do strong muscles keep your brain healthy?
There’s a robust molecular language being spoken between your muscles and your brain.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.