A View from Erica Naone
Software with a Better Ear for Music
A music search engine being previewed this week analyzes the waveform patterns of songs to classify them.
A music search engine that uses a novel technique to classify songs,will go into beta this week.
I wrote about the system a few months ago. It was designed by researchers from the University of California, San Diego, including assistant professor Gert Lanckriet. The researchers have trained the search using information contributed by Facebook users, via an application called HerdIt. The goal is to train the system to tag songs automatically–using statistical analysis applied to the waveform patterns that represent each song:
About 90 percent of the time, Lanckriet says, the system identifies patterns that are ordinarily hidden. For example, the patterns that identify a hip-hop song might include a typical hip-hop beat, but also elements that the listener wouldn’t recognize as a pattern within the song. “On average, these automatic tags predict other humans’ [tags] pretty much as accurately as a given human person can do,” Lanckriet says.[…] He envisions a system that could take an unfamiliar song–from an independent band, or even something recorded in a user’s garage–and then analyze it on the fly and suggest appropriate tags and similar music.
I’m looking forward to trying it out. See the video below for a more detailed explanation of the project.
Keep up with the latest in machine learning at EmTech Digital.
The Countdown has begun.
March 25-26, 2019
San Francisco, CA