Skip to Content
Uncategorized

Musical Genres Classified Using the Entropy of MIDI Files

The automated classification of music is an important outstanding problem in computer science. Now a straightforward way of analyzing music’s information content could help.

Communication is the process of reproducing a message created in one point space at another point in space. It has been studied in depth by numerous scientists and engineers but it is the mathematical treatment of communication that has had the most profound influence.

To mathematicians, the details of a message are of no concern. All that matters is that the message can be thought of as an ordered set of symbols. Mathematicians have long known that this set is governed by fundamental laws first outlined by Claude Shannon in his mathematical theory of communication.

Shannon’s work revolutionized the way engineers think about communication but it has far-reaching consequences in other areas, too. Language involves the transmission of information from one individual to another and information theory provides a window through which to study and understand its nature. In computing, data is transmitted from one location to another and information theory provides the theoretical bedrock that allows this to be done most efficiently. And in biology, reproduction can be thought of as the transmission of genetic information from one generation to the next.

Music too can be thought of as the transmission of information from one location to another, but scientists have had much less success in using information theory to characterize music and study its nature.

Today, that changes thanks to the work of Gerardo Febres and Klaus Jaffé at Simon Bolivar University in Venezuela. These guys have found a way to use information theory to tease apart the nature of certain types of music and to automatically classify different musical genres, a famously difficult task in computer science.

One reason why music is so hard to study is that it does not easily translate into an ordered set of symbols. Music often consists of many instruments playing different notes at the same time. Each of these can have various qualities of timbre, loudness, and so on.

Capturing all this in a set of symbols, along with whatever individual interpretation the musician adds, is a tricky business. That hasn’t stopped researchers trying, albeit with limited degrees of success.

Febres and Jaffé tackle this problem in a remarkably simple way using a common standard for digitizing music called MIDI. A MIDI file is a digital representation of a piece of music that can be read by a wide variety of computers, music players and electronic instruments.

Each file contains information about a piece of music’s pitch and velocity, volume, vibrato, and so on. This allows music created in one location to be reproduced accurately at another location.

But a MIDI file itself is simply an ordered series of 0s and 1s and this gave Febres and Jaffé a way to analyze it using standard information theory.  Indeed, they simply opened each files as a .txt and read the resulting sequence of seemingly random symbols.

The beauty of information theory is that the tools developed for compressing messages sent to Mars or for analyzing the components of language can be equally applied to any set of symbols. And that what Febres and Jaffé have done.

They began by compressing each set of symbols into the minimum number necessary to generate the original music. This fundamental set then allowed them to measure the entropy or information content associated with each piece of music.

But they also studied the way this entropy varied over time. Indeed, they studied how this second order entropy varied in 450 pieces from 71 composers and 15 different periods or types of music.

To their surprise, they found that music from the same genre shared similar values for this second order entropy. At the same time, this type of analysis shows how musical genres have evolved over time.

That’s interesting work that provides a fascinating new way of studying music. There are some caveats, of course. While some musical genres are clearly identifiable in this way, other seemingly different styles overlap.

For example, Venezuelan and Indian Raga music occupy unique regions in this parameter space. Various classical composers also occupy specific regions and so can be potentially identified by this method.

But on average, rock music and classical music strongly overlap making it hard to automatically identify them. Future work may improve matters, perhaps by increasing the size of the data base for example.

Nevertheless, Febres and Jaffé have made significant strides using a technique that should be widely applicable. Their next task, should they choose, is to find a way to apply their method, perhaps for music recommendations systems, before somebody else gets in on the act.

Ref: arxiv.org/abs/1510.01806 : Music Viewed by Its Entropy Content: A Novel Window for Comparative Analysis

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

It’s time to retire the term “user”

The proliferation of AI means we need a new word.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.