Share This Post

Featured News / Main Slider

New AI Method Predicts Music Genres by Analyzing Lyrics and Chords of Songs

New AI Method Predicts Music Genres by Analyzing Lyrics and Chords of Songs

According to an artificial intelligence (AI) tool created by a USC computer science student, the record-breaking track “Old Town Road” is slightly country, and slightly rock ’n roll.

An interactive visualization featuring the hit songs analyzed by Greer and Narayanan. (Image credit: Timothy Greer)

The debate can finally be brought to a conclusion—Lil Nas X’s record-setting, chart-topping hit “Old Town Road” is certainly country. However, it is also a bit of rock ’n roll. And while examining the lyrics and chords together, it is straight-up pop.

Anyhow, that is according to an AI tool created by USC computer science PhD student Timothy Greer. Greer’s technique involves automatically guessing music genres by investigating how lyrics and chords interact with one another all through the songs.

The technique categorized “Old Town Road” as country on the basis of the lyrics, as rock based on the chords (on the basis of a Nine Inch Nails music sample), and as pop on the basis of the chords and lyrics combined.

The paper titled “Using Shared Vector Representations of Words and Chords in Music for Genre Classification” will be showcased at the Speech, Music and Mind 2019 conference on September 14th, 2019.

A Very Human Experience

Old Town Road is an interesting song,” said Greer, an all-time musician who presently plays saxophone and keyboard in an LA-based band (music genre: Indie rock).

The lyrics are steeped in the country genre, but the chords and the instrumentation don’t sound like country at all. The algorithm highlights the complexity of music, both in terms of how the music is constructed and how it is perceived, in other words, how people process it.

Timothy Greer, Computer Science PhD Student, USC Viterbi School of Engineering

This attempt at music research—to computationally comprehend the stories told with it, and how people feel and are swayed by it—is a part of a larger research program in Computational Media Intelligence at the USC Signal Analysis and Interpretation Laboratory (SAIL).

Music construction and perception are related, but they are not one and the same.

Shrikanth Narayanan, Niki and Max Nikias Chair and Professor of Electrical and Computer Engineering, USC Viterbi School of Engineering

Narayanan, who is Greer’s supervisor and co-author of the paper, has earlier examined vocal patterns of opera singers and beatboxers using MRI scans, predicted violence ratings using movie scripts, and created technology that uses voice to judge a speaker’s emotions. He reported he is thrilled about this new study as it is a new way of analyzing music computationally and could uncover unexpected patterns.

We always say there is no hard-set rule for human experiences of music,” stated Narayanan, a classical music enthusiast who plays the Indian stringed instrument veena and the violin. “AI and machine learning can provide a lens through which to look at this very human experience.

A New Sound

“Old Town Road,” which has currently been topping the charts for 19 weeks, has been noteworthy for its genre-blending characteristic. Being one of the most hotly debated subjects in the pop world this summer, everyone seems to have a different view—is it pop, country, rock? Or a blend of all?

In April 2019, the song was taken out from the Billboard Hot Country chart as it did “not embrace enough elements of today’s country music to chart in its current version,” according to a Billboard statement.

Greer tested the song using three models he had developed to estimate the genre: using only chord embeddings, only lyric embeddings, and using a combination of chord-and-lyric embeddings. He taught the system on a dataset with 190,165 musical segments from 5,304 pop songs with lyrics and corresponding chords.

Nearly all genre prediction tools use a song’s whole audio file, which means retrieving and processing a high-quality recording, but Greer’s technique can categorize genre using just chords and lyrics, which are typically available online with a quick Google Search.

This interplay between chord sequences and lyric sequences may give us a better glimpse into how we perceive genre than using either alone, although both of these modalities contain useful information alone, as well,” said Greer.

The research offers a better insight into how music is perceived and processed, particularly the differences in human music perception—and categorization—of music genre based on the “looking glass” used.

Applications comprise how music content is marketed, consumed, and tagged; neuropsychology and the mechanisms of human thought; and affective computing systems that influence human emotions.

Source: https://viterbischool.usc.edu

AzoRobotics

Share This Post

Leave a Reply