From the moment that Chordify was launched, we’ve done our best to safeguard the simplicity and usability of the website and the apps, but mind you! There is some complex AI-technology behind the chords at Chordify. The good news is, you don’t have to study Artificial Intelligence to get a grasp of how our algorithm spits out the chords of your favorite songs. Please follow us, we will show you how it works. Let’s dive right in!

To make it comprehensible we can break down how we generate the chords here at Chordify into two processes: we need to know the chords of a song, and we want to know the position of the chords on the beat of the song, i.e. chord recognition and beat tracking. How does that work?

Weaving the spectrogram

We start off with a song’s audio — I hear you thinking: wait, what’s that exactly? Great question! An audio signal is a digitized waveform that describes deviations in air pressure over time that we as humans, in the end, perceive as sound. It is close to impossible to observe any meaningful information about chords or beats in a song’s waveform directly. Therefore, we convert the audio into a representation that reveals some insights into its musical content: a spectrogram. A spectrogram is a time-frequency representation that already gives us a better idea of the musical content in the given audio. Still, both chord recognition and beat tracking are challenging tasks, even on this representation.

Feeding the deep neural network

To solve them, we use deep neural networks. You can think of a deep neural network as a robot that you program to have a certain input-output behavior. For example, you ‘feed’ it a spectrogram and ‘ask’ it to return chords, or you feed it a spectrogram and ask it to return beat positions to you. Initially the robot really has no clue how to recognize a chord or how to detect a beat and therefore just gives you random outputs.

Training the robot

Do you remember that one time when our algorithm went on summer bootcamp to train its ass off? Probably not, that’s why we made a report about it. 

So, you start to train the robot by showing it vast amounts of spectrograms of songs along with the chord labels that you would expect it to recognize. And of course, you can do the same for beats. This way, the robot will eventually learn how to recognize a chord or how to detect a beat — i.e. find its way from input to output.

After showing the robot enough examples (thousands and thousands of songs), it will know how to perform its task on audio that it has never seen before, and that’s when we release it on our website and app, to show off its newly learned skills to all of you! 

Our research team is constantly improving the algorithm and with that the quality of the chords here on Chordify. Did you know that the Chordify Algorithm wants to learn from you? And that by editing songs, you help our AI to grow? It’s true. Happy jamming!

Want to know more about what’s going on under the hood? Here are some suggestions:

Could artificial intelligence disrupt the music industry?

Chordify team members publish an article about subjectivity in chord recognition

Our algorithm went on summer bootcamp to track beats even better

The Chordify Algorithm wants to learn from you

Make your chords even more specific

Chordify presents new beattracking algorithm at ISMIR 2019

The science behind writing down chords

Check out the big brain on our new algorithm

What did you think of this article?👍 👎You already voted!