Words And Melodies Are Decoded In Different Sides Of The Brain

This new study shows that our brain primarily processes the words in songs on the left-side of our brain, and the melody on the right. katielittle/ Shutterstock

Listening to songs is a common activity, yet it relies on our brain’s ability to carry out the difficult task of processing both speech and music at the same time. A new piece of research studied how our brain manages to do this, and it turns out we have different sides of our brain to thank for distinguishing between the words and melodies in songs.

Inspired by a songbird’s ability to separate sounds using two measures (time and frequency), Robert Zatorre, a professor at McGill University’s Montreal Neurological Institute and co-author of the study published in Science, told NPR they wanted to see if the same happened in humans.


To do so, they first enlisted the help of a composer and a soprano. They helped to create 100 unique a cappella songs from 10 sentences each sung in 10 original melodies, only a few seconds long. Then the researchers had some fun. They altered the timing and frequency patterns of some of the recordings before asking 49 participants whether the melody and words were either the same or different in pairs of tunes.

The researchers found that when timings were changed, participants could no longer understand the lyrics, but could still recognize the melody. On the other hand when the frequencies in the song had been distorted, the lyrics were still recognizable but the melodies no longer were.

At the same time that the participants were played the songs, their brains were also being scanned using functional MRI. The results showed that different sides of the brain were more heavily involved in the decoding of each element. Speech content was primarily processed in the left auditory cortex, whilst the melodic part was handled primarily in the right.

The idea that the left and right brain respond to speech and music differently is not unknown. Speaking to NPR, Daniela Sammler, a researcher at the Max Planck Institute for Cognition and Neurosciences in Leipzig, Germany, who was not involved in the study, explained that “If you have a stroke in the left hemisphere you are much more likely to have a language impairment than if you have a stroke in the right hemisphere." Other studies also show that brain damage to parts of the right hemisphere can impact a person’s ability to perceive music.


The study contributes to our existing knowledge about why this specialization exists, and it's down to the type of acoustical information (in this case timing and frequency patterns) contained in the source’s soundwave.

Whilst this study did use sentences in both French and English, in the future the team would like to use more melodic languages, such as Thai and Mandarin, to see how this may affect the results.