Health and Medicineneuroscience

Your Brain Hears Math Differently To Normal Speech


Dr. Katie Spalding

Freelance Writer

clockAug 16 2021, 18:00 UTC

You can't see it, but that kid's parietal lobe is red hot right now. Image credit: Morrowind/Shutterstock

If you’re an artsy kind of a person, it can sometimes feel like your brain just isn’t wired for stuff like math or science. You can rhapsodize for weeks about the transformative brutality of Picasso’s Guernica, but the second you hear something like “two plus four is six” your brain just returns the neurological equivalent of an error 404.


Well, it turns out there might be a reason for that. New research published today in the Journal of Neuroscience has shown that even when math problems are spoken out loud, your brain processes them completely differently from normal speech.

“[It] is well known that the left hemisphere is more involved in language processing,” study co-authors Joshua Kulasingham and Jonathan Z Simon told IFLScience. “[Our] work also agrees with that, since we found sentence responses predominantly in the left temporal lobe. In contrast, responses to equations were present in both hemispheres, consistent with prior studies showing that arithmetic processing is more bilateral.”

Sample sentences, and the areas of the brain they most consistently activated. Kulasingham et al., JNeurosci, 2021

To figure out how the brain copes with spoken math, the team invited 22 test subjects to a "cocktail party" in a magnetoencephalography (MEG) machine.

Well, OK. Technically it wasn’t an actual cocktail party – it was something called a “cocktail party paradigm”. Named for the garbled, overlapping noise that you hear when multiple people are at a party all chatting to each other about various unrelated topics, this experiment involved playing the subjects two speech recordings simultaneously and seeing how their brain lit up when they tried to understand one or the other. One recording played a simple four-word sentence like “kids like sweet food” or “cats drink warm milk”, while the other played somebody reading a simple math equation – something like “two plus two is four”, or “eight less six is two”.


Now, those are both easy enough to follow on their own but combine them and you’re facing a jumble of nonsense. I mean, listen:


Could you make anything out? Try to fix your attention on just one of the voices and listen out for anything that doesn’t make sense. You know, maybe one of them tells you that one plus two is 10, or that dogs write cold soup. Something like that.

It’s unlikely that you have a spare magnetoencephalography machine lying around, but if you did, you would have seen your brain light up just then, and in rather predictable ways. So predictable, in fact, that the team behind today’s paper would have been able to tell which voice you were listening to – and how good you were at picking out the nonsense – just from looking at your brain scans.


 “[The] brain tracks high-level sentence and equation structures, but only when attended to,” Kulasingham and Simon explained. “These high-level responses [are] in areas associated with language processing (left temporal lobe) and arithmetic processing (bilateral parietal lobes) for each case … Our work confirms that there are overlapping but distinct cortical networks involved in language and arithmetic processing and that these networks can be well separated during an auditory attention task.”

The bombardment of speech that the study participants listened to may have been discombobulating, but from a scientific perspective, it was inspired. Previous work in this area had often used more “monolithic” designs, Kulasingham and Simon say, and this meant that the brain wasn’t being forced to prioritize.

“[By] asking the brain to do more than one thing at a time … we can better distinguish which parts of the brain are most critical to the primary task,” they told IFLS. “More traditional experiments … allow brain areas that ordinarily wouldn’t be involved in the processing to engage anyways (because there’s no reason not to).”


With their “cocktail party” experiment, however, the team could watch the brain process language-independent concepts – “in this case, arithemetic,” Kulasingham and Simon said, “but there are lots of other possibilities” – on their own terms. And that opens up some very intriguing questions.

“In principle, this same technique could even be used to see if an animal can learn how to employ the rules of some abstract system … without requiring the animal to actively demonstrate the knowledge,” they told IFLS. “Investigating the details of how the brain tracks these sentences and equations is [also] interesting … can we separate out and identify possible cortical mechanisms involved in detecting equation boundaries, identifying the arithmetic operation, parsing the equation, or calculating the final result?”

 This Week in IFLScience

Receive our biggest science stories to your inbox weekly!

Health and Medicineneuroscience
  • math,

  • neuroscience,

  • speech