Our Brains Produce A Detectable Signal When We Understand Something We Hear

A comparison between the signals at the back of the scalp when someone understands speech (top row) and when someone doesn't. Professor Ed Lalor

Don't you sometimes wish there was a way to tell if the person you are talking to is not just listening, but understanding what you say, like a social media app showing when your message has been read? It turns out there is, but no one has been able to detect it until now. The technology may reduce the danger of communication breakdowns in high-stress situations.

When you think about it the brain's capacity to process complex ideas at the rapid rate at which people speak is one of the most remarkable aspects of human cognitive capacity, particularly for languages packed with words that sound identical but have very different meanings.

In an effort to understand how we do this, Professor Ed Lalor of Trinity College Dublin used datasets from previous studies of brainwaves tracked by electroencephalography (EEG) from people listening to audiobooks. He looked out for occasions when the books contained words that didn't fit well with those that preceded them.

The frequency of word combinations can help teach language to computers, where artificial intelligence systems are fed enormous quantities of text and programmed to see how often certain combinations of words appear together. Lalor reasoned that words with very unusual proximities would indicate difficult-to-understand concepts. He subsequently tested the listeners on their understanding of what they had heard at these points.

A distinctive signal could be seen whenever someone understood what had been said, despite the unexpected word.

However, this disappeared when those involved in the trial could not understand what was being said, Lalor reports in Current Biology. Importantly, the signal's absence is independent of the reason for the communication failure. If a participant couldn't understand because other noises drowned the audiobook out, the signal disappeared, as it did if the listener was distracted, or simply couldn't get their head around what they had just heard.

How much this tells us about the methods our brains use to understand speech remains to be seen, but Lalor argued in a statement there could be plenty of other benefits. “Potential applications include testing language development in infants, or determining the level of brain function in patients in a reduced state of consciousness. The presence or absence of the signal may also confirm if a person in a job that demands precision and speedy reactions – such as an air traffic controller, or soldier – has understood the instructions they have received, and it may perhaps even be useful for testing for the onset of dementia in older people based on their ability to follow a conversation,” he said.

Electroencephalograpy (EEG) equipment ready to test whether someone has understood what they heard. Might need some alterations for practical applications. Professor Ed Lalor



If you liked this story, you'll love these

This website uses cookies

This website uses cookies to improve user experience. By continuing to use our website you consent to all cookies in accordance with our cookie policy.