If you’ve ever wondered what whales talk about when they mingle deep below the surface of the ocean, then fret not – scientists are working on it. As part of this endeavor, a team of researchers from Norway and Germany have designed a computer algorithm that is able to decipher whale vocalizations in brand new ways. Their findings have been published in the journal Physical Review E.
Researchers have long known that whales produce sounds in order to communicate with each other. Sight isn’t the most useful sense in the dark blue depths, and their ability to smell is incredibly limited, so their sense of hearing is acute and highly evolved. Whales do not have ears, and instead use specialized structures in their jaw bones to “hear” sounds as vibrations in the water around them.
Whales are also able to communicate with each other using noises – mainly clicks, whistles, and pulsed calls. In addition to social interactions, clicks are mainly used for navigational purposes; sound waves bounce off objects, which alerts the whale to their presence in much the same way a submarine uses sonar to move through the deep.
Researchers have previously attempted to isolate the individual noises that whales make in order to better understand their purpose. The authors of this new study note that this introduces an inherent bias into the work: Scientists may be picking out vocalizations that, to the human ear, may sound more meaningful than they actually are.
Humpback whales are also known to have complex vocalizations. Yann hubert/Shutterstock
The researchers decided to approach this problem from a different angle, by designing a piece of software that looks at the overall “conversation” of a pod of whales rather than just the individual noises. By looking at how the frequencies of certain clicks, whistles and pulses change over time, the algorithm can identify the “dialect” of the pod, which may be distinct from another.
The team liken this approach to determining the difference between two groups of people speaking English, but one with an American accent and the other with a British lilt. Both are the same language, but they sound different.
The algorithm was tested on six different pods of long-finned pilot whales off the coast of Norway. By looking at the overall symphony of sounds made by each pod, then comparing them to the others, the algorithm was able to quickly tell them apart.
This research is a huge boost for marine biologists hoping to understand the complex social nature of whales. Perhaps more advanced algorithms will eventually be able to recognize that within a pod, certain patterns of vocalizations has some inherent meaning – and then we’d really get to know what whales talk about.