Since the groundbreaking album release Songs of the Humpback Whale, a vinyl that reached 62 on the charts, featured in National Geographic, and even went into space on Voyager, humans (and possibly aliens) have been obsessed with whale song. It’s easy to see why as you take in the haunting whines and clicks that resonate through the water, even if we don’t have a clue what they’re saying.
But what if we could translate these clicks, whistles, and whines, and — this is where it gets really wild — what if we could send messages back? Communicating with whales might sound out of reach (and like the start of the greatest disaster movie ever) but it’s actually something that is coming closer and closer to reality.
The interspecies conversation is being led by Project CETI (Cetacean Translation Initiative), which began its ambitious endeavor back in March 2020, according to Hakai Magazine. The goal: to decode whale songs, establish their “language” and, hopefully, talk back. One can only hope someone captures a reaction video of the first whale to find itself in conversation with an ROV.
The crackpot or genius idea, depending on your feelings towards establishing a narrative with a species whose intelligence could prove to challenge our own (when’s the last time you heard about a whale putting 15 boiled eggs up its rectum?), came about through a series of serendipitous conversations (with humans, not whales).
It began with computer scientist Shafi Goldwasser and marine biologist David Gruber as they discussed the similarities between sperm whale clicks and Morse code. They joined forces with computer scientist Michael Bronstein who posited that AI could be used to analyze mountains of sperm whale recordings to look for patterns comparable to speech.
The research throws into question the headache-inducing concept of when communication constitutes language, and if it really exists outside of humans. In a recent interview with IFLScience, Dr Valerie Vergara had a lot to say about cetacean communication, having dedicated much of her career to eavesdropping on chatty beluga whales, which are known as the “canaries of the seas” for their noisy nature.
As Vergara explained, it’s known that young whales exhibit their own beluga “babble talk” as they try to learn vocalizations from their parents and wider pod. One emerging avenue of Vergara’s work focuses on the recognition of unique vocal signatures among beluga whales, the decoding of which could be pivotal to establishing if they talk and whether those communications constitute a conversation.
Taking the cetacean chat and turning it into something we can analyze requires processing a lot of data, far more than human researchers can, which is where AI comes into play. Language models like GPT-3 can effectively finish an unfinished sentence (or headline) by learning what conventionally comes next, a bit like autocorrect. That said, GPT-3 still sometimes gets it catastrophically wrong.
So, we have a model and we have whales — what’s missing? This kind of tech has required around 175 billion words to work for human language, whereas the current bank of sperm whale “codas”, the term for the sperm whale equivalent of a word, sits at a comparatively measly 100,000. The next step, therefore, is to significantly bulk up the number of sperm whale recordings so that we can adequately train an AI’s neural network.
Even once that’s done, however, it’s questionable how the tech might be received by unsuspecting whales. “Maybe they would just reply, ‘Stop talking such garbage!’” said Bronstein.
[H/T: Hakai Magazine]