Background information

How AI is used to decode animal languages

Anna Sandner
23/11/2023
Translation: Megan Cornish

An app that lets us talk to dogs, translation tools that decode whale songs or systems that translate the dance of a honey bee into human language. Artificial intelligence opens up new possibilities for understanding animal languages. Let’s explore the extent of current research and development and where there are potential risks.

How nice would it be if we could talk to our pets like we would a good friend? What if, thanks to a digital translator, a dog or cat could simply say what they need, what scares them, what they want and what they dream about? Most dog owners understand when their four-legged friend is hungry, afraid or wants to go out. But often enough, even this communication goes wrong when people only listen to the sounds, while animals (also) communicate with facial expressions, posture and actions. And when it comes to rabbits, hamsters or canaries, basic needs are often not clearly understood.

In the future, could we use artificial intelligence to ask our guinea pig what its mood is and what it thinks about us and the world? Well, we can’t expect philosophical conversations with rodents, but part of humanity’s long-held dream of «talking to animals» seems like it might be possible in the future.

AI and machine learning as the key to communication

What researchers, behavioural biologists, animal trainers and other animal lovers have been trying to do for a long time is now within reach thanks to machine learning and artificial intelligence. Attempts have long been made to teach animals human language in various ways. Be it through sign language or picture boards, the research has sometimes gained remarkable insights – but the laborious attempts remained isolated examples and rarely made it beyond a few words.

Washoe, the signing chimpanzee

One of the most impressive examples of communication with animals came from the Washoe Project, which began in 1966. Psychologist and anthropologist Roger Fouts trained young chimpanzee Washoe in American Sign Language (ASL) and later described his experiences in the book Next Of Kin.

Washoe learned several hundred signs and was able to communicate well by independently beginning to combine individual words to create sentences. She also passed on what she had learned to her adopted son and talked to other signing apes in ASL.

Next of Kin describes the author’s impressive experiences with the chimpanzee Washoe, who spoke American Sign Language.
Next of Kin describes the author’s impressive experiences with the chimpanzee Washoe, who spoke American Sign Language.
Source: HarperCollins Publishers Inc

There were other examples of teaching human language to apes. There was Koko, a gorilla who learned a modified form of sign language and used it to successfully communicate with people. And Kanzi, a bonobo who partially understands spoken English and communicates using a picture board containing 300 lexigrams.

But these attempts to teach monkeys human language from an anthropocentric world view aren’t only ethically controversial looking through a modern lens. Statements about an animal’s ability to learn aren’t the same as statements about its understanding of what has been learned.

Understanding animal languages and learning to «speak»

Now rapid technological advances are enabling a new approach. Instead of trying to teach animals human language, researchers around the world are decoding different animal languages. The aim is to understand how animals communicate with each another and then to «speak» to them in the animal’s language. This is a zoocentric approach.

The breakthrough is based on several new technologies that could make possible what was long unthinkable. With ever-improving data collection options (e.g. more sensitive underwater microphones to record whale sounds around the clock), large amounts of animal sounds can be collected and isolated from background noise, then used as the basis for speech decoding. Without machine learning and AI, evaluating the data simply wouldn’t be possible. Decoding these enormous amounts of data will only become possible with ever faster computing power, more sophisticated analysis programs and ultimately AI.

The Earth Species project: using AI to decode non-human language

The Earth Species project brings together different approaches with a common goal: to communicate with animals in their own language. And this doesn’t always just mean making sounds.

Depending on the animal species, there are very different approaches, many of which also integrate body language, which is how many species exchange information. For example, among other things, bees communicate using complex dances, which they can use to describe routes to food sources. Besides barking, dogs also use posture and facial expressions to communicate. Even the songs of many bird species involve sophisticated communication systems that need to be deciphered.

The Earth Species project aims to decode this non-human communication using AI. The non-profit organisation brings together researchers from various disciplines who believe that understanding these languages will transform our relationship with the rest of nature.

CETI: understanding the language of whales

In 1966, a US Navy employee accidentally recorded whale songs. They were sold millions of times as an album (Songs of the Humpback Whale) and were even played in front of the UN General Assembly. This created new awareness of the giants of the sea. In 1982, commercial whaling was finally banned internationally. Songs that only resemble human music but can’t be understood resulted in better protection for animals. What difference could it make if we understood animal languages?

Project CETI uses machine learning and robotics to listen to and translate sperm whale communications.
Project CETI uses machine learning and robotics to listen to and translate sperm whale communications.
Source: Alex Boersma

CETI project lead David Gruber is working on exactly this with a team of world-leading experts in artificial intelligence and natural language processing, cryptographers, linguists, marine biologists, robotics specialists and underwater acousticians. The goal is to understand how sperm whales communicate. The project relies on a system that can also serve as a template for decoding other animal languages. By the way, the name CETI is a reference to the SETI Institute, an NGO that deals with the search for intelligent extraterrestrial life.

This is how CETI – the research project to decode whale language – works: by integrating biology, linguistics, robotics and machine learning, extensive communication data is collected and a model of whale language is developed using interactive experiments.
This is how CETI – the research project to decode whale language – works: by integrating biology, linguistics, robotics and machine learning, extensive communication data is collected and a model of whale language is developed using interactive experiments.
Source: Alex Boersma

In the video, CETI founder David Gruber explains how the project came about and which system will be used to decode the language of sperm whales:

Zoolingua: a translation app for dogs

Similar to the CETI project, «Zoolingua» also aims to use new technologies to create understanding and the means to talk to animals. The project is still in its early stages and is initially focusing on one species: dogs.

Will we soon be able to communicate with our dogs via the Zoolingua app? Animal behaviour researcher Dr Con Slobodchikoff believes it’s entirely possible thanks to new technologies and is working with his team on developing one such app.
Will we soon be able to communicate with our dogs via the Zoolingua app? Animal behaviour researcher Dr Con Slobodchikoff believes it’s entirely possible thanks to new technologies and is working with his team on developing one such app.
Source: SG SHOT/Shutterstock

The foundation of this idea is based on the work of Dr Con Slobodchikoff, an animal behaviourist and conservation biologist. He has been studying the social behaviour and communication systems of prairie dogs since the mid-1980s. Through sophisticated experiments, he decoded the structure and meaning of the animals’ alarm calls. He showed that animals have language that suits their needs, just as our language suits ours.

The intended result of the current research is an app that translates dogs’ facial expressions, vocalisations and actions into language. To do this, people film their dog when they notice that it wants to communicate something. The video is then uploaded via the app and analysed by AI, which then reveals what the dog wants to say.

In the video below, Dr Slobodchikoff explains what «Zoolingua» aims to achieve:

A dancing bee robot can reveal the way to food

But the decoding of animal languages doesn’t stop at mammals. Researchers at TU Berlin are working on decoding the dance of bees in order to better understand the collective intelligence of the swarm animals. Honey bees can use a sophisticated dance to tell other bees exactly where to find a food source. They indicate the direction and distance of where the food is located. Bees that accept the message first imitate the dance and then set off on the path described. The research team took advantage of this to communicate with the bees.

When a bee finds a suitable food source, it communicates the location to the other bees through its dance. The robot bee at TU Berlin can also communicate directions through its dance.
When a bee finds a suitable food source, it communicates the location to the other bees through its dance. The robot bee at TU Berlin can also communicate directions through its dance.
Source: Sushaaa/shutterstock

After deciphering how exactly the «bee tail dances» show a path, the researchers had a small «RoboBee» perform such a dance to show the honey bees a path. And it worked. Just like the message from a live bee, some of the animals imitated the RoboBee dance and flew to the specified location. The bees had understood the robot bee’s directions. However, why only some of the bees always fly off – regardless of whether bees or RoboBees transmit the message – still needs to be researched.

In the video below, you can see the dancing bee robot and find out more about the research:

Speaking animal languages with AI: possibilities, limitations and concerns

There’s rapid progress in developing new technological methods to understand animal languages and communicate with animals in their own language. But an all-powerful translation tool is unlikely to be created in the future, simply because animals have their own everyday realities that differ from our human view of the world. Ultimately, humans and animals can only communicate about what is perceptible to both species in the world. A chat about the meaning of life with the goldfish at the breakfast table isn’t a realistic prospect in the future either. Finding out more about the fears and wishes of a dog or cat with a valid translation app is more likely.

Increasing understanding and knowledge of different animal species and what goes on within them can open up opportunities for animal conservation. But the possibilities for communication also harbour dangers, for example, if fishermen or poachers specifically use animal languages to lure animals into traps. Not to mention past attempts to use certain animal species for military purposes. Or even if humans used AI to start talking to animals before it’s even really understood what exactly would be said. So far, the research has been carried out by non-profit organisations that are also committed to protecting animals. In order to avoid misuse, there are already calls for guidelines for dealing with the new means of communication in order to avoid the animal world being disadvantaged.

Header image: Aleksey Boyko/Shutterstock

16 people like this article


User Avatar
User Avatar

Science editor and biologist. I love animals and am fascinated by plants, their abilities and everything you can do with them. That's why my favourite place is always outside - somewhere in nature, preferably in my wild garden.

These articles might also interest you

  • Background information

    Thermomix, we need to talk again

    by Luca Fontana

  • Background information

    7 questions you have about DeepSeek (and the answers)

    by Samuel Buchmann

  • Background information

    Inzoi is The Sims on steroids – and I’m optimistic about it

    by Michelle Brändle

Comments

Avatar