In the children’s book “The Story of Doctor Dolittle” by Hugh Lofting, the good doctor is able to penetrate the mysteries of the natural world and human history by talking to animals. It’s never explained exactly how he is able to do this, but it does lead him on many adventures, spawning decades of books, films, and even stage productions.
Now, it seems that data scientists are trying to open similar communications channels with the non-human fauna of planet Earth – using artificial intelligence (AI) as their guide.
Several projects are underway to devise machine learning and other intelligent models that can decipher animal sounds – everything from whale songs to chicken clucks – to see if they can be interpreted according to how they behave.
To date, there have been no breakthroughs of note, and the idea of actually converting human thoughts and expressions into animal “language” is still rather far-fetched.
But, researchers are nonetheless hopeful that by simply fostering a greater understanding of why animals do what they do, conservation and preservation programs can be made more effective.
A Whale of a Conversation
One of the leading programs is the Cetacean Translation Initiative, aka Project CETI. The group operates out of the Caribbean island of Dominica, where it has set up a massive network of underwater recording devices and has even miked up a number of whales themselves, all in order to collect the data that will be used to train machine learning algorithms in whale-speak.
According to the New Yorker, sperm whales, in particular, have been known to relay numerous kinds of vocalizations for hours at a time, often in repetitive patterns called codas. These patterns differ from one region to another, like human languages do, and baby whales tend to utter random sounds before they, too, start expressing codas. The Ceti project has already identified a number of these codas, but whether any of them qualifies as an actual language is unclear.
Another effort, the Earth Species Project, is looking to expand the dialogue to a wider range of animals, including monkeys, birds, and elephants. The idea, says Wired, is that the more we humans understand the natural world, the more awareness we will have of our collective impact, which already has diminished animal populations by 70 percent over the past century.
The group is banking on the idea that new methods of AI translation between human languages might forge a link to animal sounds. Earlier translation bots used a semantic approach to bridge the gap between, say, French and English – basically like looking up words in a dictionary at high speeds. But this approach produced marginal results, at best.
Newer models use a geometric scheme to identify where languages overlap, producing far more accurate translations. If this can improve communications between humans who speak different languages, then why not between humans and animals?
Already, the group has made progress on one of the thornier problems in deciphering animal sounds: determining who is speaking at what time. Whether it’s dolphins at sea or monkeys in the forest canopy, there tends to be a lot of jabbering at once. To counter this, the team has built a neural network that can separate overlapping sounds, although it remains to be tested in the field.
In Japan, meanwhile, research is underway to devise an algorithm that can assess the mental state of chickens based on the way they cluck. At the University of Tokyo, AI models based on existing Deep Emotional Analysis Learning (DEAL) are being used to track the changing vocal patterns of ordinary chickens to see if they reveal if the birds are contented, stressed, hungry, or experiencing any number of other physical or emotional states.
The team, which consists of data scientists, animal psychiatrists, veterinarians, and others, says it has achieved a high success rate, although their findings have not yet been peer-reviewed. If successful, though, the program could produce knowledge that can be used to guide care and feeding practices, housing, and a range of other factors.
At the moment, all of these projects are targeted at improving our understanding of animals and their habitats, with the final aim of alleviating threats to their survival. Understanding their sounds might provide clues as to why dolphins and whales beach themselves, for example, or why groups of chimpanzees wage war on each other.
But there is no reason why this cannot be used to increase the exploitation of animals as well. The more we know about animal moods and emotions, the better we can manipulate them and their environments to, say, increase egg or milk production.
Taking it a step further, researcher and author Karen Bakker noted to Scientific-American recently that commercial fishing and even whaling, which is still legal in some parts of the world, can be made more effective if we know what these animals are thinking and can signal them to move to one area or another. There is also the possibility that new ideas or memes could be introduced to entire animal populations, with unknown consequences to their behavior and social structures.
With great power comes great responsibility, of course. Among all the other ways governments across the globe are looking to regulate AI, perhaps equal attention should be paid to its impact on both the natural world and the human world.
But given that there is little consensus, if any, as to how AI should be governed at all, there is not likely to be much movement on its applications in deciphering the animal mind. With luck, however, it will be a while before any of this research bears actual fruit, and as intriguing as it sounds, there will be some rules in place before we, like Dr. Dolittle, can actually talk to the animals.