By James Myers
The race to translate animal sounds into human language, Arik Kershenbaum wrote in Wired Magazine, would answer “a question that has puzzled humans as long as we have existed: ‘What are animals saying to each other?’” What, many especially wonder, are the animals saying about us humans?
Machine-learning, AI, LLMs like ChatGPT, and other techniques are supercharging technology’s ability to capture patterns in animal vocalizations that provide important clues to the answer. The first scientist to “crack the code” will receive not only fame but also a half-million-dollar prize.

Dr. Arik Kershenbaum. Image: University of Cambridge
As Dr. Kershenbaum, a zoologist at the University of Cambridge who studies wolves, gibbons, and dolphins, explained to our podcast, The Quantum Feedback Loop, animals are “not just automata, but living species with communicative intelligence not unlike our own.” His recent book, Why Animals Talk, describes a vast and fascinating array of information that animals are exchanging all around us.
Dr. Kershenbaum told The Guardian that there is “weird split personality” on the question of solving the age-old puzzle about what animals are saying. “On the one hand, we want animals to talk; but on the other, we’re scared of animals talking because that would mean we’re not quite as special as we thought.”
Passive acoustic localization technology is quickly bringing humanity closer to knowing what animals are saying.
The technology is widely used to locate the positions of animals as they move around, using their sounds to calculate the locations of individuals and groups using physics, geometry, mathematics, and other sciences. As Dr. Kershenbaum explained to The Guardian, zoologists and other scientists use the technology to “triangulate the position of animals using their sounds, which means we don’t need to collar them, or even see them. As long as they’re calling, we know where they are.”
The technology has helped to pinpoint signature whistles in dolphins, meaning that each dolphin individual has a unique sound that no other dolphin can make. The implications for connecting the sounds of individuals in motion, from specific locations at specific times and in specific groups, provides a major boost to scientists’ quest to crack the code of animal talk.
The technology also provides information on the different types of noises that dolphins make. For example, bottlenose dolphins use a clicking sound for echolocation to sense their environment, and both burst pulse calls and whistles for social communication – burst pulses being more frequently used for aggressive interactions while whistles tend to be friendlier.
BBC Earth’s feature on dolphins highlights the complex behaviour and communication of dolphins.
There are other animal communication technologies in use or under development, with the help of researchers in the fields of bioacoustics (which studies sounds made by living organisms) and ecoacoustics (which studies the sounds of entire ecosystems).
Miniaturization has allowed scientists to attach listening devices to animals as small as honeybees, and remote sensing technologies have made it possible to transmit animal vocalizations to scientists around the globe from microphones and drones in the animals’ natural habitats. In her book The Sounds of Life: How Digital Technology Is Bringing Us Closer to the Worlds of Animals and Plants, former University of British Columbia Geography professor Dr. Karen Bakker (1971-2023) wrote, “Combined, these digital devices function like a planetary-scale hearing aid: enabling humans to observe and study nature’s sounds beyond the limits of our sensory capabilities.”
Reuters reports that Baidu, the company that owns China’s largest search engine, has filed a Chinese patent application for a system to convert animal vocalisations into human language. Still in the research phase, the system “will collect animal data, including vocal sounds, behavioural patterns, and physiological signals, which will be preprocessed and merged before an AI-powered analysis designed to recognise the animal’s emotional state. The emotional states would then be mapped to semantic meanings and translated into human language.”
Applications like Avisoft and DeepSqueak were designed specifically to detect patterns in ultrasonic communications between mice, at frequencies higher than the human ear can detect. Designed by Dr. Kevin Coffeey, a behavioral neuroscientist at the University of Washington, DeepSqueak has now been adapted to detect patterns in the vocalizations of marine mammals.
Since 2020, researchers at the non-profit Project CETI have been interpreting whale vocalizations using statistical analysis and other methods.
In its 2024 annual report, Project CETI describes “significant strides in our research, discovering that sperm whales possess a phonetic alphabet and are capable of social learning, encouraging us to rethink our understanding and relationship to the nonhuman world. We’ve also continued to develop our scientific technology, including custom bio-inspired suction cups for our tags, custom drones and development of a new custom glider system to support our data collection efforts.”
In 2020, Project CETI received $35 million in funding from a Ted Audacious grant for a five-year research project to decode the communications of whales with advanced machine learning and state-of-the-art robotics. The organization has also partnered with New York University School of Law’s More Than Human Life (MOTH) Project with the goal of establishing legal and ethical principles for nonhuman animal communication technologies.

Project CETI researchers use underwater hydrophone arrays, swimming robotic fish, and other methods to collect data on whale vocalizations. Image featuring founder David Gruber, in the foreground, from Project CETI.
A paper entitled What If We Understood What Animals Are Saying? The Legal Impact Of AI-assisted Studies Of Animal Communication, published in April in Ecology Law Quarterly, sets out some legal and ethical implications of understanding communications between whales and other animals. For example, understanding the vocalizations of marine mammals like whales and dolphins could help with enforcement of existing laws against underwater noise pollution, which is a chronic problem caused by shipping and human water-based industrial activities.
Knowing what animals are saying would likely provide a significant impetus to protect endangered species and reduce animal suffering from human activities. The paper raises the possibility that such knowledge could also put a focus on the question of legal rights of individual animals and, possibly, the question of whether animals should be accorded legal personhood.
In an interview with Vox, Karen Bakker provided an example of a looming ethical issue as we gain the ability to control animals with sounds.
Bakker stated, “A research team in Germany encoded honeybee signals into a robot that they sent into a hive. That robot is able to use the honeybees’ waggle dance communication to tell the honeybees to stop moving, and it’s able to tell those honeybees where to fly to for a specific nectar source. The next stage in this research is to implant these robots into honeybee hives so the hives accept these robots as members of their community from birth. And then we would have an unprecedented degree of control over the hive; we’ll have essentially domesticated that hive in a way we’ve never done so before. This creates the possibility of exploitive use of animals. And there’s a long history of the military use of animals, so that’s one path that I think raises a lot of alarm bells.”
Becoming better stewards of the Earth’s environment is a key reason for learning to speak with animals.
As Dr. Kershenbaum explained to The Guardian, “Aside from AI, we are in an environmental crisis point. Increasing awareness of what our fellow creatures on the planet are going through, and how they’re changing, would be helpful. This is a time when we should be paying a lot more attention to what nature is saying, and you can’t do that if you’re just listening for what you want to hear, which is what we’ve been doing until now.”
Because they have limited ability to protect themselves, animals can suffer more severe effects from the environmental crisis than humans. Many organizations are working to protect animals against human-induced climate change, and understanding the reactions of animals is providing a significant boost to the efforts.
Rapid advancements in animal communication technologies are driving initiatives like Elephant Voices, whose mission is to “advance the study of elephant cognition, communication and social behavior, and to promote the scientifically sound and ethical management and care of elephants.” Aiming to collaborate with Project CETI, the 2025 goals for Elephant Voices include adding to its database that, at the end of 2024, held 10,400 records of recorded calls from identified individual elephants in known circumstances.
Each record includes nearly 100 data fields, and an additional 1,500 records await processing. Elephant Voices co-founder Joyce Poole notes that elephants use 250 sounds and gestures to communicate with each other, and correlating these to specific individuals and circumstances is a primary aim of the database.
In a video recorded for the Ted Audacious Project, marine biologist and Project CETI founder and project leader David Gruber states, “We’re in a situation now where our technology is pushing us against the wall of potentially driving us towards extinction, so there really is a need now more than ever to connect and learn from nature, and really this is a human journey into our connection with this planet and who we are. We’re not the only intelligent species here.”
Many advancements in interpreting animal vocalizations have been made since TQR published How Many Languages Are There? Scientists Shed Light on What Animals Other Than Humans are Saying, in March 2024. With the help of new technologies, science is rapidly nearing a successful conclusion to the ages-old question to communicate with animals.
What would happen, for example, if we develop a translator app that could tell us what our pets are saying? What if we had a dictionary for elephant-speak or whale-talk? Would we ever see these animals the same way we do now, or would realizing that we share a planet with a vast array of communicating animals humble us?
Craving more information? Check out these recommended TQR articles.
- Thinking in the Age of Machines: Global IQ Decline and the Rise of AI-Assisted Thinking
- Everything Has a Beginning and End, Right? Physicist Says No, With Profound Consequences for Measuring Quantum Interactions
- Cleaning the Mirror: Increasing Concerns Over Data Quality, Distortion, and Decision-Making
- Not a Straight Line: What Ancient DNA Is Teaching Us About Migration, Contact, and Being Human