Home > Information > News
#News ·2025-01-03
Each animal has its own unique history.
Shane Gero, a whale biologist at Carleton University in Canada, has spent 20 years trying to understand how whales communicate.
For example, whales from the same family make specific sounds, while sperm whales from different regions (Physeter macrocephalus) have their own "dialects."
Dolphins' whistles, elephants' rumbles, and birds' trills all have specific patterns and structures.
These subtleties can be difficult for humans to recognize and understand, but finding patterns is where AI excels.
Over the past year, AI has been helping researchers "decode" these sounds in nature.
Whales gather together in clans, each with a unique diet, social behavior, and habitat. A clan can contain thousands of whales, with each family headed by a female.
Whales spend most of their time searching for food in the depths of the ocean, up to 2 kilometers below the surface. Sunlight doesn't reach them. They use echolocation to find their prey.
On the surface, where echolocation is not required, they also use a series of clicks called codas to keep in touch with other whales, lasting between three and 40 clicks at a time.
Whales of different clans use different rhythms and pauses in their endings, and these "dialects" mark the "cultural boundaries" between clans.
In the Caribbean, Gero and his colleagues spent thousands of hours collecting data from more than 30 whale families living nearby.
To understand the rhythm and speed of the finale, the team manually created spectrographs of whale sound recordings, visualizing features such as volume and frequency.
Gero says the task is time-consuming, and handing it over to a machine learning algorithm speeds things up considerably, while also helping to distinguish which sounds are coming from which animals.
In addition, artificial intelligence also allows research to go further.
Manual manipulation can basically only classify individual words, but AI can handle the equivalent of a sentence or even the end of an entire conversation. "Machine learning is very good at finding patterns that are difficult to capture with standard statistical methods."
The researchers collected a dataset of 8,719 coda and, with the help of AI, discovered the "sperm whale phonogram" as a basis for sharing complex information between whales.
Sperm whales aren't the only creatures that use specific vocalizations to identify themselves. Mickey Pardo, a behavioral ecologist who worked at Colorado State University, used AI to discover that wild African elephants have their own names.
Elephants use low rumbles to communicate with each other, which differ in different situations (distance, face to face, or parent-child interaction).
Pardo and his colleagues found that elephants respond to certain calls and ignore others.
The researchers trained the AI model to learn the acoustic characteristics of these "calls" and predict the recipient based on the characteristics of the new call.
In the end, the model matched the caller with 27.5 percent accuracy - and while that doesn't seem like a good score, the elephant didn't call them by their first name every time.
Another animal whose real name has been discovered by AI is the marmoset (Callithrix jacchus).
In addition to predicting names, Pardo is also trying to use AI to decode other "elephant words," such as location terms.
Elephants make special calls when they call their companions to move towards a particular location. The model identified the meaning of the calls, and the researchers played the calls and verified where the elephants went.
In a separate study on elephants, Pardo found clear differences in the calls of elephants between two populations in Kenya.
Therefore, in the conservation of endangered species, individuals cannot simply be put together with other species, because the "new" may face the trouble of language.
In addition, elephant calls also contain information about gender, age, physiological condition, and so on, which scientists can comb through and use passive acoustic monitoring to understand how a particular elephant is doing.
Caroline Casey, an animal behavioral ecologist at the University of California, demonstrated in her doctoral thesis that elephant seals (Mirounga spp) also give themselves names.
Casey believes that using AI-based classifiers to interpret animal calls can reduce human bias in research, but at the same time, the value of human intuition should not be ignored.
"The human brain is able to integrate how we understand and function about our own world and use that to help explain animal behavior."
Generalize to crows
Machine learning expert Olivier Pietquin is the director of AI research at the Earth Species Project, where the team is currently using AI to decode animal species' communication.
Pietquin hopes to use the ability of neural networks to generalize from one data set to another to train models using not only a large number of sounds from different animals, but also other acoustic data (including human speech and music).
"A computer can derive some basic features of sounds before it can build an understanding to specifically recognize the vocal features of animals. This is the same way that image recognition algorithms trained on pictures of faces learn some basic features of pixels."
The pixel first describes the ellipse, then the eye. So, even using faces as most of the training data, the AI model can still use this basic knowledge to recognize a cat's face.
"We can imagine using human speech data and hoping that it can be transferred to any other animal with vocal cords."
Models trained in this way help identify which sounds convey information and which are just noise. Of course, figuring out where these calls are directed still requires humans to observe the animal's behavior and add labels to what the computer recognizes.
Researchers from the Species on Earth Project have created a neural network called Voxaboxen, which they are applying to the study of crow communication.
Unlike their counterparts elsewhere in Europe, the carrion crow population in northern Spain (Corvus corone) shares the responsibility of caring for their young. A group of crows will take turns guarding the nest, cleaning the nest and caring for the chicks, tasks they must coordinate through voice communication.
The researchers attached tags to the crows' tail feathers that contained a tiny microphone, an accelerometer and magnetometer to measure the birds' movements and calls. The tags are able to collect data for about six days, then drop to the ground and send out a signal that makes it easy for staff to retrieve and study the data.
Although there are examples of sperm whales, African savanna elephants, marmosets, walruses and crows, it is still too early to create an "animal version of Google Translate" with AI.
There is no accepted definition of whether animals are capable of communicating beyond a basic level - that is, without language.
Pardo says his main goal is not to be able to talk to wild animals and pets, but to understand their minds and how they see themselves and the world.
For example, the fact that some animals seem to have names means that they are able to recognize other individuals as entities and come up with labels, suggesting a sophisticated level of abstract thinking.
2025-02-17
2025-02-14
2025-02-13
13004184443
Room 607, 6th Floor, Building 9, Hongjing Xinhuiyuan, Qingpu District, Shanghai
gcfai@dongfangyuzhe.com
WeChat official account
friend link
13004184443
立即获取方案或咨询top