Advances in the field of machine learning have led to improved speech recognition technology, and taking on animal speech represents an entirely new horizon for AI translation technology.
Humans communicate among themselves through different languages that can be understood only if they know those languages. Distant communication has been made possible by electronics and has progressed to its present unimaginable level of communication. Electronic translators for different languages are flooding the market, and by using these humans can get any language translated into a desired language. Scientists and technologists are also working hard to make use of electronics for intercepting and translating any signals, if any, being sent by aliens, too.
We are all around surrounded as well as associated with different types of animals and birds. Communication with animals has always been an integral part of our civilisation, but it happens mostly through intuition and gestures. Humans can interpret the sounds produced by animals or birds as demands or as warnings.
There are a number of mythological stories about people understanding the language of animals and birds, whether through meditation or spiritualism, but the practice is uncommon and unscientific. Many traditional songs indicate that certain people are masters of knowing or understanding the language of crows. It is a belief that all living things other than humans possess a sixth sense and therefore interpret sounds made by different animals and birds. However, without any scientific basis, these beliefs have been termed as superstition.
While living with animals and birds, humans may recognise the intentions, emotions or thoughts behind the sounds made by these animals and birds, even if the sounds themselves are not totally understood. These interpretations can be based on eye gaze, facial expression, vocalisation, body posture (including movement of bodies and limbs) and gustatory communication (scent, pheromone and taste).
At the primitive level, humans communicate with animals using vocalisation, hand signals, body postures and touch to express love and care. Artificial intelligence (AI) can help humans express feelings in a more effective way. This may soon become a reality as scientists are making use of electronics through the implementation of AI to decode the different dialects of animals.
There have been continuous attempts to create translation devices that can interpret animal sounds and their meanings. While some such attempts have failed or managed to simply mimic animal sounds, others have seen some success, translating a few sounds.
Although, these endeavours have been exciting and novel, these have not reached the ability to truly translate what is going on inside the mind of animals. Much of this technology is being used to study animals in the wild, but someday soon it may be parlayed into a device for humans to talk to the animals around them.
Progress made towards animal communication
The man credited with leading the charge in decoding animal sounds, Dr Constantine Slobodchikoff, is an expert in animal referential communication studies using prairie dogs as a model species. Dr Slobodchikoff has been able to index the sounds made by prairie dogs to synthesise their language into English using electronics with AI to maintain and develop the catalogue and to further allow the computer to learn on its own to translate the animal talking to a great extent.
Dr Slobodchikoff does not hesitate to mention that a bias exists among biologists and linguists. He believes that animals are only able to communicate and cannot express emotions, or have conversations based solely on instinct. As per consensus, there is no universal agreement on a common level of consciousness across all animal species, but scientists who have studied the gorilla named Koko have proven that apes are more than capable of thought and feeling.
Over the years, researchers have recorded hundreds of hours of prairie dog calls using hidden microphones. A sophisticated combination of electronics and AI is used to analyse each recording by looking at how different frequencies and overtones stack on top of one another. This is how researchers have learned that the calls can be clustered into different groups, with each cluster having its own signature set of frequencies and tones.
Dolphins have always been one of the most fascinating animals for scientists to study, as they are considered to be the second most intelligent animal on Earth, next to humans. The first effective animal translator was Cetacean Hearing and Telemetry (CHAT) translator, which was used to communicate with dolphins. Using this translator, a team of scientists studying a particular group of dolphins decoded whistles and their meanings, and CHAT translated the dolphins’ whistles to the word.
Further, a company is developing a program using AI analysis software along with CHAT to decipher what dolphins speak. The software for decoding human language is currently used to collect information about the type of emotion a speaker is exuding, and the program has already mastered 40 human languages. Companies interested in this utility believe that decoding dolphins will further this capability.
The same technology is also being used to translate the sounds of other animals. Sounds made by animals have different patterns and intonations tied to different concepts and words. Scientists feed these to a computer that maps out their meaning and context. So far, the program has recorded with refinement, distinguishing hundreds of different gibbon calls.
Although having an AI translator does not necessarily mean humans will be able to have a heart-to-heart conversation with their pets, as there are vast differences between human and animal cognition, and humans are a long way from understanding the latter. However, this kind of communication technology could help humans better understand animals and their behaviour—this would mean more than just forging closer emotional ties with them.
Simple communication with animals could eliminate guesswork in caring for animals and even save their lives. Similarly, AI-based electronics communication technology could make things easier for farmers and ranchers, for instance, by quickly identifying animals that are sick by detecting signs of pain on their faces.
Role of AI
Growing success of neural networks in applications like speech recognition, vision and autonomous navigation has led to great excitement not just among the members of the AI community but also among general public. Over a relatively short period of time, AI has helped manage automation of tasks that have defied convention for decades. Some such achievements have even reached human-level intelligence.
Under its newly-acquired label, called deep learning, a new trend is emerging where machine learning research is being streamlined into neural network research. Systems such as Google Translate have made major strides in human translation in recent years, using neural machine translation technology to support interpretation for 103 languages. Some earpieces developed recently can instantly translate various human languages.
Advances in the field of machine learning have led to improved speech recognition technology, and taking on animal speech represents an entirely new horizon for AI translation technology. New algorithms can learn to interpret languages on their own by analysing massive sets of data; however, it is still unclear whether such technology can really deliver accurate interpretation of animal communication.
Scientists and technologists are experimenting with AI-based electronic devices to decode and interpret animal sounds such as barks, growls or howls into a language that humans can understand. According to researchers, this is the result of combining latest technologies in three different areas of technology, namely, electroencephalography (EEG) sensoring, micro-computing and special brain-computer interface software. The operating system relies on sensors in the headset, which detects electric signals in the animal’s brainwaves. Technology from an in-built processing device then analyses signal patterns and deciphers these into distinct feelings like anger, curiosity or tiredness.
Scientists are working to develop the technology that aims to distinguish canine thought patterns and then issue these as short sentences via a microphone that could soon allow dogs to speak with humans. Even though brainwaves vary in individual races as well as individual dogs, it is possible to detect some common patterns.
With the growth in technology, there is no doubt that in the future this will open an era of communication between animals and humans. How precisely the scientists can attach sensors to an animal brain is yet to be ironed out. Such issues in addition to ethical and social concerns are the reasons why there is a lot more research to be done before the technology becomes commercially-available. It is expected that AI-based electronics translation technology capable of interpreting at least a dog’s language for human understanding could become a reality within the next ten years.
Dr S.S. Verma is professor at Department of Physics, Sant Longowal Institute of Engineering and Technology, Sangrur, Punjab