Whenever any of us use a device where there is a powerful algorithm that helps us correct our spelling and suggest the end of sentences, there is an artificial intelligence machine behind that constantly improving and learning more and more the language. . The structures of the sentences are analyzed, the chosen words understood, the languages recognized.
This same ability could, in 2020, provide us with the first signs of a new ability, that of talking to large animals beyond humans. Don't change your site, I really mean it. I'm not talking about how to communicate with animals in the traditional way, by stroking and squealing, and I'm not Amelia Kinkade*. Perhaps this ability will rise to the fore even faster than brain-computer interfaces like Neuralink and others (but not that of CTRL-LABS which I think will be ready first).
The advanced capabilities of an artificial intelligence to decode languages have reached a point where they can begin to analyze even dead languages.
MIT and Google researchers have recently applied these skills with moderate success to ancient languages (Linear B e Ugaritic, precursor of Hebrew and first known alphabet). No luck so far, however, with the old and not yet deciphered Linear A.
How does AI understand ancient languages?
First, word-to-word relationships are mapped in a specific language, leveraging vast text databases. The system searches the texts to see how often each word appears next to each other word. This "map" of relationships is a unique imprint that defines the word in a multidimensional parameter space.
Researchers estimate that languages (all languages) can best be described as structures with 600 independent dimensions of relationships. Structures in which every relationship between word and word can be seen as a vector, let's even say “a line”, a specific route such as a train section, with precise stops. Ultimately, this vector acts as a powerful constraint that conditions how the word can appear in any translation produced by artificial intelligence.
These vectors obey some simple rules. For example: king - man + woman = queen. Each sentence can be described as a set of vectors which in turn form a trajectory through the word space.
And now we can talk to the animals
Take a leap forward. Consider the crazy speed that an artificial learning system has. Imagine that whale songs communicate in a structure similar to that of words. If the relationships that whales have for their ideas have dimensional structures similar to those found in human languages, we may be able to map the key elements of whale songs. Understand what whales are talking about and maybe be able to communicate with them.
Small reminder: some whales have brain volumes three times larger than that of adult humans, larger cortical areas, and fewer neurons, but with a similar distribution. African elephants have three times as many neurons as humans, but in very different distributions from those seen in our brains.
It seems reasonable to assume that the other large mammals on earth have attributes of thought, communication and learning that allow us to connect in some way.
What are the key elements of whale songs and elephant lines? Fonemi? Blocks of repeated sounds? Toni? Nobody knows yet, but at least the journey has begun.
The challenge, talking to animals
Projects like Earth Species Project or l 'Animal Language Institute they aim to rely on technological tools (in particular AI and everything we have learned in using computers to understand language). The goal is ambitious: to talk to animals, and even before listening to what the animals say to each other, or to us.
There is something deeply comforting in thinking that the linguistic tools of artificial intelligence can do something so beautiful to bring all thinking species together. Maybe someday we can play tricks on animals based on a joke, not their extinction.