Every time any of us uses a device where there is a powerful algorithm that helps us correct our spelling and suggest the ends of sentences, there is an artificial intelligence machine behind it that is constantly improving and learning more and more of the language . Sentence structures are analysed, chosen words understood, idioms recognised.
This same ability could, in 2020, provide us with the first signs of a new ability, that of talking to large animals beyond humans. Don't change site, I mean it. I'm not talking about how to communicate with animals in the traditional way, with petting and squealing, and I'm not Amelia Kinkade*. Perhaps this ability will rise to the fore even faster than brain-computer interfaces like Neuralink and others (but not that of CTRL-LABS which I think will be ready first).
The advanced capabilities of an artificial intelligence to decode languages have reached a point where they can begin to analyze even dead languages.
MIT and Google researchers have recently applied these skills with moderate success to ancient languages (Linear B e Ugaritic, precursor of Hebrew and first known alphabet). No luck so far, however, with the old and not yet deciphered one Linear A.
How does AI understand ancient languages?
First, word-to-word relationships in a specific language are mapped, leveraging large text databases. The system searches the texts to see how often each word appears next to every other word. This “map” of relationships is a unique imprint that defines the word in a multidimensional parameter space.
Researchers estimate that languages (all languages) can best be described as structures with 600 independent dimensions of relationships. Structures in which every relationship between word and word can be seen as a vector, let's say "a line", a specific path such as the route of a train, with precise stops. This vector ultimately acts as a powerful constraint that affects how the word can appear in any translation produced by artificial intelligence.
These vectors obey a few simple rules. For example: king – man + woman = queen. Each sentence can be described as a set of vectors which in turn form a trajectory through word space.
And now we can talk to the animals
Take a leap forward. Consider the crazy speed that an artificial learning system has. Imagine that whale songs communicate in a structure similar to that of words. If the relationships that whales have for their ideas have dimensional structures similar to those found in human languages, we may be able to map the key elements of whale songs. Understand what whales are talking about and maybe be able to communicate with them.
Small reminder: some whales have three times the brain volume of adult humans, larger cortical areas and fewer neurons, but with a similar distribution. African elephants have three times as many neurons as humans, but in very different distributions from those seen in our brains.
It seems reasonable to assume that the other large mammals on earth have attributes of thought, communication and learning that allow us to connect in some way.
What are the key elements of whale songs and elephant lines? Fonemi? Blocks of repeated sounds? Toni? Nobody knows yet, but at least the journey has begun.
The challenge, talking to animals
Projects like Earth Species Project or l 'Animal Language Institute they aim to rely on technological tools (especially AI and everything we have learned in using computers to understand language). The objective is ambitious: to talk to the animals, and before that listen to what the animals say to each other, or to us.
There is something deeply comforting in thinking that the linguistic tools of artificial intelligence could do something so beautiful to bring all thinking species together. Maybe one day we will be able to prank animals based on a joke, and not on their extinction.