The technology, tested on epilepsy patients to whom electrodes have already been implanted, is currently limited to 30-50 sentences.
The idea of communicating through abrain-machine interface it's something that went from science fiction to proof of concept in less than twenty years. A new study published in Nature Neuroscience shows that someone has taken the next small step by using AI to interpret brain activity and translate it into sentences. In practice: the cries of mind reading.
“We're not quite there yet, but we think this could be the basis of a speech prosthesis,” says Joseph Makin from the University of California, San Francisco.
A systematic review
Each of the four participants used in the study had a history of epileptic seizures. For this reason he already had electrodes implanted in his skull to monitor this activity. The researchers used those same electrodes to monitor brain activity as 50 predetermined sentences were read aloud, providing data for the neural network to decode.
The sentences were very different in context and construction: from “Tina Turner is a pop singer”, to “the woman is holding a broom” through “a little bird observes the confusion”. In short, unequivocally easy to distinguish between them.
The reading of brain activity and the audio of spoken sentences were fed into an algorithm. The algorithm learned to recognize how the parts of speech were formed.
The initial results were highly inaccurate, and blunders followed. For example, interpreting the brain activity generated by the sentence “she was wearing warm, soft wool overalls” came out “the oasis was a mirage”.
Artificial intelligence helped the system learn, allowing it to perform translations with far fewer errors. Today the brain activity in response to the sentence “the ladder was used to save the cat and the man” is interpreted as “which ladder will be used to save the cat and the man”. Almost identical.
If you try to get out of the 50 phrases used, the decoding gets much worse
Joseph Makin
Towards reading the thought?
The AI used in the study is learning to decode individual words, not just complete sentences, making it more likely to accurately decode speech into new sentences in the future.
The program also increased its accuracy when switching between participants. This also demonstrates plasticity in learning from multiple people.
While being able to interpret limited sentences is a step forward, it's still a long way from mastering the language as a whole, the authors admit.
However, this is a significant achievement. An artificial intelligence was able to interpret the speech well after less than an hour.
We have reached levels of accuracy that have not been achieved so far
I have the idea that mind reading is only a matter of time. It's no longer an "if": it's a "when".