The technology, tested on epilepsy patients to whom electrodes have already been implanted, is currently limited to 30-50 sentences.
The idea of communicating through abrain-machine interface it's something that went from science fiction to proof of concept in less than twenty years. A new study published in Nature Neuroscience shows that someone has taken the next small step by using AI to interpret brain activity and translate it into sentences. In practice: the cries of a mind reading.
"We're not quite there yet, but we think this could be the basis of a speech prosthesis", says Joseph Makin of the University of California, San Francisco.
A systematic review
Each of the four participants used in the study had a history of epileptic seizures. For this he already had electrodes implanted in his skull to monitor this activity. The researchers used those same electrodes to monitor brain activity as 50 predetermined sentences were read aloud, providing data for decoding the neural network.
The phrases were very different in context and construction: from "Tina Turner is a pop singer", to "the woman holds a broom" to "a little bird observes the confusion". In short, unequivocally easy to distinguish between them.
The reading of brain activity and the audio of the spoken sentences were fed into an algorithm. The algorithm has learned to recognize how parts of speech are formed.
The initial results were highly inaccurate, and Marquis errors followed one another. For example, interpreting the brain activity generated by the phrase "she was wearing warm fluffy wool overalls" came out "the oasis was a mirage".
Artificial intelligence helped the system to learn, until it made translations with far fewer errors. Today brain activity in response to the phrase "the ladder was used to save the cat and man" is interpreted as "which ladder will be used to save the cat and man". Almost identical.
If you try to get out of the 50 phrases used, the decoding gets much worse
Joseph Makin
Towards reading the thought?
The AI used in the study is learning to decode individual words, not just complete sentences, which makes it more likely to accurately decode speech into new future sentences.
The program has also increased its accuracy when switching from one participant to another. This also demonstrates plasticity in learning from multiple people.
While being able to interpret limited sentences is a step forward, it's still a long way from mastering the language as a whole, the authors admit.
However, this is a significant result. An artificial intelligence was able to interpret the speech already well after less than an hour.
We have reached levels of accuracy that have not been achieved so far
I have the idea that mind reading is only a matter of time. It is no longer an "if": it is a "when".