It doesn't seem like it: It's exactly the plot of an episode of Black Mirror, but it's reality. A man from San Francisco, Joshua Barbaau, used an AI-driven chatbot to "talk" to his wife even after death.
Barbeau only provided his wife's old text exchanges, as well as some necessary background information. The hyper-realistic chatbot that "brought his wife back to life" is called Project December.
Chatbot: artificial intelligence, natural perception?
The artificial intelligence model designed by the research company OpenAI (supported by Elon Musk), it's called GPT-3. We have already talked about it in other cases. GPT-3 powers the Project December chatbot.
Barbel he told a newspaper as he got to talk again to his wife, Jessice Pereira, who died eight years ago at age 23 from a rare liver disease. Despite the years that have passed, the freelance writer said he has never strayed from her.
When he came across Project December, he thought he would have a chance to talk to his wife again. GPT-3 just needed a good amount of human text to mimic human writing. Texts of all kinds, from personal love letters to any academic documents, through to responses given in web forums.
The conversations
Joshua says he "talked" to the chatbot for 10 hours straight the first time, and other times in the following months, until he broke away after realizing that it was adding pain to pain. In a final message, he wrote: “I will never stop loving you as long as I live, goodnight,” receiving a laconic “Goodnight, I love you.”
The creators themselves warn of dangers
OpenAI noted that this language-based technology program is the most sophisticated ever. However, it is also significantly dangerous. The previous AI model, the GPT-2, also questioned whether anyone could abuse its capabilities.
To be specific, the same research firm noted that criminal masterminds could instead use the technology to create a network of fake social media content, spreading fake news articles and imitating people in social media. With automation powered by artificial intelligence, scams linked to these chatbots could multiply dramatically.
Jason Rohrer – who created the program – said he didn't foresee people using the chatbot to fake their dead relatives. “Now I'm a little scared of the possibilities,” he wrote.
The US government has come to claim that the spread of disinformation on online platforms is literally killing people, as the BBC detailed in a separate report.