It doesn't seem like it: It's exactly the plot of an episode of Black Mirror, but it's reality. A man from San Francisco, Joshua Barbaau, used an AI-driven chatbot to "talk" to his wife even after death.
Barbeau only provided his wife's old message exchanges, plus some necessary background information. The hyper-realistic chatbot that brought his wife back to life is called Project December.
Chatbot: artificial intelligence, natural perception?
The artificial intelligence model designed by the research company OpenAI (supported by Elon Musk), it's called GPT-3. We have already talked about it in other cases. GPT-3 powers the Project December chatbot.
Barbel he told a newspaper how he got to talk to his wife again, Jessice Pereira, who died eight years ago at the age of 23 from a rare liver disease. Despite the past years, the freelance writer claimed he never strayed from her.
When he stumbled upon Project December, he thought he was going to get a chance to talk to his wife again. GPT-3 just needed a good amount of human texts to mimic human writing. Texts of all kinds, from personal love letters to any academic documents, through the answers given in the web forums.
The conversations
Joshua says he "talked" to the chatbot for 10 hours straight the first time, and other times in the following months, until he broke away after realizing that it added pain to pain. In a final message, he wrote: "I will never stop loving you as long as I live, goodnight", receiving a terse "Goodnight, I love you".
The creators themselves warn of dangers
OpenAI noted that this language-based technology program is the most sophisticated ever. However, it is also significantly dangerous. The previous AI model, the GPT-2, also questioned whether anyone could abuse its capabilities.
To be specific, the same research firm noted that criminal minds could instead use the technology to create a fake social media content network, spreading fake news articles and imitating people in social media. With automation powered by artificial intelligence, scams related to these chatbots could multiply dramatically.
Jason Rohrer - who created the program - said he didn't expect people to use the chatbot to simulate their dead relatives. "Now I'm a little afraid of the possibilities," he wrote.
The US government has come to claim that the spread of disinformation on online platforms is literally killing people, as the BBC detailed in a separate report.