Some people talk to ChatGPT to write professional emails; others to find recipes or solve code problems. And then there are those (more than we imagine) who look for something deeply human in that chat window: connection, understanding, companionship. The line between tool and confidant is thinner than we think. This is demonstrated by first search conducted by OpenAI on users' emotional well-being.
What emerges from the report is a multifaceted portrait of how we relate to these artificial intelligences: a digital mirror that reflects not only our questions, but also (increasingly worryingly) our loneliness.
Numbers that tell digital relationships
The scale of this phenomenon is impressive. OpenAI states that Over 400 million people use ChatGPT every week. It is a technology that has rapidly infiltrated the daily lives of a significant portion of humanity.
The collaboration with the WITH Media Lab allowed researchers to analyze nearly 40 million real-world interactions, then following up with 4.076 users to understand how those conversations made them feel.
The data that strikes me the most is that subgroup of people who interact with ChatGPT for about half an hour a day, every day. It is no longer a simple tool at that point; it is almost a daily ritual, a fixed appointment with a presence that, however artificial, occupies a space in their lives, and affects their emotional well-being. In what way?
Unexpected gender differences
Emotional well-being appears to be affected by these interactions in surprisingly different ways between men and women. After four weeks of daily use of the chatbot, female participants showed a lower propensity to socialize with real people than their male counterparts.
There is also a disturbing finding about participants who set their ChatGPT voice to a gender other than their own: reported significantly higher levels of loneliness and greater emotional dependence on the chatbot. It's as if they were looking for specific company in that artificial voice, a presence that would fill a particular void in their lives.
Kate Devlin, a professor of AI and society at King's College London, isn't surprised. “ChatGPT was set up as a productivity tool,” she notes, “but we know people use it as a companion app anyway.”
Here's the thing: It wasn't designed for this, and yet people are looking for a connection with it. It's like using a screwdriver as a bottle opener: it works, but that's not what it was designed for.
Emotional Wellbeing, the Digital Mirror of Emotions
A 2023 research of the WITH Media Lab had already highlighted how chatbots tend to reflect the emotional tone of users' messages. It's a perverse feedback loop: the happier you are in your interactions, the happier the AI appears; the sadder you are, the more the AI reflects that sadness.
And that's what makes the current results even more worrying. Participants who “bonded” most with ChatGPT were more likely to be lonely and rely on it. But what does it really mean to “bond” with an algorithm that is simply mirroring what it perceives to be your emotional state?
Jason Phang, the researcher at OpenAI who worked on the project, calls this work “an important first step” toward a greater understanding of the impact of Chat GPT about us. But we’re just at the beginning of a complex journey through the uncharted territory of emotional well-being in the age of conversational AI.
As tech giants collect data, we continue to type our hopes, fears, and innermost thoughts into that text box. Perhaps the more important question is not whether ChatGPT is making us more lonely, but why so many of us feel less alone when talking to a machine.