Meta is at the center of a new series of lawsuits accusing Instagram of encouraging eating disorders, depression and even suicide in children and adolescents. And at the heart of the prosecutorial strategy is a new topic, one that has potentially serious implications for Mark Zuckerberg's social media empire.
The lawsuits, which contain disturbing stories of victims subjected to content on Instagram promoting anorexia, self-harm and suicide, are largely based on leaks. The “deep throat”, the informant Frances Hagen, last year revealed internal Meta documents showing how Instagram causes body image disturbances and other serious mental health problems in many teenagers. In the past I have documented you of signs in this direction, but they had mostly adult women as victims: it seems that the problems are much more extensive.
Meta knew it
The crux of the matter, the most risky for Instagram, is that the leak shows that Meta was well aware of the damage its products were causing to children, but chose to put growth and profits before safety.
This is the fulcrum of almost all the lawsuits brought against the social giant: with a constant (for which other social networks such as Snapchat and Tiktok are also 'mentioned'), that of pushing products that create 'addiction' and even lethal risks.
“In what universe can a company feed this dangerous filth to children and get away with it?” He says Matthew Bergman, the founder of the Social Media Victims Law Center. The association of "social victims" has by itself more than half a dozen lawsuits filed against Instagram.
These products are causing great harm to our children
One obstacle above all in US legislation will make it difficult for victims to sue Instagram: it's called Section 230 of the Communications Decency Act. A law that, in effect, protects social media companies from similar litigation.
The elements provided by Haugen's leak, however, could "force" Meta and other companies to change their trajectory.
The point is that Section 230 is intended to preserve the freedom of speech of Internet users and prevents web platforms from being held legally responsible for the content posted by users. But this is not the only thing here: it is precisely the way in which Instagram is built that directs victims to such content. And this could make the "legal shield" ineffective.
The thesis, in summary, is that Instagram is done in a way that favors these harmful contents. Obviously Meta does not agree, but this will be a question for the judges to investigate.
Stories of victims from social networks
Sifting through the lawsuits you can read really terrible events. One lawsuit centers on a girl from Louisiana, Englyn Roberts, committed suicide at just 14 years old in 2020. According to the lawsuit filed in July, Englyn was being “bombarded on Instagram, Snapchat and TikTok with harmful images and videos,” including “violent and disturbing content that advocates self-harm and suicide.
Having captured the girl's attention, the algorithm would have suggested more and more similar contents that ended up pushing her into a vicious circle. Result? The little girl began exchanging self-harming content with her friends (other potential victims). Among these, in September 2019 a disturbing video of a woman hanged herself with an electric extension: the documents are in the proceedings.
In August 2020, Englyn imitated that video, using an extension cord to hang herself: emergency transport to the hospital wasn't enough to save her life. About a year later, informant Frances Haugen's leak gave her dad the courage to check out her daughter's social media accounts, and he began to rebuild everything about her, uncovering suicidal content and messages.
“What became clear in September 2021 is that Englyn’s death was the direct result of the psychological harm caused by her use of Instagram, Snapchat, and TikTok,” the lawsuit reads.
Like asbestos victims
Bergman, many years of experience in defending victims of asbestos exposure, does not struggle to find similarities even with victims from social networks. “They too are unknowingly exposed to some kind of toxin,” he says.
And they too end up dying in some cases. Of the six lawsuits followed, 4 were filed by parents of children who took their own lives.
By now it should be clear that social media can be extremely harmful, even deadly. If you or someone you know is struggling with suicidal thoughts, ask for help. In the face of such things, you can't swipe and go any further.