Meta-owned Instagram has often argued that the platform has been detrimental to the mental health of many young adults and teenagers and doesn’t do enough to ensure that people in specific age brackets get certain kinds of posts in their feeds.
Now, Instagram has officially been listed as the cause of death by a coroner in a case involving a 14-year-old girl named Molly Russell, who died by suicide in 2017.
One of the critical areas that the trial focuses on is that Molly, the girl who died by suicide, viewed thousands of posts on platforms like Instagram and Pinterest promoting self-harm before taking her own life.
At one point, the coroner who testified in the case, Andrew Walker, described the content that Russell liked or saved in the days ahead of her death as so disturbing that he found it “almost impossible to watch.”
In his testimony as the coroner, Walker concluded that Russell’s death couldn’t be ruled a suicide. Instead, he described her cause of death as “an act of self-harm while suffering from depression and the negative effects of online content.”
Walker came to his conclusion based on Russell’s prolific use of Instagram, which included liking, sharing, or saving 16,300 posts six months before her death and over and about 5,793 pins on Pinterest over the same amount of time, which, when combined with how the platforms catered content to contribute to Russell’s depressive state, made her situation worse.
“The platforms operated in such a way using algorithms as to result, in some circumstances, of binge periods of images, video clips and text,” which “romanticized acts of self-harm” and “sought to isolate and discourage discussion with those who may have been able to help,” Walker said.
To get users to spend more time on their app, platforms like Instagram and Pinterest curate a user’s feed in such a way that it only shows things that the user might have shown even a slight interest in. The interest is measured by time spent on a post, whether the post was liked or saved, whether the post was engaged with using comments, etc. Instagram’s algorithm does not consider the nature of the post, nor does it account for the age of the user interacting with the post. This, advocates have argued, is one of the most significant areas where content moderation has failed users.
Walker’s testimony reignites a question that child safety advocates have been asking for years – How responsible are social media platforms for the content algorithms being fed to minors, and why allow minors onto the platform in the first place?
As per a Bloomberg report, the Russell family’s lawyer has requested that Walker “send instructions on how to prevent this happening again to Pinterest, Meta, the UK government, and the communications regulator.” In their statement, the family pushed UK regulators to quickly pass and enforce the UK Online Safety Bill, which could institute “new safeguards for younger users worldwide.”
Pinterest and Meta took different approaches to defend their policies during the trial. Pinterest said that it didn’t have the technology to more effectively moderate the content that Molly was exposed to. On the other hand, meta’s head of health and well-being, Elizabeth Lagone, told the court that the content Molly viewed was considered “safe” by Meta’s standards. Meta’s official response has irked the Russell family.
“We have heard a senior Meta executive describe this deadly stream of content the platform’s algorithms pushed to Molly as ‘SAFE’ and not contradicting the platform’s policies. If this demented trail of life-sucking content were safe, my daughter Molly would probably still be alive,” the Russell family wrote.
They also added, “For the first time today, tech platforms have been formally held responsible for the death of a child. In the future, we as a family hope that any other social media companies called upon to assist an inquest in following the example of Pinterest, who have taken steps to learn lessons and have engaged sincerely and respectfully with the inquest process.”