If there’s anything that Elon Musk’s Twitter saga and Twitter Files have shown us, it’s that content moderation by social media platforms is anything but straightforward. Social media platforms like Instagram and Facebook must strike a balance between making a user’s feed as engaging as possible and keeping users, especially impressionable users, away from harmful content. This is where most social media platforms fail miserably.
A previously unpublished document that has not been leaked from Meta shows that the people heading Meta, when it was still called Facebook, knew that Instagram was intentionally pushing young teenage girls to dangerous and harmful content and did nothing to stop it.
The document reveals how an Instagram team member investigated Instagram’s algorithm and recommendations by pretending to be a 13-year-old girl looking for diet tips. Instead of showing the user content from medical and proper fitness experts, the algorithm chose to display content from more viral topics that got more engagement, which was adjacent to having a good diet. These “adjacent” viral topics turned out to be content around anorexia. The user was led to graphic content and recommendations to follow accounts titled “skinny binge” and “apple core anorexic.”
It is a known fact that Instagram was aware that almost 33 percent of all teenage users of the platform feel worse about their bodies because of the app’s recommended content and the algorithm Insta uses to curate a user’s feed. Instagram was also aware that teens who used the app felt higher rates of anxiety and depression.
This is not the first time that Instagram’s algorithms and the content it pushes on users have been a topic of contention for mental health experts and advocates. Earlier this year, Instagram was officially listed as the cause of death by a coroner in the UK in a case involving a 14-year-old girl named Molly Russell, who died by suicide in 2017.
In Molly Russell’s case, one of the critical areas that the trial focused on was whether Molly watching thousands of posts on platforms like Instagram and Pinterest promoting self-harm had anything to do with the fact that she killed herself. In his testimony as the coroner, Andrew Walker concluded that Russell’s death couldn’t be ruled a suicide. Instead, he described her cause of death as “an act of self-harm while suffering from depression and the negative effects of online content.” Walker, at one point, described the content that Russell liked or saved in the days ahead of her death as so disturbing that he found it “almost impossible to watch.”
“The platforms operated in such a way using algorithms as to result, in some circumstances, of binge periods of images, video clips and text,” which “romanticized acts of self-harm” and “sought to isolate and discourage discussion with those who may have been able to help,” Walker said.
Cases like these have opened up the debate about social media platforms’ content moderation policies and how they play out in real life. Attorney Matt Bergman started the Social Media Victims Law Center after reading the Facebook Papers disclosed by whistleblower Frances Haugen last year. He’s working with more than 1,200 families pursuing lawsuits against social media companies.
“Time after time, when they have an opportunity to choose between the safety of our kids and profits, they always choose profits,” said Bergman in an interview with a news agency in the US. He argues the design of social media platforms is ultimately hurting kids.
“They have intentionally designed a product that is addictive,” Bergman said. “They understand that if children stay online, they make more money. It doesn’t matter how harmful the material is.” Bergman argues that the apps were explicitly designed to evade parental authority and call for better age and identity verification protocols.
Meta’s global head of safety, Antigone Davis, has said, “we want teens to be safe online” and that Instagram doesn’t “allow content promoting self-harm or eating disorders.” Davis also said Meta has improved Instagram’s “age verification technology.”
Several activists and advocacy groups think content moderation across platforms needs an overhaul. While the more significant consensus is that social media platforms need to have independent moderation councils and should regulate content themselves, others have expressed that there is a need for a larger and global body that sets policies for content moderation.
Taking away content moderation from platforms and assigning an independent council that overlooks all social media platforms’ moderation policies opens up a new can of worms. For example, it will be much easier for regimes to suppress political dissidents and news that may be unfavorable to a government. This is what exactly Twitter Files is trying to show. However, the fact remains that content moderation, as we know it, is broken and needs to be fixed.