skip to content

Instagram is apparently recommending sexual content to teens as young as 13 years

Instagram has reportedly been suggesting explicit Reels to teenagers as young as 13, even when they are not actively searching for such content.

According to an investigative report by Northeastern University professor Laura Edelson and The Wall Street Journal, the Meta-owned social media platform has been suggesting sexually explicit videos to teens. During tests conducted primarily between January and April this year, both parties created new accounts with ages set to 13 to examine Instagram’s behavior.

The findings show that Instagram immediately began suggesting moderately suggestive videos upon the first time the account was logged into, featuring content like women dancing sensually or focusing on their bodies.
Accounts that engaged with these videos by watching them and skipping others soon started receiving recommendations for more explicit content.

Some of the recommended Reels included women mimicking sexual acts or offering to send nude photos in exchange for user comments. During the investigation, the investigators also encountered videos featuring nudity and, in one case, a series of videos about some graphic and explicit sexual acts within minutes of setting up the account.

Within just 20 minutes of initial engagement, the recommended Reels section became dominated by creators producing sexual content.

In contrast, similar tests conducted on TikTok and Snapchat did not yield recommendations for sexual content to the teenage accounts created on those platforms.

Even after actively searching for age-inappropriate content and following creators known for producing such videos, TikTok and Snapchat suggested such content to the test accounts. The Wall Street Journal notes that Meta, Instagram’s parent company, had previously identified similar issues through internal research.

Despite these findings, Meta spokesperson Andy Stone dismissed the report, labeling the tests as “artificial experiments” that do not accurately represent how teens use Instagram. He acknowledged Meta’s efforts to limit the sensitive content teens see, claiming that there have been significant reductions in recent months.

In January, Meta implemented substantial privacy updates aimed at safeguarding teen users. These updates automatically placed them in the platform’s most restrictive control settings, which cannot be opted out of. Despite these measures, The Wall Street Journal’s tests, conducted post-update, could replicate concerning results as recently as June. Meta had introduced these updates shortly after a previous experiment by The Jour.

Share your love
Facebook
Twitter
LinkedIn
WhatsApp

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

error: Unauthorized Content Copy Is Not Allowed