skip to content

Meta will now put under-18 Insta users into new ‘teen’ accounts, give parents better control

Meta has introduced new “teen accounts” for Instagram users under 18. The goal is to give parents more control over their children’s activities on the platform, including restricting app usage during certain hours and monitoring what their children view.

This change will apply to new users and gradually extend to existing teen accounts in the coming months.

New features for parental supervision
Under the new settings, parents will have several ways to manage their teens’ Instagram usage. They can set daily time limits, block access to the app at night, and view the categories of content their children are engaging with.

Parents will also have the option to see the accounts with which their teens are exchanging messages. Teenagers signing up for Instagram are already placed into strict privacy settings by default, such as preventing adults who don’t follow them from sending messages and muting notifications at night.

However, these changes will further empower parents, especially for users under 16, who will now require parental permission to alter any privacy settings.

These features will also be applied by default for users aged 16 and 17, though they will have the freedom to adjust the settings themselves if desired. The teen account changes will be implemented in the US, UK, Canada, and Australia.

Meta responds to concerns over online safety.
The announcement of these new teen account settings is seen as a response to growing concerns about online safety, especially for young people.

Parents frequently complain that they are not equipped to adequately monitor their children’s use of social media. Meta’s leadership has acknowledged that while parental controls have been available, they were often underused.

This update aims to simplify those controls and strengthen parents’ roles in managing their children’s online experiences.

Despite these efforts, some online safety advocates have expressed caution. Previous updates to enhance child safety on platforms like Instagram have not always produced the intended results.

Concerns remain over the availability of harmful content, such as material related to self-harm and mental health issues, which have been linked to tragedies in the past. The father of Molly Russell, a British teenager who died after exposure to harmful content on Instagram, hopes that this latest change will have more meaningful outcomes than previous attempts by Meta to improve safety.

Global implications and legislation
This move comes amid broader discussions on social media safety, with countries like Australia planning to raise the minimum age for social media access.

There is interest in how these changes might influence legislation in other countries, including the UK. Meta has stated that these new teen accounts are being introduced independently of government regulations, driven primarily by parental concerns.

However, the possibility of future changes to Facebook or other Meta-owned platforms is still being explored.

As discussions about online safety continue to evolve, how these new features will affect young users and their families remains to be seen.

Share your love
Facebook
Twitter
LinkedIn
WhatsApp

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

error: Unauthorized Content Copy Is Not Allowed