Instagram Introduces New Privacy Features to Protect Teen Users
Instagram is rolling out major privacy updates aimed at protecting teenage users.
What are the changes?
Introduction of teen accounts
Meta, Instagram's parent company, announced that all accounts belonging to users aged 13 to 15 will automatically become private, meaning their posts and activity will only be visible to approved followers. This is part of Meta’s broader effort to shield young people from harmful content.
The new "teen accounts" feature introduces several restrictions. Teenagers will only be able to receive messages from people they follow, and content flagged as sensitive will be filtered out. In addition, sleep mode will also be activated between 10pm and 7am. This means notifications will be silenced and auto-replied to DMs during that time. Teens will also get reminders to take breaks from the app after an hour of use.
Changes to parental controls
Parental controls will be significantly expanded. Parents will be able to supervise their child's account, monitor who they interact with, and view the topics their child engages with. However, they won't have access to the content of private messages.
For users under 16, parental permission will be needed to adjust privacy settings, while older teens can change them independently.
When will the changes take place?
Meta's teen privacy updates will first be implemented in the US, UK, Canada, and Australia, with a broader global rollout expected later this year and in early 2025. Meta has said that users identified as teens will become teen accounts within 60 days of being informed of the changes.
Meta aim to reduce harm of social media on mental health
Meta’s changes come in response to growing concerns about the impact of social media on young users' mental health. Meta has faced criticism and legal action for allegedly allowing harmful content to proliferate on its platforms. The company acknowledges that some teens may try to bypass the age restrictions, but says it is developing technology to identify users who lie about their age.
UK regulator Ofcom has supported the changes but called for more stringent protections across social media platforms, especially with the Online Safety Act set to take effect next year. The act will require platforms to prioritise the safety of young users and remove harmful content or face penalties.
The changes are part of Meta’s ongoing attempts to compete with fast-growing platforms like TikTok, while also addressing the criticism of the role social media plays in the well-being of its youngest users.