Meta Expands Teen Safety Features to Facebook and Messenger

Meta is rolling out its “Teen Accounts” to Facebook and Messenger, building on the safety tools it launched on Instagram last year. Teens under 16 will now need a parent’s approval to go live or disable nudity protection, which blurs suspect images in direct messages. The move comes as Meta faces pressure from lawmakers and lawsuits over the impact of its platforms on kids, with more than 30 U.S. states accusing the company of misleading the public about online dangers.
The teen accounts also include stronger parental controls. Parents can now see who their kids are messaging, track their screen time, and set time limits or app blocks. Teens aged 13–15 will automatically be placed in these accounts, while 16–17-year-olds can opt out. Unknown contacts will be restricted, and some privacy settings can’t be changed without a parent’s approval. The features are launching first in the U.S., UK, Canada, and Australia.
Meta says these changes are part of an ongoing effort to “shift the balance in favour of parents,” especially as lawmakers push forward with bills like the Kids Online Safety Act. As criticism grows and regulations tighten globally, Meta is working to show it’s taking responsibility—though critics argue the changes are long overdue.