Dark Mode
More forecasts: Johannesburg 14 days weather
  • Thursday, 12 March 2026
UK Tells Social Media Platforms to Fix Child Age Checks or Face Action

UK Tells Social Media Platforms to Fix Child Age Checks or Face Action

UK regulators have issued a direct warning to some of the world's biggest social media platforms: tighten up age verification for children under 13, or face the consequences.

 

Ofcom and the Information Commissioner's Office (ICO) wrote jointly to Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox and X on Thursday, telling them that their current approach to keeping young children off platforms that are not designed for them simply isn't good enough. The companies have until 30th April to set out what they plan to do about it.

 

The central problem, both regulators say, is that most platforms still rely on users typing in their own date of birth to verify their age, which is extremely easy to get around. "As self-declaration is easily circumvented, this means underage children can easily access services that have not been designed for them," the ICO said in its open letter. Ofcom research suggests that 86% of children aged 10 to 12 already have their own social media profiles, despite most platforms having a minimum age of 13.

 

Ofcom chief executive Dame Melanie Dawes said services were "failing to put children's safety at the heart of their products" and gave a clear ultimatum: "Platforms must now change quickly, or Ofcom will act." She added that she believed that Google was "uncomfortable" with being called out. "It's putting the spotlight on them and asking them to account for what they're doing, not through a Silicon Valley press release but on the UK public's terms."

 

The regulators want platforms to adopt the kind of robust age checks that are currently only legally required for adult content sites such as facial age estimation, digital ID verification or one-time photo matching. The ICO's CEO Paul Arnold noted that "new, viable and privacy-friendly solutions" already exist that can accurately determine whether a user is under 13.

 

Beyond age checks, Ofcom also wants platforms to limit contact between children and adults that they do not know, create safer algorithm-driven content feeds for younger users, and hold off on testing new features on minors until stronger protections are in place. Under the Online Safety Act, Ofcom can fine companies up to 10% of their global revenue for non-compliance. The ICO can separately impose fines of up to 4% of their global annual turnover for children's data breaches, and has already shown it will use those powers, fining Reddit £14.47 million earlier this year for failing to enforce age checks.

 

The platforms have pushed back to varying degrees. YouTube owner Google said it was surprised by Ofcom's "move away from a risk-based approach," arguing that the regulator should instead "focus on high risk services that are failing to comply with the codes set out in the Online Safety Act." Meta said many of Ofcom's suggestions were already in place, including AI-based age detection and facial estimation technology. Snapchat said it was testing age verification tools. TikTok pointed to its "enhanced technologies" for detecting underage accounts, saying it had removed over 90 million suspected under-13 accounts between October 2024 and September 2025. Roblox said that it has rolled out 140 new safety features in the past year and "looks forward to demonstrating our efforts in our ongoing dialogue with Ofcom."

 

Technology Secretary Liz Kendall backed the regulators fully. "No company should need a court order to act responsibly to protect children," she said, adding that no platform would get a "free pass."

 

Researchers have warned that the age check issue, while important, is only part of the problem. Professor Amy Orben, a digital mental health expert at Cambridge University, said "safety must be built into products by design rather than treated as an afterthought." Social media analyst Matt Navarra put it plainly: "Knowing a user is a child is step one, but designing a platform that doesn't exploit their attention is the next step — and that step is actually much harder."

 

The action comes days after MPs voted against an outright ban on social media for under-16s, with ministers instead opting to wait for the results of a government consultation on the matter that was launched last week. Ofcom and the ICO are expected to publish a joint statement later this month specifically focused on age assurance requirements. Across Europe, France, Spain, Germany and Greece are all pursuing their own measures to restrict children's access to social media.

Comment / Reply From