Dark Mode
More forecasts: Johannesburg 14 days weather
  • Thursday, 24 April 2025

Ofcom Sets Out New Rules To Protect Children On Social Media

Ofcom Sets Out New Rules To Protect Children On Social Media

 The UK’s media regulator, Ofcom, has rolled out strict new rules to protect children online, forcing platforms like Meta, TikTok, and X to overhaul how they handle harmful content and age verification. Under the new "Children’s Codes," tech companies must filter out dangerous material such as self-harm posts, pornography, and misogynistic content, and introduce robust age checks or face fines of up to £18 million—or even a UK ban in extreme cases. These changes come under the Online Safety Act, which aims to give young users safer digital spaces.

 

Ofcom says this is a major reset for how children experience the internet. The rules, which take effect from July 25th, outline more than 40 specific measures: algorithms must stop pushing harmful content to kids, platforms must offer better reporting tools, and children should have the option to avoid being pulled into risky group chats. Dame Melanie Dawes, Ofcom’s chief executive, said, “Unless you know where children are, you can’t give them a different experience to adults,” calling the move a “gamechanger.” But not everyone’s convinced the codes go far enough.

 

Bereaved parents like Ian Russell and Hollie Dance, whose children died after encountering harmful online content, say the rules are too soft and leave too much power in the hands of tech companies. “I am dismayed by the lack of ambition,” said Russell. While some platforms have begun to introduce teen-specific features, like Meta’s new restricted accounts, critics argue that without stronger enforcement and transparency, real change might still be out of reach.

Comment / Reply From