Meta Drops Fact-Checkers for User-Led Moderation in the U.S.
Meta, the parent company of Facebook, Instagram, and Threads, is overhauling its content moderation strategy in the United States by replacing third-party fact-checkers with a "Community Notes" system inspired by X (formerly Twitter).
Meta CEO Mark Zuckerberg announced the change, stating that the company had strayed from its mission of free expression. "Too much harmless content gets censored, too many people find themselves wrongly locked up in 'Facebook jail', and we are often too slow to respond when they do," the company said. "It’s time to get back to our roots around free expression."
The Community Notes program will allow users to add context to posts they find misleading, similar to the system used on X. Meta plans to phase in the program over the next few months. Zuckerberg emphasized that Meta will not control the notes or their visibility. The decision to abandon fact-checkers is limited to the U.S. for now. Meta confirmed that it has "no immediate plans" to implement similar changes in the UK or EU, where stricter regulations require tech companies to take greater responsibility for online content.
The shift has sparked backlash from fact-checking organizations and online safety advocates. Chris Morris, CEO of Full Fact, described the decision as "disappointing and a backward step." Meta acknowledged the risks, with Zuckerberg admitting, "We’re going to catch less bad stuff, but we’ll also reduce the number of innocent people's posts and accounts we accidentally take down."
The change also signals broader policy updates. Meta plans to reduce automated enforcement and demotions for minor policy violations, focusing instead on user reports. High-severity issues like terrorism and fraud will still be tackled by automated systems. Additionally, the company will loosen restrictions on politically charged topics such as immigration and gender identity.
Meta’s Joel Kaplan, a Republican now leading the company’s global affairs, said the reliance on independent moderators often led to over-enforcement and censorship. "It’s not right that things can be said on TV or the floor of Congress, but not on our platforms," Kaplan added.
With these changes, Meta aims to balance its commitment to free speech with user safety. However, the company’s move away from traditional moderation has intensified debates about the role of social media in regulating online discourse.