Meta announced Tuesday that it is ditching third-party fact-checking programs on Facebook, Instagram and Threads and replacing its army of paid moderators with the Community Notes model, which mimics X’s much-maligned volunteer program that allows users to publicly flag content they believe to be incorrect or misleading in mistake.
IN blog post announcing the news, Meta’s newly appointed chief global officer Joel Kaplan said the decision was made to allow more topics to be openly discussed on the company’s platforms. The change will first affect the moderation of the company in the USA.
“We will allow more speech by lifting restrictions on certain topics that are part of mainstream discourse and focusing our enforcement on illegal and serious violations,” Kaplan said, although he did not detail what topics these up-to-date rules would cover.
In a video accompanying the blog post, Meta CEO Mark Zuckerberg said the up-to-date rules would bring more political content back to users’ feeds, as well as posts on other issues that have inflamed the U.S. culture wars in recent years.
“We will simplify our content policies and get rid of a number of restrictions on topics like immigration and gender that simply deviate from mainstream discourse,” Zuckerberg said.
Meta has significantly reduced fact-checking and gotten rid of content moderation policies it put in place in the wake of 2016 revelations about influence operations on its platforms aimed at influencing elections and, in some cases, promoting violence and even genocide.
Before last year’s critical elections around the world, the Meta was there criticized for its hands-off approach to moderate the content related to these votes.
Mark Zuckerberg’s comments from last year are repeatedKaplan said Meta’s content moderation policies were put in place not to protect users but “partly in response to social and political pressure around content moderation.”
Kaplan also criticized fact-checking experts for their “biases and viewpoints” that led to excessive moderation: “Over time, we fact-checked too much content for people to understand as legitimate political speech and debate,” Kaplan wrote.
But WIRED reported last year that a lot of risky content, such as medical misinformation, had appeared on the platform, while groups such as anti-government militias were using Facebook to recruit up-to-date members.
Meanwhile, Zuckerberg blamed the “old media” for forcing Facebook to implement content moderation policies after the 2016 election. “When Trump was first elected in 2016, traditional media outlets wrote nonstop about how disinformation was a threat to democracy,” Zuckerberg said. “We tried in good faith to allay these concerns without becoming arbiters of truth, but the fact checkers were simply too politically biased and destroyed more trust than they created,” he added.
In what he tried to characterize as an effort to remove bias, Zuckerberg said Meta’s internal trust and safety team would move from California to Texas, where X is also currently headquartered. “I think because we’re working to promote free speech, it will help us is to build trust in doing this work in places where there is less concern about our teams’ bias,” Zuckerberg said.