Meta set to embrace more “bad stuff” in preparation for Trump’s presidency
Meta (fomerly known as Facebook) changes rule to curry favour from Trump presidency.
Published By: Michael Adesina
Benson Michael
Meta on Tuesday confirmed revamping its content moderation practices in a significant way, effectively ending its fact-checking program and instead relying on users to flag potentially false or misleading information.
This shift in policy is a notable departure from the company’s previous approach, which utilized third-party fact-checkers to verify the accuracy of content on its platforms.
According to Joel Kaplan, Meta’s newly installed global policy chief, the company aims to “undo the mission creep that has made our rules too restrictive and too prone to over-enforcement” .
Mark Zuckerberg, Meta’s CEO, echoed this sentiment, stating that the company is seeking to “get back to our roots around free expression” and acknowledging that the current fact-checking system has resulted in “too many mistakes and too much censorship.”
The new approach, which will be rolled out in the United States in the coming months, is similar to the Community Notes feature used by X (formerly Twitter).
This change in policy has been met with enthusiasm from conservative allies of President-elect Donald J. Trump, who have long criticized Meta’s fact-checking practices as biased against conservative users.
Notably, Meta executives have been in communication with Trump officials regarding the policy shift, and the announcement coincided with an appearance by Joel Kaplan on “Fox & Friends,” a popular show among conservatives.
The company’s decision to remove restrictions on topics like immigration and gender identity, which Zuckerberg deemed “out of touch with mainstream discourse,” has also been seen as an attempt to curry favor with the incoming Trump administration.
However, critics have raised concerns that this shift in policy may lead to an increase in misinformation and “bad stuff” on the platform, as Zuckerberg himself acknowledged.
“The reality is that this is a trade-off,” Zuckerberg stated. “It means that we’re going to catch less bad stuff, but we’ll also reduce the number of innocent people’s posts and accounts that we accidentally take down.”
The company’s trust and safety and content moderation teams will be relocated from California to Texas, in an effort to address concerns around biased content moderation.
Ultimately, the efficacy of Meta’s new approach will depend on how it is enforced, and the company’s ability to strike a balance between free expression and content moderation.
- Meta to appeal against $220m fine by FCCPC – WhatsApp
- Meta dismantles Chinese covert digital influence on its platforms
- Meta to restore Donald Trump’s Facebook, Instagram accounts