Supporters believe this step signifies Meta's attempt to strike a balance between freedom of expression, information authenticity and platform accountability. However, critics argue that this move could reopen the floodgates for misinformation such as fake news, hate speech and conspiracy theories.
SAN FRANCISCO, Jan. 12 (Xinhua) -- U.S. social media giant Meta announced Tuesday that it is ending the "Third-Party Fact-Checking" program and replacing it with the "Community Notes" system for content management. The shift would start in the United States.
Supporters believe this step signifies Meta's attempt to strike a balance between freedom of expression, information authenticity and platform accountability. However, critics argue that this move could reopen the floodgates for misinformation such as fake news, hate speech and conspiracy theories.
WHAT IS "THIRD-PARTY FACT-CHECKING"?
The "Third-Party Fact-Checking" program was a content-review and management initiative launched by Meta's predecessor, Facebook, in 2016. It aimed to combat the spread of misinformation on its platform by collaborating with independent third-party organizations to verify the authenticity of content.
This program was introduced in response to widespread criticism during the 2016 U.S. presidential election, where the platform was accused of allowing fake news proliferation.
Based on the third-party fact-checking results, the content would be categorized as false information, partially false information or lacking context.
Misinformation was flagged and accompanied by warnings or context links with factual background. The reach of such content was restricted, and users were educated about the inaccuracies. Repeat violators faced reduced account visibility or harsher penalties, such as comment restrictions or account suspensions.
WHY CANCEL?
While fact-checking is a vital tool in combating misinformation, benefiting information dissemination and social governance, its implementation at Meta has sparked controversy, with concerns over bias, misjudgment and restrictions on free speech.
Meta CEO Mark Zuckerberg said in a video on Tuesday that "fact-checkers have been too politically biased" and have "destroyed more trust than they created."
The company said it removed millions of pieces of content daily in December, but admitted that one to two out of every 10 of these actions "may have been mistakes," while some deletions "may not have actually violated its policies." The company said it would expand transparency reporting to share numbers on the mistakes on a regular basis so that people can track its progress.
When announcing the decision to end the program, Meta said its content management system and rules were overly complex and over-enforced, scrutinizing too many trivial issues and restricting legitimate political debates. "Too much harmless content gets censored," said Joel Kaplan, Meta's chief global affairs officer, adding that too many people are wrongly locked up in the "Facebook jail."
Canceling the third-party fact-checking program marks a shift in Meta's content management strategy, aiming to balance the fight against misinformation with the preservation of free expression. This change is expected to impact platform operations, user experience, public discourse and the broader information ecosystem.
WHAT IS "COMMUNITY NOTES"?
Meta announced that it would replace the "Third-Party Fact-Checking" program with a more open and decentralized "Community Notes" system. It draws inspiration from a similar system implemented on X, where it has reportedly been successful in mitigating bias.
The "Community Notes" is a content management approach involving community participation in reviewing and supplementing background information. It aims to provide a more comprehensive and transparent judgment and interpretation of online content through diverse users.
According to Meta, "Community Notes" require users with different viewpoints to reach a consensus. This bottom-up approach contrasts with the platform-led review process, helping to reduce bias and enhance information transparency.
Meta plans to phase in Community Notes in the United States first over the next couple of months, and will continue to improve it over the year.
Industry experts point out that while canceling the fact-checking mechanism may restore more space for political and mainstream topic discussions and allow users to access less-curated content, it could lead to challenges in content governance if the "Community Notes" model proves ineffective. The absence of robust oversight could exacerbate the spread of misinformation, further destabilize the information ecosystem, erode public trust, and intensify political, social and cultural divisions in the United States.
GLOBAL REACTIONS
"We're going to get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms," said Zuckerberg.
Reuters reported that this move is Meta's most significant shift in its political content management approach in recent years. It is seen as a concession to conservative criticism and an attempt to mend fences with the incoming Trump administration.
U.S. President-elect Donald Trump, banned from Facebook following the Jan. 6 Capitol riots in 2021, had accused the platform of being "an enemy of the people" and censoring conservative voices. His account was reinstated in 2023.
In response to Meta's new content moderation, Trump in a news conference praised Zuckerberg's decision and said that Meta had made significant progress, suggesting the change might be a response to his prior criticism of the company. Elon Musk also praised the decision. "This is cool," he posted on his X platform.
U.S. President Joe Biden blasted Meta Friday, calling the move "really shameful."
The International Fact-Checking Network (IFCN) warned of devastating consequences if Meta broadens its policy change beyond the United States to other countries in the world.
"Some of these countries are highly vulnerable to misinformation that spurs political instability, election interference, mob violence and even genocide," IFCN said in an open letter to Zuckerberg.
The French Foreign Ministry expressed concern over Meta's decision to end third-party fact-checking. It emphasized in a statement that freedom of expression, a fundamental right protected by France and other European countries, should not be confused with a right to virality, which would permit the dissemination of inauthentic content to millions of users without any filtering or moderation.
The ministry noted that the shift is currently limited to the United States, but France remains vigilant and committed to ensuring that Meta, along with other platforms, comply with their obligations under European laws.
The U.S. nonprofit organization Accountable Tech said on social media that the Internet has not embraced Meta's latest move. The company cares first and foremost about maximizing profit, "even if it means sacrificing user safety, quality content and our shared sense of truth."
"Zuckerberg is re-opening the floodgates to the exact same surge of hate, disinformation and conspiracy theories that caused Jan. 6th -- and that continue to spur real-world violence," said Nicole Gill, the organization's founder and executive director. "The world will be far more dangerous as a result."■
留言 0