On January 7, 2025, Meta Platforms, which handles Facebook, Instagram, and WhatsApp, announced the termination of its third-party fact-checking program. Chief Executive Officer Mark Zuckerberg stated, “We’ve reached a point where it’s just too much mistakes and too much censorship,” referring to the program that had been active since 2016.
Zuckerberg explained that the company will implement a community-driven moderation system similar to X’s (formerly Twitter) Community Notes. This approach aims to empower users to identify and address misinformation collaboratively.
However, the decision has faced criticism from various groups who argue that relying on user-driven systems may compromise the accuracy and effectiveness of content moderation.
This policy shift has raised concerns in the Philippines, where social media plays a significant role in information dissemination. The National Union of Journalists of the Philippines (NUJP) criticized Meta’s decision, warning that it could worsen the rampant spread of misinformation and disinformation.
Low media and information literacy (MIL) rates in the Philippines worsen the issue. This refers to the ability to access, analyze, evaluate, and create information across various media platforms. According to the 2019 Functional Literacy, Education, and Mass Media Survey (FLEMMS) by the Philippine Statistics Authority, while 91.6% of Filipinos aged 10 to 64 are functionally literate, this metric primarily covers basic reading and writing skills, not critical thinking or effectively assessing online content.
Misinformation (spreading false information without malicious intent) and disinformation (intentionally spreading false information) are pervasive in the Philippines. For example, during the 2022 elections, false claims favoring specific candidates circulated widely on Facebook, influencing voter perceptions.
Similarly, during the COVID-19 pandemic, anti-vaccine propaganda proliferated on the platform, creating confusion and undermining public health efforts.
The shift to a community-driven moderation system raises questions about its effectiveness in combating these issues. Without professional fact-checkers, users bear the burden of identifying false information, which requires stronger media literacy skills.
In a country where MIL education is still developing, this could lead to an unchecked spread of misinformation, significantly impacting public opinion and societal behavior. Organized groups could exploit community moderation to amplify specific narratives, further eroding trust in online platforms.