Meta, the parent company of Facebook, Instagram, Messenger, WhatsApp, and Threads, has long collaborated with third-party fact-checking organizations to combat misinformation. However, Meta’s CEO, Mark Zuckerberg, recently announced the discontinuation of third-party fact-checking systems and a relaxation of content moderation policies.
This decision has sparked widespread concern among fact-checking organizations and users alike. Seventy-one organizations under the International Fact-Checking Network (IFCN) have sent an open letter to Zuckerberg, expressing their apprehensions about this policy shift.
The letter emphasized that fact-checking is fundamental to evidence-based discussions and a shared reality. It argued that Meta’s new initiative would limit access to accurate and reliable information for billions of users and financially strain smaller fact-checking organizations dependent on Meta’s funding. Currently, Meta works with fact-checking organizations in over 100 countries across diverse languages and cultures.
Under the new policy, Meta plans to replace professional fact-checkers with a crowdsourced moderation system. Similar to Elon Musk’s “Community Notes” on X (formerly Twitter), this system would allow users to verify the authenticity of posts by providing their opinions. Meta claims this move will enhance transparency and increase user participation in fact-checking.
Critics argue that relying on general users for fact-checking cannot match the expertise and accuracy of professional fact-checkers, posing severe challenges in combating misinformation. Many emphasize the need for a hybrid system that combines professional fact-checkers with user contributions to address these issues effectively.