United States of America

[US] Meta terminates its fact-checking programme

IRIS 2025-1:1/2

Amélie Lacourt

European Audiovisual Observatory

In 2016, Mark Zuckerberg, co-founder and CEO of Meta, announced that he was instituting a fact-checking programme after being accused of disseminating mis/disinformation around the first election of Donald Trump. The fact-checking programme consisted in partnering with independent third-party fact-checkers to review and rate the accuracy of content posted on the social media.

On 7 January 2025, the tech giant decided to stop these independent fact-checking efforts in the United States and rely on a system similar to X's (formerly Twitter's) “Community Notes”. This system allows users to add contextual notes to publications they consider incomplete or misleading. Following this announcement, US media outlets or organisations partnering with Meta for the fact-checking programme have seen their collaboration terminated. 

While this decision initially concerns the United States, it has been strongly criticised by technology and media groups and associations in Europe as it raises crucial questions about the responsibility of digital platforms in the fight against disinformation. Replacing independent experts with a collaborative, user-based moderation system could entail risks. The effectiveness of “Community Notes” has been called into question. Some organisations, such as Reporters without Borders (RSF) and the European Federation of Journalists (EFJ), have already urged the President of the European Commission to act swiftly to prevent the closure of Meta’s fact-checking programme in Europe. According to the latter, “Meta's decision, if applied on EU territory, would constitute a serious and immediate attack on the integrity of the European information space, as well as on the democratic sovereignty of the Union”.

Mark Zuckerberg, who has accused the fact-checking partners of political bias, has added to the controversy and fear of an increase in misinformation on Meta's platforms. The European Commission’s chief spokesperson Paula Pinho said the Commission “absolutely refute[s] any claims of censorship on [their] side”.

In their open letter to the Commission, RSF and EFJ request the opening of an investigation in X's Community Notes system and the risks of ending Meta fact-checking in Europe. Since its enforcement, the Digital Services Act (DSA) requires major online platforms with over 45 million active monthly users (VLOPs) to “identify, analyse and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services” (Article 34 DSA). VLOPs are also required to put in place “reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified” (article 35 DSA). If it fails to comply, Meta could face sanctions for breaching these obligations.

The end of Meta's fact-checking programme in the US marks a turning point in online content moderation.


References

This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.