Latest reports under the Code of Practice on Disinformation detailing platform measures during 2024 European Parliament elections
IRIS 2024-9:1/14
Ronan Ó Fathaigh
Institute for Information Law (IViR)
On 24 September 2024, signatories of the Code of Practice on Disinformation published reports detailing their actions to combat the spread of disinformation online, with a particular focus on the European Parliament elections held in June 2024. The Code of Practice on Disinformation was first adopted in 2018 by various industry players, including online platforms, as a self-regulatory standard to tackle online disinformation in the EU; the Code was revised in 2022, and currently has 34 signatories (IRIS 2021-6/4). The Code of Practice contains 44 commitments and 128 specific measures, covering various areas including demonetising the dissemination of disinformation; guaranteeing transparency of political advertising; enhancing cooperation with fact-checkers; and facilitating researchers' access to data.
In September 2024, over 14 reports by various signatories were published, which are available on a dedicated Transparency Centre website. In order to highlight some of the measures under the Code of Practice during the 2024 European Parliament elections, three examples are significant to mention. First, in Google’s report, a notable measure was the launch of a “prebunking initiative” (a technique to forewarn people of attempts to manipulate them, equipping them to spot and refute misleading claims and building resilience) ahead of the 2024 European Parliament elections, which used short video ads on social media in France, Germany, Italy, Belgium, and Poland to discuss techniques used to advance disinformation including decontextualisation, scapegoating and discrediting. Google stated it was the “largest ever prebunking initiative on social media in the world to date”, and the videos were translated and available across EU languages along with Arabic, Russian and Turkish. Second, Meta’s report detailed how it engaged in connecting people with details about the election in their member states through in-app “Voter Information Units” and “Election Day Information”, which “users engaged more than 41 million times on Facebook and more than 58 million times on Instagram”. Meta also detailed how it connected 23 national election authorities, as well as other competent bodies, to a dedicated “reporting channel” and 13 national Digital Services Coordinators to Meta’s “government reporting channels” for reporting content. Notably, from January to June 2024, Meta reported having removed over 4.4 million ads from Facebook and Instagram in EU member states, of which over 170 000 ads were removed from Facebook and Instagram for violating its “misinformation policy”. Third, TikTok’s report detailed how TikTok established a “Mission Control Centre” to provide “consistent and dedicated coverage of potential elections-related issues” and participated in the Code of Practice on Disinformation’s “Rapid Response System” to “streamline the exchange of information among civil society organisations, fact-checkers and platforms”. TikTok also reported its fact-checking coverage for “at least one official language of every EU member state” and its launch of “localised media literacy campaigns”.
The European Commission’s Vice-President for Values and Transparency stated in relation to the latest reports, that this “cooperation is a testament of the Code as an efficient instrument bringing together online platforms, civil society and fact-checkers”.
References
- European Commission, Online platforms report on measures to protect European election integrity under the Code of Practice on Disinformation, 24 September 2024
- https://digital-strategy.ec.europa.eu/en/news/online-platforms-report-measures-protect-european-election-integrity-under-code-practice
Related articles
This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.