First annual self-assessment reports by signatories to Code of Practice on Disinformation
IRIS 2020-1:1/15
Gionata Bouchè
On 29 October 2019, the European Commission released the self-assessment reports submitted by Facebook, Google, Microsoft, Mozilla and Twitter and other trade association signatories to the Code of Practice on Disinformation (the Code) as part of its broader Action Plan against Disinformation (see IRIS 2019-1/7). The Commission is currently drafting an evaluation of the overall effectiveness of the Code during its first year of implementation which will take into account both the reports of independent bodies assisting the Commission in its analysis, as well as the relevant efforts of the signatories to adapt the transparency standards of their policies against disinformation to the Code’s self-regulatory criteria (see IRIS 2019-10/6). The assessments had to be conducted in light of the objectives set out across the Code’s five pillars: (1) scrutiny of ad placements, (2) political and issue-based advertising, (3) integrity of services, (4) empowering consumers and (5) researchers.
First, the signatories had to illustrate the course of action they had taken to disrupt advertising and monetisation incentives that facilitate the spreading of disinformation, including the imposition of restrictions to advertising services not complying with the platforms’ policies. The Commission pointed out that certain efforts had been made to secure a safe environment for ad traders, but stated, “there has been a lack thus far of joined-up efforts by platforms and other stakeholders - including fact checkers, researchers and media - to identify persistent or egregious purveyors of disinformation and to develop indicators for the trustworthiness of media sources, for the development and the deployment of ad scrutiny and brand safety measures”.
Secondly, it observed that “[a]ll platform signatories deployed policies and systems to ensure transparency around political advertising, including a requirement that all political ads be clearly labelled as sponsored content and include a ‘paid for by’ disclaimer.” However, “[a]lthough the platforms’ respective definitions of political advertising are in line with the Code, there are notable differences in scope.” While some of the platforms have gone to the extent of banning political ads, the transparency of issue-based advertising is still significantly neglected.
Thirdly, all platforms have shown that they have adopted measures to combat manipulative and inauthentic behaviour, including the suppression of millions of fake accounts and the implementation of safeguards against malicious automated activities. However, the Commission acknowledged the importance of further commitment to counteract such activities for the purpose of preserving service integrity, and stated that “more granular information is needed to better assess malicious behaviour specifically targeting the EU and the progress achieved by the platforms to counter such behaviour.”
In relation to the last two pillars, several tools have been developed to help consumers evaluate the reliability of information sources, and to open up access to platform data for researchers. For example, users of all platforms are now better able to understand and customise their exposure to personalised ads, and researchers may access political ads through Google, Twitter and Facebook’s repositories. Nevertheless, the Commission notes that “platforms have not demonstrated much progress in developing and implementing trustworthiness indicators in collaboration with the news ecosystem”, and “some consumer empowerment tools are still not available in most EU Member States.” Finally, the “provision of data and search tools to the research community is still episodic and arbitrary and does not respond to the full range of research needs.”
The Commission is now awaiting the reports by the Third Party Organisation chosen by the signatories and by an independent consultant in order to complete its process of evaluation. The final results will be presented before the European Parliament, specifically addressing emerging dynamics during the last EU general elections.
References
- European Commission, Code of Practice on Disinformation: First Annual Reports - October 2019, 29 October 2019
- https://ec.europa.eu/newsroom/dae/document.cfm?doc_id=62698
- Facebook, Facebook report on the implementation of the Code of Practice for Disinformation, September 2019
- https://ec.europa.eu/newsroom/dae/document.cfm?doc_id=62681
- Google, EC EU Code of Practice on Disinformation: Google Annual Report, September 2019
- https://ec.europa.eu/newsroom/dae/document.cfm?doc_id=62680
Related articles
IRIS 2019-10:1/6 Court of Justice of the European Union: Users must actively consent to cookies
This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.