Guidance Note on countering the spread of online mis- and disinformation

IRIS 2024-2:1/5

Urška Umek

Council of Europe, Directorate General of Democracy and Human Dignity

At its 24th Plenary meeting (29 November - 1 December 2023), the Steering Committee on Media and Information Society (CDMSI) adopted a Guidance Note on countering the spread of online mis- and disinformation through fact-checking and platform design solutions in a human rights compliant manner elaborated by its subordinate body, the Committee of Experts on the integrity of online information (MSI-INF). 

The Guidance Note is a response to the proliferation of various kinds of mis- and disinformation in the digital space, making it more challenging to maintain the integrity of elections, ensure healthy pluralism, and protect the democratic process from manipulation. Malicious actors, including some foreign governments, spread disinformation online to disrupt free and fair elections and undermine the very notion that facts matter to democracy and can be meaningfully identified and discussed. Also, there is a growing amount of disinformation generated and spread with AI tools that poses distinctive threats to democratic dialogue. The quality of public debate is also threatened by the propagation of false information by individuals who consider it as true and share it in good faith.

The Guidance Note proposes strategies to counter online mis- and disinformation in a manner that is compliant with human rights standards. These strategies focus on three areas of action: fact-checking, platform design and user empowerment which may all contribute, at different levels and in different ways, to reducing the spread and negative impacts of disinformation.

Regarding the first, the Guidance Note stresses the centrality of fact-checking as a key institution of public debate and calls for the independence of fact-checking organisations vis-à-vis states and other stakeholders with potential interests. There should be transparency of the fact-checking processes, and such processes should be delivered by independent fact-checking organisations, to foster user trust in fact-checking. Furthermore, support from both states and digital platforms is vital to ensure the financial sustainability of fact-checking. Given the importance of fact-checking for public watchdog activities in democratic societies, such checks require continuous quality control. In this connection, it is also important for online platforms to integrate external fact-checking into their content curation systems.

Platform design should involve "human rights by design" and "safety by design" features, to promote and protect human rights and fundamental freedoms. Both states and online platforms, in designing their regulatory frameworks and implementing self-regulatory policies around platform design, should first conduct and publish human rights impact assessments, with concrete measures to prevent or mitigate risks arising from the proposed interventions.

Platform design is to focus on processes through which online platforms rank, moderate and remove content, rather than on content itself, and granular responses should be employed in terms of content moderation techniques. It is, furthermore, important that online platforms invest in non-English moderators with in-depth understanding of different cultural contexts, so that their content moderation tools may function at similar levels of efficacy across different languages. That said, any state regulatory frameworks, including co-regulation, targeting platform design in relation to mis-and disinformation, should contain requirements that are proportional to the risk level that platforms’ functioning involves, based on risk level criteria such as the size (e.g. number of users and capitalisation), resources (e.g. technical and economic means), and reach (e.g. potential impact on audiences).

As part of their policies on countering the spread of online mis- and disinformation, states and platforms may, where necessary, and consistent with the right to freedom of expression, introduce appropriate and proportionate measures to ensure the prominence of public interest content online, particularly content produced by reliable and professional news organisations, in line with the principles set out in the Guidance Note on the Prioritisation of Public Interest Content Online (CDMSI(2021)009).

Last but not least, the section on user empowerment proposes a number of measures for building user resilience to disinformation, including by enhancing the capacity for collective action within communities. Collaboration between states, civil society, platforms, public service media, news organisations, fact-checkers, civil society organisations, user communities, etc. is necessary to maximise the impact of user empowerment initiatives.

This Guidance Note contains recommendations for member states of the Council of Europe, online platforms which have human rights responsibilities of their own and other stakeholders engaged in addressing the spread of mis- and disinformation online, in particular news organisations and fact-checkers.


References



This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.