France

[FR] Ahead of the entry into force of the DSA, ARCOM publishes review of resources deployed by online platforms to combat hate content online

IRIS 2023-8:1/7

Amélie Blocman

Légipresse

On 24 July 2023, one month before the European Digital Services Act (DSA) came into force for the largest online platforms and search engines, the Autorité de régulation de la communication audiovisuelle et numérique (the French audiovisual regulator – ARCOM) published its 2022 review of the resources deployed by providers to combat the dissemination of hate content online. Based on providers’ observations and reports submitted in response to a questionnaire, the review describes the various tools and procedures put in place to combat misuse of the largest online platforms.

Anticipating the European Digital Services Act (DSA), which was still being debated at the time, the Law of 24 August 2021 reinforcing respect for the principles of the Republic increased the responsibility of the main online platforms operating in France for combating the dissemination of hateful content through their services. This national measure will be replaced in early 2024 by the DSA, which will apply to all services concerned from 17 February 2024. This new regulation reaffirms and strengthens the system of limited liability for hosted content, but at the same time introduces a series of new obligations for all providers of so-called “intermediary” services (ISPs and social networks in particular) in terms of diligence, transparency, cooperation with public authorities, civil society and users, and moderation of illegal content.

The report shows that providers have, on the whole, become more transparent, reflecting a tangible willingness to comply with the DSA. It should be noted that, since 24 August 2023, very large online platforms and search engines have been obliged to increase their transparency, and assess and mitigate the systemic risks linked to their platform. ARCOM notes that some of them (e.g. Google and TikTok) have found it difficult to collect and submit detailed data in time. It regrets that greater transparency is needed on the part of moderators responsible for fighting the dissemination of online hate, since relevant information is still too often subject to confidentiality rules (e.g. Meta, Pinterest, Snapchat and TikTok).

The audiovisual regulator also stresses the need for platforms to continue their efforts to make their general terms and conditions clear and accessible, describing this as “both a challenge and a necessity”.

Regarding the key issue of reporting illegal content and contesting moderators’ decisions, ARCOM notes that, generally speaking, providers have taken steps to improve such reporting mechanisms on their platforms and to involve users. With a few exceptions (Twitter, Pinterest and LinkedIn), they also enable users who are not account-holders to report illegal content. Some services provide an ad hoc form that can be used to report hateful content under Article 6-4 of the Loi pour la confiance dans l’économie numérique (Law on confidence in the digital economy – LCEN) of 21 June 2004. However, the review points out that, on many platforms, the reporting mechanism is accessed by clicking on a button with an obscure title. ARCOM therefore invites the platforms to ensure that these systems are both accessible and comprehensible. However, the number of users who contest moderators’ decisions, which are accessible to users almost everywhere, varies widely from platform to platform. Among its recommendations, the regulator repeats what it previously suggested in its guidelines, i.e. that users should be able to indicate whether they would like to receive updates on the status of their reports.

Finally, most providers work with trusted flaggers. ARCOM notes that these partnerships, which play a key role under the DSA, will be strengthened and given a clear legal framework that will require the platforms to give priority treatment to notifications submitted by their partners.

With 85% of data requests submitted to the French authorities for the purpose of identifying potential authors of online hate receiving a favourable response, the high level of cooperation between providers and the judicial and administrative authorities demonstrates the effective criminal law response to the dissemination of illegal content. The need for an effective response will be increased further by the DSA: providers will be required to respond promptly to authorities that ask them to take action against illegal content or to transmit information about its author, as well as to justify any refusal to grant such a request. These requests, the reasons for them and the providers’ responses will be the subject of annual public reports.


References


This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.