France

[FR] Fight against fake news: first CSA report published

IRIS 2020-9:1/6

Amélie Blocman

Légipresse

On 10 September, the European Commission presented its evaluation report on the Code of Practice on Disinformation, which was launched at the end of 2018. The report takes into account the annual self-evaluation reports of the platforms that have signed the code (Google, Facebook, Twitter, Microsoft, Mozilla and TikTok). These platforms have set up policies aimed in particular at reducing advertising opportunities and economic incentives for distributors of online disinformation and at creating functionalities that give prominence to reliable information. The report notes that the quality of the information provided by the platforms concerned remains inadequate, mainly because the code is based on self-regulation. Related measures are expected by the end of the year, in particular within the framework of the Digital Service Act (DSA), which is currently being drafted.

At the end of July, the Conseil supérieur de l’audiovisuel (the French audiovisual regulator – CSA) had published its first summary of the measures taken in 2019 by online platform operators to combat the dissemination of fake news. Under the law of 22 December 2018, the CSA was required to prepare a summary of the application of these measures, based on the annual declarations submitted by the main operators. Each operator reported on the measures it had taken in accordance with the CSA’s recommendation of 15 May 2019 under its cooperation obligation. The 16 platforms concerned each have at least 5 million unique users per month and include search engines, social networks, video and audio sharing platforms, online forums and online encyclopaedias.

The CSA noted that, generally speaking, the operators had risen to the challenge of combating the dissemination of fake news. Nevertheless, the measures needed to be stepped up. All but one of the operators had set up a mechanism for reporting fake news likely to disturb public order or harm the integrity of a vote, as required by law. While most of the platforms used algorithms to organise their content, the CSA criticised a lack of transparency surrounding how these algorithms worked. It also encouraged the promotion of reliable content, in particular that published by press companies and agencies and audiovisual communication services. It also set out recommendations on combating accounts that spread fake news on a massive scale, and on media and information literacy.


References


This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.