Italy

[IT] AGCOM public consultation on a draft resolution on hate speech

IRIS 2019-4:1/25

Ernesto Apa & Marco Bassini

Portolano Cavallo & Bocconi University

On 22 January 2019, the Italian Communication Authority (AGCOM) launched a public consultation on a draft regulation aimed at fostering the protection of human dignity and the principle of non-discrimination, as well as combating hate speech. The scope of the application of the draft regulation includes both audiovisual media service providers and video-sharing service providers.

Article 1 firstly sets out certain definitions. Inter alia, “video-sharing service providers” are defined as entities operating a service that, even in part, makes available to the public programmes and user-generated videos via electronic communication networks for informational, educational or entertainment purposes, without exercising any editorial responsibility. Furthermore, according to this definition, video-sharing service providers organise content, including by automatic means or algorithms. Another key definition concerns content amounting to hate speech: instead of carving out an ad hoc concept, the draft regulation provides a blanket definition that refers to the concept of hate speech, as enshrined in other legal instruments, namely Recommendation no. 20/1997 of the Committee of Ministers of the Council of Europe, Recommendation no. 15/2016 of the European Commission against racism and intolerance, and AGCOM Resolution no. 403/18/CONS.

The draft regulation contains two separate sets of provisions, which apply, respectively, to audiovisual media service providers and video-sharing service providers.

With respect to the set of rules concerning audiovisual media service providers, the draft regulation establishes certain general principles that require providers to ensure the utmost respect for fundamental guarantees afforded to users in the delivery of entertainment and informational programmes. When broadcasting news or content regarding subjects that are likely to be subject to discriminatory attitudes, they must take into account specific restrictions designed to avoid the undermining of the fairness, accuracy and completeness of information, paving the way to hate speech or incitement to hatred.

In particular, Article 5 of the draft regulation lays down criteria that shall be binding on audiovisual media service providers, including:

- taking into account the context  and avoiding, inter alia, expressions and images that are likely to directly or indirectly circulate, incite, promote or justify hate or other forms of intolerance and discrimination or harm human dignity or lead to violence or crimes against members of certain groups or minorities;

- handling carefully news and other content likely to give rise to prejudice or stereotyping;

when presenting an item of news, paying regard to its specific context in order to avoid generalisations and to distinguish each individual case from another one;

- avoiding the circulation of images or information that are inaccurate or misleading and thus liable to give rise to unjustified social alarm;

- promptly and carefully amending any mistake or inaccuracy in the delivery of news or content regarding groups that may be likely affected by discrimination;

- promoting best practices for social inclusiveness, integration and diversity.

The public service broadcaster (RAI) bears a special responsibility to ensure respect of these principles.

AGCOM is authorised to monitor compliance by audiovisual media service providers with the requirements listed above. In the event that proceedings are commenced following the lodging of a complaint by an affected party, AGCOM shall require the provider in question not to repeat the conduct in question. It may also order the provider to publish a statement acknowledging the violation within the same timeframe as that during which the original conduct arose.

On the other hand, the draft regulation also requires video-sharing service providers to implement measures to combat hate speech and to determine and report users who engage in hate speech. Specifically, a video-sharing service platform shall have to report to AGCOM every three months on such measures that have been carried out, specifying the manner and the systems implemented to monitor third-party content.


References


This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.