Ireland

[IE] Irish Media Commission determines TikTok, X and Instagram are exposed to terrorist content

IRIS 2024-10:1/7

Eric Munch

European Audiovisual Observatory

On 13 November 2024, the Irish national media regulatory authority, Coimisiún na Meán (the Commission), announced that it had determined that TikTok, X and Instagram were "exposed to terrorist content".

The decision was reached in accordance with its Decision Framework on Hosting Service Provider Exposure to Terrorist Content published in June 2024. The Decision Framework defines "terrorist content", in its introduction, as content that incites, solicits, threatens or instructs on the commission of terrorist offences. A longer, more detailed definition is included in the annex to the document, as well as a reminder of the content of Article 3(1) of Directive (EU) 2017/541 on combating terrorism, which establishes what acts can be defined as terrorist offences, based on their nature and context.

The Commission was designated as a competent authority under Regulation (EU) 2021/784 on addressing the dissemination of terrorist content online (TCOR) to oversee the specific measures taken by a hosting service provider within Irish jurisdiction when it is deemed to be exposed to terrorist content. Exposure to terrorist content is determined based on objective factors, such as the hosting service provider having received two or more final removal orders over the previous 12 months. At this stage, the Commission will take a preliminary decision and engage with the hosting service provider, initiated via a letter informing them that the Commission has become aware that they are exposed to terrorist content, granting them the possibility to provide the Commission with comments within three weeks.

The Commission then takes a decision, based on all the information at their disposal, determining whether the hosting service provider is indeed exposed to terrorist content. Based on this decision, the hosting service provider may be required to take specific measures to prevent the dissemination of terrorist content. This includes taking appropriate technical and operational measures to identify and expeditiously remove or disable access to terrorist content, setting up easily accessible and user-friendly mechanisms for users to report or flag alleged terrorist content and reporting to the Commission on the measures taken.

TikTok, X and Meta will now be obliged to take specific measures to protect their services from being used for the dissemination of terrorist content and to report to the Commission on those measures within three months of receiving the decision. Should the Commission consider that the measures do not comply with legislative requirements, they will address a further decision to the hosting service provider, requiring them to ensure that legislative provisions are complied with. Infringements of the TCOR by hosting service providers can lead to administrative fines, including financial penalties of up to 4% of global turnover.


References



This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.