European Commission opens proceedings against TikTok

IRIS 2024-5:1/7

Amélie Lacourt

European Audiovisual Observatory

In April 2024, TikTok launched TikTok Lite in France and Spain, a new app featuring a new functionality aimed at users over 18. According to the European Commission, the “Reward Program” allows users to earn points while performing certain “tasks” on TikTok, such as watching videos, liking content, following creators, inviting friends to join TikTok, etc. These points can be exchanged for rewards, such as Amazon vouchers, gift cards via PayPal or TikTok's coins currency that can be spent on tipping creators.

Fearing potential impact of this new programme on the protection of minors and on the mental health of users, in particular in relation to the potential stimulation of addictive behaviour, the European Commission sent TikTok a request for information on 17 April 2024 requiring it to provide the risk assessment report for TikTok Lite within 24 hours and details on the measures the platform put in place to mitigate potential systemic risks of these new functionalities by 26 April 2024. The request was not complied with and the Commission opened a second formal proceedings against TikTok on 22 April 2024, under the Digital Services Act (DSA). These follow the first formal proceedings opened in February 2024, empowering the Commission to take further enforcement steps, such as interim measures, and non-compliance decisions.

The Commission is concerned that TikTok Lite has been launched without prior diligent assessment of the risks it entails, in particular those related to the addictive effect of the platforms, and without taking effective risk mitigating measures. This is of particular concern for children, given the suspected absence of effective age verification mechanisms on TikTok.

If proven, these failures would constitute infringements of Articles 34 and 35 of the DSA. Article 34 of the DSA requires designated Very Large Online Platforms and Search Engines (VLOPs and VLOSEs) to diligently identify, analyse and assess any systemic risks stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services, including prior to deploying functionalities that are likely to have a critical impact on the risks identified. Pursuant to Article 35 DSA, such services must put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified, with particular consideration to the impacts of such measures on fundamental rights.

The Commission prolonged the deadlines for TikTok to send its risk assessment report and information on mitigation measures, but the platform’s failure to submit the required information has led the Commission to suspect DSA infringement. Thierry Breton, Commission for Internal Market stated that “Unless TikTok provides compelling proof of its safety, which it has failed to do until now, we stand ready to trigger DSA interim measures including the suspension of TikTok Lite feature which we suspect could generate addiction”.


References


This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.