Proposal for a Regulation on a Single Market For Digital Services (Digital Services Act)

IRIS 2021-2:1/13

Bengi Zeybek

Institute for Information Law (IViR), University of Amsterdam

On 15 December 2020, the European Commission published the much-anticipated Proposal for a Regulation on a Single Market For Digital Services (Digital Services Act, DSA). The proposed DSA introduces a harmonised set of new rules for digital services, including social media, online market places, and other online platforms that operate in the European Union. It builds on the Commission’s 2018 Recommendation on measures to effectively tackle illegal content online (see IRIS 2018-4/9). As the Explanatory Memorandum states, the proposed DSA "seeks to ensure the best conditions for the provision of innovative digital services in the internal market", and "contribute to online safety and the protection of fundamental rights, and to set a robust and durable governance structure for the effective supervision of providers of intermediary services." 

Some general remarks on the proposed DSA can be made. First, it is lex generalis to other sector-specific acts of EU law regulating online intermediary services, such as the Directive on Copyright in the Digital Single Market (IRIS 2019-4/5), and the Audiovisual Media Services Directive (IRIS 2019-1/3). In addition, it does not define "illegal content" – this is defined by the member state laws, or other acts of EU law. Importantly, it does not directly regulate "harmful content". Rather, there is an attempt to mitigate the societal risks that harmful content may cause through the obligations imposed on "very large platforms" on the management of systemic risks. 

In setting out the duties and responsibilities of intermediaries, the proposed DSA adopts a differentiated approach. It introduces a new terminology of Internet intermediaries having regard to their role, size and impact in the online ecosystem. The overarching term is "intermediary services" that consist of "mere conduits", "caching" and "hosting" service providers. In addition to the legal framework on the liability of providers of intermediary services, the proposed DSA also lays down obligations applicable to all providers of intermediary services, such as the obligation to establish a single point of contact (Article 10) and, where necessary, a legal representative (Article 11); terms of service taking due account of fundamental rights, along with transparency reporting requirements (Articles 12 and 13); and cooperation with national authorities following orders. Hosting service providers are additionally obligated to put in place mechanisms to allow third parties to notify the presence of alleged illegal content (Article 14), and to provide information to users (Article 15).

Online platforms are introduced as a subcategory of hosting service providers to refer to providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. Online platforms’ obligations, in addition to those mentioned above, include setting up internal complaint-handling systems in respect of decisions taken in relation to alleged illegal content or information incompatible with their terms and conditions (Article 17); setting up out-of-court dispute settlement systems; and treating notices submitted by trusted flaggers as a priority (Article 18). Furthermore, they are obligated to publish reports on their activities relating to the removal and the disabling of information considered to be illegal content or contrary to their terms and conditions (Article 23). Online platforms are also mandated to comply with transparency obligations in respect of online advertising (Article 24).

Within online platforms, "very large online platforms" are platforms reaching more than 10% of the 450 million consumers in Europe that are subject to specific obligations, having regard to the fact that their activities may cause societal risks. For example, they are obligated to conduct risk assessments on the systemic risks brought about by or relating to the functioning and the use of their services (Article 26); to take reasonable and effective measures aimed at mitigating those risks (Article 27); and to submit themselves to independent audits (Article 28). Certain duties are imposed on very large online platforms where they use recommender systems (Article 29) or display online advertising on their online interface (Article 30). Importantly, the proposed DSA establishes the conditions under which very large online platforms provide access to data to the Digital Services Coordinator of establishment or the Commission and vetted researchers (Article 31).

Furthermore, the proposed DSA contains provisions aimed at facilitating an innovative cooperation process among public authorities to ensure effective enforcement and implementation across the Single Market. Member states are obligated to designate competent authorities, one of which will function as the "Digital Services Coordinator", responsible for the application and enforcement of the Proposed Regulation (Article 38). Importantly, the Proposal establishes the European Board for Digital Services, an independent advisory group of Digital Services Coordinators (Article 47).

The European Parliament and member states are scheduled to discuss the Commission’s proposal according to the ordinary legislative procedure. Once adopted, the new rules will be directly applicable across the European Union.


References


Related articles

IRIS 2018-4:1/9 European Commission: Recommendation on measures to effectively tackle illegal content online

IRIS 2019-1:1/3 Council of the EU: Publication of the Revised AVMSD in the Official Journal

IRIS 2019-4:1/5 European Parliament: Modernising the EU copyright

This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.