European Commission opens formal proceedings against X under the Digital Services Act
European Audiovisual Observatory
On 25 April 2023, the European Commission adopted its first designation decision under the Digital Services Act (DSA) based on the user data which platforms had to publish by 17 February 2023 (see: IRIS 2023-5:1/2). With 112 million monthly active users in the EU, X (formerly Twitter) was designated as a Very Large Online Platform (VLOP).
The Commission may decide to open an investigation when it has suspicions of infringements by a VLOP or VLOSE (Very Large Online Search Engine) stemming either from its assessment of the information obtained during its monitoring, or from reliable sources.
On 12 October 2023, the Commission formally sent X a request for information under the DSA. This request followed indications received by the Commission services of the alleged spreading of illegal content and disinformation, in particular the spreading of terrorist and violent content and hate speech in the context of the Hamas terrorist attacks against Israel.
Following its preliminary investigation, the Commission decided to open formal infringement proceedings against X pursuant to Article 66 of the DSA. This decision was further based on an analysis of the risk assessment report submitted by X in September 2023, its transparency report published on 3 November 2023 and its replies to the formal request for information. According to Margrethe Vestager, Executive Vice-President for a Europe Fit for the Digital Age, “The higher the risk large platforms pose to our society, the more specific the requirements of the Digital Services Act are. We take any breach of our rules very seriously. And the evidence we currently have is enough to formally open a proceeding against X. The Commission will carefully investigate X's compliance with the DSA, to ensure European citizens are safeguarded online – as the regulation mandates.”
According to the Commission, the proceedings will focus in particular on:Compliance with the DSA obligations related to countering the dissemination of illegal content in the EU, notably in relation to the risk assessment and mitigation measures adopted by X to counter the dissemination of illegal content in the EU, as well as the functioning of the notice and action mechanism for illegal content in the EU mandated by the DSA, including in light of X's content moderation resources. The effectiveness of measures taken to combat information manipulation on the platform, notably the effectiveness of X's so-called "Community Notes" system in the EU and the effectiveness of related policies mitigating risks to civic discourse and electoral processes. The measures taken by X to increase the transparency of its platform. The investigation concerns suspected shortcomings in giving researchers access to X's publicly accessible data as mandated by Article 40 of the DSA, as well as shortcomings in X's ads repository. A suspected deceptive design of the user interface, notably in relation to checkmarks linked to certain subscription products, the so-called Blue checks.
If proven, these failures would constitute infringements of Articles 34(1), 34(2) and 35(1), 16(5) and 16(6), 25(1), 39 and 40(12) of the DSA. Such formal proceedings will eventually allow the Commission to assess whether X has breached the DSA in areas linked to risk management, content moderation, dark patterns, advertising transparency and data access for researchers.
In the event of the adoption of a non-compliance decision, a decision imposing fines or a decision imposing periodic penalty payments, the Commission will have to give X the opportunity of being heard on its preliminary findings, including any matter to which the Commission has taken objection; and any measures that the Commission may intend to take in view of those preliminary findings.
- Press release – Commission opens formal proceedings against X under the Digital Services Act
This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.