European Commission publishes guidelines on the protection of minors under the DSA
IRIS 2025-8:1/6
Eric Munch
European Audiovisual Observatory
On 14 July 2025, the European Commission published its guidelines on the protection of minors under the DSA, to ensure a safe online experience for children and young users.
The guidelines, using a risk-based approach and acknowledging the fact that platforms may pose different risks to minors (due to their size, purpose and user base), are set to apply to all online platforms accessible to minors. Micro enterprises (with fewer than 10 persons and whose annual turnover and/or annual balance sheet total does not exceed EUR 2 million) and small enterprises (with fewer than 50 persons and whose annual turnover and/or annual balance sheet total does not exceed EUR 10 million) are not concerned by the guidelines, except if their platform has been designated by the Commission as a very large online platform (VLOP) under Regulation (EU) 2022/2065 (the Digital Services Act – DSA).
Recommendations made in the guidelines include:
- the use of accurate, reliable, robust, non-intrusive and non-discriminatory age assurance methods to restrict minors’ online access to age-inappropriate content (such as gambling and pornography). As per the guidelines, age assurance comprises self-declaration, age estimation and age verification methods.
- the setting of the accounts of minors to private by default, in order to hide their personal information, data and social media content from strangers;
- the modification of platforms’ recommender systems to lower the risk of children encountering harmful content and getting stuck in rabbit holes of specific content;
- the possibility for children to block and mute any user and assurance that they cannot be added to groups without their explicit consent, in order to help prevent cyberbullying;
- the impossibility for accounts to download or take screenshots of content posted by minors, to prevent the unwanted distribution of sexualised or intimate content and sexual extortion;
- the disabling by default of features contributing to excessive use (such as "streaks" or “ephemeral content”), as well as the removal of persuasive design features aimed predominantly at engagement and the introduction of safeguards around AI chatbots integrated into online platforms;
- the assurance that children’s lack of commercial literacy is not exploited (to avoid leading to unwanted spending or addictive behaviours, including certain virtual currencies or loot-boxes);
- the introduction of measures to improve moderation and reporting tools with prompt feedback and minimum requirements for parental control tools.
The guidelines will be used by the Commission as a tool to assess compliance with Article 28(1) of the DSA, as a reference in assessing whether online platforms meet the necessary standard for minors to use them.
References
- Communication to the Commission - Approval of Guidelines on measures to ensure a high level of privacy, safety and security for minors online
- https://ec.europa.eu/newsroom/dae/redirection/document/118226
This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.