United Kingdom

[GB] Ofcom fines online discussion platform 4chan for breach of the Online Safety Act

IRIS 2026-4:1/3

Julian Wilkins

Wordley Partnership

On 19 March 2026, Ofcom determined that the online discussion platform 4chan had failed to comply with section 12 of the Online Safety Act 2023 (the Act) by not using effective age assurance systems to prevent children from encountering pornographic content. In addition to the penalty imposed, if matters were not corrected by 1 April 2026, a daily penalty of GBP 500 would be imposed, for a maximum of 60 days, starting from the day after the compliance deadline.

In addition, Ofcom determined that 4chan had failed, and continued to fail, to comply with its duty under section 9(2) of the Act to carry out a suitable and sufficient illegal content risk assessment in respect of 4chan.org. Ofcom fined 4chan GBP 50 000 for this breach. 4chan had to implement a suitable illegal content risk assessment by 2 April 2026, otherwise a daily penalty of GBP 200 per day would be imposed for continued non-compliance, for a maximum of 60 days, starting with the day after the requirements deadline.

4chan was also fined GBP 20 000 for breaching section 10(5) of the Act by not explaining in its terms of service how it protects people from criminal content. 4chan had to implement highly effective age assurance by 2 April 2026 or be fined a daily penalty of GBP 100.

The penalties were imposed pursuant to section 137 of the Act. Ofcom further indicated that the purpose of the Act is to protect people in the UK and does not require platforms to restrict what people can see in other countries. Prior to making its decision, Ofcom had investigated the following breaches, namely failure to respond to a statutory information request; failure to prevent children from encountering pornographic content using highly effective age assurance; failure to complete and keep a record of a suitable and sufficient illegal content risk assessment; and non-compliance with safety duties about illegal content.

Subsequently, on 25 March 2026 Ofcom and the UK data protection regulator, the Information Commissioner’s Office (ICO), published a joint statement outlining the obligations of online services under the Online Safety Act and the Data (Use and Access) Act 2025 (DUA) regarding age assurance obligations when services were likely to be accessed by children. The DUA came into effect on 19 June 2025, amending the UK General Data Protection Regulation (UK GDPR), the Data Protection Act 2018 (DPA) and the Privacy and Electronic Communications Regulations (PECR). The statement also took account of other guidance from the regulators such as Ofcom’s Age Assurance and Children’s Access Statement published on 16 January 2025. It therefore summarises key aspects of existing Ofcom and ICO age assurance policies (including ICO’s Children’s Code and Ofcom’s Protection of Children Codes) in a practical way to help ensure compliance with online safety and data protection obligations.

Based on this joint statement, if there is a likelihood that a child will access the service, the provider must have a highly effective age assurance process in place. If access is not permitted below a minimum age, there should be an effective age gate to prevent access and unlawful processing under the UK GDPR. Self-declaration by the person accessing the service is not sufficient for determining access based on age. Subsequently, sufficient data must be provided to determine whether the person meets the minimum age requirements. Any data processing needs to be necessary, proportionate to the risks and to comply with data protection laws.

Furthermore, ICO and Ofcom expect methods and systems to be accurate and robust, thereby preventing attempts to gain access through circumvention. Neither regulator expects a party to attempt to use age assurance methods that are technically unfeasible or that introduce risks to rights and freedoms which outweigh the intended benefits. For example, under section 61 and Part 5 of the Act, the publisher or provider of a user-to-user service that may have harmful or pornographic content, must use highly effective age assurance measures to prevent children from accessing it.

Where a content provider provides a service suitable for children or children above a certain age, then the obligation is to ensure that the experience is age-appropriate according to ICO’s Children’s Code and Ofcom’s Protection of Children Codes (collectively referred to as the Codes).

The statement provides further practical examples of how the various legislation, regulations and codes of practice operate in the provision of user-to-user pornography service, including large social media sites setting a minimum age of 13 to be able to access content.


References



This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.