United Kingdom

[GB] Implementation of age verification requirements under the Online Safety Act 2023

IRIS 2025-7:1/6

Alexandros K. Antoniou

University of Essex

The UK’s communications and online safety regulator, Ofcom, announced the implementation of new requirements obligating all online platforms that host pornographic content to deploy ‘highly effective’ age assurance measures by 25 July 2025. This regulatory shift stems from the Online Safety Act 2023 and aims to prevent minors from accessing harmful content, including pornography, while safeguarding adult users’ rights to access legal material.

Scope of application, implementation and compliance requirements

The new age verification regime will apply broadly across the digital landscape, encompassing dedicated adult content websites, as well as platforms where pornography might appear incidentally, including social media, dating platforms, and gaming services. This marks a significant tightening of the UK’s regulatory approach to content accessibility and positions age assurance as a central pillar of child protection online.

The introduction of these measures is grounded in new empirical evidence gathered by Ofcom, which indicates that 8% of children aged 8–14 in the UK accessed pornographic content online within a single month. Notably, 3% of children as young as 8–9 years old were exposed to such material. The data also highlights significant gender disparities, with 19% of boys aged 13–14 having visited pornographic services compared to 11% of girls in the same age group. These findings prompted urgent calls for intervention. Ofcom reports that 80% of UK adults support age verification for online pornography as a means to protect children.

Under the new framework, services must ensure that pornographic content cannot be accessed before users’ ages are reliably verified. Specifically, Ofcom requires that age check processes are ‘technically accurate, robust, reliable and fair’. Passive methods such as tick-box declarations or user self-certification will no longer be permissible.

Ofcom defines ‘highly effective’ age assurance to include both age verification and age estimation techniques. Acceptable methods may include, for instance: 

· Facial age estimation using biometric technology;

· Open banking data to verify age through financial institutions;

· Digital identity wallets;

· Credit card validation as proof of adult status;

· Email-based age estimation leveraging linked account activity;

· Mobile network data to confirm the absence of age restrictions;

· Photo ID matching combining document upload with facial comparison.

Providers may implement these directly or via specialist third-party age assurance services. They are also required to publish clear information about the nature of the checks used and how user data is handled.

Although the age assurance measures require the handling of sensitive personal data, Ofcom emphasises that they must comply with UK data protection legislation. Oversight lies with the Information Commissioner’s Office (ICO), which will coordinate with Ofcom to ensure that users’ privacy rights are upheld. Online services must maintain internal documentation demonstrating how their practices comply with data protection rules and minimise risk to personal information. This framework is designed to mirror existing societal norms regarding age-restricted goods in the offline environment, like alcohol and tobacco, while making suitable adjustments for the digital context.

Industry adoption and enforcement

In advance of the deadline, major pornographic platforms with UK audiences, including PornHub (the most visited porn service in the UK), RedTube, YouPorn and others, have confirmed their commitment to implementing compliant age checks. Several other smaller platforms have reportedly done so as well, reflecting broader industry alignment with the regulatory direction.

Earlier in May 2025, Ofcom launched investigations into porn providers suspected of non-compliance, highlighting a more proactive enforcement stance. The regulator has the authority to impose substantial financial penalties (up to £18 million or 10% of global qualifying revenue, whichever is higher) for non-compliance. In more serious cases, Ofcom may seek court orders to sanction third parties, including internet service providers, potentially leading to site access restrictions or blocks within the UK.

Extension to broader harmful content

The regulatory measures are not limited to pornography. From the same date, namely 25 July 2025, a wider cohort of online services, including social media and gaming platforms, must take steps to prevent children from encountering other categories of harmful content, such as material promoting suicide, self-harm, and eating disorders.

Where platforms rely on algorithmic recommender systems and pose a medium or high risk, they are required to configure these systems to avoid serving harmful content to child users. The riskiest services must incorporate highly effective age assurance to distinguish adult users from minors and tailor their content moderation practices accordingly.

Providers may either follow Ofcom’s Children’s Codes or take alternative action that demonstrably meets the legal standard of care to mitigate the risks their services pose to children.

Ofcom has committed to publishing a detailed report in 2026 evaluating the effectiveness of current age verification technologies and providing further regulatory guidance. This ongoing review is intended to ensure that the legal framework remains effective, proportionate, and responsive to both industry capabilities and societal expectations.


References





This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.