United Kingdom

[GB] Ofcom reports on its first year of VSP regulation

IRIS 2023-1:1/18

Alexandros K. Antoniou

University of Essex

Ofcom, the UK’s communications regulator, published its first report on video-sharing platforms (VSPs) since becoming the statutory regulator for such platforms established in the UK. This is the first of its kind under the VSP regime and reveals information previously unpublished by in-scope regulated companies.

- Platforms’ compliance with the new VSP regime

Ofcom’s report outlines the regulator’s key outcomes from the first year of regulation (October 2021 to October 2022). Its findings stem from the use of the regulator’s statutory powers under section 368Z10(3) of the Communications Act 2003 to issue enforceable information requests to all notified VSPs.

Specifically, some platforms made positive changes to their systems and processes in light of new VSP requirements, e.g., TikTok’s dedicated online safety committee to provide oversight of content and safety compliance, Snapchat’s parental control feature, and OnlyFans’ age assurance tools for all new UK subscribers. However, Ofcom found that platforms provided limited evidence of how well their user safety measures operate, blurring as a result their effectiveness and consistency. It also emerged that some platforms are not adequately resourced, equipped and prepared for regulation. There is a clear need for some of them to improve the quality of their responses to the regulator’s information requests. Moreover, Ofcom found that risk assessment processes were not prioritised by platforms, despite their importance in proactively identifying and mitigating safety risks. Risk assessments, however, will be a requirement on all regulated services under future online safety laws that will eventually supersede the VSP regime. Finally, some adult VSPs’ access control measures were not found to be sufficiently robust in preventing children from accessing pornographic content.

Moving towards the second year of the implementation of the regime, Ofcom will dedicate most of its attention to the comprehensiveness of user policies (also known as Community Guidelines), including their application and enforcement; the availability of appropriate tools empowering users to tailor their online experience; and the implementation of suitable age verification (AV) mechanisms to protect children from harmful online content, including pornography.

To increase transparency of platform processes and raise awareness of how VSPs protect against harmful content, Ofcom’s report also sets out the measures adopted by some platforms to protect their users. The following platforms were reviewed in particular: TikTok, Snapchat, Twitch, Vimeo, BitChute, and some smaller VSPs including Fruitlab, ReCast Sport and Thomas Cook, as well as smaller adults VSPs like AdmireMe, FanzWorld and Xpanded. The report explains the governance processes within each regulated service (giving detail on their systems for online safety risk management) and the journey followed by users/ subscribers on each of these platforms.

- Additional sets of research

Ofcom also made available a report on the VSP Landscape in the UK, describing the context in which providers apply protection measures. The report offers insights into: (a) who the notified VSP providers are; (b) how many users of VSPs there are in the UK and their demographics; (c) what the main business models used by VSP providers are; and (d) what information VSP providers make publicly available in their transparency reports.

With the aim of building its evidence base around the appropriateness of certain protection measures, Ofcom commissioned further sets of research to understand people’s experiences of using (and attitudes towards) safety measures on VSPs. The research explored a range of users’ perspectives, from parents (or carers) of children aged 6-17 to users of porn platforms.

More specifically, the VSP Parental Guidance Research looked at parents’ attitudes towards children’s online behaviours. In summary, it found that parents tended to perceive VSPs generally as having a constant and unregulated stream of content. Based on their current understanding and the information available to them, six in ten parents said they did not use parental controls on the VSPs that their child uses, because their child "did not need them". Just over half of parents remembered seeing or receiving guidance on how to keep their child safe online from multiple sources (government websites being the most trusted). However, the study revealed that the process of finding information on online safety was described by many parents as overwhelming and often only prompted by a specific incident (e.g., school guidance, discovering their child was looking at inappropriate content). Parents were also appreciative of safety guidance from VSPs that was clear, digestible, accessible, and easy to understand.

An additional set of research, i.e., Adult Users’ Attitudes to Age-Verification (AV) on Adult Sites, found that, although there was broad support from adult participants for age assurance measures to prevent under-18s from accessing online pornography, UK adult sites were not doing enough to protect children. The biggest adult video-sharing site, OnlyFans, introduced new age verification in response to regulation (using third-party tools) but smaller sites based in the UK did not have sufficiently robust access control measures. Subscriber sign-on processes show that smaller UK-established adult VSPs have AV measures in place when users sign up to post content, but users can generally access adult content simply by self-declaring that they are over 18. Ofcom’s research showed that 81% of participants accepted AV measures where these were expected in general (e.g., whilst purchasing alcohol online or participating in online gambling). A similar proportion (80%) felt Internet users should be required to verify their age when accessing pornography online, especially on dedicated adult sites. The use of a credit card was the preferred means of AV for paid access to pornography. Serious concerns were expressed by participants about how user data might be processed and stored during AV processes to access pornography, reflecting a very low level of trust in the data privacy practices of adult sites.

These findings will inform Ofcom’s regulation of VSPs, including the rules on the protection of children, and its engagement with notified providers.


References





This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.