United Kingdom

[GB] Ofcom paper “Addressing Harmful Online Content”

IRIS 2018-10:1/17

Julian Wilkins

Wordley Partnership and Q Chambers

On 18 September 2018, Ofcom launched its discussion paper entitled “Addressing Harmful Online Content” (the report) which considers ways to regulate social media, especially in the context of children and young people, whilst respecting freedom of expression. The report follows the DCMS’s (Department for Digital, Culture, Media & Sport) July 2018 interim report entitled “Disinformation and Fake News” (see IRIS 2018-8:1/27). Furthermore, this coming winter, the UK Government will publish a White Paper setting out its intention to legislate in order to improve online safety. Ofcom’s report flagged other UK parliamentary activity concerning the Internet, including the House of Lords Communications Committee enquiry entitled “The Internet: to regulate or not to regulate?” Whilst the House of Commons Science and Technology Committee is conducting an inquiry into the impact of social media and screen use on young people’s health.

Ofcom’s report offers policy and legislation makers an insight into the current regulation of content standards for broadcast and on-demand video services and how these could be adapted to prevent harmful online content.

The report recognised that traditional broadcasting and online services were converging, but various aspects of online content were either partially regulated or not regulated at all. Joint research between Ofcom and the Information Commissioner’s Office (IFO) revealed that seven out of ten UK adult Internet users report concerns about harmful content or conduct online, and a quarter say they have directly experienced some harm.

The report recognised that the sheer scale of text, audio and video generated or shared online far exceeded the output of broadcast TV and radio, which made pre-publication regulation more difficult. Online content encouraged a variety of voices and opinions. The public did not necessarily seek impartial online content as compared to traditional broadcasters.

Regulation in news and comment content may center on transparency so that platforms are clear on where content comes from and whether it can be trusted. Online platforms do not commission or create content, whilst the quantity of content may lead to regulation focusing on how quickly an online platform addresses a complaint. Online viewers expect protection in areas such as the protection of minors and protection from illegal content.

Ofcom considered that certain principles would assist policymakers as they determined online protection, such as freedom of expression, allowing the rules to adapt over time to reflect changing technology as well as evolving consumer behaviour and expectations. Public expectations of protection or freedom of expression relating to conversations between individuals may be very different from their expectation of traditional broadcasters and publishers. Careful consideration of the content’s context is likely to be critical for an effective, proportionate online regulatory regime including the application of sanctions.

The regulator needs to be independent to build trust and credibility with the public. Ofcom needs to build a close relationship and common standards with regulators in other jurisdictions, especially given the power and global influence of Internet platforms. Ofcom works with European regulators such as EPRA (European Platform of Regulatory Authorities) and ERGA (The European Regulators Group for Audiovisual Media Services) to monitor developments and promote coordination and co-operation concerning online content.

Regulation will need to be flexible to cater for changing technology and services. Media literacy or people’s understanding and awareness of online issues can assist with preventing harmful content. Ofcom has organised a conference in early 2019 for UK and international regulators. Ofcom will work closely with the UK Government, ICO, the Competitions and Markets Authority and the Advertising Standards Authority.

Ofcom’s report identified current initiatives such as the revised AVMS Directive (Audiovisual Media Services Directive) to apply some regulatory standards to video-sharing platforms such as YouTube and other social media services. Germany and Australia have introduced legislation requiring platform providers to remove certain types of illegal content within a specified period after being identified by users and sanctions will apply for repeated compliance failure. France has introduced steps to target mass disinformation during general elections. The Trust Project is an initiative of 75 news organisations to kitemark trusted sites and for platforms to intervene, including to verify content, if a site is being manipulative or deceptive with content, including highly visible trending. This scheme involves Google, Bing, Facebook and Twitter.

Ofcom recognised that the scope and design of new legislation is a matter for government and parliament but hopes their discussion paper helps policy makers to curtail harmful aspects of the Internet whilst preserving the Internet’s benefits to society, culture, trade and freedom of expression.


References


This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.