United Kingdom

[GB] Ofcom guidance to broadcasters on synthetic media

IRIS 2023-6:1/26

Alexandros K. Antoniou

University of Essex

On 3 April 2023, Ofcom (the UK’s independent communications regulator) issued guidance to broadcasters on the use of “synthetic media”, including “DeepFakes”. The regulator stated that its Broadcasting Code is adequately equipped to help maintain trust and fairness in programming and reminded licence holders to give thoughtful consideration to their compliance processes.

Ofcom understands the term “synthetic media” to mean video, image, text, or voice that has been produced (either fully or partly) with the help of artificial intelligence algorithms. Synthetic media is becoming more widespread on the internet and is also utilised in various types of digital media, such as virtual reality, augmented reality, and gaming. It is increasingly used in industries such as marketing, advertising and entertainment, including film-making and broadcasting.

As this technology advances quickly, it is expected that synthetic media will become even more prevalent in broadcast content. In its guidance, Ofcom recognised that broadcasters could reap significant benefits from this technology, such as the ability to enhance audience interaction by producing content that would be difficult or unattainable with conventional methods. The regulator stressed that it is essential for broadcasters and audiences to have the freedom to explore new and emerging technologies like this, in line with their right to freedom of expression and access to information.

However, Ofcom acknowledged the challenges synthetic media pose not only to broadcasters themselves but also to their audiences. For example, DeepFake software is capable of substituting a person’s appearance in a pre-existing still or moving image with someone else’s likeness and can do so with realistic results. Ofcom identified, in particular, several risks associated with the use of synthetic media and DeepFake technology specifically. First, it has the potential to generate fake news, propaganda, and other types of disinformation. These can spread rapidly on the Internet and broadcast journalists may find it hard to verify the credibility of content from online sources. Second, the increased use of DeepFakes could lead to a loss of trust and credibility among audiences, who may struggle to determine the authenticity of the content. Third, there is a risk that audiences could mistake DeepFake footage for a real person, leading to the unfair treatment of a person appearing in programming or unwarranted privacy violations.

The regulator reminded broadcasters of the rules available under the existing Broadcasting Code (last revised in December 2020) to protect audiences from potential harms arising from the use of synthetic media and maintain high levels of trust in broadcast news. These include (but are not limited to) Section Two of the Code, which aims to establish guidelines that safeguard audiences from potentially harmful or offensive material, as well as from content that might be materially misleading, and Section Five which requires that news content, in whatever form, is reported with “due accuracy” and presented with “due impartiality”. “Due” means adequate or appropriate to the subject and nature of the programme. Moreover, Section Seven mandates broadcasters to avoid “unjust or unfair treatment” of individuals or organisations in their programmes. And finally, Section Eight establishes guidelines to ensure that broadcasters do not invade individuals’ privacy in their programming or in the process of obtaining material for their programming. However, context and public interest may justify derogations from the Code’s rules and practices in certain cases.

In light of this, Ofcom advised all licensees to “consider carefully whether their compliance processes need to be adapted or developed to account for the potential risks involved in the use of synthetic media technologies to create broadcast content”.


References


This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.