United Kingdom

[GB] UK Coroner orders major online platform to provide to his court their proposals to provide suitable self-regulation to prevent future teenage deaths from suicide.

IRIS 2022-10:1/12

Julian Wilkins

Wordley Partnership and Q Chambers

Whilst the UK Parliament’s Online Harms Bill 2022 awaits further passage through both the Houses of Common and Lords, the verdict in a recent Coroner’s Court decision has invited some of the leading online platforms to consider self-regulatory measures to protect children and vulnerable adults from harmful content.

The Coroner’s hearing was heard in the Northern District of Greater London Coroner’s Court, before H. M. Coroner and senior coroner Mr Andrew Walker, concerning the suicide of 14 years old Molly Rose Russell who died on the 21 November 2017.  Despite a promising future and loving parents, Molly had become depressed, which developed into depressive illness.

It transpired that Molly had subscribed to a number of online sites. The sites were not suitable for children yet the content was easily available. She had access to images, video clips and text concerning self-harm, suicide or otherwise, which were negative or depressing in nature.

The platforms worked in such a way that algorithms would provide more content and had the effect of providing excessive material of like content whereby Molly would "binge" view the material.

The Coroner considered that the content had a negative effect on Molly. Some of the content romanticised acts of self-harm by young people on themselves. Whilst other content sought to isolate and discourage discussion with those who may have been able to help, for instance Molly’s parents.

Instead, the platform encouraged Molly to approach celebrities for help with little prospect of a reply. Some of the content was graphic, tending to portray self-harm and suicide as an inevitable consequence of an irrevocable condition.

The effect of the sites was to normalise Molly’s condition, focusing on a limited and irrational view without the counterbalance of normality.

The Coroner concluded in his report that it was likely that the material viewed by Molly, someone already suffering with a depressive illness and at a vulnerable age, negatively affected her mental health and contributed to her death in "a more than minimal way".

The Coroner raised six matters of concern namely (i). There was no separation between the adult and child parts of online platforms, or separate platforms for children and adults. (ii). There was no age verification when signing up to the on-line platform. (iii). That the content was not controlled so as to be age specific. (iv). Algorithms were used to provide content together with adverts. (v). Parents, guardians or carers did not have access to the material being viewed or have any control over that material. (vi). That the child's account was not capable of being separately linked to the parent, guardian or carer's account for monitoring.

The Coroner recommended that consideration is given by the Government to reviewing the provision of internet platforms to children, with reference to harmful on-line content, separate platforms for adults and children, verification of age before joining the platform, provision of age specific content, the use of algorithms to provide content, the use of advertising and parental guardian or carer control, including access to material viewed by a child, and retention of material viewed by a child.

Also, the Coroner recommended the setting up of an independent regulatory body to monitor on-line platform content taking account of his concerns. Further, the enactment of such legislation as may be necessary to ensure the protection of children from the effects of harmful on-line content and the effective regulation of harmful on-line content.

Although regulation would be a matter for Government, the Coroner saw no reason why the platforms themselves would not wish to give consideration to self-regulation to address his concerns.

The Coroner considered the platforms had the power to take suitable self-regulation to prevent future deaths.

Organisations including Meta Platforms, Snap Inc and Twitter International Company were under a duty to respond to the Coroner’s report by 8 December 2022. The Coroner ordered that the responses contained details of action taken or proposed action to be taken, including a timetable for action. In the absence of such proposals the organisations must explain why no action is proposed.


This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.