United Kingdom
[GB] Parliamentary Home Affairs Committee report on abuse, hate, and extremism online
IRIS 2017-7:1/19
Julian Wilkins
Wordley Partnership and Q Chambers
On 1 May 2017, the House of Commons Home Affairs Select Committee published its report concerning how social media companies, including YouTube, Google, and Twitter, were policing their sites and taking suitable action to remove illegal content that constituted a hate crime. The Committee concluded that overall the social media companies were “shamefully far” from tackling illegal and dangerous content. The Committee recommended that the Government consult on stronger laws and a system of fines for companies failing to remove illegal content.
The Committee’s enquiry was announced in July 2016 following the murder of Jo Cox MP in June 2016 in the lead-up to the EU referendum. There had been a marked increase in hate crime and one aspect of the inquiry concerned the role of social media in hate crime. The Committee acquired evidence from a wide range of sources including social media companies.
The Committee recognised that various social media and technology companies had considered the impact of online hate, abuse, and extremism has upon individuals. Also, the Committee welcomed some companies’ efforts to reduce such online behaviour by publishing clear community guidelines, building new technologies, and promoting online safety for young people and schools. However, the evidence acquired before the Committee revealed that nowhere near enough was being done to tackle illegal and dangerous content published on companies’ sites and to keep their users safe.
The Committee urged the companies to urgently improve the quality and speed of their responses to reports of dangerous and illegal content from wherever those reports derived.
The Committee considered it unacceptable that the companies refused to reveal the number of people they employ to safeguard users, or the amount they spend on public safety initiatives. The companies refused to give the information because of commercial sensitivity. The Committee’s conclusions and recommendations included finding that “the biggest and richest social media companies are shamefully far from taking sufficient action to tackle illegal and dangerous content.... Given their immense size, resources and global reach, it is completely irresponsible of them to fail [to] abide by the law, and to keep their users and others safe”. Further, the Committee praised Twitter for some of its technological initiatives but was overall disappointed at the lack of pace of development of technical solutions, and in particular criticised Google for currently only using its technology to identify illegal or extreme material to assist advertisers but not to remove illegal content proactively. The Committee commented in its conclusions and recommendations as follows: “Most legal provisions in this field predate the era of mass social media use and some predate the internet itself. The Government should review the entire legislative framework governing online hate speech, harassment, and extremism and ensure that the law is up to date”. The Committee recommended that companies that fail to proactively search for and remove illegal material should pay towards the costs of the police doing so instead. An analogy was made with section 25 of the Police Act 1996 whereby football clubs pay for the policing around a stadium. The companies should publish quarterly reports on their safeguarding activities including number of staff, complaints, and action taken. Transparent reporting would encourage improved standards and competition between online platforms to find innovative solutions to these problems. If the companies refuse to do this voluntarily then the Government will consult on forcing them to do so.
References
- House of Commons Home Affairs Committee, Hate Crime: abuse, hate and extremism online, Fourteenth report of the Session 2016-2017, HC 609, 1 May 2017
- https://www.publications.parliament.uk/pa/cm201617/cmselect/cmhaff/609/609.pdf
This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.