Germany

[DE] Social networks publish their second transparency reports

IRIS 2019-4:1/11

Agata Witkowska

Patpol

The Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken (Act to improve law enforcement in social networks - NetzDG), which is designed to force social networks to deal more quickly and more comprehensively with complaints about hate crime and other criminal content, entered into force in Germany on 1 October 2017 (see IRIS 2018-1/15). Under the Act, social networks must ensure, through an effective and transparent procedure, that complaints are immediately noted and checked, and that illicit content is deleted within specified deadlines. Social network providers that receive over 100 complaints about illegal content in the same calendar year are also subject to reporting obligations.

The portal operators Facebook, Twitter and Google have now published transparency reports for the second time. The reports detail the total number of complaints made, the number of complaints that resulted in content being deleted, and the number of employees dealing with complaints. A special procedure, independent of the companies’ own procedures, applies to such complaints, and a special complaint form can be used to report any of the crimes listed in the Act.

According to Facebook’s report, a total of 1 048 offences were reported in 500 separate complaints between 1 July and 31 December 2018, with users able to report more than one incident in a single complaint. As a result of these complaints, 369 posts were deleted or blocked, representing a quota of around 35%. The company indicated that teams of trained experts and lawyers, totalling 63 people, were employed to process the complaints, although they also worked in other areas. Compared with the first transparency report, the company reported an increase in the quality of complaints, with the deletion quota rising from 21% to 35%. While the number of alleged infringements reported in the second half of 2018 had fallen, the number of deleted or blocked posts had remained more or less the same.

The company emphasised that it was taking firm action to remove hate speech as soon as it was reported. It had made considerable progress by improving and implementing its Community Standards. It had also expanded its global team responsible for dealing with complaints. In Germany, content was checked for possible breaches of Facebook’s Community Standards by around 2 000 people at its Essen and Berlin offices. In addition, Facebook reported that new technologies such as machine learning and artificial intelligence were useful tools for detecting inappropriate content more quickly and effectively than human beings. However, since errors could not be ruled out, an appeals procedure for individual posts had been introduced in 2018. Users who believed that Facebook had made a mistake could therefore ask for a decision to be reviewed.

As well as Facebook, Twitter and Google have also submitted transparency reports. Twitter received a total of 256 462 complaints in the second half of 2018, 20 140 of which were lodged by complaints bodies and the rest by individual users. Over 50 employees dealt with these complaints, 23 165 of which resulted in content being deleted, that is, only 8% of all cases reported. Meanwhile, Google received a total of 250 957 complaints concerning its YouTube video portal, with 83 390 of them lodged by complaints bodies. Its 75 employees deleted 54 644 videos, that is, more than one-fifth of those reported. Observers believe the comparatively small number of complaints submitted to Facebook (500) is a result of the company’s complicated complaints procedure.


References




Related articles

IRIS 2018-1:1/15 [DE] Network Enforcement Act enters into force

This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.