Germany

[DE] Media regulator takes action against Twitter for pornographic content

IRIS 2020-10:1/4

Jan Henrich

Institute of European Media Law (EMR), Saarbrücken/Brussels

On 30 September 2020, the Kommission für Jugendmedienschutz (Commission for the Protection of Minors in the Media – KJM) ruled that youth protection rules had been breached on the Twitter platform. Unable to prosecute the account holders concerned on account of their anonymity, the KJM held Twitter itself responsible.

In six separate cases, the KJM found that Twitter users had breached the German Jugendmedienschutz-Staatsvertrag (Inter-State Treaty on the protection of minors in the media – JMStV) by making pornographic content publicly accessible. Since the service providers had distributed this content without fulfilling their legal obligation to ensure that children and young people could not access it, the KJM issued complaints and adopted appropriate measures. These will now be implemented at state level by the relevant media authorities in various legal procedures. According to the KJM chairman, Marc Jan Eumann, Twitter had the technical know-how to protect children and young people from pornographic content. The platform’s failure to proactively meet its responsibility to do so was therefore incomprehensible. The media regulators hope that the legal procedures will encourage Twitter to take corrective action more quickly when the law is clearly breached in the future.

The KJM is Germany’s central supervisory body for the protection of minors in private broadcasting and telemedia. As an organ of the state media authorities, it checks whether the JMStV has been infringed and decides on remedial measures, such as complaints, prohibition orders and fines.

In June, the KJM announced plans to take more stringent action to combat pornography on online platforms in future. It had been investigating three website operators who had provided access to pornographic material without an age verification system. In August, it also published a study on search engines’ youth protection settings and filter mechanisms. The study found that, even when security settings were activated, children and young people using search engines could easily come into contact with content that could harm their development, especially content relating to violence, self-harm and extremism (see IRIS 2020-9/21).


References


This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.