Germany
[DE] Youth protection organisations report high numbers of checks and complaints regarding content harmful to minors
IRIS 2025-6:1/17
Christina Etteldorf
Institute of European Media Law
On 28 April 2025, the Kommission für Jugendmedienschutz (Commission for the Protection of Minors in the Media – KJM) published its 11th activity report, which documents a significant increase in the number of cases relating to the protection of children and young people from harmful content over the past two years. Hate, incitement and disinformation are described as the dominant themes, particularly in online media. Shortly afterwards, on 13 May 2025, the Freiwillige Selbstkontrolle Multimedia (voluntary self-monitoring body for multimedia service providers – FSM) also published statistics on illegal telemedia content reported to its complaints office in 2024. These figures were also high, especially with regard to content harmful to minors, such as depictions of child sexual abuse. The annual report of the joint body for the protection of minors across all the Bundesländer (jugendschutz.net), which was published on 20 May 2025, confirms this picture and documents, among other things, a doubling of registered offences in 2024 compared with the previous year.
The KJM’s 11th activity report covers the period from March 2023 to February 2025 and primarily documents the three areas of application of the Jugendmedienschutzstaatsvertrag (Interstate Treaty on the Protection of Minors in the Media – JMStV) that form much of the KJM’s remit, as well as its further involvement in international dialogue and public relations work, for example. During the reporting period, the KJM issued 901 opinions on indexing procedures. German youth protection legislation stipulates that certain media and telemedia that seriously endanger the development of children and young people can be ‘indexed’, i.e. checked by the Bundeszentrale für Kinder- und Jugendmedienschutz (Federal Centre for the Protection of Children and Young People in the Media – BzKJ) and then, if necessary, included in the list of media harmful to minors. This results, among other things, in enforceable distribution and advertising bans or restrictions. The KJM’s opinion must be obtained and taken into account as part of the BzKJ’s checking process. The KJM decided in favour of indexing in the majority of cases examined during the reporting period. The main risks to minors were the depiction of sexualised violence against children, pornography, extremism and discrimination. The activity report also refers to the close cooperation with the voluntary self-regulatory bodies, which is firmly anchored in German youth media protection law and took place intensively between 2023 and 2025, as well as the procedures carried out for technical youth protection measures, which the KJM assessed positively. These included the use of facial age estimation tools with the help of facial recognition software.
However, the report on checks carried out by the KJM is particularly interesting, documenting a significant overall increase in the number of checks. In broadcasting, however, there was a sharp decrease in the number of cases, which fell from 26 in the 2021–2023 reporting period to just six in the current period. In the online sector, i.e. telemedia, on the other hand, the number of cases increased five-fold from 99 (2021–2023) to 553. Most of these cases concerned criminal offences against minors linked to incitement to hatred (such as incitement against minorities), the use of symbols of unconstitutional organisations (such as swastikas) and Holocaust denial, i.e. mainly content from the field of political extremism. Hate, incitement and disinformation were the main problem areas during the reporting period. However, the KJM does not (only) attribute the rapid rise in cases to an actual increase in problematic content, but also and primarily to the fact that better coordinated cooperation between supervisory bodies and the use of technical tools are making supervision more effective. In particular, it mentions the KIVI tool (an AI-supported tool used as an aid for monitoring social media) and cooperation with the Zentrale Meldestelle für strafbare Inhalte im Internet (Central Reporting Centre for Criminal Content on the Internet). However, the number of enquiries and complaints also more than trebled, with the KJM receiving 2,212 enquiries and complaints in writing or by telephone (of which 1,780 related to the protection of minors in telemedia). Most concerned traditional websites (1,350 complaints) and tended to focus on the topics of eroticism/sex, (incitement to) violence or (incitement to) hate.
However, the FSM’s statistics for 2024, which were published on 13 May 2025 by the voluntary self-regulatory body involved in telemedia supervision under the JMStV, show even higher complaint figures. With 25,536 reports, the FSM complaints office recorded the second-highest number of complaints since it was founded in 1997 (30,573 the previous year). A total of 8,529 complaints concerned pornographic content, which was the most common type of complaint. Alarmingly, however, the second most common related to depictions of child sexual abuse (8,077 cases, 46%[PG1] of complaints). The FSM highlights the phenomenon of so-called ICAP (Invite Child Abuse Pyramid) sites as particularly striking. By sharing links to depictions of abuse, users of these sites collect points in order to gain access to new content. Such links were found in large numbers on social media, among other places. The FSM immediately forwards information about such content stored on German servers to the Bundeskriminalamt (Federal Criminal Police Office – BKA) and informs the host provider in a notice-and-takedown procedure. Where child sexual abuse images are hosted abroad, the complaints office informs the host provider and forwards the report to the competent partner hotline in the international complaints office network, INHOPE.
Meanwhile, jugendschutz.net, which was set up by the federal government, federal states and state media authorities to protect children and young people on the Internet, in particular by monitoring telemedia services (including through an online reporting mechanism), recorded 17,630 infringements in its annual report. The fact that this is more than double the average number of cases over the previous three years (7,291 in 2021–2023) is primarily attributed to the enormous increase in cases related to sexual violence, which accounted for 89% of cases. The report records 14,567 cases of child pornography and 825 cases of youth pornography, as well as an increase in political extremism, which only accounted for 7% of the total number of offences, but rose from 852 cases in the previous year to 1,245. In contrast, fewer cases were reported in the areas of general pornography and violence. In over 9,700 cases, jugendschutz.net notified providers and self-regulatory bodies of infringements and demanded that they be removed quickly, which led to the deletion or blocking of the content in 99% of cases. In particularly serious cases, such as those that concerned sexual violence, it involved law enforcement authorities, the KJM and the BzKJ.
[PG1]I think 46% is wrong.
References
- Pressemitteilung und 11. Tätigkeitsbericht der KJM
- https://www.kjm-online.de/presse/pressemitteilungen/kommission-fuer-jugendmedienschutz-kjm-veroeffentlicht-11-taetigkeitsbericht/
- Press release and 11th activity report of the KJM
- https://www.kjm-online.de/presse/pressemitteilungen/kommission-fuer-jugendmedienschutz-kjm-veroeffentlicht-11-taetigkeitsbericht/
- Pressemitteilung und Statistik 2024 der FSM
- https://www.fsm.de/mitteilung/beschwerdestelle-jahresstatistik-2024/
- FSM press release and statistics 2024
- https://www.fsm.de/mitteilung/beschwerdestelle-jahresstatistik-2024/
- Jahresbericht 2024 von jugendschutz.net
- https://www.jugendschutz.net/fileadmin/daten/publikationen/jahresberichte/jahresbericht_2024.pdf
- jugendschutz.net annual report 2024
- https://www.jugendschutz.net/fileadmin/daten/publikationen/jahresberichte/jahresbericht_2024.pdf
This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.