Norway
[NO] Norwegian Media Authority: Visualising main findings from the Code of Practice on Disinformation
IRIS 2024-9:1/18
Audun Aagre
The Digital Services Act (DSA) and the Code of Practice on Disinformation (CoP) include transparency measures for search engines, social media, and other online platforms. Signing the CoP is voluntary but is considered a risk mitigation measure based on the systemic risk provisions of Articles 34 and 35 of the DSA.
The Norwegian Media Authority (NMA) has assessed the three first status reports from major providers and services, such as Google (YouTube, Google Advertising), Meta (Instagram, Facebook), ByteDance (TikTok), and Microsoft (LinkedIn, Microsoft Advertising).
The NMA has published a dashboard showing developments of selected reported indicators over time, such as the number of removed disinformation posts, fake accounts, and demonetisation efforts. The dashboard is available in English on the NMA’s website.
The reports provide valuable insight into efforts made by very large online platforms to counter disinformation. For example, TikTok reported the removal of over 90 million fake accounts in the EEA during the second half of 2023, compared with 6 million in the first half of 2023. During the same period, the number of ads removed due to violation of disinformation policies across the analysed platforms in the EEA increased from thirteen million to eighty million.
However, in 2024 the reports reflect a significant decrease in the moderation of disinformation, fake accounts and fact-check labels. The number of posts removed as disinformation content is decreasing, from 330,000 for the first half (H1) of 2023, to 311,000 the second half (H2) and 295,000 for the first half of 2024. The number of deleted fake accounts has remarkably decreased, from 101 million for H2 2023 to 55 million for H1 2024. In the same period, content labelled as fact-checked has dropped from 68 million to 31 million in six months and the number of accepted political ads has increased from 730,000 to 1 million.
In assessing the reports, the Norwegian Media Authority notably found that:
- Most platforms report that they have introduced measures to mitigate risks arising from disinformation. The quality of reporting is improving compared with the previous report.
- The reports are strong in addressing content moderation and user liability but weak on indicators related to platform liabilities, such as risks stemming from recommender systems and the algorithmic amplification of misinformation and disinformation.
- Meta's reports would contain approximate numbers. Besides, for some of the indicators, the reports include numbers at the global level, but not at the EEA level. Meta is the only platform not reporting on the EEA/ EFTA States. According to the Norwegian Media Authority, such shortcomings make comparing measures for several indicators difficult.
- The reports on measures to mitigate risks related to election manipulation and AI-generated disinformation provide valuable insight into how regulatory authorities and platforms can collaborate to ensure free elections.
Based on these assessments, the Norwegian Media Authority has issued several recommendations, including the following:
- Meta needs to improve the quality and accuracy of its reporting and include the EEA EFTA States. Transforming the CoP into a Code of Conduct may contribute to improvements.
- There is a need for a methodology for the independent assessment of the effectiveness of platforms' measures, based on the systemic risk provisions of the DSA.
References
- De globale plattformenes etterlevelse av bransjenormen mot desinformasjon
- https://www.medietilsynet.no/globalassets/publikasjoner/bransjenorm-desinfo/240909_plattformenes_etterlevelse_bransjenormen_desinformasjon.pdf
- Global platforms' compliance with the industry standard against disinformation
This article has been published in IRIS Legal Observations of the European Audiovisual Observatory.