This content is translated with an automatic translation tool: the text may contain inaccurate information.

Integration of the Code of Conduct on Combating Unlawful Incitement to Hatred

The new framework of the Digital Services Act encourages voluntary codes of conduct to address online risks

Publication date:

29/01/2025

© Provincia autonoma di Trento -

Description

On 20 January, the Commission and the European Digital Services Committee welcomed the integration of the revised 'code of conduct to counter unlawful online hate speech +' into the Digital Services Act, which encourages voluntary codes of conduct to address online risks.

The code of conduct+, which builds on the initial 2016 code of conduct to counter unlawful online hate speech, was signed by Dailymotion, Facebook, Instagram, Jeuxvideo.com, LinkedIn, Microsoft hosted consumer services, Snapchat, Rakuten Viber, TikTok, Twitch, X and YouTube.

The Code of Conduct+ will strengthen the way online platforms deal with content that EU and national laws define as illegal hate speech. The integrated code of conduct will facilitate compliance and effective enforcement of the Digital Services Act with regard to the risks of spreading illegal content on their services.

As a result of this integration, online platforms designated under the Digital Services Act can adhere to the Code of Conduct+ to demonstrate their compliance with the Digital Services Act's obligation to mitigate the risk of dissemination of illegal content on their services. Compliance with the commitments of the Code of Conduct+ will be part of the annual independent audit to which these platforms are subject under the Digital Services Act and which contributes to enhancing the transparency and accountability of the platforms. Concretely, the signatories of the code of conduct+ commit themselves, inter alia, to:

Enable a network of 'monitoring whistleblowers', which are non-profit or public entities with expertise in illegal hate speech, to regularly monitor how signatories are reviewing hate speech warnings. Monitoring whistleblowers may include entities designated as 'Trusted Flagger' under the Digital Services Act.
Commit to the maximum to review within 24 hours at least two-thirds of hate speech notices received by monitoring flaggers.
Engage with well-defined and specific transparency commitments regarding measures to reduce the prevalence of hate speech on their services, including through automated detection tools.
Participate in multi-stakeholder structured cooperation with experts and civil society organisations that can report trends and developments in hate incitement that they observe, helping to prevent waves of hate incitement from going viral.
Increase, in cooperation with civil society organisations, users' awareness of illegal hate speech and procedures for reporting illegal online content.

As part of their respective evaluations of the Code of Conduct+, the Commission and the European Digital Services Committee encourage signatory platforms to take several recommendations into account when implementing the Code of Conduct+, including

Providing information as part of their reports on the outcome of the measures taken, as well as additional data on hate speech on their platforms. This may include, for example, the role of recommender systems and the organic and algorithmic reach of illegal content prior to its removal.
Present country-level data broken down by internal classification of hate incitement (such as race, ethnicity, religion, gender identity or sexual orientation) and ensure adequate follow-up of input from multilateral cooperation.

The Commission and the Committee will monitor and evaluate the achievement of the objectives of the Code of Conduct+, as well as its recommendations, and facilitate regular review and adaptation of the Code. This process will be part of the continuous monitoring of the platforms' compliance with the existing rules.

Source: European Commission

Photo: EU

Sito web OpenCity Italia · Site editors access