Content Moderation / Knowledge Center / Guide

Digital Services Act and Content Moderation

Guide, 2024

The Digital Services Act (DSA) aims at creating online safety standards accross the European Union (EU). As this new law will soon go into effect, this guide will help you understand what is at stake and how content moderation can help.

What is the Digital Services Act?

The DSA is a new law that was adopted by the European Parliament on July 5th 2022 and that will come into effect by January 1st 2024 at the latest. It has already started to be applied for big platforms.

The DSA is a set of rules and regulations applying in the EU with the main goal of ensuring a safer online space for users by protecting their rights and addressing illegal content and products online. It also requires platforms to be accountable and transparent about how they collect and handle data.

Who is in scope?

The DSA applies to all digital services, i.e. all businesses or organizations providing online services in any member state of the EU. Here are some examples of types of companies that are in the scope of the law:

  • Intermediary services such as internet service providers, DNA providers, VoIP apps, messaging apps or email service providers

  • Hosting services such as cloud computing services

  • Online platforms such as online marketplaces, app stores, social media platforms, booking platforms or any other platform hosting user-generated content (UGC)

Very Large Online Platforms (VLOPs) and Very Large Search Engines (VLOSEs) having more than 45 million users per month are also covered by the law and need to comply to additional requirements, as the obligations of online stakeholders are proportionate to their ability and size.

How to prepare for the Digital Services Act?

The DSA may impact your business and operations.

Platforms have already been asked to publish information on their number of users to determine the scope they belong to (VLOPs and VLOSEs are facing additional requirements) by February 17, 2023, and every 6 months after that.

In any case, you should be ready to take some actions to ensure compliance. Here are some key steps to take:

  1. Terms of service and community guidelines:

    • Review your terms of service to be in line with the DSA and update them if needed

    • Provide clear rules and policies on UGC: what is allowed or not on your platform, what kind of content is considered as illegal, what are the possible consequences when the rules are not respected?

  2. Content moderation:

    • Use a content moderation system (human or automated moderation) to ensure illegal content is handled

    • Give your users a simple and accessible way to do reports

    • Allow users to dispute moderation decisions through a simple, transparent and easy process

  3. Transparency:

    • Inform your users on how content moderation works and impacts their experience on the platform

    • Give users some clear explanations on the content moderation decisions

    • Produce annual transparency reports on content moderation (number of orders to remove illegal content, number of reports from users, decisions made, accuracy and error rate for automated moderation systems, guidelines provided to human moderators, etc.)

These advices are basic: depending on the size, location and audience of your platform, you should conform to more rules.

Keep in mind that the DSA may still evolve and that you should stay informed and up to date with its requirements.

What are the sanctions in case of non-compliance?

In case of violation of the DSA, sanctions such as fines or penalty payments may be applied.

Depending on the type of digital service, the fine imposed may range up to 6% of its annual worldwide turnover.

The same goes for the penalty payment, which can represent a maximum of 5% of the platform's average daily global revenue or turnover per day of penalty.

Benefits of Content Moderation under the Digital Services Act

Compliance, accountability and transparency are the three concepts to keep in mind. Content moderation is essential to create and maintain a safe and healthy environment for users, and it becomes even more crucial under the DSA. It is a service that can:

  • Help platforms comply with the regulations by quickly and proactively detect unwanted and illegal content

  • Provide guidance to platforms on how identified illegal UGC should be treated under the law

  • Make sure users rights are respected and increase their trust and well-being on the platform

  • Improve transparency and accountability by providing more visibility on the type of UGC found on the platform and how it is monitored

Read more

Head back to the Knowledge Center