Back to Glossary

What is the Digital Services Act?

Last reviewed by Moderation API

The Digital Services Act (Regulation (EU) 2022/2065) is the European Union's main law governing how online intermediaries handle illegal content, user rights, advertising, and systemic risk. It entered into force in November 2022 and became fully applicable to all in-scope services on 17 February 2024.

The DSA replaces the patchwork of national rules that previously governed online liability in Europe with a single tiered framework where obligations scale by company size and risk profile.

Who the DSA applies to

The DSA covers any online intermediary offering services to users in the EU, regardless of where the company itself is headquartered. It defines four tiers of obligation: intermediary services (basic rules that apply to everyone), hosting services, online platforms, and Very Large Online Platforms and Search Engines, known as VLOPs and VLOSEs. That top tier covers services with more than 45 million monthly active users in the EU.

The European Commission designates VLOPs and VLOSEs by decision. The first wave of designations in April 2023 named 19 services including Amazon, Facebook, Google Search, Instagram, TikTok, X, YouTube, and Wikipedia. The list has grown since, and the Commission publishes updates as new services cross the threshold.

Core obligations

All in-scope services have to offer a notice-and-action mechanism so users can report illegal content, provide a statement of reasons whenever they moderate a user's content, and publish annual transparency reports. Online platforms have a longer list on top of that, including internal complaint handling, out-of-court dispute settlement, and trusted flagger status for qualifying entities that can submit prioritized reports.

VLOPs carry the heaviest load.

They have to run annual systemic risk assessments covering illegal content, fundamental rights, civic discourse, and public health, submit to independent audits, maintain an ad repository, offer a non-profiling recommender option, and share data with vetted researchers. In practice that means standing up whole teams dedicated to risk assessment and audit response, in addition to the usual moderation and policy work.

Enforcement and penalties

Enforcement is split. National Digital Services Coordinators handle smaller services, while the European Commission itself enforces against VLOPs and VLOSEs directly. Penalties for non-compliance go up to 6% of a provider's worldwide annual turnover, with periodic penalty payments of up to 5% of average daily worldwide turnover for ongoing breaches. The Commission has already opened formal proceedings against several VLOPs since 2024, on matters ranging from illegal content handling to dark patterns and minors' safety. The pace of those proceedings makes it clear the DSA is going to be enforced in earnest.

What it means for moderation teams

For T&S teams, the DSA converts a lot of former best practices into hard legal requirements. Platforms need auditable policies, documented appeal flows, detailed logging of every moderation decision, and the ability to generate a statement of reasons for each action in a standardized format the Commission will accept.

Those statements of reasons then land in the DSA Transparency Database, which is public and searchable. For the first time, anyone can compare how different platforms actually enforce their rules side by side using primary data rather than whatever the platforms choose to publish in their own transparency reports.

Find out what we'd flag on your platform