The Digital Services Act (DSA), also known as the "Law on Digital Services," introduces new rules and special due diligence obligations for intermediary services, especially online platforms. The goal of the DSA is to create a safe, predictable, and trustworthy online environment. This is to be achieved through specific liability rules and special due diligence obligations for online intermediary services that offer their products to users in the EU.
The regulation has been in effect since August 25, 2023, for "very large online platforms and search engines" such as YouTube, Instagram, Wikipedia, or Zalando. However, for the vast majority of affected stakeholders, the DSA will not be binding until February 17, 2024.
Who is covered by the DSA?
In general, the DSA applies to all “online intermediary services”. This term encompasses services that involve the transmission, storage, and provision of user content. The obligations of online intermediary services vary based on their role, size, and impact within the online environment.
The regulation defines an „intermediary service” as including:
- Mere transmission services, such as access providers.
- Caching services, such as proxy cache servers.
- Hosting services, including cloud computing providers or hosting services.
Additionally, there are special forms of intermediary services that also fall within the scope of the DSA:
- Online platforms, as a subcategory of hosting services, where user-provided information is not only stored but also distributed on their behalf, as seen with platforms like Amazon, Facebook, or Instagram.
- Online search engines as a distinct subcategory of intermediary services.
Micro and small businesses are largely exempt from the specific obligations placed on online platforms. A company is no longer considered a small business if it employs more than 50 workers and its annual balance sheet total or annual turnover exceeds EUR 10 million.
What does the DSA regulate in terms of content?
The DSA regulates online intermediary services with a specific liability regime and special due diligence obligations, which vary depending on the role, size, and impact of the intermediary service.
- Liability regime for unlawful content
In principle, intermediary services are not obligated to proactively monitor or investigate the transmission and storage of unlawful content from third parties. Unlawful content, as per the DSA’s definition encompasses all information that does not comply with Union law or the law of a Member State. However once an intermediary service becomes aware of the unlawfulness or upon receiving a governmental or court order, they must promptly block or remove (take-down) the content. In certain cases, intermediary services may also be required to prevent the re-accessibility of unlawful content (stay-down).
- Due diligence obligations
The new due diligence obligations in the DSA place a particular emphasis on transparency and accountability towards users. Online platforms, for instance, are required to disclose when information constitutes advertising, identify on whose behalf (natural or legal person) the advertising is displayed and provide information on the criteria used to display advertising to users (transparency of recommendation systems).
In addition, so called “dark patterns“ are extensively prohibited. This term refers to practices aimed at distorting or impairing a user’s free and uninfluenced decision making, such as making it difficult to cancel a subscription due to „click fatigue“ or designing lengthy click paths.
- Designation of a central contact point by hosting services
To facilitate efficient communication between authorities and digital intermediary services, each intermediary service must designate a central contact point as a „Single Point of Contact“ and also specifically in which language contact can be made. This contact point must be accessible to users „directly and quickly … electronically and in a user-friendly manner“ (e.g., through a chat bot).
- Disclosure of information on content moderation and user content restrictions in terms and conditions
The DSA also includes regulations on content moderation and other restrictions on user-generated content. To ensure increased transparency, enhanced user protection, and the prevention of arbitrary content moderation, the DSA imposes new obligations on hosting services to make their terms and conditions "clear, simple, and user-friendly." This can involve using graphical elements, such as images or videos, to present complex legal topics as visually as possible.
- Establishment of reporting and remedial procedures:
Users must be able to easily report unlawful content in electronic form. Hosting services are required to establish appropriate reporting and remedial procedures. If, after a careful and objective review, the hosting service determines that the reported content is unlawful and implements restrictions (such as account suspension), it must also provide the reporting party with a clear and specific explanation in an easily understandable manner.
- What consequences can be expected for violations of the DSA?
The DAS follows a concept similar to the General Data Protection Regulation (GDPR): for violations, fines up to 6% of the global annual turnover of the preceding year can be imposed by the relevant authority.
Our New Technologies and IP Team is ready to assist you in implementing the new requirements of the DSA, such as ensuring compliant terms and conditions and fulfilling all information obligations.