The European Union’s Digital Services Act (DSA) builds on the e-Commerce Directive and regulates the obligations of digital services that act as intermediaries in connecting consumers with third-party goods, services, or content.

loader

Overview

The European Union’s Digital Services Act (DSA) builds on the e-Commerce Directive and regulates the obligations of digital services that act as intermediaries in connecting consumers with third-party goods, services, or content.

 

The DSA features due diligence and transparency obligations for all online intermediaries. It contains notice-and-takedown requirements related to government orders, as well as novel and extensive obligations related to global content moderation, advertising, data access, and product design practices. The DSA is complemented by the Digital Markets Act (DMA), which tackles economic concerns related to large online platforms that control access to digital markets for other businesses.

 

 

The DSA features due diligence and transparency obligations for all online intermediaries. It contains notice-and-takedown requirements related to government orders, as well as novel and extensive obligations related to global content moderation, advertising, data access, and product design practices. The DSA is complemented by the Digital Markets Act (DMA), which tackles economic concerns related to large online platforms that control access to digital markets for other businesses.

 

DSA Compliance Services

Perkins Coie’s Digital Safety & Human Rights lawyers have a deep understanding of the DSA requirements for intermediaries, hosting services, and online platforms. We counsel companies doing business in the EU to help them develop tailored DSA compliance strategies informed by business priorities, human rights, and operational constraints. We partner with clients to help them do the following:

  • Fully understand their current content moderation; advertising; data access; and product design policies, processes, practices.
  • Identify gaps in compliance with the DSA, human rights, and industry best practices, and provide recommendations to remediate those gaps.
  • Design and implement enhanced practices, policies, and products that comply with the DSA.
  • Identify associated opportunities for global risk mitigation informed by a fast-moving regulatory landscape.

Over the last decade, we have acted as the global strategic quarterback for numerous clients facing legal and reputational exposure related to content moderation and government orders around the world. With a team bolstered by former in-house, government, and United Nations lawyers, as well as world-class privacy, data security, and litigation lawyers, we ably counsel clients regarding all aspects of the DSA.

DSA Essentials

Providers in Scope

The DSA applies to all online intermediaries that offer their digital services in the EU, including those established outside of the EU. Intermediaries established outside of the EU will have to appoint a legal representative, as they do for other EU obligations.

All intermediary providers that store, transmit, or disseminate user-generated content, and have a substantial connection to the EU, are subject to the DSA. Providers’ specific obligations depend on their size and type of service.

Content Moderation Transparency

All intermediaries must specify any restrictions imposed on the use of their service, including information on the policies, procedures, tools, and algorithms used for content moderation and internal complaint handling. Any content restrictions must be enforced in an objective and proportionate manner, with due regard for fundamental rights.

All intermediaries must produce annual reports that include (1) information on the number of orders issued to provide information and/or take down illegal content; and (2) information on content moderation undertaken, including on the use of account suspension, content removal, automation, downranking, and visibility filtering, categorized by the type of illegal content or terms and conditions violation.

Hosting providers must provide a clear and specific statement of reasons for any visibility filtering, downranking, content removal, demonetization, or account suspension. The statement should include the basis for the decision and, where applicable, information on the use of automation.

In their terms and conditions, online platforms must describe any parameters used to suggest information to users, and they must disclose options to modify those parameters. They must also maintain an appeals system allowing decisions taken against allegedly illegal content or terms and conditions violations to be challenged and reversed. For issues that are not satisfactorily resolved by their internal system, platforms must participate in out-of-court dispute settlement proceedings.

Notice and Takedown Processes

All intermediaries must comply with orders from EU member state authorities to provide information or to act against illegal content.

Hosting providers must enable (1) anyone to submit notices containing a substantiated explanation as to why specific content is allegedly illegal under EU or member state law (along with additional transparency reporting on these mechanisms); and (2) adjudication of notices in a timely, diligent, nonarbitrary, and objective manner, along with notification to the submitting party of the decision made.

Ads and Marketplace Transparency

Ad transparency. Online platforms must ensure that users are able to obtain specified information for each advertisement displayed, including the person or entity on whose behalf the ad is presented and information about the parameters used to determine the recipients of the ad.

Bans on targeted advertising. Online platforms may not present advertisements based on profiling children or based on special categories of personal data such as ethnicity, political views, or sexual orientation.

Online marketplace transparency. Online platforms offering a marketplace must obtain identifying information from traders and make it available to users. Platforms must also periodically check for availability of illegal products or services and notify users when such products or services are detected.

Safety By Design

Protection of minors. Online platforms must put in place appropriate and proportionate measures to protect the privacy, safety, and security of minors. Platforms are prohibited from serving ads based on profiling of minors.

Dark patterns. Online platforms may not design, organize, or operate their service in a way that deceives, manipulates, materially distorts, or impairs users’ ability to make free and informed decisions.

Reporting Threats to Life or Safety

Hosting providers that become aware of any information giving rise to a suspicion that a serious criminal offense involving a threat to life or safety of persons has taken place must promptly inform law enforcement or judicial authorities in the relevant EU member state and provide all information available.

Risk Governance and Additional Obligations for Very Large Online Platforms (VLOPs)

Risk governance. VLOPs must undertake annual assessments of the severity and probability of the following on their platforms: (1) dissemination of illegal content; (2) negative effects on fundamental rights, including human dignity, privacy, freedom of expression and information, freedom and pluralism of the media, nondiscrimination, rights of the child, and consumer protection; (3) negative effects on civil discourse, electoral processes, or public security; and (4) negative effects on gender-based violence, public health, minors, and serious negative consequences to a person’s physical and mental well-being.

VLOPs must also put in place reasonable, proportionate, and effective risk mitigation measures tailored to the specific systemic risks identified.

Crisis response. Where the European Commission determines that a threat to public security or public health exists, VLOPs may be required to assess how their platforms contribute to the threat and implement mitigation measures.

Independent audits. VLOPs must undergo annual independent audits to assess compliance with certain DSA obligations and establish an internal compliance function dedicated to monitoring DSA compliance.

Data access. VLOPs must provide member state authorities and vetted researchers with access to certain types of data, including on the design, functioning, and testing of algorithmic systems.

Additional transparency obligations. VLOPs’ additional obligations include reporting on content moderation every six months and the creation of an anonymized repository for advertising information.

Penalties

The DSA imposes steep penalties for noncompliance. The maximum penalty for a failure to comply with the DSA’s substantive obligations is 6% of a provider’s global annual gross revenues. A “periodic penalty” for noncompliance may also be imposed, not to exceed 5% of a provider’s global annual daily gross revenues. Where a provider supplies incorrect, incomplete, or misleading information to a regulator, the provider may be subject to a maximum fine of 1% of global annual gross revenues.

Accolades

Industry Reputation

Our work in privacy and data security and related legal areas receives recognition from leading business and legal publications and directories, including the following:

  • Ranked nationally in Privacy and Data Security: The Elite and Privacy & Security: Litigation by Chambers USA, 2003 – 2023
  • Ranked Tier 1 nationally for both Information Technology Law and Technology Law by U.S. News—Best Lawyers®, 2022
  • Ranked in the Top 10 Best Law Firms for “Privacy and Data Security” by Vault, 2018 – 2022
  • Ranked Tier 2 nationally for Regulatory Enforcement Litigation (Telecom) by U.S. News—Best Lawyers®, 2022
  • Named as one of the top law firm “Litigation Powerhouses” by Law360, 2016
  • Named a “Leader” among tech-savvy law firms based on corporate counsel feedback to BTI Brand Elite, 2016
  • Named “Law Firm of the Year” for Technology Law by U.S. News—Best Lawyers®, 2015

Insights