Insights and Analysis

Shaping the EU’s Digital Future: EU Parliament adopts amended draft Digital Services Act

Image
Image

In December 2020, the EU Commission had published its first draft for the Digital Services Act (DSA) – laying the foundations for  its ambitious plan to reshape Europe’s digital future. The proposed legislation will bring a comprehensive package of harmonized rules for the Internet economy across Europe – building on, and adding much granularity to, the principles of the eCommerce Directive.

Throughout 2021, policy makers in the European Parliament and the European Council have been working intensively on their versions for the draft regulation. The European Council adopted its amendments to the Commission text on 25 November 2021. The European Parliament voted on its version  on 20 January 2022, with several last-minute changes.

In this article we take a short ride through the developments since the publication of the first draft.

Recap

The Digital Services Act (DSA), alongside the Digital Markets Act (DMA), is part of the EU Commission’s digital strategy, aimed at  reinforcing the single market for digital services and  creating a more level playing field for businesses of all sizes across the EU. To this end, the DSA builds on the E-Commerce Directive, with its central provisions on the liability safe harbour for online intermediaries - which  over time have been subjected to  diverging case-law and application across the Member States.

The EU Commission (EC) presented its vision for the comprehensive reform package in a first DSA draft on 15 December 2020. While the EC decided to generally maintain the cornerstones of the existing liability regime – i.e. the country-of-origin principle, the safe harbour provisions for online intermediaries, and the banning of general monitoring obligations – it also proposed new and far-reaching staggered obligations for the different types of online intermediaries. In a nutshell,  the EC draft introduces:

  • detailed rules on content moderation, including a harmonized notice and action mechanism, and internal and external redress mechanisms,
  • new rules for advertising transparency,
  • seller-vetting obligations for online marketplaces (“know your business customer”), 
  • strict obligations relating to systemic risks for online platforms with more than 45 million monthly active users (defined as “very large online platforms” / VLOPs), and
  • a comprehensive enforcement system with new oversight bodies and severe penalties for providers who fail to comply with the new regulations.

For details on the initial EC draft and subsequent developments, you can refer to our previous articles (here, here and here).

Key aspects of the European Parliament’s draft

Given the wide and ambitious reach of the DSA,  it was clear from the outset that various political groups within the European Parliament (EP) were going to push for distinctly different visions. Members of the Committee on the Internal Market and Consumer Protection (IMCO), which is in charge of the legislative process in the EP, proposed  over 2200 amendments to the text over the summer. Following a consolidation by IMCO in December 2021, the EP eventually voted on its draft DSA on 20 January 2022.

The EP draft stands out in several aspects. Not only does it expand the obligations under the EC draft even further. It also introduces entirely new obligations and requirements that would pose a significant challenge for compliance. At the same time,  the text also comes with sensible technical amendments that should bring more clarity for all stakeholders. The following changes are key:

  • Geographical Scope (Art. 2d): The first important change derives from a deletion. The EC draft originally made the DSA applicable to all services that either (a) are established in the EU, (b) target their services to one or more Member States, or (c) have a “significant number of users” in the Member States. The vagueness of the latter concept brought so much uncertainty and justified criticism that the EP now removed the requirement. The  established tests for “targeting” the EU user-base, as developed under case-law will therefore now remain applicable under the DSA.
  • Voluntary Measures (Art. 6): The EC had proposed in its draft that providers should not lose their liability privileges only because  they carry out voluntary own-initiative investigations on illegal content. While the EP seems to share that view, it now also requires providers to be more transparent with respect to such measures, and to maintain certain safeguards, such as human oversight and documentation.
  • Ban on General Monitoring (Art. 7): Furthermore, the EP proposes to strengthen the principle, adopted from the E-Commerce Directive, that providers must not be obligated to actively monitor the third party information they store. The EP text now specifies that this applies to all forms of obligations, legal or factual, and all types of monitoring, whether automated or not. Similarly,  providers may not be required to use automated tools for content moderation.
  • “Dark Patterns” (Art. 13a): Entirely new in the EP draft is a provision on so-called “dark patterns”. It essentially enjoins intermediary services from using the structure or functionalities  of their user interfaces to unduly influence users’ decision-making. This potentially very far-reaching provision is supplemented with a non-exhaustive list of examples that indicate a bit better what the EP had in mind. For instance, when asking for user consent, providers should not give more visual prominence to any of the consent options  – which would ban most standard  designs for cookie banners. This and other examples provided by the EP show that the new provision does not fit well into the context of the DSA, given that it would only apply to online platforms, but not to many other website operators whose user interface designs influence decision-making to the same extent.
  • Notice-and-Takedown (Art. 14): The EP draft makes some technical, but still highly important, adjustments to the rules on notice-and-takedown.  First and foremost, the draft now clarifies that a notice can only trigger an obligation for the provider to act if the illegality of the notified content can be established by a diligent provider “without legal or factual examination”. Content shall also remain accessible while the provider’s assessment is pending, unless the terms and conditions of the provider stipulate otherwise. Furthermore, providers that for technical, operational or contractual reasons cannot remove a piece of content may refer the notice to the provider that has direct control over that content. Each of these adjustments brings much needed clarification for content moderation  practices.
  • Abuse (Art. 20): Similar adjustments are made for the DSA provisions on repeat infringers and abusive notice submitters. While the EC draft would have required online platform providers to (temporarily) suspend bad actors after a prior warning, the EP changed this obligation into an elective right to issue temporary or permanent suspensions.
  • KYBC (Art. 22): The obligations for online marketplaces to vet their sellers (so-called “know-your-business-customer”, or KYBC) have been significantly expanded in the EP draft. In addition to  ID and trade register data, online marketplaces would, under the EP’s wording, also need to obtain data on the individual products  of each trader. This information would also have to be checked continuously with best efforts for its completeness and reliability. And  online marketplaces should also make best efforts to identify and prevent the dissemination of illegal products – which essentially boils down  to a general monitoring and stay-down obligation. These far-reaching requirements will certainly lead  to  heated debate in the upcoming trilogue, including because they also  run counter to the ban on general monitoring obligations – which the EP itself had just reaffirmed. Compatibility with fundamental rights and earlier CJEU case-law such as SABAM v Netlog (C-360/10) may become a matter for discussion as well.
  • Information on illegal products (Art. 22a): The EP text supplements its expanded KYBC obligations  with a set of information obligations for online marketplaces: Upon becoming aware of illegal  products, the marketplace  must report them to the relevant authorities, and also inform all customers that have already bought the products in question. One can imagine the large adverse consequences of such notifications – which after years in court may well turn out to have been completely unfounded  – on a trader’s business and reputation. More heated debate to ensue!
  • Advertising (Art. 24, 33): The EP text also expands advertising obligations for online platforms. In particular, online platforms shall ensure that users can make informed choices when consenting to targeted advertising and can still  access the platform if they refuse to give such consent. The EP draft also increases the scope of information that very large online platforms must store in their advertising repositories. And completely new in the DSA context is a prohibition of targeted advertising based on the personal data of minors.
  • VLOP Risk Assessments and Audits (Art. 26, 27): With respect to very large online platforms (with an active monthly user base of more than 45 million), the EP also calls for broader risk assessments. VLOP providers shall analyse and annually report on “systemic risks” of their service -  ranging from the dissemination of illegal content, malfunctioning or manipulation, to negative effects for fundamental rights, public health and well-being. If providers do identify such risks, they shall put in place effective mitigation measures, to be evaluated by the EC. This system is complemented with obligations for external auditing, where the EP draft now puts more emphasis on the necessary independence of auditors. 
  • Compensation (Art. 43a): Finally, and somewhat hidden in the DSA’s enforcement section, the EP has introduced a provision that entitles users and user organisations to seek compensation from any provider for any direct damage or loss suffered due to the provider’s non-compliance with a DSA obligation, without prejudice to the hosting privilege. While this new right to compensation seems to supplement the EU’s Collective Redress Directive (EU) 2020/1828, its scope, impact, and rules of procedure  are not (yet) clear.
     

All considered, the EP takes an even bolder approach on the DSA than the EC and the Council before it. The completely new  provisions in particular  will fuel  fundamental debates about the direction that Europe should take in cementing the legal foundations for the Internet economy for the decades to come.

Outlook

The DSA train races ahead at  high speed, with crucial crossroads coming up in 2022.

The EU bodies having finalized their respective starting positions, the DSA is now entering the next stage of the legislative process – the so-called trilogue. The first meetings are already scheduled, with first discussions on the substance in late February. The wrestle for the final text will  be complex and passionate, given that the respective visions of the EU bodies are hard to reconcile in several aspects.

Beyond the fundamental issues we have looked at in this article, diverging views have also been voiced on  supervision and enforcement.  France in particular has been pushing for a country-of-destination (rather than the traditional country-of-origin) approach, essentially giving jurisdiction to the authorities of each Member State in which an online service is offered. This initiative is countered by a coalition of smaller Member States, led by Ireland, which defends the country-of-origin principle as fundamental to providing legal certainty and to the future innovation of digital services in Europe.

Given the significance and ambition of the DSA, the trilogue negotiations will likely run into the second half of 2022. We can expect the Act to enter  into force around the end of 2023 or early  2024.

While this date may seem far in the future, intermediaries are well advised to review their current workflows against the – likely – set of DSA obligations right now,  and start preparing for compliance. Waiting for the final text will in most cases be too late, considering that the EC draft currently sets an implementation period of no more than three months. GDPR implementation has shown how badly it can backfire to leave compliance audits to the last moment. Now is the time to evaluate current workflows, perform impact assessments and gap analyses, and plan all steps necessary for future compliance.

DSA Taskforce

Our multi-jurisdictional DSA Taskforce  closely follows all developments of the legislative process. We will provide further regular updates on this blog covering the key developments. For details of our DSA taskforce click here and to view our topic centre click here

Authored by Anthonia Ghalamkarizadeh, Florian Richter.

Search

Register now to receive personalized content and more!