Digital Economy

Digital Markets Act and Digital Services Act: European Commission unveils two draft regulations to reform rules on digital economy and competition

The proposals aim to fundamentally reform the EU legal framework for the digital economy, and to subject the digital industry to significantly more regulation. The European Commission wants to increase its focus on problems observed in recent years in the field of digital services and online platforms. To this end, the European Commission presented two legislative proposals on 15 December 2020:

  • The proposed Digital Markets Act is intended to introduce ex-ante regulations for so-called “gatekeepers” of the digital economy, i.e., particularly powerful platform operators, in the form of a list of prohibited problematic practices. This list covers a wide range of prohibited behaviors typical of digital markets, some of which are familiar from proceedings of the competition authorities, such as the Google Shopping and Google Android decisions of the European Commission or the investigations against Amazon regarding the possible abusive misuse of transactional data. In addition, the European Commission is intended to be provided with new tools to effectively enforce the prohibitions in order to quickly identify and eliminate competition problems in the digital markets.
  • In addition, the Digital Services Act provides for new obligations for digital service providers who act in their role of connecting consumers with goods, services and content (“intermediaries”). These include, in particular, measures to counter illegal online content (harmonization of the “notice and take down” procedure), introduction of an internal complaint-handling system that allows users to challenge decisions made by platforms to block or remove content, as well as far-reaching obligations for very large platform providers. However, the liability regime of the e-Commerce Directive, which is to be incorporated into the Digital Services Act, remains untouched.

The proposals are largely based on the results of public consultations in which companies and stakeholders were able to participate from June to September 2020 and which received more than 3,000 responses from various stakeholders of the (digital) economy.

I. Digital Markets Act (DMA)

The Digital Markets Act intends to create a level playing field in digital markets and give the European Commission wide-ranging enforcement powers to take immediate action against distortions of competition in digital ecosystems.

1. Gatekeepers as addressees

The proposed DMA aims at operators of so-called “core platform services” exercising a gatekeeper function. “Core platform services” include familiar (multi-sided) digital platforms, such as social networks, intermediation services, or video sharing platforms. They also include numerous digital services such as number-independent interpersonal (OTT) communications services, online search engines, operating systems, cloud computing services as well as advertising services, especially those offered by operators of the aforementioned digital platforms.

The operators of these “core platform services” are to be classified as gatekeepers if they:

  • have a significant impact on the internal market,
  • serve as a gateway for business users to reach end users, and
  • enjoy an entrenched and durable position in their operations or if it is foreseeable that they will enjoy such a position in the near future.

The draft specifies quantitative thresholds (EUR 6.5 billion in annual revenue, 45 million active users per month) that, if met, lead to a presumption that the three gatekeeper criteria mentioned above are met. The European Commission would also be empowered to classify companies as gatekeepers following a market investigation, even if the thresholds specified in the draft are not met. The European Commission is to define the criteria for determining gatekeeper status and would be able to adapt them to new technological or market-specific developments. In addition, companies are to perform a self-assessment on whether they meet the stated gatekeeper criteria and are to be obliged to notify the European Commission if the criteria are met.

2. Ex-ante regulation

The draft regulation contains a number of ex-ante prohibitions for gatekeepers, not requiring any further specification (“blacklist”), as well as a number of additional prohibitions which are intended to be further specified by the European Commission during its proceedings against individual gatekeepers. The catalogue as a whole provides amongst others for the following prohibitions and/or obligations:

  • No personal data collected by two (or more) services, e.g. a social network and an instant messaging service, may be combined by the gatekeeper (even if they are the gatekeeper's own services) without the users’ (effective) choice and consent.
  • Business users must be able to offer their products/services at different prices or conditions on third-party platforms, such as a hotel comparison and booking portal competing with the gatekeeper.
  • Business users must be able to offer products and services to end users they have acquired via the gatekeeper's platform - such as an app store - and to conclude subsequent contracts with them, even if this is then not done via the respective gatekeeper's services.
  • Business users, when using the platform for their own commercial purposes, must not be required to use, offer or cooperate with the gatekeeper's identification services, such as the login features provided by social networks for third-party sites.
  • Business users or end users must not be required to use other core platform services of the gatekeeper to gain access to a particular core platform service, such as a technical application (prohibition of “bundling”).
  • End users must not be prevented from uninstalling any of the gatekeeper's preinstalled software or apps, such as an internet search engine. The gatekeeper must ensure that third-party software can interact with the gatekeeper's services and can also be used without the “core platform service”.
  • The gatekeeper's own products and services must not be given preferential treatment, in particular with regard to the presentation or ranking on the platform, e.g. search display of products in a prominently placed window, which guides users directly to the purchase of products (prohibition of “self-preferencing”).
  • The portability of data generated by the activity of companies or end users on the platform must be ensured; not only in the sense of selective data transmission / data transfer, but by way of continuous real-time access.
  • To a certain extent, business users must have access to the usage and transaction data generated on the platform in relation to their products or services. In addition, search engine operators should have access on a FRAND basis to anonymized ranking, search, click, and view data generated by end users via gatekeepers’ online search engines.
  • Fair and non-discriminatory terms and conditions must be applied to companies when using online sales platforms for user software (i.e., app stores), provided these are “core-platform services”.
  • In addition, there are various prohibitions and restrictions in connection with online advertising, which are intended to ensure greater transparency and traceability of the services and remuneration structure of gatekeepers in the online advertising sector.

It is envisaged that the European Commission may adapt the list of prohibitions and obligations to changing market situations and may add further prohibitions and/or obligations.

3. Market Investigation

The “New Competition Tool” discussed prior to publication of the current draft regulations, which was intended to allow for far-reaching investigations and enforcement measures in a large number of sectors, has been reduced to a platform-specific market investigation. The tool would allow the European Commission to examine whether certain companies qualify as gatekeepers, whether a platform is a “core platform service” or whether the aforementioned list of ex-ante prohibitions needs to be expanded. As a result, relatively little remains of the originally envisaged, far-reaching powers.

4. Investigative and enforcement powers of the European Commission

The European Commission has also now been given investigative powers to verify and enforce compliance with the provisions of the regulation. These are measures familiar from antitrust and merger control law, such as requests for information and on-site inspections at companies’ business premises.

Violations of the ex-ante provisions are subject to fines. Fines can amount to up to 10 percent of the total turnover of the company concerned. If a company repeatedly violates prohibitions of the list of prohibitions (“systematic non-compliance”), the European Commission may even impose structural remedies such as the divestment of certain business units. In contrast to EU antitrust law, gatekeepers can only appeal directly to the European Court of Justice as the first and final instance against decisions imposing fines.

The European Commission is also to be able to issue interim measures against gatekeepers if companies and end users are threatened with serious and irreparable damage resulting from violations of the ex-ante regulations.

5. Notification of mergers

A further new provision of the draft regulation is that gatekeepers must notify the European Commission of planned mergers with other “core platform services” or any other digital company before they are implemented. This applies regardless of whether a transaction is subject to merger control under EU or national law. However, the regulation does not entail merger control scrutiny. Rather, the purpose of the provision is to keep the European Commission informed of current developments in the relevant market segments and to contribute to its assessment of any potential modifications regarding the classification of individual platforms as gatekeepers.

II. Digital Services Act (DSA)

The DSA intends to improve the protection of consumers and their fundamental rights in the online sector and to create a uniform legal framework for the liability of online platforms for illegal content as well as more transparency for algorithms and online advertising. Violations of obligations under the regulation are to be punishable by a fine of up to 6 percent of a company's total annual turnover.

1. No reform of the liability regime of the e-Commerce Directive

Contrary to expectations, the European Commission has opted to retain the liability regime of the e-Commerce Directive, according to which platform providers, as so-called hosting providers, are only liable for third-party content if they fail to remove illegal content after becoming aware of it. The draft regulation adopts the corresponding provisions of the e-Commerce Directive largely unchanged and also clarifies that the liability privileges do not cease to apply merely because online intermediaries take measures on a voluntary basis to remove or uncover illegal content. In particular, the country of origin principle of the e-Commerce Directive remains unaffected by the DSA.

The draft also contains obligations that differentiate according to the type of online intermediary service (hosting service providers, hosting service providers that qualify as online platforms and very large online platforms).

2. Information requirements

All online intermediaries shall include in their terms and conditions information on any restrictions on the use of data provided by users, including information on procedures, measures and tools, as well as on the use of algorithm decision-making and human review of content used for the purpose of content moderation – both in relation to illegal content and content that violates the platform's terms of use.

3. “Notice and take down” procedure

All hosting service providers (including online platforms) are required to establish a user-friendly “notice and take down” procedure that allows users to report illegal content. In addition, the draft regulation specifies what information such a notification must contain in order for the hosting service provider to have positive knowledge of illegal content that triggers liability. Hosting service providers must provide sufficient reasons to the user for their decision to block or remove a particular piece of content. The definition of what constitutes “illegal” content is however governed by national law.

4. Additional obligations for online platforms

In addition to the aforementioned obligations, online platforms – with the exception of micro or small enterprises – shall establish an internal complaint-handling system that allows users to appeal decisions by the platform provider to block or remove allegedly illegal content. In addition, users are to have the option of calling an out-of-court dispute settlement body. Notices from so-called “trusted flaggers” are to be given preferential treatment. Repeat infringers who regularly disseminate illegal content on the platform are to be suspended for an appropriate period. The draft also includes an obligation to notify law enforcement authorities in the event of a suspicion of serious criminal acts. Online marketplaces shall publish information about traders admitted to the platform. Finally, online platforms must prepare reports on the number of out-of-court dispute settlement procedures, temporary user suspensions and automated content checks.

5. Obligations for particularly large online platforms

Particularly large online platforms with more than 45 million monthly active users within the EU shall be subject to further obligations:

  • They must assess systemic risks arising from the use of their platforms and put in place effective content moderation mechanisms to address identified risks (e.g. illegal content, privacy violations, etc.).
  • The relevant parameters of the decision-making algorithms used to serve content on the platform (i.e. ranking mechanisms) must be disclosed. Users must be offered options to change these parameters (including an option that is not based on profiling).
  • A repository must be established, accessible via APIs (Application Programming Interfaces), containing detailed information about online advertising served in the past year; this is to enable monitoring and research of emerging risks from advertising distribution (e.g., through illegal advertising and disinformation).
  • Upon request by the competent authority, very large online platforms will be required to provide the competent authority, as well as selected academic researchers researching systemic risks posed by online platforms, with access to the data necessary to monitor the large online platforms' compliance with the DSA.

The European Commission is to have supervisory and enforcement powers over very large online platforms. In addition, Member States are to appoint so-called “Digital Services Coordinators” to oversee enforcement of the regulation.

III. Gleiss Lutz analysis

The European Commission's draft regulations for the digital sector will implement some of the previously announced reform proposals. The DMA significantly tightens the EU legal framework regarding the digital economy. Although it has been repeatedly emphasized by the European Commission that the new regulations are not specifically aimed at a few tech giants, it follows quite clearly from the DMA’s restricted scope to “gatekeepers” that the European Commission will primarily take action against major search engines, social networks and intermediary platforms.

With the extensive list of ex-ante prohibitions and the enforcement powers attached to it, the European Commission assumes the function of a regulatory authority for digital companies that – from the perspective of the European legislator – are particularly relevant for the functioning of the internal market. The DMA is intended to supplement the more protracted “classic ex-post” abuse of dominance proceedings in order to stay abreast of the rapid development of digital markets. It remains to be seen whether and to what extent the European Commission can use the DMA to effectively and quickly overcome the challenges it has identified in the digital sector in recent years (in particular lock-in effects, tipping markets and leveraging market power into neighboring markets) without unduly restricting digital business models. This presupposes reasonably speedy adoption of the act after discussion in the EU Parliament and the Council of Member States, which both must approve the proposal. Many observers expect the legislative process to be rather lengthy, with the possibility of significant changes to the proposal. Realistically, the new regulations will probably not enter into force within the next two years.

In practice, the relationship of the DMA and regulatory measures of Member States with regard to digital markets will be particularly relevant. The envisaged regulations of the DMA strongly suggest that, in any case, the ex-ante regulatory competence for digital platforms will in the future lie solely at the European level, with the European Commission responsible for oversight and enforcement. National regulations, in particular those under antitrust law, must not run counter to European requirements. Against this background, the question arises as to what extent there is still room for national ex-ante regulations. In Germany, this question primarily arises with regard to Section 19a of the Act against Restraints of Competition (ARC), which is envisaged as part of the impending 10th amendment to the ARC. However, as the amendment is expected to enter into force long before the DMA, there should still be scope for the application of Section 19a ARC for at least the next two years.


The DSA is also clearly aimed at the so-called “tech giants” with far-reaching consequences for the fulfillment of compliance requirements and with fines that even exceed the maximum amount of the GDPR. Some of the transparency requirements that will apply to very large online platforms are modeled on the Platform-to-Business Regulation (e.g. the obligation to disclose ranking mechanisms), which came into force on 12 July 2020.

However, with regard to the untouched liability regime of the e-Commerce Directive for online intermediaries, the draft is more lenient than the European Commission initially suggested.

The harmonization of the “notice and take down” procedure as well as the internal complaint system, which are to apply in principle to all online platforms, could also replace national legislation with which some Member States in recent years have reacted to hate crimes on the internet and lead to a uniform framework for online platforms – at least for cases of EU-wide significance. This could be the case, for example, with regard to the Network Enforcement Act (Netzwerkdurchsetzungsgesetz) in Germany (whose scope, however, is limited to certain criminal offenses and whose addressees are limited to online platforms with at least two million registered users), the “Avia” law in France, and the Communications Platforms Act (Kommunikationsplattformen-Gesetz) in Austria.

 

Authors: Dr. Moritz Holm-Hadulla, Dr. Hannah Bug, Kristina Winkelmann

Forward