Insight

The EU’s Digital Services Act: A Primer

Executive Summary 

  • The European Union’s Digital Services Act (DSA) would introduce stricter regulations for internet intermediaries and digital platforms, requiring the implementation of a virtual complaint system, setting yearly reporting requirements, and threatening fines.
  • The DSA also includes a “very large” distinction for platforms with an active user base of over 10 percent of the European population, which brings with it more stringent reporting and monitoring standards, yearly external audits, and higher fines.
  • The DSA would not only impact companies located in the European Union, but it would also directly target American companies with its “very large” distinction, harming consumers in the United States and around the globe.

 Introduction

The Digital Services Act (DSA) is part of the European Commission’s latest effort to regulate tech companies, alongside the Digital Markets Act and the Digital Services Tax. These regulations are part of its “Shaping Europe’s Digital Future” digital strategy, which seeks to create a new regulatory framework to govern tech companies. Considered together, these policies signal that the European Union (EU) would like to aggressively regulate the tech industry. The approach taken in these proposals risks harming business and consumers both in Europe and around the globe, and would be particularly burdensome for American companies.

What is the DSA?

The DSA focuses on content moderation and online advertising. It seeks to regulate “online intermediaries and platforms.” These include “online marketplaces, social networks, content-sharing platforms, app stores as well as online travel and accommodation platforms.” The DSA would replace the E-Commerce Directive initially passed in 2000. The Commission believes the current directive has a limited capacity to create a “consistent, cross-border supervision of these online intermediaries,” due to it being too open-ended, leading to inconsistent use by the member states. To address this shortcoming, the European Parliament has proposed a package of additional tech policy legislation that creates additional regulations such as requiring algorithm disclosure, reporting, and transparency for online intermediaries and platforms.

Together with other elements of the Shaping Europe’s Digital Future strategy, the DSA would expand the EU’s prescriptive approach to tech and innovation. Such an approach is a stark contrast to the United States’ current light-touch regulatory approach to tech policy, which has resulted in the United States having eight of the 10 highest-valued tech companies in 2019, while there are no European companies in this category.

Key Elements of the DSA

The DSA would introduce regulations on various topics regarding the digital economy. The regulations cover a variety of areas, from defining the different service providers on the internet, to content moderation, digital advertising, internal complaint systems, and external audits, among others. This primer highlights five key elements of the DSA.

The “Very Large Platforms” Definition

The DSA introduces a distinction among online intermediaries by creating the category of “very large platforms”, defined by having more than 45 million users or 10 percent of the EU’s population. This category would include not only traditional social media giants, but also newer entrants such as TikTok, Twitter, and Snapchat. The advocates for this proposal claim that due to their size, these platforms can pose larger “societal risks” including a higher risk of an increased spread of illegal content, the violation of fundamental rights, manipulation of their automated system, and dominance of the digital advertisement market. If realized, these risks could lead to negative impacts on public health, civic discourse, electoral processes, and national security.

Companies subject to this distinction would face additional regulations regarding content moderation and targeted advertisements. The additional regulations include a requirement of yearly external audits at their own expense, stricter monitoring and supervision by European authorities, and higher reporting standards of their advertisement and content-moderation practices. The DSA also mandates these platforms establish a risk assessment strategy, which would be part of this auditing process. 

Algorithm Disclosure

The DSA would require online intermediaries subject to its regulations to publish the parameters of the algorithms used for content moderation and targeted advertisement upon request by the Commission. Additionally, it authorizes the Commission to conduct on-site inspections to review the algorithms and ask questions. If the platforms fail to publish or allow inspections, they will be subject to a fine of up to 10 percent of their total revenue in the fiscal year.

Reports on Content Moderation

The DSA also stipulates that platforms must release transparency reports, both yearly and upon request of the Commission. Platforms must publish “clear, easily comprehensible and detailed reports” on their content-moderation efforts.  The reports must include: all content moderation on illegal content by request of member states, including average time of compliance on the order; content removed due to user reports; content removed on their own criteria; and the number of complaints received through the mandatory internal complaint system.

Virtual Complaint System

Finally, the DSA also mandates online platforms establish an “effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge” for recipients of the service. This system must allow consumers to push back against certain content-moderation practices, and platforms must provide access to this complaint system for at least six months from the removal of their content.

This system must allow consumers to establish complaints against the following:

  1. decisions to remove or disable access to information;
  2. decisions to suspend or terminate the provision of service, in whole or in part, to the recipients;
  3. decisions to suspend or terminate the recipient’s account.

Additionally, platforms must make sure the complaint systems are “easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints,” and handle complaints in a “timely, diligent and objective manner.” Further, in case the platform finds the removed content is not illegal nor against the terms of service, it shall reverse its decision without “undue” delay.

Fines

If platforms fail to comply with the regulations stipulated by the DSA, they could be subject to fines of up to 10 percent of global revenue, in the case of repeat offenders. Some of these fines are specifically targeted for companies falling in the “very large” distinction, with fines of up to 6 percent of global revenue for first time infractions on the dispositions of the act, and 1 percent for cases of mistakes in reporting. These sizeable fines are apparently not meant to be simply deterrents: There are reports that the EU is already incorporating their revenue into its budget, which, as the American Enterprise Institute highlights, signals that the EU expects companies will not be able to comply with these regulations.

The Potential Impact of the DSA

The DSA would impose significant regulatory burdens on tech companies operating in Europe, including many American companies. The introduction of the “very large” distinction would directly change the regulatory regime faced by successful American companies such as Facebook, Google, and Twitter. This outcome does not appear to be coincidental. As Competitive Enterprise Institute’s Adam Young points out, the DSA seems to be another chapter of protectionism in the ongoing U.S.-EU trade tensions, where American companies are put in a disadvantageous position, facing higher scrutiny, regulatory burdens, and fines.

But it is not just this new classification that would impact the tech market. Requiring disclosure of algorithms also risks harms to both consumers and innovation. While the extension of the required disclosure is not discussed in the DSA, the publication of the platforms’ algorithms could severely impact the nature of their business. Allowing public access to these algorithms allows spammers, scammers, and other ill-intentioned individuals to easily “game” them, using the parameters in order to promote their content. By doing so, consumers could end up facing increased exposure to disinformation, fraud, and spam, which would make their online experience undoubtedly worse. Forcing platforms to publicly disclose their algorithms can harm their innovative process, as algorithms are usually protected as trade secrets because platforms often compete on the ability to serve their consumers with the content that is more relevant for them based on the algorithms. By making the algorithms public, they are more prone to be copied by competitors, diminishing the returns on investing in a good algorithm.

The mandate on reports, while promoting the spirit of transparency and giving users access to valuable information, can also act as a double-edge sword: By defining a certain standard of “transparency,” such a mandate could incentive platforms to limit themselves to what is on the checklist. Many large platforms (e.g., Google, Facebook) already voluntarily publish these reports. And while established players have the capacity to have robust reporting teams, this requirement will imply additional costs for smaller companies, causing them either to spend scarce capital resources on reporting teams, or to face the significant fines proposed by the DSA. Additionally, while transparency reports might be useful in order to covey information over content-moderation standards, mandating them by law might imply violations of free speech.

In a similar fashion to the report requirement, the virtual complaint system is well intentioned but faces challenges regarding compliance. The vague and open-ended language can generate confusion for businesses around what an “easy to access, user-friendly” system is. Thus, businesses are left to the mercy of the interpretation made by the regulating agencies, leaving a door open for regulatory overreach. While this section would be valuable in a directive, which has a non-binding nature and concentrates on encouraging good practices, the fact that it is part of a regulation, making it mandatory, means companies can expect its enforcement. There is no objective definition of what “easy to access, timely, or user-friendly” means, and the definition can be stretched or narrowed depending on who is in charge of evaluating. This lack of certainty introduces risks for businesses that might be unfairly punished for making a wrong judgement call or by deviating from a definition set by the regulators.

These measures, when considered together, could yield significant compliance costs. Establishing these complaint systems and producing these moderation reports come at a cost, both in monetary and human resources. Each person dedicated to creating these reports or to coding and designing the complaint system is a person that is not working to meet consumer needs or improving the product a business is intended to provide. These costs are especially prohibitive for start-ups and small businesses, which often are under-staffed, adding more tasks to a workforce that tends to be overworked already. These costs can also impact American consumers by increasing the prices of goods or services or forcing them to miss out on beneficial improvements for a time as the businesses’ resources are tied up in compliance.

Thus, these costs might actually limit competition, by burdening small businesses with costs they cannot easily absorb, thereby strengthening the position of current incumbents. This dynamic was evident with the implementation of the General Data Protection Regulation, where businesses pulled out of the European market due to the cost or impossibility of complying with the regulation.

Conclusion

The DSA would put many American tech companies at a disadvantage by subjecting them to more stringent monitoring and reporting standards alongside higher fines. And with companies around the globe required to publicly disclose the key parameters of their algorithms, consumers around the globe are more susceptible to spam, scam, and fraud, as ill-intentioned actors could tailor their content to the algorithms. The result would be to the detriment of consumers who would have to settle for less-ideal products with more exposure to threats, or miss out on certain services due to the high compliance costs.

Disclaimer