Insight

What’s Wrong with Changing Section 230?

Executive Summary

  • Recent debates around Section 230—a federal statute that protects the legal liability of online intermediaries that host or publish speech—have focused on the perceived exploitation of this liability protection by bad actors, but policymakers should also consider the impact of policy changes on the beneficial uses of technologies and how online platforms have further enabled speech.
  • Current law regarding this liability protection for online platforms already has exceptions for violations of federal criminal law, and policymakers should consider if the desire for additional response to illegal behavior online requires changes to liability or could be better addressed through increased law-enforcement resources using additional tools.
  • Conditioning Section 230 on compliance with government recommendations raises concerns about potential First Amendment violations and expansions in the power of the administrative state.

Introduction

Policymakers are debating whether one of the internet’s foundational legal statutes is allowing nefarious behavior to flourish—and thus whether that law needs to change. The latest indication of this debate is a recent Department of Justice (DoJ) workshop on whether Section 230, a federal statute that protects the legal liability of online intermediaries such as social media platforms that host speech and other user-generated content, was “nurturing innovation or fostering unaccountability.” This workshop’s—and implicitly the DoJ’s—focus on Section 230 is not isolated: Members of Congress have also proposed changes to Section 230 in an effort to regulate online content and the platforms hosting it more tightly.

In general, proposed changes to Section 230 suggest making either additional carve-outs or creating conditions for receiving this liability protection that is critical to the ability for a wide range of online platforms to host user-generated content. While many of these proposed changes seek to address obviously harmful and already illegal online behavior such as sex trafficking or child sexual exploitation, changes to Section 230 could have broad and negative consequences for the economy and society. Changing this law could stifle both legitimate free speech and new online entrants while not further addressing the underlying concerning behaviors that typically are already illegal.

What is Section 230 and Why Are There Calls to Change It?

Section 230 has been called “the 26 words that created the internet” because of the wide variety of user-generated content that this law has enabled. Section 230 has two key elements. It prevents an online platform from being treated as the publisher of user-generated content. This categorization applies not only to social media platforms, but also listings on home-sharing platforms, review sites, information on Wikipedia, and the comments sections of blogs and newspaper articles online. The second part of Section 230 protects such sites from liability for content moderation decisions regarding good-faith decisions to take down or leave up user-generated content.

There are a few carve-outs from Section 230 and its liability protection including and a recent carve-out specific to sex trafficking. But now there seems to be growing momentum for broader changes to the law, with critics blaming this liability protection for everything from violations of local laws by home sharing to furthering illegal sales of drugs in the opioid epidemic. In addition to the conversations around Section 230, there has also been an increasing amount of attention on end-to-end encryption and law enforcement’s concerns about its potential for abuse. Attorney General William Barr in his opening comments before the workshop discussed how the Department’s interest in Section 230 had arisen in the context of the review of market-leading online platforms and how the law was relevant to law enforcement concerns about lawless places where  bad could become invisible to law enforcement. Such statements as well as indications of conditioning Section 230 protection on best practices subject to the Attorney General’s determination are seen in a draft proposal to be called the EARN IT Act from Senators Lindsey Graham (R-SC) and Richard Blumenthal (D-CT). Such a proposal could tie Section 230 protection to companies’ including potentially requiring backdoors for encryption and thus result in the confluence of what have largely remained separate political fights. Similarly, the DoJ has expressed concerns about encryption and its potential to hide various malicious activities.

While addressing such illegal activity should be a top priority for law enforcement, and technology companies should continue to work hard to create tools to make it easier to identify and stop illegal content, creating backdoors to encryption or conditioning Section 230 protection on meeting specific government standards would have many consequences and potential risks both to online speech and cybersecurity. Under the protections of Section 230, the internet has yielded a new platform for many voices and new economic opportunities that would have been difficult to allow if the government treated platforms as publishers of the user-generated content that they host.

Section 230 and Its Exceptions

As Stand Together’s Neil Chilson described at the DoJ workshop, recent policy proposals for changing the current liability protections under Section 230 primarily fall into two types: carve-outs and bargaining chips. At times the rhetoric and proposals for changes certainly involve both elements. The 2017 changes to Section 230 in the Stop Enabling Sex Trafficking Act (SESTA)  is a good example of the carve-out approach, as it created a new specific carve-out for sex-trafficking related content, allowing civil and state liability in addition to the existing lack of Section 230 protection for such behavior as related to federal criminal activity. Since the passage of this law there have been additional suggestions about the possible need for amending Section 230 to create carve-outs for a variety of other illegal actions such as opioid sales as well as less nefarious activity such as the violation of local rules on home sharing.

SESTA exemplifies the carve-out approach’s potential impact. In the case of sex trafficking, as with many other concerns such as the sale of opioids, child sexual-abuse material, or terrorism, the underlying action is already illegal at the federal level. The federal government was able to close the website Backpage.com, notorious for claims of underage sex trafficking, before SESTA was signed into law. An additional carve-out might not change the loss of Section 230 for such truly bad actors, but it creates additional burdens for platforms that are not soliciting such content and are engaged in content moderation but may sadly miss something. In a post-SESTA world, Reddit removed certain sub-Reddits it felt were more likely to contain content for which it could be liable, and Craigslist closed its personal section.

Far from only targeting bad actors, changes to Section 230 via carve-outs require more monitoring for all websites. This exception approach removes the level playing field for small and mid-size players just getting started, as their platforms now will require much closer monitoring in certain areas in addition to general content moderation. By providing liability protection for user-generated content, Section 230 provides certainty to smaller players and their investors that they will not be subject to expensive (or perhaps frivolous) litigation because of a user’s misuse of their platform. These concerns are not unfounded. Companies such as Salesforce and Mailchimp are finding themselves subject to litigation for allegedly facilitating sex trafficking based on bad actors’ use of their platforms. Even if successfully defended, litigation is still a burdensome and costly process for companies that could previously rely on Section 230.

As SESTA and its impact so far show, even for widely agreed-upon harms there can be a much greater impact from carve-outs for speech online than just on the illegal and harmful behavior. Additional carve-outs would further complicate matters and could undermine the way Section 230 solves the “moderator’s dilemma.” In a world without Section 230, online platforms are forced to decide between not engaging in moderation so that they do risk liability or to constantly engage in policing content at a high cost, including silencing any speech that might fall into “gray areas.” The carve-out approach removes the certainty that Section 230 provides and risks returning to these two less-than-ideal choices.

Should Companies Have to “Earn” Section 230?

Other proposals have suggested that tech companies need to earn the privilege of Section 230 liability protection through certain actions in conjunction with agency mandates. This approach, for example, can be seen in proposals relating to requiring proof of political neutrality or requiring compliance with “best practices” that could then require giving law enforcement a backdoor on end-to-end encryption. There are significant reasons why requiring companies to “earn” Section 230 protection could be ripe for abuse or create significant government power over speech.

In some cases, such proposals raise constitutional concerns by allowing the government to become the potential regulator of speech. This problem is particularly true for proposals that would require platforms to prove political neutrality to a government entity, as was proposed in Senator Josh Hawley’s (R-MO) Ending Support for Internet Censorship Act. Such proposals raise First Amendment concerns by having the government dictate to private companies the nature of the content they carry. Beyond that problem, these proposals should also raise concerns about increasing government intervention into the discourse between individuals as well as into the companies whose standards they would dictate.

But political neutrality is not the only condition for Section 230 liability protection that policymakers have proposed. The drafts circulating of the proposed EARN IT Act would make Section 230 contingent on compliance with a DoJ commission’s best practices. This or similar structures should raise concerns about the power given to a single individual or government entity as well as the potential violations of the non-delegation doctrine, as TechFreedom’s Berin Szoka points out. For those concerned about the growth of the power of the administrative state, such a delegation would empower unelected bureaucrats to craft the rules rather than allowing difference in content moderation among platforms. At the same time, the current rules already create exceptions for federal criminal activity, providing law enforcement the tools to go after the truly bad actors online as well as offline.

Rather than applying the statute generally, requiring companies to earn Section 230 protection has many potential risks, including the potential politicization of the requirements or the ability to dictate such requirements in a way that only grants the privilege to government-selected winners and losers. Section 230 allows a marketplace for content moderation rather than having the government to dictate content. For already illegal content, policymakers should look at ensuring that law enforcement agencies have enough resources to utilize the existing tools to go after such behaviors, rather than place platforms back in a difficult moderator’s dilemma.

Conclusion

It is important for law enforcement to be able to pursue bad actors that use the internet to conduct illegal behavior. But we also must consider the benefits of how liability protections like Section 230 enable more voices—and thus how changing Section 230 could impact the beneficial uses of online platforms. These questions and concerns occur in a broader debate over the impacts of technology, and some proposals in this debate can treat one policy as a silver-bullet solution rather than unpacking the many different policy issues involved.  Companies should do all they can to address the illegal activities driving many of the calls to change Section 230. But rather than seeking to change Section 230, policymakers should look to better address the underlying illegal and harmful activity and provide resources to law enforcement for these underlying concerns and recognize the benefits of Section 230 in enabling a broad range of innovative platforms and discourse online.

Disclaimer