Insight

Does Content Moderation Need Changes to Section 230?

Executive Summary

  • Recent proposed changes to Section 230—a liability shield that allows online intermediaries to make prudential decisions about content moderation including protecting them from being sued for their users’ content—would make content moderation online more difficult.
  • Section 230 has been criticized by both the left and the right, but its underlying principles remain important and popular.
  • The potential impact of any changes to Section 230 must be considered in a far broader context than just the tech giants and social media platforms: Changing this law would stymie innovation, threaten smaller companies, and ultimately limit the options for speech online.

Introduction

Section 230 is a segment of law that has facilitated the explosion of user-generated speech on the internet. By shielding online intermediaries—from large social media companies like Facebook to the comments section of a local newspaper or personal blog—from liability for their users’ content. Section 230 gives these platforms the freedom to moderate their content as they see fit. The result has been a proliferation of internet communication venues and an explosion of online speech over the last two decades. But a number of policymakers are seeking to overturn or heavily alter this protection, as two recent proposals demonstrate.

On June 17, the Department of Justice (DOJ) unveiled recommendations for changes to Section 230. The same day, Senator Josh Hawley introduced a new bill that would change this important liability shield. Both of these actions come shortly after President Trump issued an executive order on social media that implicated changes to Section 230. While these latest proposals may lack some of the constitutional concerns of the president’s executive order, they still would undermine the explosion of free expression that has occurred in the digital era and the difficult task of content moderation.

New Proposals on Section 230 Changes

Both the DOJ and Senator Hawley’s proposals would change Section 230 from its current, rather straightforward liability protection to a standard that is much more difficult to understand and apply.

The DOJ proposal would require additional moderation action on numerous illegal and illicit activities such as drug sales and child pornography. While companies should certainly do all they can stop these atrocious activities, Section 230 immunity already does not apply to federal criminal activity. The impact of lowering the standard for when companies are liable if a third party conducts such activities using their platforms can be seen in the two years since an additional carve-out to Section 230 regarding sex trafficking was passed. Companies such as Salesforce have found themselves subject to litigation as a result of the use of these innocuous tools by wrongdoers, while other websites such as Craigslist have deleted certain sections like personals for fear they could be abused and increase the company’s liability.

Beyond adding moderation requirements, the DOJ seeks to clarify that Section 230 is not a defense against antitrust claims. The proposal also seeks to more strictly limit the type of content that platforms can moderate by requesting Congress define “good faith” and limit the type of objectionable content companies can remove.

Senator Hawley’s latest proposal for changing Section 230 would significantly increase the amount of litigation faced by companies and the difficulty in making content moderation decisions. This proposal allow litigation with $5,000 per violation statutory damages for any content removal if it was not in compliance with the terms of service or done in “good faith.” As TechDirt’s Mike Masnick writes, “The shortest version of the bill’s likely impact is that it would create an army of ‘content moderation troll’ lawyers, because you could sue any platform that you felt removed your content unfairly and get $5,000 plus attorney’s fees.” Such a change would impact large platforms by exposing them to numerous lawsuits, but this new burden would be even more significant on smaller newer players who may be unable to  afford to defend a case even if they would succeed in court. This could be a deterrent from continuing to develop new options for messaging, reviews, or social media that all rely on user-generated content. Most fundamentally, this proposed change would undermine the intentions of Section 230 to allow a wide range of content moderation decisions.

Section 230 Is Still Needed (and Wanted)

While current proposals and criticisms of Section 230 may be coming primarily from Republicans, there is bipartisan misunderstanding on the issue of content moderation. While both Democratic presidential candidate Joe Biden and Speaker of the House Nancy Pelosi criticized President Trump’s executive order on social media, they each have called to revoke or significantly change Section 230 at other times. But whomever such calls come from, Section 230 remains important to continuing a wide variety of expression and innovation.

Contrary to some arguments, Section 230 remains important not only for social media platforms but also for a wide variety of beneficial and innovative online resources. Beyond social media, information websites such as Medium and Wikipedia also rely on Section 230, as their content is generated by users. Review sites such as Yelp and home rental platforms such as Airbnb can also know that disputes over a bad review cannot lead to a defamation case against them thanks to Section 230. Even the comments sections on articles from major newspapers receive Section 230 protection. Changing Section 230 would have an impact on all of these beneficial uses, not just concerns about social media that critics often claim to be targeting.

Further, new Gallup and Knight Foundation polling reveals that changing Section 230 might not be as popular as some politicians think. While the vast majority of Americans were distrustful of social media companies, the polling found that nearly two-thirds still supported the underlying principles of the law. While there may be concerns about specific content moderation decisions, a better answer is to encourage competition and change to address those concerns—not to change the legal framework in which all platforms operate. Whether it is a concern about over moderation or under moderation, removing or significantly changing Section 230 would leave most platforms without a middle ground and return to an unfortunate dilemma between silencing legitimate speech and no moderation at all.

Conclusion

Section 230 is often misunderstood, and changing it could have consequences beyond just the popular social media platforms of today. The potential impact of any changes must be considered in a far broader context than just the tech giants and should also recognize the benefits that the explosion of user-generated content online has had.

Disclaimer