Insight

The Problems With Applying Common Carriage Regulations to Online Platforms

Executive Summary

  • Federal lawmakers have proposed using common carriage regulations to prevent online platforms from removing or moderating hosted content to combat alleged discrimination against conservative speech.
  • Common carriage regulations historically ensured non-discriminatory public access to services such as shipping and telecommunications and could theoretically be applied to online platforms to ensure users can share information on these services regardless of their political beliefs.
  • Common carriage regulations for online platforms present First Amendment concerns and could run afoul of Section 230 of the Communications Act of 1934 protections, impose economic costs on platforms, and harm platforms’ functionality and the user experience.

Introduction

In response to allegations of anti-conservative bias by online platforms including Google, Meta, Twitter, and Amazon, Republican Senators Bill Hagerty (TN), Roger Wicker (MS), and Lindsey Graham (SC) introduced legislation in the 117th Congress to impose common carriage regulations on online platforms, which would force platforms to carry all content indiscriminately. Currently, online platforms serve content to users that they find interesting, using algorithms to rank, recommend, and index content, but also remove content that violates the platforms’ terms of service or that users or advertisers find objectionable. Legal protections offered by the First Amendment and Section 230 of the Communications Act of 1934, which provides intermediary liability protection for hosting and removing user-generated content online, make moderation and curation possible.

Yet as some conservative members see current content moderation policies as unable to protect conservative speech, they have proposed imposing common carriage regulations, which would largely ban these platforms from moderating content by requiring that they host all user-generated content indiscriminately. Such legislation may find widespread support as lawmakers on both sides of the aisle have expressed a desire to regulate platforms’ ability to rank and recommend content. Meanwhile the Supreme Court’s 2022–2023 docket features a case on the legality of common carriage for social media companies at the state level, which could clarify the bounds of current regulation as well as future federal action.

Common carriage regulations, derived from English common law, were initially applied to a variety of firms, but became concentrated in industries that dealt with the transmission of goods and information such as freight, transportation, and telecommunications. Common carriage regulations shield the carrier from liability for the content of the goods they transport in exchange for non-discrimination by the carrier. Thus, applied to tech companies, such regulations would protect online speech by shielding platforms from liability while disincentivizing them from removing or moderating constitutionally protected speech. Nevertheless, legislators should consider the costs of imposing common carriage regulations on online platforms, such as the difficulty in properly applying legacy regulations to novel technologies, the potential that common carriage regulations infringe on platforms’ editorial rights under the First Amendment and Section 230 of the Communications Act of 1934, and the potential harms to platforms’ economic model and the user experience.

Content Moderation, Common Carriage, and the Case for Regulating Online Platforms

Critics of content moderation assert the practice grants a few companies in Silicon Valley too much power to decide what speech is allowed online. Such concerns intensified after social media companies limited the reach of and removed posts that dealt with issues such as the origins of COVID-19 and voter fraud in the 2020 election. Calls for action accelerated when Facebook and Twitter removed former President Trump from their platforms in response to his posts leading up to and during the United States Capitol attack on January 6, 2021. Since 2021, federal lawmakers have proposed laws that would limit a platform’s ability to moderate content by designating platforms as common carriers. These efforts coincided with laws passed in Texas and Florida, both of which restrict platforms’ ability to moderate user-generated content and impose aspects of common carriage regulation. The Supreme Court will rule on the constitutionality of the Texas law in its upcoming term and provide clarity on the constitutionality of applying common carriage regulations to online platforms going forward.

Common carriage emerged in 18th century England to ensure that the public had non-discriminatory access to essential services. Common carrier designation depended on the nature of the business, specifically whether the service was offered indiscriminately to all members of the public, and whether the business influenced or altered the goods or simply transported them.

In the United States, regulators have employed some aspects of common carriage regimes to evolving technologies, specifically telecommunications, albeit inconsistently. Title II of the 1934 Communications Act established the Federal Communications Commission (FCC) and empowered it to regulate telecommunications companies as common carriers complete with liability protection, non-discrimination requirements, rate regulation, and tariff scheduling. As a result of innovation and deregulation of the telecommunications industry, “quasi-common-carriage,” where some rules reserved for common carriers are applied to private firms and vice versa, has expanded some of these regulations to firms that may not be common carriers. Public access requirements for cable broadcasting are a prominent example, as well as the failed “Fairness Doctrine.” For proponents of these regulations, this offers a justification for applying legacy regulations to novel technologies.

Conversely, this raises questions about affixing economic regulations meant for analog and early broadcast technology to the websites and apps of today. Non-discrimination requirements for user-generated content would force platforms to treat such content the same as a telephone company treats a phone call. While telephone networks are natural monopolies, the same cannot be said for online platforms, where barriers to entry are low and millions of websites compete with one another and other industries for users’ attention and firms’ advertising dollars. Even individuals arguing for using common carriage concede that creating a workable regulatory regime without destroying core platform components or violating platforms’ constitutionally protected editorial rights poses a real challenge. A non-discrimination requirement would prevent platforms from sorting and recommending content that individuals find interesting or useful, a central component of creating competition and improving the user experience online.

The Costs of Common Carriage

Legislation embracing common carriage could run into constitutional roadblocks depending on the outcomes of Net Choice v. Ken Paxton and Gonzalez v. Google. The former will address the issue of non-discrimination through content moderation based on “viewpoint,” and the latter weighs in on whether Section 230 (c)(1) immunizes platforms when they make targeted recommendations. Currently, private firms are free to moderate content on their platforms and are protected from intermediary liability for speech they host thanks to a combination of the editorial protections guaranteed by the First Amendment and outlined in Section 230 (c)(1) and (c)(2) of the Communications Act of 1934, respectively. Paxton is a straightforward test of applying common carriage regulations to online platforms, an idea that many have challenged as flawed because of its inconsistency with First Amendment protections for publishers and private actors. Gonzalez has implications for the future use of Section 230, algorithmic content moderation, and online discourse broadly. These cases should clarify the reach of the First Amendment and Section 230 online, the strength of common carriage arguments for online platforms, and provide a measuring stick for legislation aimed at algorithmic moderation and recommendations.

Imposing common carriage regulation on social media raises significant concerns about the economic impacts such regimes could have on platforms. Content moderation by online platforms is a profit-maximizing decision, and regulations that strip this authority will likely harm companies that rely on advertising for revenue. Another concern is the expanded authority of regulators at the FCC to impose tariffs and other economic regulations on platforms. As consistent with their Title II authority, the FCC has wide latitude to impose regulations, and as the agency pushes for regulation to address social goals they may come at the expense of platform autonomy as well as profit- and welfare-maximization. Building on the harms of intervention, stripping away editorial protections offered by Section 230, would expose small- and medium-sized platforms to costly lawsuits if they curate or “edit” content in a way that a user or attorney general dislikes. Restricting moderation disincentivizes innovations that would better serve users, propping up established platforms that can coast off network effects, regulatory capture, and large legal departments. Differences in content curation drive competition between online platforms and restricting their ability to innovate with moderation would impose economic costs on firms and their users.

Along with economic harms, advocates of common carriage undervalue the harm regulations could pose to the user experience on platforms and consumer welfare. A key feature of these online platforms is the way their algorithms rank, index, and suggest content to users. TikTok’s algorithm and content curation fueled its ascent in the social media space. Laws that restrict platforms’ ability to make “editorial decisions” would deprive platforms of a key part of their appeal. Moderators remove posts that may violate community guidelines, such as hate speech, “doxing”(sharing a person’s personal information or data for malicious purposes), spam, or misinformation. Platforms’ editorial decisions are driven by economic incentives to raise consumer welfare as well as illustrate their “modification” of content they provide, a departure from the non-discriminatory function that characterizes many common carriers.

Conclusion

Common carriage regulations are applied to industries and companies that indiscriminately move goods for all individuals, regardless of the contents. While there are similarities between legacy communications and current online platforms, the editorial discretion and investment in technology to serve content to consumers are what differentiate one service from another, a clear departure from the character of a common carrier. Even if support for common carriage continues to grow, legislators must be aware of the costs these regulations can create, specifically the difficulty of applying legacy regulations to novel and evolving technologies, concerns about the legality of common carriage regulations in terms of the First Amendment and Section 230, and the harms done to platforms’ economic viability and incentives to maximize user experience online.

 

 

 

 

 

 

 

 

 

 

 

Disclaimer