Insight

The Kids Online Safety Act Lame Duck Push

Executive Summary

  • A bipartisan group of senators unveiled updated text to the Kids Online Safety Act, a bill that would impose a duty of care on online platforms to act in the best interest of minors and mitigate the harms from using their services.
  • The new text attempts to address some previous concerns in the bill, most notably limiting the duty of care to instances when the platform should know the user is a minor, as well as platforms’ liability when minors actively seek out information on potentially sensitive topics.
  • The new language does not fully address the concerns surrounding age verification and free speech online, however, because the constructive knowledge standard goes beyond current federal law governing children’s privacy and adds uncertainty regarding what satisfies the duty of care.

Introduction

On December 13, 2022, Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN) released updated text of the Kids Online Safety Act (KOSA) with hopes to include the legislation in the year-end omnibus spending package currently being negotiated in the lame-duck session. The bill would establish a duty of care for covered platforms, requiring them to act in the best interest of minors using their services. A previous insight from the American Action Forum (AAF) highlighted, however, that many of the KOSA’s provisions could do more harm than good, such as the duty essentially requiring platforms to implement age verification of users and the wide applicability of the law, perhaps covering entities the drafters did not intend. The new bill text addresses some of these concerns, though not entirely. Congress should carefully consider the balance between protecting children while still allowing innovation of, and free speech on, online platforms.

This primer briefly explains the key changes to the bill and how they may affect its implementation.

Adding a Knowledge Standard

Under the original language, KOSA would have established a broad duty of care for platforms to act in the best interest of minors using their services, with no language limiting applicability of this duty to the knowledge that the user is in fact a minor. In other words, lacking any knowledge standard, the original language would have essentially forced all covered platforms to verify the age of all users. While age verification has a place in protecting children online, mandating age verification would come with a variety of privacy and safety problems for users.

The new language limits the duty of care to users whom the platform “knows or should know” are minors. This limiting language would give covered platforms some additional protection and could allow them to comply with the law without verifying the age of users. For example, if a platform tends to host more adult content, and the user doesn’t behave as a typical minor, the platform can argue that it lacked the knowledge required under the law and is thus not liable for violating its duty of care to that minor.

The added bill text, however, doesn’t eliminate all concerns with KOSA. Current federal law protecting children online operates under an actual knowledge standard, meaning the covered platform must know the user is a minor for protections to apply. Critics argue that an actual knowledge standard allows platforms to simply ignore potential risks by remaining oblivious, and a constructive knowledge standard would better ensure these platforms consider the safety of users. At the same time, a constructive knowledge standard would add risk and uncertainty for platforms: the Federal Trade Commission could argue, for example, that a user’s behavioral patterns suggest that an individual user is in fact a minor, and therefore, failing to protect that user makes the platform in question liable for violating its duty of care. As a result, even with the limiting language, many platforms would likely feel compelled under KOSA to require age verification to limit potential liability.

Limitations on Seeking Content

As originally drafted, KOSA would likely have encouraged covered platforms to simply prohibit the discussion of any potentially harmful topics to avoid liability. The previous AAF insight on KOSA argued the original version of the bill could have prevented minors from finding information on important issues such as mental health and body image. Again, the knowledge limitations in the added bill text may help to alleviate some risk, but not all of it.

To address this concern, drafters included a limitation holding that nothing in the duty of care “shall be construed to require a covered platform to prevent or preclude any minor from deliberately and independently searching for, or specifically requesting, content.” This limitation would essentially absolve platforms of any liability for instances when the minor specifically seeks out content, but not instances when the content is recommended or suggested by the platform to the user.

Nevertheless, platforms’ recommendations are a key tool for users to discover content they may find helpful. For example, if a minor specifically seeks out content on suicide, recommendation algorithms can be used to promote suicide prevention content. Still, if recommendations could lead to liability, even positive content may end up limited as platforms refrain from recommendations altogether to avoid defending the decision in court.

Further, it may be difficult for a platform to prove that the user sought out the information, and they may find the risk of liability too great. As a result, even with KOSA’s new limiting language, many platforms may opt to remove these discussions entirely.

Applicability to Different Services

In KOSA’s original text, the definition of covered platform included any commercial software application or electronic service that connects to the Internet and is likely to be used by a minor. This broad definition encompassed everything from broadband providers and movie studios to video game developers and social media platforms. Over the last year since KOSA’s introduction, this definition has been restricted, but not significantly. Under the new text of the bill, the definition specifically includes many of these services, and only excludes traditional utility telephone and text-messaging services, nonprofits, and schools.

This widespread applicability opens the door for a range of potential negative impacts on speech. For example, a movie studio that owns a streaming service may run afoul of the law by releasing a television show that is seen by some to glorify drug abuse. A video game developer could run afoul of the law for developing a multiplayer game that allows voice chat between players, especially if that game is freely accessible to all users. A broadband provider could even run into legal trouble for allowing users to access a wide range of websites online.

Conclusion

While the new changes to KOSA’s legislative text attempt to address some of the more significant concerns in the original bill, they do not fully resolve the problems with age verification and free speech online. Congress should carefully consider the potential tradeoffs as it works to include KOSA in lame-duck legislation.

Disclaimer