Insight

Kids Online Safety Act Could Do More Harm Than Good

Executive Summary

  • The Senate Commerce Committee recently unanimously approved the Kids Online Safety Act (KOSA), which would require platforms to protect children from dangerous materials online.
  • KOSA would create a “duty of care” for covered platforms, ranging from social media to streaming services and video games, to act in the best interest of minors, allowing the Federal Trade Commission to seek significant fines for companies that fail to protect minors from harms experienced using those platforms.
  • The bill would likely come with significant tradeoffs such as requiring minors to provide information to verify their age or making it more difficult for them to find information on challenges they may be facing, such as mental health or addiction.

Introduction

Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN) this year introduced the Kids Online Safety Act (KOSA), which would create a new “duty of care” for social media companies to prevent harm to minors, as well as require platforms to create tools for parents to better control their children’s online experience. If a company violates this duty of care, the Federal Trade Commission (FTC) and state attorneys general could threaten significant fines against the platform.

Social media companies have taken a multitude of steps to help protect children from harmful Internet content, but kids still face risks online. While legislators designed the bill to address these harms, the legislation would likely come with significant tradeoffs. Congress should carefully consider potential unintended consequences for both the children and adults using the Internet. Because the bill creates a generalized duty of care for these platforms with no bright-line rules, platforms would likely need to gather additional information to verify user identities—including their ages—to avoid potential litigation, and in practice the bill could restrict access to important information for both children and adults.

After unanimously passing out of the Senate Commerce Committee, the bill may soon be taken up by the full Senate. In considering the legislation, lawmakers should carefully weigh the relative tradeoffs it would require, and work to limit its negative impacts while maximizing the benefits.

Kids Online Safety Act: What the Bill Does

KOSA would implement two key provisions designed to protect children online. First, the bill would create a duty for covered platforms, ranging from social media services to video games and streaming services, to act in the best interest of a minor who uses a platform’s products and services, as well as another duty to prevent and mitigate “heightened risks of physical, emotional, developmental, or material harms.” A duty-based approach would create a fairly broad standard for platforms but would also allow for enforcement flexibility: Even if the bill doesn’t explicitly outlaw a specific practice, courts may still find that a platform’s implementation of a particular practice does not conform to the duty of care the platform owes minors using its service.

Second, KOSA would create a series of “safeguards for minors,” ranging from settings to prevent other individuals from viewing the minor’s personal data, to default settings for minors that offer the strongest protections available, along with the provision of control options that do not encourage minors to weaken or turn these protections off. These settings, paired with parental tools, would be designed to create an experience whereby minors and their parents could better control the types of content that platforms promote and limit the data collected and shared regarding the minor.

In addition to these two key features, KOSA would also impose disclosure and transparency requirements on platforms. These requirements are also fairly broad, most significantly requiring a platform to explain how it uses algorithmic recommendations. The level of detail to which a platform must outline its recommendation model remains unclear, however, as the law could only require that a platform disclose that a recommendation model exists, and that the child would have content delivered based on that model.

To implement these provisions, KOSA would grant the FTC authority to enforce violations of the act under its unfair or deceptive acts or practices authority. The bill would allow the FTC to go through Administrative Procedure Act informal rulemaking to implement the safeguards, as well as disclosure and transparency requirements, however. Further, state Attorneys General can likewise enforce the provisions of the act and any regulations the FTC implements, though the FTC would retain the authority to intervene in state cases.

Age Verification Concerns

Unlike previous efforts to protect children online, KOSA doesn’t require a platform to know an individual is a minor for the protections to apply. Instead, the bill imposes a broad duty of care for platforms to “act in the best interests of a minor that uses the platform’s products or services” and, in doing so, prevent harms the minor may face on the platform. Nothing in this duty limits its application to cases in which the platform knows that a given user is a minor.

To avoid risk of liability, a covered platform could independently verify the age of users before allowing use of the platform to ensure minors are identified. In fact, the bill expressly contemplates this outcome, requiring a study of the technology to verify user age at both the device and operating system level. If a service can verify a user’s age, it can then tailor the experience for minors in an attempt to conform with the provisions of the act.

While a study of age verification would yield valuable insights, covered platforms would likely begin to implement these technologies immediately, and some are currently doing so on a voluntary basis. Nevertheless, there are significant downsides that would come with implementation.

When creating an account, all users, not only children, would likely need to deliver sensitive information to the platform or a third-party identity verifier. This information would likely include some form of identification, such as a credit card, and would effectively prohibit an individual from accessing the platform anonymously. Previous attempts to require age verification were struck down by courts due to an adult’s right to access lawful content anonymously. This bill would not go as far as the attempts previously struck down, but many of the same concerns would likely be present as platforms begin to implement these services.

Further, if legislators want to protect children online, forcing children and adults to give up more information before they can access the Internet would create risks countervailing those goals. For example, if a platform requires the credit card information of a parent or a selfie of the child to create an account, parents may not want to bear additional risk in creating their child’s account. For lesser-known platforms, parents may not want to divulge that information at all, making it more difficult for children to access this platform, regardless of the potential benefits its content could provide.

Generalized Duty of Care and Access to Content

Rather than verifying the age of users, platforms could simply treat all users as minors when contemplating the duty of care. Yet whether the platform applies the duty to all users or just minors, it will certainly result in less information being available to users online.

In principle, platforms adhering to a generalized duty of care to the users could lead to some meaningful benefits. For example, when developing and implementing a content-recommendation model, platforms would prioritize the safety of its users because failing to do so could breach their duty of care. At the same time, such a duty wouldn’t force platforms to eliminate discussion of these topics outright if the platform meets the legal duty of care, allowing some leeway in the design and implementation of moderation practices.

But again, these potential benefits come with significant tradeoffs. Primarily, the flexibility that this approach would provide also adds uncertainty regarding enforcement, and risk-averse platforms will likely overcorrect either inadvertently or deliberately to avoid liability. For example, if an individual is struggling with addiction, content from users who likewise struggle with these challenges could provide comfort and guidance, ultimately helping the user deal with their addiction. For a platform, however, content relating to the topic could be construed to promote or even glorify that abuse, especially considering that platforms currently struggle to moderate content effectively. Knowing an FTC or state AG lawsuit could always be around the corner may force some platforms to entirely remove any mention of these topics, ultimately harming many of the individuals the bill seeks to protect.

Limiting enforcement to the FTC and state AGs does alleviate some concern because individual users would be unable to bring an influx of lawsuits, but government officials are already attempting to influence moderation decisions for political ends. Content that some states see as beneficial could be seen as harmful in others, meaning platforms would be unable to simultaneously meet their duty of care. This bill could make the challenge more difficult.

Second, platforms have a First Amendment right to tailor content to the individual user’s needs. Allowing platforms to voluntarily develop best practices through a duty of care model has been proposed and could lead to enforceable obligations; forcing companies to adopt the model may exceed the government’s authority, however. If platforms must expand this duty to all users to avoid potentially breaching that duty to minors, the First Amendment concerns would become more pronounced.

Finally, the duty of care could apply to entities beyond those that allow for user-generated content, meaning a wide swath of platforms, including internet service providers, video game publishers, and movie studios could be tasked with this duty of care. This broad approach is likely intentional on the part of the legislation’s drafters, but its consequences may still go beyond those intentions.

Conclusion

KOSA provides an interesting approach to preventing harms to children online, but its protections would come with a host of tradeoffs. Specifically, the legislation would require that all users—including minors—provide more personal information in order to use the Internet; it would likely also make it more difficult for children to access information on such issues as addiction and mental health. As Congress contemplates this bill, it should carefully consider how benefits and risks can best be balanced.

 

Disclaimer