Insight

KOSA Updates Seek To Address Critics’ Concerns

Executive Summary

  • A bipartisan group of senators recently unveiled new language for the Kids Online Safety Act (KOSA), a bill designed to require covered platforms to assume a duty of care to protect children from online harms.
  • The updated language attempts to address concerns with the original bill regarding its impact on free speech and privacy online by narrowing the scope of the duty of care requirement to cover only the design features of covered platforms and limiting enforcement by state attorneys general.
  • Despite the changes to the bill text, as currently drafted KOSA still presents significant concerns for free speech and privacy online, both of which could be compounded by state enforcement efforts.

Introduction

A bipartisan group of senators recently announced updates to the Kids Online Safety Act (KOSA), a bill that would establish a duty of care requiring online platforms to better protect minors using their service. The updated language attempts to address concerns with the original bill regarding its impact on free speech and privacy online. It would do this by narrowing the scope of the duty of care requirement to cover only the design features of covered platforms – such as auto-loading videos or endless scrolling – and by limiting enforcement by state attorneys general (AG).

While in some respects the new changes improve the legislation, the current language still largely fails to address the concerns of critics. First, the changes to the duty of care provision may broaden rather than narrow liability because it would now require platforms to exercise reasonable care in implementing design features, and the definition of “design features” is so broad it could include almost all aspects of the use of social media platforms. As a result, platforms will almost necessarily be forced to verify the age of their users, infringing on their privacy and freedom of speech rights. Second, the bill would still allow state AGs to enforce other provisions of the law and would not prevent states from bringing state law claims with inconsistent definitions of harm.

How KOSA Works

KOSA’s primary provision would create a duty for covered platforms, ranging from social media to video games and streaming, to protect minors using these services from a variety of harms. While the legislation identifies six harms (listed below) the language is very broad and could have many different interpretations:

  1. Specific mental health disorders including anxiety, depression, eating disorders, substance abuse disorders, and suicidal behaviors;
  2. Patterns of use that indicate or encourage addiction-like behaviors by minors;
  3. Physical violence, online bullying, and harassment of minors;
  4. Sexual exploitation and abuse of minors;
  5. Promotion and marketing of narcotic drugs, tobacco products, gambling or alcohol; and
  6. Predatory, unfair, or deceptive marketing practices, or other financial harms.

The bill would also impose myriad other requirements on covered platforms, such as a series of “safeguards for minors” settings designed to give minors and their parents more control over how content is delivered, disclosure requirements regarding the platforms’ practices, and transparency reports for larger platforms. To implement these provisions, KOSA would grant the FTC authority to enforce violations of the act under its unfair or deceptive acts or practices authority, and include a role for state enforcement as well.

Changes to the Legislation

Critics have raised a wide range of concerns regarding KOSA and how the bill would affect speech online for all users. The changes to the bill attempt to assuage some of these concerns.

First, drafters attempted to limit the duty of care to the “design features” of a platform to address concerns about free speech and privacy, but they keep the original categories of harm. Because the duty of care does not contain a knowledge requirement, and platforms that are used or reasonably likely to be used by minors are covered, platforms could be liable for harms to children even if a platform believes the user to be an adult. Therefore, platforms have two options: 1) verify the age of all users, raising privacy and speech concerns; or 2) restrict the delivery of all content that could potentially harm children. By specifically targeting platforms’ design features, the drafters are attempting to assuage concerns that the bill would target specific types of content rather than how the platform delivers such content.

Second, some have raised concerns about how different states may attempt to use KOSA and similar state laws to effectively block legal content. Specifically, LGBTQ+ groups have come out against the legislation out of fear that more conservative state AGs may use KOSA’s broad duty of care and safeguard provisions to target platforms that allow children to view LGBTQ+-related content, regardless of whether the content is potentially harmful. The new language, therefore, removes state AGs’ enforcement of the duty of care provision with the intent of creating one standard regime at the FTC. In response to this change, many LGBTQ+ groups will no longer oppose the legislation.

Do the Changes Address Key Concerns?

Despite the updated language, KOSA still largely fails to address the concerns regarding free speech and privacy.

While the original duty of care required platforms to “take reasonable measures in the design and operation of their products and services,” the new language requires platforms to exercise reasonable care, which could be interpreted more broadly than reasonable measures and is much more akin to traditional negligence standards. Perhaps more problematic, almost any aspect of a product or service could be understood as a design feature, which the statute defines as any component that will encourage the frequency, time spent, or activity of minors on the platform. For example, if the application recommends video content to a user, any recommended video to a minor could breach a duty of care if the FTC finds it harmful.

With an even broader duty of care, platforms will almost certainly feel the need to age-gate their services. Currently, the most accurate methods of age verification include face scanning and other biometrics, as well as collecting personal information such as government ID. As a result, users regardless of age will need to provide more information to platforms to use the service, raising additional privacy concerns. Courts have found that age-verification requirements would also limit the ability of users to speak anonymously online, a critical component of free speech. As a result, users regardless of age will need to provide more information to platforms to use the service, raising additional privacy concerns. Further, because KOSA fails to narrow the actual categories of harm, broad swaths of content that may be considered harmful children could lead to liability, leading to over-removal of such content. In effect, the duty of care will necessarily limit privacy and speech online.

The bill’s language regarding state AGs does seem to address some of the critics’ concerns, however. The updated text includes state AG enforcement for many of the provisions of the bill, such as the safeguards for minors and the disclosure requirements, but not enforcement of the duty of care. Because the duty of care is so broad, allowing states to enforce the law could lead to a wide range of outcomes. By removing states from enforcement of the duty of care and creating a sole enforcer at the federal level, the new draft should create more consistency in applying the provisions. At the same time, the bill still could support similar state AG investigations, and different administrations could implement the bill in widely different fashions. For example, with the broad categories of harm and duty of care, AGs may bring more traditional negligence claims under state law, citing these harms as injuries. Further, any finding of harm by one administration’s FTC could be used by AGs to bring action under separate state law, even if a future administration’s FTC disagrees. With the looming threat of FTC and state litigation claiming the platform harms minors, platforms may simply choose to remove any content that could trigger scrutiny and potential litigation, disproportionately impacting some groups.

Conclusion

KOSA’s new language, drafted in an attempt to resolve concerns about free speech and privacy online, is not likely to address these concerns with any degree of precision. The legislation would still leave covered platforms with an overly broad duty of care that would have a chilling effect on free speech. It would also present significant privacy concerns for minors who use these services, and they may be required to submit sensitive information for age verification. What’s more, as currently drafted, inconsistent enforcement by the FTC and state AGs could still lead to drastically different standards depending on the jurisdiction, making it extremely difficult for platforms to maintain a consistent definition of online harm.

Disclaimer