The Daily Dish

The Kids Online Safety Act

Yesterday major social media platforms argued before the Supreme Court against Texas and Florida laws intended to address conservatives’ perceptions of online censorship. These laws, the platforms contend, violate the First Amendment. It is a difficult issue, about which the Court seems torn.

In contrast, there is no disagreement about the need to protect children from online dangers. Yet drafting federal legislation to do this, and only this, has proven to be a real challenge. A case in point is the Kids Online Safety Act (KOSA), which Jeff Westling discusses thoroughly in his latest insight.

The basic idea of the bill is to create a duty for covered platforms to protect minors from harm, namely:

  1. Specific mental health disorders including anxiety, depression, eating disorders, substance abuse disorders, and suicidal behaviors;
  2. Patterns of use that indicate or encourage addiction-like behaviors by minors;
  3. Physical violence, online bullying, and harassment of minors;
  4. Sexual exploitation and abuse of minors;
  5. Promotion and marketing of narcotic drugs, tobacco products, gambling or alcohol; and
  6. Predatory, unfair, or deceptive marketing practices, or other financial harms.

It seems simple enough. But there are so problems with this language. The first problem is these duties are very broad and are thus subject to a variety of interpretations. If a platform is going to protect against specific harms, it needs to know exactly what harms are on the list.

The second issue is that, as Westling puts it:

Because the duty of care does not contain a knowledge requirement, and platforms that are used or reasonably likely to be used by minors are covered, platforms could be liable for harms to children even if a platform believes the user to be an adult. Therefore, platforms have two options: 1) verify the age of all users, raising privacy and speech concerns; or 2) restrict the delivery of all content that could potentially harm children.

Another problem Westling points out is that:

[S]ome have raised concerns about how different states may attempt to use KOSA and similar state laws to effectively block legal content. Specifically, LGBTQ+ groups have come out against the legislation out of fear that more conservative state AGs may use KOSA’s broad duty of care and safeguard provisions to target platforms that allow children to view LGBTQ+-related content, regardless of whether the content is potentially harmful.

While changes to state attorney general enforcement have alleviated some of these concerns, there still very much exists a risk that states could enforce similar laws, as well as other provisions in KOSA, to create inconsistent speech standards online. In short, KOSA quickly ends up with the same free speech and biased content moderation concerns as other legislation aimed at online platforms. Despite having gone through two iterations, the bill seems far from achieving its goals, all while putting free speech online at risk. As Westling concludes:

The legislation would still leave covered platforms with an overly broad duty of care that would have a chilling effect on free speech. It would also present significant privacy concerns for minors who use these services, and they may be required to submit sensitive information for age verification. What’s more, as currently drafted, inconsistent enforcement by the FTC and state AGs could still lead to drastically different standards depending on the jurisdiction, making it extremely difficult for platforms to maintain a consistent definition of online harm.

Disclaimer

Fact of the Day

Across all rulemakings this past week, agencies published $592.9 million in total costs and added 901,701 annual paperwork burden hours.

Daily Dish Signup Sidebar