Insight
December 18, 2025
Navigating the Tensions in Federal Child Online Legislation
Executive Summary
- In the ongoing debate over child online safety legislation, the Kids Online Safety Act (KOSA) and Children and Teens’ Online Privacy Protection Act (COPPA 2.0) have recently gained momentum, advancing out of the House Subcommittee on Commerce, Manufacturing, and Trade; yet long-standing concerns continue to complicate the path forward.
- Constitutional issues, such as free speech concerns, appear to have been largely addressed, but key points of contention remain, including the proposed disparate treatment of companies based on size, weakened duty-of-care obligations, privacy provisions that may limit the deployment of effective age-assurance techniques, and concerns about federal preemption of state laws.
- To break the legislative logjam, Congress must create real incentives for platforms to redesign products with youth safety in mind, set uniform standards for when platforms are responsible, provide targeted guidance on age assurance that protects privacy, and harmonize new child-safety rules with privacy laws.
Introduction
In the ongoing debate over child online safety legislation, lawmakers continue to assess how to address emerging digital risks. Renewed attention is on the updated versions of the Kids Online Safety Act (KOSA) and the Children and Teens’ Online Privacy Protection Act (COPPA) 2.0. Both bills have advanced out of the House Subcommittee on Commerce, Manufacturing, and Trade, moving them closer to a vote by the full House. Nevertheless, long-standing concerns continue to complicate the path forward.
While constitutional issues, such as free speech concerns, appear to have been largely addressed, key points of contention remain in four core areas: the application of different compliance expectations to smaller platforms versus large social media companies, changed duty-of-care obligations that may reduce platform accountability, privacy provisions that may limit the deployment of effective age assurance techniques, and preemption language that could erase state-level protections. These tensions highlight the ongoing challenge of balancing privacy protections, safety, and platform liability in children’s online legislation. Failure to reach consensus risks leaving statutory protections outdated once again, even as rising digital risks demand stronger protection for youth.
To deliver effective, durable protections for young users online, future legislation must create real incentives for platforms to redesign products with youth safety in mind, set uniform standards for when platforms are responsible, provide targeted guidance on age assurance that protects privacy, and harmonize new child-safety rules with privacy laws.
The Case for Youth Protection Online and a Stalled Path Forward
The internet offers enormous benefits to children including education, recreation, relationship-building, and more. But the need to protect young users online is increasingly urgent as new technological advances bring both opportunities and risks. Young users are exposed to data violations and safety threats. To address this, lawmakers have pursued two major legislative paths. One is data privacy legislation, regulating companies’ ability to collect and retain the personal information of minors. The second centers on online safety, requiring a user’s age to be confirmed or reliably estimated and limiting access to certain services. While bipartisan coalitions in both the Senate and House have formed around varied legislative proposals, notably COPPA and KOSA, constitutional concerns, such as free speech, emerged. While these concerns appear to have been addressed, long-standing challenges continue to complicate the path forward.
Efforts to Date and Remaining Challenges
COPPA and COPPA 2.0
The Children’s Online Privacy Protection Act (COPPA) represented the first major attempt by Congress to ensure the privacy of young users online. Enacted in 1998, COPPA protects the personal information of children under 13, requiring parental notice and consent before platforms can collect, use, or share a child’s personal information. In the years since COPPA was implemented and as digital platforms have evolved, policymakers have pushed to update the law and expand its protections.
COPPA 2.0 was initially introduced in 2023 and passed the Senate by a broad bipartisan vote of 91–3 in 2024. Disagreement over subsequent revisions in the House stalled the effort and prevented the bill’s enactment. COPPA 2.0 was reintroduced in the current Congress, but lawmakers disagree on its proposed disparate treatment of affected companies based on size and on privacy requirements that can impact companies’ ability to deploy age assurance techniques. Notably, the bill does not mandate any particular age assurance technology. The House version of COPPA 2.0 introduces a tiered standard for platform responsibility in which smaller platforms have legal obligations when they “know” a user is a minor, while larger social media platforms would also have obligations if they “willfully disregard signals” indicating a user is underage. COPPA 2.0 also expands the definition of “personal information” to explicitly include biometric identifiers such as fingerprints and facial template, and subsequently imposes limits on their retention. While the Senate version limits these biometrics to characteristics used to identify an individual, the House version defines them broadly, including any collection or processing of biological traits, regardless of whether the data is intended for, or capable of, identifying an individual. This distinction is critical because age assurance often relies on biometrics to confirm a user’s age; thus, under the broad House approach, platforms collecting biometric data for any purpose, including age assurance or internal analytics, would be subject to strict COPPA 2.0 requirements.
Some Senators have expressed concern that the House version is weaker, potentially reducing real protections for kids in order to limit the burden on tech companies. The tech industry has also warned that broad language and ambiguity could lead to unexpected application of requirements: smaller companies that are not required to actively verify user age may simply keep the status quo, while larger companies would have stronger incentives to reduce their legal risk by collecting more data than strictly necessary to avoid missing underage users. Critics say the bill’s ambiguity also raises uncertainty about which age assurance methods trigger the rules, further impacting platforms ability to deploy effective age-assurance techniques.
Kids Online Safety Act
In addition to updating federal law to safeguard the privacy of young users online, advocates have also sought to protect them from objectionable online content. KOSA would mainly require covered platforms to assume a duty of care to protect children from online harm. While the bill has been introduced and updated since 2022, the current House and Senate versions of KOSA differ mainly on how they define “duty of care.” The Senate version imposes a strong duty on covered platforms, requiring them to take “reasonable care” in designing features to prevent a wide range of harms to minors, including mental health impacts, addictive use patterns, cyberbullying, and more. The House version, by contrast, drops the duty-of-care language and instead requires platforms to maintain “reasonable policies, practices, and procedures” to address a narrower set of harms. This approach has raised concerns among critics who believe it weakens accountability and reduces incentives for platforms to redesign their products around youth safety.
Finally, a push for federal preemption in the House versions of both KOSA and COPPA 2.0 is a central point of debate. While the Senate bill does not impose preemption language, the House version could override state protections, potentially undermining legislation that many states have already established. This is important because, in the absence of comprehensive federal legislation, states such as California, New York, Virginia, Maryland, Nebraska, and Vermont have established age-appropriate design codes and limited access to certain online products.
Conclusion
Even with broad agreement that kids need stronger online protection, long-standing concerns still leave the future of KOSA and COPPA 2.0 uncertain. To make a difference, policymakers should create real incentives for platforms to redesign products with youth safety in mind, setting uniform standards for when platforms are responsible, providing targeted guidance on age assurance that protects privacy, and harmonizing new child-safety rules with privacy laws. If lawmakers strike the proper balance among safety, privacy, and accountability, platforms will have the clarity and incentives they need to protect young users while allowing them to make the most of their online experiences.





