Insight

Biden FTC’s Regulation Through Consent Decree

Executive Summary

  • The Federal Trade Commission (FTC) accused Meta of violating a consent decree governing the company’s collection and use of consumer data and proposed specific modifications preventing the company from monetizing minors’ data in the future.
  • Many of the FTC’s conclusions rely on outdated and incomplete information and do not adequately reflect Meta’s adherence to the consent decree; even if Meta violated the original consent decree, the FTC’s proposed modifications are likely unrelated to the harms cited.
  • The FTC appears to be using minor incidents as justification to impose its preferred policy regime without having to go through the required process for broad rule changes; Congress, therefore, should take oversight actions to ensure the agency stays within the authority it has been granted.

Introduction

In May, the Federal Trade Commission (FTC) accused Meta of “repeatedly violat[ing] its privacy promises” in its 2020 consent decree with the agency. According to the FTC, Meta’s (then Facebook) privacy program contains several gaps and weaknesses and that “the breadth and significance of these deficiencies pose substantial risks to the public.” As a result, the FTC has proposed modifications to the consent decree that would, among other things, impose a blanket prohibition preventing Meta from monetizing minors’ data.

To support these conclusions, the FTC appears to be relying on largely outdated or incomplete information. Specifically, regarding Meta’s “deficiencies,” the FTC relies on a 2021 independent assessor report that was simply the first in a biennial process, with another independent report likely coming out this summer. While the report did note that the company required additional tweaks to its privacy policies, it also made clear that the key foundational elements necessary for an effective privacy program were in place. Moreover, the report claimed that Meta had anticipated the gaps and weaknesses in its privacy program and had plans to resolve them. Rather than waiting for an updated report to provide clarity on the extent to which Meta has addressed concerns from the initial report, the FTC proposed drastic modifications to its 2020 consent decree that are largely untethered to the actual harms being claimed.

Ultimately, many of the facts were redacted in the FTC’s public order, so it may be that the FTC has a compelling argument that is not yet public. But the FTC’s actions may also be indicative of a larger pattern developing at the agency. During the Biden Administration, the FTC has made a dramatic shift from its traditional, enforcement-focused role and has instead begun an attempt to regulate broad swaths of the economy. In this case, the agency is relying on largely outdated evidence to support a blanket ban on the monetization of minors’ data. This is a policy that FTC leadership wishes to pursue but may not have the legal authority to do. The agency has also attempted to expand the bounds of its rulemaking authority and revoked bipartisan merger guidelines because the agencies departure from the consumer welfare standard as enshrined in the original guidelines.

While much of the information on which the FTC relied in its order isn’t available to the public, Congress can and should carefully oversee the agency throughout this and similar proceedings. If the agency goes beyond its authority, Congress will need to specifically restrict these efforts or else grant the agency the authority to pursue them.

Evidence of Harms and Proposed Modifications

Evidence of Harm

In the FTC’s order, the FTC cites two main potential violations of the 2020 consent decree: failed implementation of a robust privacy protection regime and misrepresentations made regarding products and services.

In the 2020 consent decree, the FTC and Meta agreed to allow an independent, third-party assessor to evaluate the privacy protections implemented by Meta. This assessor evaluated factors such as third-party risk management, incident management, data life cycle management, security, employee training, transparency, and compliance reporting. The report from the assessor occurs every two years and provides insights into how Meta is adhering to the obligations in the consent decree. As the order indicates, the 2021 assessor report does identify Meta’s gaps and weaknesses in many of these criteria. Much of the information relied on by the FTC is non-public, however, so it is difficult to evaluate just how extensive these concerns are. As the last report came out in summer of 2021, a follow-up report should be released later this year and will go into more detail regarding the status of Meta’s implementation of its privacy program.

Further, as a part of the consent decree, Meta promised “in connection with any product or service, [Meta] shall not misrepresent in any manner, expressly or by implication, the extent to which [Meta] maintains the privacy or security of Covered Information, including, but not limited to…[t]he extent to which [Meta] makes or has made Covered Information accessible to third parties.” First, the FTC argues that from 2018–2020, Meta represented that “Expired Apps” could retain information obtained while the user was still active but would be unable to continue obtaining non-public information. The FTC claims that “in some instances” Meta continued to share users’ information with expired apps through June 2020. Second, the FTC alleges that Meta allowed “Messenger Kids” users – a messaging and video calling app intended for children under 13 – to participate in group chats and calls with unapproved contacts, despite representing that users could only communicate with parent-approved contacts. This coding error was fixed in July 2010.

Proposed Modifications

The FTC proposes two main modifications to the 2020 consent decree. First, the agency would impose strict limitations on Meta’s ability to use information it collects from children and teens. Meta would only be allowed to collect and use information to provide service or for security purposes and could not monetize that information or use it for its own commercial gain “whether for advertising, enriching its own data models and algorithms, or providing other benefits to [Meta].” This modification, in practice, prohibits collecting data on youth users outside of very specific circumstances. Second, the FTC would prohibit Meta from releasing any new or modified product, service, or feature until it can demonstrate to the third-party assessor that its program fully complies with the order and has no material gaps or weaknesses.

In addition to the two main modifications, the FTC’s proposed changes would:

  • Extend existing protections to Respondent’s future uses of facial recognition templates.
  • Broaden notice and affirmative consent requirements when changing data practices.
  • Expand reporting requirements.
  • Safeguard information by businesses Meta acquires.
  • Strengthen existing privacy program provisions relating to risk assessments.

Meta must file a response to the FTC within 30 days, either accepting the proposed modifications or challenging them, which would likely result in legal action.

The FTC’s Case and Proposed Modifications May Lack Evidence

Many of the facts upon which the FTC relied in its order are largely outdated, incomplete, or irrelevant to the proposed modifications.

First, the FTC most critically relies on the independent assessment report from the summer of 2021, a year after Meta agreed to the 2020 consent decree. While much of the information in the report – and the FTC’s order citing it – is redacted for public viewing, the report is just the first in a biennial assessment process in which the independent assessor tracks the implementation of Meta’s privacy program. As Meta itself anticipated in filings to the FTC at the time, the company didn’t expect to impose a perfect program immediately, and the assessor’s role is to make sure the agency stays abridged on the steps Meta was taking and whether it was complying with the consent decree. Even in the unredacted portions of that report, the independent assessor makes clear that Meta had been making significant strides to lay the groundwork for a robust privacy program and that “the overall scope of the program and structure [redacted] into which the program is organized is logical and appropriately comprehensive.”

Even if the report demonstrates serious flaws in Meta’s privacy program, the timing draws questions as well. The report cited by the FTC was simply the first assessment of Meta’s initial response to the consent decree. Because the assessment occurs every two years, another assessment will likely be released in the summer of 2023, only months after the FTC proposed its modifications. Assuming Meta continues to make improvements to its policies, relying on a report from two years ago, when another report with more up-to-date information will likely be released in the coming months, makes little sense. Rather, the FTC should theoretically wait for that report to see if the concerns persist.

Second, the misrepresentations the FTC cited had been largely, or completely, addressed by Meta prior to the 2020 consent decree. For example, the limitation on sharing information with expired apps, as cited by the FTC, occurred from 2018–2020. Likewise, the “Messenger Kids” misrepresentations were corrected in July 2019. While these misrepresentations are serious, it is unclear how they justify a modification to the consent decree when the issues were resolved years ago.

Finally, the FTC lacks a clear case for why these proposed modifications address the specific harms cited. Again, while much of the assessor report is redacted, the FTC doesn’t draw a clear connection between the use of youth data in particular and a violation of the consent decree warranting modification. While there could be some evidence in the redacted portions of the order, statements from a sitting Democratic commissioner cast similar doubts on the agency’s authority to regulate Meta’s privacy standards.

Regulation Through Threat

Regardless of the legal outcome, the proposed modifications will likely have an impact on businesses operating online. The FTC has long resolved proceedings with consent decrees, allowing both the agency and businesses to avoid costly litigation and the risk of losing that litigation. With these proposed modifications, all businesses are now potentially at risk for similar modifications, meaning they must carefully consider their practices even if unrelated to minors’ data, as the FTC could add similar provisions. For those companies without a consent decree regarding user privacy, the FTC has made clear its priority to target the collection and use of minors’ data and could theoretically bring FTC enforcement actions under its unfair or deceptive acts or practices authority, leaving companies at risk if they don’t change their behavior.

While this comes with some benefits, it also exceeds the authority Congress gave to the FTC. One reason regulation by threat works is that the FTC is inherently an enforcement agency, not a regulatory one. If the agency wanted to pass rules banning the collection of minors’ data or the use of that data for monetary gain, it would have to go through rigorous rulemaking procedures imposed on it by Congress. Indeed, the FTC has started a proceeding looking at privacy broadly, and minors’ data may be a part of that proceeding, but it is unclear if the agency’s eventual rules – which will take years before their final implementation – will survive judicial scrutiny. Instead, by simply targeting one of the largest online companies, the FTC effectively puts all businesses on notice that it sees this type of behavior as problematic and may bring enforcement action in the future, driving changes to business practices without even issuing a rule.

Congress can and should bring oversight actions to ensure the agency stays within its congressionally designated bounds. Bringing an enforcement action is one thing, but the Biden Administration’s FTC has repeatedly pushed the bounds of its authority, whether it is throwing out longstanding competition guidelines, attempting to pass unfair methods of competition rules, or now modifying a consent decree based on behavior largely resolved prior to a previous modification. Even for those who wish to see a stronger FTC, allowing this type of behavior could hamper the agency for years if courts reject its massive overreach.

 

Disclaimer