The Federal Trade Commission (FTC) released an advance notice of proposed rulemaking (ANPRM) on “Commercial Surveillance and Data Security” on August 11, 2022. The ANPRM, approved on a 3-2 party-line vote, is the initial step in a process that could result in the adoption of the first federal regulation addressing privacy, data security, and algorithmic discrimination across broad sectors of the U.S. economy. From the questions posed in the ANPRM, the FTC appears to be considering approaches that could have extraordinary ramifications for businesses if adopted, such as limits on personalized advertising, privacy protections for teens and children under 13 that go beyond those of the Children’s Online Privacy Protection Act (COPPA), and limits on techniques that promote prolonged online activity by children and teens.

Rulemaking Framework

The ANPRM is the first step on the long journey that is rulemaking under Section 18 of the FTC Act, 15 U.S.C. § 57a, often referred to as Magnusson-Moss or Mag-Moss rulemaking, which imposes rulemaking obligations on the FTC that exceed the notice-and-comment requirements of the Administrative Procedure Act (APA), 15 U.S.C. § 553(b). Section 18 authorizes the FTC to issue trade regulation rules that “define with specificity” acts or practices that are unfair or deceptive. The FTC can only issue rules regarding practices that are either “deceptive” or “unfair” within the meaning of Section 5 of the FTC Act and that are “prevalent” in the market. 15 U.S.C. § 57a(b)(3). “Prevalence” must be based on previously issued cease-and-desist orders (i.e., litigated administrative orders) or any other information available to the FTC that indicates a “widespread pattern” of unfair or deceptive acts or practices. 15 U.S.C. § 57a(b)(3)(A) & (B). The ANPRM must “[c]ontain a brief description of the area of inquiry under consideration, the objectives which the Commission seeks to achieve, and possible regulatory alternatives under consideration by the Commission.” 15 U.S.C. § 57a(b)(2)(A)(i).

Some Noteworthy Issues

When the FTC first indicated, in December 2021, that it intended to initiate this rulemaking, it stated that it was considering a rulemaking to “curb[] lax security practices, limit[] intrusive surveillance, and ensur[e] that algorithmic decision-making does not result in unlawful discrimination.” The ANPRM has taken an expansive approach to these three already-broad areas, inviting public comment on 95 wide-ranging questions, many with numerous subparts. The following topics are of particular note:

  • Substantive restrictions on data processing. Chair Lina Khan expressed concern that notice and consent protections “tend to create process requirements while sidestepping more fundamental questions about whether certain types of data collection and processing should be permitted in the first place.” The ANPRM reflects these concerns, posing a number of questions about the efficacy of consent-based data processing.
  • Protections for children and teens. The Commission seems especially interested in addressing issues relating to teens and children. Although COPPA provides a legal framework for the privacy and security of children under 13 and the FTC is already reviewing its COPPA rules, the ANPRM poses numerous questions about protections for children under 13 that go beyond COPPA’s scope and asks whether COPPA’s mechanisms are adequate. For example, the ANPRM asks whether parental consent—an essential component of COPPA—is “an efficacious way of ensuring child online privacy,” and whether rules should limit personalized advertising to children and teenagers “irrespective of parental consent.” The ANPRM also asks a number of questions about how services that are not directed to children or teens should address child and teen privacy, including what protections they should be required to provide for these groups, and whether they should be required to take steps to determine the age of their users. In addition, treading into topics traditionally considered beyond the privacy or data security sphere, the ANPRM reflects concern about practices that affect children and teens’ mental health, as well as practices that “manipulate” children and teens into prolonging their online activity.
  • Restrictions on personalized advertising. The ANPRM asks a number of questions regarding personalized advertising, for example, regarding the potential harms of such advertising, potential limits on the use of personalized advertising to teens and children, whether opt-outs from personalized advertising should be available to all consumers, and whether companies in certain sectors (e.g., finance, healthcare, search, or social media) should be limited from owning or operating a business that engages in personalized advertising. Consistent with recent claims by privacy advocates and California regulators, the ANPRM seems to advance a view that contextual advertising is as effective as personalized advertising and should be used in its place.
  • Data security. The definition of “data security” in the ANPRM is broad and includes breach notification, data minimization, and retention. Yet, as Commissioner Noah Phillips notes in his dissent, the ANPRM gives data security “short shrift,” asking only six questions in the section of the document specific to data security. (The sections of the ANPRM that ask questions about harm and children and teens also sweep “lax data security measures” into the predicate of the questions, but do not contain questions geared toward determining which measures are “lax.”) That said, even in the six questions in the data security section, the possibility for significant impact on companies is apparent, indicating potential interest in (1) mandating by rule specific measures such as encryption, (2) codifying a prohibition on deceptive statements about security so that penalties can be obtained for first-time violations, (3) applying the data security requirements of COPPA and/or the Gramm-Leach-Bliley Act (GLBA) Safeguards Rule to much wider sectors of industry, and (4) requiring certification by businesses that their practices meet certain standards. Further, it is likely that more specificity and detail will become evident in any subsequent notice of proposed rulemaking (where the FTC must publish the text of any proposed rules).
  • Algorithms and algorithmic discrimination. In soliciting comment on algorithms and algorithmic discrimination, the FTC takes a broad view of its potential remit, asking several questions about “algorithmic error” and “automated decision-making,” whether it should focus on harms to protected classes or consider harms to other underserved groups (e.g., unhoused people or rural communities), whether it should analyze proxies for protected classes, and whether it should consider rules regarding algorithmic discrimination only in established areas like housing, employment, and consumer finance or go beyond them. The ANPRM also asks whether other federal laws (the First Amendment, Section 230 of the Communications Act, 47 U.S.C. § 230, and other civil rights laws) should affect the scope of any FTC rule in these areas.
  • Biometric information. The FTC has indicated that biometric information is a priority area and such data receives special attention in the ANPRM, which includes a question that suggests the FTC is considering developing a rule that imposes substantive limits on the use of facial recognition, fingerprinting, and other biometric technologies.

Next Steps

The ANPRM will be published in the Federal Register. Following publication, the public will have 60 days to submit comments. In addition, the FTC will host a virtual public forum on September 8, 2022, where members of the public can apply to speak for up to two minutes.

Following its review of the comments on the ANPRM, if it decides to propose rules, the FTC will publish a notice of proposed rulemaking containing the text of any proposed rules and an invitation for public comment. Subsequently, the FTC must hold a hearing in which it may permit cross-examination and/or rebuttal testimony on disputed issues of material fact.

Industry participation is critical at each step of this lengthy process to build a robust and balanced record about the benefits of industry practices and other key issues. In addition to helping to ensure that any rules issued through this process are justified, the proceedings are likely to influence the FTC’s enforcement and other policy activities and may inform Congress or other policymakers as well.

* * *

Perkins Coie’s Privacy & Data Security practice works with the world’s most innovative companies on data protection issues. We are at the forefront of cutting-edge technologies and evolving privacy norms, and we have represented clients in more than one hundred regulatory investigations and in rulemaking proceedings before the FTC, the Federal Communications Commission (FCC), and state attorneys general. Please reach out to any of our attorneys with questions.

© 2022 Perkins Coie LLP