In the absence of federal privacy legislation, the FTC is widely expected to commence a rulemaking to “curb lax security practices, limit privacy abuses, and ensure that algorithmic decision-making does not result in unlawful discrimination.” This paper addresses some of the reasons why FTC rulemaking is a poor substitute for federal legislation and an inefficient allocation of limited agency resources. While the FTC has considerable power to craft rules banning unfair or deceptive practices, Magnuson-Moss rulemaking is slow and resource-intensive, may not produce enforceable final rules, and does not necessarily preempt inconsistent state law. Plus, the limits of FTC’s unfairness authority do not always square well with privacy. Competition rulemaking, meanwhile, would be a terrible strategic blunder for the FTC and should be avoided. The FTC should instead focus its efforts on the most egregious practices that plainly fit within the statutory rubric of unfairness.
By Ben Rossen
When the Federal Trade Commission (“FTC”) announced in December that it is considering commencing a “commercial surveillance” rulemaking to “curb lax security practices, limit privacy abuses, and ensure that algorithmic decision-making does not result in unlawful discrimination,” privacy advocates appeared to have cause for celebration. Finally, after years of stalled negotiations on comprehensive privacy legislation in Congress, a newly aggressive FTC under Chair Lina Khan was going to blow the dust off the Commission’s musty old rulemaking powers and solve America’s privacy problem once and for all.
Unfortunately, the truth is a bit more nuanced. While the FTC has considerable power to make rules prescribing unfair and deceptive acts and practices (“UDAP”) under Section 18 of the FTC Act (so-called “Magnuson-Moss” rulemaking after the Magnuson-Moss Warranty – Federal Trade Commission Improvement Act of 1975), there are also significant drawbacks to this authority that may make it a poor fit for privacy regulation. Magnuson-Moss rulemaking is far from costless: it is slow and burdensome, and complicated privacy rules will likely take years to complete. It will also require the FTC to devote significant resources to rulemaking, likely at the expense of enforcement. From a policy perspective, Magnuson-Moss will force the FTC to shoehorn every privacy issue into the FTC Act’s definition of unfairness, which can be difficult when informational injuries can be quite subjective. There are also real questions as to whether FTC rulemaking is the right solution at all for a problem as complex as data privacy where most stakeholders generally agree that Congress is better suited than unelected Commissioners to resolve the difficult policy trade-offs necessary for effective regulation.
Chair Khan has also hinted that the FTC may engage in competition rulemaking under Section 6(g) of the FTC Act to regulate “the abuses stemming from surveillance-based business models” because “it is not only consumers that are threatened by [such business models] but also competition.” Unfair methods of competition (“UMC”) rulemaking under Section 6(g) could theoretically be achieved through notice-and-comment rulemaking under the Administrative Procedure Act, a much faster process than Magnuson-Moss. But there are serious questions as to whether the FTC has any authority to issue competition rules, guaranteeing a legal challenge that would likely end poorly for the agency.
Nonetheless, the FTC appears ready to invest heavily in rulemaking. In March 2021, then-Acting Chairwoman Rebecca Slaughter announced a new rulemaking group within the FTC’s Office of the General Counsel that would be tasked with streamlining the FTC’s “planning, development, and execution” of new rules intended to “deliver effective deterrence for the novel harms of the digital economy and persistent old scams alike.” One of Khan’s first actions as Chair was to approve changes to the Commission’s procedures to “streamline” Magnuson-Moss rulemaking proceedings while giving the Chair and a majority of Commissioners more direct control over the process. Since then, the FTC has issued advance notices of proposed rulemaking for two new UDAP rules: a rule prohibiting business and government impersonation fraud and a rule prohibiting unfair or deceptive earnings claims. These two rules both received bipartisan support and were adopted unanimously, in part because they address relatively uncontroversial deceptive practices.
Privacy, however, will be a different story and no easy road for the Commission. This article addresses some of the reasons why FTC rulemaking is ultimately a poor substitute for federal legislation and, likely, an inefficient allocation of limited agency resources.
II. How We Got Here: The Case for FTC Rulemaking
The FTC has served as America’s de facto privacy regulator since the passage of the Fair Credit Reporting Act in the 1970s. Under Section 5 of the FTC Act, which prohibits unfair and deceptive commercial practices, the FTC has pursued privacy and data security cases in myriad areas across the digital economy. But the FTC Act was never designed to be a privacy statute and a UDAP framework, while broad and flexible, is not always a good fit for privacy and data security. Many of the FTC’s early cases in this area focused on deception, which requires the agency to show that a representation, omission, or practice is likely to mislead consumers acting reasonably under the circumstances, and that it is material – that is, it would likely affect a consumer’s conduct or decisions with regard to a product or service. The FTC regularly used this authority to challenge deceptive claims in privacy policies – which the agency deems to be “material” despite the fact that few consumers read them. While the FTC has brought some important deception cases, the upshot of these efforts was that companies learned to say very little about their privacy practices.
Unfairness, meanwhile, requires proof that an act or practice (1) causes or is likely to cause substantial injury, (2) that is not reasonably avoidable by consumers themselves, and (3) is not outweighed by benefits to consumers or competition. The FTC has long recognized that unjustified, substantial consumer injury is the primary focus of the FTC Act, and not all injuries are legally “unfair.” Historically, substantial injury meant financial harm or serious threats to health and safety, and the FTC’s longstanding policy statement provides that “[e]motional impact, and other more subjective types of harm . . . will not ordinarily make a practice unfair.” Similarly, by statute, public policy considerations cannot serve as the primary basis for a finding of unfairness. These requirements pose challenges for aggressive privacy enforcement against practices like targeted advertising where reasonable people can and do disagree about the extent of injury and there are significant countervailing benefits from free online services. Nonetheless, the FTC has wielded its unfairness authority to stop a variety of harmful practices, including failures to reasonably secure personal information, soliciting and publicly posting nonconsensual pornography along with victims’ personal information, selling sensitive data such as Social Security numbers to third parties that did not have a legitimate business need for the information, and collecting and sharing sensitive television-viewing information without notice or consent, among others.
The limitations of the FTC’s UDAP authority have grown more apparent with the rise of the tech giants and increasing calls for aggressive regulation. Recognizing these limitations, Commissioners from both sides of the aisle have repeatedly urged Congress to enact comprehensive privacy legislation that would establish baseline privacy protections for all Americans, give the FTC stronger teeth to enforce it through civil penalty authority for first-time offenses, and authorize the FTC to hire more attorneys and technologists to enforce the law.
Nonetheless, Congress has failed to act. Despite a growing patchwork of state privacy laws that prompted industry to come to the table in favor of federal legislation – and, specifically, preemption – the prospects of federal legislation remain dim. Increasingly, privacy advocates and members of Congress have called on the FTC to enact privacy rules instead. Somewhat surprisingly, even Republican Commissioner Christine Wilson – no fan of rulemaking – reluctantly voiced her support for privacy rulemaking last year (which she has since walked back) to solve the “market failure” caused by information asymmetries among consumers and the companies that collect, use, and share consumer personal information. Alvaro Bedoya, likely to be confirmed as a fifth commissioner soon, has previously indicated that he would support privacy rulemaking and it thus appears likely the FTC will move quickly to start the process once he arrives.
III. Is Magnuson-Moss Rulemaking Worth All the Effort?
The most likely source of authority for privacy rulemaking is Section 18 of the FTC Act, which authorizes the agency to enact “rules that define with specificity” unfair or deceptive acts or practices in or affecting commerce. This would be the most logical route because the FTC has always treated privacy as a consumer protection issue and Congress has unambiguously delegated this authority to the FTC through Magnuson-Moss. Section 18 rulemaking would give the FTC considerable – though not unlimited – flexibility to declare a variety of privacy or security concerns to be “an unfair act or practice” under the FTC Act.
However, Magnuson-Moss rulemaking is far from costless. First, it imposes significant burdens on limited agency resources. Despite recent attempts by the FTC to streamline rulemaking procedures under Magnuson-Moss, it remains a slow, byzantine process that requires the agency to navigate a maze of bureaucratic obstacles before a final rule can become effective. The statute is particularly burdensome when it comes to complex or controversial rules, which could include dozens of mandates – each of which the FTC would need to prove addresses an unfair or deceptive practice, as defined by statute and agency guidance, that is “prevalent” in the market. It requires the FTC to hold adjudicative hearings with cross-examination and rights of rebuttal, and respond to all significant comments, proposed regulatory alternatives, and requests for exemptions. While the agency can place some limits on the extent of due process afforded to interested parties, anyone can challenge the rule on appeal if the FTC’s limits on cross-examination or rebuttal precluded disclosure of disputed material facts. A complex set of privacy and security rules would likely take years to become final. Without bipartisan consensus, a new administration could simply cancel unfinished rulemaking, potentially wasting years of effort. In the meantime, how many cases would the FTC have been able to bring if it instead focused its resources on aggressive enforcement?
Second, although there are many ways that the FTC could try to formulate rules that restrict data collection and use, the FTC’s authority to promulgate UDAP rules is limited to practices that are unfair or deceptive under the FTC Act, which, as previously discussed, does not always track neatly with privacy. It may therefore be difficult for the FTC to promulgate sweeping rules prohibiting behavioral advertising without a foundation that such practices are already recognized as unfair. It would also be difficult for the FTC to prohibit consumers from consenting to certain uses of data because an act or practice can only be unfair under the FTC Act if it was not reasonably avoidable by consumers themselves, such as through clear and conspicuous disclosures or meaningful consent. The FTC will likely identify a bevy of potential harms resulting from “commercial surveillance,” such as an increased risk of data breaches, misinformation campaigns, social media’s effects on children and teens, and discrimination caused by microtargeting of protected classes. But if the FTC targets these harms with overbroad rules that simply ban digital advertising, the rulemaking record will be full of evidence of the benefits consumers receive from free ad-supported online services and the procompetitive effects of digital advertising on small publishers and niche brands that were able to flourish due to inexpensive customer acquisition through targeted ads. The FTC would need to explain why other less burdensome regulatory alternatives are inappropriate (such as opt-in consent or a universal opt-out regime), particularly when the FTC has itself repeatedly recognized the significant benefits to consumers from the collection and use of data. Challengers would undoubtedly use the FTC’s past statements and guidance on appeal to try to invalidate the agency’s rules or reopen the rulemaking record, all of which could result in further delay and cast doubt upon the validity of any final FTC rule.
Finally, Magnuson-Moss rulemaking could further exacerbate the problem of patchwork compliance with privacy regulations because it is by no means clear to what extent such regulations would preempt state law. California, Colorado, Virginia, and Utah have all recently enacted comprehensive privacy laws, and many other states are considering similar legislation. Federal privacy legislation that provides strong baseline privacy protections while establishing a national standard could streamline compliance costs for industry while providing significant benefits to consumers. By contrast, Magnuson-Moss does not contain any express preemption clause and implied preemption is by no means guaranteed. The few cases to have considered the preemptive effect of Magnuson-Moss regulations suggest that the FTC could preempt state laws that pose a direct conflict or are inconsistent with particularized purposes of a detailed regulatory scheme, but the law of “obstacle preemption” is far from settled and requires courts to divine legislative intent. Thus, rather than creating a national standard, FTC regulations could result in competing federal and state privacy regimes, further complicating the patchwork of compliance.
In a best-case scenario, a Magnuson-Moss rulemaking might push Congress to finally pass much-needed federal privacy legislation. Alternatively, targeted rulemaking addressing egregious business practices that unquestionably injure consumers might receive bipartisan support, and relatively narrow rules could probably be completed in a year or less. On the other hand, a partisan rulemaking process that tries to mimic comprehensive legislation or ban entire industries would almost certainly result in a years-long slog, tying up limited agency resources with potentially little to show for it. And if history is any guide, agency overreach will not be received well in Congress, especially if political winds change. The end result of such a process is unlikely to justify the significant costs of rulemaking.
IV. UMC Rulemaking Is Not an Appropriate Regulatory Solution
The FTC might also try to regulate privacy through competition rulemaking under Section 6(g) of the FTC Act, but this path is far riskier due to serious questions about the FTC’s authority to promulgate substantive competition rules. Proponents of UMC rulemaking see Section 6(g) as a faster alternative to Magnuson-Moss because it would be governed by simple notice-and-comment rulemaking under the Administrative Procedure Act.
Chair Khan, who has previously expressed her support for UMC rulemaking, has already begun to pave the way for it. For instance, in July 2021, the Commission rescinded, without replacing, its bipartisan Statement of Enforcement Principles Regarding “Unfair Methods of Competition” Under Section 5 of the FTC Act, opening the door to UMC enforcement that extends beyond the constraints of other antitrust laws. More recently, a December filing describing the agency’s annual regulatory priorities stated that the FTC “in the coming year will consider developing . . . unfair-methods-of-competition rulemakings,” specifically calling out “the abuses stemming from surveillance-based business models” as a particular concern of the Commission because of threats to both consumers and competition.
As I and others have written about elsewhere, broad UMC rulemaking would be a terrible strategic error for the FTC. Substantive rulemaking under Section 6(g) stands on shaky legal footing, at best. UMC rulemaking proponents point to National Petroleum Refiners Association v. FTC, a 1973 D.C. Circuit case that upheld the FTC’s authority to issue broad legislative rules under the FTC Act, and the only court to have considered the FTC’s UMC rulemaking power. They argue that Congress effectively ratified National Petroleum Refiners when it enacted detailed UDAP rulemaking provisions in Magnuson-Moss without addressing UMC, and that the FTC’s determination that a practice is a UMC will receive Chevron deference.
The premise of this argument is fundamentally incorrect. While a detailed analysis of National Petroleum Refiners is beyond the scope of this paper, suffice it to say that it is highly unlikely that any modern court would similarly interpret the FTC Act. The D.C. Circuit’s permissive statutory analysis effectively concluded that an ambiguous grant of rulemaking authority should be construed to give agencies the broadest possible powers so that they will have flexibility in determining how to effectuate their statutory mandates. Not only has the Supreme Court never explicitly adopted this approach, recent decisions under the major questions doctrine strongly suggest it would decline to do so if presented the opportunity. Some scholars have gone so far as to argue that no current Supreme Court justice would approach statutory interpretation the way the D.C. Circuit did in National Petroleum Refiners.
UMC rulemaking would be an especially poor fit for privacy given that only the FTC has authority to enforce Section 5 of the FTC Act but antitrust enforcement is divided between the FTC and the Department of Justice. This would lead to obvious problems if, for example, the FTC banned behavioral advertising as UMC: companies subject to FTC oversight would then face per se liability, while those overseen by DOJ would have the exact same practices evaluated under a rule of reason analysis. Consider, for example, the absurd results that would stem from how DOJ and FTC have divided enforcement among the biggest tech companies, with the FTC handling Meta and Amazon but DOJ overseeing Google and Apple.
For all these reasons, the FTC would be foolhardy to tackle privacy through UMC rules when Magnuson-Moss, despite its drawbacks, provides clear authority to promulgate UDAP rules, does not present issues of divided enforcement, and is far more consistent with the FTC’s longstanding approach to privacy under its consumer protection authority.
Privacy regulation, if successful, could prove to be the defining consumer protection achievement of Lina Khan’s tenure as Chair of the FTC. But this outcome is far from a certainty. Privacy rulemaking will be slow and inefficient, and at the end of the day, may not even produce enforceable final rules. While some have opined that the FTC must enact privacy rules soon because the worst possible outcome would be that neither Congress nor the FTC act to protect Americans’ privacy, there are arguably worse outcomes. Setting aside the possibility of Congressional blowback reminiscent of the FTC’s darkest days after KidVid, failed rulemaking that siphons the agency’s limited resources away from case-by-case enforcement could leave consumers less protected than ever. If, as expected, the FTC commences privacy rulemaking this year, the Commission should focus its efforts on the most egregious practices that plainly fit within the rubric of unfairness and would be wise to avoid the distraction of UMC rules.
 Mr. Rossen is Special Counsel in the Antitrust & Competition Law Practice Group of Baker Botts LLP and a former senior attorney in the FTC’s Division of Privacy and Identity Protection. The views presented in this paper are those of the author and do not necessarily reflect the views of the firm or any client.
 Federal Trade Comm’n, Statement of Regulatory Priorities at 2, https://www.reginfo.gov/public/jsp/eAgenda/StaticContent/202110/Statement_3084_FTC.pdf.
 Fed. Trade Comm’n, FTC Acting Chairwoman Slaughter Announces New Rulemaking Group (Mar. 25, 2021), https://www.ftc.gov/news-events/press-releases/2021/03/ftc-acting-chairwoman-slaughter-announces-new-rulemaking-group.
 See FTC Policy Statement on Deception (Oct. 23, 1984) (appended to Cliffdale Assocs., Inc., 103 F.T.C. 110, 183 (1984)), https://www.ftc.gov/public-statements/1983/10/ftc-poliystatement-deception.
 15 U.S.C. § 45(n).
 See FTC Policy Statement on Unfairness (Dec. 17, 1980) (appended to Int’l Harvester Co., 104 F.T.C. 949, 1070 (1984).
 See e.g. In the Matter of InfoTrax Systems, L.C., FTC File No. 162 3130, Docket No. C-4696 (2019); FTC v. Equifax, No. 1:19-cv-03927-TWT (N.D. Ga. 2019).
 FTC v. EMP Media, Inc. (d/b/a MyEx.com), No. 2:18-cv-00035 (D. Nev. 2018); In the Matter of Craig Brittain, FTC File No. 132 3120, Docket No. C-4564 (2015).
 FTC v. Sitesearch Corp. d/b/a LeapLab, No. 2:14-cv-02750 (D. Ariz. Feb. 18, 2016).
 FTC v. Vizio, Inc., No. 2:17-cv-00758 (D.N.J. 2017).
 See Letter from Sen. Richard Blumenthal, et al., to the Hon. Lina Khan, Chair, Federal Trade Commission (Sep. 21, 2021), https://www.blumenthal.senate.gov/imo/media/doc/2021.09.20%20-%20FTC%20-%20Privacy%20Rulemaking.pdf.
 S&P Global Market Intelligence, FTC nominee signals support for privacy rules, Big Tech regulations (Nov. 17, 2021), https://www.spglobal.com/marketintelligence/en/news-insights/latest-news-headlines/ftc-nominee-signals-support-for-privacy-rules-big-tech-regulations-67645909.
 See Fed. Trade Comm’n, FTC Votes to Update Rulemaking Procedures, Sets Stage for Stronger Deterrence of Corporate Misconduct (July 1, 2021), https://www.ftc.gov/news-events/news/press-releases/2021/07/ftc-votes-update-rulemaking-procedures-sets-stage-stronger-deterrence-corporate-misconduct.
 See e.g. Wait But Why? Rethinking Assumptions About Surveillance Advertising, IAPP Privacy Security Risk Closing Keynote 2021, Remarks of Commissioner Rebecca Slaughter (Oct. 22, 2021).
 See e.g. Alden Abbot, Broad-Based FTC Data-Privacy and Security Rulemaking Would Flunk a Cost-Benefit Test, Int’l Ctr. for L. & Econ. (Oct. 13, 2021), https://laweconcenter.org/resource/broad-based-ftc-data-privacy-and-security-rulemaking-would-flunk-a-cost-benefit-test/.
 See Am. Fin. Servs. Ass’n v. FTC, 767 F.2d 957, 990-91 (D.C. 1984) (upholding conflict preemption of Credit Practices Rule where the FTC made clear the rule as not intended to occupy the field of credit regulation, and drafted the rule to be as consistent with state law as possible); Katharine Gibbs Sch. Inc. v. FTC, 612 F.2d 658, 667 (2d Cir. 1979) (invalidating overbroad preemption of the Vocational School Rule that preempted “any provision of any state law, rule, or regulations which is inconsistent with or otherwise frustrates the purpose of the provisions of this trade regulation rule.”).
 See generally Federal Preemption: A Legal Primer, Cong. Research Serv. (July 23, 2019), at 28, https://sgp.fas.org/crs/misc/R45825.pdf.
 See e.g. Rohit Chopra & Lina Khan, The Case for “Unfair Methods of Competition” Rulemaking, 87 U. Chi. L. Rev. 357 (2020).
 Federal Trade Comm’n, Statement of Regulatory Priorities at 2, https://www.reginfo.gov/public/jsp/eAgenda/StaticContent/202110/Statement_3084_FTC.pdf.
 See e.g. Maureen K. Ohlhausen & Ben Rossen, Dead End Road: National Petroleum Refiners Association and FTC “Unfair Methods of Competition” Rulemaking, The FTC’s Rulemaking Authority, Concurrences (forthcoming 2022); see also Maureen K. Ohlhausen & James Rill, Pushing the Limits? A Primer on FTC Competition Rulemaking, U.S. Chamber of Com. (Aug. 12, 2021), https://www.uschamber.com/assets/archived/images/ftc_rulemaking_white_paper_aug12.pdf.
 482 F.2d 673 (D.C. Cir 1973).
 See e.g. Nat’l Fed’n of Indep. Bus. v. Dep’t of Lab., Occupational Safety & Health Admin., 142 S. Ct. 661, 665 (2022) (per curiam).