Contact

Click here for a confidential contact or call:

1-212-350-2774

Increased Enforcement Shows Firms Need to Plug Antitrust and Privacy Law Requirements Into Their Digital Algorithms

Posted  June 9, 2022

The recent announcement by the Federal Trade Commission and the U.S. Department of Justice of a settlement with Twitter that would impose $150 million in penalties and robust compliance measures is merely the most recent warning to firms using algorithmic tools that abuses are being subjected to increased enforcement efforts under antitrust and privacy laws. 

Just under three years after the FTC’s record-breaking $5 billion penalty against Meta (then Facebook) for data privacy violations, the FTC and DOJ have joined forces to remind businesses that “violating FTC orders will result in substantial penalties.”

The settlement announced by the government includes a sizable $150 million penalty—equal to 3% of Twitter’s revenues in 2021—and comprehensive compliance measures.  Twitter is accused of violating the terms of a 2011 consent agreement by collecting personal data and misusing it to sell targeted advertising services.  While this settlement serves as a warning that enforcement of privacy violations under the consumer protection laws is still a priority, businesses that utilize algorithmic tools should also prepare for increased enforcement at the intersection of privacy and antitrust, given President Biden’s recent Executive Order, and the appointment of privacy expert Alvaro Bedoya to the FTC.

Twitter was warned it might face enforcement action for a repeat offense back in 2011, when it resolved the prior FTC complaint with a settlement requiring it to honor the privacy choices exercised by its users.  The new government complaint alleges Twitter broke those promises when it collected consumers’ personal data, ostensibly to provide improved security services, but also used it to sell targeted digital advertising services.  Twitter allegedly misused users’ data  for targeted advertising by combining data collected from Twitter users for security purposes, with data from brokers like Acxiom and Datalogix.  This data resulted in predictive algorithms that provided advertisers with lists of “look-alike” targets who resembled existing customers in various ways, including their interests, purchase history, and income level.

The FTC’s Consumer Protection Bureau has been investigating data collection and usage for over a decade, entering into consent decrees with data brokers and data driven digital advertisers.  This regulatory work seeks to address concerns about harm to consumers, including discrimination, inferior services, higher prices, and unauthorized disclosure of sensitive information.  Recently state attorney generals have started probing data collection and targeted advertising by social media giants, with a focus on protecting younger consumers from privacy violations.  In another recent FTC case against WW (formerly Weight Watchers), the company was required not only to delete the health data that the FTC accused it of collecting without informed consent from children as young as eight, but also  to destroy any algorithms derived from the data, and to pay a $1.5 million fine.  As noted in the FTC’s Business Blog: “It’s FTC 101.  Companies can’t tell consumers they will use their personal information for one purpose and then use it for another.”

In the European Union, data collection and algorithms have frequently undergone scrutiny in competition law cases.  For example, in 2020, under similar facts, a German court ruled that Meta (then Facebook) had violated competition laws and abused its dominance in social media by illegally harvesting data about its users for its digital advertising services.  In December 2021, real estate brokers were fined by the Spanish competition authority for violating Spanish competition law and the Treaty on the Functioning of the EU by engaging in a conspiracy to fix commissions through an algorithm.  This February, the UK’s Competition and Markets Authority reached a landmark agreement allowing Google to use an algorithm to remove third party cookies tracking users to create a “Privacy Sandbox,” under many conditions.  This blog’s recent survey listed several other abuse of dominance claims in Europe and the United Kingdom related to data collection for digital advertising.

While data privacy and security practices have mostly been challenged in the United States by enforcers under consumer protection laws, there may be a growing trend to pursue such practices under competition laws.  With the recent confirmation of Alvaro Bedoya, an expert in privacy law and tie-breaking vote for Democrats, the FTC is poised to act on President Biden’s Executive Order on Promoting Competition in the American Economy, directing regulators to address “unfair data collection and surveillance practices that may damage competition, consumer autonomy, and consumer privacy.”

Even without new rulemaking, there is precedent for antitrust scrutiny of the collection and transmission of data for use with digital algorithms.

In some cases, regulators look closely at data collection with concerns about abuse of market dominance.  For example, in a settlement with the FTC several years ago related to the acquisition of DataQuick, data broker CoreLogic agreed to divest itself of “bulk data,” in response to an FTC complaint that the acquisition would significantly increase the concentration for national assessor and recorder data.  Zillow Group faced similar antitrust scrutiny last year for its acquisition of ShowingTime.

Other cases focus on an algorithms’ function and effects.  The DOJ Antitrust Division and the FBI teamed up a few years ago to indict a group of sellers who were charged under the Sherman Act with conspiring to fix the prices for posters sold online on Amazon Marketplace with the help of pricing algorithms.  Conversely, in a more recent private action by Epic Games accusing Apple of anticompetitive behavior related to its App Store, Apple raised a winning defense, arguing that combining its algorithm with manual human review creates a “walled garden” of privacy and security that caries weight as a procompetitive justification.

In another recent action filed by private plaintiffs in the United States District Court for the District of New Jersey, a company was accused of violating the Sherman Act by monopolizing co-location services, or low latency access to servers located near major trading destinations.  Algorithmic high-frequency trading involves trading large amounts of stocks based on pre-defined criteria and in short amounts of time, to take advantage of small changes in price.  In this context, a few milliseconds in routing and executing trades provides a firm with a competitive advantage, and proximity to the location becomes paramount to trading with algorithms.  Plaintiffs alleged that they had been physically locked out of the location necessary to compete in NASDAQ, and several other exchanges, for stocks and options.  This case was quickly and quietly settled.

Notably, President Biden’s Executive Order calls for cooperation between agencies with overlapping jurisdiction.  Given this increased interest in an interdisciplinary approach, the telecommunications industry also should be on notice that the Federal Communications Commission has the tools to address concerns about competition.  In a 2017 case before the FCC, use of data was considered by proponents and opponents of the deal.  The dominant telecom provider advocating for this transaction argued that algorithms that mine caller data and surveil users to identify potentially illegal calling patterns were a public benefit.  However, opponents of the transaction cited those very algorithms and data practices as a reason to reject the transaction.  In a 3-2 vote, the FCC approved the transfer of licenses, despite a dissenting opinion noting ex parte objections that the purchaser was misusing personal and confidential consumer data and harming competition.  The majority reasoned these objections were best addressed through post-merger enforcement, rather than in dealing with the transaction.

A few months ago, the same company abandoned another acquisition, after the DOJ and FCC expressed concerns.  The FCC received a comment outlining years of privacy and security violations related to the same algorithms and data collection practices in dispute during the 2017 transaction.  The DOJ noted that the proposed merger of two of the four leading companies in the market for inmate telecommunications services would have eliminated competition that benefits telecom users by improving services and terms.  With the FCC currently split 2-2 along party lines, confirmation of President Biden’s pick for the FCC, Gigi Sonh, could lead to further competition enforcement through the FCC’s public interest standard.

Litigation and enforcement efforts that focus on algorithmic data are well-established, but also quickly evolving.  State, federal and foreign enforcers, and private parties will continue to look closely at how private and confidential data is collected and protected, and at the effects of digital algorithms on markets and consumers, with an increasing interdisciplinary and inter-agency approach.

While the FTC’s settlement with Twitter serves as another warning to businesses that rely on sensitive and personal data to ensure compliance with established privacy law from the perspective of consumer protection law, there is increasing room in the U.S. for practices pertaining to data-driven digital algorithms to face challenge—and treble damages—under antitrust law.

Written by Paulette Rodríguez López

Edited by Gary J. Malone