Google Shopping: beware of ‘self-favouring’ in a world of algorithmic nudging By Nicolo Zingales (Sussex Law School)1
On 27 June 2017, the European Commission closed its investigation in the Google Shopping case. It found a breach of article 102 TFEU in relation to Google’s “more favourable positioning and display of its own comparison shopping service compared to competing comparison shopping services” (hereinafter, “the conduct”).2 The Commission’s Decision is important for several reasons. First and foremost, it constitutes the first application of the leveraging theory in an algorithmic context, where as a result of certain algorithmic design choices3 a dominant undertaking systematically directs (“nudges”) consumers towards its own goods or services in a secondary market. Google apparently did not see it coming, as it argued both in the proceedings before the European Commission and in the appeal it lodged against the Decision4> that the Commission used a novel theory of abuse, and therefore in accordance with its previous practice should not have imposed a fine. However, the Commission rejected this argument, noting that it had already used a self-favouring theory to establish abuse in a number of cases.5 It therefore imposed a fine of almost 2.5 billion euros and ordered Google to take adequate measures to bring the conduct to an end, and refrain from repeating it, or engaging in any act or conduct with the same or an equivalent object or effect.6
The divergence of views between Google and the Commission relates to the specificities of the application of leveraging theory (and in particular the so called ‘self-favouring’ abuse) in this particular context. In this short Comment, I acknowledge the peculiarities of algorithmic leveraging and sketch some of the implications of a broad definition of preferential treatment in a world where algorithmic mediation, and to some extent “nudging”, becomes pervasive.
The notion of preferential treatment in Google Shopping
In order to appreciate the Commission’s definition of preferential treatment, it is necessary to make a clarification about the technology under discussion: to provide users with the most relevant results, search engines undertake editorial functions in indexing, triggering, ranking and displaying content. Those choices are made primarily by designing algorithms, i.e. rules that will govern the operation of Google’s crawling, triggering, ranking and displaying technologies to perform the desired process. Because of these editorial functions, algorithms can have in-built biases which lead to systematically favouring certain content, although that may not necessarily be the result of a deliberate choice of the designer. Since the stage of algorithmic design is removed from the generation of results, it is often difficult for the designer to anticipate all the possible consequences. This holds even more true when it comes to deep learning algorithms, recently incorporated into Google Search,7 that are characterized by the property to automatically learn and improve from experience without being extensively programmed. The problems of transparency, fairness and accountability of algorithmic systems are so complex and important that they have come to define an entire field of research, much of which focused on deep learning.8 They are now an increasing source of headaches for courts and regulators.
Given the challenges in predicting the nature and effects of algorithmic design decisions on the market, it is particularly significant that the Decision condemns a conduct resulting from algorithmic design choices.9 Taken at face value, this could mean that a dominant company having developed or used an algorithm is strictly liable for any possible anticompetitive effects derived therefrom. Consequentially, it requires the adoption of wide-ranging measures of self-monitoring to ensure compliance by design, which is something that Commissioner Vestager has recently alluded to.10 However, the Commission provides no guiding principle on how far that compliance framework should go: neither in the substantive part of the decision nor in its remedial order, where it requires Google to ensure equal treatment concerning “all elements that have an impact on the visibility, triggering, ranking or graphical format of a search result in Google’s general search result pages”.11 While Google may be able to get to a good compromise in the definition of the conduct it is required to adhere to under the remedy,12 we may query what that high-level definition of equal treatment means for future developers of algorithms?
To compound those challenges, it is worth noting the Decision does not specify a threshold of materiality for differential treatment by a dominant company to fall foul of Article 102. The Commission presents data showing that the conduct in question is sufficiently capable (a threshold that is notably lower than likelihood13) of driving competitors out of business, reducing incentives to innovate and consumer choice, and leading to higher prices.14 To supplement its findings, it puts forward some colorful evidence of intent by the concerned undertaking to favour its own services over those of competitors in order to leverage its position in general search into the market for shopping comparison services.15 Regrettably, however, the line between permitted and prohibited conduct is rather blurred, as nowhere in the Decision does the Commission detail what amounts to preferential treatment, other than stating that it involves the application of different standards for ranking and visualization to Google Shopping than to other comparison shopping services.
Most notably, the Decision begs the question of whether a dominant undertaking remains free to set up its ranking and selection (“triggering”) criteria, so long as those are applicable indistinctively both to its products and services and to those of its competitors. The Commission seems to gloss over those details, affirming that “[it] does not object to Google applying certain relevance standards, but to the fact that Google’s own comparison shopping service is not subject to those same standards as competing comparison shopping services16”. This leaves us with the suspicion that a dominant undertaking such as Google could in fact be found liable for designing its algorithms in a way that leads to a disparate impact on a given class of competitors (or in the case of the implementation of the remedy, its competing comparison shopping services), despite the indiscriminate application of those algorithms to all products and services.
However, a blanket prohibition of self-favouring formulated in these terms would be likely to impose a disproportionate burden on a range of undertakings, if not accompanied by some limiting principle: much like a dominant company’s indiscriminate conditions of sale may lead to refusal to supply in violation of Article 102 when it fulfills the specific conditions established in Bronner,17 an algorithm with indiscriminate application but disparate impact on competitors should be held in violation of Article 102 only if it meets specific requirements serving as proxy of consumer harm. To be clear, this is not a call for the application of the Bronner conditions, which is unsurprisingly invoked by Google, but rather a recognition that the Commission would be well advised to narrow the net it casts to catch anticompetitive conduct perpetrated through algorithmic nudging. Perhaps one could take the Commission’s emphasis on the “active” nature of the refusal to grant competitors access to a proportion of its general search result pages (in the sense of exempting Google Shopping from demotions and “hardcoding” its position in the ranking)18 to mean that the use of certain types of instructions or criteria would be considered “passive” refusal, and therefore escape the prohibition. However, this distinction necessitates further elaboration. In the absence of clarifications on the constitutive elements of the self-favouring abuse, the extent of antitrust deference towards the design of “relevance standards” for ranking and selection algorithms is nebulous, and therefore problematic for investment and innovation.19
Transparency is important, but not enough
Another piece of the puzzle in understanding the Commission’s stance towards algorithmic design is its concern for transparency as a means of protecting both market players and final consumers in their interactions with a dominant company. First, the Decision highlights the ample discretion to remove or demote websites retained by Google in its Webmaster Guidelines, where the company warns against certain identified practices but also reserves the right to “respond negatively to other practices not listed20”. Second, it recognizes that only a fraction of Google’s users (“the most knowledgeable users”) is likely to take the “Sponsored” label to mean that different positioning and display mechanisms are used for the corresponding search results.21 It is worth noting that the Decision does not provide empirical support for the latter position, and that this specific issue was at the core of the Dissenting Opinion to the recent Indian Competition Commission’s Decision finding that Google leveraged its dominant position in general web search to favour its own flight comparison service (Google Flights) over competing ‘travel verticals’.22 Nevertheless, these statements indicate that an important element of the Commission’s condemnation of the conduct lies in the opaqueness of Google’s prioritization and/or penalization practices, which affects the structure of competition in the market for shopping comparison services.
One may therefore expect that the transparency and intelligibility of algorithmic practices will play a role in determining the scope of differential treatment that may be caught under Article 102. However, even admitting the relevance of those considerations, it remains to be seen the extent to which those can serve as defense to a self-favouring allegation. One could argue, for instance, that Google should not be allowed to escape scrutiny by making it crystal clear that its search services systematically prioritize content coming from domains starting with “Goo”, or pages displaying its official logo. Condoning such conduct would run counter to the antitrust doctrine rejection of formalism, including the established principle that an abuse of dominant position is prohibited regardless of the means and procedure by which it is achieved.23 Following this argument, the fact that Google has come consistently on top of the auctions run for its Shopping Unit slots as part of its remedial measures24 should at least raise some eyebrow about the adequacy of those measures, highlighting the importance of the link with a clear and consistent definition of the abuse in question.
Towards a negligence-based safe harbor for impactful algorithms?
The main criticism of this Comment is that the Commission must look harder into the criteria underlying algorithmic design. Attributing strict liability for any algorithmic conduct that is capable of affecting competition is extremely far-reaching. What is needed is thus a limiting principle, for example in the guise of a ‘safe harbour’, that provides legal certainty for undertakings offering ranking and selection algorithms. A safe harbour could rely on conditions similar to those set out by article 14 of the E-commerce directive, which grants a content host immunity from liability under European law for the information stored provided that: “(a) it does not have actual knowledge of illegal activity or information and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or information is apparent; and (b) upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the information”.25 Those conditions could then be used to design, with appropriate institutional and procedural safeguards, a framework of ‘notice and re-adjustment’ of disproportionately and unjustifiably affected competitors. Additionally, the Commission could ensure that a dominant company is not negligent by prescribing a due diligence procedure for the design of algorithms that can effectively impact consumer choice through the selection or ranking of content. Such procedure could for instance rely on established techniques to detect the existence of bias,26 maintain a record of that testing for inspection by a competition or judicial authority, and even define a threshold of adverse impact warranting a change of the existing rules or criteria. While algorithmic accountability is a regulatory challenge that is here to stay, the global antitrust community has a responsibility to clarify the scope of the nascent antitrust duty to police one’s own algorithm. This should aim to ensure a sufficient protection against unfair manipulation (a task shared with consumer and data protection authorities), but without undermining the incentives to invest and innovate in algorithmic technologies.
1 Lecturer in Competition and Information Law, Sussex Law School. Comments welcome at N.Zingales@sussex.ac.uk.
2 European Commission, Case AT.39740 , Brussels, 27.6.2017, C(2017) 4444 final. Available at http://ec.europa.eu/competition/antitrust/cases/dec_docs/39740/39740_14996_3.pdf (hereinafter, “Decision”).
3 By “design choices”, I refer here to the rules and criteria embedded in the algorithm, including any subsequent changes or “updates” (as they are typically called in the context of Google search). Further, I am using a specific notion of algorithm, as a set of mathematical instructions to provide gatekeeping services.
4 See Case T-612/17, Action brought on 11 September 2017 – Google and Alphabet v Commission, OJ C 369, 30.10.2017, p. 37–38.
5 Decision, para. 649, referring to Case 311/84, Télémarketing, EU:C:1985:394; Case C-333/94 P, Tetra Pak II, EU:C:1996:436; Case T- 228/97, Irish Sugar, EU:T:1999:246; Case T-201/04, Microsoft, EU:T:2007:289. It bears noting that the case-law suggests that self-favouring may be caught as a manifestation of various types of conduct prohibited by article 102: see in this regard Nicolas Petit, ‘Theories of Self-Preferencing Under Article 102 TFEU: A Reply to Bo Vesterdorf’ (April 29, 2015). Available at SSRN: https://ssrn.com/abstract=2592253 or http://dx.doi.org/10.2139/ssrn.2592253. See also the Decision by India’s Competition Commission in Cases Nos. 07 & 30 of 2012, Matrimony.com v Google LLC, Google India and Google Ireland, available at http://www.cci.gov.in/sites/default/files/07%20%26%20%2030%20of%202012.pdf (finding that Google’s leveraging amounted to an imposition of unfair conditions in the purchase or sale of goods or services, in contravention of Section 4 (2) (a) (i) of the Competition Act).
6 Decision, Art. 2-4.
7 Cade Metz, ‘AI Is Transforming Google Search. The Rest of the Web Is Next’, Wired (2 April 2016). Available at https://www.wired.com/2016/02/ai-is-changing-the-technology-behind-google-searches/.
8 See for instance the annual conferences on Fairness, Accountability and Transparency (FAT) and on Fairness, Accountability and Transparency in Machine Learning (FATML), at https://fatconference.org and https://www.fatml.org/.
9 In some instances, the preferential treatment ostensibly results from the criteria generating a given algorithmic result. A good example is the “signals” for triggering the appearance of Product Universal, and/or its appearance in the middle to top position of the results in the first page: the number of stores and the number of shopping comparison engine in the top-3 generic search results. Decision, para. 391. In other parts of the Decision, however, the Commission merely takes issue with the exclusion of Google Shopping from the application of certain criteria that adversely affect the position of competing price comparison services (notably the […] and Panda algorithms). See Decision, para. 512.
10 Margarethe Vestager, ‘Algorithms and competition’, Speech at the Bundeskartellamt 18th Conference on Competition, Berlin, 16 March 2017. Available at https://ec.europa.eu/commission/commissioners/2014-2019/vestager/announcements/bundeskartellamt-18th-conference-competition-berlin-16-march-2017_en
11 The only limit it provides in that respect, presumably reflecting the feedback received in the ‘market-testing’ of the commitments offered to Commissioner Almunia in 2013 and 2014, is that any measure chosen by Google to comply with the order “should not lead to competing comparison shopping services being charged a fee or another form of consideration that has the same or an equivalent object or effect as the infringement established by this Decision”. Decision, para. 700.
12 As a measure implementing the remedy, since 28 September 2017 Google shifted its shopping operations into a separate entity, with other companies now able to bid for places in the Shopping Units. Furthermore, each ad in the Shopping Unit indicates which comparison service is providing it. However, it has been reported that as many as 99% of those Shopping results are held by Google. See Searchmetrics, ‘Google Shopping: Is the Revamped Comparison Service Fairer to Competitors?’ (29 January 2018), at https://blog.searchmetrics.com/us/2018/01/29/google-shopping-revamped-fairer-to-competitors/. See also Sam Schechner and Nathalia Dozdriak, ‘Google Rivals Ask EU to Toughen Measures in Antitrust Case’, Wall Street Journal (30 January 2018). Available at https://www.wsj.com/articles/google-rivals-ask-eu-to-toughen-measures-in-antitrust-case-1517334038
13 The Commission can find support for “capability” in a series of cases, many of which are listed in the Decision at para. 602; Case C-52/09, Konkurrensverket v TeliaSonera Sverige AB, EU:C:2011:83, para. 64; Case C- 549/10 P, Tomra Systems and Others v Commission, EU:C:2012:221, para. 79; Case T-336/07 Telefónica SA v Commission, EU:T:2012:172, para. 272, upheld on appeal in Case C-295/12 P, EU:C:2014:2062, para. 124; Case C-23/14 Post Danmark, EU:C:2015:651, para. 66; see also Case T-286/09, Intel v Commission, ECLI:EU:T:2014:547, para. 85, on this specific point confirmed on appeal in Case C-413/14, ECLI:EU:C:2017:632, para.149.
14 Decision, paras. 594-597.
15 In particular, the Commission found in internal documents that the Google’s Engineering Director responsible for Froogle, the previous version of Google Shopping, stated that “Froogle stinks” and warned that “(1) [t]he [Froogle] pages may not get crawled without special treatment; without enough pagerank or other quality signals, the content may not get crawled. (2) If it gets crawled, the same reasons are likely to keep it from being indexed; (3) If it gets indexed, the same reasons are likely to keep it from showing up (high) in search results […] We’d probably have to provide a lot of special treatment to this content in order to have it be crawled, indexed, and rank well”. Decision, para. 491.
16 Id., para. 440 (emphasis added). By choosing to use the word ‘certain’, the Decision suggests that the use of certain other criteria may be problematic. This hypothesis appears to be confirmed by para. 537, according to which “the Commission does not object to Google applying specific criteria per se but to the fact that Google prominently positions and displays results only from its own comparison shopping service and not from competing comparison shopping services” (emphasis added).
17 Namely, that the facility that is the object of refusal is indispensable to compete on a downstream market, and that refusal is not objectively justified. See Oscar Bronner GmbH & Co. KG v. Mediaprint Zeitungs- und Zeitschriftenverlag GmbH & Co. KG, Case C-7/97, 1998 E.C.R. I-7791,  4 C.M.L.R. 112.
18 Decision, para. 650.
19 For a useful mapping of types of deference towards design choices, see Stacey Dorgan, ‘The Role of Design Choice in Intellectual Property and Antitrust Law’, 15 Columbia Technology Law Journal 27 (2016).
20 Decision, para. 347.
21 Id., paras. 536 and 599.
22 Cf. Matrimony.com Decision, supra note 5, para. 248; and Dissenting Opinion, paras. 5-6.
23 Id., para. 338; Case 6/72, Europemballage and Continental Can v Commission, EU:C:1973:22, paras. 27 and 29; Case T-128/98, Aéroports de Paris v Commission, EU:T:2000:290, para. 170.
24 See supra, note 12.
25 See Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market, OJ L 178, 17.7.2000, p. 1–16. Applying these conditions to the Commission’s reasoning, they could be used to give content to the notions of “active” and “passive” conduct mentioned at para. 650: see supra, note 18.
26 See Christian Sandvig et al. ‘Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms’, Data and discrimination: converting critical concerns into productive inquiry (2014), 1-23.