On 21 May, the European Commission dropped the first shoe in its antitrust investigation of Google. Joaquín Almunia, the Commission’s Vice President for competition policy, gave a public statement outlining four ‘concerns’ with Google’s practices that he said–like the very model of a cautious bureaucrat–‘may be considered’ anti-competitive. He invited Google to propose remedies for these concerns; an invitation that Google chairman Eric Schmidt promptly declined, insisting that Google has done nothing wrong. The parties are continuing their discussions, but the other shoe is poised for an early descent.
Since the Commission has now indicated the hand it intends to play when the suing starts, this is an opportune time to reflect on the cards it contains. Two of the four are small beer. Google demands exclusive advertising deals from some of its web site partners, and Google makes it more difficult than the Commision would like for advertisers to transfer their campaigns to competing ad platforms. Both are straightforward allegations of exclusionary conduct with obvious remedies; neither threatens Google’s strategic interests in search innovation. If these were all that stood between Google and the Commission, it is hard to imagine that they would not quickly reach a compromise.
The third issue is meatier, but still hardly earth-shaking. Google, the Commission alleges, ‘copies content from competing vertical search services and uses it in its own offerings’. As stated, this claim sounds in copyright, not in antitrust. And further, Google offers an unconditional opt-out to web sites that do not wish to be in its search index. So this is hardly a case of wholesale misappropriation.
Instead, the real sticking point is that Google’s opt-out is all-or-nothing. As Yelp’s CEO explained at a Senate Judiciary Committe hearing last fall, Google gave Yelp the choice of having its reviews and ratings incorporated into Google’s local search, or of being excluded entirely from Google’s organic search index. Google thus dangles the carrot of the traffic it can deliver to entice sites into giving it the data it needs to build products that compete with them. Thus, the Commission’s concern here is best described as a form of tying, rather than outright theft from competitors. Again, it is hard to imagine that allowing more granular opt-outs would threaten core Google interests, although it is also clear why the company would prefer not to have to implement them.
The fourth concern is the real heart of the case. In Almunia’s words, ‘Google displays links to its own vertical search services differently than it does for links to competitors,’ thus giving its own services ‘preferential treatment’. Similar allegations against Google have a long pedigree, so it is worth understanding the theory underlying them. According to critics, Google sometimes ‘manipulates’ the order in which it presents search results, in order to promote its own services or to demote competitors.
The argument has intuitive appeal in light of Google’s many representations that its rankings are calculated ‘automatically’ and ‘objectively,’ rather than reflecting ‘the beliefs and preferences of those who work at Google’. But turning manipulation into a well-specified theory of anticompetitive conduct is surprisingly difficult. The problem is that one cannot define ‘manipulation’ without some principled conception of the baseline from which it is a deviation. To punish Google for being non-neutral, one must first define ‘neutral’.
In the first place, search engines exist to make distinctions among web sites, so equality of outcome is the wrong goal. Nor is it possible to say, except in extremely rare cases (such as, perhaps, ‘4263 feet in meters’) what the objectively correct best search results are. The entire basis of search is that different users have different goals, and the entire basis of competition in search is that different search engines have different ways of identifying relevant content. Courts and regulators who attempt to specify search results would merely be substituting their own judgments of quality for the search engine’s.
Anti-manipulation, then, must be a process value: even-handed treatment of all web sites, whether they be the search engine’s friends or foes. Call this idea ‘impartiality’. A strong version of impartiality would be akin to Rawls’s veil of ignorance: algorithmic changes must be made without knowledge of which web sites they will help and hurt. This is probably a bad idea. Consider the DecorMyEyes scam: an unethical glasses merchant deliberately sought out scathing reviews from furious former customers, because the attention boosted his search rank. Google responded with an algorithmic tweak specifically targeted at web sites like his. Strong impartiality would break the feedback loops that let search engines find and fix their mistakes.
Thus, the anti-manipulation case hinges on a weaker form of impartiality, one that prohibits only those algorithmic changes that favour Google at the expense of its competitors. Here, however, it confronts one of the most difficult problems of hi-tech antitrust: weighing pro-competitive justifications and anti-competitive harms in the design of complicated and rapidly changing products. Many self-serving innovations in search also have obvious user benefits.
One example is Google’s treatment of product-search sites like Foundem and Ciao. Google has admitted that it applies algorithmic penalties to price-comparison sites. This may sound like naked retaliation against competitors, but the sad truth is that most of these ‘competitors’ are threats only to Google’s users, not to Google itself. There are some high-quality product-search sites, but also hundreds of me-too sites with interchangeable functionality and questionable graphic design. When users search for a product by its name, these me-too sites are trying to reintermediate a transaction that has very little need of them. Ranking penalties directed at this category share some of the pro-consumer justification of Google’s recent moves against webspam.
A slightly different practice–and the one identified specifically in Almunia’s statement–is Google’s increasing use of what it calls Universal Search, in which it offers news, image, video, local, and other specialised search results on the main results page, intermingled with the classic ‘ten blue links.’ Since Google has competition in all of these specialised areas, Universal Search favours Google’s own services over those of competitors. Universal Search is an obvious departure from neutrality, whatever your baseline – but is it bad for consumers? The inclusion of maps and local results is an overwhelming positive: it saves users a click and helps them get the address they’re looking for more directly. Other integrations, such as Google’s attempts to promote its Google+ social network by integrating social results, are more ambiguous. Some integration rather than none is almost certainly the best overall design, and any attempt to draw a line defining which integration is permissible will raise sharp questions about regulatory competence.
Some observers have suggested not that Google be prohibited from offering Universal Search, but that it be required to modularize the components, so that users could choose which source of news results, map results, and so on would be included. This idea is structurally elegant, but in-house integration also has important pragmatic benefits. Google and Bing don’t just decide which map results to show, they also decide when to show map results, and what the likely quality of any given map result is compared with other possible results. These comparative quality assessments don’t work with third-party plugin services.
It makes sense for general-purpose search engines to turn their expertise as well to specialised search. Once they do, it makes sense for them to return their own specialised results alongside their general-purpose results. And once they do that, it also makes sense for them to invite users to click through to their specialised sub-sites to explore the specialised results in more depth. All of these moves are so immediately beneficial to users that the Commission and other regulators concerned about Universal Search should tread with great caution.
James Grimmelmann is a Professor of Law at New York Law School. This is an edited and expanded version of a blog post to the Antitrust and Competition Policy Blog. For more on these issues, see my papers Some Skepticism About Search Neutrality, The Google Dilemma, and The Structure of Search Engine Law.