Society of the Query. The Googlization of our Lives

Every week we see the launch of yet another Google initiative. Even for informed insiders it is next to impossible to keep up, let alone reveal a master plan. As we write, mid-April 2008, there is the Google App Engine, “a developer tool that enables you to run your web applications on Google’s infrastructure”. It’s a perfect example of how a company that owns today’s infrastructure is able to concentrate more power.

Lovink Geert

A specter haunts the world’s intellectual elites: information overload.1 Ordinary people have hijacked strategic resources and are clogging up once carefully policed media channels. Before the Internet, the mandarin classes rested on the idea that they could separate “idle talk” from “knowledge”. With the rise of Internet search engines it is no longer possible to distinguish between patrician insights and plebeian gossip. The distinction between high and low, and their co-mingling on occasions of carnival, are from bygone times and should not concern us. Nowadays an altogether new phenomenon is causing alarm: search engines rank according to popularity, not Truth. Search is the way we now live. With the dramatic increase of accessed information, we have become hooked on retrieval tools. We look for telephone numbers, addresses, opening times, a person’s name, flight details, best deals and in a frantic mood declare the ever growing pile of grey matter “data trash”. Soon we will search and only get lost. Old hierarchies of communication have not only imploded, communication itself has assumed the status of cerebral assault. Not only has popular noise risen to unbearable levels, we can no longer stand yet another request from colleagues of importance. Even the benign greeting from friends and family has acquired the status of a chore with the expectation of reply. What most concerns the educated class is that chatter has entered the hitherto protected domain of science and philosophy, when instead they should be worrying about who is going to control the increasingly centralized computing grid.

What today’s administrators of noble simplicity and quiet grandeur can’t express, we should say for them: there is a growing discontent with Google and the way the Internet organizes information retrieval. The scientific establishment has lost control over one of its key research projects – the design and ownership of computer networks, now used by billions of people. How did so many people end up being that dependent on a single search engine? Why are we repeating the Microsoft saga once again? It seems boring to complain about a monopoly in the making, when average Internet users have such a multitude of tools at their disposal to distribute power. One possible way to overcome this predicament would be to positively redefine Heidegger’s “Gerede”. Instead of a culture of complaint that dreams of an undisturbed offline life and radical measures to filter out the noise, it is time to openly confront today’s trivial forms of Dasein in blogs, text messaging and computer games. Intellectuals should no longer portray Internet users as secondary amateurs, cut off from a primary and primordial relationship with the world. There is a greater issue at stake and it requires venturing into the politics of informatic life. It is time to address the emergence of a new type of corporation that is rapidly transcending the Internet: Google.

The World Wide Web, which should have realized Borges’ infinite library, as described in his short story The Library of Babel (1941), is seen by many of its critics as nothing but a variation of Orwell’s Big Brother (1948). The ruler, in this case, has turned from an evil monster into a collection of cool youngsters whose corporate responsibility slogan is “Don’t Be Evil”. Guided by a much older and experienced generation of IT gurus (Eric Schmidt), Internet pioneers (Vint Cerf) and economists (Hal Varian), Google has expanded so fast, and in such a wide variety of fields, that there is virtually no critic, academic or business journalist who has been able to keep up with the scope and speed with which Google has developed in recent years. New applications and services pile up like unwanted Christmas presents with increasing regularity. Just add Google’s free email service Gmail, the video sharing platform YouTube, the social networking site Orkut, GoogleMaps and GoogleEarth, its main revenue service AdWords with the Pay-Per-Click advertisements, office applications such as Calendar, Talks and Docs. Google not only competes with Microsoft and Yahoo, but also with entertainment firms, public libraries (through its massive book scanning program) and even telecom firms. Believe it or not, the Google Phone is coming soon. I recently heard a less geeky family member saying she had heard that Google was much better and easier to use than the Internet. It sounded cute, but she was right. Not only has Google become the better Internet, it is taking over software tasks from your own computer so that you can access these data from any terminal or handheld device. Apple’s MacBook Air is a further indication of the migration of data to privately controlled storage bunkers. Security and privacy of information are rapidly becoming the new economy and technology of control. And the majority of users, and indeed companies, are happily abandoning the power to self-govern their informational resources.

My interest in the concepts behind search engines was raised again while reading a book of interviews with MIT professor and computer critic Joseph Weizenbaum, known for his 1966 automatic therapy program ELIZA and his 1976 book Computer Power and Human Reason.2 Weizenbaum died on March 5, 2008 at the age of 84. A few years ago Weizenbaum moved from Boston back to Berlin, the city where he grew up before escaping with his parents from the Nazis in 1935. The interviews were conducted by Munich-based journalist Gunna Wendt. A number of Amazon reviewers complained about Wendt’s uncritical questions and the polite, superficial level of her contributions. This did not disturb me. I enjoyed the insights of one of the few insider critics of computer science. Especially interesting are Weizenbaum’s stories about his youth in Berlin, the exile to the USA and the way he became involved in computing during the 1950s. The book reads like a summary of Weizenbaum’s critique of computer science, namely that computers impose a mechanistic point of view on their users. What especially interested me was the way in which the “heretic” Weizenbaum shapes his arguments as an informed and respected insider – a position similar to the “net criticism” that I developed with Pit Schultz ever since we started the “nettime” project in 1995.

The title and subtitle of the book sound intriguing: Wo sind sie, die Inseln der Vernunft im Cyberstrom? Auswege aus der programmierten Gesellschaft. Weizenbaum’s system of belief can be summarized something like this: “Nicht alle Aspekte der Realität sind berechenbar”[Not all aspects of reality are calculable]. Weizenbaum’s Internet critique is a general one. He avoids becoming specific, and we have to appreciate this. His Internet remarks are nothing new for those familiar with Weizenbaum’s oeuvre: the Internet is a great pile of junk, a mass medium that consists of up to 95% nonsense, much like the medium of television, in which direction the Web is inevitably developing. The so-called information revolution has flipped into a flood of disinformation. The reason for this is the absence of an editor or editorial principle. The book fails to address why this crucial media principle was not built-in by the first generations of computer programmers, of which Weizenbaum was a prominent member. The answer probably lies in the computer’s initial employment as a calculator. Techno determinists in Berlin’s Sophienstrasse and elsewhere insist that mathematical calculation remains the very essence of computing. The (mis)use of computers for media purposes was not foreseen by the mathematicians, and today’s clumsy interfaces and information management should not be blamed on those who designed the first computers. Once a war machine, it will be a long and winding road to repurpose the digital calculator into a universal human device that serves our endlessly rich and diverse information and communication purposes.

On a number of occasions I have formulated a critique of “media ecology” that intends to filter “useful” information for individual consumption. Hubert Dreyfus’ On the Internet (2001) is one of the key culprits here. I do not believe that it is up to any professor, editor or coder to decide for us what is and what is not nonsense. This should be a distributed effort, embedded in a culture that facilitates, and respects, difference of opinion. We should praise the richness and make new search techniques part of our general culture. One way to go would be to further revolutionize search tools and increase the general level of media literacy. If we walk into a book store or library our culture has taught us how to browse through the thousands of titles. Instead of complaining to the librarian or informing the owners that they carry too many books, we ask for assistance or work it out ourselves. Weizenbaum would like us to distrust what we see on our screens, be it television or the Internet. Weizenbaum fails to mention who is going to advise us what to trust, whether something is truthful or not, or how to prioritize the information we retrieve. In short, the role of mediators is jettisoned in favor of cultivating general suspicion.

Let’s forget Weizenbaum’s info-anxiety. What makes the interview such an interesting read is its insistence on the art of asking the right question. Weizenbaum warns against an uncritical use of the word “information”. “The signals inside the computer are not information. They are not more than signals. There is only one way to turn signals into information, through interpretation.” For this we depend on the labor of the human brain. The problem of the Internet, according to Weizenbaum, is that it invites us to see it as a Delphi oracle. The Internet will provide the answer to all our questions and problems. But the Internet is not a vending machine in which you throw a coin and then get what you want. The key here is the acquisition of a proper education in order to formulate the right query. It’s all about how one gets to pose the right question. For this one needs education and expertise. We do not reach a higher education standard by raising the possibility to publish. Weizenbaum: “Die Möglichkeit, dass jeder etwas ins Internet stellen kann, bedeutet nicht sehr viel. Das willkürliche Hineinwerfen bringt genauso wenig wie das willkürliche Fischen.” [The possibility that everyone can put something on the Internet does not mean much. Randomly throwing something out there makes as little sense as randomly fishing.] In this context Weizenbaum makes the comparison between the Internet and the now vanished CB radio. Communication alone will not lead to useful and sustainable knowledge.

Weizenbaum relates the uncontested belief in (search engine) queries to the rise of the “problem” discourse. Computers were introduced as “general problem solvers” and their purpose was to provide a solution for everything. People were invited to delegate their lives to the computer. “We have a problem,” argues Weizenbaum, “and the problem requires an answer.” But personal and social tensions cannot be resolved through by declaring them a problem. What we need instead of Google and Wikipedia is the “capacity to scrutinize and think critically”. Weizenbaum explains this with reference to the difference between hearing and listening. A critical understanding requires that we first sit down and listen. Then we need to read, not just decipher, and learn to interpret and understand.

As you might expect, the so-called Web 3.0 is heralded as the technocratic answer to Weizenbaum’s criticism. Instead of Google’s algorithms based on keywords and an output based on ranking, soon we will be able to ask questions to the next generation of “natural language” search engines such as Powerset. However, we can already guess that computational linguists do not question the problem-answering approach and will be cautious about acting as a “content police force” who decide what is and what’s not crap on the Internet. The same applies to Semantic Web initiatives and similar artificial intelligence technologies. We are stuck in the age of web information retrieval. Whereas Google’s paradigm was one of link analysis and page rank, next generation search engines will, for instance, become visual and start indexing the world’s image, this time not based on the tags that users have added, but on the “quality” of the imagery itself. Welcome to the Hierarchization of the Real. The next volumes of computer use manuals will introduce programmer geeks to aesthetic culture 101. Camera club enthusiasts turned coders will be the new polluters of bad taste.

Ever since the rise of search engines in the 1990s we live in the “Society of the Query”, which, as Weizenbaum indicates, isn’t that far removed from Guy Debord’s Society of the Spectacle. Written in the late 1960s, this Situationist analysis was based on the rise of the film, television and advertisement industries. The main difference with today is that we are explicitly requested to interact. We’re no longer addressed as an anonymous mass of passive consumers. Instead we are “distributed actors” who are present on a multitude of channels. Debord’s critique of commodification is no longer revolutionary. The pleasure of indulging in consumerism is so wide-spread that it has reached the status of a universal human right. We all love the commodity fetish, the brands, and indulge in the glamour that the global celebrity class performs on our behalf. There is no social movement or cultural practice, however radical, that can escape the commodity logic. No strategy has been devised to live in the age of the post-spectacle. Concerns have instead been focusing on privacy, or what’s left of it. The capacity of capitalism to absorb its adversaries has been such that it has been next to impossible to argue why we still need criticism – in this case of the Internet – unless all your private telephone conversations and Internet traffic become publicly available. Even then, it is difficult to make the case for critique so much as organized complaint by a consumer lobby group. Consider this “shareholder democracy” in action. Only then will the sensitive issue of privacy become the catalyst for a wider consciousness about corporate interests, but its participants will be carefully partitioned. Entry to the shareholding masses is restricted to the middle classes and above. And this only amplifies the need for a lively and diverse public domain in which neither state surveillance nor market interests have a vital say.

Already by 2005 the president of the French Biliothèque National, Jean-Noël Jeanneney, published a booklet in which he warned against Google’s claim to “organize the world’s information”.3 It is not up to any single, private corporation to assume such a role. Google and the Myth of Universal Knowledge, translated into English by the University of Chicago Press, remains one of the few documents that openly challenge Google’s uncontested hegemony. Jeanneney targets only one specific project, Book Search, in which millions of books of American university libraries are being scanned. His argument is a very French-European one. Because of the unsystematic and unedited manner by which Google selects the books, the archive will not properly represent the giants of national literature such as Hugo, Cervantes and Goethe. Google, with its bias of English sources, will therefore not be the appropriate partner to build a public archive of the world’s cultural heritage. Says Jeanneney: “The choice of the books to be digitized will be impregnated by the Anglo-Saxon atmosphere.” While in itself a legitimate argument, the problem here is that it is not in Google’s interest to build and administrate an online archive in the first place. Google suffers from data obesity and is indifferent to calls for careful preservation. It would be naïve to demand cultural awareness. The prime objective of this cynical enterprise is to monitor user behavior in order to sell traffic data and profiles to interested third parties. Google is not after the ownership of Emile Zola. Its intention is to lure the Proust fan away from the archive. Perhaps there is an interest in a cool Stendhal mug, the XXL Flaubert T-shirt or a Sartre purchase at Amazon. For Google Balzac’s collected work is abstract data junk, a raw resource whose sole purpose it is to make a profit, whereas for the French it is the epiphany of their language and culture. It remains an open question whether the proposed European answer to Google, the multi-media search engine Quaero, will ever become operational, let alone embody Jeanneney’s values. By the time of Quaero’s launch, the search engine market will be a generation ahead of Quaero in media and device capabilities; some argue that Mr. Chirac was more interested in defending French pride than the global advancement of the Internet.4

Every week we see the launch of yet another Google initiative. Even for informed insiders it is next to impossible to keep up, let alone reveal a master plan. As we write, mid-April 2008, there is the Google App Engine, “a developer tool that enables you to run your web applications on Google’s infrastructure”. It’s a perfect example of how a company that owns today’s infrastructure is able to concentrate more power. App Engine will allow startups to use Google’s web servers, APIs, and other developer tools as the primary architecture for building new web applications. As Richard MacManus remarks, “Google clearly has the scale and smarts to provide this platform service to developers. However, it begs the question: why would a startup want to hand over that much control and dependence to a big Internet company?” Computing infrastructure is rapidly turning into a utility and the Google App Engine is yet another example of this. So MacManus ends with the rhetorical question: “Would you want Google to control your entire end-to-end development environment? Isn’t that what developers used to be afraid of Microsoft for?” The answer might be simple: it is the developers not-so-secret wish to be bought by Google. Millions of Internet users are, willingly or not, participating in this process by freely providing these companies with their profiles and attention, the currency of the Internet. A few week earlier Google patented a technology that will enhance its ability to “read the user”. The intention is to decipher which page regions and topics the viewer is interested in based on the viewer’s behavior after they have arrived at a page. This is just an example of the many analytical techniques this media company is developing to study and commercially exploit user behaviour.

It is no great surprise that Google’s fiercest critics are North Americans. So far, Europe has invested surprisingly little of its resources into the conceptual understanding and mapping of new media culture. At best, the EU is the first adaptor of technical standards and products from elsewhere. But what counts in new media research is conceptual supremacy. Technology research alone will not do the job, no matter how much money the EU invests in future Internet research. As long as the gap between new media culture and major governing, private and cultural institutions is reproduced, a thriving technological culture will not be established. In short, we should stop seeing opera and the other belles artes as a form of compensation for the unbearable lightness of cyberspace. Besides imagination, a collective will and a good dose of creativity, Europeans could mobilize their unique capacity to grumble into a productive form of negativity. The collective passion to reflect and critique may as well be used in a movement of “critical anticipation” that can overcome the outsider syndrome many feel with regard to the assigned role of merely a user and consumer.

Jaron Lanier wrote in his Weizenbaum obituary: “We wouldn’t let a student become a professional medical researcher without learning about double blind experiments, control groups, placebos and the replication of results. Why is computer science given a unique pass that allows us to be soft on ourselves? Every computer science student should be trained in Weizenbaumian skepticism, and should try to pass that precious discipline along to the users of our inventions.”5

We have to ask ourselves: why are the best and most radical Internet critics US-Americans? We can no longer use the argument that they are better informed. My two examples, both following in Weizenbaum’s footsteps, are Nicolas Carr and Siva Vaidhyanathan. Carr comes from the industry (Harvard Business Review) and developed himself as the perfect insider critic. His recent book, The Big Switch, describes Google’s strategy to centralize, and thus control, the Internet infrastructure through its data center.6 Computers are becoming smaller, cheaper and faster. This economy of scale makes it possible to outsource storage and applications at little or no cost. Businesses are switching from in-house IT departments to network services. There is an ironic twist here. As generations of hip IT gurus cracked jokes about the IBM’s Thomas Watson prediction – the world only needed five computers – this is exactly the trend. Instead of further decentralizing, Internet use is concentrated in a few, extremely energy-demanding data centers.7 Carr’s specialty is amoral observations of technology, ignoring the greedy character of the dotcom-turned-Web 2.0 class. Siva Vaidhyanathan’s project, The Googlization of Everything, has the ambition to synthesize critical Google research into a book that is due to come out in late 2009. In the meantime, he collects the raw material on one of his blogs.8

For the time being we will remain obsessed with the diminishing quality of the answers to our queries – and not with the underlying problem, namely the poor quality of our education and our diminishing ability to think in a critical way. I am curious whether future generations will embody – or shall we say design –Weizenbaum’s “islands of reason”. What is necessary is a reappropriation of time. At the moment there is simply not the time to stroll around like a flaneur. All information, any object or experience has to be instantaneously at hand. Our techno-cultural default is one of temporal intolerance. Our machines register software redundancy with increasing impatience, demanding installation of the update. And we are all too willing to oblige, mobilized by the fear of slower performance. Usability experts measure the fractions of a second in which we decide whether the information on the screen is what we are looking for. If we’re dissatisfied, we click further. Serendipity requires a lot of time. We could praise randomness, but hardly practice this virtue ourselves. If we can no longer stumble into islands of reason through our inquiries, we may as well build them ourselves. With Lev Manovich and other colleagues I am arguing that we need to invent new ways to interact with information, new ways to represent it, and new ways to make sense of it. How are artists, designers, and architects are responding to these challenges? Stop searching. Start questioning. Rather than trying to defend ourselves against “information glut”, we can approach this situation creatively as the opportunity to invent new forms appropriate for our information- rich world.

(Thanks to Ned Rossiter for his editorial assistance and ideas)



«