Issue 13 Futures Spring 2004
Daniel Rosenberg and Susan Harding
The Future is not what it used to be.
We have been living through boom times for the future. Even before the escalating storms of the early 21st century, our cultures and industries collaborated in a remarkable proliferation of words and images about this impossible object. In recent years, the very thought “future” has been spectacularized in extraordinary ways. Whether in modes of progress or apocalypse, our media have overflowed with anticipations of things to come, with utopias, dystopias, stories of time travel and artificial intelligence, with accounts of acceleration and progress, of doom and imminent destruction, with scenarios, predictions, prophecies, and manifestos. Since the rise of the digital economy, even the benighted “science” of futurology has come back into style.1
In the first years of the 21st century, representations of the future have cycled wildly through a historical repertoire, from the ray-gun gothic of the 1930s to the noir and the endism of the 1940s and 1950s to the plastic modularity of the 1960s and back again. As if following a kind of Moore’s Law scaling principle, futures today seem to be reproducing themselves faster and more cheaply than ever. At the same time, their shelf-lives appear to be getting shorter. Any child can historicize them for you, can tell you in a minute which future is up to date and which is already over, which doesn’t run fast enough on the current microprocessor and which doesn’t run at all. In the computer world, an entire sub-industry has sprung up in what is called legacy software, programs written on old platforms, modified and translated to run on new machines as if it were still 1979 and the first wave of chunky Galaxians were twirling madly toward the missile defense systems and video arcades of our Earth.
More and more, our sense of the future is conditioned by a knowledge of futures that we have already lost. Indeed, nostalgia for the future has become so pervasive today that it has even developed a distinctive set of commercial uses. As Arjun Appadurai suggests, contemporary mass consumption “is not simply based on the functioning of simulacra in time, but also on the force of the simulacra of time.”2 If different modes of production imply different forms and experiences of temporality, our current habits of consumption appear to imply a nostalgia for productivity in general and for all of the different experiences of temporality that it might be able to generate.3 Today, our futures feel increasingly citational—each is haunted by the “semiotic ghosts” of futures past.4
The rise of this kind of nostalgia points up something both formally and historically important. The future is not an empty category. Even if we accept a skeptical critique of prophecy, we must acknowledge that for us the future is not so much undetermined as overdetermined. Our lives are constructed around knowledges of the coming that are as full (and flawed) as our knowledges of the past. Often these future knowledges are profoundly freighted, since they involve anticipatory hopes and fears energized by pasts that are with us still. Our futures are not merely geometrical extensions of time. They haunt our presents, obeying architectural laws that look more like Gaudi than Euclid, arising in diverse and peculiar ways.
In historical terms, the development of future-nostalgia also points to a crisis in modern futurity. From the beginning, the modern was constituted through a rejection of prophecy. The philosophy of the Enlightenment required that time would be open to human achievement and that events could gain meaning from their interrelation, rather than from their relationship to absolute, Biblical beginnings and ends. By bracketing eschatological questions, the Enlightenment effectively “sealed off” the future from prophetic knowledge.5 But this development had paradoxical consequences. In no way did it amount to a going-out-of-business for futuro-logical workshops. The Enlightenment proscription against traditional prophetic practices turned out to produce new and intensified imaginative demands on the future and new techniques of narration and prognosis.6 The very possibility of an open-ended time elicited an outpouring of grand narratives from Condorcet and Kant to Hegel and Comte. This effect was by no means limited to high philosophy. In the arena of fiction, for example, the late 18th century saw an efflorescence of future fantasies. And, for the first time in literary history, these futures took place not in some vague hereafter but in a chronological expanse freed from the finitude of sacred history, in the profane historical future, in the years 2440, 1850, 1900, and 7308.7
Of course, these future narratives were also morality tales for the present, but in them the present was materialized through striking new kinds of proleptic imagining. The new futurisms of the 18th and 19th centuries allowed—and even required—the thinking of alternative timelines: in them, the present was not just the past of the future, but the “the past of future, contingent presents.”8 It is difficult to overestimate the implications of this new possibility. But it is equally crucial to note that its victory was only ever partial. The contingent futures that emerged during the Enlightenment never fully displaced the necessary futures of prophecy. In some instances, such as that of Auguste Comte, modern visions of progress themselves took on a providential character. In others, such as the 19th-century Uchronie of Charles Renouvier, contingencies piled on contingencies seemingly without end.9 Moreover, the religious prophets did not oblige anyone by going away. As it turns out, what most characterizes the modern problem of the future is not its historical distance from the mode of prophecy but rather its hybrid and contradictory relationship to it.
The modern period saw a proliferation of techniques for imagining, predicting, and narrating futures—many in an ambiguous terrain “between science and fiction”—and a developing cultural consciousness of the instability of this new temporal landscape. By the end of the 19th century, according to contemporary observers, time itself appeared to be accelerating, and futures—big and small alike—seemed to be coming and going with breathtaking speed. And this sense of acceleration did not abate. Instead, it became something like second nature, so that by the late 20th century, the problem was no longer how to account for historical acceleration, but how to account for the acceleration of acceleration itself.
At the same time, the coming and going of futures became such a regular feature of modern life that it has sometimes seemed as if it could have no history at all. Witness the turning of the recent millennium. Although the event itself did not occasion the level of cult activity or terrorism anticipated by many observers, it did provoke an outpouring of futurological speculation. Prophets, prognosticators, predictors, fortune-tellers, astrologers, millennialists, apocalyptics, visionaries, seers, and their journalistic and academic fellow-travelers clogged airwaves, magazines, newspapers, bookstores, and pews with their wares. As we approached 2000, the clock of discourse ticked louder and louder, and the future itself seemed to shrink to fit the narrowing frame left until the calendar turned over. When all was said and done, though, 2000 could not have been anything but an anticlimax to the countless stories in which it played an anticipatory role. There was something vampiric about the moment: a thousand flashbulbs popped, but nothing showed up in the picture. Still, invisibly, it was everywhere. It haunted us.
At the same time, the millennium set off a kind of world-wide explosion of future kitsch and marketing, of gadgets, blockbusters, and pageants, an entire world of media turned Busby Berkeley for a year. New York City took out a trademark and made itself the official world capital of the “event.” Airline tours were devised in order to allow paying passengers the experience of two or more millennial New Year’s Eves, and one South Pacific island went so far as to change its position on the international date line in order to offer wealthy tourists a guaranteed experience of arriving at the 21st century before anyone else in the world.
Even skeptics rushed into this boom future market. Rationalists assured us that “the millennium” was only a kind of folie à plusieurs based on a scientifically meaningless fascination with round numbers. But, at the same time, they traded in the fascination. In the months leading up to the turn-of-the-millennium, anticipations of the year 2000 transformed into fears of a Y2K computer bug, and for a while the future was now. As Y2K, the future acquired a technical, a rational, and especially, an economic profile. Its importance was to be measured in the amount of money spent preventing it, or cleaning up the mess that it created; Y2K gave us something to believe in and anticipate when we were barred from hoping for something mysterious. It also had the effect of spectacularizing a new world order—as, according to the experts, only the hypertechnologized and the primitive would be spared. It would be those technological and political stragglers of the second world, principally the former Communist world, who would be at risk, perhaps punished.
At Y2K, the big story turned out to be the non-story. As hours passed on New Year’s Eve and nations of the Earth passed in cohorts from the 20th to the 21st centuries, CNN and the networks reported “success” in nearly every locality. There were scattered reports of problems released from the bunker-style headquarters of our own Federal “Y2K Preparedness Center,” but none of these turned out to be serious. Certainly none approached the level of crisis created months later by the hacker-induced failures of several major web portals or the simple computer virus called the Love Bug.
But the failure of the Y2K apocalypse did not lessen its historical importance. Like any other national pageantry, Y2K in all its dimensions—cultural, commercial, political, and technological—energized an entire economy of anticipation, and produced a powerful expressive performance of a still-unstable global culture business, vying for metanarrative control over the future. The events of Y2K lavishly demonstrated that the future in the modern West is always already dense with meaning. “The future” is a placeholder, a placebo, a no-place, but it is also a commonplace that we need to understand in all of its cultural and historical density.
To this end, the articles and artifacts gathered here highlight everyday future-making practices: each works to illustrate and to understand the how of our anticipations as much as the what. The following section is a hypertext. While its subjects are diverse, they are also pervasively linked—technologies of time and trauma; the hope and hubris of the manifesto; conspiracy, prophecy, and utopia—subjects both deeper and more mundane than we usually recognize.
Susan Harding is professor of anthropology at the University of California, Santa Cruz. She is the author of The Book of Jerry Falwell: Fundamentalist Language and Politics (Princeton, 2000), among other works.Daniel Rosenberg is assistant professor of history in the Robert D. Clark Honors College at the University of Oregon. His next book concerns the history of the past.
Cabinet is published by Immaterial Incorporated, a non-profit 501(c)(3) organization. Cabinet receives generous support from the Lambent Foundation, the Orphiflamme Foundation, the Andy Warhol Foundation for the Visual Arts, the Opaline Fund, the New York City Department of Cultural Affairs, the Danielson Foundation, the Katchadourian Family Foundation, The Edward C. Wilson and Hesu Coue Wilson Family Fund, and many individuals. All our events are free, the entire content of our many sold-out issues are on our site for free, and we offer our magazine and books at prices that are considerably below cost. Please consider supporting our work by making a tax-deductible donation by visiting here. Thank you for your consideration.
© 2004 Cabinet Magazine