Fall 2004

Standard Deviation: An Interview with Jürgen Link

Modernity and the reign of normalism

Anne Mihan, Thomas O. Haakenson, and Jürgen Link

It is commonly understood that what constitutes the “normal” is in a constant state of flux, but according to Jürgen Link, the very framework through which the normal is produced is itself mutable. A professor of Modern German Literature and Discourse Theory at the University of Dortmund, Link has produced a wide-ranging and sophisticated analysis of normalism and normativity in his Versuch über den Normalismus (“Inquiry into Normalism”). In a recent telephone interview with Link, Anne Mihan and Thomas O. Haakenson discussed the average, the “War on Terror,” the data-fication of society, and the meaning of normalism in contemporary culture.


Anne Mihan and Thomas O. Haakenson: Could you explain your use of the term “normalism” and its particular relevance for the years 1968 and 1989?

Jürgen Link: By normalism I understand the sum of all social apparatuses, or what Michel Foucault calls dispositifs, through which normalities are produced and reproduced in Western societies, equally on the side of objects and on the side of subjects: how normal bodies and normal souls are produced and reproduced. This normalism exists in two “ideal-typical varieties,” which differ according to their respective limits of normality, i.e., the border between what is still regarded as normal and what is constructed as abnormal. The limits of normality in these two varieties can be understood either as “protonormalism,” which first appeared in the 19th century and maintained normalism tightly and firmly, or as “flexible normalism,” which was solidified in the 1960s and grasped normalism loosely. Flexible normalism is something entirely different from a strict protonormalism with commandments and bans and clear yes-no norms. Flexible normalism allows a greater range of normal behavior. In differentiating between these two types of normalism, I would note that Foucault’s notion of normaliser has been mistranslated. Normaliser does not mean “to normalize,” but “to standardize,” as is the case in the standardization of industry norms. Normaliser was a “manipulative conditioning” designed to standardize people’s behavior during the period of 19th-century protonormalism and has a very strong authoritarian character for Foucault.

Retrospectively, one could say that 1968 was the final transition from “protonormalism” to “flexible normalism” in Western societies. In 1968 the tendency toward “flexible normalism” had already been underway for decades, but was not yet consolidated in its entirety. The significance of 1968 was that this change came in the form of a cultural revolution with numerous explosions. It happened like a build-up of water, as if a dyke had broken open in a very short period of time, in many different areas: in scientific production, partly in industrial production, and especially in subjectivity and sexuality. Another example is the contact between generations, between older people and the young. Also important was the overcoming of nationalism and the beginning of internationalization on a massive scale. All of this occurred in the form of an explosion.

By “explosion” I mean a process that occurs simultaneously on different levels, but most of all a rapidly escalating process of denormalization, in which normality goes off course and is broken. In France in 1968, it began with demands for the abolition of gender-segregation in students’ dormitories, out of which very shortly grew utopias of a free sexuality according to Reich and to Deleuze and Guattari. It began with demands for greater academic freedom like the kind practiced in Germany and quickly escalated toward demands for universities governed by the students. In the workers’ sphere it started with calls for pay increases and quickly led to a weeklong general strike and demands for production to be organized and supervised by the workers themselves.

The 1968 movement pressed against the boundaries of any normalism. That was the exceptional quality of the movement: it bordered on what one could call with Ernst Bloch “concrete utopia,” that is, a different, “transnormalistic” society, one that takes a critical stance toward all forms of normalism. Deleuze and Guattari’s Anti-Oedipus was a manifesto of such a “transnormalistic” way of life.

Because the transition happened in a short period of time with unforeseeable proliferation and massive complexity, it reached its limits very quickly. It became afraid of its own courage, and knew it had no answer to the question as to what an entirely different, transnormalistic society should look like. Thus, very soon an “ebbing away” of this movement occurred. But society had changed fundamentally as a result; “flexible normalism” was realized in the areas I described, especially with respect to subjectivity.

And the fall of the Iron Curtain in 1989?

Nineteen eighty-nine was similar, but it was a belated revolution. In the West, everything was already operating on the principles of flexibilization; the East wanted to and had to catch up. The difference from 1968 was that this need to catch up was confronted with an already widely consolidated form of flexible normalism in the West. Therefore, questions about the experience of limits, about normalism in general, and especially about transnormalism were practically not even raised. There were at most a few minor tendencies, but they had no chance of being realized, so that a quick decrease in intensity of the revolutionary impetus and a rapid normalization occurred.

How do concepts like “normal,” “normalcy,” “normalism” relate to the “average,” the “mediocre,” the “banal,” and the “everyday”?

Normalism has an essential historical a priori, which I call “data-fication” or the “data-processing, data-producing society.” These are societies that continually, exhaustively, and routinely make themselves statistically transparent. By transparency I mean a society that claims to determine statistically as exactly as possible the relevant dispersions of its population—of health issues, capital, knowledge, intelligence, quality of living, sexuality, and others—and then to tell the results to all its members, so that individuals learn their respective position in the distribution and can develop strategies that will enhance this position—diet, training, learning, psychotherapy, and so forth. Such a thing never existed in Antiquity or in the Middle Ages. It appeared during the 18th century in Western Europe and North America and prevailed in the course of the 19th century. Continually more areas of life became statistically transparent. It began with economic and medical data. Bodily data such as size and weight were recorded by statistical means. It then extended to issues of sexuality and included what one calls intelligence, I.Q.

In normalism, averages are not per se normative norms in the sense of commandments. We do not all have to have an average weight or an average intelligence. Rather, averages—just like the boundaries of normalcy—are signposts for our orientation: between averages and the limits of normalcy stretches a very broad normal range measuring between two-thirds and 95% of the population. This broad normalistic normal range exudes a great feeling of safety—“With an I.Q. of 85 I am still far removed from the lower border”—and it tolerates very diverse data, that is, a variety of pluralism.

Other concepts of normalcy simply refer to “banality” or “the everyday,” but we should understand the concept more precisely and with respect to the “data-processing society.” There have always been averages, also for ancient Greeks or ancient Jewry. But they were not conscious of it; it was not in their epistemology. For example, the average number of whippings that a slave received in a week in ancient Greece was not a relevant datum—it played absolutely no role. For us that would immediately play a role because we are oriented toward turning everything into data.

Again I refer to two very prominent examples in the US: the first is body weight. I believe that this is an enormously important thing for subjectivities in our kind of societies. This concerns questions of what one calls normal weight, how one determines it, how it is located in the dispersal of the spectrum, where the limits of normalcy are, from which point obesity is anomalous and absolutely to be addressed on a personal level and, on the other side, the point at which one is underweight. The second example is I.Q., the intelligent quotient. Here the question is whether one can quantify intelligence gradually and whether it actually creates a normal or a so-called Gauss distribution.

Rico Lins’s cover image for the “normalism” issue of kultuRRevolution, a journal co-edited by Jürgen Link.

Does making something statistically quantifiable thus imply a kind of generally accepted average or even a sort of common sense?

Let us take two examples: are female soldiers in the US Army a normalcy? In order to answer this question, the percentile—additionally specified according to ranks—is decisive. If it is roughly above 5%, it is commonly agreed that there is statistical normalcy. Running parallel to such exact statistics is an impression that forms in the public’s mind, for example through images in the media: I see female soldiers everywhere, so there must be many of them. It must be normal. For the incidents of torture of Iraqi prisoners we will probably never receive exact statistics. Thus, the official announcement declaring those incidents to be extreme exceptions from the norm runs counter to the public’s vague impressions fueled by the media coverage.

Is the tendency toward normalism sometimes frustrated either by its own logic or by other means?

I have established six inequations, all of which I will not go through here. One is: “normalcy is not the same as normativity.” The second is: “normalism is not the same as anthropological everydayness or everydayness in general.” On the contrary, only in modernity is everyday life the result of normalization. That means, for example, our transportation system is a normalcy of life at the quotidian level, because it must be continually normalized. In this respect “data-fication” plays an enormous role. One measures exactly the number of cars traveling on a certain street at a certain time and whether the street is wide enough for the amount of traffic. If it is not, it has to be widened. That, in my opinion, is normalistic everyday life.

A further inequation is “normalism is not the same as the aesthetically banal.” Modern art chafes against norms and strives to break, for instance, the classical norms. But that for me is not normalism. Art simply hates banality. Only if art is positioned against normalized everyday life does it lead to an interlinking with normality. If you reject radio music out of aesthetic considerations because it is banal, and if “banal” simply means a large majority that is statistically formidable, only then would banality indeed be linked with normalism.

What is the relationship then between normalcy and capitalism?

Normalism is not a concept that is in competition with or on the same level as concepts such as capitalism, modernism in the sense of “modernity,” industrialism or technocratism. Normalism is not a totalizing concept but, on the contrary, it indicates a discontinuous and ruptured net of social apparatuses or dispositifs.

But normalism has clearly been linked closely with capitalism since its beginnings. The first great data collections were the measurements of body weight and body size en masse in medical examinations connected with militarism. The others were collections of economic data, amounts of import and export, budget deficits, and also demographic data, cases of death, births, and so forth. If today one looks at the “Wall Street syndrome,” one sees actual “data-fication” in every walk of life. Everything must be turned into data as completely and precisely as possible in order that prognostic curves can somehow be calculated. In this respect there is a very close linkage between normalism and capitalism.

This “data-fication” is also true for subjectivity. Normalism atomizes, that is, isolates and segregates the individual, simply due to statistical conditions. This situation has often remained unreflected. If one considers the origins of modern individualism, one points to Protestantism or the Enlightenment or capitalism and others, and one forgets the question of statistics. Statistical data processing presupposes that I treat every individual the same. If I want to determine an average weight, or an average I.Q., I cannot treat particular categories of people—say kings, aristocrats, or capitalists—as special. On the contrary, I have to treat everyone the same, otherwise I cannot calculate an average. This is normalism’s very important contribution to the atomization of society.

But if society is confronted with the constant pressure of atomization, how is it held together at all?

One possibility is the so-called principle of achievement, the principle of competition applied to individuals. If I set all individuals in competition and measure their performance and say that there is an average, that these people are at the top level, those are in the middle, and there is a border of normalcy, I once again associate these groups of atomized and isolated individuals.[1] That in my opinion is the main point in the linking of capitalism and normalism at the level of the subject.

In your book Versuch über den Normalismus you refer to “aporias” in your discussion of normalistic association types. Many of these associations have been confronted with the limits of their own inclusiveness. In the demand for gay marriage, have lesbians and gay men succumbed to the pull of normalization?

This is precisely that tension, an aporia. During the proto-normalism of the 19th century, when the ranges of normalcy were limited as firmly and strictly as possible, only heterosexual, procreative sexuality was considered normal. That means that there was a distribution curve in which the “normal range” was rather narrowly defined and many things were considered abnormal.

In flexible normalism’s attitude toward homosexuality, one can see very well how the limits of normalcy are extending further outward. More of the abnormalities of earlier protonormalism have been integrated into the zone of normalcy. I believe that the Kinsey reports have played a great role in this process, because they proposed a veritable “data revolution” with respect to homosexuality. Lesbian and gay individuals are integrated into normalism, into the “normal range.” Once they are in the “normal range,” they are, however, in a certain way once again atomized and re-dispersed.

Let me explain: At the beginning of all minority movements there is a strong transnormalistic tendency that is critical of all forms of normalism: one would say, “We are a community on the grounds of certain characteristics that run counter to specific societal norms.” Then something occurs that is to be expected within the theory of normalism: one cannot isolate oneself fully from society. One continues to be part of a society that operates normalistically. This results in the integration of a few individuals from minority groups: certain women become managers, some blacks move up the social ladder. Atomization or separation cannot be prevented easily. In their communities, minorities are thus confronted incessantly with this tension, this aporia. They have to safeguard their association and, at the same time, they cannot completely withdraw from the filtering and redistributing movement of society-at-large.

Is the result of normalization a stripping away of difference?

Foucault already dealt with this to some extent, but I believe that it is not quite clear in his work. He said that people are individualized in disciplinary societies. But what is this form of individualization? It is something like a computer search. If we include fingerprints and biometrical data in passports—not only in the US but also in Germany, where Minister of the Interior Otto Schily wants to introduce this—such a process conforms to this concept of individualization. It means the possibility to be identified, “checked out” immediately by a computer. That is not in the interest of difference at all. On the contrary, this form of individualization occurs on the basis of an enforced conformity.

The so-called War on Terror is linked for many with a “de-democratization” of society resulting in restrictive political measures and tendencies toward self-censorship on the part of the media. Is this a struggle for normalcy?

Yes, there is certainly a connection. In his first big speech after September 11, in which George W. Bush declared the so-called War on Terror, he said among other things, “It is my hope that in the months and years ahead life will return almost to normal.” That was, I believe, a very important sentence. At that time one hoped it would take just a few months. It is three years later, and I believe we do not have the impression that we ever really returned to normalcy.

These years have set free what I describe as a “fear of denormalization,” a fear to lose normalcy, to be taken out of normalcy. Here again I see a form of aporia, because I believe that the concept of a “Global War on Terror” is the programmatic attempt to produce normalcy “preemptively”—with force where it is necessary, and globally across the entire planet. A global escalation of normalization, so to speak. It affects the entire world, including numerous countries that we are not at all familiar with. We know nothing about the types of people and types of groups that exist there. I believe that even the proponents of this concept now see that it is a very risky strategy. There is the enormous risk that instead of heading toward normalization this campaign might somehow derail and head toward denormalization. This fear, I believe, is shared at present more or less by the majority of people in the US as well as in Europe. It is a vague fear that somehow an escalation of denormalization could be unleashed.

This fear is in part massively repressed through political leaders and most of the major media when they say, “We have everything under control. It is continually getting better.” But the repressed fear of denormalization leads to panic reactions such as the reduction of democratic rights and other extreme tendencies. There is a massive contradiction between the fear one has of denormalization and the promise of normalization. The underlying fear of denormalization is there, it has to be worked out somehow by each of us.

The interviewers would like to thank Mirko M. Hall and the senior editors of Cultural Critique—Jochen Schulte-Sasse, Keya Ganguly, and John Mowitt—for access to advance copies of Link’s translated essays and to the translator’s notes to these texts.

  1. Link hyphenates the term As-soziation in order to combine the ideas of a type of community with that of building a “social body” (“socius” in the terminology of Deleuze and Guattari).

Jürgen Link is professor of modern German literature and discourse theory 
at the University of Dortmund. His recent books include Nationale Mythen und Symbole (1991); Versuch über den Normalismus (1996); and Hölderlin-Rousseau: Inventive Rückkehr (1999).

Anne Mihan studied German and English at Humboldt University in Berlin and at the University of Minnesota, Minneapolis. She is a doctoral candidate in the American Studies Institute at Humboldt University and is completing a dissertation in literary studies.

Thomas O. Haakenson is a doctoral candidate in the Department of Cultural Studies and Comparative Literature at the University of Minnesota. He is 
completing a dissertation in Berlin, Germany, through the auspices of the Max Planck Institute for the History of Science and the Berlin Program for Advanced German and European Studies.

If you’ve enjoyed the free articles that we offer on our site, please consider subscribing to our nonprofit magazine. You get twelve online issues and unlimited access to all our archives.