5/ Social Meanings Like perception, the process of classification clearly underscores the role of meaning in human cognition. Nowhere, however, is that role as explicitly evident as when we examine the way we use symbols. Using symbols presupposes a mental association of two elements, one of which (the "signifier") is regarded as representing, or "standing for," the other (the "signified").1 The meaning of a present, thus, is the personal affection it is supposed to represent. Convertible cars likewise may signify free-spiritedness, whereas cigars are often associated in our minds with virility. When we regard something as a symbol, we are primarily concerned with what it represents, to the point where what it actually "means" sometimes overrides any functional significance it may otherwise have for us. Champagne, for example, is thus associated in our minds far more with celebrating than with simply quenching thirst. Long, polished fingernails likewise function primarily not as claws but as symbolic evidence of having reached a certain social status that protects one from the need to perform hard physical labor.2 Along similar lines, clothes become for us more than simply practical responses to weather conditions. Differences in clothing, for example, often signify differences between contrasting types of social space (home versus work),3 time (sacred versus profane),4 and 68 Social Meanings / 69 ambiance (formal versus casual). Loosening ones tie or removing one's shoes may thus signify a switch from formality to informality, whereas opening a button in one's blouse may be "read" as an implicit invitation to one's bed. We likewise use clothes to signify various professional (military uniform), ethnic (sari), gender (nylon stockings), and political (pro-life T-shirt) identities.5 Like clothes, time, too, is loaded with extra-functional meanings. It is the distinctly symbolic significance of "lead time" (rather than just the practical fact that one may have other plans), for example, that may make one decline a "last-minute" invitation to a wedding or a prom, and the political symbolism of waiting that sometimes leads people to be purposefully late and let others "cool their heels" in order to humiliate them. When we specifically refrain from contacting someone "too soon," it is usually because of the way we associate frequency of contact with intimacy (which is also why switching from getting together every day to every other week is likely to be interpreted as indicative of some significant cooling off in a relationship).'' The mental association of a particular signifier with a particular signified may very well be the work of a particular individual. The meaning of our dreams, for example, is for the most part personal, and no respectable psychoanalyst would ever attempt to "decode" a dream without some significant input from the particular patient who dreamt it. Most symbolic associations, however, involve shared meanings and, as such, are not just personal. At the same time, they are not natural either. In fact, most meanings rest on conventional, sodomental associations of particular signifiers with particular signifieds. To fully appreciate the unmistakably social nature of symbols, we need to compare them with other kinds of signs. It would be particularly useful, in this regard, to examine the extent to which the mental association of a particular signifier with the particular signified it represents is natural. That, of course, would allow us to establish 70 / Social Meanings Social Meanings / 71 "where" the meaning of any given sign falls between inevitability and conventionality. As Figure 5.1 shows, approaching signification from this angle basically yields three major types of signs—the ones embodying the two polar extremes of absolute inevitability (indicators) and pure conventionality (symbols), and an in-between composite of major elements of both (icons).7 The most striking contrast is the one between symbols and indicators, which are distinctively characterized by the intrinsic (and thus inevitable) nature of their association with what they represent. The semiotic relation between an indicator and what it signifies to us is absolutely natural and does not require any artificial mediation in the form of social convention. Consider medical symptoms, those pieces of physical evidence we regard as "symptomatic," or "indicative," of particular diseases. The mental association of rectal bleeding with cancer of the colon or of certain blisters and the viral condition we call chicken pox, for example, has absolutely nothing to do with social convention. Nor does the association of smoke with fire, of daffodils with spring, of a fast heartbeat with excitement, or of the height of a mercury column in a glass tube with the temperature. Such connections are certainly not something for which we ourselves are in any way responsible. Most of the signs we use in our daily life, however, are not indicators. Their association with what they are supposed to represent to INDICATOR ICON SYMBOL Inevitability -*-*~ Conventionality Natural association of Artificial association of siguifier with signified signifier with signified Figure 5.1. Types of signification us involves a certain element of human intervention and, as such, is far from inevitable. The meaning of these signs is clearly extrinsic to them, an unmistakably artificial property added by social convention alone. The artificial, conventional nature of all signs other than indicators is evident even in iconic representation, which still presupposes some physical resemblance between signifier and signified. After all, despite the obvious physical affinity between the cross and the actual shape of the crucified body of Christ, its mental association with Christianity is by no means inevitable and rests at least partly on social convention.* And whereas the smell of a rabbit inevitably denotes to predators an actual rabbit, the mental association of a conventional outline drawing of a rabbit with an actual rabbit already presupposes an element of socialization. (In fact, slightly shorter ears and a somewhat longer tail would immediately "transform" such a conventional rabbit into a conventional cat, despite the fact that no actual cat ever looks like a long-tailed, short-eared rabbit.)9 Only humans, of course, seem to undergo such semiotic socialization, which explains why, despite the obviously great arousal power of nude photographs for humans, we would never expect a fox, for example, to actually be turned on by a mere picture (or any other iconic representation) of a vixen's body. The contrast between indicators and signs whose meaning is less inevitable and more conventional becomes even more pronounced as we proceed from merely iconic to strictly symbolic modes of representation (whereby sexual arousal, for example, is generated not even by an actual picture of a naked body but through the mere use of "erotic" language to describe it). Unlike icons, symbols do not presuppose any physical affinity whatsoever between a signifier and what it is supposed to represent to us. It is the difference between an outline drawing of a rabbit and the word "rabbit," between the Roman numeral "III" and the mathematical sign "3," between a cross and the sound of a pipe organ. In fact, the meaning of symbols is completely dissociated from 72 / Social Meanings their physical properties. Had we not been semiotically socialized to "read" clocks, for example, we would never be able to tell the time from the angle formed by a clock's hands. Nor would we be able to figure out that a rectangle with white, green, and red horizontal stripes "means" Bulgaria, or that the dove, which is in fact a rather unfriendly bird, has anything to do with peace. Of all the signs we use, the mental association between signifier and signified is the least inevitable in symbols. Indeed, it is their absolutely artificial, conventional nature that distinguishes symbols from all other signs.10 Thus, if we are to understand the full meaning of a particular symbol, we cannot afford to consider its physical properties alone. A symbolic analysis of snakes that focuses strictly on their iconic "phallic" essence, for example, is inherently limited. Nor can we fully understand why widows wear black dresses if we focus only on the seemingly inevitable macabre nature of the color black. In fact, the actual color of the dresses widows wear is not as symbolically significant as the fact that it so sharply contrasts with the color ot the dresses brides wear. Indeed, the meaning of the colors of the dresses both brides and widows wear is the message they convey together about the fundamental cultural contrast between the social states of entering and exiting marriage for women.11 The meaning of symbols generally derives not from their own inherent properties but from the way they are semiotically positioned in our minds vis-á-vis other symbols. Thus, in order to understand how a particular symbol comes to be mentally associated with what it represents to us, we must first understand how it is related to other symbols we use. The critical relation, in this regard, is contrast or opposition, since a symbol basically derives its meaning from its distinctive features, those properties that distinguish it from other symbols.12 In order to know what it is, we must therefore first find out what it is not. I once saw a man looking at two public restroom doors labeled "Bucks" and "Does" who muttered to him- Social Meanings / 73 self, "I am not a doe, so I guess I am a buck." In order to understand the meaning of the word "man," for example, we first need to know whether it is used in contrast to "woman" (as in "this is definitely a man's job"), "child" (as in "for God's sake, act like a man"), "animal" (as in "when man started using language"), or "nature" (as in "man-made fibers"). Semantics, in short, is inseparable from syntactics. In order to fully understand the meaning of a symbol, we must transcend the narrow confines of a strictly semantic analysis and consider also the syntactic context within which it is structurally embedded (that is, the way it is semiotically contrasted in our minds with other symbols).13 A "semiotic square" may help us see more clearly the relational nature of symbols, the fact that they basically derive their distinctive meanings from the way they are semiotically contrasted in our minds with other symbols.14 The color blue, for example, is conventionally associated with baby boys. As evident from Figure 5.2, this association can be fully understood only within the context of the essentially homologous conventional association of the color pink with baby girls [a]. The mental association of handwriting with informality, or of the French tu with intimacy, can likewise be understood only within the context of the somewhat parallel conventional association of printing with formality [b] and of vow5with social distance [c]. And before we rush to the conclusion that Sunday was chosen by the Church to represent Christianity mainly because it happened to be the day of the Resurrection, we should note that, in their effort to establish their distinctiveness vis-a-vis Jews, the early Christians were primarily looking for some day other than Saturday as the pivot of their week [d].i3 (For the same reason, they later also proceeded to arrange their calendar so that Easter would never fall on the same day as the first evening of Passover.)16 Of course, if baby girls were all dressed in blue, baby boys could very well wear pink, as the basic structural color contrast between 74 / Social Meanings Social Meanings / 75 a. Pink b. Printing c. Tii d. Sunday e. Caviar ( + ) a. Female b. Formal c. Intimate d. Christian e. Special (-) a. Blue b. Handwriting c. Vous A. Saturday e. Cereal (") ( + ) a. Male b. Informal c. Distant d. Jewish e. Ordinary + semantic association — syntactic contrast Figure 5.2. The semiolic square the sexes would still be maintained! The cultural meanings we conventionally attach to caviar and cereal [e] or to champagne and coffee ' would likewise be completely reversed if we had the former every morning for breakfast and reserved the latter only for special occasions. What is at stake here is a fundamental generic cultural distinction between the ordinary (the semiotically "unmarked") and the special (the semiotically "marked"). The actual contents of each of those two categories are of somewhat secondary significance.18 Thus, for shift workers, receiving an unexpected telephone call from one's boss at home in the middle of the morning basically has the same meaning as receiving such a call in the middle of the night would for a regular daytimer.19 By the same token, for prostitutes, kissing Johns on the mouth is often considered a much more significant (and therefore dangerous) token of intimacy than fellating them. Given the semiotic role of the week as a cycle of periodic alternation between marked and unmarked time periods,20 which particular days we choose to mark is likewise somewhat secondary. Thus, if it were Monday rather than Saturday nights that we normally reserved as "special" nights for going out, switching from seeing others on Saturday nights to Monday nights would then convey the message of becoming more, rather than less, committed to them! Given all this, it is hardly surprising that meanings may vary from one social environment to another. Indeed, the same symbol often means different things in different social contexts (which ought to remind us, of course, that neither of those meanings is natural). The same act or object often has different meanings in different cultures. The very existence of the Scottish kilt, for example, helps dispel the notion that skirts are inherently feminine, whereas everyday encounters among Arabs remind us that "staring" at someone is not as inherently offensive as many Americans might believe.21 By the same token, the word met has quite different meanings in English and in Hebrew (where it actually means "dead"). Consider also the "language" of hair. On the Polynesian island of Tikopia, for example, men usually allow their hair to hang free on formal rather than informal occasions, certainly helping to dispel the Western notion that loose hair inherently signifies informality. The fact that on that island it is also women who cut their hair short and men who wear theirs long likewise reminds us that our conventional association of long hair with femininity is not as natural (and therefore inevitable) as we might think. Indeed, for women in Tikopia, the act of growing long hair is as symbolically defiant as the act of cutting their hair short has often been for women in the West.22 76 / Social Meanings The fact that meanings often change considerably over time even within the same culture further underscores their unmistakably conventional nature. The Beatles' 1964 haircut, which at the time seemed so wild and disheveled, actually looked quite tame (and even short) only four years later! And although long hair seemed to be intrinsicaily associated in the West during the late 1960s and early 1970s with subversiveness, it certainly did not have that meaning at the time of Benjamin Franklin, Louis XIV, or Johann Sebastian Bach. Furthermore, meanings sometimes vary considerably across contemporary cognitive subcultures within the same society. (As evident from the current cultural battles in America over the meaning of abortion, of sexual harassment, and of owning a gun, such diversity often generates discord.) Thus, for example, unlike the Christians around them, Hindus in England today clearly do not regard Easter, the cross, or the Gospels as sacred. By the same token, in.the United States, whereas Jews and Muslims associate the act of eating pork with sin, neither Methodists nor Presbyterians attach any moral significance to it. Indeed, meanings vary considerably even within the same thought community, and the same act or object can actually have very different meanings when it is socially "packaged" differently. The meaning we attach to the act of killing someone, for example, certainly changes dramatically when it happens on the battlefield rather than on the subway, and handling other people's genitals obviously means something quite different in a gynecological examination and in bed.23 A $150 ticket for a concert likewise changes its meaning entirely when the proceeds go to AIDS victims. By now it is probably evident that an essentialist view of symbolic meaning as an inherent property of the acts or objects with which it is conventionally associated is inadequate. It is quite clear that the "meaning" of the Finnish national anthem, for example, lies not in its actual sounds but in the mental associations they seem to evoke Social Meanings / 77 for some listeners as members of a particular thought community. In other words, if many Finns indeed respond to the symbolism of their national anthem, they certainly do so not as human beings but as Finns. By the same token, while the sanctity of the Gospels, the Pope, or the cross is clearly not just the product of the mind of a particular individual, nor is it an inherent property of those "sacred" objects themselves.2,1 Rather, it is a product of the fact that a particular thought community has chosen to sanctify them. This is obviously also true of the "virtuous" nature of chastity, the "criminal" nature of treason,25 the "perverse" nature of bestiality, or the "noble" nature of self-sacrifice. It is also true of the "edibility" of the things we conventionally eat. The fact that it is quite possible to eat (and even enjoy) food which we normally consider repulsive as long as we do not know exactly what it is that we are actually eating,26 for example, shows that revulsion has more to do with our minds than with our stomachs (which should not surprise us given the fact that our vomiting center is indeed located in the brain). Yet it is obviously not just individuals' own minds. While the fact that we are usually revolted by the same food objects that many others around us also find revolting suggests that revulsion is more than just a personal response of particular individuals, the fact that those objects often vary from one culture to another clearly suggests that neither is it an entirely natural response. After all, unlike many Americans, for example, only few Japanese find raw fish revolting. Nor, for that matter, are most Frenchmen revolted by frog legs or snails. Since neither the gastrointestinal tracts nor the vomiting centers of Frenchmen who do not find snails revolting are fundamentally different from those of Americans who do, it is quite clear that even a seemingly natural act such as vomiting is nonetheless affected by conventional, social definitions of what is repulsive. This is also true of many other seemingly natural acts such as laughing, blushing, 78 / Social Meanings Social Meanings / 79 trembling, or crying, which, in a somewhat similar fashion, are at least partly affected by unmistakably social definitions of what is "funny," "embarrassing," "dangerous,"37 or "sad." Thus, rather than accept the common behaviorist view of human action in terms of natural responses to stimuli, we certainly need to pay more attention to the distinctly social process of meaning attribution that, through unmistakably conventional rules of mental association, actually links symbolic signifiers to the particular signi-fieds they come to represent to us. In other words, we need to recognize the cognitive role of society as a critical mediator between reality and our minds.28 Meaning, in short, is an unmistakably social relation between sig-nifier and signified. Rather than an inherent property of the symbol itself, it is a product of the particular soriomental connection between the symbol and the particular thought community that uses it. The meaning of symbols, thus, is a property of the way they are socially used. Despite all this, however, we often disregard the conventional nature of symbols, thereby basically reifying them. Since their meaning is clearly not just personal (after all, it is not just a few unrelated individuals who happen to link Sunday to Christianity or who in the 1960s came to associate long hair with subversiveness), we often assume that it must therefore be natural. We thus forget that in Tunisia or Afghanistan, for example, the Finnish national anthem does not evoke any of the associations and images it so effectively evokes in Finland. The tendency to mistake intersubjectivity for objectivity is quite evident, for example, in the way we attribute natural, "intrinsically" erotic qualities to essentially nonsexual objects such as nylon stockings, sports cars, cigarettes, and jewelry. It can also be seen in the way we sometimes confuse totemic representations of collectivities with those collectivities themselves,29 which explains people's readiness to give their lives in order to protect their national flag as well as the 1982 war between Britain and Argentina over the barren Falkland Islands or the long-standing conflict between Arabs and Jews over Jerusalem. Reifying the meaning of symbols basically blurs the fundamental distinction between signifier and signified, a major logical fallacy exemplified by the act of mistaking a map for the actual territory it is supposed to represent30 or the word rabbit for what it happens to mean in English. As such, it denies their unmistakably conventional nature. After all, despite our fetishistic attachment to nylon stockings, their "inherent" sexiness is only a product of the minds of the advertisers who are trying to get us to buy them.31 Nor, for that matter, is the "holy city" of Jerusalem intrinsically sacred. The tendency to reify the meaning of symbols is eerily evocative of the highly nonreflective manner in which people process hypnotic suggestions or use language in Orwellian dystopias.32 Sadly enough, it promotes a way of thinking that is a far cry from the remarkably sophisticated level to which human cognition has evolved since the invention of language. The move from being able to invoke the mental image of a rabbit only by producing the actual smell of a rabbit to being able to do so by merely uttering the word rabbit has certainly increased our cognitive flexibility. Whereas the meaning of indicators is basically fixed, symbols can mean practically anything. Having a fine appreciation of their "algebraic" quality, Humpty Dumpty was indeed quite right when he told Alice: "When I use a word ... it means just what I choose it to mean .. One of the most important features of strictly symbolic systems of signification such as language is the fact that, unlike clothes, for example, which have other functions besides being symbols, words are intrinsically worthless. Yet it is precisely their lack of any intrinsic meaning that allows us to attach any meaning we wish to them, thereby providing them with a virtually unlimited signifying potential. Indeed, "the more barren and indifferent the symbol, the greater is its semantic power."34 The word rabbit can thus come to mean a 80 / Social Meanings doorknob, a feather, or a laptop computer if we so wish. By the same token, we can designate mere pieces of otherwise-useless paper as being worth $100 (or even thousands of dollars in the case of checks) precisely because, unlike milk or eggs, for example, money is valuable only as a means of exchange and is therefore intrinsically worthless!35 Such cognitive flexibility, in fact, is one of the great advantages we have over other animals, which, being seriously constricted semioti-cally by the fixed meanings of the signs they use, are clearly unable to transcend their strictly indicative function. A lion cannot decide what a particular roar it produces would mean. And while dogs can certainly respond to words we teach them (sit, stay, no) as indicators, they obviously lack the cognitive ability to actively determine (and possibly even change) their meaning, which we, of course, are free to do with those same words as symbols.i6 Reifying the meaning of symbols essentially reduces them to mere indicators and therefore implies a readiness to give up the greatest advantage that being able to use symbols offers us. It basically means trading the cognitive freedom that typically comes with flexible-mindedness3' for the inevitably constrictive way of thinking promoted by the rigid mind. Given the virtually unlimited signifying potential of symbols, it also means a terrible waste of our distinctively human capacity to think creatively38 6/ Social Memories Not only does our social environment influence the way we mentally process the present, it also affects the way we remember the past. Like the present, the past is to some extent also part of a social reality that, while far from being absolutely objective, nonetheless transcends our own subjectivity and is shared by others around us. As evident from the universalistic tendency of those who study memory today to focus primarily on the formal aspects of the processes of organizing, storing, and accessing memories which we all share, they are largely interested in how humans remember past events. And yet, when they come to examine the actual contents of those memories, they usually go to the other extreme and focus on the individual. Nowhere is this individualistic bent more glaringly evident than in psychoanalysis, which deals almost exclusively with our distinctly personal memories. Once again one can identify a relatively unexplored intellectual terrain made up of various "remembrance environments" lying somewhere between the strictly personal and the absolutely universal. These environments (which include, for example, the family, the workplace, the profession, the fan club, the ethnic group, the religious community, and the nation) are all larger than the individual yet at the same time considerably smaller than the entire human race. Admittedly, there are various universal patterns of organizing, 81 82 / Social Memories Social Memories / 83 storing, and accessing past experiences that indeed characterize all human beings and actually distinguish human memory from that of dogs, spiders, or parrots. At the same time, it is also quite clear that we each have our own unique autobiographical memories, made up of absolutely personal experiences that we share with nobody else. Yet we also happen to have certain memories which we share with some people but not with others. Thus, for example, there are certain memories commonly shared by most Guatemalans or art historians yet only by few Australians or marine biologists. By the same token, there are many memories shared by nearly all Beatles fans, stamp collectors, or longtime readers of Mad Magazine, yet by no one else besides them. The unmistakably common nature of such memories indicates that they are clearly not just personal. At the same time, the fact that they are almost exclusively confined to a particular thought community shows that they are not entirely universal either. Such memories constitute the distinctive domain of the sociology of memory, which, unlike any of the other cognitive sciences, focuses specifically on the social aspects of the mental act of remembering. In doing so, it certainly helps us gain a finer appreciation of the considerable extent to which our social environment affects the way we remember the past. The work on memory typically produced by cognitive psychologists might lead one to believe that the act of remembering takes place in a social vacuum. The relative lack of explicit attention to the social context within which human memory is normally situated tends to promote a rather distorted vision of individuals as "mnemonic Robinson Crusoes" whose memories are virtually free of any social influence or constraint. Such a naive vision would be quite inappropriate even within the somewhat synthetic context of the psychological laboratory, where much of the research on memory today (with the notable exception of "ecologically" oriented work)1 typically takes place. It is even less appropriate, however, within the context of real life. Consider the critical role of others as witnesses whose memories help corroborate our own.2 No wonder most courts of law do not give uncorroborated testimony the same amount of credence and official recognition as admissible evidence that they normally give to socially corroborated testimony. After all, most of us tend to feel somewhat reassured that what we seem to remember indeed happened when there are others who can verify our recollections and thereby provide them with a stamp of intersubjectivity. The terribly frustrating experience of recalling people or events that no one else seems to remember strongly resembles that of seeing things or hearing sounds which no one else does.3 Furthermore, there are various occasions when other people have even better access to certain parts of our past than we ourselves do and can therefore help us recall people and events which we have somehow forgotten. A wife, for example, may remind her husband about an old friend of his which he had once mentioned to her yet has since forgotten/ Parents, grandparents, and older siblings, of course, often remember events from our own childhood that we cannot possibly recall. In fact, many of our earliest "memories" are actually recollections of stories we heard from them about our childhood.3 In an odd way, they remember them for us! Yet such social mediation can also assume a somewhat negative form, since such "mnemonic others" can also help block our access to certain events in our own past, to the point of actually preventing some of them from becoming memories in the first place! This is particularly critical in the case of very young children, who still depend on others around them to define what is real (as well as "memorable") and what is not. A 35-year-old secretary whose boss tells her to "forget this ever happened" will probably be psychologically independent enough to store that forbidden memory in her mind anyway. However, a five-year-old boy whose mother flatly 84 / Social Memories denies that a certain event they have just experienced together ever took place will most likely have a much harder time resisting her pressure to suppress it from his consciousness and may thus end up repressing it altogether. Such instances remind us, of course, that the reasons we sometimes tend to repress our memories may not always be internal and that our social environment certainly plays a major role in helping us determine what is "memorable" and what we can (or even should) forget. Needless to say, they further demonstrate the ubiquity of sociomental control. The notion that there are certain things that one should forget also underscores the normative dimension of memory, which is typically ignored by cognitive psychology. Like the curricular institutionalization of required history classes in school, it reminds us that remembering is more than just a spontaneous personal act, as it also happens to be regulated by unmistakably social rules of remembrance that tell us quite specifically what we should remember and what we must forget. Such rules often determine how far back we remember. In the same way that society helps delineate the scope of our attention and concern through various norms of focusing, it also manages to affect the extent of our mental reach into the past by setting certain historical horizons beyond which past events are regarded as somehow irrelevant and, as such, often forgotten altogether.6 The way society affects the "depth" of individuals' memory by relegating certain parts of the past to official oblivion is often quite explicit, as in the case of the 1990 ruling by the Israeli broadcasting authorities prohibiting television and radio announcers from referring to places in present-day Israel by their old Arab names, just as blatant is the aptly-named statute of limitations, the ultimate institutionalization of the idea that it is actually possible to put certain things "behind us." The very notion of such a statute implies that even events that we all agree happened can nonetheless be mentally Social Memories / 85 banished to a "pre-historical" past that is considered legally irrelevant, and thereby officially forgotten. The unmistakably conventional nature of any statute of limitations, of course, reminds us that it is very often society that determines which particular bygones we let be bygones. Yet the extent to which our social environment affects the "depth" of our memory is also manifested somewhat more tacitly in the way we conventionally begin historical narratives/ By defining a certain moment in history as the actual beginning of a particular historical narrative, it implicitly defines for us everything that preceded that moment as mere "pre-history" which we can practically forget. Thus, for example, when the founders of Islam established the flight of the Prophet from Mecca to Medina in a.d. 622 as the pivot of the conventional Mohammedan chronological dating system, they implicitly defined everything that had ever happened prior to that momentous event as a mere prelude to the "real" history that every Muslim ought to remember.8 By the same token, when sociologists say (as they often do) that sociology was "born" in the 1830s with the work of Auguste Comte (who was indeed the first ever to use the term "sociology"), they are implicitly saying that their students need not really read the work of Aristotle, Hobbes, or Rousseau, which is somehow only "pre-sociological."9 Nowhere is the unmistakably social partitioning of the past into a memorable "history" and a practically forgettable "pre-history" more glaringly evident than in the case of so-called discoveries. When the New York Times, for example, offers its readers a brief historical profile of Mozambique that begins with its "discovery" by the Portuguese in 1498 and fails to remind them that that particular moment marks only the beginning of the European chapter in its history, it relegates that country's entire pre-European past to official oblivion. A similar example of such "mnemonic decapitation" is the way Icelanders begin the official history of their island. Both the Book of Settlements (Landndmabok) and Book of Icelanders (Islendingabok) mention in passing the fact that when the first 86 / Social Memories Social Memories / 87 Norwegians arrived on the island in the ninth century, they found Irish monks already living there, yet their commitment to Iceland's Scandinavian identity (and therefore origins) leads them to present those Norwegians as its first settlers!10 While not trying to explicitly conceal the actual presence of Celts prior to Iceland's official "discovery" by Scandinavians, they nevertheless treat them as irrelevant to its "real" history. Consider also the way we conventionally regard Columbus's first encounter with America as its official "discovery," thereby suppressing the memory of the millions of native Americans who were already living there. The notion that Columbus "discovered" America goes hand in hand with the idea that American history begins only in 1492 and that all events in the Western Hemisphere prior to that year are just part of its "pre-American" past. From this historio-graphic perspective, nothing that predates 1492 truly belongs in "American history." Indeed, it is conventionally considered part of a mere "pre-Columbian" prologue. America's "pre-history" includes not only its own native past but also earlier, "pre-Columbian" European encounters with it, which explains why the Norse voyages across the Atlantic (to Greenland, Newfoundland, and possibly Nova Scotia) in the late tenth and early eleventh centuries are still not considered part of the official narrative of "the discovery of America."11 Despite the fact that most of us are fully aware of the indisputable Norse presence on the western shores of the Atlantic almost five centuries before Columbus, we still regard his 1492 landfall in the Bahamas as the official beginning of American history. After all, if "America" was indeed born only on October 12, 1492 (a notion implicitly supported by the official annual celebration of its "birthday" on Columbus Day), nothing that had happened there prior to that date can be considered truly part of "American history."12 Needless to say, this grand division of the past into a memorable "history" and an officially forgettable "pre-history" is neither logical nor natural. It is an unmistakably social, normative convention. One needs to be socialized to view Columbus's first voyage to the Caribbean as the beginning of American history. One certainly needs to be taught to regard everything that had ever happened in America prior to 1492 as a mere prelude to its "real" history. Only then, indeed, can one officially forget "pre-Columbian" America. We usually learn what we should remember and what we can forget as part of our mnemonic socialization, a process that normally takes place when we enter an altogether new social environment, such as when we get married, start a new job, convert to another religion, or emigrate to another country.13 (It is a subtle process that usually happens rather tacitly: listening to a family member recount a shared experience, for example, implicitly teaches one what is considered memorable and what one can actually forget.) In acquainting us with the specific rules of remembrance that operate in that environment, it introduces us to a particular "tradition" of remembering. A mnemonic tradition includes not only what we come to remember as members of a particular thought community but also how we remember it. After all, much of what we seem to "remember" is actually filtered (and often inevitably distorted) through a process of subsequent interpretation, which affects not only the actual facts we recall but also the particular "light" in which we happen to recall them. Thus, it is hardly surprising that a girl who grows up in a highly traditionalistic family which tends to embellish and romanticize the past would come to "remember" her great-grandfather as a larger-than-life, almost mythical figure. Indeed, that is why Americans who grow up today in liberal and conservative homes "remember" so differently the great social upheavals of the 1960s and 1970s. As the very first social environment in which we learn to interpret our own experience, the family plays a critical role in our mnemonic socialization. In fact, most subsequent interpretations of early "recollections" of particular events in one's life are only ^interpretations of the way they were originally experienced and remembered within the context of one's family! That explains why we often spend a lot 88 / Social Memories Social Memories / 89 of mental effort as we grow up trying to "reclaim" our own personal recollections from our parents or older siblings. Indeed, what is often experienced in intensive psychotherapy is the almost inevitable clash between recalling certain people and events through the mnemonic lenses provided by our immediate family and recalling those same people and events by gradually regaining contact with deeper layers of our selves. Yet mnemonic traditions affect our memory even more significantly by prompting us to adopt a particular cognitive "bias"14 that leads us to remember certain things but not others. As an increasing body of research on memory seems to indicate, familiarity usually breeds memorability, as we tend to remember information that we can somehow fit into ready-made, familiar schematic mental structures that "make sense" to us15 (the same structures that, as we have already seen, affect the way we mentally process our perceptual experience). That is why it is usually much easier to recall that a particular character in a story we have read happened to wear glasses when she is a librarian, for example, than when she is a waitress or a nurse. This tendency to remember things schematically applies not only to actual facts but also to the way we recall the general "gist" of events (which is often all we can remember of them)16 as well as to the way we interpret those memories.1' To further appreciate such tendency to remember events that proceed according to a certain schematic set of prior expectations, consider also the formulaic, script-like "plot structures'"8 we often use to narrate the past, a classic example of which is the traditional Zionist view of the history of Jews' "exilic" life outside the Land of Israel almost exclusively in terms of persecution and victimization.19 I find it quite interesting, in this regard, that only in my late thirties did 1 first realize that Captain Alfred Dreyfus, who I had always "remembered" languishing in the penal colony on Devil's Island until he died (following the infamous 1894 trial at which he was wrongly convicted for treason against France), was actually exoner- ated by the French authorities and even decorated with the Legion of Honor twelve years later! Having grown up in Israel during the 1950s and having been socialized into the Zionist mnemonic tradition of narrating European Jewish history, it is hardly surprising that that is how I "remembered" the end of the famous Dreyfus Affair. Needless to say, the schematic mental structures on which mnemonic traditions typically rest are neither "logical" nor natural. Most of them are either culture-specific or subculture-specific,20 and therefore something we acquire as part of our mnemonic socialization. Thus, if we tend to remember so much better situational details that are salient in our own culture or subculture,21 it is mostly because so many of our pre-existing expectations are based on conventionalized, social typifications.22 Once again we are seeing indisputable evidence of society's ubiquitous cognitive role as a mediator between individuals and their own experience. In fact, since most of the schematic mental structures that help us organize and access our memories are part of our unmistakably social "stock of knowledge,"23 much of what we seem to recall is only socially, rather than personally, familiar to us! Indeed, it is what we come to "remember" as members of particular thought communities. The fact that I can actually "recall" the Dreyfus Affair also reminds us that what we remember includes far more than just what we have personally experienced. In other words, it underscores the unmistakably impersonal aspect of memory. I was already forty-three when I first saw Venice, yet I soon realized that it was actually quite familiar to me. The majestic Grand Canal, for example, was something I had already "seen" on the cover of an album of brass concerti by Venetian composer Antonio Vivaldi when I was eighteen. And when I saw the infamous "Lion's Mouth" (where anonymous accusers once dropped their denunciations of 90 / Social Memories Social Memories / 91 feliow Venetians to the secret police) in the Palace of the Doges, I was actually seeing something I remembered from a book I had read some twenty years earlier. Stored in my mind are rather vivid "recollections" of my greatgrandfather (who I never even met and about whom I know only indirectly from my mother's, grandmother's, and great-aunt's accounts), the Crucifixion (the way 1 first "saw" it in Nicholas Ray's film The King of Kings when I was twelve), and the first voyage around the world (the way I first envisioned it when I read Stefan Zweig's biography of Ferdinand Magellan as a teenager). I have somewhat similar "memories" of the Inca Empire, the Punic Wars, and Genghis Khan, despite the fact that I personally experienced none of them. In fact, neither are my recollections of most of the "historical" events that have taken place in my own lifetime entirely personal.24 What I usually remember of those events is how they were described by others who did experience them personally! They are socially mediated memories that are based entirely on secondhand accounts of others.25 Thus, for example, I "remember" the French pullout from Algeria and the Soviet invasion of Prague mainly through the way they were reported at the time in the newspapers. I likewise "recall" the Eichmann trial, the Cuban missile crisis, and the landing of Apollo 11 on the moon mainly through radio and television reports.26 In fact, much of what we seem to "remember" we did not actually experience personally. We only do so as members of particular families, organizations, nations, and other mnemonic communities2' to which we happen to belong. Thus, for example, it is mainly as a Jew that I "recall" so vividly the Babylonian destruction of the First Temple in Jerusalem more than twenty-five centuries before I was born. By the same token, it is as a member of my family that I "remember" my great-great-grandmother (whose memory is probably no longer carried by anyone outside it), and as a soccer fan that I recall Uruguay's historic winning goal against Brazil in the 1950 World Cup final. Consider also the special place of the Stonewall riots and of Charlie Parker's early gigs with Dizzy Gillespie at Minton's Play House in the respective memories of homosexuals and jazz aficionados. Indeed, being social presupposes the ability to experience events that happened to groups and communities long before we even joined them as if they were somehow part of our own past, an ability so perfectly captured by the traditional Jewish claim, explicitly repeated every Passover, that "we were slaves to Pharaoh in Egypt, and God brought us out of there with a mighty hand." (On Passover Jews also recite the following passage from the Haggadah: "In every generation, a man should see himself as though he had gone forth from Egypt. As it is said: And you shall tell your son on that day, it is because of what God did for me when I went forth from Egypt.'")28 Such existential fusion of one's own biography with the history of the groups or communities to which one belongs is an indispensable part of one's unmistakably social identity as an anthropologist, a Mormon, a Native American, a Miami Dolphins fan, or a member of the U.S. Marine Corps. In marked contrast to our strictly autobiographical memory, such sociobiographical memory19 also accounts for the sense of pride, pain, or shame we sometimes experience as a result of things that happened to groups and communities to which we belong long before we even joined them.'50 Consider the national pride of present-day Greeks, much of which rests on the glorious accomplishments of fellow Greek scholars, artists, and philosophers some twenty-four centuries ago, or the institutional arrogance of many current faculty of academic departments that were considered great forty years ago but have since been in decline. Consider also the long tradition of pain and suffering carried by many present-day American descendants of nineteenth-century African slaves, or the great sense of shame that pervades the experience of many young Germans born many years after the collapse of the Nazi regime. Indeed, identifying with a particular collective past is an important part of the process of acquiring a particular social identity 92 / Social Memories (hence the appeal for some students of African-American and Women's Studies programs in universities, for example). Familiarizing new members with their collective past is an important part of groups' and communities' general efforts to incorporate them. Business corporations, colleges, and army battalions, for example, often introduce new members to their collective history as part of their general "orientation." Children whose parents came to the United States from Ghana, Ecuador, or Cambodia are likewise taught in school to "remember" Paul Revere and the Mayflower as part of their own past. From Poland to Mexico, from Israel to Taiwan, the study of national history plays a major role in the general effort of the modern state to foster a national identity.31 At the same time (and for precisely the same reasons), exiting a group or a community typically involves forgetting its past. Children who are abandoned by one of their parents, for example, rarely carry on the memories of his or her family. Children of assimilated immigrants likewise rarely learn much from their parents about the history of the societies they chose to leave, both physically and psychologically, behind them. Given its highly impersonal nature, social memory need not even be stored in individuals' minds. Indeed, there are some unmistakably impersonal "sites"32 of memory. It was the invention of language that first freed human memory from the need to be stored in individuals' minds. As soon as it became technically possible for people to somehow "share" their personal experiences with others, those experiences were no longer exclusively theirs and could therefore be preserved as somewhat impersonal recollections even after they themselves were long gone. In fact, with language, memories can actually pass from one person to another even when there is no direct contact between them, through an intermediate. Indeed, that has always been one of the main social functions of the elderly, who, as the de facto custodians of the social memories of their communities, have traditionally Social Memories / 93 served as "mnemonic go-betweens," essentially linking historically separate generations who would otherwise never be able to mentally "connect" with one another. Such "mnemonic transitivity" allows for the social preservation of memories in stories, poems, and legends that are transmitted from one generation to the next. One finds such oral traditions33 in practically any social community—from families, churches, law firms, and college fraternities to ethnic groups, air force bases, basketball teams, and radio stations. It was thus an oral tradition that enabled the Marranos in Spain, for example, to preserve their secret Jewish heritage (and therefore identity) for so many generations. It was likewise through stories that the memory of their spectacular eleventh-century encounter with America was originally preserved by Icelanders, more than a century before it was first recorded in their famous sagas and some 950 years before it was first corroborated by actual archaeological finds in Newfoundland/4 Furthermore, ever since the invention of writing several thousand years ago, it is also possible to actually bypass any oral contact, however indirect, between the original carrier of a particular recollection and its various future retrievers. Present-day readers of Saint Augustine's Confessions can actually "share" his personal recollections of his youth despite the fact that he has already been dead for more than fifteen centuries! Doctors can likewise share patient histories readily, since the highly impersonal clinical memories captured in their records are accessible even when those who originally recorded them there are not readily available for immediate consultation.35 That explains the tremendous significance of documents in science (laboratory notes, published results of research), law (affidavits, contracts), diplomacy (telegrams, treaties), business (receipts, signed agreements), and bureaucracy (letters of acceptance, minutes of meetings), as well as of the archives, libraries, and computer files where they are typically stored.36 It also accounts for the critical role of history textbooks in the mnemonic socialization of present and future generations. 94 / Social Memories Social Memories / 95 Yet preserving social memories requires neither oral nor written transmission. Given the inherent durability of material objects as well as the fact that they are mnemonically evocative in an immediate, "tangible" manner, they too play an important role in helping us retain memories.3' Hence the role of ruins, relics, and old buildings as social souvenirs. A visit to the National Museum of Anthropology in Mexico City, for example, helps "connect" modern Mexican "pilgrims" to their Toltec, Maya, and Aztec origins. A walk through the old neighborhoods of Jerusalem likewise allows present-day Jews a quasi-personal "contact" with their collective past.38 As evident from the modern advent of preservationism39 as well as from the modern state's political use of archaeology as part of its general effort to promote nationalism,40 we are certainly more than just passive consumers of such quasi-physical mnemonic links to our collective past. Numerous medals, plaques, tombstones, war memorials, Halls of Fame, and other commemorative monuments (and the fact that we make them from stone or metal rather than paper or wood)41 serve as evidence that we purposefully design such future sites of memory well in advance. Like souvenirs, class yearbooks, and antiques,42 such objects have a purely commemorative value for us, and we design them strictly for the purpose of allowing future generations mnemonic access to their collective past.43 The entire meaning of such "pre-ruins" derives from the fact that they are mnemonically evocative and will therefore help us in the future to recover our past. The self-conscious effort to preserve the past for posterity is manifested even more poignantly in the statues, portraits, stamps, coins, and paper money we produce as social souvenirs. The visual images so vividly captured on them represent an ambitious attempt to somehow "freeze" time and allow future generations the fullest possible mnemonic access to major individuals and events from their collective past. National galleries that try to offer posterity a comprehensive visual encapsulation of a nations history (the collection of paintings displayed in the U.S. Capitol building,44 Diego Riveras murals at the National Palace in Mexico City) are the culmination of such artistic endeavors. Since the invention of the camera (as well as its two major offspring, the motion-picture and television cameras), these more traditional means of "capturing" the past have gradually given way to photographs and films.45 The family photo album and the television archive, indeed, are among the major modern sites of social memory. In fact, it is primarily through snapshots, home movies, and television footage that most of us nowadays remember old relatives, family weddings, or the Gulf War. As evident from the rapid evolution of audio-recording technology from the phonograph to the portable cassette-recorder, in our attempt to somehow "freeze" time we actually try to capture not only visual images but also the very sounds of the past. Historic recordings of Winston Churchill's speeches and Vladimir Horowitz's concerts, for example, underscore the growing significance of tapes, cassettes, and compact discs as modern sites of social memory. Video technology, of course, represents the modern attempt to integrate such graphic and sonic efforts to preserve the past. The ultimate progeny of the camera and the phonograph, the camcorder generates remarkably vivid audio-visual memories that are virtually independent of any individual carrier! The famous videotaped beating of Rodney King by members of the Los Angeles police, for example, is the epitome of such absolutely disembodied and therefore truly impersonal memory. As evident from its repeated use in court, it may very well represent (not unlike the increasingly common use of instant video replay in televised sports)46 the ultimate victory of social and therefore "official" over purely personal memory. Not only are many of our recollections impersonal, they are often also collective. My memory of the first mile ever run under four minutes, for example, is actually shared by the entire track world. So are some of the memories I share with other sociologists, Jews, or Rutgers University employees. In each of these cases my own 96 / Social Memories Social Memories / 97 recollections are part of a collective memory4' shared by an entire community as a whole. The collective memory of a mnemonic community is quite different from the sum total of the personal recollections of its various individual members,48 as it includes only those that are commonly shared by all of them (in the same way that public opinion, for example, is more than just an aggregate of individuals' personal opinions).49 In other words, it involves the integration of various different personal pasts into a single common past that all members of a community come to remember collectively. America's collective memory of the Vietnam War, for example, is thus more than just an aggregate of all the war-related recollections of individual Americans, just as Israel's collective memory of the Holocaust* is more than the mere sum of the personal recollections of all the Holocaust survivors living in Israel. We must be particularly careful not to mistake personalized manifestations of a mnemonic community's collective memory for genuinely personal recollections.5' When asked to list the first names that come to their minds in response to the prompt "American history from its beginning through the end of the Civil War," Americans usually list the same people—George Washington, Abraham Lincoln, Thomas Jefferson, Benjamin Franklin, Robert E. Lee, John Adams, and Ulysses S. Grant.52 The fact that so many different individuals happen to have the same "free" associations about their nation's past shows that their memories are not as independent as we might think but merely personalized manifestations of a single common collective memory. In so doing, it also underscores the tremendous significance of mnemonic socialization. Yet the notion of a "collective memory" implies a past that is not ■ only commonly shared but also jointly remembered (that is, "co-memorated"). By helping ensure that an entire mnemonic community will come to remember its past together, as a group, society affects not only what and who we remember but also when we remember it! Commemorative anniversaries such as the 1992 Columbus quin-centennial, the 1995 fiftieth anniversary of the end of World War II, and the 1976 American bicentennial are classic manifestations of such mnemonic synchronization. Yet we also "co-remember" past events by associating them with holidays and other "memorial days" which we jointly celebrate on a regular annual53 (or even weekly, as in the case of both the Sabbath and the Lord's Day)5'1 basis. Fixed in a mnemonic community's calendar, such days ensure members' synchronized access to their collective past. Indeed, keeping certain past events in our collective memory by ensuring their annual commemoration is one of the main functions of the calendar.35 Thus, on Easter, millions of Christians come to remember their common spiritual origins together, as a community. By the same token, every Passover, jews all over the world jointly remember their collective birth as a people. The annual commemoration of the French Revolution on Bastille Day and of the European colonization of New England on Thanksgiving Day play similar "co-evocative" roles for Frenchmen and Americans, respectively. That also explains various attempts throughout history to remove certain holidays from the calendar in an effort to obliterate the collective memories they evoke. The calendrical dissociation of Easter from Passover, for example, was thus part of a conscious effort by the Church to "decontaminate" Christians' collective memory from somewhat embarrassing Jewish elements,56 whereas the calendar of the French Revolution represented an attempt to establish a mnemonically sanitized secular holiday cycle that would be devoid of any Christian memories.5' Given all this, it is also clear why the recent political battle over the inclusion of Martin Luther King Jr.'s birthday in the American calendar was actually a battle over the place of African Americans in America's collective memory. 98 / Social Memories Social Memories / 99 The battle over whether to officially include Martin Luther King Jr.'s birthday in the American calendar is one of numerous battles fought between as well as within mnemonic communities over the social legacy of the past. The very existence of such mnemonic battles further underscores the social dimension of human memory. The most common mnemonic battles are the ones fought over the "correct" way to interpret the past. As we develop a collective sense of history, we may not always agree on how a particular historical figure or event ought to be remembered. While many Americans regard Columbus as a hero who embodies the modern Western quest for knowledge and spirit of free enterprise, there are many others who claim that he should actually be remembered as the villainous spearhead of the modern Western expansionist spirit that is responsible for both colonialism and the massive destruction of the environment.38 By the same token, whereas many Israelis still accept the official Zionist view of the fall of Masada and the Bar-Kokhba rebellion nineteen centuries ago as exemplary heroic events, a growing number of others are voicing the concern that they are actually symptomatic of a rather myopic stubbornness that resulted in terrible national disasters that could have been avoided by a more politically expedient way of dealing with the Romans who occupied Judaea.'9 Consider also the cultural battles fought among Americans over the "correct" interpretation of Watergate,60 or the debate among historians over whether the origins of Greek (and therefore Western) civilization are Indo-European or African,61 as well as everyday marital battles over past infidelities. Mnemonic battles are also fought over what ought to be collectively remembered in the first place. Eurocentrists, multiculturalists, and feminists, among others, battle over the literary tradition into which young members of society ought to be mnemonically socialized. Consider also the problem of delineating the historical narratives that are to be remembered. Given the inherently conventional nature of any beginning,62 "where" a particular historical narrative ought to begin is by no means self-evident.63 After all, even people who are trying to recount an event they have just witnessed together often disagree on the precise point at which their account ought to begin. It is not at all clear, for example, whether we should begin the "story" of the Vietnam War during the Johnson or Kennedy years. Nor is it absolutely clear whether the narrative of the events leading to the Gulf War ought to begin in August 1990, when Iraq invaded Kuwait (which is the standard American version), or several decades earlier, when both were still part of a single, undivided political entity (which is the standard Iraqi version). As we might expect, such narratological pluralism often generates discord. Japan and the United States wage an ongoing mnemonic battle over the inclusion of the Japanese attack on Pearl Harbor in 1941 in the narrative of the events leading to the atomic bombings of Hiroshima and Nagasaki by the United States four years later. Consider also the Arab-Israeli dispute over the point at which a fair narration of the history of the West Bank ought to begin, or the strong objection of Native Americans to the Eurocentric depiction of 1492 as the beginning of American history. After all, for anyone whose ancestors lived in America thousands of years before it was "discovered" by Europe, that date certainly constitutes more of an ending than a beginning. Like Akira Kurosawa's Rashomon, the fact that such discord exists at all reminds us that our memory of the past is not entirely objective, since we evidently do not all remember it the same way. Yet mnemonic battles usually involve not just individuals but. entire communities, and are typically fought in the public arena (such as in newspaper editorials and radio talk shows), which suggests that the past is not entirely subjective either. That remembering is more than just a personal act is also evident from the fact that major changes in the way we view the past (such as our growing sensitivity to multi-culturalist historiographic concerns) usually correspond to major social changes that affect entire mnemonic communities.64 This, again, underscores the intersubjective, unmistakably social dimension of human memory. Standard Time / 101 7/ Standard Time Not only the content of our memories but also the way we mentally "place" them in the past is affected by our social environment, as evident from the way that we so often use unmistakably impersonal temporal reference frameworks1 for dating even absolutely personal events in our own past. I thus recall having fractured my elbow "in 1985," for example, or having my house painted "just before the Gulf War." Admittedly, we often date the occurrence of past events in terms of strictly personal dating frameworks used only by ourselves, as when I remember something as having taken place "250,000 cigarettes ago" or "3,000 quarts of booze ago"2 or around the time I discovered chamber music. These are all instances of a strictly personal manner of dating that is absolutely meaningless to anyone other than the person using it. Yet we also happen to date past events in unmistakably impersonal, intersubjective terms that are meaningful to others beside us as well. Thus, when couples recall something as having occurred on their second date, a week before they moved to San Francisco, or the year they bought their Chevy, for example, they are actually using social dating frameworks based on temporal landmarks derived from their collective life as a couple. So do college professors who date past departmental events in terms such as 100 "a couple of years before we hired Gordon," "on Carol's first semester as chair," or "the year Nick came up for tenure." The standard chronological "eras" we use for (formally as well as informally) dating past events likewise revolve around essentially impersonal, collectively significant temporal milestones.3 The birth of Christ {for Christians) and the flight of Mohammed from Mecca to Medina in a.d. 622 (for Muslims) are classic examples of such sociotemporal landmarks. So, for that matter, are the wars, revolutions, and various calamities (earthquakes, hurricanes, droughts, fires, epidemics)4 we often use as temporal anchors for dating even strictly personal events in our past. Unlike the last time one had one's period or changed the oil filter in one's car, such events are the foundations of unmistakably impersonal dating frameworks used not only by specific individuals but also by entire mnemonic communities. Thus, for example, when 1 mention in a lecture that something happened "in 1628," my entire audience is jointly transported mentally to the very same point in history. It is such frameworks that make it possible to integrate several different personal pasts into a single common past. Indeed, with the Christian Era having attained practically universal status (after all, the date "1628" has the exact same chronological meaning in Switzerland, Costa Rica, and Angola),3 such a past is increasingly becoming a global one as well. Yet it is not only the past that we date in an intersubjective, social manner, but the future and the present as well. The way we date future events is, again, partly personal, as when we plan to take a shower, make a particular telephone call, or get married in such terms as "soon," "later," or "someday." Yet it is also partly social, as when we start training for "the 2004 Olympics," plan a vacation for "August," or schedule an appointment for "next Thursday at 6:00." By the same token, when an anorectic patient is told by her doctor that she will be discharged from the hospital as 102 / Standard Time Standard Time / 103 soon as she weighs one hundred pounds, they both situate that moment in a single, common future.6 So do a professor and a student who plan to meet as soon as a paper on which the latter is currently working is completed. The unmistakably social nature of the manner in which we mentally "place" events in time is also evident from the way we "date" the present. Admittedly, we often "date" the present in strictly personal terms such as the number of college credits we have already completed, the number of the page we are on in the book we are currently reading, or the number of onions we still have to peel and slice for the soup we are making. Yet we also do it in standard terms that are shared by others beside us as well—"a quarter to seven," "Saturday," "August 20," "two games before the end of the season," "three weeks before the Illinois primaries," and so on. (We usually "date" the present either in terms of our temporal distance from a specific historical landmark or by "anchoring" it within a standard calendrical cycle such as the day, the week, the month, or the year.) Doing this implies being able to convert strictly personal forms of time reckoning into standard temporal designations that have the exact same meaning for everyone using them. Such integration of various different personal "times" into a single common time (made up of a common past, a common present,7 and a common future) presupposes unmistakably impersonal, standard time-reckoning frameworks such as clock time and the calendar. In other words, it presupposes a standard language of reckoning time in which one says "next Thursday at seven" rather than "soon" and "in 1506" instead of "long ago," a language that allows us to agree that it is now "11:25" and that today is "Wednesday, November 26, 1997." Standardizing the way we reckon time is a necessary prerequisite for participation in a world that is also shared by others beside us. It is at the basis of any effort to coordinate human action at the level of families, organizations,8 communities,9 or entire societies. Even a simple act such as making an appointment with someone would be practically impossible without it! Standard time is one of the pillars of the intersubjective, social world. Indeed, social life would not have been possible were it not for our ability to reckon time in a standard, common fashion. If we "did not have a homogeneous conception of time ... all consensus among minds, and thus all common life, would become impossible."10 Given all this, it is hardly surprising that not knowing what day of the week or what year it is is often regarded as indicative of some serious socio-clinical problem, as evident from some of the routine questions typically asked during psychiatric evaluation. People who do not use standard time clearly do not inhabit the same phenomenal world shared by those around them. They are confined to their own inner worlds and cannot "enter" the intersubjective, social world." It is the anxiety about being barred from mental participation in the social world that accounts for the somewhat uneasy feeling that usually accompanies the realization that our watch is standing still or that we cannot recall what day it is,12 a rather disorienting experience that strongly resembles that of waking from a deep sleep.13 It is the dreadful prospect of "mental exile" from the social world that explains why castaway sailors and prisoners in solitary confinement would try to keep count of the days of the week even when they are all by themselves, far away from human civilization.14 And when Leo Tolstoy's Ivan Ilych does not seem to care whether it is Friday or Sunday,15 he is obviously dying, since the living would rarely risk ignoring standard time. Being "sociotemporally disoriented" is a rather common experience during vacations, when we are somewhat less compulsive about wearing a watch and often lose count of the days of the week.1" Yet even on such occasions rarely ever are we truly free from the temporal grip of our social environment. Even vacationers need to know what day or time it is to avoid going to museums on the days they are closed, being late for breakfast, or missing their return flight home. 104 / Standard Time Standard Time / 105 This is why getting one's first watch is often regarded as a ritual of practical as well as symbolic initiation into the social world of adults. Wearing this portable, miniature version of the town-square tower clock1' greatly facilitates our mental participation in a world commonly shared by others beside us. Even when I am alone, a tiny machine attached to my arm nevertheless connects me to others in my social milieu!18 In fact, along with the Gregorian Calendar, the Christian Era, and the seven-day week, clock time is part of what is nowadays becoming an essentially global time-reckoning system.19 After all, 1996 was 1996 in Armenia as well as in Peru, and when it is Sunday in Cape Town it is also Sunday in both Damascus and Madrid. (By the same token, when it is November 26 in Dublin it is also November 26 in both Tunis and Prague.) And though the possibility that it would be midnight at the same moment all over the world is obviously precluded by the spherical shape of the earth, the fact that when it is 8:56 a.m. in Rio de Janeiro it is exactly 3:56 a.m. in Vancouver and 1:56 p.m. in Tel Aviv suggests that people around the globe use the same standard time-reckoning system. Yet the almost-universal status of this system does not mean that it is therefore also a natural one. Just because we all happen to use clock time, the Gregorian calendar, and the seven-day week, for example, does not mean that we should therefore also reify their existence. Based on unmistakably sodotemporal arrangements,20 they are certainly not as natural as they may appear to us at first glance. The introduction of standard clock time, for example, marks a most significant turn away from reckoning time in accordance with nature and its cycles. Since we no longer set our clocks and watches by the sun, as we once did, the time they indicate is far less grounded in nature (and therefore also less inevitable). After all, within each of the standard "time zones" we have been using for the past 115 years,21 there is only one meridian where clock time actually corre- sponds to solar time (so that 12:00 p.m. indeed marks the exact moment when the sun reaches its zenith). With the exception of that single meridian, there is always at least some discrepancy between the two. Indeed, in communities that are located seven and a half degrees of longitude east or west of that meridian, clock time may differ by as much as thirty minutes from solar time. (Since the earth completes a full rotation on its axis every twenty-four hours, actual solar time varies by four minutes for every degree of longitude.) To further appreciate the unmistakably conventional nature of clock time, consider also the existing one-hour time differentials between neighboring time zones. With standard time, we have managed to establish mathematically elegant, rounded off clock-time differentials between almost any two points around the globe. (In the few exceptional cases when the existing clock-time differential is not an exact number of hours, it is nonetheless designated in terms of a certain number of hours and thirty [as in the cases of Iran, Afghanistan, India, or Newfoundland] or forty-five [as in the cases of Guyana and Nepal] minutes from Greenwich Mean Time.)22 And yet, in marked contrast to the awkward though honest solar-time differentials (such as thirty-eight minutes and twenty-six and a quarter seconds) that actually exist between almost any two points on earth, such mathematically "neat" clock-time differentials are inaccurate from a purely physiotemporal (and thus natural) standpoint. That is also true of the abrupt one-hour clock-time differentials often created by time-zone boundaries between communities that are actually within walking distance of each other. The differences between clock time and solar time are further complicated by the way we actually divide the world into time zones. Each country can practically choose its own standard (or standards) of time, which creates many situations that, while politically (and thus sociotemporally) understandable, are nonetheless quite awkward from a strictly physiotemporal standpoint. Thus, as a result of the fact that China, which is about as wide as the United States, chooses to squeeze what could be four different time zones into a 106 / Standard Time single one, the standard time for western Tibet is two and a half hours ahead of that for Calcutta in neighboring India, despite the physiotemporally embarrassing fact that it is about half an hour' behind it in actual solar time!23 (In a similar vein, as a result of the particular way in which the boundary between the Pacific and Mountain time zones happens to cut across North America, one has to move one's watch one hour forward when one travels from Las Vegas to Boise despite the fact that one is actually traveling westward rather than eastward.) By the same token, when it is 9:20 a.m. along most of Argentina's western border, it is still only 7:20 in the southeastern provinces of Colombia, which lie exactly on the same meridian. Even more striking is the twenty-four-hour clock-time differential between the islands of Tonga and Midway (which are actually only two degrees of longitude away from each other), the obvious result of the particular way in which the International Date Line happens to zigzag the Pacific. Consider also, in this regard, the unmistakably conventional common practice of advancing standard clock time by an hour for part of the year. While it may seem quite natural (and therefore inevitable) given the seasonal differences in the length of daylight, the idea of introducing daylight saving time was a social decision. Furthermore, even when we make a conscious effort to be physiotemporally sensitive and accommodate nature and its rhythms, we nevertheless end up choosing to advance standard clock time by an unmistakably conventional sociotemporal interval. The other main constituents of our standard time-reckoning system are just as conventional as clock time. Despite our common tendency to reify them, they all represent unmistakably sociotemporal (rather than strictly physiotemporal or biotemporal) arrangements and are therefore by no means inevitable. Consider, for example, the hour, the minute, and the second. As fractions of the day, they are essentially mathematical (and therefore absolutely artificial) cycles that do not correspond to any natural Standard Time / 107 periodicity. Nor, for that matter, does the week, which represents the boldest human effort to calendrically ignore nature altogether.24 The other major standard cycles we use for reckoning the time are, likewise, only rough approximations of the actual natural periodicities to which they correspond. As such, they are certainly not as inevitable as they may seem to us at first glance. Even the day, arguably the most natural of the cycles that constitute our standard system of reckoning the time, does not always correspond to the actual period of a full rotation of the earth on its axis. After all, there are two calendar days every year that are not twenty-four, but twenty-three (the day on which we get on daylight saving time) or twenty-five (the day on which we get off daylight saving time), hours long. The calendar year, another pillar of our standard system of reckoning the time, is also a rough approximation of the actual 365 days, 5 hours, 48 minutes, and 46 seconds it takes the earth to complete a full revolution around the sun. Though it is obviously much more convenient mathematically, being a precise multiple of the day, it nonetheless distorts the actual physiotemporal relations between the earth and the sun. indeed, it is in order to somehow make up for these nearly six hours which we omit from our calendar every year that we add an extra 366th day every four years at the end of February. Mathematical convenience alone also accounts for the fact that we add an extra day every four years rather than six extra hours every year, as well as for the fact that it is a full twenty-four-hour day rather than the actual extra twenty-three hours, fifteen minutes, and four seconds that accumulate every four years. While certainly more convenient from a strictly mathematical standpoint, such a calen-drical arrangement creates a physiotemporal distortion which indeed called for the suppression of ten calendar days from the year 1582 by Pope Gregory XIII and also accounts for the fact that, ever since then, we add an extra 366th day only to century years that are also precise multiples of 400 (thereby skipping 1700, 1800, and 108 / Standard Time Standard Time / 109 1900). By thus omitting three actual calendar days every four hundred years, we manage to get rid of the three superfluous days that would have accumulated over that period given that every quadrennial leap year is in fact 44 minutes and 56 seconds (the actual difference between 23 hours, 15 minutes, and 4 seconds and a full calendar day) longer than it would be if we were to measure it strictly according to nature and its periodicities. Like the calendar year, the calendar month too is only a rough approximation of the actual natural cycle on which it is originally based. (This is true even in the Jewish, Mohammedan, and other so-called "lunar" calendars.) A precise multiple of the day, it is certainly more convenient mathematically than the actual period of twenty-nine days, twelve hours, forty-four minutes, and three seconds that elapse between any two successive new moons. In fact, only by dissociating it from the lunation have we managed to actually synchronize the month with both the day (so that every new calendar month would begin at midnight, along with a new calendar day) and the year (so that the beginning of a new calendar year would always coincide with that of a new calendar month).25 However, from a strictly physiotemporal standpoint, the conventional thirty-, thirty-one-, twenty-eight-, or twenty-nine-day calendar month clearly distorts the actual relations between the earth and the moon. Practically none of the points where these cycles conventionally begin has any natural significance. There is nothing in nature that regularly corresponds to midnight, Sunday, August 1, or New Year's Day, The fact that when our days, weeks, months, and years actually "begin" is utterly conventional further reminds us that none of the foundations of our standard system of reckoning the time is truly inevitable.26 The unmistakably conventional nature of our standard time-reckoning system is also evident from the fact that a standardized method of reckoning time has not always existed.2/ The Gregorian calendar has only relatively recently become more than just a Euro- pean and American calendar.28 Societies that now reckon the time in accordance with the seemingly natural (or divine) seven-day week have not always done so.29 Furthermore, the very effort to standardize the mental process of reckoning the time is quite recent. Prior to the official introduction of the International Date Line in 1884, the same day that was considered Sunday by anyone who came to Alaska from the United States was still considered Monday by those who came there from Russia,30 (Indeed, that was precisely the kind of problem that inspired the ending of Jules Verne's novel Around the World in Eighty Days, written in 1873.)31 In fact, even standard clock time is a relatively modern phenomenon, and there is far more temporal coordination between people who are living in Denver and Nairobi today than there was between people who were living in Philadelphia and Baltimore only a little more than a century ago. Not until 1840, when railroad companies began using Greenwich time throughout Britain, was the first serious attempt made to standardize clock-time reckoning beyond the strictly local level.32 Only as they became parts of a single transportation network (which understandably called for a single timetable) did local communities that had until then led a rather insular existence reach a point when they could no longer afford to reckon the time independently of one another. Nor could they afford to do so once instantaneous telecommunication became a technological reality in the mid-1830s. A person who is trying to place a telephone call from Caracas to Paris or Singapore cannot possibly be oblivious to the local times there. Nor, for . that matter, can a stockbroker or television reporter in London today afford not to be fully synchronized with his associates in Tokyo or New York. No such concerns, however, could have even existed prior to the invention of the telegraph only 160 years ago. Despite its obvious ubiquity, our standard system of reckoning the time is still not absolutely universal even today. Its international 110 / Standard Time status notwithstanding, the Gregorian calendar is still not even the most significant framework within which Orthodox Jews, Muslims, or Bahai's, for example, reckon the time. Nor, for that matter, is it all that clear how relevant are clock time or the week to people who are retired or unemployed.,J While perhaps somewhat exceptional, such cases nevertheless help us separate the merely conventional from the truly inevitable. People who do not know "the time" or have absolutely no idea what day or year "it is" may very well be considered cognitive deviants, but they certainly keep reminding us that thinking in a social manner is by no means natural. 8/ Conclusion The six mental processes I have examined here (perception, attention, classification, semiotic association, memory, and time reckoning) certainly do not exhaust the phenomenon we call "thinking." Nonetheless, probing their social underpinnings gives us at least a general idea of what cognitive sociology has to offer the modern science of the mind.1 There are numerous matters which cognitive science has thus far been unable to address. For example, it cannot explain why the "cubist" style of perceiving objects evolved only in the twentieth century, or how secretaries figure out which of the things that are said at meetings ought to be included in the minutes and which ones can be officially ignored. Nor can it account for the strong aversion of Gypsies to animals that shed their skin, or for cognitive battles over the mental delineation of "science" and "work." Addressing such matters certainly calls for a cognitive sociology. By the same token, only a sociology of perception can account for the fact that we now notice "Freudian" slips that would have been ignored a hundred years ago. Only a sociology of attention would dwell on the striking contrast between the rigid style of mental focusing so prevalent among lawyers and the more "fluid" style so distinctively characteristic of detectives. And only a sociology of classification can account for the fact that, by the time she was three, 111