RULE I Do not carelessly denigrate social institutions or creative achievement. Loneliness and Confusion For years, I saw a client who lived by himself.5 He was isolated in many other ways in addition to his living situation. He had extremely limited family ties. Both of his daughters had moved out of the country, and did not maintain much contact, and he had no other relatives except a father and sister from whom he was estranged. His wife and the mother of his children had passed away years ago, and the sole relationship he endeavored to establish while he saw me over the course of more than a decade and a half terminated tragically when his new partner was killed in an automobile accident. When we began to work together, our conversations were decidedly awkward. He was not accustomed to the subtleties of social interaction, so his behaviors, verbal and nonverbal, lacked the dancelike rhythm and harmony that characterize the socially fluent. As a child, he had been thoroughly ignored as well as actively discouraged by both parents. His father—mostly absent—was neglectful and sadistic in his inclinations, while his mother was chronically alcoholic. He had also been consistently tormented and harassed at school, and had not chanced upon a teacher in all his years of education who paid him any genuine attention. These experiences left my client with a proclivity toward depression, or at least worsened what might have been a biological tendency in that direction. He was, in consequence, abrupt, irritable, and somewhat volatile if he felt misunderstood or was unexpectedly interrupted during a conversation. Such reactions helped ensure that his targeting by bullies continued into his adult life, particularly in his place of work. I soon noticed, however, that things worked out quite well during our sessions if I kept mostly quiet. He would drop in, weekly or biweekly, and talk about what had befallen and preoccupied him during the previous seven to fourteen days. If I maintained silence for the first fifty minutes of our one-hour sessions, listening intently, then we could converse, in a relatively normal, reciprocal manner, for the remaining ten minutes. This pattern continued for more than a decade, as I learned, increasingly, to hold my tongue (something that does not come easily to me). As the years passed, however, I noticed that the proportion of time he spent discussing negative issues with me decreased. Our conversation—his monologue, really—had always started with what was bothering him, and rarely progressed past that. But he worked hard outside our sessions, cultivating friends, attending artistic gatherings and music festivals, and resurrecting a long-dormant talent for composing songs and playing the guitar. As he became more social, he began to generate solutions to the problems he communicated to me, and to discuss, in the latter portion of the hours we shared, some of the more positive aspects of his existence. It was slow going, but he made continual incremental progress. When he first came to see 5I have modified the accounts drawn from my clinical practice enough to ensure the continuing privacy of my clients while endeavoring to maintain the essential narrative truth of what I am relating. 1 me, we could not sit together at a table in a coffee shop—or, indeed, in any public space—and practice anything resembling a real-world conversation without his being paralyzed into absolute silence. By the time we finished, he was reading his original poetry in front of small groups, and had even tried his hand at stand-up comedy. He was the best personal and practical exemplar of something I had come to realize over my more than twenty years of psychological practice: people depend on constant communication with others to keep their minds organized. We all need to think to keep things straight, but we mostly think by talking. We need to talk about the past, so we can distinguish the trivial, overblown concerns that otherwise plague our thoughts from the experiences that are truly important. We need to talk about the nature of the present and our plans for the future, so we know where we are, where we are going, and why we are going there. We must submit the strategies and tactics we formulate to the judgments of others, to ensure their efficiency and resilience. We need to listen to ourselves as we talk, as well, so that we may organize our otherwise inchoate bodily reactions, motivations, and emotions into something articulate and organized, and dispense with those concerns that are exaggerated and irrational. We need to talk—both to remember and to forget. My client desperately needed someone to listen to him. He also needed to be fully part of additional, larger, and more complex social groups—something he planned in our sessions together, and then carried out on his own. Had he fallen prey to the temptation to denigrate the value of interpersonal interactions and relationships because of his history of isolation and harsh treatment, he would have had very little chance of regaining his health and well-being. Instead, he learned the ropes and joined the world. Sanity as a Social Institution For Drs. Sigmund Freud and Carl Jung, the great depth psychologists, sanity was a characteristic of the individual mind. People were well-adjusted, in their views, when the subpersonalities existing within each of them were properly integrated and balanced in expression. The id, the instinctive part of the psyche (from the German “it,” representing nature, in all its power and foreignness, inside us); the superego (the sometimes oppressive, internalized representative of social order); and the ego (the I, the personality proper, crushed between those two necessary tyrants)—all had their specialized functions for Freud, who first conceptualized their existence. Id, ego, and superego interacted with each other like the executive, legislative, and judicial branches of a modern government. Jung, although profoundly influenced by Freud, parsed the complexity of the psyche in a different manner. For him, the ego of the individual had to find its proper place in relationship to the shadow (the dark side of the personality), the anima or animus (the contrasexual and thus often repressed side of the personality), and the self (the internal being of ideal possibility). But all these different subentities, Jungian and Freudian alike, share one thing in common: they exist in the interior of the person, regardless of his or her surroundings. People are social beings, however—par excellence—and there is no shortage of wisdom and guidance outside of us, embedded in the social world. Why rely on our own limited resources to remember the road, or to orient ourselves in new territory, when we can rely on signs and guideposts placed there so effortfully by others? Freud and Jung, with their intense focus on the autonomous individual psyche, placed too little focus on the role of the community in the maintenance of personal mental health. It is for such reasons that I assess the position of all my new clinical clients along a few dimensions largely dependent on the social world when I first start working with them: Have they been educated to the level of their intellectual ability or ambition? Is their use of free time engaging, meaningful, and productive? Have they formulated solid and well-articulated plans for the future? Are they (and 2 those they are close to) free of any serious physical health or economic problems? Do they have friends and a social life? A stable and satisfying intimate partnership? Close and functional familial relationships? A career—or, at least, a job—that is financially sufficient, stable and, if possible, a source of satisfaction and opportunity? If the answer to any three or more of these questions is no, I consider that my new client is insufficiently embedded in the interpersonal world and is in danger of spiraling downward psychologically because of that. People exist among other people and not as purely individual minds. An individual does not have to be that well put together if he or she can remain at least minimally acceptable in behavior to others. Simply put: We outsource the problem of sanity. People remain mentally healthy not merely because of the integrity of their own minds, but because they are constantly being reminded how to think, act, and speak by those around them. If you begin to deviate from the straight and narrow path—if you begin to act improperly—people will react to your errors before they become too great, and cajole, laugh, tap, and criticize you back into place. They will raise an eyebrow, or smile (or not), or pay attention (or not). If other people can tolerate having you around, in other words, they will constantly remind you not to misbehave, and just as constantly call on you to be at your best. All that is left for you to do is watch, listen, and respond appropriately to the cues. Then you might remain motivated, and able to stay together enough so that you will not begin the long journey downhill. This is reason enough to appreciate your immersion in the world of other people—friends, family members, and foes alike—despite the anxiety and frustration that social interactions so often produce. But how did we develop the broad consensus regarding social behavior that serves to buttress our psychological stability? It seems a daunting task—if not impossible—in the face of the complexity that constantly confronts us. “Do we pursue this or that?” “How does the worth of this piece of work compare to the worth of that?” “Who is more competent, or more creative, or more assertive, and should therefore be ceded authority?” Answers to such questions are largely formulated in consequence of intensive negotiation—verbal and nonverbal—regulating individual action, cooperation, and competition. What we deem to be valuable and worthy of attention becomes part of the social contract; part of the rewards and punishments meted out respectively for compliance and noncompliance; part of what continually indicates and reminds: “Here is what is valued. Look at that (perceive that) and not something else. Pursue that (act toward that end) and not some other.” Compliance with those indications and reminders is, in large measure, sanity itself—and is something required from every one of us right from the early stages of our lives. Without the intermediation of the social world, it would be impossible for us to organize our minds, and we would simply be overwhelmed by the world. The Point of Pointing I have the great good fortune of a granddaughter, Elizabeth Scarlett Peterson Korikova, born in August 2017. I have watched her carefully while she develops, trying to understand what she is up to and playing along with it. When she was about a year and a half old, she engaged in all manner of unbearably endearing behaviors—giggling and laughing when she was poked, high-fiving, bumping heads, and rubbing noses. However, in my opinion, the most noteworthy of all the actions she undertook at that age was her pointing. She had discovered her index finger, using it to specify all the objects in the world she found interesting. She delighted in doing so, particularly when her pointing called forth the attention of the adults surrounding her. This indicated, in a manner not duplicable in any other way, that her action and intention had import—definable at least in part as the tendency of a behavior or attitude to compel the attention of others. She thrived on that, and no wonder. We compete for attention, personally, socially, and economically. No currency has a value that exceeds it. Children, adults, and societies 3 wither on the vine in its absence. To have others attend to what you find important or interesting is to validate, first, the importance of what you are attending to, but second, and more crucially, to validate you as a respected center of conscious experience and contributor to the collective world. Pointing is, as well, a crucial precursor to the development of language. To name something—to use the word for the thing—is essentially to point to it, to specify it against everything else, to isolate it for use individually and socially. When my granddaughter pointed, she did it publicly. When she pointed to something, she could immediately observe how the people close to her reacted. There is just not that much point, so to speak, in pointing to something that no one else cares about. So, she aimed her index finger at something she found interesting and then looked around to see if anyone else cared. She was learning an important lesson at an early age: If you are not communicating about anything that engages other people, then the value of your communication—even the value of your very presence—risks falling to zero. It was in this manner that she began to more profoundly explore the complex hierarchy of value that made up her family and the broader society surrounding her. Scarlett is now learning to talk—a more sophisticated form of pointing (and of exploration). Every word is a pointer, as well as a simplification or generalization. To name something is not only to make it shine forth against the infinite background of potentially nameable things, but to group or categorize it, simultaneously, with many other phenomena of its broad utility or significance. We use the word “floor,” for example, but do not generally employ a separate word for all the floors we might encounter (concrete, wood, earth, glass), much less all the endless variations of color and texture and shade that make up the details of the floors that bear our weight. We use a low-resolution representation: If it holds us up, we can walk on it, and is situated inside a building, then it is a “floor,” and that is precise enough. The word distinguishes floors, say, from walls, but also restricts the variability in all the floors that exist to a single concept—flat, stable, walkable indoor surfaces. The words we employ are tools that structure our experience, subjectively and privately—but are, equally, socially determined. We would not all know and use the word “floor” unless we had all agreed that there was something sufficiently important about floors to justify a word for them. So, the mere fact of naming something (and, of course, agreeing on the name) is an important part of the process whereby the infinitely complex world of phenomena and fact is reduced to the functional world of value. And it is continual interaction with social institutions that makes this reduction—this specifi- cation—possible. What should we point to? The social world narrows and specifies the world for us, marking out what is important. But what does “important” mean? How is it determined? The individual is molded by the social world. But social institutions are molded, too, by the requirements of the individuals who compose them. Arrangements must be made for our provisioning with the basic requirements of life. We cannot live without food, water, clean air, and shelter. Less self-evidently, we require companionship, play, touch, and intimacy. These are all biological as well as psychological necessities (and this is by no means a comprehensive list). We must signify and then utilize those elements of the world capable of providing us with these requirements. And the fact that we are deeply social adds another set of constraints to the situation: We must perceive and act in a manner that meets our biological and psychological needs—but, since none of us lives or can live in isolation, we must meet them in a manner approved of by others. This means that the solutions we apply to our fundamental biological problems must also be acceptable and implementable socially. 4 It is worth considering more deeply just how necessity limits the universe of viable solutions and implementable plans. First, as we alluded to, the plan must in principle solve some real problem. Second, it must appeal to others—often in the face of competing plans—or those others will not cooperate and might well object. If I value something, therefore, I must determine how to value it so that others potentially benefit. It cannot just be good for me: it must be good for me and for the people around me. And even that is not good enough—which means there are even more constraints on how the world must be perceived and acted upon. The manner in which I view and value the world, integrally associated with the plans I am making, has to work for me, my family, and the broader community. Furthermore, it needs to work today, in a manner that does not make a worse hash of tomorrow, next week, next month, and next year (even the next decade or century). A good solution to a problem involving suffering must be repeatable, without deterioration across repetitions—iterable, in a word—across people and across time. These universal constraints, manifest biologically and imposed socially, reduce the complexity of the world to something approximating a universally understandable domain of value. That is exceptionally important, because there are unlimited problems and there are hypothetically unlimited potential solutions, but there are a comparatively limited number of solutions that work practically, psychologically, and socially simultaneously. The fact of limited solutions implies the existence of something like a natural ethic—variable, perhaps, as human languages are variable, but still characterized by something solid and universally recognizable at its base. It is the reality of this natural ethic that makes thoughtless denigration of social institutions both wrong and dangerous: wrong and dangerous because those institutions have evolved to solve problems that must be solved for life to continue. They are by no means perfect—but making them better, rather than worse, is a tricky problem indeed. So, I must take the complexity of the world, reduce it to a single point so that I can act, and take everyone else and their future selves into consideration while I am doing so. How do I manage this? By communicating and negotiating. By outsourcing the terribly complex cognitive problem to the resources of the broader world. The individuals who compose every society cooperate and compete linguistically (although linguistic interaction by no means exhausts the means of cooperation and competition). Words are formulated collectively, and everyone must agree on their use. The verbal framework that helps us delimit the world is a consequence of the landscape of value that is constructed socially—but also bounded by the brute necessity of reality itself. This helps give that landscape shape, and not just any old shape. This is where hierarchies—functional, productive hierarchies—more clearly enter the picture. Things of import must be done, or people starve or die of thirst or exposure—or of loneliness and absence of touch. What needs to be done must be specified and planned. The requisite skills for doing so must be developed. That specification, planning, and development of skills, as well as the implementation of the informed plan, must be conducted in social space, with the cooperation of others (and in the face of their competition). In consequence, some will be better at solving the problem at hand, and others worse. This variance in ability (as well as the multiplicity of extant problems and the impossibility of training everyone in all skilled domains) necessarily engenders a hierarchical structure—based ideally on genuine competence in relation to the goal. Such a hierarchy is in its essence a socially structured tool that must be employed for the effective accomplishment of necessary and worthwhile tasks. It is also a social institution that makes progress and peace possible at the same time. Bottom Up The consensus making up the spoken and unspoken assumptions of worth characterizing our societies has an ancient origin, developing over the course of hundreds of millions of years. After all, “How 5 should you act?” is just the short-term, immediate version of the fundamental long-term question, “How should you survive?” It is therefore instructive to look into the distant past—far down the evolutionary chain, right to the basics—and contemplate the establishment of what is important. The most phylogenetically ancient multicellular organisms (that is far enough for our purposes) tend to be composed of relatively undifferentiated sensorimotor cells.[3] These cells map certain facts or features of the environment directly onto the motor output of the same cells, in an essentially one-to-one relationship. Stimulus A means response A, and nothing else, while stimulus B means response B. Among more differentiated and complex creatures—the larger and commonly recognizable denizens of the natural world—the sensory and motor functions separate and specialize, such that cells undertaking the former functions detect patterns in the world and cells in the latter produce patterns of motor output. This differentiation enables a broader range of patterns to be recognized and mapped, as well as a broader range of action and reaction to be undertaken. A third type of cell—neural—emerges sometimes, as well, serving as a computational intermediary between the first two. Among species that have established a neural level of operation, the “same” pattern of input can produce a different pattern of output (depending, for example, on changes in the animal’s environment or internal psychophysical condition). As nervous systems increase in sophistication, and more and more layers of neural intermediation emerge, the relationship between simple fact and motor output becomes increasingly complex, unpredictable, and sophisticated. What is putatively the same thing or situation can be perceived in multiple ways, and two things perceived in the same manner can still give rise to very different behaviors. It is very difficult to constrain even isolated laboratory animals, for example, so thoroughly that they will behave predictably across trials that have been made as similar as possible. As the layers of neural tissue mediating between sensation and action multiply, they also differentiate. Basic motivational systems, often known as drives, appear (hunger, thirst, aggression, etc.), adding additional sensory and behavioral specificity and variability. Superseding motivations, in turn—with no clear line of demarcation—are systems of emotion. Cognitive systems emerge much later, first taking form, arguably, as imagination, and later—and only among human beings—as full-fledged language. Thus, in the most complex of creatures, there is an internal hierarchy of structure, from reflex through drive to language-mediated action (in the particular case of human beings), that must be organized before it can function as a unity and be aimed at a point.[4] How is this hierarchy organized—a structure that emerged in large part from the bottom up, over the vast spans of evolutionary time? We return to the same answer alluded to earlier: through the constant cooperation and competition—the constant jockeying for resources and position—defining the struggle for survival and reproduction. This happens over the unimaginably lengthy spans of time that characterize evolution, as well as the much shorter course of each individual life. Negotiation for position sorts organisms into the omnipresent hierarchies that govern access to vital resources such as shelter, nourishment, and mates. All creatures of reasonable complexity and even a minimally social nature have their particular place, and know it. All social creatures also learn what is deemed valuable by other group members, and derive from that, as well as from the understanding of their own position, a sophisticated implicit and explicit understanding of value itself. In a phrase: The internal hierarchy that translates facts into actions mirrors the external hierarchy of social organization. It is clear, for example, that chimpanzees in a troop understand their social world and its hierarchical strata at a fine level of detail. They know what is important, and who has privileged access to it. They understand such things as if their survival and reproduction depend upon it, as it does.[5] A newborn infant is equipped with relatively deterministic reflexes: sucking, crying, startling. These nonetheless provide the starting point for the immense range of skills in action that develop with human maturation. By the age of two (and often much earlier than that, for many skills), children can orient with all their senses, walk upright, use their opposable-thumb-equipped hands for all sorts of purposes, and communicate their desires and needs both nonverbally and verbally—and this is of 6 course a partial list. This immense array of behavioral abilities is integrated into a complex assortment of emotions and motivational drives (anger, sadness, fear, joy, surprise, and more) and then organized to fulfill whatever specific, narrow purpose inspires the child for the moment and, increasingly, over longer spans of time. The developing infant must also hone and perfect the operation of his or her currently dominant motivational state in harmony with all his or her other internal motivational states (as, for example, the separate desire to eat, sleep, and play must learn to coexist so each can manifest itself optimally), and in keeping with the demands, routines, and opportunities of the social environment. This honing and perfecting begin within the child’s maternal relationship and the spontaneous play behavior within that circumscribed but still social context. Then, when the child has matured to the point where the internal hierarchy of emotional and motivational functions can be subsumed, even temporarily, within a framework provided by a conscious, communicable abstract goal (“let us play house”), the child is ready to play with others—and to do so, over time, in an increasingly complex and sophisticated manner.[6] Play with others depends (as the great developmental psychologist Jean Piaget observed[7]) upon the collective establishment of a shared goal with the child’s play partners. The collective establishment of a shared goal—the point of the game—conjoined with rules governing cooperation and competition in relationship to that goal or point, constitutes a true social microcosm. All societies might be regarded as variations upon this play/game theme—E pluribus unum6 —and in all functional and decent societies the basic rules of fair play, predicated upon reciprocity across situation and time, come inevitably to apply. Games, like solutions to problems, must be iterable to endure, and there are principles that apply to and undergird what constitutes that iterability. Piaget suspected, for example, that games undertaken voluntarily will outcompete games imposed and played under threat of force, given that some of the energy that could be expended on the game itself, whatever its nature, has to be wasted on enforcement. There is evidence indicating the emergence of such voluntary game-like arrangements even among our nonhuman kin.[8] The universal rules of fair play include the ability to regulate emotion and motivation while cooperating and competing in pursuit of the goal during the game (that is part and parcel of being able to play at all), as well as the ability and will to establish reciprocally beneficial interactions across time and situation, as we already discussed. And life is not simply a game, but a series of games, each of which has something in common (whatever defines a game) and something unique (or there would be no reason for multiple games). At minimum, there is a starting point (kindergarten, a 0–0 score, a first date, an entry-level job) that needs to be improved upon; a procedure for enacting that improvement; and a desirable goal (graduation from high school, a winning score, a permanent romantic relationship, a prestigious career). Because of that commonality, there is an ethic—or more properly, a meta-ethic—that emerges, from the bottom up, across the set of all games. The best player is therefore not the winner of any given game but, among many other things, he or she who is invited by the largest number of others to play the most extensive series of games. It is for this reason, which you may not understand explicitly at the time, that you tell your children: “It’s not whether you win or lose. It’s how you play the game!”7 How should you play, to be that most desirable of players? 6“Out of many, one.” 7Even rats understand this. Jaak Panksepp, one of the founders of the psychological subfield called affective neuroscience and an extremely creative, courageous, and gifted researcher, spent many years analyzing the role of play in the development and socialization of rats (see J. Panksepp, Affective Neuroscience: The Foundations of Human and Animal Emotions [New York: Oxford University Press, 1998], particularly the chapter on play, 280–99). Rats like to play. They particularly enjoy rough-and-tumble play, particularly if they are male juvenile rats. They enjoy it so much that they will voluntarily work—pulling a lever repeatedly, say—to gain the opportunity to enter an arena where another juvenile waits to play. When two juvenile strangers meet for the first time in this situation, they size each other up and then establish dominance. If one rat is only 10 percent larger than the other, he can pretty much win every physical contest, 7 What structure must take form within you so that such play is possible? And those two questions are interrelated, because the structure that will enable you to play properly (and with increasing and automated or habitual precision) will emerge only in the process of continually practicing the art of playing properly. Where might you learn how to play? Everywhere . . . if you are fortunate and awake. The Utility of the Fool It is useful to take your place at the bottom of a hierarchy. It can aid in the development of gratitude and humility. Gratitude: There are people whose expertise exceeds your own, and you should be wisely pleased about that. There are many valuable niches to fill, given the many complex and serious problems we must solve. The fact that there are people who fill those niches with trustworthy skill and experience is something for which to be truly thankful. Humility: It is better to presume ignorance and invite learning than to assume sufficient knowledge and risk the consequent blindness. It is much better to make friends with what you do not know than with what you do know, as there is an infinite supply of the former but a finite stock of the latter. When you are tightly boxed in or cornered—all too often by your own stubborn and fixed adherence to some unconsciously worshipped assumptions—all there is to help you is what you have not yet learned. It is necessary and helpful to be, and in some ways to remain, a beginner. For this reason, the Tarot deck beloved by intuitives, romantics, fortune-tellers, and scoundrels alike contains within it the Fool as a positive card, an illustrated variant of which opens this chapter. The Fool is a young, handsome man, eyes lifted upward, journeying in the mountains, sun shining brightly upon him—about to carelessly step over a cliff (or is he?). His strength, however, is precisely his willingness to risk such a drop; to risk being once again at the bottom. No one unwilling to be a foolish beginner can learn. It was for this reason, among others, that Carl Jung regarded the Fool as the archetypal precursor to the figure of the equally archetypal Redeemer, the perfected individual. The beginner, the fool, is continually required to be patient and tolerant—with himself and, equally, with others. His displays of ignorance, inexperience, and lack of skill may still sometimes be rightly attributed to irresponsibility and condemned, justly, by others. But the insufficiency of the fool is every rat wrestling match—but they still wrestle to find out, and the larger rat almost inevitably pins the smaller. If you were inclined to view the establishment of hierarchy as equivalent to dominance by power, that would be game over. The larger, more powerful rat won. End of story. But that is by no means the end of the story, unless the rats meet only once. Rats live in social environments, and they interact with the same individuals over and over. Thus, the game, once started, continues—and the rules have to govern not so much the single game as the repeating one. Once dominance is established, the rats can play—something they do in a manner very different from genuine fighting (just like play fighting with a pet dog is very different from being attacked by a dog). Now, the larger rat could pin the smaller rat every time. However, that breaks the rules (really, the meta-rules: those that are observable only over the course of repeated games). The purpose of the repeated game is not dominance, but continuing play. This is not to say that the initial dominance is without significance. It matters, not least in the following manner: When the two rats meet a second time, they will both adopt a unique role. The smaller rat is now duty bound to invite his larger friend to play, and the larger rat is duty bound to accept the invitation. The former will jump around playfully, to indicate his intent. The larger rat might hang back and act cool and a bit dismissive (as is now his prerogative); but if he’s a decent sort he will join in the fun, as in his heart of hearts he truly wants to play. However—and this is the critical issue—if the larger rat does not let the smaller rat win the repeated wrestling matches some substantial proportion of the time (Panksepp estimated 30 to 40 percent of the time), the smaller rat will stop exhibiting invitations to play. It is just not any fun for the little guy. Thus, if the larger rat dominates with power (like a bully), as he could, then he will lose at the highest level (the level where the fun continues for the longest possible time), even while he “wins” more frequently at the lower. What does this imply? Most important, that power is simply not a stable basis upon which to construct a hierarchy designed to optimally govern repeated interactions. And this is not just true for rats. Alpha males among at least certain primate groups are far more prosocial than their lesser comrades. Power doesn’t work for them, either (see F. B. M. de Waal and M. Suchak, “Prosocial Primates: Selfish and Unselfish Motivations,” Philosophical Transactions of the Royal Society of London: Biological Science 365 [2010]: 2711–22. See also F. de Waal, The Surprising Science of Alpha Males, TEDMED 2017, bit.ly/primate_ethic). 8 often better regarded as an inevitable consequence of each individual’s essential vulnerability, rather than as a true moral failing. Much that is great starts small, ignorant, and useless. This lesson permeates popular as well as classical or traditional culture. Consider, for example, the Disney heroes Pinocchio and Simba, as well as J. K. Rowling’s magical Harry Potter. Pinocchio begins as a woodenheaded marionette, the puppet of everyone’s decisions but his own. The Lion King has his origin as a naive cub, the unwitting pawn of a treacherous and malevolent uncle. The student of wizarding is an unloved orphan, with a dusty cupboard for a bedroom, and Voldemort—who might as well be Satan himself—for his archenemy. Great mythologized heroes often come into the world, likewise, in the most meager of circumstances (as the child of an Israelite slave, for example, or newborn in a lowly manger) and in great danger (consider the Pharaoh’s decision to slay all the firstborn male babies of the Israelites, and Herod’s comparable edict, much later). But today’s beginner is tomorrow’s master. Thus, it is necessary even for the most accomplished (but who wishes to accomplish still more) to retain identification with the as yet unsuccessful; to appreciate the striving toward competence; to carefully and with true humility subordinate him or herself to the current game; and to develop the knowledge, self-control, and discipline necessary to make the next move. I visited a restaurant in Toronto with my wife, son, and daughter while writing this. As I made my way to my party’s table, a young waiter asked if he might say a few words to me. He told me that he had been watching my videos, listening to my podcasts, and reading my book, and that he had, in consequence, changed his attitude toward his comparatively lower-status (but still useful and necessary) job. He had ceased criticizing what he was doing or himself for doing it, deciding instead to be grateful and seek out whatever opportunities presented themselves right there before him. He made up his mind to become more diligent and reliable and to see what would happen if he worked as hard at it as he could. He told me, with an uncontrived smile, that he had been promoted three times in six months. The young man had come to realize that every place he might find himself in had more potential than he might first see (particularly when his vision was impaired by the resentment and cynicism he felt from being near the bottom). After all, it is not as if a restaurant is a simple place—and this was part of an extensive national organization, a large, high-quality chain. To do a good job in such a place, servers must get along with the cooks, who are by universal recognition a formidably troublesome and tricky lot. They must also be polite and engaging with customers. They have to pay attention constantly. They must adjust to highly varying workloads—the rushes and dead times that inevitably accompany the life of a server. They have to show up on time, sober and awake. They must treat their superiors with the proper respect and do the same for those—such as the dishwashers—below them in the structure of authority. And if they do all these things, and happen to be working in a functional institution, they will soon render themselves difficult to replace. Customers, colleagues, and superiors alike will begin to react to them in an increasingly positive manner. Doors that would otherwise remain closed to them—even invisible—will be opened. Furthermore, the skills they acquire will prove eminently portable, whether they continue to rise in the hierarchy of restaurateurs, decide instead to further their education, or change their career trajectory completely (in which case they will leave with laudatory praise from their previous employers and vastly increased chances of discovering the next opportunity). As might be expected, the young man who had something to say to me was thrilled with what had happened to him. His status concerns had been solidly and realistically addressed by his rapid career advance, and the additional money he was making did not hurt, either. He had accepted, and therefore transcended, his role as a beginner. He had ceased being casually cynical about the place he occupied in the world and the people who surrounded him, and accepted the structure and the position he was offered. He started to see possibility and opportunity, where before he was blinded, essentially, by his pride. He stopped denigrating the social institution he found himself part of and began to play his part properly. And that increment in humility paid off in spades. 9 The Necessity of Equals It is good to be a beginner, but it is a good of a different sort to be an equal among equals. It is said, with much truth, that genuine communication can take place only between peers. This is because it is very difficult to move information up a hierarchy. Those well positioned (and this is a great danger of moving up) have used their current competence—their cherished opinions, their present knowledge, their current skills—to stake a moral claim to their status. In consequence, they have little motivation to admit to error, to learn or change—and plenty of reason not to. If a subordinate exposes the ignorance of someone with greater status, he risks humiliating that person, questioning the validity of the latter’s claim to influence and status, and revealing him as incompetent, outdated, or false. For this reason, it is very wise to approach your boss, for example, carefully and privately with a problem (and perhaps best to have a solution at hand—and not one proffered too incautiously). Barriers exist to the flow of genuine information down a hierarchy, as well. For example, the resentment people lower in the chain of command might feel about their hypothetically lesser position can make them loath to act productively on information from above—or, in the worst case, can motivate them to work at counterpurposes to what they have learned, out of sheer spite. In addition, those who are inexperienced or less educated, or who newly occupy a subordinate position and therefore lack knowledge of their surroundings, can be more easily influenced by relative position and the exercise of power, instead of quality of argumentation and observation of competence. Peers, by contrast, must in the main be convinced. Their attention must be carefully reciprocated. To be surrounded by peers is to exist in a state of equality, and to manifest the give-and-take necessary to maintain that equality. It is therefore good to be in the middle of a hierarchy. This is partly why friendships are so important, and why they form so early in life. A two-year-old, typically, is self-concerned, although also capable of simple reciprocal actions. The same Scarlett whom I talked about earlier—my granddaughter—would happily hand me one of her favorite stuffed toys, attached to a pacifier, when I asked her to. Then I would hand it, or toss it, back (sometimes she would toss it to me, too—or at least relatively near me). She loved this game. We played it with a spoon, as well—an implement she was just beginning to master. She played the same way with her mother and her grandmother—with anyone who happened to be within playing distance, if she was familiar enough with them not to be shy. This was the beginning of the behaviors that transform themselves into full-fledged sharing among older children. My daughter, Mikhaila, Scarlett’s mother, took her child to the outdoor recreational space on top of their downtown condo a few days before I wrote this. A number of other children were playing there, most of them older, and there were plenty of toys. Scarlett spent her time hoarding as many of the playthings as possible near her mother’s chair, and was distinctly unimpressed if other children came along to purloin one for themselves. She even took a ball directly from another child to add to her collection. This is typical behavior for children two and younger. Their ability to reciprocate, while hardly absent (and able to manifest itself in truly endearing ways), is developmentally limited. By three years of age, however, most children are capable of truly sharing. They can delay gratification long enough to take their turn while playing a game that everyone cannot play simultaneously. They can begin to understand the point of a game played by several people and follow the rules, although they may not be able to give a coherent verbal account of what those rules are. They start to form friendships upon repeated exposure to children with whom they have successfully negotiated reciprocal play relationships. Some of these friendships turn into the first intense relationships that children have outside their family. It is in the context of such relationships, which tend strongly to 10 form between equals in age (or at least equals in developmental stage), that a child learns to bond tightly to a peer and starts to learn how to treat another person properly while requiring the same in return. This mutual bonding is vitally important. A child without at least one special, close friend is much more likely to suffer later psychological problems, whether of the depressive/anxious or antisocial sort,[9] while children with fewer friends are also more likely to be unemployed and unmarried as adults.[10] There is no evidence that the importance of friendship declines in any manner with age.8 All causes of mortality appear to be reduced among adults with high-quality social networks, even when general health status is taken into consideration. This remains true among the elderly in the case of diseases such as hypertension, diabetes, emphysema, and arthritis, and for younger and older adults alike in the case of heart attacks. Interestingly enough, there is some evidence that it is the provision of social support, as much or more than its receipt, that provides these protective benefits (and, somewhat unsurprisingly, that those who give more tend to receive more).[11] Thus, it truly seems that it is better to give than to receive. Peers distribute both the burdens and joys of life. Recently, when my wife, Tammy, and I suffered serious health problems, we were fortunate enough to have family members (my in-laws, sister and brother; my own mother and sister; our children) and close friends stay with us and help for substantial periods of time. They were willing to put their own lives on hold to aid us while we were in crisis. Before that, when my book 12 Rules for Life became a success, and during the extensive speaking tour that followed, Tammy and I were close to people with whom we could share our good fortune. These were friends and family members genuinely pleased with what was happening and following the events of our lives avidly, and who were willing to discuss what could have been the overwhelming public response. This greatly heightened the significance and meaning of everything we were doing and reduced the isolation that such a dramatic shift in life circumstances, for better or worse, is likely to produce. The relationships established with colleagues of similar status at work constitute another important source of peer regulation, in addition to friendship. To maintain good relationships with your colleagues means, among other things, to give credit where credit is due; to take your fair share of the jobs no one wants but still must be done; to deliver on time and in a high-quality manner when teamed with other people; to show up when expected; and, in general, to be trusted to do somewhat more than your job formally requires. The approval or disapproval of your colleagues rewards and enforces this continual reciprocity, and that—like the reciprocity that is necessarily part of friendship—helps maintain stable psychological function. It is much better to be someone who can be relied upon, not least so that during times of personal trouble the people you have worked beside are willing and able to step in and help. Through friendship and collegial relationships we modify our selfish proclivities, learning not to always put ourselves first. Less obviously, but just as importantly, we may also learn to overcome our naive and too empathic proclivities (our tendency to sacrifice ourselves unsuitably and unjustly to predatory others) when our peers advise and encourage us to stand up for ourselves. In consequence, if we are fortunate, we begin to practice true reciprocity, and we gain at least some of the advantage spoken about so famously by the poet Robert Burns: O wad some Pow’r the giftie gie us To see oursels as ithers see us! It wad frae mony a blunder free us, 8This makes the July 30, 2019, poll from YouGov, “Millennials Are the Loneliest Generation” (bit.ly/2TVVMLn), indicating that 25 percent have no acquaintances and 22 percent no friends, particularly ominous, if true. 11 An’ foolish notion: What airs in dress an’ gait wad lea’e us, An’ ev’n devotion![12] Top Dog It is a good thing to be an authority. People are fragile. Because of that, life is difficult and suffering common. Ameliorating that suffering—ensuring that everyone has food, clean water, sanitary facilities, and a place to take shelter, for starters—takes initiative, effort, and ability. If there is a problem to be solved, and many people involve themselves in the solution, then a hierarchy must and will arise, as those who can do, and those who cannot follow as best they can, often learning to be competent in the process. If the problem is real, then the people who are best at solving the problem at hand should rise to the top. That is not power. It is the authority that properly accompanies ability. Now, it is self-evidently appropriate to grant power to competent authorities, if they are solving necessary problems; and it is equally appropriate to be one of those competent authorities, if possible, when there is a perplexing problem at hand. This might be regarded as a philosophy of responsibility. A responsible person decides to make a problem his or her problem, and then works diligently—even ambitiously—for its solution, with other people, in the most efficient manner possible (efficient, because there are other problems to solve, and efficiency allows for the conservation of resources that might then be devoted importantly elsewhere). Ambition is often—and often purposefully—misidentified with the desire for power, and damned with faint praise, and denigrated, and punished. And ambition is sometimes exactly that wish for undue influence on others. But there is a crucial difference between sometimes and always. Authority is not mere power, and it is extremely unhelpful, even dangerous, to confuse the two. When people exert power over others, they compel them, forcefully. They apply the threat of privation or punishment so their subordinates have little choice but to act in a manner contrary to their personal needs, desires, and values. When people wield authority, by contrast, they do so because of their competence—a competence that is spontaneously recognized and appreciated by others, and generally followed willingly, with a certain relief, and with the sense that justice is being served. Those who are power hungry—tyrannical and cruel, even psychopathic—desire control over others so that every selfish whim of hedonism can be immediately gratified; so that envy can destroy its target; so that resentment can find its expression. But good people are ambitious (and diligent, honest, and focused along with it) instead because they are possessed by the desire to solve genuine, serious problems. That variant of ambition needs to be encouraged in every possible manner. It is for this reason, among many others, that the increasingly reflexive identification of the striving of boys and men for victory with the “patriarchal tyranny” that hypothetically characterizes our modern, productive, and comparatively free societies is so stunningly counterproductive (and, it must be said, cruel: there is almost nothing worse than treating someone striving for competence as a tyrant in training). “Victory,” in one of its primary and most socially important aspects, is the overcoming of obstacles for the broader public good. Someone who is sophisticated as a winner wins in a manner that improves the game itself, for all the players. To adopt an attitude of naive or willfully blind cynicism about this, or to deny outright that it is true, is to position yourself—perhaps purposefully, as people have many dark motives—as an enemy of the practical amelioration of suffering itself. I can think of few more sadistic attitudes. Now, power may accompany authority, and perhaps it must. However, and more important, genuine authority constrains the arbitrary exercise of power. This constraint manifests itself when the authoritative agent cares, and takes responsibility, for those over whom the exertion of power is pos- 12 sible. The oldest child can take accountability for his younger siblings, instead of domineering over and teasing and torturing them, and can learn in that manner how to exercise authority and limit the misuse of power. Even the youngest can exercise appropriate authority over the family dog. To adopt authority is to learn that power requires concern and competence—and that it comes at a genuine cost. Someone newly promoted to a management position soon learns that managers are frequently more stressed by their multiple subordinates than subordinates are stressed by their single manager. Such experience moderates what might otherwise become romantic but dangerous fantasies about the attractiveness of power, and helps quell the desire for its infinite extension. And, in the real world, those who occupy positions of authority in functional hierarchies are generally struck to the core by the responsibility they bear for the people they supervise, employ, and mentor. Not everyone feels this burden, of course. A person who has become established as an authority can forget his origins and come to develop a counterproductive contempt for the person who is just starting out. This is a mistake, not least because it means that the established person cannot risk doing something new (as it would mean adopting the role of despised fool). It is also because arrogance bars the path to learning. Shortsighted, willfully blind, and narrowly selfish tyrants certainly exist, but they are by no means in the majority, at least in functional societies. Otherwise nothing would work. The authority who remembers his or her sojourn as voluntary beginner, by contrast, can retain their identification with the newcomer and the promise of potential, and use that memory as the source of personal information necessary to constrain the hunger for power. One of the things that has constantly amazed me is the delight that decent people take in the ability to provide opportunities to those over whom they currently exercise authority. I have experienced this repeatedly: personally, as a university professor and researcher (and observed many other people in my situation doing the same); and in the business and other professional settings I have become familiar with. There is great intrinsic pleasure in helping already competent and admirable young people become highly skilled, socially valuable, autonomous, responsible professionals. It is not unlike the pleasure taken in raising children, and it is one of the primary motivators of valid ambition. Thus, the position of top dog, when occupied properly, has as one of its fundamental attractions the opportunity to identify deserving individuals at or near the beginning of their professional life, and provide them with the means of productive advancement. Social Institutions are Necessary - but Insufficient Sanity is knowing the rules of the social game, internalizing them, and following them. Differences in status are therefore inevitable, as all worthwhile endeavors have a goal, and those who pursue them have different abilities in relationship to that goal. Accepting the fact of this disequilibrium and striving forward nonetheless—whether presently at the bottom, middle, or top—is an important element of mental health. But a paradox remains. The solutions of yesterday and today, upon which our current hierarchies depend, will not necessarily serve as solutions tomorrow. Thoughtless repetition of what sufficed in the past—or, worse, authoritarian insistence that all problems have been permanently solved—therefore means the introduction of great danger when changes in the broader world makes local change necessary. Respect for creative transformation must in consequence accompany appropriate regard for the problem-solving hierarchical structures bequeathed to us by the past. That is neither an arbitrary moral opinion nor a morally relative claim. It is something more akin to knowledge of twin natural laws built into the structure of our reality. Highly social creatures such as we are must abide by the rules, to remain sane and minimize unnecessary uncertainty, suffering, and strife. However, we must also transform those rules carefully, as circumstances change around us. 13 This implies, as well, that the ideal personality cannot remain an unquestioning reflection of the current social state. Under normal conditions, it may be nonetheless said that the ability to conform unquestioningly trumps the inability to conform. However, the refusal to conform when the social surround has become pathological—incomplete, archaic, willfully blind, or corrupt—is something of even higher value, as is the capacity to offer creative, valid alternatives. This leaves all of us with a permanent moral conundrum: When do we simply follow convention, doing what others request or demand; and when do we rely on our own individual judgment, with all its limitations and biases, and reject the requirements of the collective? In other words: How do we establish a balance between reasonable conservatism and revitalizing creativity? First and foremost on the psychological front is the issue of temperament. Some people are temperamentally predisposed to conservatism, and others to more liberal creative perception and action.[13] This does not mean that socialization has no ability to alter that predisposition; human beings are very plastic organisms, with a long period of preadult development, and the circumstances we find ourselves in can change us very drastically. That does not alter the fact, however, that there are relatively permanent niches in the human environment to which different modes of temperament have adapted to fill. Those who tend toward the right, politically, are staunch defenders of all that has worked in the past. And much of the time, they are correct in being so, because of the limited number of pathways that produce personal success, social harmony, and long-term stability. But sometimes they are wrong: first, because the present and the future differ from the past; second, because even once-functional hierarchies typically (inevitably?) fall prey to internal machinations in a manner that produces their downfall. Those who rise to the top can do so through manipulation and the exercise of unjust power, acting in a manner that works only for them, at least in the short term; but that kind of ascendance undermines the proper function of the hierarchy they are nominally part of. Such people generally fail to understand or do not care what function the organization they have made their host was designed to fulfill. They extract what they can from the riches that lie before them and leave a trail of wreckage in their wake. It is this corruption of power that is strongly objected to by those on the liberal/left side of the political spectrum, and rightly so. But it is critically important to distinguish between a hierarchy that is functional and productive (and the people who make it so) and the degenerate shell of a oncegreat institution. Making that distinction requires the capacity and the willingness to observe and differentiate, rather than mindless reliance on ideological proclivity. It requires knowing that there is a bright side to the social hierarchies we necessarily inhabit, as well as a dark (and the realization that concentrating on one to the exclusion of the other is dangerously biased). It also requires knowledge that on the more radical, creative side—the necessary source of revitalization for what has become immoral and outdated—there also lurks great danger. Part of the danger is that very tendency of those who think more liberally to see only the negative in well-founded institutions. The further danger stems from the counterpart to the corrupt but conservative processes that destabilize and destroy functional hierarchies: there are unethical radicals, just as there are crooked administrators, managers, and executives. These individuals tend to be profoundly ignorant of the complex realities of the status quo, unconscious of their own ignorance, and ungrateful for what the past has bequeathed to them. Such ignorance and ingratitude are often conjoined with the willingness to use tired clichés of cynicism to justify refusal to engage either in the dull but necessary rigors of convention or the risks and difficulties of truly generative endeavor. It is this corruption of creative transformation that renders the conservative—and not only the conservative—appropriately cautious of change. A few years before writing this, I had a discussion with a young woman in her early twenties—the niece of someone who emailed me after watching some of my online lectures. She appeared severely unhappy, and said that she had spent much of the past six months lying in bed. She came to talk to 14 me because she was becoming desperate. The only thing that stood between her and suicide, as far as she was concerned, was the responsibility she still maintained for an exotic pet, a serval cat. This was the last remaining manifestation of an interest in biology that once gripped her, but which she abandoned, much to her current regret, when she dropped out of high school. She had not been well attended to by her parents, who had allowed her to drift in the manner that had become disastrous over the span of several years. Despite her decline, she had formulated a bit of a plan. She said she had thought about enrolling in a two-year program that would enable her to finish high school, as a prerequisite for applying to a veterinary college. But she had not made the necessary detailed inquiries into what would be required to carry out this ambition. She lacked a mentor. She had no good friends. It was far too easy for her to remain inactive and disappear into her isolation. We had a good conversation, for about three quarters of an hour. She was a nice kid. I offered to discuss her future in more detail if she would complete an online planning program designed by my professorial colleagues and me.9 All was going well until the discussion twisted toward the political. After discussing her personal situation, she began to voice her discontent with the state of the world at large—with the looming catastrophe, in her opinion, of the effects of human activity on the environment. Now, there is nothing wrong, in principle, with the expression of concern for planet-wide issues. That is not the point. There is something wrong, however, with overestimating your knowledge of such things—or perhaps even considering them—when you are a mid-twenty-year-old with nothing positive going on in your life and you are having great difficulty even getting out of bed. Under those conditions, you need to get your priorities straight, and establishing the humility necessary to attend to and solve your own problems is a crucial part of doing just that. As the verbal exchange continued, I found myself no longer engaged in a genuine conversation with a lost young woman who had come to speak with me. Instead, I became a hypothetically equal partner in a debate with an ideologue who knew what was wrong, globally speaking; who knew who was at fault for those global problems; who knew that participating in the continuing destruction by manifesting any personal desire whatsoever was immoral; and who believed, finally, that we were all both guilty and doomed. Continuing the conversation at that point meant I was (1) speaking not with this young woman so much as with whatever or whomever took possession of her while in the grip of generic, impersonal, and cynical ideas, and (2) implying that discussion of such topics under the circumstances was both acceptable and productive. There was no point in either outcome. So, I stopped (which did not mean that the entire meeting had been a waste). It was impossible for me not to conclude that some of what had reduced her to her monthslong state of moral paralysis was not so much guilt about potentially contributing to the negative effects of human striving on the broader world, as it was the sense of moral superiority that concern about such things brought her (despite the exceptional psychological danger of embracing this dismal view of human possibility). Excuse the cliché, but it is necessary to walk before you can run. You may even have to crawl before you can walk. This is part of accepting your position as a beginner, at the bottom of the hierarchy you so casually, arrogantly, and self-servingly despise. Furthermore, the deeply antihuman attitude that often accompanies tears shed for environmental degradation and man’s inhumanity to man cannot but help but have a marked effect on the psychological attitude that defines a person’s relationship to him or herself. It has taken since time immemorial for us to organize ourselves, biologically and socially, into the 9This is part of the Self-Authoring Suite, a set of individual programs designed to help people write about the troubles of their past (Past Authoring), the faults and virtues of their present personality (Present Authoring, in two parts), and their desires and wishes for the future (Future Authoring). I specifically recommended the latter. 15 functional hierarchies that both specify our perceptions and actions, and define our interactions with the natural and social world. Profound gratitude for that gift is the only proper response. The structure that encompasses us all has its dark side—just as nature does, just as each individual does—but that does not mean careless, generic, and self-serving criticism of the status quo is appropriate (any more than knee-jerk objection to what might be necessary change). The Necessity of Balance Because doing what others do and have always done so often works, and because, sometimes, radical action can produce success beyond measure, the conservative and the creative attitudes and actions constantly propagate themselves. A functional social institution—a hierarchy devoted to producing something of value, beyond the mere insurance of its own survival—can utilize the conservative types to carefully implement processes of tried-and-true value, and the creative, liberal types to determine how what is old and out of date might be replaced by something new and more valuable. The balance between conservatism and originality might therefore be properly struck, socially, by bringing the two types of persons together. But someone must determine how best to do that, and that requires a wisdom that transcends mere temperamental proclivity. Because the traits associated with creativity, on the one hand, and comfort with the status quo, on the other, tend to be mutually exclusive, it is difficult to find a single person who has balanced both properly, who is therefore comfortable working with each type, and who can attend, in an unbiased manner, to the necessity for capitalizing on the respective forms of talent and proclivity. But the development of that ability can at least begin with an expansion of conscious wisdom: the articulated realization that conservatism is good (with a set of associated dangers), and that creative transformation—even of the radical sort—is also good (with a set of associated dangers). Learning this deeply—truly appreciating the need for both viewpoints—means at least the possibility of valuing what truly diverse people have to offer, and of being able to recognize when the balance has swung too far in one direction. The same is true of the knowledge of the shadow side of both. To manage complex affairs properly, it is necessary to be cold enough in vision to separate the power hungry and self-serving pseudoadvocate of the status quo from the genuine conservative; and the self-deceptive, irresponsible rebel without a cause from the truly creative. And to manage this means to separate those factors within the confines of one’s own soul, as well as among other people. And how might this be accomplished? First, we might come to understand consciously that these two modes of being are integrally interdependent. One cannot truly exist without the other, although they exist in genuine tension. This means, first, for example, that discipline—subordination to the status quo, in one form or another—needs to be understood as a necessary precursor to creative transformation, rather than its enemy. Thus, just as the hierarchy of assumptions that make up the structure that organizes society and individual perceptions is shaped by, and integrally dependent on, restrictions, so too is creative transformation. It must strain against limits. It has no use and cannot be called forth unless it is struggling against something. It is for this reason that the great genie, the granter of wishes—God, in a microcosm—is archetypally trapped in the tiny confines of a lamp and subject, as well, to the will of the lamp’s current holder. Genie—genius—is the combination of possibility and potential, and extreme constraint. Limitations, constraints, arbitrary boundaries—rules, dread rules, themselves—therefore not only ensure social harmony and psychological stability, they make the creativity that renews order possible. What lurks, therefore, under the explicitly stated desire for complete freedom—as expressed, say, by the anarchist, or the nihilist—is not a positive desire, striving for enhanced creative expression, as in the romanticized caricature of the artist. It is instead a negative desire—a desire for the complete absence of responsibility, which is simply not commensurate with genuine freedom. This is the lie of objections to the rules. But “Down with Responsibility” does not make for a compelling slogan—being 16 sufficiently narcissistic to negate itself self-evidently—while the corresponding “Down with the Rules” can be dressed up like a heroic corpse. Alongside the wisdom of true conservatism is the danger that the status quo might become corrupt and its corruption self-servingly exploited. Alongside the brilliance of creative endeavor is the false heroism of the resentful ideologue, who wears the clothes of the original rebel while undeservedly claiming the upper moral hand and rejecting all genuine responsibility. Intelligent and cautious conservatism and careful and incisive change keep the world in order. But each has its dark aspect, and it is crucial, once this has been realized, to pose the question to yourself: Are you the real thing, or its opposite? And the answer is, inevitably, that you are some of both—and perhaps far more of what is shadowy than you might like to realize. That is all part of understanding the complexity we each carry within us. Personality as Hierarchy - and Capacity for Transformation How, then, is the personality that balances respect for social institutions and, equally, creative transformation to be understood? It is not so easy to determine, given the complexity of the problem. For that reason, we turn to stories. Stories provide us with a broad template. They outline a pattern specific enough to be of tremendous value, if we can imitate it, but general enough (unlike a particular rule or set of rules) to apply even to new situations. In stories, we capture observations of the ideal personality. We tell tales about success and failure in adventure and romance. Across our narrative universes, success moves us forward to what is better, to the promised land; failure dooms us, and those who become entangled with us, to the abyss. The good moves us upward and ahead, and evil drags us backward and down. Great stories are about characters in action, and so they mirror the unconscious structures and processes that help us translate the intransigent world of facts into the sustainable, functional, reciprocal social world of values.10 The properly embodied hierarchy of values—including the value of conservatism and its twin, creative transformation—finds its expression as a personality, in narrative—an ideal personality. Every hierarchy has something at its pinnacle. It is for this reason that a story, which is a description of the action of a personality, has a hero (and even if that someone is the antihero, it does not matter: the antihero serves the function of identifying the hero through contrast, as the hero is what the antihero is most decidedly not). The hero is the individual at the peak, the victor, the champion, the wit, the eventually successful and deserving underdog, the speaker of truth under perilous circumstances, and more. The stories we create, watch, listen to, and remember center themselves on actions and attitudes we find interesting, compelling, and worthy of communication as a consequence of our personal experience with both admirable and detestable people (or fragments of their specific attitudes and actions), or because of our proclivity to share what has gripped our attention with those who surround us. Sometimes we can draw compelling narratives directly from our personal experience with individual people; sometimes we create amalgams of multiple personalities, often in concert with those who compose our social groups. The client whose story was told in part earlier had a life usefully employed as an example of the necessity of social engagement. That tale did not, however, exhaust the significance of his transformed attitudes and actions. While he was reconstructing his social life, becoming an active participant in a range of collective activities, he simultaneously developed a certain creative expertise that was equally unexpected. He had not benefited from formal education beyond the high school level, and did 10You can see this played out, for example, in the proclivity of American evangelical Protestants to ask, when faced with a novel existential problem, “What would Jesus do?” It is an easy approach to parody, but indicates precisely the values of stories: Once a narrative has been internalized, it can be used as a template to generate new perceptions and behaviors. It might seem naive or presumptuous to imagine what actions the archetypal Savior Himself might undertake in the confines of a normal life, but the fundamental purpose of religious narratives is in fact to motivate imitation. 17 not have a personality that immediately struck the external observer as markedly creative. However, the personally novel social pursuits that attracted him were in the main oriented toward aesthetic endeavor. He first developed his eye for form, symmetry, novelty, and beauty as a photographer. The social advantages of this pursuit were manifold: he joined a club that had its members attend biweekly photography walks, where they would sojourn as a group of twenty or so to parts of the city that were visually interesting, either for their natural beauty or uniqueness or for the attraction they held as industrial landscapes. He learned a fair bit about photographic equipment, technically, because of doing so. The group members also critiqued one another’s work—and they did this constructively, which meant that all of them appeared to indicate what errors had been made but also what of value had been managed. This all helped my client learn to communicate in a productive manner about topics that might otherwise have been psychologically difficult (touching as they did on criticisms that, because of their association with creative vision, could easily have generated counterproductively sensitive overreactions) and, as well, to increasingly distinguish between visual images that were trite or dull or conformist and those of genuine quality. After a few months, his perception had developed sufficiently so that he began to win local contests and generate small professional commissions. I had believed from the beginning that his participation in the photography club was well advised from the perspective of personality development, but I was genuinely struck by the rapid development of his visual and technical ability and very much enjoyed the times we spent in our sessions reviewing his work. After a few months of work on the photography front, my client began to produce and to show me other images he had created, as well—which were in their first incarnation decidedly amateurish abstract line drawings done in pen. These essentially consisted of loops of various sizes, joined continuously, on a single page: scribbles, really, although more controlled and evidently purposeful than mere scribbles. As I had with the photographs (and the photography club), I regarded these as psychologically useful—as an extension of creative ability—but not as worthwhile artistic endeavors in their own right. He kept at it, however, generating several drawings a week, all the while bringing what he had created to our sessions. What he produced increased in sophistication and beauty with dramatic rapidity. Soon, he was drawing complex, symmetrical, and rather dramatic black-and-white pen-and-ink drawings of sufficient intrinsic beauty to serve as commercially viable T-shirt designs. I had seen this sort of development clearly in the case of two other clients, both characterized by intrinsically creative temperaments (very well hidden in one of the cases; more developed, nurtured, and obvious in the other). In addition, I had read accounts of clinical cases and personal development by Carl Jung, who noted that the production of increasingly ordered and complex geometrical figures—often circles within squares, or the reverse—regularly accompanied an increase in organization of the personality. This certainly seemed true not only of my client, as evidenced by his burgeoning expertise at photography and the development of his skill as a graphic artist, but also of the two others I had the pleasure of serving as a clinical therapist. What I observed repeatedly was, therefore, not only the reconstruction of the psyche as a consequence of further socialization (and the valuation of social institutions) but the parallel transformation of primarily interior processes, indicated by a marked increase in the capacity to perceive and to create what was elegant, beautiful, and socially valued. My clients had learned not only to submit properly to the sometimes arbitrary but still necessary demands of the social world, but to offer to that world something it would not have had access to had it not been for their private creative work. My granddaughter, Scarlett, also came to exhibit behaviors that were indicative of, if not her creative ability, then at least her appreciation for creative ability, in addition to her socialization as an agent of socially valued pointing. When people discuss a story—presented as a movie, or a play, or 18 a book—they commonly attempt to come to a sophisticated consensus about its point (sophisticated because a group of people can generally offer more viewpoints than a single individual; consensus because the discussion usually continues until some broad agreement is reached as to the topic at hand). Now, the idea that a story is a form of communication—and entertainment—is one of those facts that appears self-evident upon first consideration, but that becomes more mysterious the longer it is pondered. If it is true that a story has a point, then it is clear that it is pointing to something. But what, and how? What constitutes pointing is obvious when it is an action specifying a particular thing, or a person by a particular person, but much less obvious when it is something typifying the cumulative behavior, shall we say, of a character in a story. The actions and attitudes of J. K. Rowling’s heroes and heroines once again provide popular examples of precisely this process. Harry Potter, Ron Weasley, and Hermione Granger are typified in large part by the willingness and ability to follow rules (indicating their expertise as apprentices) and, simultaneously, to break them. While those who supervise them are inclined, equally, to reward both apparently paradoxical forms of behavior. Even the technologies used by the young wizards during their apprenticeship are characterized by this duality. The Marauder’s Map, for example (which provides its bearer with an accurate representation of explored territory in the form of the physical layout or geography of Hogwarts, the wizarding school, as well as the locale of all its living denizens), can be activated as a functional tool only by uttering a set of words that seem to indicate the very opposite of moral behavior: “I solemnly swear that I am up to no good,” and deactivated, so that its function remains secret, with the phrase “Mischief managed.” It is no easy matter to understand how an artifact that requires such statements to make it usable could possibly be anything but “no good”—a tool of evil purpose, apparently. But, like the fact that Harry and his friends regularly but carefully break rules, and are equally regularly and carefully rewarded for doing so, the Marauder’s Map varies in its ethical desirability with the intent of its users. There is a strong implication throughout the series that what is good cannot be simply encapsulated by mindless or rigid rule following, no matter how disciplined that following, or how vital the rules so followed. What this all means is that the Harry Potter series does not point to drone-like subservience to social order as the highest of moral virtues. What supersedes that obedience is not so obvious that it can be easily articulated, but it is something like “Follow rules except when doing so undermines the purpose of those selfsame rules—in which case take the risk of acting in a manner contrary to what has been agreed upon as moral.” This is a lesson that seems more easily taught by representations of the behaviors that embody it than transmitted by, say, rote learning or a variant rule. Meta-rules (which might be regarded as rules about rules, rather than rules themselves) are not necessarily communicated in the same manner as simple rules themselves. Scarlett, with her emphasis on pointing, learned soon after mastering the comparatively straightforward physical act, to grasp the more complex point of narratives. She could signify something with her index finger at the age of a year and a half. By two and a half years, however, she could understand and imitate the far more intricate point of a story. For a period of approximately six months, at the latter age, she would insist, when asked, that she was Pocahontas, rather than Ellie (the name preferred by her father) or Scarlett (preferred by her mother). This was a staggering act of sophisticated thought, as far as I was concerned. She had been given a Pocahontas doll, which became one of her favorite toys, along with a baby doll (also very well loved), who she named after her grandmother, my wife, Tammy. When she played with the infant doll, Ellie was the mother. With Pocahontas, however, the situation differed. That doll was not a baby, and Ellie was not its mother. My granddaughter regarded herself, instead, as the grown Pocahontas—mimicking the doll, which was fashioned like a young woman, as well as the character who served as the lead in the Disney movie of the same name, which she had raptly observed on two separate occasions. 19 The Disney Pocahontas bore marked similarities to the main protagonists of the Harry Potter series. She finds herself promised by her father to Kocoum, a brave warrior who embodies, in all seriousness, the virtues of his tribe, but whose behavior and attitudes are too rule bound for the more expansive personality of his bride-to-be. Pocahontas falls in love, instead, with John Smith, captain of a ship from Europe and representative of that which falls outside of known territory but is (potentially) of great value. Paradoxically, Pocahontas is pursuing a higher moral order in rejecting Kocoum for Smith—breaking a profoundly important rule (value what is most valued in the current culture’s hierarchy of rules)—very much in the same manner as the primary Potter characters. That is the moral of both narratives: follow the rules until you are capable of being a shining exemplar of what they represent, but break them when those very rules now constitute the most dire impediment to the embodiment of their central virtues. And Elizabeth Scarlett, not yet three years of age, had the intrinsic wisdom to see this as the point of what she was watching (the Disney movie) and using as a role-playing aid (the doll Pocahontas). Her perspicacity in this regard bordered on the unfathomable. The same set of ideas—respect for the rules, except when following those rules means disregarding or ignoring or remaining blind to an even higher moral principle—is represented with stunning power in two different Gospel narratives (which serve, regardless of your opinion about them, as central traditional or classical stories portraying a personality for the purposes of evoking imitation). In the first, Christ is presented, even as a child, as a master of the Jewish tradition. This makes him fully informed as to the value of the past, and portrays him as characterized by the respect typical, say, of the genuine conservative. According to the account in Luke 2:42–52,11 Jesus’s family journeyed to Jerusalem every year at the Jewish holiday of Passover: And when he was twelve years old, they went up to Jerusalem after the custom of the feast. And when they had fulfilled the days, as they returned, the child Jesus tarried behind in Jerusalem; and Joseph and his mother knew not of it. But they, supposing him to have been in the company, went a day’s journey; and they sought him among their kinsfolk and ac- quaintance. And when they found him not, they turned back again to Jerusalem, seeking him. And it came to pass, that after three days they found him in the temple, sitting in the midst of the doctors, both hearing them, and asking them questions. And all that heard him were astonished at his understanding and answers. And when they saw him, they were amazed: and his mother said unto him, Son, why hast thou thus dealt with us? behold, thy father and I have sought thee sorrowing. And he said unto them, How is it that ye sought me? wist ye not that I must be about my Father’s business? And they understood not the saying which he spake unto them. And he went down with them, and came to Nazareth, and was subject unto them: but his mother kept all these sayings in her heart. And Jesus increased in wisdom and stature, and in favour with God and man. 11All biblical citations are from the King James Version unless otherwise noted. 20 A paradox emerges, however, as the entirety of the Gospel accounts are considered—one closely associated with the tension between respect for tradition and the necessity for creative transformation. Despite the evidence of His thorough and even precocious understanding and appreciation of the rules, the adult Christ repeatedly and scandalously violates the Sabbath traditions—at least from the standpoint of the traditionalists in His community, and much to His own peril. He leads His disciples through a cornfield, for example, plucking and eating the grains (Luke 6:1). He justifies this to the Pharisees who object by referring to an account of King David acting in a similar manner, feeding his people when necessity demanded it on bread that was reserved for the priests (Luke 6:4). Christ tells his interlocutors quite remarkably “that the Son of man is Lord also of the sabbath” (Luke 6:5). An ancient document known as the Codex Bezae,12 a noncanonical variant of part of the New Testament, offers an interpolation just after the section of the Gospel of Luke presented above, shedding profound light on the same issue. It offers deeper insight into the complex and paradoxical relationship between respect for the rules and creative moral action that is necessary and desirable, despite manifesting itself in apparent opposition to those rules. It contains an account of Christ addressing someone who, like Him, has broken a sacred rule: “On that same day, observing one working on the Sabbath, [Jesus] said to him O Man, if indeed thou knowest what thou doest, thou art blest; but if thou knowest not, thou art accursed, and a transgressor of the Law.”[14] What does this statement mean? It sums up the meaning of Rule I perfectly. If you understand the rules—their necessity, their sacredness, the chaos they keep at bay, how they unite the communities that follow them, the price paid for their establishment, and the danger of breaking them—but you are willing to fully shoulder the responsibility of making an exception, because you see that as serving a higher good (and if you are a person with sufficient character to manage that distinction), then you have served the spirit, rather than the mere law, and that is an elevated moral act. But if you refuse to realize the importance of the rules you are violating and act out of self-centered convenience, then you are appropriately and inevitably damned. The carelessness you exhibit with regard to your own tradition will undo you and perhaps those around you fully and painfully across time. This is in keeping with other sentiments and acts of Christ described in the Gospels. Matthew 12:11 states: “And he said unto them, What man shall there be among you, that shall have one sheep, and if it fall into a pit on the Sabbath day, will he not lay hold on it, and lift it out?” Luke chapter 6 describes Him healing a man with a withered hand on another Sabbath, stating “It is lawful on the Sabbath days to do good, or to do evil? to save life, or destroy it?” (Luke 6:9). This psychologically and conceptually painful juxtaposition of two moral stances (the keeping of the Sabbath versus the injunction to do good) is something else that constantly enrages the Pharisees, and is part of the series of events that eventually leads to Christ’s arrest and Crucifixion. These stories portray the existential dilemma that eternally characterizes human life: it is necessary to conform, to be disciplined, and to follow the rules—to do humbly what others do; but it is also necessary to use judgment, vision, and the truth that guides conscience to tell what is right, when the rules suggest otherwise. It is the ability to manage this combination that truly characterizes the fully developed personality: the true hero. A certain amount of arbitrary rule-ness must be tolerated—or welcomed, depending on your point of view—to keep the world and its inhabitants together. A certain amount of creativity and rebellion must be tolerated—or welcomed, depending on your point of view—to maintain the process of regeneration. Every rule was once a creative act, breaking other rules. Every creative act, genuine in its creativity, is likely to transform itself, with time, into a useful rule. It is the living interaction between 12A codex is a book composed of sheets of vellum, papyrus, or most commonly, paper. The term is now generally reserved for manuscripts that have been handwritten, as in the case of the Codex Bezae. The Codex Bezae contains Greek and Latin versions of Acts and most of the four Gospels that are unique in what they additionally include, what they omit, and often, the style in which they are written. 21 social institutions and creative achievement that keeps the world balanced on the narrow line between too much order and too much chaos. This is a terrible conundrum, a true existential burden. We must support and value the past, and we need to do that with an attitude of gratitude and respect. At the same time, however, we must keep our eyes open—we, the visionary living—and repair the ancient mechanisms that stabilize and support us when they falter. Thus, we need to bear the paradox that is involved in simultaneously respecting the walls that keep us safe and allowing in enough of what is new and changing so that our institutions remain alive and healthy. The very world depends for its stability and its dynamism on the subsuming of all our endeavors under the perfection—the sacredness—of that dual ability. Do not carelessly denigrate social institutions or creative achievement. 22