147 Leaders of cults and groups using thoughtreform processes have taken in and controlled millions of persons to the detriment of their welfare. Sometimes such influence is called coercive persuasion or extraordinary influence, to distinguish it from everyday persuasion by friends, family, and other influences in our lives, including the media and advertising. The key to successful thought reform is to keep the subjects unaware that they are being manipulated and controlled – and especially to keep them unaware that they are being moved along a path of change that will lead them to serve interests that are to their disadvantage. The usual outcome of thought-reform processes is that a person or group gains almost limitless control over the subjects for varying periods of time. When cultic groups using this level of undue influence are seen in the cold light of day, uninformed observers often cannot grasp how the group worked. They wonder how a rational person would ever get involved. Recently, because of the media attention garnered by the actions of certain groups, the world has become somewhat more aware of thought reform, but most people still don’t know how to deal with situations of extraordinary influence. A number of terms have been used to describe this process, including brainwashing, thought reform, coercive persuasion, mind control, coordinated programs of coercive influence and behavior control, and exploitative persuasion . . . Perhaps the first and last terms convey something of the crux of what I will be describing in this chapter. When I ask ordinary people what they think brainwashing is, they correctly grasp that it refers to the exploitative manipulation of one person by another. They usually describe a situation in which a person or group has conned others into going along with a plan put in place by the instigator. Conned has a widely understood meaning in our informal conversation and our streets, which is why it is generally difficult to manipulate street-smart kids. They already know to look for a double agenda, calling it a con game, snow job, scam, jiving someone, putting someone on, and many other names. A certain type of psychological con game is exactly what goes on in a thought-reform environment. A complex set of interlocking factors is put into place, and these factors, either quickly or slowly, depending on the situation and the subject, bring about deep changes in the mind-set and attitudes of the targeted individual. Through the manipulation of psychological and social factors, people’s attitudes can indeed be changed, and their thinking and behavior radically altered. CHAPTER NINE The Process of Brainwashing, Psychological Coercion, and Thought Reform MARGARET THALER SINGER THE “BRAINWASHING” CONTROVERSY Historical Examples of Brainwashing . . . In just the last sixty years, the world has seen numerous examples of how easily human conduct can be manipulated under certain circumstances. During the 1930s purge trials in the former Soviet Union, men and women accused of committing crimes against the state were maneuvered into both falsely confessing to and falsely accusing others of these crimes. The world press expressed bewilderment and amazement at the phenomenon but, with few exceptions, soon lapsed into silence. Then in the late 1940s and early 1950s, the world witnessed personnel at Chinese revolutionary universities implement a thoughtreform program that changed the beliefs and behaviors of the citizens of the largest nation in the world. This program, which Mao Tse-tung wrote about as early as the 1920s, was put into place when the communist regime took power in China on October 1, 1949. Chairman Mao had long planned how to change people’s political selves – to achieve “ideological remolding,” as he called it – through the use of a coordinated program of psychological, social, and political coercion. As a result, millions of Chinese citizens were induced to espouse new philosophies and exhibit new conduct. The term brainwashing was first introduced into the Western world in 1951, when American foreign correspondent Edward Hunter published a book titled Brainwashing in Red China. Hunter was the first to write about the phenomenon, based on his interviews of both Chinese and non-Chinese coming across the border from China into Hong Kong. His translator explained to him that the communist process of ridding people of the vestiges of their old belief system was called colloquially hse nao, which literally means “wash brain,” or “cleansing the mind.” The 1950s also brought the Korean War. North Korea’s intensive indoctrination of United Nations’ prisoners of war showed the world the extent to which captors would go to win converts to their political cause. The Korean program was based on methods used by the Chinese, combined with other social and psychological influence techniques. Later in the same decade, Cardinal Mindszenty, the head of the Roman Catholic Church in Hungary and a man of tremendous personal forcefulness, strength of convictions, and faith in God, ended up being so manipulated and processed by his Russian captors that he – like the earlier purge trial victims – both falsely confessed and falsely accused his colleagues. These extremes of social and psychological manipulations of thought and conduct were, and sometime still are, disregarded by Americans because the events occurred far away and could be dismissed as merely foreign propaganda and political acts. Such reasoning is a variation of the “not me” myth: not in our land could such a thing happen. But later, certain events occurred in California that forced many to see that extremes of influence and manipulation were possible in the United States, too. In 1969, Charles Manson manipulated a band of middle-class youths into believing his mad version of Helter Skelter. Under his influence and control, his followers carried out multiple vicious murders. Not long after, the Symbionese Liberation Army (SLA), a ragtag revolutionary group, kidnapped newspaper heiress Patricia Hearst and abused her psychologically and otherwise. The SLA used mind manipulations as well as gun-at-the-head methods to coerce Patty into compliance. They manipulated and controlled her behavior to the extent that she appeared with them in a bank robbery and feared returning to society, having been convinced by the SLA that the police and the FBI would shoot her. This series of events from the 1930s to the present demonstrates that individual autonomy and personal identity are much more fragile than was once commonly believed. And that certain venal types have gotten hold of and perfected techniques of persuasion that are wreaking havoc in our society . . . 148 THE “BRAINWASHING” CONTROVERSY Packaged Persuasion Several years ago, a colleague and I interviewed a young couple at the request of their attorneys. The couple, who had once been good citizens and loving parents, had been accused of a spanking that allegedly led to their son’s death. While they were members of a cult in West Virginia with a female leader, their 23-month-old son allegedly had either hit or pushed the leader’s grandchild during play. The parents were ordered to get the child to apologize; otherwise, according to the irate leader, no one would go to heaven. The boy was beaten with a wooden board by his father, with his mother in the room, for more than two and a half hours. The boy’s blood poled in the bruises in his buttocks and legs, and he died. In court, I described how the leader had slowly gained control of the members of her group and how the beating evolved from her teaching and control. In another case, Ron Luff, a former Navy career petty officer with a series of recommendations for excellent conduct and performance, was convinced by his cult leader to follow that leader’s orders. These orders were to help the leader kill an Ohio family of five, including three young daughters, dump the bodies into a lime pit in a barn, then go off on a long wilderness trek with the leader and his two dozen followers. Ron Luff was found guilty of aggravated murder and kidnapping and sentenced to 170 years in prison. The cult leader, Jeffrey Lundgren, was sentenced to die in Ohio’s electric chair. Both cases are on appeal at the time of this book’s writing. People repeatedly ask me how cult leaders get their followers to do such things as give their wives to a child-molesting cult leader, drop out of medical school to follow a martial arts guru, give several million dollars to a selfappointed messiah who wears a wig and has his favorite women dress like Jezebel, or practice sexual abstinence while following a blatantly promiscuous guru. Because of the great discrepancies between individual’s conduct before cult membership and the behavior exhibited while in the cult, families, friends, and the public wonder how these changes in attitude and behavior are induced. How cult leaders and other clever operators get people to do their bidding seems arcane and mysterious to most persons, but I find there is nothing esoteric about it at all. There are no secret drugs or potions. It is just words and group pressures, put together in packaged forms. Modern-day manipulators use methods of persuasion employed since the days of the cavemen, but the masterful con artists of today have hit upon a way to put the techniques together in packages that are especially successful. As a result, thought reform, as a form of influence and persuasion, falls on the extreme end of a continuum that also includes education as we typically see it, advertising, propaganda, and indoctrination . . . There is a mistaken notion that thought reform can only be carried out in confined places and under threat of physical torture or death. But it is important to remember that the brainwashing programs of the forties and fifties were applied not only to military or civilian prisoners of war but also to the general population. In all our research, I and others who study these programs emphasize over and over that imprisonment and overt violence are not necessary and are actually counterproductive when influencing people to change their attitudes and behaviors. If one really wants to influence others, various coordinated soft-sell programs are cheaper, less obvious, and highly effective. The old maxim “Honey gathers more flies than vinegar” remains true today. Attacking the Self There is, however, an important distinction to be made between the version of thought reform prevalent in the 1940s and 1950s and the version used by a number of contemporary groups, including cults, large group awareness training programs, and assorted other groups. These latter-day efforts have built upon the age-old influence techniques to perfect amazingly successful programs of persuasion and 149 THE “BRAINWASHING” CONTROVERSY change. What’s new – and crucial – is that these programs change attitudes by attacking essential aspects of a person’s sense of self, unlike the earlier brainwashing programs that primarily confronted a person’s political beliefs. Today’s programs are designed to destabilize an individual’s sense of self by undermining his or her basic consciousness, reality awareness, beliefs and worldview, emotional control, and defense mechanisms. This attack on a person’s central stability, or self-concept, and on a person’s capacity for self-evaluation is the principal technique that makes the newer programs work. Moreover, this attack is carried out under a variety of guises and conditions – and rarely does it include forced confinement or direct physical coercion. Rather, it is subtle and powerful psychological process of destabilization and induced dependency. Thankfully, these programs do not change people permanently. Nor are they 100 percent effective. Cults are nor all alike, thoughtreform programs are not all alike, and not everyone exposed to specific intense influence processes succumbs and follows the group. Some cults try to defend themselves by saying, in effect, “See, not everyone joins or stays, so we must not be using brainwashing techniques.” Many recruits do succumb, however, and the better organized the influence processes used, the more people will succumb. What is of concern, then, is that certain groups and training programs that have emerged in the last half-century represent well-organized, highly orchestrated influence efforts that are widely successful in recruiting and converting people under certain conditions for certain ends. My interest has been in how these processes work, in the psychological and social techniques that produce these behavioral and attitudinal changes. I am less interested in whether the content of the group centers around religion, psychology, self-improvement, politics, lifestyle, or flying saucers. I am more interested in the widespread use of brainwashing techniques by crooks, swindlers, psychopaths, and egomaniacs of every sort. How Thought Reform Works Brainwashing is not experienced as a fever or a pain might be; it is an invisible social adaptation. When you are the subject of it, you are not aware of the intent of the influence processes that are going on, and especially, you are not aware of the changes taking place within you. In his memoirs, Cardinal Mindszenty wrote, “Without knowing what had happened to me, I had become a different person.” And when asked about being brainwashed, Patty Hearst said, “The strangest part of all this, however, as the SLA delighted in informing me later, was that they themselves were surprised at how docile and trusting I had become . . . It was also true, I must admit, that the thought of escaping from them later simply never entered my mind. I had become convinced that there was no possibility of escape . . . I suppose I could have walked out of the apartment and away from it all, but I didn’t. It simply never occurred to me.” A thought-reform program is not a oneshot event but a gradual process of breaking down and transformation. It can be likened to gaining weight, a few ounces, a half pound, a pound at a time. Before long, without even noticing the initial changes – we are confronted with a new physique. So, too, with brainwashing. A twist here, a tweak there – and there it is: a new psychic attitude, a new mental outlook. These systematic manipulations of social and psychological influences under particular conditions are called programs because the means by which change is brought about is coordinated. And it is because the changes cause the learning and adoption of a certain set of attitudes, usually accompanied by a certain set of behaviors, that the effort and the result are called thought reform. Thus, thought reform is concerted effort to change a person’s way of looking at the world, which will change his or her behavior. It is distinguished from other forms of social learning by the conditions under which it is conducted and by the techniques of environmental and 150 THE “BRAINWASHING” CONTROVERSY interpersonal manipulation that are meant to suppress certain behavior and to elicit and train other behavior. And it does not consist of only one program – there are many ways and methods to accomplish it. The tactics of a thought-reform program are organized to • Destabilize a person’s sense of self. • Get the person to drastically reinterpret his or her life’s history and radically alter his or her worldview and accept a new version of reality and causality. • Develop in the person a dependence on the organization, and thereby turn the person into a deployable agent of the organization. Thought reform can be profitably looked at in at least three ways (summarized in table [9.1]. Robert Lifton has recognized eight themes of thought reform, I have identified six conditions, and Edgar Schein has named three stages. The themes and stages outlined by Lifton and Schein focus on the sequence of the process, while the circumstances I have outlined suggest the conditions needed in the surrounding environment if the process is to work. Singer’s six conditions The following conditions create the atmosphere needed to put thought-reform processes into place. The degree to which these conditions are present increases the level of restrictiveness enforced by the cult and the overall effectiveness of the program. 1 Keep the person unaware that there is an agenda to control or change the person. 2 Control time and physical environment (contacts, information). 3 Create a sense of powerlessness, fear, and dependency. 151 Table [9.1] Criteria for thought reform Conditions Themes Stages (Singer) (Lifton) (Schein) 1 Keep the person unaware 1 Unfreezing. of what is going on and the changes taking place. 2 Control the person’s time 1 Milieu control. and, if possible, physical 2 Loading the language. environment. 3 Demand for purity. 3 Create a sense of powerlessness, covert fear, 4 Confession. and dependency. 4 Suppress much of the person’s old behavior and attitudes. 5 Instill new behavior and 5 Mystical manipulation. 2 Changing. attitudes. 6 Doctrine over person. 6 Put forth a closed system 7 Sacred science. of logic; allow no real input or criticism. 8 Dispensing of existence. 3 Refreezing. THE “BRAINWASHING” CONTROVERSY 4 Suppress old behavior and attitudes. 5 Instill new behavior and attitudes. 6 Put forth a closed system of logic. The trick is to proceed with the thoughtreform process one step at a time so that the person does not notice that she or he is changing. I will explain more fully how each step works. (1) Keep the person unaware of what is going on and how she or he is being changed a step at a time. Imagine you are the person being influenced. You find yourself in an environment to which you are forced to adapt in a series of steps, each sufficiently minor so that you don’t notice the changes in yourself and do not become aware of the goals of the program until late in the process (if ever). You are kept unaware of the orchestration of psychological and social forces meant to change your thinking and your behavior. The cult leaders make it seem as though what is going on is normal, that everything is the way it’s supposed to be. This atmosphere is reinforced by peer pressure and peer-modeled behavior, so that you adapt to the environment without even realizing it. For example, a young man was invited to a lecture. When he arrived, he noticed many pairs of shoes lined against the wall and people in their stocking feet. A woman nodded at his shoes, so he took them off and set them with the others. Everyone was speaking in a soft voice, so he lowered his voice. The evening proceeded with some ritual ceremonies, meditation, and a lecture by a robed leader. Everything was paced slowly and led by this man, with the rest quietly watching and listening. The young man also sat docilely, even though he wanted to ask questions. He conformed to what the group was doing. In this case, however, at the end of the evening when he was asked to come back to another lecture, he said, “Thanks, but no thanks,” at which two men quickly ushered him out a back door, so others wouldn’t hear his displeasure. The process of keeping people unaware is key to a cult’s double agenda: the leader slowly takes you through a series of events that on the surface look like one agenda, while on another level, the real agenda is to get you, the recruit or member, to obey and to give up your autonomy, your past affiliations, and your belief systems. The existence of the double agenda makes this process one of noninformed consent. (2) Control the person’s social and/or physical environment; especially control the person’s time. Cults don’t need to have you move into the commune, farm, headquarters, or ashram and live within the cult environment twentyfour hours a day in order to have control over you. They can control you just as effectively by having you go to work every day with instructions that when not working – on your lunch hour, for example – you must do continuous mind-occupying chanting or some other cult-related activity. Then, after work, you must put all your time in with the orga- nization. (3) Systematically created a sense of powerlessness in the person. Cults create this sense of powerlessness by stripping you of your support system and your ability to act independently. Former friends and kinship networks are taken away. You, the recruit or follower, are isolated from your ordinary environments and sometimes removed to remote locations. Another way cults create a sense of powerlessness is by stripping people of their main occupation and sources of wealth. It is to achieve this condition that so many cultic organizations have members drop out of school, quit their jobs or give up their careers, and turn over their property, inheritances, and other resources to the organization. It is one of the steps in creating a sense of dependency on the organization and a continuing sense of individual powerlessness. Once stripped of your usual support network and, in some cases, means of income, your confidence in your own perceptions erodes. As your sense of powerlessness increases, your good judgment and understanding of the world are diminished. At the same time as you are destabilized in relation to your ordinary reality and worldview, the cult confronts you with a new, unanimously (group-)approved worldview. As the group attacks your previous worldview, causing you 152 THE “BRAINWASHING” CONTROVERSY distress and inner confusion, you are not allowed to speak about this confusion, nor can you object to it, because leadership constantly suppresses questions and counters any resistance. Through this process, your inner confidence is eroded. Moreover, the effectiveness of this approach can be speeded up if you are physically tired, which is why cult leaders see to it that followers are kept overly busy. (4) Manipulate a system of rewards, punishments, and experiences in such a way as to inhibit behavior that reflects the person’s former social identity. The expression of your beliefs, values, activities, and characteristic demeanor prior to contact with the group is suppressed, and you are manipulated into taking on a social identity preferred by the leadership. Old beliefs and old patterns of behavior are defined as irrelevant, if not evil. You quickly learn that the leadership wants old ideas and old patterns eliminated, so you suppress them. For example, the public admission of sexual feelings in certain groups is met with overt disapproval by peers and superiors, accompanied by a directive to take a cold shower. An individual can avoid public rebuke on this topic by no longer speaking on the entire topic of sexuality, warmth, or interest in another human being. The vacuum left is then filled with the group’s ways of thinking and doing. (5) Manipulate a system of rewards, punishments, and experiences in order to promote learning of the group’s ideology or belief system and group-approved behaviors. Once immersed in an environment in which you are totally dependent on the rewards given by those who control the setting, you can be confronted with massive demands to learn varying amounts of new information and behaviors. You are rewarded for proper performance with social and sometimes material reinforcement; if slow to learn or noncompliant, you are threatened with shunning, banning, and punishment which includes loss of esteem from others, loss of privileges, loss of status, and inner anxiety and guilt. In certain groups, physical punishment is meted out. The more complicated and filled with contradictions the new system is and the more difficult it is to learn, the more effective the conversion process will be. For example, a recruit may constantly fail at mastering a complicated theology but can succeed and be rewarded for going out to solicit funds. In one cultic organization, the leadership introduces the new recruits to a complicated dodge-ball game. Only long-term members know the complex and ever-changing rules, and they end up literally leading and pushing the new recruits through the game. This, then, is followed by a very simple exercise in which members get together to “share.” Older members stand up and share (that is, confess) some past bad deed. The new members, who failed so badly at the bewildering dodge-ball game, now can feel capable of succeeding by simply getting up and confessing something about their past that was, by group standards, bad. Since esteem and affection from peers is so important to new recruits, any negative response is very meaningful. Approval comes from having your behaviors and thought patterns conform to the models put forth by the group. Your relationship with peers is threatened whenever you fail to learn or display new behaviors. Over time, an easy solution to the insecurity generated by the difficulties of learning the new system is to inhibit any display of doubt and, even if you don’t understand the content, to merely acquiesce, affirm, and act as if you do understand and accept the new philosophy or content. (6) Put forth a closed system of logic and an authoritarian structure that permits no feedback and refuses to be modified except by leadership approval or executive order. If you criticize or complain, the leader or peers allege that you are defective, not the organization. In this closed system of logic, you are not allowed to question or doubt a tenet or rule or to call attention to factual information that suggests some internal contradiction within the belief system or a contradiction with what you’ve been told. If you do make such observations, they may be turned around and argued to mean the opposite of what you intended. You are made to feel that you are wrong. In cultic groups, the individual member is always wrong, and the system is always right. 153 THE “BRAINWASHING” CONTROVERSY For example, one cult member complained privately to his immediate leadership that he doubted he’d be able to kill his father if so instructed by the cult, even though that act was to signify true adherence to the cult’s system. In response, he was told he needed more courses to overcome his obvious weakness because by now he should be more committed to the group. In another case, a woman objected to her fund-raising team leader that it would be lying to people to say cult members were collecting money for a children’s home when they knew the money went to the leader’s headquarters. She was told, “That’s evidence of your degraded mentality. You are restoring to our leader what’s rightfully his, that’s all!” Another woman who wanted to go home to see her dying grandmother was refused her request. “We’re strengthening you here,” she was told. “This request is a sign of your selfishness. We’re your new family, and we’re right to not let you go.” The goal of all this is your conversion or remolding. As you learn to modify your former behaviors in order to be accepted in this closed and controlled environment, you change. You affirm that you accept and understand the ideology by beginning to talk in the simple catchphrases particular to the group. This “communication” has no foundation since, in reality, you have little understanding of the system beyond the catchphrases. But once you begin to express your seeming verbal acceptance of the group’s ideology, then that ideology becomes the rule book for the subsequent direction and evaluation of your behavior. Also, using the new language fosters your separation from your old conscience and belief system. Your new language allows you to justify activities that are clearly not in your interests, perhaps not even in the interests of humankind. Precisely those behaviors that lead to criticism from the outside world because they violate the norms and rules of the society as a whole are rationalized within the cult community through use of this new terminology, this new language. For example, “heavenly deception” and “transcendental trickery” (terms used by two of the large cultic groups) are not called what they are – lying and deceptive fund-raising. Nor is the rule “do not talk to the systemites” called what it is – a way to isolate members from the rest of the world. Lifton’s eight themes Paralleling Singer’s six conditions are the eight psychological themes that psychiatrist Robert Lifton has identified as central to totalistic environments, including the communist Chinese and Korean programs of the 1950s and today’s cults. Cults invoke these themes for the purpose of promoting behavioral and attitudinal changes. (1) Milieu control. This is total control of communication in the group. In many groups, there is a “no gossip” or no “nattering” rule that keeps people from expressing their doubts or misgivings about what is going on. This rule is usually rationalized by saying that gossip will tear apart the fabric of the group or destroy unity, when in reality the rule is a mechanism to keep members from communicating anything other than positive endorsements. Members are taught to report those who break the rule, a practice that also keeps members isolated from each other and increases dependence on the leadership. Milieu control also often involves discouraging members from contacting relatives or friends outside the group and from reading anything not approved by the organization. They are sometimes told not to believe anything they see or hear reported by the media. One left-wing political cult, for example, maintains that the Berlin Wall is still standing and that the “bourgeois capitalist” press wants people to think otherwise in order to discredit communism. (2) Loading the language. As members continue to formulate their ideas in the group’s jargon, this language serves the purpose of constricting members’ thinking and shutting down critical thinking abilities. As first, translating from their native tongue into “groups- 154 THE “BRAINWASHING” CONTROVERSY peak’ forces members to censor, edit, and slow down spontaneous bursts of criticism or oppositional ideas. That helps them to cut off and contain negative or resistive feelings. Eventually, speaking in cult jargon is second nature, and talking with outsiders becomes energyconsuming and awkward. Soon enough, members find it most comfortable to talk only among themselves in the new vocabulary. To reinforce this, all kinds of derogatory names are given to outsiders: wogs, systemites, reactionaries, unclean, of Satan. One large international group, for example, has dictionaries for members to use. In one of these dictionaries, criticism is defined as “justification for having done an overt.” Then one looks up overt and the dictionary states: “overt act: an overt act is not just injuring someone or something; an overt act is an act of omission or commission which does the least good for the least number of dynamics or the most harm to the greatest number of dynamics.” Then the definition of dynamics says: “There could be said to be eight urges in life . . .” And so, one can search from term to term trying to learn this new language. One researcher noted that the group’s founder has stated that “new followers or potential converts should not be exposed to [the language and cosmology of the group] at too early a stage. ‘Talking whole track to raw meat’ is frowned upon.” When cults use such internal meanings, how is an outsider to know that the devil disguise, just flesh relationships, and polluting are terms for parents? That an edu is a lecture by the cult leader or that a mislocation is a mistake? A former cult member comments, “I was always being told, ‘You are being too horizontal.’” Translated, this meant she was being reprimanded for listening to and being sympathetic to peers. A dwindling group in Seattle, the Love Family, had a “rite of breathing.” This sounds ordinary, but in fact for some members it turned out to be a lethal euphemism. The leader, a former California salesman, initiated this rite, in which members sat in a circle, passing around and sniffing a plastic bag containing a rag soaked with toluene, an industrial solvent. The group called the chemical “tell-u-all.” (3) Demand for purity. An us-versus-them orientation is promoted by the all-or-nothing belief system of the group: we are right; they (outsiders, nonmembers) are wrong, evil, unenlightened, and so forth. Each idea or act is good or bad, pure or evil. Recruits gradually take in, or internalize, the critical, shaming essence of the cult environment, which builds up lots of guilt and shame. Most groups put forth that there is only one way to think, respond, or act in any given situation. There is no in between, and members are expected to judge themselves and others by this all-ornothing standard. Anything can be done in the name of this purity; it is the justification for the group’s internal moral and ethical code. In many groups, it is literally taught that the end justifies the means – and because the end (that is, the group) is pure, the means are simply tools to reach purity. If you are a recruit, this ubiquitous guilt and shame creates and magnifies your dependence on the group. The group says in essence, “We love you because you are transforming yourself,” which means that any moment you are not transforming yourself, you are slipping back. Thus you easily feel inadequate, as though you need “fixing” all the time, just as the outside would is being denounced all the time. (4) Confession. Confession is used to lead members to reveal past and present behavior, contacts with others, and undesirable feelings, seemingly in order to unburden themselves and become free. However, whatever you reveal is subsequently used to further mold you and to make you feel close to the group and estranged from nonmembers. (I sometimes call this technique purge and merge.) The information gained about you can be used against you to make you feel more guilty, powerless, fearful, and ultimately in need of the cult and the leader’s goodness. And it can be used to get you to rewrite your personal history so as to denigrate your past life, making it seem illogical for you to want to 155 THE “BRAINWASHING” CONTROVERSY return to that former life, family, and friends. Each group will have its own confession ritual, which may be carried out either one-on-one with a person in leadership or in group sessions. Members may also write reports on themselves and others. Through the confession process and by instruction in the group’s teachings, members learn that everything about their former lives, including friends, family, and nonmembers, is wrong and to be avoided. Outsiders will put you at risk of not attaining the purported goal: they will lessen your psychological awareness, hinder the group’s political advancement, obstruct your path toward ultimate knowledge, or allow you to become stuck in your past life and incorrect thinking. (5) Mystical manipulation. The group manipulates members to think that their new feelings and behavior have arisen spontaneously in this new atmosphere. The leader implies that this is a chosen, select group with a higher purpose. Members become adept at watching to see what particular behavior is wanted, learning to be sensitive to all kinds of cued by which they are to judge and alter their own behavior. Cult leaders tell their followers, “You have chosen to be here. No one has told you to come here. No one has influenced you,” when in fact the followers are in a situation they can’t leave owing to social pressure and their fear. Thus they come to believe that they are actually choosing this life. If outsiders hint that the devotees have been brainwashed or tricked, the members say, “Oh, no, I chose voluntarily.” Cults thrive on this myth of voluntarism, insisting time and again that no member is being held against his or her will. (6) Doctrine over person. As members retrospectively alter their accounts of personal history, having been instructed either to rewrite that history or simply to ignore it, they are simultaneously taught to interpret reality through the group concepts and to ignore their own experiences and feelings as they occur. In many groups, from the days of early membership on, you will be told to stop paying attention to your own perceptions, since you are “uninstructed,” and simply to go along with and accept the “instructed” view, the party line. The rewriting of personal history more often than not becomes a re-creating, so that you learn to to fit yourself into the group’s interpretation of life. For example, one young man recently out of a cult reported to me that he was “a drug addict, violent, and irresponsible.” It soon became clear from our discussions that none of this was true. His drug addiction amounted to three puffs of marijuana a number of years ago; his violence stemmed from his participation on a high school wrestling team; and his irresponsibility was based on his not having saved any money from his very small allowance as a teenager. However, the group he had been in had convinced him that these things represented terrible flaws. (7) Sacred science. The leader’s wisdom is given a patina of science, adding a credible layer to his central philosophical, psychological, or political notion, He can then profess that the group’s philosophy should be applied to all humankind and that anyone who disagrees or has alternative ideas is not only immoral and irreverent but also unscientific. Many leaders, for example, inflate their curricula vitae to make it look as though they are connected to higher powers, respected historical leaders, and so forth. Many a cult leader has said that he follows in the tradition of the greatest – Sigmund Freud, Karl Marx, the Buddha, Martin Luther, or Jesus Christ. (8) Dispensing of existence. The cult’s totalistic environment clearly emphasizes that members are part of an elitist movement and are the select of the world. Nonmembers are unworthy, lesser beings. Most cults teach their members that “we are the best and only one,” saying, in one way or another, “We are the governors of enlightenment and all outsiders are lower beings.” This kind of thinking lays the foundation for dampening the good consciences members brought in with them and allows members, as agents or representatives of a “superior” group, to manipulate nonmembers for good of the group. Besides reinforcing the us-versus-them mentality, this 156 THE “BRAINWASHING” CONTROVERSY thinking means that your whole existence centers on being in the group. If you leave, you join nothingness. This is the final step in creating members’ dependence on the group. Numerous former cult members report that, when they look back at what they did or would have done at the command of the group, they are appalled and stricken. Many have said they would have killed their own parents if so ordered. Hundreds have told me of countless deceptions and lies, such as shortchanging donors on the street, using ruses to keep members from leaving, and urging persons who could ill afford it to run their credit cards up to their limit in order to sign up for further courses. Schein’s three stages Next, we consider the stages people go through as their attitudes are changed by the group environment and the thought-reform processes. These were labeled by psychologist Edgar Schein as the stages of “unfreezing, changing, and refreezing.” (1) Unfreezing. In this first stage, your past attitudes and choices – your whole sense of self and notion of how the world works – are destabilized by group lectures, personal counseling, reward, punishments, and other exchanges in the group. This destabilization is designed to produce what psychologists call an identity crisis. While you are looking back at your own world and behavior and values (that is, unfreezing them), you are simultaneously bombarded with the new system, which implies that you have been wrong in the past. This process makes you uncertain about what is right, what to do, and which choices to make. As described earlier, successful behavioral change programs are designed to upset you to the point that your self-confidence is undermined. This makes you more open to suggestion and also more dependent on the environment for cues about “right thinking” and “right conduct.” Your resistance to the new ideas lessens when you feel yourself teetering on an edge with massive anxiety about the right choices in life on the one side, and the group ideas that offer the way out of this distress on the other side. Mary groups use a “hot seat” technique or some other form of criticism to attain the goal of undercutting, destabilizing, and diminishing. For example, “Harry” had been in the Army, was approaching his late twenties, and was always very sure of himself. But when he joined a Bible cult, the leaders said he wasn’t learning fast enough to speak in tongues. He was told that he was resistant, that this was a sign of his evil past. He was told this over and over, no matter how hard he tried. Before long, Harry seemed to lose confidence in himself, even in his memories of his Army successes. His own attitude about himself as well as his actual behavior was unfreezing. (2) Changing. During this second stage, you sense that the solutions offered by the group provide a path to follow. You feel that anxiety, uncertainty, and self-doubt can be reduced by adopting the concepts put forth by the group or leader. Additionally, you observe the behavior of the longer-term members, and you begin to emulate their ways. As social psychology experiments and observations have found for decades, once a person makes an open commitment before others to an idea, his or her subsequent behavior generally supports and reinforces the stated commitment. That is, if you say in front of others that you are making a commitment to be “pure,” then you will feel pressured to follow what others define as the path of purity. If you spend enough time in any environment, you will develop a personal history of experience and interaction in it. When that environment is constructed and managed in a certain way, then the experiences, interactions, and peer relations will be consistent with whatever public identity is fostered by the environment and will incorporate the values and opinions promulgated in that environment. Now, when you engage in cooperative activity with peers in an environment that you do not realize is artificially constructed, you do not perceive your interactions to be coerced. 157 THE “BRAINWASHING” CONTROVERSY And when you are encouraged but not forced to make verbal claims to “truly understanding the ideology and having been transformed,” these interactions with your peers will tend to lead you to conclude that you hold beliefs consistent with your actions. In other words, you will think that you came upon the belief and behaviors yourself. Peer pressure is very important to this process: • If you say in front of others, you’ll do it. • Once you do it, you’ll think it. • Once you think it (in an environment you do not perceive to be coercive), you’ll believe that you thought it yourself. (3) Refreezing. In this final phase, the group reinforces you in the desired behavior with social and psychological rewards, and punishes unwanted attitudes and behaviors with harsh criticism, group disapproval, social ostracism, and loss of status. Most of the modern-day thought-reform groups seek to produce smiling, nonresistant, hardworking persons who do not complain about group practices and do not question the authority of the guru, leader, or trainer. The more you display the group-approved attitudes and behavior, the more your compliance is interpreted by the leadership as showing that you now know that your life before you belonged to the group was wrong and that your new life is “the way” . . . The degree to which a group or situation is structured according to these conditions, themes, and stages will determine the degree to which it is manipulative. Not all cults or groups that use thought-reform processes implement their mind-bending techniques in the same way or to the same extent. The implementation varies both within individual groups and across groups. Often, the peripheral members will have no awareness of the kinds of manipulations that go on in the upper or inner levels of a particular group or teaching. Thought reform is subtle, fluid, and insidious – and sometimes hard to identify, particularly for the novice or the overly idealistic. But when it is present, it has powerful repercussions. Producing a New Identity As part of the intense influence and change process in many cults, people take on a new social identity, which may or may not be obvious to an outsider. When groups refer to this new identity, they speak of members who are transformed, reborn, enlightened, empowered, rebirthed, or cleared. The group-approved behavior is reinforced and reinterpreted as demonstrating the emergence of “the new person.” Members are expected to display this new social identity. However, the vast majority of those who leave such groups drop the cult content, and the cult behavior and attitudes, and painstakingly take up where they left off prior to joining. Those who had been subjected to thought-reform processes in the Far East, for example, gradually dropped the adopted attitudes and behaviors and returned to their former selves as soon as they were away from the environment. We see from years of research with prisoners of war, hostages, battered wives, former cult members, and other recipients of intense influence that changes made under this influence are not stable and not permanent. The beliefs a person may adopt about the world, about a particular philosophy, and even about himself or herself are reversible when the person is out of the environment that induced those beliefs. We might ask ourselves – and surely many former cult members have – how a person can display reprehensible conduct under some conditions, then turn around and resume normal activities under other conditions. The phenomenon has been variously described as doubling or as the formation of a pseudopersonality (or pseudoidentity), superimposed identity, a cult self, or a cult personality. What is important about these labels is that they call attention to an important psychological 158 THE “BRAINWASHING” CONTROVERSY and social phenomenon that needs to be studied more carefully – namely, that ordinary persons, with their own ideas and attitudes, can be rapidly turned around in their social identity but later can recover their old selves and move forward. By this, I am not saying that people in cults or groups that use thought-reform processes are just faking it by role-playing, pretending, or acting. Anyone who has met a former friend who’s been transformed into a recruiting zealot for a New Age transformational program, for example, knows that something more profound than role-playing is operating as that old friend defends her or his new self and new group, speaking single-mindedly, spouting intense, firmly stated dogma. This is not play-acting. It is far more instinctive and experienced as real. Doubling, or the formation of a pseudopersonality, has become a key issue. It is a factor that ultimately allows cult members to leave their groups and permits us to understand why exit counseling works as a means of reawakening a person who has been exposed to thought-reform processes. The central fact is this: the social identity learned while a person is in a thought-reform system fades, much as a summer tan does when a person is no longer at the beach. The process is far more complicated than this analogy, of course, but I want to emphasize that cult thinking and behaviors are adaptive and not stable. It is the cult environment that produces and keeps in place the cult identity. Some persons stay forever in the group, but the vast majority leave at some point, either walking away or being lured out by family and friends. An understanding of thought-reforming phenomena is vital to learning more about the role that group social support or pressure plays for all of us. It is important not only for families with relatives in cultic groups but also for ex-members wondering if there are psychological and social theories to explain what happened to them, and for everyone who wants to learn something about how we all operate. 159