Category Archives: Control Techniques

Coercive Persuasion and Attitude Change

Encyclopedia of Sociology Volume 1, Macmillan Publishing Company, New York

By Richard J. Ofshe, Ph.D.

Coercive persuasion and thought reform are alternate names for programs of social influence capable of producing substantial behavior and attitude change through the use of coercive tactics, persuasion, and/or interpersonal and group-based influence manipulations (Schein 1961; Lifton 1961). Such programs have also been labeled “brainwashing” (Hunter 1951), a term more often used in the media than in scientific literature. However identified, these programs are distinguishable from other elaborate attempts to influence behavior and attitudes, to socialize, and to accomplish social control. Their distinguishing features are their totalistic qualities (Lifton 1961), the types of influence procedures they employ, and the organization of these procedures into three distinctive subphases of the overall process (Schein 1961; Ofshe and Singer 1986). The key factors that distinguish coercive persuasion from other training and socialization schemes are:

  1. The reliance on intense interpersonal and psychological attack to destabilize an individual’s sense of self to promote compliance
  2. The use of an organized peer group
  3. Applying interpersonal pressure to promote conformity
  4. The manipulation of the totality of the person’s social environment to stabilize behavior once modified

Thought-reform programs have been employed in attempts to control and indoctrinate individuals, societal groups (e.g., intellectuals), and even entire populations. Systems intended to accomplish these goals can vary considerably in their construction. Even the first systems studied under the label “thought reform” ranged from those in which confinement and physical assault were employed (Schein 1956; Lifton 1954; Lifton 1961 pp. 19-85) to applications that were carried out under nonconfined conditions, in which nonphysical coercion substituted for assault (Lifton 1961, pp. 242-273; Schein 1961, pp. 290-298). The individuals to whom these influence programs were applied were in some cases unwilling subjects (prisoner populations) and in other cases volunteers who sought to participate in what they believed might be a career-beneficial, educational experience (Lifton 1981, p. 248).

Significant differences existed between the social environments and the control mechanisms employed in the two types of programs initially studied. Their similarities, however, are of more importance in understanding their ability to influence behavior and beliefs than are their differences. They shared the utilization of coercive persuasion’s key effective-influence mechanisms: a focused attack on the stability of a person’s sense of self; reliance on peer group interaction; the development of interpersonal bonds between targets and their controllers and peers; and an ability to control communication among participants. Edgar Schein captured the essential similarity between the types of programs in his definition of the coercive-persuasion phenomenon. Schein noted that even for prisoners, what happened was a subjection to “unusually intense and prolonged persuasion” that they could not avoid; thus, “they were coerced into allowing themselves to be persuaded” (Schein 1961, p. 18).

Programs of both types (confined/assaultive and nonconfined/nonassaultive) cause a range of cognitive and behavioral responses. The reported cognitive responses vary from apparently rare instances, classifiable as internalized belief change (enduring change), to a frequently observed transient alteration in beliefs that appears to be situationally adaptive and, finally, to reactions of nothing less than firm intellectual resistance and hostility (Lifton 1961, pp. 117-151, 399-415; Schein 1961, pp. 157-166).

The phrase situationally adaptive belief change refers to attitude change that is not stable and is environment dependent. This type of response to the influence pressures of coercive-persuasion programs is perhaps the most surprising of the responses that have been observed. The combination of psychological assault on the self, interpersonal pressure, and the social organization of the environment creates a situation that can only be coped with by adapting and acting so as to present oneself to others in terms of the ideology supported in the environment (see below for discussion). Eliciting the desired verbal and interactive behavior sets up conditions likely to stimulate the development of attitudes consistent with and that function to rationalize new behavior in which the individual is engaging. Models of attitude change, such as the theory of Cognitive Dissonance (Festinger 1957) or Self-Perception Theory (Bern 1972), explain the tendency for consistent attitudes to develop as a consequence of behavior.

The surprising aspect of the situationally adaptive response is that the attitudes that develop are unstable. They tend to change dramatically once the person is removed from an environment that has totalistic properties and is organized to support the adaptive attitudes. Once removed from such an environment, the person is able to interact with others who permit and encourage the expression of criticisms and doubts, which were previously stifled because of the normative rules of the reform environment (Schein 1961, p. 163; Lifton 1961, pp. 87-116, 399-415; Ofshe and Singer 1986). This pattern of change, first in one direction and then the other, dramatically highlights the profound importance of social support in the explanation of attitude change and stability. This relationship has for decades been one of the principal interests in the field of social psychology.

Statements supportive of the proffered ideology that indicate adaptive attitude change during the period of the target’s involvement in the reform environment and immediately following separation should not be taken as mere playacting in reaction to necessity. Targets tend to become genuinely involved in the interaction. The reform experience focuses on genuine vulnerabilities as the method for undermining self-concept: manipulating genuine feelings of guilt about past conduct; inducing the target to make public denunciations of his or her prior life as being unworthy; and carrying this forward through interaction with peers for whom the target develops strong bonds. Involvement developed in these ways prevents the target from maintaining both psychological distance or emotional independence from the experience.

The reaction pattern of persons who display adaptive attitude-change responses is not one of an immediate and easy rejection of the proffered ideology. This response would be expected if they had been faking their reactions as a conscious strategy to defend against the pressures to which they were exposed. Rather, they appear to be conflicted about the sentiments they developed and their reevaluation of these sentiments. This response has been observed in persons reformed under both confined/assaultive and nonconfined/ nonassaultive reform conditions (Schein 1962, pp. 163- 165; Lifton 1961, pp. 86-116, 400- 401).

Self-concept and belief-related attitude change in response to closely controlled social environments have been observed in other organizational settings that, like reform programs, can be classified as total institutions (Goffman 1957). Thought-reform reactions also appear to be related to, but are far more extreme than, responses to the typically less-identity-assaultive and less- totalistic socialization programs carried out by organizations with central commitments to specifiable ideologies, and which undertake the training of social roles (e.g., in military academies and religious-indoctrination settings (Donbush 1955; Hulme 1956).

The relatively rare instances in which belief changes are internalized and endure have been analyzed as attributable to the degree to which the acquired belief system and imposed peer relations function fully to resolve the identity crisis that is routinely precipitated during the first phase of the reform process (Schein 1961, p. 164; Lifton 1961, pp. 131-132, 400). Whatever the explanation for why some persons internalize the proffered ideology in response to the reform procedures, this extreme reaction should be recognized as both atypical and probably attributable to an interaction between long-standing personality traits and the mechanisms of influence utilized during the reform process.

Much of the attention to reform programs was stimulated because it was suspected that a predictable and highly effective method for profoundly changing beliefs had been designed, implemented, and was in operation. These suspicions are not supported by fact. Programs identified as thought reforming are not very effective at actually changing people’s beliefs in any fashion that endures apart from an elaborate supporting social context. Evaluated only on the criterion of their ability genuinely to change beliefs, the programs have to be judged abject failures and massive wastes of effort.

The programs are, however, impressive in their ability to prepare targets for integration into and long-term participation in the organizations that operate them. Rather than assuming that individual belief change is the major goal of these programs, it is perhaps more productive to view the programs as elaborate role-training regimes. That is, as resocialization programs in which targets are being prepared to conduct themselves in a fashion appropriate for the social roles they are expected to occupy following conclusion of the training process.

If identified as training programs, it is clear that the goals of such programs are to reshape behavior and that they are organized around issues of social control important to the organizations that operate the programs. Their objectives then appear to be behavioral training of the target, which result in an ability to present self, values, aspirations, and past history in a style appropriate to the ideology of the controlling organization; to train an ability to reason in terms of the ideology; and to train a willingness to accept direction from those in authority with minimum apparent resistance. Belief changes that follow from successfully coercing or inducing the person to behave in the prescribed manner can be thought of as by-products of the training experience. As attitude- change models would predict, they arise “naturally” as a result of efforts to reshape behavior (Festinger 1957; Bem 1972).

The tactical dimension most clearly distinguishing reform processes from other sorts of training programs is the reliance on psychological coercion: procedures that generate pressure to comply as a means of escaping a punishing experience (e.g., public humiliation, sleep deprivation, guilt manipulation, etc.). Coercion differs from other influencing factors also present in thought reform, such as content-based persuasive attempts (e.g., presentation of new information, reference to authorities, etc.) or reliance on influence variables operative in all interaction (status relations, demeanor, normal assertiveness differentials, etc.). Coercion is principally utilized to gain behavioral compliance at key points and to ensure participation in activities likely to have influencing effects; that is, to engage the person in the role training activities and in procedures likely to lead to strong emotional responses, to cognitive confusion, or to attributions to self as the source of beliefs promoted during the process.

Robert Lifton labeled the extraordinarily high degree of social control characteristic of organizations that operate reform programs as their totalistic quality (Lifton 1961). This concept refers to the mobilization of the entirety of the person’s social, and often physical, environment in support of the manipulative effort. Lifton identified eight themes or properties of reform environments that contribute to their totalistic quality:

  1. Control of communication
  2. Emotional and behavioral manipulation
  3. Demands for absolute conformity to behavior prescriptions derived from the ideology
  4. Obsessive demands for confession
  5. Agreement that the ideology is faultless
  6. Manipulation of language in which cliches substitute for analytic thought
  7. Reinterpretation of human experience and emotion in terms of doctrine
  8. Classification of those not sharing the ideology as inferior and not worthy of respect

(Lifton 1961, pp. 419-437, 1987).

Schein’s analysis of the behavioral sequence underlying coercive persuasion separated the process into three subphases: unfreezing, change, and refreezing (Schein 1961, pp. 111-139). Phases differ in their principal goals and their admixtures of persuasive, influencing, and coercive tactics. Although others have described the process differently, their analyses are not inconsistent with Schein’s three-phase breakdown (Lifton 1961; Farber, Harlow, and West 1956; Meerloo 1956; Sargent 1957; Ofshe and Singer 1986). Although Schein’s terminology is adopted here, the descriptions of phase activities have been broadened to reflect later research.

Unfreezing is the first step in eliciting behavior and developing a belief system that facilitates the long-term management of a person. It consists of attempting to undercut a person’s psychological basis for resisting demands for behavioral compliance to the routines and rituals of the reform program. The goals of unfreezing are to destabilize a person’s sense of identity (i.e., to precipitate an identity crisis), to diminish confidence in prior social judgments, and to foster a sense of powerlessness, if not hopelessness. Successful destabilization induces a negative shift in global self evaluations and increases uncertainty about one’s values and position in society. It thereby reduces resistance to the new demands for compliance while increasing suggestibility.

Destabilization of identity is accomplished by bringing into play varying sets of manipulative techniques. The first programs to be studied utilized techniques such as repeatedly demonstrating the person’s inability to control his or her own fate, the use of degradation ceremonies, attempts to induce reevaluation of the adequacy and/or propriety of prior conduct, and techniques designed to encourage the reemergence of latent feelings of guilt and emotional turmoil (Hinkle and Wolfe 1956; Lifton 1954, 1961; Schein 1956, 1961; Schein, Cooley, and Singer 1960). Contemporary programs have been observed to utilize far more psychologically sophisticated procedures to accomplish destabilization. These techniques are often adapted from the traditions of psychiatry, psychotherapy, hypnotherapy, and the human-potential movement, as well as from religious practice (Ofshe and Singer 1986; Lifton 1987).

The change phase allows the individual an opportunity to escape punishing destabilization procedures by demonstrating that he or she has learned the proffered ideology, can demonstrate an ability to interpret reality in its own terms, and is willing to participate in competition with peers to demonstrate zeal, through displays of commitment. In addition to study and/or formal instruction, the techniques used to facilitate learning and the skill basis that can lead to opinion change include scheduling events that have predictable influencing consequences, rewarding certain conduct, and manipulating emotions to create punishing experiences. Some of the practices designed to promote influence might include requiring the target to assume responsibility for the progress of less- advanced “students,” to become the responsibility of those further along in the program, to assume the role of a teacher of the ideology, or to develop ever more refined and detailed confession statements that recast the person’s former life in terms of the required ideological position. Group structure is often manipulated by making rewards or punishments for an entire peer group contingent on the performance of the weakest person, requiring the group to utilize a vocabulary appropriate to the ideology, making status and privilege changes commensurate with behavioral compliance, subjecting the target to strong criticism and humiliation from peers for lack of progress, and peer monitoring for expressions of reservations or dissent. If progress is unsatisfactory, the individual can again be subjected to the punishing destabilization procedures used during unfreezing to undermine identity, to humiliate, and to provoke feelings of shame and guilt.

Refreezing denotes an attempt to promote and reinforce behavior acceptable to the controlling organization. Satisfactory performance is rewarded with social approval, status gains, and small privileges. Part of the social structure of the environment is the norm of interpreting the target’s display of the desired conduct as demonstrating the person’s progress in understanding the errors of his or her former life. The combination of reinforcing approved behavior and interpreting its symbolic meaning as demonstrating the emergence of a new individual fosters the development of an environment-specific, supposedly reborn social identity. The person is encouraged to claim this identity and is rewarded for doing so.

Lengthy participation in an appropriately constructed and managed environment fosters peer relations, an interaction history, and other behavior consistent with a public identity that incorporates approved values and opinions. Promoting the development of an interaction history in which persons engage in cooperative activity with peers that is not blatantly coerced and in which they are encouraged but not forced to make verbal claims to “truly understanding the ideology and having been transformed,” will tend to lead them to conclude that they hold beliefs consistent with their actions (i.e., to make attributions to self as the source of their behaviors). These reinforcement procedures can result in a significant degree of cognitive confusion and an alteration in what the person takes to be his or her beliefs and attitudes while involved in the controlled environment (Bem 1972; 0fshe et al. 1974).

Continuous use of refreezing procedures can sustain the expression of what appears to be significant attitude change for long periods of time. Maintaining compliance with a requirement that the person display behavior signifying unreserved acceptance of an imposed ideology and gaining other forms of long-term behavioral control requires continuous effort. The person must be carefully managed, monitored, and manipulated through peer pressure, the threat or use of punishment (material, social, and emotional) and through the normative rules of the community (e.g., expectations prohibiting careers independent of the organization, prohibiting formation of independent nuclear families, prohibiting accumulation of significant personal economic resources, etc.) (Whyte 1976; Ofshe 1980; Ofshe and Singer 1986).

The rate at which a once-attained level of attitude change deteriorates depends on the type of social support the person receives over time (Schein 1961 pp. 158-166; Lifton pp. 399-415). In keeping with the refreezing metaphor, even when the reform process is to some degree successful at shaping behavior and attitudes, the new shape tends to be maintained only as long as temperature is appropriately controlled.

One of the essential components of the reform process in general and of long-term refreezing in particular is monitoring and limiting the content of communication among persons in the managed group (Lifton 1961; Schein 1960; Ofshe et al. ] 974). If successfully accomplished, communication control eliminates a person’s ability safely to express criticisms or to share private doubts and reservations. The result is to confer on the community the quality of being a spy system of the whole, upon the whole.

The typically observed complex of communication-controlling rules requires people to self- report critical thoughts to authorities or to make doubts known only in approved and readily managed settings (e.g., small groups or private counseling sessions). Admitting “negativity” leads to punishment or reindoctrination through procedures sometimes euphemistically termed “education” or “therapy.” Individual social isolation is furthered by rules requiring peers to “help” colleagues to progress, by reporting their expressions of doubt. If it is discovered, failure to make a report is punishable, because it reflects on the low level of commitment of the person who did not “help” a colleague to make progress.

Controlling communication effectively blocks individuals from testing the appropriateness of privately held critical perceptions against the views of even their families and most-valued associates. Community norms encourage doubters to interpret lingering reservations as signs of a personal failure to comprehend the truth of the ideology; if involved with religious organizations, to interpret doubt as evidence of sinfulness or the result of demonic influences; if involved with an organization delivering a supposed psychological or medical therapy, as evidence of continuing illness and/or failure to progress in treatment.

The significance of communication control is illustrated by the collapse of a large psychotherapy organization in immediate reaction to the leadership’s loss of effective control over interpersonal communication. At a meeting of several hundred of the members of this “therapeutic community” clients were allowed openly to voice privately held reservations about their treatment and exploitation. They had been subjected to abusive practices, which included assault, sexual and economic exploitation, extremes of public humiliation, and others. When members discovered the extent to which their sentiments about these practices were shared by their peers they rebelled (Ayalla 1985).

Two widespread myths have developed from misreading the early studies of thought reforming influence systems (Zablocki 1991 ). These studies dealt in part with their use to elicit false confessions in the Soviet Union after the 1917 revolution; from American and United Nations forces held as POWs during the Korean War; and from their application to Western missionaries held in China following Mao’s revolution.

The first myth concerns the necessity and effectiveness of physical abuse in the reform process. The myth is that physical abuse is not only necessary but is the prime cause of apparent belief change. Reports about the treatment of POWs and foreign prisoners in China documented that physical abuse was present. Studies of the role of assault in the promotion of attitude change and in eliciting false confessions even from U.S. servicemen revealed, however, that it was ineffective. Belief change and compliance was more likely when physical abuse was minimal or absent (Biderman 1960). Both Schein (1961) and Lifton (1961) reported that physical abuse was a minor element in the theoretical understanding of even prison reform programs in China.

In the main, efforts at resocializing China’s nationals were conducted under nonconfined/ nonassaultive conditions. Millions of China’s citizens underwent reform in schools, special-training centers, factories, and neighborhood groups in which physical assault was not used as a coercive technique. One such setting for which many participants actively sought admission, the “Revolutionary University,” was classified by Lifton as the “hard core of the entire Chinese thought reform movement” (Lifton 1961,p. 248).

Attribution theories would predict that if there were differences between the power of reform programs to promote belief change in settings that were relatively more or less blatantly coercive and physically threatening, the effect would be greatest in less-coercive programs. Consistent with this expectation, Lifton concluded that reform efforts directed against Chinese citizens were “much more successful” than efforts directed against Westerners (Lifton 1961, p. 400).

A second myth concerns the purported effects of brainwashing. Media reports about thought reform’s effects far exceed the findings of scientific studies–which show coercive persuasion’s upper limit of impact to be that of inducing personal confusion and significant, but typically transitory, attitude change. Brainwashing was promoted as capable of stripping victims of their capacity to assert their wills, thereby rendering them unable to resist the orders of their controllers. People subjected to “brainwashing” were not merely influenced to adopt new attitudes but, according to the myth, suffered essentially an alteration in their psychiatric status from normal to pathological, while losing their capacity to decide to comply with or resist orders.

This lurid promotion of the power of thought reforming influence techniques to change a person’s capacity to resist direction is entirely without basis in fact: No evidence, scientific or otherwise, supports this proposition. No known mental disorder produces the loss of will that is alleged to be the result of brainwashing. Whatever behavior and attitude changes result from exposure to the process, they are most reasonably classified as the responses of normal individuals to a complex program of influence.

The U.S. Central Intelligence Agency seems to have taken seriously the myth about brainwashing’s power to destroy the will. Due, perhaps, to concern that an enemy had perfected a method for dependably overcoming will — or perhaps in hope of being the first to develop such a method –the Agency embarked on a research program, code-named MKULTRA. It became a pathetic and tragic failure. On the one hand, it funded some innocuous and uncontroversial research projects; on the other, it funded or supervised the execution of several far-fetched, unethical, and dangerous experiments that failed completely (Marks 1979; Thomas 1989).

Although no evidence suggests that thought reform is a process capable of stripping a person of the will to resist, a relationship does exist between thought reform and changes in psychiatric status. The stress and pressure of the reform process cause some percentage of psychological casualties. To reduce resistance and to motivate behavior change, thought-reform procedures rely on psychological stressors, induction of high degrees of emotional distress, and on other intrinsically dangerous influence techniques (Heide and Borkovec 1983). The process has a potential to cause psychiatric injury, which is sometimes realized. The major early studies (Hinkle and Wolfe 1961; Lifton 1961; Schein 1961) reported that during the unfreezing phase individuals were intentionally stressed to a point at which some persons displayed symptoms of being on the brink of psychosis. Managers attempted to reduce psychological pressure when this happened, to avoid serious psychological injury to those obviously near the breaking point.

Contemporary programs speed up the reform process through the use of more psychologically sophisticated and dangerous procedures to accomplish destabilization. In contemporary programs the process is sometimes carried forward on a large group basis, which reduces the ability of managers to detect symptoms of impending psychiatric emergencies. In addition, in some of the “therapeutic” ideologies espoused by thought reforming organizations, extreme emotional distress is valued positively, as a sign of progress. Studies of contemporary programs have reported on a variety of psychological injuries related to the reform process. Injuries include psychosis, major depressions, manic episodes, and debilitating anxiety (Glass, Kirsch, and Parris 1977, Haaken and Adams 1983, Heide and Borkovec 1983; Higget and Murray 1983; Kirsch and Glass 1977; Yalom and Lieberman 1971; Lieberman 1987; Singer and Ofshe 1990).

Contemporary thought-reform programs are generally far more sophisticated in their selection of both destabilization and influence techniques than were the programs studied during the 1950s (see Ofshe and Singer 1986 for a review). For example, hypnosis was entirely absent from the first programs studied but is often observed in modern programs. In most modern examples in which hypnosis is present, it functions as a remarkably powerful technique for manipulating subjective experience and for intensifying emotional response. It provides a method for influencing people to imagine impossible events such as those that supposedly occurred in their “past lives,” the future, or during visits to other planets. If persons so manipulated misidentify the hypnotically induced fantasies, and classify them as previously unavailable memories, their confidence in the content of a particular ideology can be increased (Bainbridge and Stark 1980).

Hypnosis can also be used to lead people to allow themselves to relive actual traumatic life events (e.g., rape, childhood sexual abuse, near-death experiences, etc.) or to fantasize the existence of such events and, thereby, stimulate the experience of extreme emotional distress. When imbedded in a reform program, repeatedly leading the person to experience such events can function simply as punishment, useful for coercing compliance.

Accounts of contemporary programs also describe the use of sophisticated techniques intended to strip away psychological defenses, to induce regression to primitive levels of coping, and to flood targets with powerful emotion (Ayalla 1985; Haaken and Adams 1983; Hockman 1984; Temerlin and Temerlin 1982). In some instances stress and fatigue have been used to promote hallucinatory experiences that are defined as therapeutic (Gerstel 1982). Drugs have been used to facilitate disinhibition and heightened suggestibility (Watkins 1980). Thought-reform subjects have been punished for disobedience by being ordered to self-inflict severe pain, justified by the claim that the result will be therapeutic (Bellack et al. v. Murietta Foundation et al.).

Programs of coercive persuasion appear in various forms in contemporary society. They depend on the voluntary initial participation of targets. This is usually accomplished because the target assumes that there is a common goal that unites him or her with the organization or that involvement will confer some benefit (e.g., relief of symptoms, personal growth, spiritual development, etc.). Apparently some programs were developed based on the assumption that they could be used to facilitate desirable changes (e.g., certain rehabilitation or psychotherapy programs). Some religious organizations and social movements utilize them for recruitment purposes. Some commercial organizations utilize them as methods for promoting sales. Under unusual circumstances, modern police-interrogation methods can exhibit some of the properties of a thought-reform program. In some instances, reform programs appear to have been operated for the sole purpose of gaining a high degree of control over individuals to facilitate their exploitation (Ofshe 1986; McGuire and Norton 1988; Watkins 1980).

Virtually any acknowledged expertise or authority can serve as a power base to develop the social structure necessary to carry out thought reform. In the course of developing a new form of rehabilitation, psychotherapy, religious organization, utopian community, school, or sales organization it is not difficult to justify the introduction of thought-reform procedures.

Perhaps the most famous example of a thought-reforming program developed for the ostensible purpose of rehabilitation was Synanon, a drug treatment program (Sarbin and Adler 1970, Yabionsky 1965; Ofshe et al. 1974). The Synanon environment possessed all of Lifton’s eight themes. It used as its principle coercive procedure a highly aggressive encounter/therapy group interaction. In form it resembled “struggle groups” observed in China (Whyte 1976), but it differed in content. Individuals were vilified and humiliated not for past political behavior but for current conduct as well as far more psychologically intimate subjects, such as early childhood experiences, sexual experiences, degrading experiences as adults, etc. The coercive power of the group experience to affect behavior was substantial as was its ability to induce psychological injury (Lieberman, Yalom, and Miles 1973; Ofshe et al. 1974).

Allegedly started as a drug-rehabilitation program, Synanon failed to accomplish significant long-term rehabilitation. Eventually, Synanon’s leader, Charles Diederich, promoted the idea that any degree of drug abuse was incurable and that persons so afflicted needed to spend their lives in the Synanon community. Synanon’s influence program was successful in convincing many that this was so. Under Diederich’s direction, Synanon evolved from an organization that espoused non-violence into one that was violent. Its soldiers were dispatched to assault and attempt to murder persons identified by Diederich as Synanon’s enemies (Mitchell, Mitchell, and Ofshe 1981).

The manipulative techniques of self-styled messiahs, such as People’s Temple leader Jim Jones (Reiterman 1982), and influence programs operated by religious organizations, such as the Unification Church (Taylor 1978) arid Scientology (Wallis 1977; Bainbridge and Stark 1980), can be analyzed as thought-reform programs. The most controversial recruitment system operated by a religious organization in recent American history was that of the Northern California branch of the Unification Church (Reverend Mr. Moon’s organization). The influence program was built directly from procedures of psychological manipulation that were commonplace in the human-potential movement (Bromley and Shupe 1981). The procedures involved various group-based exercises as well as events designed to elicit from participant’s information about their emotional needs and vulnerabilities. Blended into this program was content intended slowly to introduce the newcomer to the group’s ideology. Typically, the program’s connection with the Unification Church or any religious mission was denied during the early stages of the reform process. The target was monitored around the clock and prevented from communicating with peers who might reinforce doubt and support a desire to leave. The physical setting was an isolated rural facility far from public transportation.

Initial focus on personal failures, guilt-laden memories, and unfulfilled aspirations shifted to the opportunity to realize infantile desires and idealistic goals, by affiliating with the group and its mission to save the world. The person was encouraged to develop strong affective bonds with current members. They showed unfailing interest, affection, and concern, sometimes to the point of spoon-feeding the person’s meals and accompanying the individual everywhere, including to the toilet. If the unfreezing and change phases of the program succeeded, the individual was told of the group’s affiliation with the Unification Church and assigned to another unit of the organization within which re- freezing procedures could be carried forward.

Influence procedures now commonly used during modern police interrogation can sometimes inadvertently manipulate innocent persons’ beliefs about their own innocence and, thereby, cause them falsely to confess. Confessions resulting from accomplishing the unfreezing and change phases of thought reform are classified as coerced-internalized false confessions (Kassin and Wrightsman 1985; Gudjonsson and MacKeith 1988). Although they rarely come together simultaneously, the ingredients necessary to elicit a temporarily believed false confession are: erroneous police suspicion, the use of certain commonly employed interrogation procedures, and some degree of psychological vulnerability in the suspect. Philip Zimbardo (1971) has reviewed the coercive factors generally present in modern interrogation settings. Richard Ofshe (1989) has identified those influence procedures that if present in a suspect’s interrogation contributes to causing unfreezing and change.

Techniques that contribute to unfreezing include falsely telling a suspect that the police have evidence proving the person’s guilt (e.g., fingerprints, eyewitness testimony, etc.). Suspects may be given a polygraph examination and then falsely told (due either to error or design) that they failed and the test reveals their unconscious knowledge of guilt. Suspects may be told that their lack of memory of the crime was caused by an alcohol or drug induced blackout, was repressed, or is explained because the individual is a multiple personality.

The techniques listed above regularly appear in modern American police interrogations. They are used to lead persons who know that they have committed the crime at issue to decide that the police have sufficient evidence to convict them or to counter typical objections to admitting guilt (e.g., “I can’t remember having done that.”). In conjunction with the other disorienting and distressing elements of a modern accusatory interrogation, these tactics can sometimes lead innocent suspects to doubt themselves and question their lack of knowledge of the crime. If innocent persons subjected to these sorts of influence techniques do not reject the false evidence and realize that the interrogators are lying to them, they have no choice but to doubt themselves.

Tactics used to change the suspect’s position and elicit a confession include maneuvers designed to intensify feelings of guilt and emotional distress following from the suspect’s assumption of guilt. Suspects may be offered an escape from the emotional distress through confession. It may also be suggested that confession will provide evidence of remorse that will benefit the suspect in court.

Thought reform is not an easy process to study for several reasons. The extraordinary totalistic qualities and hyperorganization of thought-reforming environments, together with the exceptional nature of the influence tactics that appear within them, put the researcher in a position roughly analogous to that of an anthropologist entering into or interviewing someone about a culture that is utterly foreign. The researcher cannot assume that he or she understands or even knows the norms of the new environment. This means that until the researcher is familiar with the constructed environment within which the reform process takes place, it is dangerous to make the routine assumptions about context that underlie research within one’s own culture. This problem extends to vocabulary as well as to norms and social structure.

The history of research on the problem has been one in which most of the basic descriptive work has been conducted through post-hoc interviewing of persons exposed to the procedures. The second-most frequently employed method has been that of participant observation. Recently, in connection with work being done on police interrogation methods, it has been possible to analyze contemporaneous recordings of interrogation sessions in which targets’ beliefs are actually made to undergo radical change. All this work has contributed to the development of an understanding of the thought-reform phenomenon in several ways.

Studying the reform process demonstrates that it is no more or less difficult to understand than any other complex social process and produces no results to suggest that something new has been discovered. The only aspect of the reform process that one might suggest is new, is the order in which the influence procedures are assembled and the degree to which the target’s environment is manipulated in the service of social control. This is at most an unusual arrangement of commonplace bits and pieces.

Work to date has helped establish a dividing line between the lurid fantasies about mysterious methods for stripping one’s capacity to resist control and the reality of the power of appropriately designed social environments to influence the behavior and decisions of those engaged by them. Beyond debunking myths, information gathered to date has been used in two ways to further the affirmative understanding of thought reform: It has been possible to develop descriptions of the social structure of thought-reforming environments, of their operations, and to identify the range of influence mechanisms they tend to incorporate; the second use of these data has been to relate the mechanisms of influence present in the reform environment to respondents’ accounts of their reactions to these experiences, to increase understanding of both general response tendencies to types of influence mechanisms and the reactions of particular persons to the reform experience.

As it is with all complex, real-world social phenomena that cannot be studied experimentally, understanding information about the thought-reform process proceeds through the application of theories that have been independently developed. Explaining data that describe the type and organization of the influence procedures that constitute a thought-reform process depends on applying established social-psychological theories about the manipulation of behavior and attitude change. Assessing reports about the impact on the experiences of the personalities subjected to intense influence procedures depends on the application of current theories of personality formation and change. Understanding instances in which the reform experience appears related to psychiatric injury requires proceeding as one would ordinarily in evaluating any case history of a stress-related or other type of psychological injury.

Captive Hearts, Captive Minds

Freedom and Recovery from Cults and Abusive Relationships

By Madeleine L. Tobias and Janja Lalich

Chapter one excerpts – The Cultic Relationship.

Cults may be large or small. What defines them is not their size but their behavior. In addition to the larger, more publicized cults, there are small cults of less than a dozen members who follow a particular “guru”; “family cults,” where the head of the family uses deceptive and excessive persuasion and control techniques; and probably the least acknowledged, the one-on-one cult.

The one-on-one cult is a deliberately manipulative and exploitative intimate relationship between two persons, often involving physical abuse of the subordinate partner. In the one-on-one cult, which we call a cultic relationship, there is a significant power imbalance between the two participants. The stronger uses his (or her) influence to control, manipulate, abuse, and exploit the other. In essence the cultic relationship is a one-on-one version of the larger group. It may even be more intense than participation in a group cult since all the attention and abuse is focused on one person, often with more damaging consequences.

Many marriages or domestic partnerships where there is spousal abuse may be characterized and explained in this way. Other one-on-one cults may be found in boss/employee situations, in pastor/worshipper milieus, in therapist/client relationships, in jailor/prisoner or interrogator/suspect situations, and in teacher/student environments (including academic, artistic, and spiritual situations – for example, a school professor, a yoga master, a martial arts instructor, or an art mentor). It is our hope that those who have suffered such individualized abuse will find much in this book to identify with and use in healing their pain.

Since the upsurge of both public and professional interest in the issue of domestic violence, there has been some recognition to the link between mind control and battering. Men or women who batter their partners sometimes use manipulative techniques similar to those found in cults. The most common include “isolation and the provocation of fear; alternating kindness and threat to produce disequilibrium; the induction of guilt, self-blame, dependency, and learned helplessness.” The degree to which these features are present in a relationship affects the intensity of control and allows the relationship to be labeled cultic.

The similarities between cultic devotion and the traumatic bonding that occurs between battered individuals and their abusers are striking. An abused partner is generally made to submit to the following types of behaviors:

  • early verbal and/or physical dominance,
  • isolation/imprisonment
  • fear arousal and maintenance
  • guilt induction
  • contingent expressions of “love”
  • enforced loyalty to the aggressor and self-denunciation
  • promotion of powerlessness and helplessness
  • pathological expressions of jealousy
  • hope-instilling behaviors
  • required secrecy (13)

When psychological coercion and manipulative exploitation have been used in a one-on-one cultic relationship, the person leaving such a relationship faces issues similar to those encountered by someone leaving a cultic group.

Thought Reform and the Psychology of Totalism

The University of North Carolina Press/Chapel Hill and London
By Robert Jay Lifton, M.D.

 

Below is an edited excerpt from Chapter 22 of Robert Jay Lifton’s book,”Thought Reform and the Psychology of Totalism: A Study of ‘Brainwashing’ in China.” Lifton, a psychiatrist and distinguished professor at the City University of New York, has studied the psychology of extremism for decades. He testified at the 1976 bank robbery trial of Patty Hearst about the theory of “coercive persuasion.” First published in 1961, his book was reprinted in 1989 by the University of North Carolina Press. Scroll down to the read the chapter.

 

 


Chapter 22: Ideological Totalism

Topics

Milieu Control
Mystical Manipulation
The Demand for Purity
The Cult of Confession
The “Sacred Science”
Loading the Language
Doctrine Over Person
The Dispensing of Existence

A discussion of what is most central in the thought reform environment can lead us to a more general consideration of the psychology of human zealotry. For in identifying, on the basis of this study of thought reform, features common to all expressions of ideological totalism, I wish to suggest a set of criteria against which any environment may be judged – a basis for answering the ever-recurring question: “Isn’t this just like ‘brainwashing’?”

These criteria consist of eight psychological themes which are predominant within the social field of the thought reform milieu. Each has a totalistic quality; each depend upon an equally absolute philosophical assumption; and each mobilizes certain individual emotional tendencies, mostly of a polarizing nature. In combination they create an atmosphere which may temporarily energize or exhilarate, but which at the same time poses the gravest of human threats.

Milieu Control

The most basic feature of the thought reform environment, the psychological current upon which all else depends, is the control of human communication. Through this milieu control the totalist environment seeks to establish domain over not only the individual’s communication with the outside (all that he sees and hears, reads or writes, experiences, and expresses), but also – in its penetration of his inner life – over what we may speak of as his communication with himself. It creates an atmosphere uncomfortably reminiscent of George Orwell’s 1984.

Such milieu control never succeeds in becoming absolute, and its own human apparatus can – when permeated by outside information – become subject to discordant “noise” beyond that of any mechanical apparatus. To totalist administrators, however, such occurrences are no more than evidences of “incorrect” use of the apparatus. For they look upon milieu control as a just and necessary policy, one which need not be kept secret: thought reform participants may be in doubt as to who is telling what to whom, but the fact that extensive information about everyone is being conveyed to the authorities is always known. At the center of this self-justification is their assumption of omniscience, their conviction that reality is their exclusive possession. Having experienced the impact of what they consider to be an ultimate truth (and having the need to dispel any possible inner doubts of their own), they consider it their duty to create an environment containing no more and no less than this “truth.” In order to be the engineers of the human soul, they must first bring it under full observational control.

Mystical Manipulation

The inevitable next step after milieu control is extensive personal manipulation. This manipulation assumes a no-holds-barred character, and uses every possible device at the milieu’s command, no matter how bizarre or painful. Initiated from above, it seeks to provoke specific patterns of behavior and emotion in such a way that these will appear to have arisen spontaneously, directed as it is by an ostensibly omniscient group, must assume, for the manipulated, a near-mystical quality.

Ideological totalists do not pursue this approach solely for the purpose of maintaining a sense of power over others. Rather they are impelled by a special kind of mystique which not only justifies such manipulations, but makes them mandatory. Included in this mystique is a sense of “higher purpose,” of having “directly perceived some imminent law of social development,” and of being themselves the vanguard of this development. By thus becoming the instruments of their own mystique, they create a mystical aura around the manipulating institutions – the Party, the Government, the Organization. They are the agents “chosen” (by history, by God, or by some other supernatural force) to carry out the “mystical imperative,” the pursuit of which must supersede all considerations of decency or of immediate human welfare. Similarly, any thought or action which questions the higher purpose is considered to be stimulated by a lower purpose, to be backward, selfish, and petty in the face of the great, overriding mission. This same mystical imperative produces the apparent extremes of idealism and cynicism which occur in connection with the manipulations of any totalist environment: even those actions which seem cynical in the extreme can be seen as having ultimate relationship to the “higher purpose.”

At the level of the individual person, the psychological responses to this manipulative approach revolve about the basic polarity of trust and mistrust. One is asked to accept these manipulations on a basis of ultimate trust (or faith): “like a child in the arms of its mother.” He who trusts in this degree can experience the manipulations within the idiom of the mystique behind them: that is, he may welcome their mysteriousness, find pleasure in their pain, and feel them to be necessary for the fulfillment of the “higher purpose” which he endorses as his own. But such elemental trust is difficult to maintain; and even the strongest can be dissipated by constant manipulation.

When trust gives way to mistrust (or when trust has never existed) the higher purpose cannot serve as adequate emotional sustenance. The individual then responds to the manipulations through developing what I shall call the psychology of the pawn. Feeling himself unable to escape from forces more powerful than himself, he subordinates everything to adapting himself to them. He becomes sensitive to all kinds of cues, expert at anticipating environmental pressures, and skillful in riding them in such a way that his psychological energies merge with the tide rather than turn painfully against himself. This requires that he participate actively in the manipulation of others, as well as in the endless round of betrayals and self-betrayals which are required.

But whatever his response – whether he is cheerful in the face of being manipulated, deeply resentful, or feels a combination of both – he has been deprived of the opportunity to exercise his capacities for self-expression and independent action.

 

The Demand for Purity

In the thought reform milieu, as in all situations of ideological totalism, the experiential world is sharply divided into the pure and the impure, into the absolutely good and the absolutely evil. The good and the pure are of course those ideas, feelings, and actions which are consistent with the totalist ideology and policy; anything else is apt to be relegated to the bad and the impure. Nothing human is immune from the flood of stern moral judgments. All “taints” and “poisons” which contribute to the existing state of impurity must be searched out and eliminated.

The philosophical assumption underlying this demand is that absolute purity is attainable, and that anything done to anyone in the name of this purity is ultimately moral. In actual practice, however, no one is really expected to achieve such perfection. Nor can this paradox be dismissed as merely a means of establishing a high standard to which all can aspire. Thought reform bears witness to its more malignant consequences: for by defining and manipulating the criteria of purity, and then by conducting an all-out war upon impurity, the ideological totalists create a narrow world of guilt and shame. This is perpetuated by an ethos of continuous reform, a demand that one strive permanently and painfully for something which not only does not exist but is in fact alien to the human condition.

At the level of the relationship between individual and environment, the demand for purity creates what we may term a guilty milieu and a shaming milieu. Since each man’s impurities are deemed sinful and potentially harmful to himself and to others, he is, so to speak, expected to expect punishment – which results in a relationship of guilt and his environment. Similarly, when he fails to meet the prevailing standards in casting out such impurities, he is expected to expect humiliation and ostracism – thus establishing a relationship of shame with his milieu. Moreover, the sense of guilt and the sense of shame become highly-valued: they are preferred forms of communication, objects of public competition, and the basis for eventual bonds between the individual and his totalist accusers. One may attempt to simulate them for a while, but the subterfuge is likely to be detected, and it is safer to experience them genuinely.

People vary greatly in their susceptibilities to guilt and shame, depending upon patterns developed early in life. But since guilt and shame are basic to human existence, this variation can be no more than a matter of degree. Each person is made vulnerable through his profound inner sensitivities to his own limitations and to his unfulfilled potential; in other words, each is made vulnerable through his existential guilt. Since ideological totalists become the ultimate judges of good and evil within their world, they are able to use these universal tendencies toward guilt and shame as emotional levers for their controlling and manipulative influences. They become the arbiters of existential guilt, authorities without limit in dealing with others’ limitations. And their power is nowhere more evident than in their capacity to “forgive.”

The individual thus comes to apply the same totalist polarization of good and evil to his judgments of his own character: he tends to imbue certain aspects of himself with excessive virtue, and condemn even more excessively other personal qualities – all according to their ideological standing. He must also look upon his impurities as originating from outside influences – that is, from the ever-threatening world beyond the closed, totalist ken. Therefore, one of his best way to relieve himself of some of his burden of guilt is to denounce, continuously and hostilely, these same outside influences. The more guilty he feels, the greater his hatred, and the more threatening they seem. In this manner, the universal psychological tendency toward “projection” is nourished and institutionalized, leading to mass hatreds, purges of heretics, and to political and religious holy wars. Moreover, once an individual person has experienced the totalist polarization of good and evil, he has great difficulty in regaining a more balanced inner sensitivity to the complexities of human morality. For these is no emotional bondage greater than that of the man whose entire guilt potential – neurotic and existential – has become the property of ideological totalists.

 

The Cult of Confession

Closely related to the demand for absolute purity is an obsession with personal confession. Confession is carried beyond its ordinary religious, legal, and therapeutic expressions to the point of becoming a cult in itself. There is the demand that one confess to crimes one has not committed, to sinfulness that is artificially induced, in the name of a cure that is arbitrarily imposed. Such demands are made possible not only by the ubiquitous human tendencies toward guilt and shame but also by the need to give expression to these tendencies. In totalist hands, confession becomes a means of exploiting, rather than offering solace for, these vulnerabilities.

The totalist confession takes on a number of special meanings. It is first a vehicle for the kind of personal purification which we have just discussed, a means of maintaining a perpetual inner emptying or psychological purge of impurity; this purging milieu enhances the totalists’ hold upon existential guilt. Second, it is an act of symbolic self-surrender, the expression of the merging of individual and environment. Third, it is a means of maintaining an ethos of total exposure – a policy of making public (or at least known to the Organization) everything possible about the life experiences, thoughts, and passions of each individual, and especially those elements which might be regarded as derogatory.

The assumption underlying total exposure (besides those which relate to the demand for purity) is the environment’s claim to total ownership of each individual self within it. Private ownership of the mind and its products – of imagination or of memory – becomes highly immoral. The accompanying rationale (or rationalization) is familiar, the milieu has attained such a perfect state of enlightenment that any individual retention of ideas or emotions has become anachronistic.

The cult of confession can offer the individual person meaningful psychological satisfactions in the continuing opportunity for emotional catharsis and for relief of suppressed guilt feelings, especially insofar as these are associated with self-punitive tendencies to get pleasure from personal degradation. More than this, the sharing of confession enthusiasms can create an orgiastic sense of “oneness,” of the most intense intimacy with fellow confessors and of the dissolution of self into the great flow of the Movement. And there is also, at least initially, the possibility of genuine self-revelation and of self-betterment through the recognition that “the thing that has been exposed is what I am.”

But as totalist pressures turn confession into recurrent command performances, the element of histrionic public display takes precedence over genuine inner experience. Each man becomes concerned with the effectiveness of his personal performance, and this performance sometimes comes to serve the function of evading the very emotions and ideas about which one feels most guilty – confirming the statement by one of Camus’ characters that “authors of confessions write especially to avoid confessing, to tell nothing of what they know.” The difficulty, of course, lies in the inevitable confusion which takes place between the actor’s method and his separate personal reality, between the performer and the “real me.”

In this sense, the cult of confession has effects quite the reverse of its ideal of total exposure: rather than eliminating personal secrets, it increases and intensifies them. In any situation the personal secret has two important elements: first, guilty and shameful ideas which one wishes to suppress in order to prevent their becoming known by others or their becoming too prominent in one’s own awareness; and second, representations of parts of oneself too precious to be expressed except when alone or when involved in special loving relationships formed around this shared secret world. Personal secrets are always maintained in opposition to inner pressures toward self-exposure. The totalist milieu makes contact with these inner pressures through its own obsession with the expose and the unmasking process. As a result old secrets are revived and new ones proliferate; the latter frequently consist of resentments toward or doubts about the Movement, or else are related to aspects of identity still existing outside of the prescribed ideological sphere. Each person becomes caught up in a continuous conflict over which secrets to preserve and which to surrender, over ways to reveal lesser secrets in order to protect more important ones; his own boundaries between the secret and the known, between the public and the private, become blurred. And around one secret, or a complex of secrets, there may revolve an ultimate inner struggle between resistance and self-surrender.

Finally, the cult of confession makes it virtually impossible to attain a reasonable balance between worth and humility. The enthusiastic and aggressive confessor becomes like Camus’ character whose perpetual confession is his means of judging others: “[I]…practice the profession of penitent to be able to end up as a judge…the more I accuse myself, the more I have a right to judge you.” The identity of the “judge-penitent” thus becomes a vehicle for taking on some of the environment’s arrogance and sense of omnipotence. Yet even this shared omnipotence cannot protect him from the opposite (but not unrelated) feelings of humiliation and weakness, feelings especially prevalent among those who remain more the enforced penitent than the all-powerful judge.

 

The “Sacred Science”

The totalist milieu maintains an aura of sacredness around its basic dogma, holding it out as an ultimate moral vision for the ordering of human existence. This sacredness is evident in the prohibition (whether or not explicit) against the questioning of basic assumptions, and in the reverence which is demanded for the originators of the Word, the present bearers of the Word, and the Word itself. While thus transcending ordinary concerns of logic, however, the milieu at the same time makes an exaggerated claim of airtight logic, of absolute “scientific” precision. Thus the ultimate moral vision becomes an ultimate science; and the man who dares to criticize it, or to harbor even unspoken alternative ideas, becomes not only immoral and irreverent, but also “unscientific.” In this way, the philosopher kings of modern ideological totalism reinforce their authority by claiming to share in the rich and respected heritage of natural science.

The assumption here is not so much that man can be God, but rather that man’s ideas can be God: that an absolute science of ideas (and implicitly, an absolute science of man) exists, or is at least very close to being attained; that this science can be combined with an equally absolute body of moral principles; and that the resulting doctrine is true for all men at all times. Although no ideology goes quite this far in overt statement, such assumptions are implicit in totalist practice.

At the level of the individual, the totalist sacred science can offer much comfort and security. Its appeal lies in its seeming unification of the mystical and the logical modes of experience (in psychoanalytic terms, of the primary and secondary thought processes). For within the framework of the sacred science, and sweeping, non-rational “insights.” Since the distinction between the logical and the mystical is, to begin with, artificial and man-made, an opportunity for transcending it can create an extremely intense feeling of truth. But the posture of unquestioning faith – both rationally and non-rationally derived – is not easy to sustain, especially if one discovers that the world of experience is not nearly as absolute as the sacred science claims it to be.

Yet so strong a hold can the sacred science achieve over his mental processes that if one begins to feel himself attracted to ideas which either contradict or ignore it, he may become guilty and afraid. His quest for knowledge is consequently hampered, since in the name of science he is prevented from engaging in the receptive search for truth which characterizes the genuinely scientific approach. And his position is made more difficult by the absence, in a totalist environment, of any distinction between the sacred and the profane: there is no thought or action which cannot be related to the sacred science. To be sure, one can usually find areas of experience outside its immediate authority; but during periods of maximum totalist activity (like thought reform) any such areas are cut off, and there is virtually no escape from the milieu’s ever-pressing edicts and demands. Whatever combination of continued adherence, inner resistance, or compromise co-existence the individual person adopts toward this blend of counterfeit science and back-door religion, it represents another continuous pressure toward personal closure, toward avoiding, rather than grappling with, the kinds of knowledge and experience necessary for genuine self-expression and for creative development.

 

Loading the Language

The language of the totalist environment is characterized by the thought-terminating cliché. The most far-reaching and complex of human problems are compressed into brief, highly reductive, definitive-sounding phrases, easily memorized and easily expressed. These become the start and finish of any ideological analysis. In [Chinese Communist] thought reform, for instance, the phrase “bourgeois mentality” is used to encompass and critically dismiss ordinarily troublesome concerns like the quest for individual expression, the exploration of alternative ideas, and the search for perspective and balance in political judgments. And in addition to their function as interpretive shortcuts, these cliches become what Richard Weaver has called “ultimate terms” : either “god terms,” representative of ultimate good; or “devil terms,” representative of ultimate evil. In [Chinese Communist] thought reform, “progress,” “progressive,” “liberation,” “proletarian standpoints” and “the dialectic of history” fall into the former category; “capitalist,” “imperialist,” “exploiting classes,” and “bourgeois” (mentality, liberalism, morality, superstition, greed) of course fall into the latter. Totalist language then, is repetitiously centered on all-encompassing jargon, prematurely abstract, highly categorical, relentlessly judging, and to anyone but its most devoted advocate, deadly dull: in Lionel Trilling’s phrase, “the language of nonthought.”

To be sure, this kind of language exists to some degree within any cultural or organizational group, and all systems of belief depend upon it. It is in part an expression of unity and exclusiveness: as Edward Sapir put it, “‘He talks like us’ is equivalent to saying ‘He is one of us.'” The loading is much more extreme in ideological totalism, however, since the jargon expresses the claimed certitudes of the sacred science. Also involved is an underlying assumption that language – like all other human products – can be owned and operated by the Movement. No compunctions are felt about manipulating or loading it in any fashion; the only consideration is its usefulness to the cause.

For an individual person, the effect of the language of ideological totalism can be summed up in one word: constriction. He is, so to speak, linguistically deprived; and since language is so central to all human experience, his capacities for thinking and feeling are immensely narrowed. This is what Hu meant when he said, “using the same pattern of words for so long…you feel chained.” Actually, not everyone exposed feels chained, but in effect everyone is profoundly confined by these verbal fetters. As in other aspects of totalism, this loading may provide an initial sense of insight and security, eventually followed by uneasiness. This uneasiness may result in a retreat into a rigid orthodoxy in which an individual shouts the ideological jargon all the louder in order to demonstrate his conformity, hide his own dilemma and his despair, and protect himself from the fear and guilt he would feel should he attempt to use words and phrases other than the correct ones. Or else he may adapt a complex pattern of inner division, and dutifully produce the expected cliché’s in public performances while in his private moments he searches for more meaningful avenues of expression. Either way, his imagination becomes increasingly dissociated from his actual life experiences and may tend to atrophy from disuse.

 

Doctrine Over Person

This sterile language reflects characteristic feature of ideological totalism: the subordination of human experience to the claims of doctrine. This primacy of doctrine over person is evident in the continual shift between experience itself and the highly abstract interpretation of such experience – between genuine feelings and spurious cataloguing of feelings. It has much to do with the peculiar aura of half-reality which totalist environment seems, at least to the outsider, to possess.

The inspiriting force of such myths cannot be denied; nor can one ignore their capacity for mischief. For when the myth becomes fused with the totalist sacred science, the resulting “logic” can be so compelling and coercive that it simply replaces the realities of individual experience. Consequently, past historical events are retrospectively altered, wholly rewritten, or ignored, to make them consistent with the doctrinal logic. This alteration becomes especially malignant when its distortions are imposed upon individual memory as occurred in the false confession extracted during thought reform.

The same doctrinal primacy prevails in the totalist approach to changing people: the demand that character and identity be reshaped, not in accordance with one’s special nature or potentialities, but rather to fit the rigid contours of the doctrinal mold. The human is thus subjected to the ahuman. And in this manner, the totalists, as Camus phrases it, “put an abstract idea above human life, even if they call it history, to which they themselves have submitted in advance and to which they will decide arbitrarily, to submit everyone else as well.”

The underlying assumption is that the doctrine – including its mythological elements – is ultimately more valid, true, and real than is any aspect of actual human character or human experience. Thus, even when circumstances require that a totalist movement follow a course of action in conflict with or outside of the doctrine, there exists what Benjamin Schwartz described as a “will to orthodoxy” which requires an elaborate facade of new rationalizations designed to demonstrate the unerring consistency of the doctrine and the unfailing foresight which it provides. But its greater importance lies in more hidden manifestations, particularly the totalists’ pattern of imposing their doctrine-dominated remolding upon people in order to seek confirmation of (and again, dispel their own doubts about) this same doctrine. Rather than modify the myth in accordance with experience, the will to orthodoxy requires instead that men be modified in order to reaffirm the myth.

The individual person who finds himself under such doctrine-dominated pressure to change is thrust into an intense struggle with his own sense of integrity, a struggle which takes place in relation to polarized feelings of sincerity and insincerity. In a totalist environment, absolute “sincerity” is demanded; and the major criterion for sincerity is likely to be one’s degree of doctrinal compliance – both in regard to belief and to direction of personal change. Yet there is always the possibility of retaining an alternative version of sincerity (and of reality), the capacity to imagine a different kind of existence and another form of sincere commitment. These alternative visions depend upon such things as the strength of previous identity, the penetration of the milieu by outside ideas, and the retained capacity for eventual individual renewal. The totalist environment, however, counters such “deviant” tendencies with the accusation that they stem entirely from personal “problems” (“thought problems” or “ideological problems”) derived from untoward earlier influences. The outcome will depend largely upon how much genuine relevance the doctrine has for the individual emotional predicament. And even for those to whom it seems totally appealing, the exuberant sense of well-being it temporarily affords may be more a “delusion of wholeness” than an expression of true and lasting inner harmony.

 

The Dispensing of Existence

The totalist environment draws a sharp line between those whose right to existence can be recognized, and those who possess no such right.

Are not men presumtuous to appoint themselves the dispensers of human existence? Surely this is a flagrant expression of what the Greeks called hubris, of arrogant man making himself God. Yet one underlying assumption makes this arrogance mandatory: the conviction that there is just one path to true existence, just one valid mode of being, and that all others are perforce invalid and false. Totalists thus feel themselves compelled to destroy all possibilities of false existence as a means of furthering the great plan of true existence to which they are committed.

For the individual, the polar emotional conflict is the ultimate existential one of “being versus nothingness.” He is likely to be drawn to a conversion experience, which he sees as the only means of attaining a path of existence for the future. The totalist environment – even when it does not resort to physical abuse – thus stimulates in everyone a fear of extinction or annihilation. A person can overcome this fear and find (in martin Buber’s term) “confirmation,” not in his individual relationships, but only from the fount of all existence, the totalist Organization. Existence comes to depend upon creed (I believe, therefore I am), upon submission (I obey, therefore I am) and beyond these, upon a sense of total merger with the ideological movement. Ultimately of course one compromises and combines the totalist “confirmation” with independent elements of personal identity; but one is ever made aware that, should he stray too far along this “erroneous path,” his right to existence may be withdrawn.

The more clearly an environment expresses these eight psychological themes, the greater its resemblance to ideological totalism; and the more it utilizes such totalist devices to change people, the greater its resemblance to thought reform. But facile comparisons can be misleading. No milieu ever achieves complete totalism, and many relatively moderate environments show some signs of it. Moreover, totalism tends to be recurrent rather than continuous. But if totalism has at any time been prominent in the movement, there is always the possibility of its reappearance, even after long periods of relative moderation.

Then, too, some environments come perilously close to totalism but at the same time keep alternative paths open; this combination can offer unusual opportunities for achieving intellectual and emotional depth. And even the most full-blown totalist milieu can provide (more or less despite itself) a valuable and enlarging life experience – if the man exposed has both the opportunity to leave the extreme environment and the inner capacity to absorb and make inner use of the totalist pressures.

Also, ideological totalism itself may offer a man an intense peak experience: a sense of transcending all that is ordinary and prosaic, of freeing himself from the encumbrances of human ambivalence, of entering a sphere of truth, reality, and sincerity beyond any he had ever known or even imagined. But these peak experiences, carry a great potential for rebound, and for equally intense opposition to the very things which initially seem so liberating. Such imposed peak experiences – as contrasted with those more freely and privately arrived at by great religious leaders and mystics – are essentially experiences of personal closure. Rather than stimulating greater receptivity and “openness to the world,” they encourage a backward step into some form of “embeddedness” – a retreat into doctrinal patterns more characteristic (at least at this stage of human history) of the child than of the individuated adult.

And if no peak experience occurs, ideological totalism does even greater violence to the human potential: it evokes destructive emotions, produces intellectual and psychological constrictions, and deprives men of all that is most subtle and imaginative – under the false promise of eliminating those very imperfections and ambivalences which help to define the human condition. This combination of personal closure, self-destructiveness, and hostility toward outsiders leads to the dangerous group excesses so characteristic of ideological totalism in any form. It also mobilizes extremist tendencies in those outsiders under attack, thus creating a vicious circle of totalism.

What is the source of ideological totalism? How do these extremist emotional patterns originate? These questions raise the most crucial and the most difficult of human problems. Behind ideological totalism lies the ever-present human quest for the omnipotent guide – for the supernatural force, political party, philosophical ideas, great leader, or precise science – that will bring ultimate solidarity to all men and eliminate the terror of death and nothingness. This quest is evident in the mythologies, religions, and histories of all nations, as well as in every individual life. The degree of individual totalism involved depends greatly upon factors in one’s personal history: early lack of trust, extreme environmental chaos, total domination by a parent or parent-representative, intolerable burdens of guilt, and severe crises of identity. Thus an early sense of confusion and dislocation, or an early experience of unusually intense family milieu control, can produce later a complete intolerance for confusion and dislocation, and a longing for the reinstatement of milieu control. But these things are in some measure part of every childhood experience; and therefore the potential for totalism is a continuum from which no one entirely escapes, and in relationship to which no two people are exactly the same.

It may be that the capacity for totalism is most fundamentally a product of human childhood itself, of the prolonged period of helplessness and dependency through which each of us must pass. Limited as he is, the infant has no choice but to imbue his first nurturing authorities – his parents – with an exaggerated omnipotence, until the time he is himself capable of some degree of independent action and judgment. And even as he develops into the child and the adolescent, he continues to require many of the all-or-none polarities of totalism as terms with which to define his intellectual, emotional, and moral worlds. Under favorable circumstances (that is, when family and culture encourage individuation) these requirements can be replaced by more flexible and moderate tendencies; but they never entirely disappear.

During adult life, individual totalism takes on new contours as it becomes associated with new ideological interests. It may become part of the configuration of personal emotions, messianic ideas, and organized mass movement which I have described as ideological totalism. When it does, we cannot speak of it as simply as ideological regression. It is partly this, but it is also something more: a new form of adult embeddedness, originating in patterns of security-seeking carried over from childhood, but with qualities of ideas and aspirations that are specifically adult. During periods of cultural crisis and of rapid historical change, the totalist quest for the omnipotent guide leads men to seek to become that guide.

Totalism, then, is a widespread phenomenon, but it is not the only approach to re-education. We can best use our knowledge of it by applying its criteria to familiar processes in our own cultural tradition and in our own country.

 

Influence

By Robert B. Cialdini, Ph.D.

See Dr. Robert Cialdina’s Web site “Influence at Work”

Introduction

Robert Cialdini is a Professor of Psychology at Arizona State University and has spent many years devoted to the scientific investigation and research of persuasion techniques. His book “Influence” has become a classic. Within his book Cialdini lists six basic social and psychological principles that form the foundation for successful strategies used to achieve influence.

Those six principles are:

Rule of Reciprocity

According to sociologists and anthropologists, one of the most widespread and basic norms of human culture is embodied in the rule of reciprocity. This rule requires that one person try to repay what another person has provided. By obligating the recipient to an act of repayment in the future–the rule for reciprocation allows one individual to give something to another with the confidence that it is not being lost.

This sense of future obligation according to the rule makes possible the development of various kinds of continuing relationships, transactions, and exchanges that are beneficial to society. Consequently, virtually all members of society are trained from childhood to abide by this rule or suffer serious social disapproval.

The decision to comply with someone’s request is frequently based upon the Rule of Reciprocity. Again, a possible and profitable tactic to gain probable compliance would be to give something to someone before asking for a favor in return.

The opportunity to exploit this tactic is due to three characteristics of the Rule of Reciprocity:

  1. The rule is extremely powerful, often overwhelming the influence of other factors that normally determine compliance with a request.
  2. The rule applies even to uninvited first favors, which reduces our ability to decide whom we wish to owe and putting the choice in the hands of others
  3. The rule can spur unequal exchanges. That is–to be rid of the uncomfortable feeling of indebtedness, an individual will often agree to a request for a substantially larger favor, than the one he or she first received.

Another way in which the Rule of Reciprocity can increase compliance involves a simple variation on the basic theme: instead of providing a favor first that stimulates a returned favor, an individual can make instead an initial concession–that stimulates a return concession.

One compliance procedure, called the “rejection-then-retreat technique”, or door-in-the-face technique, relies heavily on the pressure to reciprocate concessions. By starting with an extreme request that is sure to be rejected, the requester can then profitably retreat to a smaller request–the one that was desired all along. This request is likely to now be accepted because it appears to be a concession. Research indicates, that aside from increasing the likelihood that a person will say yes to a request–the rejection-then-retreat technique also increases the likelihood that the person will carry out the request a will agree to future requests.

The best defense against manipulation by the use of the Rule of Reciprocity to gain compliance is not the total rejection of initial offers by others. But rather, accepting initial favors or concessions in good faith, while also remaining prepared to see through them as tricks–should they later be proven so. Once they are seen in this way, there is no longer a need to feel the necessity to respond with a favor or concession.

Commitment and Consistency

People have a desire to look consistent through their words, beliefs, attitudes and deeds and this tendency is supported or fed from three sources:

  1. Good personal consistency is highly valued by society.
  2. Consistent conduct provides a beneficial approach to daily life.
  3. A consistent orientation affords a valuable shortcut through the complexity of modern existence. That is– by being consistent with earlier decisions we can reduce the need to process all the relevant information in future similar situations. Instead, one merely needs to recall the earlier decision and respond consistently.

The key to using the principles of Commitment and Consistency to manipulate people is held within the initial commitment. That is–after making a commitment, taking a stand or position, people are more willing to agree to requests that are consistent with their prior commitment. Many compliance professionals will try to induce others to take an initial position that is consistent with a behavior they will later request.

Commitments are most effective when they are active, public, effortful, and viewed as internally motivated and not coerced. Once a stand is taken, there is a natural tendency to behave in ways that are stubbornly consistent with the stand. The drive to be and look consistent constitutes a highly potent tool of social influence, often causing people to act in ways that are clearly contrary to their own best interests.

Commitment decisions, even erroneous ones, have a tendency to be self-perpetuating–they often “grow their own legs.” That is–those involved may add new reasons and justifications to support the wisdom of commitments they have already made. As a consequence, some commitments remain in effect long after the conditions that spurred them have changed. This phenomenon explains the effectiveness of certain deceptive compliance practices.

To recognize and resist the undue influence of consistency pressures upon our compliance decisions–we can listen for signals coming from two places within us–our stomach or “gut reaction” and our heart.

  • A bad feeling in the pit of the stomach may appear when we realize that we are being pushed by commitment and consistency pressures to agree to requests we know we don’t want to perform.
  • Our heart may bother us when it is not clear that an initial commitment was right.

At such points it is meaningful to ask a crucial question, “Knowing what I know now, if I could go back, would I have made the same commitment?”

Social Proof

One means used to determine what is correct is to find out what others believe is correct. People often view a behavior as more correct in a given situation–to the degree that we see others performing it.

This principle of Social Proof can be used to stimulate a person’s compliance with a request by informing him or her that many other individuals, perhaps some that are role models, are or have observed this behavior. This tool of influence provides a shortcut for determining how to behave. But at the same time it can make those involved with using this social shortcut–vulnerable to the manipulations of others who seek to exploit such influence through such things as seminars, group introduction dinners, retreats etc. Group members may then provide the models for the behavior that each group plans to produce in its potential new members.

Social proof is most influential under two conditions:

  1. Uncertainty–when people are unsure and the situation is ambiguous they are more likely to observe the behavior of others and to accept that behavior as correct
  2. Similarity–people are more inclined to follow the lead of others who are similar.

Some recommendations on how to reduce susceptibility to contrived social proofs would include a greater sensitivity to clearly counterfeit evidence. That is–what others are doing and their behavior should not form a sole basis for decision-making.

Liking

People prefer to say yes to individuals they know and like. This simple rule helps to understand how Liking can create influence and how compliance professionals may emphasize certain factors and/or attributes to increase their overall attractiveness and subsequent effectiveness. Compliance practitioners may regularly use several factors.

Physical attractiveness–is one feature of a person that often may help to create some influence. Although it has long been suspected that physical beauty provides an advantage in social interaction, research indicates that this advantage may be greater than once supposed. Physical attractiveness seems to engender a “halo” effect that extends to favorable impressions of other traits such as talent, kindness, and intelligence. As a result, attractive people are more persuasive both in terms of getting what they request and in changing others’ attitudes.

Similarity–is a second factor that influences both Liking and compliance. That is–we like people who are like us and are more willing to say yes to their requests, often without much critical consideration.

Praise–is another factor that produces Liking, though this can sometimes backfire when they are crudely transparent. But generally compliments most often enhance liking and can be used as a means to gain compliance.

Increased familiarity–through repeated contact with a person or thing is yet another factor that normally facilitates Liking. But this holds true principally when that contact takes place under positive rather than negative circumstances. One positive circumstance that may works well is mutual and successful cooperation.

A final factor linked to Liking is often association. By associating with products or positive things–those who seek influence frequently share in a halo effect by association. Other individuals as well appear to recognize the positive effect of simply associating themselves with favorable events and distancing themselves from unfavorable ones.

A potentially effective response that reduces vulnerability to the undue influence of Liking upon decision-making requires a recognition of how Liking and its attending factors mayimpact our impression of someone making requests and soliciting important decisions. That is– recognizing how someone making requests may do inordinately well under certain circumstances–should cause us to step back from some social interaction and objectively separate the requester from his or her offer or request. We should make decisions, commitments and offer compliance based upon the actual merits of the offer or request.

Authority

In the seminal studies and research conducted by Milgram regarding obedience there is evidence of the strong pressure within our society for compliance when requested by an authority figure. The strength of this tendency to obey legitimate authorities is derived from the systematic socialization practices designed to instill in society the perception that such obedience constitutes correct conduct. Additionally, it is also frequently adaptive to obey the dictates of genuine authorities because such individuals usually possess high levels of knowledge, wisdom, and power. For these reasons, deference to authorities can occur in a mindless fashion as a kind of decision-making shortcut. When reacting to authority in an automatic fashion there is a tendency to often do so in response to the mere symbols of authority rather than to its substance.

Three types of symbols have been demonstrated through research as effective in this regard:

  1. Titles
  2. Clothing
  3. Automobiles.

In separate studies investigating the influence of these symbols–individuals that possessed one or another of these symbols, even without other legitimizing credentials, were accorded more deference or obedience by those they encountered. Moreover, in each instance, those individuals who deferred and/or obeyed these individuals underestimated the effect of authority pressures upon their behavior.

Asking two questions can attain a meaningful defense against the detrimental effects of undue influence gained through authority.

  1. Is this authority truly an expert?
  2. How truthful can we expect this expert to be?

The first question directs our attention away from symbols and toward actual evidence for authority status. The second advises us to consider not just the expert’s knowledge in the situation, but also his or her trustworthiness. With regard to this second consideration, we should be alert to the trust-enhancing tactic in which a communicator may first provide some mildly negative information about himself or herself. This can be seen as a strategy to create the perception of honesty–making subsequent information seem more credible to those listening.

Scarcity

According to the Principle of Scarcity–people assign more value to opportunities when they are less available. The use of this principle for profit can be seen in such high-pressure sales techniques as only a “limited number” now available and a “deadline” set for an offer. Such tactics attempt to persuade people that number and/or time restrict access to what is offered. The scarcity principle holds true for two reasons:

  1. Things difficult to attain are typically more valuable. And the availability of an item or experience can serve as a shortcut clue or cue to its quality.
  2. When something becomes less accessible, the freedom to have it may be lost.

According to psychological reactance theory, people respond to the loss of freedom by wanting to have it more. This includes the freedom to have certain goods and services. As a motivator, psychological reactance is present throughout the great majority of a person’s life span. However, it is especially evident at a pair of ages: “the terrible twos” and the teenage years. Both of these periods are characterized by an emerging sense of individuality, which brings to prominence such issues as control, individual rights, and freedoms. People at these ages are especially sensitive to restrictions.

In addition to its effect on the valuation of commodities, the Principle of Scarcity also applies to the way that information is evaluated. Research indicates that the act of limiting access to a message may cause individuals to want it more and to become increasingly favorable to it. The latter of these findings, that limited information is more persuasive–seems the most interesting. In the case of censorship, this effect occurs even when the message has not been received. When a message has been received, it is more effective if it is perceived to consist of some type of exclusive information.

The scarcity principle is more likely to hold true under two optimizing conditions

  1. Scarce items are heightened in value when they are newly scarce. That is things have higher value when they have become recently restricted–more than those than those things that were restricted all along have.
  2. People are most attracted to scarce resources when they compete with others for them.

It is difficult to prepare ourselves cognitively against scarcity pressures because they have an emotional quality that makes thinking difficult. In defense, we might attempt to be alert regarding the sudden rush of emotions in situations involving scarcity. Perhaps this awareness may allow us to remain calm and take steps to assess the merits of an opportunity in terms of why we really want and objectively need.

This is based upon the summary notes within the book–Influence. By Robert B. Cialdini, Ph.D. (Quill, NY, 1984 (Revised 1993)