Cult Awareness Groups and NRM Scholars

Cultic Studies Review, 4(2), 2005, Pages 146-168

Cult Awareness Groups and NRM Scholars: Toward Depolarization of Key Issues

Michael D. Langone, Ph.D.

Abstract

Since the 1970s there has been a divide between scholars and professionals with sometimes widely differing views of cultic groups. Although the phrases “anti-cultists” and “pro-cultists” are commonly used to describe the two camps, this article refers to sympathizers and critics. The article discusses the history of the academic disputes, attempts over the years to encourage dialogue between the two camps, and changes in the population of help seekers. Dialogue in recent years has decreased the polarization and increased communication between the two “camps.”

What has sometimes been called "the cult wars" began in the 1970s. At that time ad hoc grass roots organizations began to form in response to what was perceived to be a sudden increase in the number of young people joining nonmainstream religious groups, i.e., "cults." Virtually all of these organizations were founded and supported by parents of cult-involved young people, mostly college age. The term “cult” has been defined in many different ways. I now prefer a modification of a definition advanced by sociologist Benjamin Zablocki (cf. Rosedale & Langone, 1998): "an ideological organization held together by charismatic relationships and demanding a high level of commitment." However, most of the early critics of cultic groups, including myself, placed exploitative manipulation at the core of their definitions. A popular definition in this tradition is:

Cult (totalist type): A group or movement exhibiting a great or excessive devotion or dedication to some person, idea, or thing and employing unethically manipulative techniques of persuasion and control (e.g. isolation from former friends and family, debilitation, use of special methods to heighten suggestibility and subservience, powerful group pressures, information management, suspension of individuality or critical judgment, promotion of total dependency on the group and fear of leaving it, etc.), designed to advance the goals of the group's leaders, to the actual or possible detriment of members, their families, or the community. (West & Langone, 1986, pp. 119-120)

The late Henrietta Crampton, who founded a parent organization concerned about Children of God and was one of the early leaders of the Citizens Freedom Foundation (later renamed the Cult Awareness Network), told me that in the very early days of this movement (early 1970s), she and other parents were able to visit the headquarters of cults and talk to their young adult children. As a result, many of the young people left the group. According to Crampton, the groups then tightened their boundaries and made contact between families and members more and more difficult. What little communication was allowed was closely monitored by group leaders. Parents in these interactions saw their children's' conversations as stilted, rehearsed, and under the control of their "handlers." Analogies with Korean War era brainwashing were easy to make, and the nascent parent organizations began to say that their children had been brainwashed, had been "programmed" to behave in predetermined ways.

In response to these closed boundaries, desperate parents, according to Crampton, attempted to remove their children from groups through trickery or force. Once away from their handlers, the young adult members were persuaded to listen to information censored by their groups. Most then left. The concept of "deprogramming" came into being, and a small number of paid consultants began to specialize in getting kids out of cults. The excitement and risk associated with deprogramming energized the parent organizations. It also stimulated journalists, who wrote many emotional stories about the process and the groups from which people had been "rescued." Although deprogramming resulted in exit more often than not (studies by Bromley and me both indicated about a 60% exit rate – Bromley, 1988; Langone, 1984), the failure rate, and consequent legal risks, were sufficiently high to cause many parents to agonize over the decision (many also were troubled about the ethics of intervening) and to feel much anxiety before and during the intervention.

The groups that deprogrammers were most likely to target were the Unification Church, ISKCON, Scientology, Children of God, The Way International, and Divine Light Mission. Not surprisingly, the groups fought back through public relations campaigns and special training for their members. I remember, for example, hearing former Moonies report that they had been told to slash their wrists if abducted by deprogrammers, so that they could get to a hospital, where their fellow Moonies could "rescue" them. I also remember reading lurid, anti-deprogramming propaganda, which parents, deprogrammers, and former members who had been deprogrammed derided, much as current members scoffed at some of the media portrayals of cult life.

Parents’ frustration and anger led them to do what angry citizens often do: they turned to the government and tried to have conservatorship laws passed in various states so that they would be able to force the groups to turn the parents’ adult children over to mental health authorities. Fortunately, in my opinion, none of these laws ever passed because the potential for abuse outweighed the potential benefits. Nevertheless, it took several years before the parent organizations gave up their legislative endeavors.

The combination of deprogramming and attempts to pass conservatorship laws seems to have motivated some scholars of cults (who began using the term "new religious movement" to distinguish their work from the media stories about “cults”) to advocate on behalf of the targeted groups. Their advocacy included writing derogatory articles and books (e.g., Bromley & Shupe, 1981) about the so-called "anti-cult movement." As a result "anti-cultists" labeled these scholars "pro-cultists."

With few exceptions members of the two camps talked only with their compatriots. Simple-minded stereotypes flourished. And research to which everybody should have paid attention went unnoticed by those outside the particular individual's camp. Burke Rochford's research on child abuse in ISKCON and ISKCON's development from a communal to a congregational organization (Rochford, 1983, 1997, 1998; Rochford & Heinlein, 1998), for example, should have been required reading for any parent or professional concerned about a Krishna case. Instead, hardly anybody in the so-called "anti-cult movement" even knew about his research until ICSA (then AFF) engaged in substantial dialogue with so-called “pro-cultists” toward the end of the 1990s. I believe that scholars in the so-called "pro-cult" camp were equally ignorant or dismissive of our work (Kropveld, 2003).

In the late 1990s a dialogue began between a handful of critics and sympathizers (much credit goes to Dr. Eileen Barker for initiating this dialogue). This dialogue has resulted in many private conversations (personal and e-mail) and talks at the “other camp’s” conferences. ICSA conference speakers in recent years have included Nancy Ammerman, Dick Anthony, Massimo Introvigne, Phil Lucas, Gordon Melton, Burke Rochford, and, of course, Eileen Barker. These exchanges have, I hope, caused at least some of us to revise our thinking about contentious issues.

There have certainly been changes within the critical community, although only some of these changes can be attributed to the academic dialogue. Changes in the population of people seeking information and assistance from the critical community has also contributed significantly to the growing recognition that the cult issue is much more complex and nuanced than was once thought. Two surveys (see Table 1) strikingly reveal these population changes (Conway, Siegelman, & Coggins, 1986; Langone, 1992). Both surveys used similar snowball methods. Conway et al.’s subjects had closer ties to the parent organizations, demonstrated by the fact that about 70% had been deprogrammed. In my study 60% of the subjects had left on their own without any outside intervention, 9% had been ejected from their groups, and only 13% had been deprogrammed. Although Conway et al. had 426 subjects and I had only 308, their 426 subjects came from a total of 48 groups, whereas my smaller number of subjects came from a total of 101 groups. Seventy-six percent of their subjects came from only 5 of the 48 groups (the Unification Church, Scientology, The Way, Divine Light Mission, and ISKCON), and 44% of their subjects came from the Unification Church. The five largest groups in my sample comprised only 33% of the total subject population. Of the "big five" in Conway et al.’s survey, only Scientology had a significant representation in my sample (16%).

Table 1: Comparison of Conway, Siegelman, & Coggins with Langone

Thus, nearly half of the Conway et al. subjects were former Moonies. Moreover, because the Unification Church focused on college recruitment during this period, the average age of joining for Conway et al.’s subjects was less than in my survey and still in the college years: 21 vs. 24.8. Many, if not most, of the Conway and Siegelman Moonie subjects probably dropped out of college after joining. In my study, 38% of those who were students when they joined (43% of the total) dropped out of school after joining. A son or daughter's sudden dropping out of college was probably one of the factors that most mobilized parents during the late 70s and early 80s.

The dominance of Unification Church members in the Conway et al. study clearly affected how parents and helping professionals in the critical community thought about “cult” issues during the late 1970s and 1980s. The Moonie recruitment model as practiced at Booneville in California was seen to fit the “brainwashing” model far better than any other group, mainly because their methods were systematic and relatively effective (Scharff, 1985). Some English Moonies believed that their American counterparts were “fanatics” (Donna Collins, personal communication), so maybe Booneville reflected an extreme even within the Unification Church. Because parents of Moonies and former Moonies constituted such a high percentage of the early parent organizations (and so many of the parents’ children had gone through Booneville), it is no wonder that many people in these organizations thought "Moonie" and "brainwashed" when they thought "cult." Although a new psychology Ph.D., such as I was at the time, still has his head full of concepts such as "random samples" and "external validity" (I used to talk about the "Moonification" of the cult phenomenon in those days), parents without training in statistical methodology (and unfortunately some mental health professionals who don’t apply their research training) tended to base their conclusions on their personal experience, rather than systematic methods of data collection (much as, by the way, sociologists have appeared to base their conclusions about the so-called "ACM" – Kropveld, 2003).

I would like to think that the lectures and writings of people like me, who cautioned about over generalizing from selected samples, caused the decline in power of the "Moonie model" of cult recruitment and conversion. But the change in views within the critical community probably resulted more from the decline of Moonie cases to a trickle by the late 80s and the influx into the critical network of the ex-members represented in my survey—people who had walked out on their own at a relatively advanced age from a large variety of groups bearing no resemblance to the Moonie Booneville program. As these ex-members described their group lives and talked to others, the personal experience of people within the critical network changed, and their views on cults became more nuanced.

Hence, by the time the academic dialogue between the two “camps” began in the late 1990s most experts in the critical community knew in their bones, even if they didn’t articulate it, that the cult world was a lot more complex than the once dominant Moonie model implied. Moreover, most of the people attending our conferences hadn't been around during the late 70s and early 80s and only knew vaguely, if at all, of the "cult wars." Thus, changes in the critical community made it receptive to the dialogue that I hope will continue well into the future.

Scholarship on “Brainwashing”

Until Ben Zablocki (1997) began to defend the brainwashing notion, the sociological literature tended to portray "brainwashing" in extreme terms, conflating lay and professional presentations. Dr. Barker captured the essence of what the sociologists said we said:

Recruitment that employs deception should, however, be distinguished from "brainwashing" or "mind control." If people are the victims of mind control, they are rendered incapable of themselves making the decision as to whether or not to join a movement—the decision is made for them. (Barker, 1989/1995, p. 17)

Dr. Barker's encapsulation of what she thought "ACMers" said certainly was accurate to a point, especially for certain vocal laymen, such as Ted Patrick (1977). Dr. Barker did not quote any "ACM" experts on the definition of "brainwashing"; my impression is that sociologists hardly ever quoted us when they were telling their readers what we said.

Most of the mental health professionals of this era, who preferred to eschew the term "brainwashing," had views that were more nuanced and interactional than the “robot” view of “brainwashing” implied in Dr. Barker’s quote above. Even some of the early writings of critics, in my opinion, set the stage for, or at least provided some theoretical framework for, the changes within the critical community that I alluded to above. Let me simply give a few quotes as examples:

The cults these people belonged to maintain intense allegiance through the arguments of their ideology, and through social and psychological pressures and practices that, intentionally or not, amount to conditioning techniques that constrict attention, limit personal relationships, and devalue reasoning. (Singer, 1979, January, p. 75)

Clearly, the groups are far from uniform, and what goes on in one may or may not go on in another. Still, when in the course of research on young adults and their families over the last four years, I interviewed nearly 300 people who were in or who had come out of such cults, I was struck by similarities in their accounts. For example, the groups’ recruitment and indoctrination procedures seemed to involve highly sophisticated techniques for inducing behavioral change. (Singer, 1979, January, p. 72)

With respect to a specific susceptibility to conversion, the usual psychiatric categories do not entirely satisfy and indeed are confusing. A great variety of persons from the early teens to the 50s, with a wide variety of personality strengths and weaknesses, have entered these groups. The cults themselves select a segment of the marketplace and, as with any new enterprise, thrive only if they develop technical skills to build a core group and maintain internal congruity. The attempts of many observers to describe salient personality traits that render converts vulnerable and place them in pathological categories have been misleading, because they have tended to obscure the fact of nearly universal susceptibility to sudden change in the general population. In my studies of more than 60 subjects in all stages of involvement, about 60% by examination and by history obtained from relatives have been substantially and chronically disturbed and unhappy for many years. A large share of this group actively had sought conversions repeatedly. About 40%, however, were by history and examination essentially normal, maturing persons. Their susceptibility to conversion was either an artifact of the aggressive manipulation of a proselytizer or the result of a normal painful crisis of maturation. Singer’s estimate indicates an even larger percentage of normal persons (75%), while Galanter et al., in their questionnaire study of 237 members of the Unification Church, found “9% who felt they had serious emotional problems in the past” that had “led to professional help (30%) and even hospitalization (6%).” (Clark, 1979, 279-281.)

The actual act of cult affiliation frequently occurs within a group setting. This aspect of the complex problem of affiliation and conversion is often dealt with under the rubric of “group pressures,” without attempting to define or understand the dynamics that are operative within the group setting. This article explores the process of cult affiliation and the maintenance of cult membership from the vantage point of group dynamics. The recent additions to our understanding of group processes provided by object relations theory and the work of Bion (Kernberg, 1977) and Ezriel (1952) are particularly important as they reflect upon the inherent potential within all groups for cultlike transformation and on the regressive potential within individuals for loss of individuality and autonomy. (Halperin, 1982, p. 13)

But cults do not step into a vacuum. Each recruit brings a unique personal history that includes character, life aspirations, and family background, as well as distinctive qualities of development and maturation. Many cults are skilled at assessing these subtleties and tailoring an enticing approach around them. Through cult indoctrination, a person’s sense of self is often drastically altered. Recruits are likely to have internal experiences they cannot explain. They frequently form intense new friendships and assimilate a new language as well as new beliefs and mannerisms. The changes resulting from cult indoctrination can leave family and friends frightened and bewildered. The respect and mutuality which normally sustain their bonds with the recruit are often ruptured by the cultist’s new identity. Familiar and habitual patters of communication no longer seem to affect the cult member. This often leads families to feel that “programming” has replaced the individual they once knew. (Scharff & Zeitlin, 1984, pp. 1-2)

In treating individuals and/or families reporting adverse cult involvement, clinicians should not assume that all such involvements are necessarily harmful or a result of unethical persuasion. Rather, they should remain alert to alternative explanations of and responses to reported distress. Sometimes, for example, family members aver, through ignorance rather than motivated distortion, that a convert has been “brainwashed” by a group when, in fact, he or she has made an informed, unpressured decision to join. Sometimes family members and/or former cultists avoid confronting family or personal distress by attributing all problems to the cult, which may be at most only partly responsible. And sometimes family members and/or clinicians may try to persuade a convert, particularly one who had psychological problems before conversion, to leave a cult which, even though being harmful in some respects, offers him or her more psychological support (e.g., friends, a simplified environment, a vocational niche) than the non-cult environment provides. On the other hand, clinicians should not adopt a “blame-the-parents” posture, for this can easily lead to a false dismissal of cult-related harm. (Langone, 1984, p. 6)

We hasten to add, however, that our argument should not be construed as condemning all cults and cult-like organizations as unhealthy. Different groups present different settings – some harmful, some benign, others perhaps constructive – to potential converts, each of whom in turn presents a unique personality – sometimes healthy, sometimes troubled, always more or less vulnerable – to the proselytizers with whom he comes into contact. We are concerned here exclusively with cults which regularly effect drastic and destructive personality changes in many converts. (Clark, Langone, Schecter, & Daly, 1981, p. 6)

To those of us working with families and former group members it appeared that sociologists of religion ignored these distinctions and attributed Ted Patrick's simplistic views to everybody associated with the so-called "anti-cult movement." Sociologists seemed to confuse the processes or conditions associated with "brainwashing" with one of many possible outcomes of a "brainwashing" process, namely, the creation of deployable agents. They depicted brainwashing in the starkest, black-and-white terms. The possibility that one brainwashing program might be ineffective, another moderately effective, and still another very effective seemed to have escaped notice. Either one was brainwashed or one was free. Ted Patrick might have agreed. But most of the professionals in the field saw the phenomenon as a more complex process and viewed "brainwashing" as a term that referred to a continuum of influence. Indeed, most of my colleagues, including exit counselors who stress information sharing with cultists, would probably endorse what Dr. Barker said after her quotation cited above:

If, on the other hand, it is just deception that is being practiced, converts will be perfectly capable of making a decision—although they might make a different decision were they basing their choice on more accurate information. It can be argued, and it is indeed frequently the case, that when people who have joined a movement realize that the movement’s beliefs or practices are not what they had initially thought them to be, they leave the movement. It can also be argued that once some people have actually joined a movement on the basis of false information, they are more likely to stay because they have become subjected to further influences; they may, for example, have formed strong emotional attachments to members of the movement during the “extra time” that was gained through the deception. (Barker, 1989/1995, pp. 17-18)

I have elsewhere elaborated upon Dr. Barker’s propositions in a Web article entitled, “Deception, Dependency, and Dread”:

Although the process here described is complex and varied, the following appears to occur in the prototypical cult conversion:

  • A vulnerable prospect encounters a cultic group.

  • The group (leader[s]) deceptively presents itself as a benevolent authority that can improve the prospect's well-being.

  • The prospect responds positively, experiencing an increase in self-esteem and security, at least some of which is in response to what could be considered "placebo" The prospect can now be considered a "recruit".

  • Through the use of "sharing" exercises, "confessions," and skillful individualized probing, the group [leader(s)] assesses the recruit's strengths and weaknesses.

  • Through testimonies of group members, the denigration of the group's "competitors" (e.g., other religious groups, other therapists), the tactful accentuation of the recruit's shameful memories and other weaknesses, and the gradual indoctrination of the recruit into a closed, nonfalsifiable belief system, the group's superiority is affirmed as a fundamental assumption.

  • Members' testimonies, positive reinforcement of the recruit's expressions of trust in the group, discrete reminders about the recruit's weaknesses, and various forms of group pressure induce the recruit to acknowledge that his/her future well-being depends upon adherence to the group's belief system, more specifically its "change program."

  • These same influence techniques are joined by a subtle undermining of the recruit's self-esteem (e.g., by exaggerating the "sinfulness" of experiences the recruit is encouraged to confess"), the suppression or weakening of critical thinking through fatiguing activity, near-total control of the recruit's time, trance-induction exercises (e.g., chanting), and the repetitive message that only disaster results from not following the group's "change program." These manipulations induce the recruit to declare allegiance to the group and to commit to change him/herself as directed by the group. He or she can now be considered a convert embarking on a path of "purification," "enlightenment," "self-actualization," "higher consciousness," or whatever. The recruit's dependency on the group is established and implicitly, if not explicitly, acknowledged. Moreover, he/she has accepted the group's authority in defining what is true and good, within the convert's heart and mind as well as in the world.

Although, like my most of my colleagues, I have avoided the term “brainwashing,” if pressed, I would say that the above bulleted statements more or less describe the first phase of what one might call “brainwashing.” This process does not make “robots.” It does not eliminate choice, the capacity to make decisions. The process manipulates choice, resulting in what Dr. Janja Lalich, in a new theoretical formulation, calls “bounded choice” (Lalich, 2004). Moreover, the tension inherent in yielding to group pressures—what some have called a “pseudo-identity” (West & Martin, 1996)—sets the stage for future conflicts and disillusionment, which may eventually lead to departure from the group. Why most members of most groups eventually leave, while others remain is a question for which we still don’t have an adequate answer, in my opinion. My impression is that there are many answers, reflecting many kinds of interactions between individuals and powerful environments.

Scholarship on the “Anti-Cult Movement” (ACM)

Kropveld (2003) summarizes the main problem with sociological research on the so-called ACM:

The sociological literature on the “anti-cult movement” repeatedly makes the mistake of presuming that all organizations and individuals, who express concerns about cults, have uniform objectives, a common agenda, and close, interlinking relationships. In fact, there are numerous differences, and most “ACM” groups know very little about other groups and individuals. Here is a partial list of organizations that Dr. Barker might categorize as “cult awareness groups” (all are from North America unless otherwise indicated):

Info-Cult/Info-Secte

American Family Foundation

Cult Information Service

Freedom of Mind Foundation

New England Institute of Religious Research

Maine Cult Information Network

Cult Hotline and Clinic of the New York Jewish Board of Family and Children’s Services

Cult Awareness and Information Center (Australia)

Cult Awareness Center

Cult Information & Family Support (Australia)

Edmonton Society Against Mind Abuse

Ex-Cult Resource Center

FACTNet

FAPES (Argentina)

Forum Against Cults (Israel)

Free Minds

Wellspring Retreat and Resource Center

reFOCUS

Religious Movement Resource Center

REVEAL

The Ross Institute (RI)

Saskatchewan Citizens Against Mind Control

Mind Control Research Center (Japan)

I have not listed the dozens of organizations that fall under Dr. Barker’s “Countercult Groups.” Nor have I listed European organizations that are members of FECRIS (Fédération Européene des Centres de Recherche et d'Information sur le Sectarisme – European Federation of Centres of Research and Information on Sectarianism) and other European organizations. Moreover, I could have listed hundreds of individuals who have written about cultic groups and/or who offer services to people believing such groups have harmed them. Hence, scholars who generalize about “the” “anti-cult movement,” when they have had at best superficial contact with only a few organizations and individuals, make the same error as laymen and helping professionals who generalize from their limited experience to the wide world of cults/NRMs, in which there are thousands of groups. (Kropveld, p. 134)

I agree completely with Kropveld. I have worked in this field for over 20 years as a member of the so-called "anti-cult movement." Yet I hardly know anything at all about most of the organizations on Kropveld's list. I feel confident writing about ICSA (formerly AFF - Langone, 2002), but I would not feel qualified to write about other organizations in the list unless I at a minimum conducted in-depth interviews with people familiar with the organizations. And I certainly wouldn't generalize from my N of 1 (ICSA/AFF) to the entire group.

What body of evidence have sociologists used to make generalizations about the "ACM"? Kropveld and a colleague searched approximately 20 databases "for studies on individual 'ACM' groups, and were unable to find even one sociological study that was systematically researched" (Kropveld, 2003, p. 132). This lack of empirical data (surveys or in-depth field work) is consistent with my own N-of-one experience at AFF. Until Eileen Barker began attending our conferences a few years ago, the only sociologist who ever visited AFF, so far as I can remember, was Dr. Larry Shinn, back in the early 80s, and he only stayed for a couple of hours, mostly talking to my colleague Dr. Robert Schecter. I had some long and interesting conversations with Dr. Thomas Robbins in the early 80s, but these were usually about intellectual issues, not the "ACM" as an organization.

Sociologists have correctly criticized my colleagues and me for sometimes over generalizing from limited clinical observations to the broad population of "cults." But in my opinion sociologists have had a huge blind spot when it comes to their own over generalizations about the so-called "ACM."

Scholarship on NRMs

Sociological research on NRMs has been reviewed elsewhere (e.g., Ofshe, 1991; Robbins, 1988).

Clinical experience, for the most part, has shaped the opinions of mental health professionals, for better and worse. On the plus side, clinical experience can provide profound insights into the dynamics of the individual case, which may sometimes illuminate other cases. On the negative side, clinical experience tends to reflect selection factors (e.g., only psychologically troubled people come for counseling) that limit the generalizability of the clinical findings.

I have argued elsewhere (Langone, 2002) that the driving force behind clinical interest in cults/NRMs is the fact that some groups under some circumstances harm some people. We can debate the merits of various theories advanced to explain this phenomenon. Given the current state of empirical research in the area, we can debate the prevalence of harm within and across groups. But that harm exists and is not inconsequential seems to me to be beyond dispute. Collectively, my colleagues and I have worked with thousands of former group-members. The "bad" things that they report cannot all be simplistically attributed to "face-saving" or "sour grapes." "Atrocity tales" (a term some sociologists have used to denigrate former member reports – Bromley, Shupe, & Ventimiglia, 1979) are often "atrocity facts." Moreover, psychological and physical pain can sometimes be severe without being an "atrocity."

During the past 10-15 years research-oriented clinicians and a few academic psychologists have conducted empirical studies to help us better understand the nature and magnitude of harm. There has also been some effort devoted to the question of prevalence (Bloomgarden & Langone, 1984; Hulet, 1984; ICR Survey Research Group, 1993; Lottick, 1993; Zimbardo & Hartley, 1985).

Nature and Magnitude of Harm

In a review of the psychological literature, Aronoff-McKibben, Malinowski, & Lynn (2000) conclude:

The available evidence warrants three conclusions: (a) persons entering cults do not necessarily exhibit psychopathology; (b) current cult members appear psychologically well-adjusted generally, and demonstrate few conspicuous symptoms of psychopathology. However, pathology may be masked by conformity pressures and demand characteristics associated with the cultic environment; (c) a small but growing body of research indicates that at least a substantial minority of former cult members experience significant adjustment difficulties. There also are indications that these difficulties cannot be ascribed to demand characteristics. (p. 91)

I was involved in one of the few studies that have been able to include comparison groups (Langone, 1996; Malinowski, Langone, & Lynn, 1999). In this study I compared a population of former members of the International Churches of Christ, former Catholics, and graduates of InterVarsity Christian Fellowship. One component of the study used the Group Psychological Abuse Scale to compare these three groups' perceptions of their group environments. I reasoned that if the negative reports of former cult members resulted merely from their being "disgruntled" and "ex," then former Catholics should rate the Catholic Church as former ICC members rate their group. Although the ex-Catholics rated their former church more negatively than IV graduates (who left the group because of graduation, not because they were unhappy with it), their ratings were still well below the abuse-nonabuse cutoff point and were several standard deviations below the ratings of ex-ICC members. A recent as yet unpublished study in Mexico, using a Spanish version of the GPA, obtained virtually identical ratings from former Mexican Catholics and ICC members.

Counseling: Who Needs It and What Do They Need?

The psychological research findings on pre-cult counseling indicate that a sizeable minority of cult joiners sought help before their group experience, but it is far from certain that they are much different on this dimension from the general population:

Post-cult distress may at least in part reflect pre-cult psychopathology. This suggestion is bolstered by findings concerning the percentage of cultists reporting pre-cult psychological counseling: 7% (Barker, 1984), 30% (Spero, 1982), 30% (Galanter, Rabkin, Rabkin, & Deutsch, 1979), 38% (Galanter & Bucklery, 1978), 42% (Langone, unpublished.), 59% (Knight, 1986), and 62% (Sirkin & Grellong, 1988). Averaged out, these studies indicate that approximately one-third of former cultists had had counseling before joining the cult, a finding that is very similar to Sirkin & Grellong’s (1988) non-cult comparison group of Jewish youth, 33% of whom had had counseling. It is certainly possible that cult joiners tend to be somewhat more troubled psychologically than nonjoiners. However, since a comprehensive National Institute of Mental Health epidemiological study found that approximately 20% of the general population suffer from at least one psychiatric disorder (Freedman, 1986), the level of psychopathology in the cult joiner population may not be much greater than that of the population as a whole. Cult joiners may simply be more willing to seek help, which could possibly contribute to their susceptibility to cultic recruitment. Moreover, even if on the whole cultists are more disturbed psychologically, a majority appear to have been within the normal range psychologically before they joined their group. (Martin, Langone, Dole, & Wiltrout, 1992, p. 222)

Langone (1992) found that 70% of his 308 subjects sought counseling after leaving their groups, while 42% had received counseling prior to joining. Thus, it seems likely that a significant percentage of former group members seek professional help after their group experience.

Even if my prevalence estimate of 85,000 people joining and leaving cultic groups is much higher than the reality (Langone, no date), it is clear, given the small number of professionals with a cult expertise, that the vast majority of former cultists in psychotherapy obtain help from professionals with no particular expertise related to cults. Although all else being equal, I believe it is preferable that a therapist treating a former group member be familiar with the cult literature, I have often said that I would rather refer an ex-cultist to a competent therapist, particularly one who has experience treating traumatized populations such as battered women (see Ramirez-Boulette & Andersen, 1986; Ward, 2000), who knew nothing about cults than to an incompetent therapist who had read everything ever published on the subject.

Former cult members, like other psychotherapy populations, seek help because they are distressed, because they have "symptoms." Specific symptoms can be treated in different ways (e.g., depression's being treated successfully by pharmacotherapy, cognitive therapy, interpersonal therapy). So it would not surprise me if most former cult members benefited enough from psychotherapy that did not specifically address cult issues to get on with their lives, even if they still have unfinished “cult business” in one or more their mind’s “compartments.”

Nonetheless, we do regularly encounter cases in which years of traditional therapy proved ineffective. Sometimes people who have been out of their groups for 20 years or more come to an ex-member workshop and experience an epiphany that leads to a therapeutic breakthrough. They come to see that their enduring distress results in part from what was "done to them," not only from what they did. They will typically say something such as, "Now I understand what happened to me." They become liberated from the self-blaming patterns into which their group had indoctrinated them.

Research Needs

Although we have learned a lot about cultic groups over the past 25 years, there are still major gaps in our understanding. We need to carry out coordinated programs of research that address the following areas:

  • Further development of measures that assess group environments and the perceptions people have about group environment.

  • More effective measures of harm, particularly measures sensitive to the identity and dissociative disturbances reported in the clinical literature.

  • More complex assessment protocols designed to enhance treatment as well as to collect data useful in research (Wellspring Retreat and Resource Center, for example, has been exemplary in integrating research and clinical goals).

  • Psychologically sensitive participant observational studies of specific groups.

  • Group-specific surveys (even simple ones can be valuable) that illuminate the great variation, even among ex-member respondents, and that challenge us to ask why X% of subjects said "no" as well as why Y% of subjects said "yes."

  • Surveys of parents

  • Surveys of scholars, helping organizations

  • We always need more well written personal accounts, clinical case studies, and clinical guidelines – but we need papers that struggle with greater levels of complexity.

References

Aronoff-McKibben, J., Lynn, S. J., & Malinoski, P. T. (2000). Are cultic environments psychologically harmful? Clinical Psychology Review, 20, 91-111. (Reprinted in Cultic Studies Review, 1(3), 2002. Available at http://www.culticstudiesreview.org/csr_member/mem_articles/mckibben_jodi_csr0103.htm.)

Barker, E. (1984). The making of a Moonie: Brainwashing or choice. Oxford: Basil Blackwell.

Barker, E. (1989/1995). New religious movements: A practical introduction (Fifth impression). London: HMSO.

Bromley, D. G. (1988). Deprogramming as a mode of exit from new religious movements: The case of the Unification Movement. In D. G. Bromley (Ed.), Falling from the faith: Causes and consequences of religious apostasy (pp. 185-204). Beverly Hills: Sage.

Bromley, D. G., Shupe, A. D., & Ventimiglia, J. C. (1979). Atrocity tales, the Unification Church and the social construction of evil. Journal of Communication, 29, 42-53.

Bromley, D. & Shupe, A. (1981). Strange gods: The great American cult scare. Boston: Beacon Press.

Kropveld, M. (2003). An Example for Controversy: Creating a Model for Reconciliation. Cultic Studies Review, 2(2), 130-150. Available at http://www.culticstudies.org/infoserv_articles/kropveld_michael_anexampleforcontroversy.htm.

Conway, F., Siegelman, J. H., & Coggins, J. (1986). Information disease: effects of covert induction and deprogramming. Update, 10(2), 45-57.

ICR Survey Research Group. (1993, Aug. 4-8). Cult screening test. Media, PA: AUS Consultants.

Lalich, Janja. (2004). Bounded choice: True believers and charismatic cults. Berkeley: University of California Press.

Langone, M. D. (no date). Prevalence. http://www.icsahome.com/infoserv_articles/_add/langone_michael_prevalence.htm

Langone, M. D. (1992). Questionnaire study: preliminary report. Unpublished manuscript. Available at: http://www.csj.org/infoserv_articles/langone_michael_questionnairesurvey.htm

Langone, Michael D. (1993). Recovery from cults: Help for victims of psychological and spiritual abuse. New York: Norton.

Langone, M. D. (1996). An Investigation of a Reputedly Psychologically Abusive Group That Targets College Students. Report prepared for the Danielsen Institute, Boston University. Available at: http://www.culticstudies.org/infoserv_articles/langone_michael_bu_bcc_study.htm

Langone, M. D. (2002). Deception, Dependency, and Dread. Unpublished manuscript. Available at: http://www.csj.org/infoserv_articles/langone_michael_ddd.htm

Langone, M. D. (1984). Deprogramming: An analysis of parental questionnaires. Cultic Studies Journal, 1, 63-78.

Langone, M. D. (2002). Cults, conversion, science, and harm. Cultic Studies Review, 1(2), 178-186. Available at http://www.culticstudiesreview.org/csr_member/mem_articles/langone_michael_cultconv_csr0102.htm

Langone, M. D. (2002). History of the American Family Foundation. Cultic Studies Review, 1(1), 3-50.

Malinoski, Peter, Langone, Michael D., & Lynn, Stephen J. (1998). Psychological distress in former members of the International churches of Christ and noncultic groups. Cultic Studies Journal, 16(1), 33-52.

Martin, P. R., Langone, M. D., Dole, A. A., Wiltrout, J. (1992). Post-cult symptoms as measured by the MCMI Before and After Residential Treatment. Cultic Studies Journal, 9(2), 219-250.

Ofshe, R. (1991). Coercive persuasion and attitude change. In E. F. Borgatta & M. L. Borgatta (Eds.). Encyclopedia of sociology (pp. 212-224). New York: Macmillan.

Patrick, Ted. (1977). Let our children go! New York: Ballantine Books.

Ramirez-Boulette, T., & Andersen, S. (1986). « Mind Control » and the battering of women. Cultic Studies Journal, 3(1), 25-35.

Robbins, T. (1988). Cults, converts, and charisma: The sociology of new religious movements. London: Sage Publications.

Rochford, E. B. (1983). Recruitment strategies, ideology, and organization in the hare Krishna Movement. In E. Barker (Ed.), Of gods and men: New religious movements in the West (pp. 283-302). Macon, GA: Mercer University Press.

Rochford, E. B. (1997). Family Formation, Culture and Change in the Hare Krishna Movement. ISKCON Communications Journal, 5(2), 61-82. Also available at: http://www.culticstudiesreview.org/csr_articles/rochford_burke.htm

Rochford, E. B. (1998). Further reflections on child abuse within ISKCON. ISKCON Communications Journal, 6(2), 64-67. Also available at: http://www.culticstudiesreview.org/csr_articles/rochford_burke2.htm

Rochford, E. B., & Heinlein, J. (1998). Child Abuse in the Hare Krishna Movement:1971-1986. ISKCON Communications Journal, 6(1), 41-69. Also available at: http://www.culticstudiesreview.org/csr_articles/rochford_burke_heinlein_full.htm

Rosedale, H., & Langone, M. D. (1998). On using the term “cult.” In American Family Foundation, Cults and psychological abuse: A resource guide (pp. 22-28). Bonita Springs, FL: American Family Foundation. Also available at http://www.culticstudies.org/infoserv_articles/langone_michael_term_cult.htm

Scharff, G. (1985). Autobiography of a former Moonie. Cultic Studies Journal, 2(2), 252-258. (Reprinted from The Center Journal, March/April, 1982.)

Ward, D. (2000). Domestic violence as a cultic system. Cultic Studies Journal, 17, 42-55.

West, L. J., & Langone, M. D. (1986). Cultism: A conference for scholars and policy makers. Cultic Studies Journal, 3, 117-134.

West, L. J., & Martin, P. R. (1996). Pseudo-identity and the treatment of personality change in victims of captivity and cults. Cultic Studies Journal, 13(2), 125-152.

Zablocki, Benjamin. (1997). The blacklisting of a concept: The strange history of the brainwashing conjecture in the sociology of religion. Nova Religio: The Journal of Alternative and Emergent Religions, 1(1), 96-121.

This article is based on a presentation at the Society for the Scientific Study of Religion, October 24-26, 2003, Norfolk, Virginia.

Michael D. Langone, Ph.D., a counseling psychologist, is ICSA’s Executive Director. He was the founder editor of Cultic Studies Journal (CSJ), the editor of CSJ’s successor, Cultic Studies Review, and editor of Recovery from Cults. He is co-author of Cults: What Parents Should Know and Satanism and Occult-Related Violence: What You Should Know. Dr. Langone has spoken and written widely about cults. In 1995, he received the Leo J. Ryan Award from the "original" Cult Awareness network and was honored as the Albert V. Danielsen visiting Scholar at Boston University. (aff@affcultinfoserve.com)