Academic Disputes and Dialogue

ICSA e-Newsletter, Vol. 4, No. 3, 2005

Academic Disputes and Dialogue Collection: Preface

Michael D. Langone, Ph.D.

Over the years ICSA has published a number of articles addressing academic disputes and dialogue regarding cults and brainwashing (frequently called “thought reform,” “coercive persuasion,” or “mind control”). In order to illuminate the history of these disputes and the issues they have addressed, we have put together an online collection, “Academic Disputes and Dialogue” (see linked items in reference list).

The disputes became conspicuous in the late 1970s and early 1980s, when the Jonestown tragedy of 1978 made cults front-page news and when parents of some cult-involved youth were resorting to “deprogramming” their children in order to persuade them to leave the controversial groups. Some parents were also lobbying legislators to pass conservatorship legislation that would facilitate parents’ attempts to get their children out of cultic groups. Some academicians, mostly sociologists and religious scholars, were highly critical of these activists, while some professionals and academicians, mostly psychologists and psychiatrists, were sympathetic to the parents to varying degrees. This paper focuses on the disputes and dialogue among academicians and professionals. However, it should be noted that lay activists have played important roles in various disputes.

The academic disputants were often referred to as “pro-cultists” and “anti-cultists.” However, I prefer the terms “cult critics” and “cult sympathizers” in order to indicate that the disagreement is more a question of how much critical emphasis a scholar or professional deems appropriate, rather than whether or not he or she is “for” or “against” “cults.” Whatever terms one prefers, it is clear that by the early 1980s there were indeed two “camps” within the cultic studies field.

Two early and influential books expressing the sympathizers’ position were those of Bromley & Shupe (1981) and Robbins & Anthony (1981). Clark, Langone, Schecter, & Daly (1981) and Keiser & Keiser (1987) provided two of the more balanced critical perspective on the cult issue during these early years. I have previously summarized the issues fueling the debate between the two camps:

Sympathizers viewed cultists as “seekers” who freely and rationally chose to join their groups. Critics viewed cult joining as a process dependent upon deception and manipulation, that is, as an illusory or an uninformed choice, as a more intense and enduring form of the psychosocial influence studied by social psychologists. Sympathizers, nevertheless, often misrepresented the critics’ position by portraying them as advocates of a robotization theory of cult conversion based on The Manchurian Candidate. Sympathizers saw cultists’ families as threatened by cults and desirous of gaining control over their cultist children. Critics saw families as worried and anxious to save their loved ones from cult harm. Sympathizers considered cults to be “innovative” groups and cult leaders to be “entrepreneurial.” Critics viewed cults as destructive and their leaders as deceitful and hypocritical. Sympathizers tended to accept at face value cultists’ reports while doubting the accuracy of ex-cultists and their reports, sometimes pejoratively referring to them as “apostates” (Lewis, 1989; Shupe & Bromley, 1981) and “atrocity tales” (Bromley, Shupe, & Ventimiglia, 1979), respectively. Critics tended to doubt the accuracy of the reports of cultists, whom they considered to be deceived and manipulated, and looked favorably on ex-cultists’ reports. Lastly, sympathizers condemned deprogramming and guardianship proposals, sometimes with a level of passion inconsistent with their official persona of “dispassionate scientists.” Critics, although not usually in favor of deprogramming, tended to sympathize with parents who attempted to deprogram their loved one and to be at least open to considering guardianship proposals. (Langone, 1993, p. 32)

Some attempts at dialogue between the two camps were made in the early 1980s. Volume 2, Number 1 (1983) of Cultic Studies Newsletter (precursor to Cultic Studies Journal) contained an interesting collection of essays that reflected the disputes of that day. Robbins (1983) commented on an article in the previous issue of that newsletter and, in so doing, expressed the substance and tone of many sympathizer critiques of cult critics during that time period (see also Robbins, 1985). Schuller (1983), in a review of Bromley and Shupe (1981), similarly reflected the critics’ critiques of sympathizers. In a tongue-in-cheek essay, I reflected on how we might enhance dialogue between the “two tribes” of cultic studies researchers (Langone, 1983).

Kilbourne (1985) edited the proceedings of an American Association for the Advancement of Science, Pacific Division conference in which critics and sympathizers participated. Held in Logan, Utah, this conference’s purpose “was to offer ‘neutral territory’ for the proponents of the different perspectives to present their views on the key methodological and related issues concerning new religions, the intention being to facilitate communication between and understanding of the different perspectives” (p. 7). My late colleague, Dr. John Clark, and I contributed to this conference (Langone & Clark, 1985).

These attempts at dialogue, along with more productive dialogue that began in the late 1990s contributed to at least some critics’ and sympathizers’ adopting a more nuanced perspective of the issues than that portrayed in the longer quotation above. Dialogue, however, was hindered by the fact that in the mid-1980s former members of cultic groups began to sue successfully for infliction of emotional distress and other reasons (see ICSA’s collection of legal articles). Richardson (1996), for example, noted: “My initial concern changed quickly into worry as it became obvious that the brainwashing claims were being attended to by the legal system and policy makers. A pattern quickly developed in the legal arena, with advocates of the brainwashing theories winning in jury trials” (p. 118).

Shinn (1992), who wrote a thoughtful essay on the conflict he felt as an academician in the courtroom, says:

Early in my eight hours on the Robin George witness stand, I learned to adjust my standards of research reporting to allow me to give yes and no answers much of the time and to ask questions in return when I needed time to explain an answer. Since our academic profession is grounded in the fundamental premise of accuracy in doing research and honesty in reporting what we find, the ethical crises caused by having to distill my answers to simplistic truisms—thereby distorting my complex research findings—was more than a little disturbing.

Perhaps a more subtle and insidious effect of such experiences was that before long I found myself believing the simplistic pictures I was forced to paint in the courtroom and in the media. These left me less inclined to inject the negative information I had unearthed in my interviews and observations into my own analysis and conclusions. Fortunately, early on in my research I met a fellow Krishna researcher nearly as biased in the negative direction as I was in the positive one. Subsequently, in my debates with this colleague I realized that the most balanced analysis of the information we shared likely lay between our divergent analyses. This encounter also enabled me constantly to question the relatively simplistic analysis I was forced to give in the courtroom and in the media and to allow my own negative critical judgments to come forward along with more positive interpretations of the faith and practice of the Hare Krishnas. Nonetheless, the courtroom and, to a lesser extent, the public media, in their insistence upon simplicity and absoluteness with respect to truth, challenge any scholar’s “truthfulness.” Even when religious studies scholars reach definite or stark conclusions, they usually include considerable nuance. Such is not the luxury of the courtroom—or the television studio. (p. 283)

What Shinn perhaps did not realize is that some cult critics felt the same kinds of conflicts. My late colleague, Dr. John Clark, for example, had made very similar statements in private conversations. He essentially said: “Those are the rules of the legal profession. If you don’t play by those rules, you won’t be effective.”

These legal battles, and the academic distortions they tended to encourage, increased the polarization of the two “camps.” Also contributing significantly to the increased polarization was the Frye standard, which governed admissibility of expert testimony at that time. Frye required that a scientific theory be generally accepted by the scientific community. Thus, a few short years after the "divergent perspectives" volume was published (Kilbourne, 1985), various documents, conferences, articles, and books (see http://www.cesnur.org/testi/se_brainwash.htm for a collection of articles on this subject by sympathizers) claimed that "the scientific community had rejected brainwashing theory" — precisely what the sympathizers' lawyers wanted to hear! Never has "science" witnessed such rapid "progress"! In only a few years the "scientific community" supposedly concluded that there were no longer two camps; there was one camp and a handful of renegade scientists. The fact that some of these "renegades" wrote articles for prestigious publications, such as The encyclopedia of sociology (Ofshe, 1992), The Merck manual of diagnosis and therapy (Singer, 1986), and the Comprehensive textbook of psychiatry (West & Singer, 1980) seemed to have no effect on the droning chant, "the scientific community has rejected brainwashing."

The phenomenon bore an eerie resemblance to a political campaign managed by public relations consultants and implemented by "spin doctors." The so-called "Hadden memo” (Hadden, 1989, December 20), when it was leaked to the public, caused quite a stir within the critical community, for it reinforced the notion that “pro-cultists” were indeed scheming on behalf of cults. But the memo also generated some controversy and concern among sympathizers because it claims to speak for Drs. Eileen Barker and David Bromley, when in fact they had disagreed with a number of points that Jeffrey Hadden (now deceased) had made. I know and respect all the parties involved. I’ve read the memo in the context of the “cult wars” that I have been describing. I don’t believe it is as damning as some have made it out to be. Cult sympathizers stand for certain principles, just as do cult critics. Sometimes, these principles place the parties on different sides of an issue in the courtroom. The courtroom, unfortunately, can tempt scholars to move into an inappropriate advocacy mode. This advocacy mentality is what is most striking in the Hadden memo. Here are a few selections from the memo:

Singer's position is typically couched

in the notion that brainwashing is "irresistible,

irreversible, and that it takes place subtly without the

'victim' really being aware of what is happening." It seems

to us fairly clear that this does not happen. BUT, Singer's

testimony weaves back and forth between this proposition and

"normal" social influence theory.

If she, and/or others, were to back away from the

"irresistible, irreversible and subtle" definition, how does

this change the battleground? Would our task be easier or

more difficult? . . .

On the issue of the value of research and litigation,

our legal consultant (Lieberman) was not particularly sanguine

about the prospects of social scientists coming up with

findings that would be of great value. In so many words, he

told us that the most important think [sic] we could do is prepare a

statement that refutes the claim that social science can be

helpful. I interpreted this as the agnostic statement we

discussed in Salt Lake. . .

AGENDA ITEM # 3 - Preparation of the "agnostic" resolution

and development of a strategy for

encouraging the governing bodies of ASA,

APA (or perhaps Sect 38), SSSR, ASR, RRR

and CISR to adopt same.

In my opinion, much of Hadden’s concern resulted from sensationalized media reports and recognition that “cults” are much more diverse than these reports (and some of the early professional articles) implied. He feared, as did I during the late 1970s and early 1980s, that an undiscerning view of the cult phenomenon would result in abuses. That is why I opposed conservatorship laws. Hadden (and some of his colleagues), however, exaggerated the threat that legal cases posed to freedom of religion (see Rosedale, 1993). As a result, they attributed the caricature of brainwashing theory (“irresistible, irreversible”) that might have appeared from time to time in courtrooms (recall the Shinn quotation above) to all so-called “anti-cultists,” further exacerbating the distrust between the two camps. These caricature-based attacks against brainwashing theory have, I believe, been decisively refuted (see Amitrani & Di Marzio, 2000a; 2000b; Bardin, 1994; Martin, Pile, Burks, & Martin, 1998; Rosedale, 1993; Zimbardo, 1997) and the biases against brainwashing theories within the sociology of religion community exposed (Balch & Langdon, 1998; Beit-Hallahmi, 2001; Kent & Krebs, 1998a, 1998b; Zablocki, 1996, 1997).

Moreover, the sympathizers appear to presume that the relevant scientific community consists of sociologists of religion and religious studies scholars. However, if a court is investigating whether or not a particular group harmed a particular person (see Aronoff-McKibben, Lynn, & Malinoski, 2000), one might conclude that psychology is at least as relevant a scientific discipline as sociology. There has been a general understanding within the mental health field that groups can gain high levels of influence over people. A recent survey of 700 psychologists (Lottick, 2005), for example, found that over 50% strongly support a law against “brainwashing” (therefore, they must believe that “brainwashing” exists!), a figure that ironically might be higher than what one would find among psychologists who are cult experts, perhaps because the latter might be more sensitive to how such a law could be abused. Thus, if one expands the “scientific community” beyond the rather small group of sociologists and religious studies scholars specializing in new religious movements, the notion that the “scientific community rejects brainwashing theory” becomes more difficult to defend.

The general acceptability criterion of Frye became less important in 1993, when Daubert v. Merrell Dow Pharmaceuticals, Inc. altered the criteria for admissibility of scientific expert testimony.

Daubert overturned the seventy-year-old threshold federal standard for admitting scientific evidence which was established in Frye. At the same time, the state court standard was left in limbo as many state courts reexamine their rules in light of Daubert.

As Justice Blackmun’s opinion in Daubert explains, until 1993 most courts, federal and state, followed the Frye rule that psychiatric, psychological, or other scientific evidence could be offered in the courtroom only upon the showing that the type of evidence was generally accepted in the field. That general principle has been called into question with the determination that, at least in the federal courts, Federal Rule 702 supersedes Frye and does away with the “general acceptance” prerequisite. Instead, the new standard for federal courts is whether “scientific, technical, or other specialized knowledge will assist the trier of fact to understand the evidence to determine a fact in issue.” (Hominik, 1995, p. 43 – also available in Appendix I of Web posting of this article)

Not surprisingly, since Daubert the emphasis among sympathizers in the courtroom has shifted to the alleged methodological deficiencies of various theories of extreme influence that have been lumped together under the term “brainwashing.” In particular, psychologist Dick Anthony has argued that “brainwashing” theories are not scientific (Anthony, 1996, 2001). Anthony contends that so-called brainwashing testimony should not even be allowed in the courtroom. However, despite some victories for those who argue this point, judges in numerous cases have allowed, and continue to allow, testimony concerning the use of powerful influence techniques (i.e., “brainwashing”) in group situations, nonreligious as well as religious.

Before he died Herb Rosedale, ICSA’s late president, had been working on a paper on scholarship and advocacy, in which he analyzed how expert testimony in highly charged ideological areas, such as cults, can affect the scholarship of academics and professionals (Rosedale, unpublished). Footnote 86 of his draft paper rebuts arguments to disallow brainwashing testimony:

A brief comment directed to the erosion of responsibility on the part of experts and others in describing the state of the law in the U.S. courts with respect to the admission of brainwashing, mind control and coercive persuasion testimony is also apposite. In numerous papers and submissions to courts, advocates for new religions state unequivocally that American courts do not admit testimony of brainwashing or mind control. Factually, that is false. Many statements have been made by those seeking to block admission of expert testimony as to mind control, brainwashing and coercive persuasion asserting that the decision of the Federal District Court in the Northern District of California in 1990, U.S. v. Fishman, 743, F.Sup. 713, rejected the theory of coercive persuasion and precluded admission of expert testimony with respect to it and that settled the issue. That view was asserted in an affidavit submitted to a Florida court by a well-recognized expert within the last few months and a similar statement was made by another expert academic in a presentation in Europe last year. However, in Hejl v. Sands, et al., Case #3PA-94-1035 in the Superior Court of the State of Alaska in February, 2000, Judge Eric Smith wrote an extensive decision in which he expressly permitted Dr. Paul Martin to testify as to coercive persuasion over objection supported by an affidavit of Dick Anthony. In its decision, the court pointed out that the theory of coercive persuasion was not susceptible to being barred in evidence because it did not meet a falsifiability test under Daubert. The court stated that there are many soft sciences (such as psychiatry) which are both reliable and not falsifiable and testimony is admissible if supported by peer review. Hejl v. Sands, et al., supra. Opinion on motion seeking to preclude evidence, p. 11 [Smith]. The opinion went on to state that the theory is supported in a number of scholarly books and journals which have been peer reviewed and is also supported by the personal experience of experts in the field. The court pointed out that the Fishman case was not dispositive and was, in fact, inconsistent with the decision of the California Supreme Court in Molko v. Holy Spirit Association, 762, P2d, 46, California (1988), and other cases in which evidence of non-physical coercion led to admissibility of evidence of coercive persuasion (Ibid. p. 13) and cases cited at Footnote 3). In a further opinion in which the defendants attacked the admissibility of not only the testimony of Dr. Paul Martin but the testimony of a psychiatrist, Dr. John Hochman, the attack also failed (See opinion on various of plaintiffs' motions in Limine dated February 25, 2000). Reference to defendant's motion 4 involves Dr. Martin's testimony, 6 Dr. Hochman's testimony and also 7 Dr. Hochman's testimony, 9 deals with exclusion of testimony relating to religious beliefs, at pps 8-12, 13-20, 20-33. While both Fishman and Hejl are decisions of lower courts, I have never seen Hejl cited in any submissions made or papers offered by the supporters of new religious movements on the issues of mind control, brainwashing or coercive persuasion, although it is clearly the most recent comprehensive and receive analysis of the issues. Annexed hereto as Exhibit A are the relevant pages from the decision of that court. (Rosedale, unpublished paper, footnote 86)

It seems to me that professionals on both sides of the debate will continue to have divergent opinions, which courts may or may not be interested in considering in specific cases—in conjunction with other evidence and opinions. It is presumptuous for expert witnesses to think that the future of religious freedom hangs on their or their opponents’ testimony. If a certain line of testimony is not persuasive, there is no harm in its being heard. If that same line of testimony is persuasive, then why, other than the practical desire to win, keep it out of the courtroom? Because one “camp” of experts has decided that the importance of religious freedom demands that we protect judges and juries from their own gullibility? Is there a fear that judges and juries will be ‘brainwashed” by “brainwashing testimony”? I think this kind of blacklisting is silly. Ultimately, intelligent ideas will defeat stupid ones. And the republic will endure.

The courtroom debates will go on. These debates, however, should not prevent us from continuing to dialogue with those who might hold different opinions on issues in the cultic studies field (see Kropveld, 2003; Langone, 1995, 2000). A book entitled, Misunderstanding cults: Searching for objectivity in a controversial field (Zablocki & Robbins, 2001) is an especially noteworthy attempt to foster dialogue. This book tries “to restore a moderate perspective to the social scientific study of cults” (p. xiii). Zablocki (2001) presents his theory of brainwashing in a chapter entitled, “Towards a Demystified and Disinterested Scientific Theory of Brainwashing.” In that same volume Dick Anthony was given an opportunity to critique brainwashing theories. Anthony’s chapter (2001), “Tactical Ambiguity and Brainwashing Formulations: Science or Pseudo Science,” takes up 103 pages of this 515-page book—far and away the longest chapter in the book.

Zablocki felt that Anthony’s critique was so deficient that it demanded a long, detailed, point-by-point refutation. We have published Zablocki’s rejoinder to Anthony in order to continue ICSA’s historical documentation of the critic-sympathizer disputes and dialogue (Zablocki, 2005).

Although scholars will undoubtedly continue to debate these issues in and outside the courtroom, I believe that the passions that once characterized this field have diminished considerably. Recent years have witnessed numerous civil and productive exchanges between critics and sympathizers. Our disagreements are more informed and respectful. And there is a growing recognition that we have more common ground than we once thought. I believe that this trend will continue and will do what I can to encourage it.

References

Aronoff-McKibben, J., Lynn, S. J., & Malinoski, P. T. (2000). Are cultic environments psychologically harmful? Clinical Psychology Review, 20, 91-111. (Reprinted in Cultic Studies Review, 1(3), 2002.

Amitrani, Alberto, & Di Marzio, Raffaella. (2000a). Blind or just don’t want to see? Brainwashing, mystification, and suspicion. Cultic Studies Journal, 17, 122-142.

Amitrani, Alberto, & Di Marzio, Raffaella. (2000b). “Mind control” in new religious movements and the American Psychological Association. Cultic Studies Journal, 17, 101-121.

Anthony, Dick. (1996). Brainwashing and totalitarian influence: An exploration of admissibility criteria for testimony in brainwashing trials. Ann Arbor: UMI Dissertation Services.

Anthony, Dick. (2001). Tactical ambiguity and brainwashing formulations: Science or Pseudo Science. In Benjamin Zablocki & Thomas Robbins (Eds.), Misunderstanding cults: Searching for objectivity in a controversial field (pp. 215-317). Toronto: University of Toronto Press.

Balch, Robert W., & Langdon, Stephen. (1998). How the problem of malfeasance gets overlooked in studies of new religions: An examination of the AWARE study of Church Universal and Triumphant. In Anson Shupe (Ed.), Wolves within the fold: Religious leadership and abuses of power (pp. 191-211 ). New Brunswick, NJ: Rutgers University Press.

Bardin, David. (1994). Psychological coercion and human rights: Mind control (“brainwashing”) exists.

Beit-Hallahmi, Benjamin. (2001). “O tuant muse”: Collaborationism and research integrity. In B. Zablocki & T. Robbins (Eds.), Misunderstanding cults: Searching for objectivity in a controversial field (pp. 35-70). Toronto: University of Toronto Press. (See http://www.apologeticsindex.org/b35.html for online versions of conference paper on which this chapter was based.)

Bromley, David, & Shupe, Anson. (1981). Strange gods: The great American cult scare. Boston: Beacon Press.

Bromley, D. G., Shupe, A., & Ventimiglia, J. C. (1979). Atrocity tales, the Unification Church, and the social construction of evil. Journal of Communication, 29, 42-53.

Clark, John, Langone, Michael, Schecter, Robert, & Daly, Roger. (1981). Destructive cult conversion: Theory, research, and treatment. Weston, MA: American Family Foundation.

Daubert v. Merrell Dow Pharmaceuticals, Inc., 113 S.Ct. 2786 (1993).

Frye v. United States, 293 F.2d 1013 (D.C. Cir. 1923).

Hadden, Jeffrey K. (1989, December 20). To: Social scientists concerned about forensic and related issues dealing with new religious movements. Posted at http://www.apologeticsindex.org/h14a01.html.

Hominik, David. (1995). Cults in Amereican society: A legal analysis of undue influence, fraud, and misrepresentation. Cultic Studies Journal, 12(1), 1-48.

Keiser, Thomas, & Keiser, Jacqueline. (1987). The anatomy of an illusion: religious cults and destructive persuasion. Springfield: Charles Thomas.

Kent, Stephen A., & Krebs, Theresa. (1998a). Academic compromise in the social scientific study of alternative religions. Nova Religio, 2(1), 44-54. See also: http://www.apologeticsindex.org/c26.html

Kent, Stephen A., & Krebs, Theresa. (1998b). When scholars know sin: Alternative religions and their academic supporters. Skeptic, 6(3), 36-44. (A discussion forum on this article was published in Volume 7, Number 1 of Skeptic, pp. 14-25.)

Kilbourne, Brock. (1985). Scientific research and new religions: Divergent perspectives. San Francisco: Pacific Division, American Association for the Advancement of Science, pp. 90-113.

Kropveld, Michael. (2003). An Example for Controversy: Creating a Model for Reconciliation. Cultic Studies Review, 2(2), 130-150.

Langone, Michael D. (1983). On dialogue between the two tribes of cultic researchers. Cultic Studies Newsletter, 2(1), 11-15.

Langone, Michael D. (1993). Recovery from cults: Help for victims of psychological and spiritual abuse. New York: Norton.

Langone, Michael D. (1995). Secular and religious critiques of cults: Complementary visions, not irresolvable conflicts. Cultic Studies Journal, 12(2), 166-186.

Langone, Michael D. (2000). The two “camps” of cultic studies: Time for a dialogue. Cultic Studies Journal, 17, 79-100.

Langone, Michael D., & Clark, John G. (1985). New religions and public policy: Research implications for social and behavioral scientists. In Brock Kilbourne (Ed.), Scientific research and new religions: Divergent perspectives. San Francisco: Pacific Division, American Association for the Advancement of Science.

Lewis, James. (1989). Apostates and the legitimation of repression: Some historical and empirical perspectives on the cult controversy. Sociological Analysis: A Journal in the Sociology of Religion, 49, 386-397.

Lottick, Edward. (2005, July). Prevalence of cults in the U.S.A. Paper presented at the conference Psychological Manipulation, Cultic Groups and Other Alternative Movements, conducted by International Cultic Studies Association and Universidad Autonoma de Madrid. July 14-16, 2005, Madrid, Spain.

Malinoski, Peter, Langone, Michael D., & Lynn, Stephen J. (1998). Psychological distress in former members of the International churches of Christ and noncultic groups. Cultic Studies Journal, 16(1), 33-52.

Martin, Paul R., Pile, Lawrence A., Burks, Ron, & Martin, Stephen D. (1998). Overcoming the bondage of revictimization: A rational/empirical defense of thought reform. Cultic Studies Journal, 15(2), 151-190.

Ofshe, Richard. (1992). Coercive persuasion and attitude change. In E. F. Borgatta & M. L. Borgatta (Eds.). Encyclopedia of sociology (pp. 212-224). New York: Macmillan.

Richardson, James T. (1996). Sociology and the new religions: “Brainwashing,” the courts, and religious freedom. In P. J. Jenkins & S. Kroll-Smith (Eds.), Witnessing for sociology: Sociologists in court. Westport, CT: Praeger.

Robbins, Thomas. (1983). Cults, coercion, and dialogue. Cultic Studies Newsletter, 2(1), 1-5.

Robbins, Thomas. (1985). Objectionable aspects of “cults”: Rhetoric and reality. Cultic Studies Journal, 2(2), 358-370.

Robbins, Thomas, & Anthony, Dick. (1981). In gods we trust. New Brunswick, NJ: Transaction Books.

Rosedale, Herbert L. (1993). Cult litigation doesn’t threaten religion. Cult Observer, 10(2).

Rosedale, Herbert L., & Langone, Michael D. (1998). On using the term “cult.” In American Family Foundation, Cults and psychological abuse: A resource guide (pp. 22-28). Bonita Springs, FL: American Family Foundation.

Shinn, Larry D. (1992). Cult conversion and the courts: Some ethical issues in academic expert testimony. Sociological Analysis, 53(3), 273-285.

Schuller, J. (1983). Review of Strange gods: The great American cult scare. Cultic Studies Newsletter, 2(1), 8-11.

Singer, Margaret T. (1986). Group psychodyanmics. In R. Berkow (Ed.), The Merck manual of diagnosis and therapy (15th edition) (pp. 1467-1471). Rahway, NJ: Merck.

West, Louis J., & Singer, Margaret T. (1980). Cults, quacks, and nonprofessional psychotherapies. In A. M. Friedman, H. I. Kaplan, & B.J. Saddock (Eds.). Comprehensive textbook of psychiatry, III (pp. 3245-3458). Baltimore: Williams & Wilkins.

Zablocki, Benjamin. (1996). Reliability and validity of apostate accounts in the study of religious communities. Paper presented at the Association for the Sociology of Religion in New York City, Saturday, August 17, 1996.

Zablocki, Benjamin. (1997). The blacklisting of a concept: The strange history of the brainwashing conjecture in the sociology of religion. Nova Religio: The Journal of Alternative and Emergent Religions, 1(1), 96-121.

Zablocki, Benjamin. (2001). Towards a demystified and disinterested scientific theory of brainwashing. In Benjamin Zablocki & Thomas Robbins (Eds.), Misunderstanding cults: Searching for objectivity in a controversial field (pp. 159-214). Toronto: University of Toronto Press.

Zablocki, Benjamin. (2005). Methodological fallacies in Anthony’s critique of exit cost analysis. Cultic Studies Review, 4(2). Available at http…

Zablocki, Benjamin, & Robbins, Thomas (Eds.). (2001). Misunderstanding cults: Searching for objectivity in a controversial field (pp. 215-317). Toronto: University of Toronto Press.

Zimbardo, P. (1997, May). What messages are behind today’s cults. Monitor. Washington, D.C.: American Psychological Association.

ICSA E-Newsletter, 4(3), 2005