Groucho Marx and Cult Recovery
ICSA Today, Volume 7, Number 2, 2016, pages 2-5
Groucho Marx and Cult Recovery
Michael D. Langone
“Who are you going to believe, me or your lying eyes?” Attributed to Groucho Marx, this quote succinctly and entertainingly articulates a recovery issue many former members of high-control, cultic groups confront: the question of authority.
Most of what we believe rests on faith in what other people say or in what we read. The evening news proclaims that an earthquake has damaged Jerusalem. I’m not in Jerusalem. I haven’t seen or felt the earthquake. But I believe that the earthquake has rocked that city because I attribute authority to (i.e., believe) the evening news broadcast.
If, however, I were standing in the middle of Jerusalem and could see no sign of an earthquake, I’d have a conflict. The US evening news that I access on the Web says an earthquake struck the place where I am standing, but my eyes tell me differently. I see people all around me living life normally, honking the horns on their cars, riding bicycles, looking in shop windows. “The media report must be mistaken,” I think. I believe my eyes, not somebody to whom I normally would attribute a high level of authority, of trustworthiness.
In many forms of cult conversion, one of the most fundamental changes is a change in authority. For example,
I used to believe the evening news, but cult leader X, who I believe has a special pipeline to God, tells me that journalists are a conspiratorial cabal that manufactures false news all the time. He says that Jerusalem is holy ground and could not be struck by an earthquake. Therefore, the journalists are lying. I feel some doubt as I watch the news in my living room in the United States of America, but I want to stay in my leader’s good graces, so I believe him, even if I have to will that belief a bit.
My conflict becomes more acute when my friend, who happens to be in Jerusalem, sends pictures of the devastation to my phone. Then he calls me and describes the destruction to me. Even if I no longer trust the evening news, I trust my friend. I fearfully tell my leader about my friend’s photos and report. Big mistake! The full force of my leader’s anger reduces me to a shaking, cowering child. Even my good friend in Jerusalem is evil. I have an intense mixture of conflicting emotions. But my desire to please my leader and avoid his wrath wins out. I push aside questions and doubts, fanatically focusing on the thought-stopping chant my leader taught me. I must believe him, not my lying eyes.
Though my example is admittedly extreme, the principal it illustrates sheds light on the often-asked question, “How can intelligent people believe that rubbish?!” They can believe because the leaders and members of cultic groups know how to denigrate a person’s current authorities (i.e., those whose statements the person trusts) and transfer that trust to themselves. Furthermore, cultic groups know how to reinforce this change in authority through the clever and careful use of psychological manipulation—through guilt, intimidation, ridicule, isolation, threat, and so on.
When persons who have been under a cult’s influence for a long time leave their group (for whatever reason), they do not automatically return to being the person they were before the cult. They inevitably take a part of the cult with them. That is why people in this field will say things such as “He is physically out of the group, but not mentally.”
Some of the cult residue relates to sources of authority. Although former members may have rejected the group's authority, they may still give authority to information sources favored by the group (e.g., a natural-healing website). In other words, former cult members may retain some of the delusions, illusions, and misconceptions of their group’s members and leaders, sometimes even for many years after they have left. Moreover, the tendency toward magical thinking, which is common in cultic groups, may incline former members to seek out other sources of magical thinking. So even though they are out of the cult, these former members are still in the mental box of their group. It is not surprising, then, that former members of cultic groups may be prone to cognitive errors.
Some of the most harmful of these mistakes cluster around medicine, family, psychotherapy, and religion:
A leader and other members of a group insisted incessantly that modern medicine is a sham and that only ancient medical teachings are trustworthy. If you accepted that belief as a corollary of the belief in the leader’s divine authority, you may continue to hold the belief even years after you have left the group because you still see ancient medicine as an authority.
The group drilled into your mind the false belief that your family hates you. So even after leaving the group, you avoid your family. You continue to attribute credibility to that belief because you do not realize the degree to which the group leader, whom you may now hate, is still your authority concerning beliefs about family.
If during your years in a group the leader and everybody around you lived as though only prayer could solve personal problems, and they portrayed psychotherapy and psychiatry as agents of the devil, mental-health professionals would have little authority in your eyes. If this attitude toward the mental-health profession sticks with you after you leave, you may experience much avoidable psychological pain.
If you spent many years in a Bible-based group that attributed authority only to the King James Version of the Bible, you may find it difficult when you leave to question this authority and the hermeneutics favored by the group. You may, for example, find yourself struggling with spiritual and theological questions that you could more effectively address if you attributed authority to other religious sources.
These examples illustrate why it may be important for former members to actively and vigorously question the residue of authority that remains with them after they have left their group. They do not necessarily have to reject every secondary authority favored by their former leader; some information sources may warrant respect. It may, however, be wise for them to review systematically these secondary sources of authority in order to adapt more effectively to life in the mainstream world.
To help young people resist the seductions of cultic groups, experts in this field will often say, “Question everything and everybody.” This same advice may be offered to former cult members, as well, although with some caveats. In a manipulative, high-control group, somebody with a fairly solid identity is coaxed into questioning and ultimately abandoning and exchanging their previous identity for a new one. Of course, those born into or raised in a group develop the group-sanctioned personality as part of their socialization. When former members question the central authorities that they accepted in the group, the entire identity that the cult imposed on them may begin to crumble. They may feel themselves in free fall, questioning everything and not knowing what to believe.
There are two ways to deal with the risk of free fall. First, find a therapist1 or support group of persons who understand what you are going through. They may be able to provide you with the authority of common sense until you are ready to dive deeper into your disillusionment. They may tell you that it is okay to feel lost, that the confusion will pass, that you will learn how to believe in people again.
Second, keep reminding yourself that you don’t have to find new moorings right away. Prioritize your problems. If feasible, reconnect to family and friends, but don’t feel obligated to “spill your guts” to them if doing so makes you feel unsafe. It’s okay to tell them that you’re hurting and don’t want to talk much about your group experience. Sometimes practical issues need to be a first priority, particularly finding a job and a place to live. Having a job can anchor you and give you social connections that can make it less lonely and frightening to be separated from your group. Reconfiguring your identity and clarifying spiritual questions will take time, and often it is wise to put these problem areas aside temporarily.
A Practical Approach
When you are ready to address your belief system, which is so essential to reconfiguring your identity,2 try to approach the task as a practical scientist might. First, list the most important beliefs that you imbibed as a result of being a member of a group. Then examine these beliefs as objectively as you can, as would a scientist. Scientific methodology rests on studying what others have said about a topic, logic (or reasoning), observation, and the testing of hypotheses. Let’s look at each of these approaches briefly.
Studying What Others Have Said
This step is more difficult than it may at first appear because it often is impossible to study everything that has been written or posted about a subject. Therefore, one must select from a potentially long list of resources. How does one determine which resources to focus on?
First, consult with people who understand cultic dynamics; they may be able to direct you to helpful resources. ICSA’s website, icsahome.com, is a good place to begin. In particular, see our study guides, topic collections, links, videos (a link to ICSA’s YouTube channel is at the bottom of every page in the website), and FAQs—all listed under Information, which is one of the top navigation links. Our links page, for example, includes links to the James Randi Educational Foundation and Skeptic (The Skeptics Society), which offer critiques of paranormal beliefs, astrology, and other pseudoscientific ideas that some groups promulgate, and Quackwatch, which focuses on dubious medical claims. We also have links to sites focused on Bible-based groups. Also, feel free to write ICSA (Contact Us is at the bottom of every page in the site).
Second, seek out critics of your group, especially those who have relevant professional credentials. Keep in mind that critics do not necessarily reject everything that your group may have taught you to believe. Many Bible-based groups, for example, teach elements of orthodox Christianity, even though their core message may be incompatible with Christianity.
If you begin to read criticisms, you may find yourself being triggered or feeling anxious and spacey. This is a common reaction, for these critics may have been vilified in your group, and you may have developed an automatic psychological aversion to them. See ICSA’s Support page (top navigation link), which directs you to workshops, support groups, and other resources that may help you deal with such reactions.
The level of sophistry (“the use of reasoning or arguments that sound correct but are actually false”—merriam-webster.com/dictionary/sophistry) put forth by some cult leaders is astounding. Buttressed by psychological influence tactics (e.g., guilt induction, conformity pressures), sophistry can persuade cult members to believe notions that cannot withstand critical scrutiny. A favorite tactic of cult leaders is to set up nonfalsifiable systems, in which a proposition is deemed true whatever the outcome (heads, I win; tails, you lose). A group’s meditation technique may, for example, claim to cure an ailment. If a member practices the technique and does not experience a cure, then the member is obviously doing it wrong. If a cure follows the meditation practice (even though the result may be coincidence or a placebo effect), then the technique is proven. Either way, the leader wins. Such nonfalsfiable systems are a hallmark of pseudoscience.3
All science depends upon observation. Because we can be fooled, however, the various sciences advocate detailed methodologies for observation. Nevertheless, common-sense observation can often reveal the inconsistencies and falsehoods in a cultic group’s teachings. A common falsehood is that dire consequences will befall anybody who leaves the group. If you leave, or if you are considering leaving, talk to people who have left. Odds are quite high that, although they may struggle with recovery issues, the cult’s dark predictions did not come true. If this prediction is false, others may be, as well.
When exercising your powers of observation, keep in mind the concept of confirmation bias, which “refers to the tendency to selectively search for and consider information that confirms one's beliefs” (psychologyandsociety.com/confirmationbias.html). Fight the confirmation-bias tendency that we all have by deliberately looking for information that challenges your preconceptions. If your preconception can stand up to critical scrutiny, then you can have more confidence in its validity.
Test Your Beliefs
The first step in testing a belief is to treat it as a hypothesis, not a fact. Any belief can become a hypothesis when you search for evidence that will either be supportive of or contrary to the belief. One may, for example, think that people are cold and mean. To defend yourself against the hostile stares you may expect from people as you walk down the street, you take on the demeanor of a cold and mean person! That demeanor is not likely to elicit smiles and warmth from the people you pass. This example is what we mean by a self-fulfilling prophecy. Your expectation comes to pass because you behave as though it already has come to pass.
If instead you treated the proposition that most people are mean and cold as a hypothesis, you would vary your behavior and observe the results. As you walk down the street, try smiling and saying, “good morning.” Then count the positive and negative reactions that you receive. Chances are there will be more positive responses than you may have initially expected. There will, of course, be some negative reactions. But the hypothesis, “people are cold and mean” (which implies always or usually cold and mean), may prove to be false when you test it in a systematic way. (Of course, this experiment wouldn’t work on a superbusy street, such as one finds in Manhattan. You’d be saying so many “good mornings” so quickly that you’d probably be deemed crazy!! Therefore, use common sense when testing hypotheses.)
I wish I could do scientific studies on every question that I have! In the day-to-day world, unfortunately, we may have to make do with informal hypothesis testing, such as in the example of saying “good morning” to people as we walk down a street. The main point to keep in mind is that what we believe to be true may be false. So let’s cultivate a healthy humility and test our beliefs as best we can.
Mainstream education, if practiced properly, teaches us how to question and analyze ideas. A cultic group typically does not teach its children how to do this and works hard at persuading its converts to stifle their education. This is why it is often important for former members, especially those born or raised in groups, to pursue their education after they leave the group.
If you left a group that seemed to say, “believe me, not your lying eyes,” you may have a residue of dubious beliefs and unworthy sources of authority that may be worth examining at some point in your recovery. I hope the reflections in this essay will help you in that endeavor. And always remember, unless you have very good reason to think otherwise, trust your eyes!
 ICSA can sometimes help you find therapists familiar with cult issues. A cult expertise, however, isn’t necessary, although it may be desirable. For example, therapists who have worked with traumatized persons (e.g., from spouse abuse) often understand the dynamics that have adversely affected cult members.
 See Gillie Jenkinson’s article, “An Investigation into Cult Pseudo-Personality,” in which she says that the cult pseudopersonality “needs chewing over and digesting, allowing what is nourishing to remain and eliminating the rest.” (icsahome.com/articles/pseudopersonality)
 Some resources to help you spot sophistry and pseudoscience include ICSA’s Education study guide, particularly our Pseudoscience Fact Sheets (icsahome.com/elibrary/studyguides/education); ICSA’s conference video on Critical Thinking Skills by Wendy and Doug Duncan (youtube.com/watch?v=GynqJ9uOGh8); ICSA’s Links page, particularly the links Real-World Reasoning, Skeptic, and The Reasoning Page (icsahome.com/elibrary/links).
About the Author
Michael D. Langone, PhD, received a doctorate in Counseling Psychology from the University of California, Santa Barbara in 1979. Since 1981 he has been Executive Director of International Cultic Studies Association (ICSA). He was the founding editor of Cultic Studies Journal (CSJ); the editor of CSJ’s successor, Cultic Studies Review; and editor of Recovery From Cults: Help for Victims of Psychological and Spiritual Abuse (an alternate of the Behavioral Science Book Service). He is coauthor of Cults: What Parents Should Know and Satanism and Occult-Related Violence: What You Should Know. Dr. Langone, ICSA Today’s Editor-in-Chief, has been the chief designer and coordinator of ICSA’s international conferences, which have taken place in Barcelona, New York, Rome, Philadelphia, Geneva, Denver, Brussels, Atlanta, Madrid, and Stockholm. In 1995, he was honored as the Albert V. Danielsen visiting Scholar at Boston University. He has authored numerous articles in professional journals and books, and spoken widely to dozens of lay and professional groups, various university audiences, and numerous radio and television stations, including the MacNeil/Lehrer News Hour and ABC 20/20.