Wall Street Journal - February 25, 2011
When Does a Religion Become a Cult?
America has long been a safe harbor for experimental faiths. But the unorthodox can descend into something darker
By MITCH HOROWITZ | Opinion
America has probably supplied the world with more new religions than any other nation. Since the first half of the 19th century, the country's atmosphere of religious experimentation has produced dozens of movements, from Mormonism to a wide range of nature-based practices grouped under the name Wicca.
By 1970 the religious scholar Jacob Needleman popularized the term "New Religious Movements" (NRM) to classify the new faiths, or variants of old ones, that were being embraced by the Woodstock generation. But how do we tell when a religious movement ceases to be novel or unusual and becomes a cult?
It's a question with a long history in this country. The controversy involving Hollywood writer-director Paul Haggis is only its most recent occurrence. Mr. Haggis left the Church of Scientology and has accused it of abusive practices, including demands that members disconnect from their families, which the church vigorously denies.
To use the term cult too casually risks tarring the merely unconventional, for which America has long been a safe harbor. In the early 19th century, the "Burned-over District" of central New York state—so named for the religious passions of those who settled there following the Revolutionary War—gave rise to a wave of new movements, including Mormonism, Seventh-Day Adventism and Spiritualism (or talking to the dead). It was an era, as historian Sydney E. Ahlstrom wrote, when "Farmers became theologians, offbeat village youths became bishops, odd girls became prophets."
When the California Gold Rush of 1849 enticed settlers westward, the nation's passion for religious novelty moved with them. By the early 20th century, sunny California had replaced New York as America's laboratory for avant-garde spirituality. Without the weight of tradition and the ecclesiastical structures that bring some predictability to congregational life, some movements were characterized by a make-it-up-as-you-go approach that ultimately came to redefine people, money and propriety as movable parts intended to benefit the organization.
Many academics and observers of cult phenomena, such as psychologist Philip G. Zimbardo of Stanford, agree on four criteria to define a cult. The first is behavior control, i.e., monitoring of where you go and what you do. The second is information control, such as discouraging members from reading criticism of the group. The third is thought control, placing sharp limits on doctrinal questioning. The fourth is emotional control—using humiliation or guilt. Yet at times these traits can also be detected within mainstream faiths. So I would add two more categories: financial control and extreme leadership.
Financial control translates into levying ruinous dues or fees, or effectively hiring members and placing them on stipends or sales quotas. Consider the once-familiar image of Hare Krishna devotees selling books in airports. Or a friend of mine—today a respected officer with a nonprofit organization—who recalls how his departure from the Rev. Sun Myung Moon's Unification Church was complicated by the problem of a massive hole in his résumé, reflecting the years he had financially committed himself to the church.
Problems with extremist leadership can be more difficult to spot. The most tragic cult of the last century was the Rev. Jim Jones's Peoples Temple, which ended with mass murder and suicide in the jungles of Guyana in 1978. Only a few early observers understood Jones as dangerously erratic. Known for his racially diverse San Francisco congregation, Jones was widely feted on the local political scene in the 1970s. He was not some West Coast New Ager gone bad. He emerged instead from the mainstream Christian Church (Disciples of Christ) pulpit, which sometimes lent a reassuringly Middle-American tone to his sermons.
Yet every coercive religious group harbors one telltale trait: untoward secrecy. As opposed to a cult, a religious culture ought to be as simple to enter or exit, for members or observers, as any free nation. Members should experience no impediment to relationships, ideas or travel, and the group's finances should be reasonably transparent. Its doctrine need not be conventional—but it should be knowable to outsiders. Absent those qualities, an unorthodox religion can descend into something darker.
Mr. Horowitz, the editor in chief of Tarcher/Penguin in New York and the author of "Occult America" (Bantam), is writing a history of the positive-thinking movement.
From Rwanda to Libya: Who's Calling Whom a Cockroach Now?
by Steven Hassan | Cult expert, Counselor, Author, media spokesperson
Recently in the news is political cult leader Muammar Gaddafi calling the Libyan people "cockroaches" -- the very term used by radio hosts to dehumanize Tutsi in Rwanda on the cusp of the hundred-day slaughter in 1994. How could people turn on their friends and neighbors, and murder them?
I recently saw a brilliant new documentary film by Adam Mazo called Coexist, which tells the story of victims, perpetrators, and survivors of the 1994 genocide in Rwanda, and what they face as they try to rebuild shattered trust in their country. A key theme in the film is the government-mandated process of reconciliation, and the need to break the cycle of violence so today's victims don't become tomorrow's perpetrators.
The film shows viewers two faces of forgiveness: its healing power and agonizing limitations. It also lays bare the challenges of coexistence between victims and perpetrators, the ongoing risk of revenge, and the lengthy process of rehumanizing the killers, many of whom have been released from prison and are returning to the villages where they raped, destroyed, and murdered.
I can't get out of my head the testimony of one of the perpetrators featured in the film, a man named Gregoire. He is serving a life sentence for ordering the killing of thousands of Tutsi in his district and from his prison compound says, "Maybe some people blame it on the government, but the government is not inside our hearts. I brainwashed people to kill Tutsi. I never killed Tutsi but... if I'd wanted to stop it, nothing would have happened."
There is a powerful connection between Gregroire's statement about brainwashing of killers in Rwanda and my own work with cult members and other victims of mind control. In both cases people are dealing with issues of obedience to authority and conformity to peer pressure. They were operating in a "closed system" where reality was being dictated by those in authority to everyone else living in abject fear.
I have developed the BITE model of mind control, based on the work of former military intelligence researchers like eminent psychiatrist Robert Jay Lifton and psychologist Margaret Singer. The BITE model (control of Behavior, Information, Thoughts and Emotions) explains that if you have an authoritarian person or regime that using these components to create a new "identity" which is dependent and obedient, you have the essential components of mind control.
The film left me wondering whether Rwandans who participate in reconciliation processes would be less likely to get swept up ever again in the madness of mass killing.
Victims of cults and those impacted by genocide (whether perpetrators or victims) need to learn about conformity and obedience to deepen their understanding of how susceptible human beings are to authority and persuasion. The groundbreaking experiments of social psychologists Solomon Asch in the 1950s, Stanley Milgram in the 1960s, and Philip Zimbardo in the 1970s reveal the depths of human subservience to authority and the sway of groups over individuals. Interestingly, across all three experiments, roughly sixty-percent of subjects abandon their own beliefs and cave into group pressure, even if it means doing something they know is wrong, or worse: something they realize is harmful to others.
As I watched Coexist, I found myself thinking about the work of Asch, Milgram, and Zimbardo, and my work on cult mind control and brainwashing. In this age of social networking via the Internet, the world audience should take the time to understand more about social influence. And how we, as adults must always remain vigilant to the social influence pressures we are under and always use our critical thinking to periodically "reality-test" our decisions and actions. Information control is what all dictators and cult leaders need to use to establish and maintain power. Deception, spying, keeping dissidents (ex-members, critics) silenced, intensive propaganda are just some of the sub-components of Information control. The Internet has proven to be the most powerful vehicle for social change, because in my opinion, people do want to know the "truth" and will seek out other facts and opinions, so they can think for themselves.
Given Rwanda's commitment to unity and reconciliation -- the crux of official government policy -- there is a crying need for these lessons to be learned by both victims and perpetrators. And this is one of the main points of Coexist and the insightful Viewer's Guide written by the film's learning director, Dr. Mishy Lesser: the cycle of violence could be reignited if Rwandans in the future fail to stand firm in their beliefs and, if necessary, disobey authority to protect one another. In a country with so much human suffering, ongoing fear, and lingering trauma, I would suggest that the best way to ensure a brighter future is to incorporate into the country's educational system -- at all levels -- the study of conformity, obedience, and subservience, as well as strategies to combat harmful and illegitimate applications of those human tendencies. That way, Rwandans can show the world how passive bystanders can become active upstanders committed to the protection of all vulnerable people.
The Coexist documentary and its Viewer's Guide should become part of a mandated international educational curriculum that shows a vivid example of "how good people do evil things," to quote my mentor Dr. Zimbardo. His Heroic Imagination Project is an ambitious effort to seed heroism throughout the world and which encourages people to do the right thing, not because there is something in it for them, but because it is the right thing to do. I am working with my co-director Alan Scheflin to create an academic think tank at Santa Clara Law School to study all aspects of social influence.
The Cult With No Name
Exposing the abuses and frauds of cults makes advocate a target for regular legal and physical threats
Books on cults, survivor stories, and recovery.
Their targets used to be university students, but today fringe religious groups are believed to be recruiting school-aged children.
Lessons to be learned when cults make news