1 Nov 2010

Mistakes Were Made (But Not By Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts [book review]


I am reposting this article from a year and a half ago because of its relevance to recent events involving official apologies from various destructive religious groups, such as those by various Catholic leaders as well as by the leaders of The Family International cult, a.ka. the Children of God. Those leaders bragged recently in a Salt Lake Tribune article, that they have issued 7 official apologies. However, they only admit that "mistakes were made", while still clinging to their official doctrines that led to and enabled those "mistakes", which are more accurately described as abuses and crimes.


Mistakes Were Made (But Not By Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts

by Greta Christina at Greta Christina's Blog - January 29, 2008 [see end of article for links]

I am totally having fits about this book. Everyone reading this blog has to read it. Everyone not reading this blog has to read it. I was already more or less familiar with the concepts in it before I started reading... and I am nevertheless finding it a life-changer.

And in particular, anyone interested in religion has to read it. It doesn't talk much about religion specifically; but the ideas in it are spot-on pertinent to the topic.

For believers... and for atheists.

A quick summary. Mistakes Were Made (But Not By Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts is about cognitive dissonance: the uncomfortable-at-best feeling you get when things you do, or things that happen, contradict your beliefs -- about yourself or the world. It's about the unconscious justifications, rationalizations, and other defense mechanisms we use to keep that dissonance at bay. It's about the ways that these rationalizations perpetuate and entrench themselves. And it's about some of the ways we may be able to derail them. The book is fascinating and readable; it's clear, well-written, well-researched, loaded with examples, and often very funny.

The basic idea: When we believe something that turns out to be untrue, it conflicts with our concept of ourselves as intelligent. When we make a decision that turns out badly, it conflicts with our concept of ourselves as competent. And when we do something that hurts someone, it conflicts with our concept of ourselves as good. That's the dissonance.

And what we do, much if not most of the time, is rationalize. We come up with reasons why our mistake wasn't really a mistake; why our bad deed wasn't really so bad.

"I couldn't help it." "Everyone else does it." "It's not that big a deal." "I was tired/sick." "They made me do it." "I'm sure it'll work out in the long run." "I work hard, I deserve this." "History will prove me right." "I can accept money and gifts and still be impartial." "Actually, spending fifty thousand dollars on a car makes a lot of sense." "When the Leader said the world was going to end on August 22, 1997, he was just speaking metaphorically."

In fact, we have entire social structures based on supporting and perpetuating each other's rationalizations -- from patriotic fervor in wartime to religion and religious apologetics.

More on that in a bit.

I could summarize the book ad nauseum, and this could easily turn into a 5,000 word book review. But I do have my own actual points to make. So here are, IMO, the most important pieces of info to take from this book

1) This process is unconscious. It's incredibly easy to see when someone else is rationalizing a bad decision. It's incredibly difficult to see when we're doing it ourselves. The whole way that this process works hinges on it being unconscious -- if we were conscious of it, it wouldn't work.

2) This process is universal. All human beings do it. In fact, all human beings do it pretty much every day. Every time we take a pen from work and think, "Oh everyone does it, and the company can afford it"; every time we light a cigarette after deciding to quit and think, "Well, I only smoke half a pack a day, that's not going to kill me"; every time we eat a pint of Ben and Jerry's for dinner and think, "It's been a long week, I deserve this"; every time we buy consumer products made in China (i.e., by slave labor) and think, "I really need new sneakers, and I just can't afford to buy union-made"... that's rationalization in action. It is a basic part of human mental functioning. If you think you're immune... I'm sorry to break this to you, but you're mistaken. (See #1 above, re: this process being unconscious, and very hard to detect when we're in the middle of it.)

3) This process is self-perpetuating. The deeper we get into a rationalization, the more likely we are to repeat the bad decision, hang on to the mistaken belief, continue to do harm to others.

This is probably the scariest part of the book. When we hurt someone and convince ourselves that they deserved it, we're more likely to hurt them -- or other people like them -- again. Partly because we've already convinced ourselves that they're bad, so why not... but also, in large part, to bolster our belief that our original decision was right.

The most chilling examples of this are in the justice system and international relations. In the justice system, cops and prosecutors are powerfully resistant to the idea that they might have made a mistake and put the wrong person in prison. As a result, they actively resist revisiting cases, even when new evidence turns up. And the justice system is, in far too many ways, structured to support this pattern.

As for this process playing out in international relations, I have just three words: "The Middle East." Any time you have a decades- or centuries-old "they started it" vendetta, you probably have one of these self-perpetuating rationalization processes on your hands. On all sides.

But this happens on a small scale as well, with individuals. I know that I've said snarky, mean things behind people's backs, for no good reason other than that friends of mine didn't like them and were being mean and snarky about them... and I've then convinced myself that I really couldn't stand that person, and gone on to say even more mean things about them. And I've more than once tried to convince my friends to dislike the people that I disliked... because if my friends liked them, it was harder to convince myself that my dislike was objectively right and true. All unconsciously, of course. It's taken time and perspective to see that that's what I was doing.

4) The more we have at stake in a decision, the harder we hang on to our rationalization for it.

This is a freaky paradox, but it makes a terrible kind of sense when you think about it. The further along we've gone with a bad decision, and the more we've committed to it, the more likely we are to justify it -- and to stick with it, and to invest in it even more heavily.

A perfect example of this is end-of-the-world cults. When people quit their jobs and sell their houses to follow some millennial leader, they're more likely to hang on to their beliefs, even though the world conspicuously did not end on August 22, 1997 like they thought it would. If someone doesn't sell their house to prepare for the end of the world -- if, say, they just take a week off work -- they'll find it easier to admit that they made a mistake.

And this is true, not just for bad decisions and mistaken beliefs, but immoral acts as well. Paradoxically, the worse the thing is that you've done, the more likely you are to rationalize it, and to stick to your rationalization like glue. As I wrote before when I mentioned this book: It's relatively easy to reconcile your belief that you're a good person with the fact that you sometimes make needlessly catty remarks and forget your friends' birthdays. It's a lot harder to reconcile your belief that you're a good person with the fact that you carved up a pregnant woman and smeared her blood on the front door. The more appalling your immoral act was, the more likely you are to have a rock-solid justification for it... or a justification that you think is rock-solid, even if everyone around you thinks it's transparently self-serving or batshit loony.

5) This process is necessary.

This may be the hardest part of all this to grasp. As soon as you start learning about the unconscious rationalization of cognitive dissonance, you start wanting to take an icepick and dig out the part of your brain that's responsible for it.

But in fact, rationalization exists for a reason. It enables us to make decisions without being paralyzed about every possible consequence. It enables us to have confidence and self-esteem, even though we've made mistakes in the past. And it enables us to live with ourselves. Without it, we'd be paralyzed with guilt and shame and self-doubt. Perpetually. We'd never sleep. We'd be second-guessing everything we do. We'd be having dark nights of the soul every night of our lives.

So that's the gist of the book. Cognitive dissonance, and the unconscious rationalizations and justifications we come up with to deal with it, are a basic part of human consciousness. It's a necessary process... but it also does harm, sometimes great harm. So we need to come up with ways, both individually and institutionally, to minimize the harm that it does. And since the process is harder to stop the farther along it's gone, we need to find ways to catch it early.

That's the concept. And I think it's important.

It's important because, in a very practical and down-to-earth way, this concept gives us a partial handle on why dumb mistakes, absurd beliefs, and harmful acts get perpetuated. And it gives us -- again, in a very practical, down-to-earth way -- a handle on what we can do about it.

We have a tendency to think that bad people know they're bad. Our popular culture is full of villains cackling over their beautiful wickedness, or trying to lure their children to The Dark Side. It's a very convenient way of positioning evil outside ourselves, as something we could never do ourselves. Evil is Out There, something done by The Other. (In fact, I'd argue that this whole cultural trope is itself a very effective support for rationalization. "Sure, I set the stove on fire/ shagged the babysitter/ gave my money to a con artist... but it's not like I'm Darth Vader.")

But reality isn't like that. Genuine sociopaths are rare. Most people who do bad things -- even terrible, appalling, flatly evil things -- don't think of themselves as bad people. They think of themselves as good people, and they think of their evil acts as understandable, acceptable, justifiable by the circumstances. In some cases, they even think of their evil acts as positive goods.

If we want to mitigate the effects of foolish beliefs, bad decisions, and hurtful acts, we need to look at the reality of how these things happen. We need to be vigilant about our own tendency to rationalize our mistakes. We need to be knowledgeable about how to effectively deal with other people's rationalizations. We need to create institutional structures designed to catch both our mistakes and our rationalizations, and to support us in acknowledging them. (The scientific method is a pretty good model of this.) And especially in America, we need to create a culture that doesn't see mistakes as proof of incompetence, misconceptions as proof of stupidity, and hurtful acts as proof of evil.

And this book offers us ways to do all of that.

The book isn't perfect. There are, for instance, some very important questions that it neglects to answer. Specifically, I kept finding myself wondering: What's the difference between rationalization and simple optimism, or positive thinking? What's the difference between rationalizing a bad decision, and just having a silver-lining, "seeing the bright side" attitude? And if there is a difference, how can you tell which one you're doing?

And, as a commenter here in the blog asked when I mentioned this book earlier: What's the difference between justifying why your bad behavior wasn't really bad -- and genuinely changing your mind about what is and isn't bad? Think of all the people who believed that homosexual sex was wrong and they were bad people for even thinking about it… until they actually did it, and spent time with other people who did it, and realized that there wasn't actually anything wrong with it. How do you tell the difference between a rationalization and a genuine change of heart?

Somewhat more seriously, the section on "What can we actually do about this?" is rather shorter than I would have liked. The authors do have some excellent practical advice on dealing with cognitive dissonance and rationalization. But while their advice on dealing with other people's rationalizations is helpful, and their ideas on creating institutional structures to nip the process in the bud are inspired, their advice for dealing with one's own dissonance/ rationalization pretty much comes down to, "Just try to be aware of it." Problematic -- since as they themselves point out, rationalization and justification are singularly resistant to introspection.

But it's a grand and inspiring start, an excellent foundation on an important topic. It's been a life-changer, and I recommend it passionately to everyone.

So what does it have to do with religion?

The most obvious relevance is this: For those of us who don't believe in it, religion clearly looks like a prime example of rationalization and justification of a mistaken belief. Religious apologetics especially. Since there's no hard evidence in the world to support the beliefs, the entire exercise -- all the explanations and defenses, all the "mysterious ways"es and "this part isn't meant literally"s and "you just have to take that on faith"s -- it all looks from the outside like one gigantic rationalization for a mistaken belief. It looks like a well-oiled mechanism for refusing to accept that you hold a belief -- and have based your life and your choices on a belief -- that is illogical and unsupported by evidence.

And it looks like a classic example of a social structure built to support one another in maintaining these rationalizations: supporting one another in rejecting alternatives, and repeating the beliefs to one another over and over until they gain the gravitas of authoritative truth.

(This is what I was trying to get at when I called religion a self-referential game of Twister. I dearly wish I'd read this book when I wrote that piece; it would have given me much clearer language to write it in.)

And the more contrary a belief is to reality, the more entrenched this mechanism becomes. The non-literal, science-appreciating, "God is love" believers are usually more ecumenical, better able to think that they don't know everything and that different beliefs may have some truth and validity. It's the literalists, the fundamentalists, the ones who deny well-established realities like evolution and the sanity of gay people and the geological age of the planet, who have the seriously entrenched rationalizations for their beliefs... and the powerful institutional structures for deflecting questions and evidence and doubt. ("Those questions come from Satan" is my current favorite.)

So that's the obvious relevance.

But there's a less obvious relevance as well. This is an important book for believers... but it's also an important book for atheists. And not just as a source of ammunition for our debates.

It's an important book for atheists because of its ideas on how to deal with people who are entrenched in rationalization -- and how really, really not to. One of the most important points this book makes is that there are useful ways to point out other people's rationalizations to them… and some not-so-useful ways. And screaming at someone, "What were you thinking? How could you be so stupid?" is one of the not-so-useful methods. In fact, it usually has the exact undesired effect -- it makes people defensive, and drives them deeper into their rationalizations.

Now, many atheists may decide that screaming, "How could you be so stupid?" is still a valid strategy. And in a larger, long-term sense, it may well be. If religion is the emperor's new clothes, having an increasingly large, increasingly vocal community of people chanting, "Naked! Naked! Naked!" may, in the long run, be quite effective in chipping away at the complicity that religion depends on, and making it widely known that there is an alternative. Especially with younger people, who aren't yet as entrenched in their beliefs. And it's already proven effective in inspiring other atheists to come out of the closet.

In one-on-one discussions and debates, though, it's not going to achieve much. And we need to be aware of that. If we're going to be all rational and evidence-based, we need to accept the reality of what forms of persuasion do and don't work.

But it's not just important for atheists to read this book to learn how to deal with believers' fallibility. It's important for atheists to read it to learn how to deal with our own.

Atheists, oddly enough, are human. And we therefore share the human tendency to rationalize and justify our beliefs and behavior. No matter how rational and evidence-based we like to think of ourselves as, we are not immune to this pattern.

And of particular relevance, I think, is one of the book's main themes: the human tendency to reject any and all ideas coming from people we disagree with. The more entrenched we get in a belief, the more unwilling we are to acknowledge that our opponents have any useful ideas whatsoever, or any valid points to make.

And I've definitely seen that play out in the atheosphere. I've seen an unfortunate tendency among some atheists to tag all believers as stupid; to reject religion as having nothing even remotely positive or useful to offer; to explain the widespread nature of religious belief by saying things like, "People are sheep."

I don’t exempt myself from this. I think I've mostly been good about critiquing ideas rather than people; but I have gotten my back up when I thought someone was being unfair to me, and have refused to acknowledge that maybe I was being unfair as well. And I've definitely fallen prey to the error of thinking, "give 'em an inch and they'll take a mile"; of thinking that any concession at all is the first step to appeasement, and I have to stick to my guns like a mule. A mule with guns.

But this tendency isn't helpful. The issue of religion and not-religion is already polarizing enough on its own, without us artificially divvying the world into Us and Them.

If I'm right, and religion really is (among other things) an elaborate rationalization for hanging on to a mistaken belief... well, that doesn't make believers ridiculous and atheists superior. It puts us all in the same human boat. It puts religion in the same category as hanging onto ugly clothes and shoes that gave me blisters, for years, because I didn't want to admit that I'd made a mistake when I bought them. It puts it in the same category as going through with a disastrous marriage, because I didn't want to admit I'd made a mistake when I got engaged. It puts religion into a particular category of human fallibility... a fallibility that we all fall prey to, every day of our lives.

I'm not saying religion is okay. Let me be very clear about that. I think religion is a mistake; I think it's a harmful mistake; and I'm not going to stop speaking out against it. And I'm not asking anyone else to stop speaking out against it.

But for my own peace of mind, I'm making a sort of New Year's Resolution about cognitive dissonance. I'm resolving to be better about acknowledging when I make mistakes, and correcting them. I'm resolving to be better about acknowledging when people I disagree with make good points. And when I'm in one-on-one debates with people, I'm resolving to think, not just about why I'm right and they're wrong, but about what kind of argument is likely to persuade them.

This article was found at:


and at:


No comments:

Post a Comment