Cognitive Dissonance: Stop Lying to Yourself
It takes courage to face up to the lies we tell ourselves, but the work is worth it
Have you ever noticed that some people don't seem to be straight with themselves? And by some people we could include you and me.
Sometimes people seem to be in denial. It's like one part of them knows the truth but another part of them is so invested in seeing things a different way that they don't allow themselves to see reality.
Cognitive dissonance is worth knowing about because it explains so much human behavior. And not just other people's!
I want to tell you a quick but amazing story about the birth of the term 'cognitive dissonance'.
The flying saucer that never showed up
Imagine this: You've sold your home, quit your job, shunned your colleagues, and abandoned your friends and family.
Why? Because you are one of the select few who know the truth!
And what is this truth you know? The end of the world is nigh, and you know for a fact that you are one of the chosen ones who will be swept up from the 'great flood' at the stroke of midnight on 21 December 1957.
But midnight on 21 December comes around and there is no flood. No end of the world. No flying saucer to the rescue. No nothing (to use a double negative).
What do you do? Admit you were wrong? Acknowledge that you gave up position, money, friends... for nothing? Tell yourself and others you've been a gullible schmuck?
Not on your life.
But what do you do?
On being stood up by the aliens
This is where a certain social psychologist, Leon Festinger, enters our story.
Back in the late 1950s Festinger infiltrated a flying-saucer doomsday cult. The members of this cult had given up everything on the premise that the world was about to self-destruct and that they, because of their faith, would be the sole survivors.
Their leader, ex-dianetics (which later became scientology) enthusiast Dorothy Martin, had been receiving messages from aliens on the Planet Clarion telling her of the approaching worldwide deluge and stating that she and her followers would be rescued and carried off by flying saucer to the aliens' home planet.
Festinger posed as a cultist and was present when the spaceship failed to show up. He was curious about what would happen. How would the disappointed cultists react to the failure of their prophecy? Would they be embarrassed and humiliated?
What actually happened amazed him.
Cognitive dissonance: Who are you kidding?
In the lead-up to the fateful day, the cult had shunned publicity and shied away from journalists. Now, after the non-event, the cultists suddenly wanted publicity. They wanted media attention and coverage. Why? So they could explain how their faith and obedience had helped save the planet from the flood. The aliens had spared planet Earth for their sake - and now their role was to spread the word and make us all listen.
This fascinated Festinger. He observed that the real driving force behind the cultists' apparently inexplicable response was the need not to face the awkward and uncomfortable truth and change their minds, but rather to 'make minds comfortable' - to smooth over the unacceptable inconsistencies.
This is an extreme example, but I think cognitive dissonance goes on all the time in small but significant ways.
You can't handle the truth!
Festinger coined the term 'cognitive dissonance' to describe the uncomfortable tension we feel when we experience conflicting thoughts or beliefs (cognitions) or engage in behaviour that stands in opposition to our stated beliefs.
Think of the woman who states she believes in faithfulness but has an affair with a neighbour's husband. Now, to help balance the books and feel that she is consistent with her own values, she may have to convince herself that she is helping him because his wife is abusive or doesn't understand him. I have seen this happen.
Or take the 60-a-day smoker who tells you as they wheeze their guts up that smoking hasn't affected their health but they really do feel they are spending too much money on it. I've seen that happen too.
What about those people who fail to take any action towards some goal they had loudly and publicly proclaimed they were going to achieve, citing some lame reason as to why it wasn't a good idea after all. I have done that myself!
The woman doesn't say, "I am behaving in a way that contradicts my stated value system!" The smoker doesn't say, "Who am I kidding? This smoking is stealing my life from me!" And I've never said, "I am just too lazy to follow this goal!"
It can seem easier to employ a bit of comfortable denial. It's a shame, because that kind of honesty might just get us somewhere. And some people are that honest - or learn to be when they have to.
So why do people lie to themselves and others?
If you convince yourself of a lie it's no longer a lie
What is particularly interesting is the lengths to which people will go to reduce their inner tension rather than concede that they might, in fact, be wrong. They will accept almost any form of relief if it means they don't have to admit being at fault, or mistaken.
This is effectively choosing short-term comfort over long-term happiness. Cognitive dissonance isn't about lying but rather being convinced of something because it suits you to be convinced of it. One part of the self has lied to another part.
Festinger quickly realized that our tendency towards 'cognitive dissonance' could explain many mysteries of human behaviour. He wanted to see whether the type of denial he saw among members of the flying-saucer cult might also occur in everyday situations.
How much is a lie worth?
In a fascinating experiment, Festinger and a colleague paid some subjects 20 dollars to tell a specific lie, while they paid another group of subjects only one dollar to do the same (Festinger & Carlsmith, 1959). Those who were paid just one dollar were far more likely to later claim that they had actually believed the lie they were told to tell. Why?
Well, because it's just so much harder to justify having done something that conflicts with your own sense of being 'an honest person' for a mere pittance. If you get more money, you can tell yourself, "Yeah, I lied, but I got well paid! It was justified." But for one dollar? That's not a good enough reason to lie, so what you were saying must have been true in the first place... right?
Amazingly, this style of cognitive dissonance (what you might call 'not worth the candle' dissonance) was the very type instigated and manipulated by the Chinese during the Korean war to re-engineer the beliefs of their American prisoners of war - what came to be known as 'brainwashing'.
I must believe this; else I wouldn't do it!
Of course, there are other ways to induce and shape beliefs - fear and reward, for example - but the Chinese captors during the Korean conflict worked on their prisoners more subtly (at least some of the time) by taking advantage of cognitive dissonance effects.
Some US captives were offered extra rice or candies for writing anti-American, pro-Chinese essays. Some of the soldiers that went along with it actually subsequently converted to communism.
I would posit that this worked via the exact same mechanism as the one-dollar 'liars' coming to believe that they hadn't, in fact, told a lie. No one wants to believe they sold out for a bowl of rice or some candy. The Chinese captors seemed to instinctively know how cognitive dissonance works and how to use it to mould beliefs.
This is not to say that everybody will experience cognitive dissonance in every circumstance, but it does seem to happen more than we realize. It's as though we can brainwash ourselves sometimes.
Emotional reasoning is the cognitive distortion that happens when we try to justify our emotional reactions using a neat narrative that seems to fit our value system or ideology. This justification mechanism behaves rather like a press secretary or publicist, working hard to justify what the emotional mind is driving us to do. Justification replaces a search for truth.
When we feel we want to act violently towards some group or person we may seek to justify the violence (even killing) by demonizing and dehumanizing them: "They are scum/fascists/commies/cockroaches/vermin/rats!"
Rationalization and justification are always the first steps towards violence.
And, what's worse, it seems our brains reward us - rather like a drug addict getting their fix - when we rationalize away any information we don't want to hear. We feel really good when we have successfully convinced ourselves that "it ain't so!"
Dishonest politicians? Never!
Emotional factors influence how we vote for our politicians much more than our careful and logical appraisal of their policies, according to Drew Westen PhD, a professor of psychiatry and psychology at Emory University in Atlanta and author of The Political Brain: The Role of Emotion in Deciding the Fate of the Nation (Westen 2008). Maybe this comes as no surprise.
But what about when we learn that our favoured politician may be dishonest? Do we take the trouble to really find out what they are supposed to have done, and so possibly have to change our opinions (and our vote), or do we experience that nasty cognitive dissonance and seek to keep our minds comfortable at the possible cost of truth?
Closing our minds down
Westen and colleagues conducted a study analyzing magnetic resonance imaging (MRI) scans of the brains of staunch Democrats and staunch Republicans in the USA. The MRIs showed that the emotional areas of participants' brains lit up when they read articles suggesting their favoured politician was dishonest. So far, nothing too surprising. But get this!
There was a decrease in activity in the parts of the brain that deal with reasoning when they read this damning information.
So part of reducing the discomfort of cognitive dissonance may be to think less. This makes sense. The flying-saucer cultists, when confronted with their unfulfilled doomsday prophecy, didn't think their way out of their dilemma; they felt their way out of it - they based their decisions on emotion.
When we make up our minds, sometimes we close them down.
The researchers in Westen's study found that all the participants appeared to find ways of ignoring negative information about their favoured politician, thus allowing them to hold onto their previous beliefs. And when their emotional response eventually overcame the reasoning, it stimulated the brain's reward system - just as drugs do for addicts.
Nobody said that a sincere search for truth would be comfortable!
Get real: Do you suffer from cognitive dissonance?
"But I'm not like that!" I hear you cry. Well, maybe not, but remember that cognitive dissonance is unconscious. You don't consciously notice that there is a discrepancy between your beliefs and your behaviour. It all happens outside your awareness. But the discomfort still drives you inexorably to seek relief. That's how it works.
We often assume from our incessant information gathering and analysis that human beings are truth-seeking creatures, but much evidence indicates that maintaining our emotional stability is far more important to us than sharpening up our perceptions of reality.
Just because it's true doesn't mean we believe it! Especially, it seems, when it comes to our romantic partners.
You're much better than my ex
We like to think that our lives are improving and we are making good decisions. After all, nobody wants to think "You are worse than my last boyfriend." It's more comfortable to justify our choices with 'reasons'.
If we have chosen to be with someone new, then it would make sense to feel that this person has positive qualities that were lacking in previous partners - after all, we are with them, aren't we? We want to be happy with our decision.
Research shows, not surprisingly, that we tend to romanticize our current partners at the expense of our exes. A team of researchers, led by psychologist Glenn Geher of the State University of New York at New Paltz, found that most people rated their current partners as 'much better' than their exes, regardless of how partners might actually stack up against one another (Geher et al. 2005). This, of course, was especially true of partners who were happy in their new relationship. They just had to believe this new person was actually superior.
It sometimes seems as though we are swimming in an ocean of denial.
Cognitive dissonance all around... help!
If we experience cognitive dissonance over some issue, we seek to escape it in myriad ways. We believe things (or not) because to think otherwise may be too emotionally destabilizing. It's easier to make up reasons in your head - why that report really can wait until tomorrow, why it may be better to make a fresh start in the morning, why you should continue dating that lunatic - than to admit you might have been lazy or misguided.
The conscious mind is employed by the unconscious mind to justify our behaviour so that our self-concepts don't have to change too radically. No one likes to be wrong.
But learning to love being wrong may be the first painful step towards real self-knowledge.
How many people justify smoking, or never visiting an elderly relative, or any number of good things left undone or bad things committed? Having a rationale makes what you want to do seem like the right thing to do - even if it really isn't.
Many of the worst sins committed ever had plenty of rationalized 'reasons' in support of them.
Minds must be made comfortable
So what are we to make of all this? How are we to uncover truth when it seems that people who genuinely believe they are being honest are really deceiving themselves? When one part of the mind doesn't know what the other is doing?
To make things worse, the 'self-esteem industry' may have actually encouraged our already strong natural propensity to seek relief from cognitive dissonance through rationalization.
If the main imperative is to feel good about yourself at all costs (because you are 'worth it'), any behaviour that indicates your character really could do with a bit more work would be likely to rouse intolerable tension - and you would be that much keener to justify it to yourself rather than deal with it.
It's easier to assume someone else is bad or responsible than to take it on ourselves.
I really think we need to be braver. We need to find the will to face the sometimes uncomfortable truths about ourselves and our behaviour.
It's not just about thinking well of yourself.
Interestingly, self-justifications don't always have to put us in a positive light - just a consistent one. People with low self-esteem are uncomfortable with evidence that puts them in a better light, and therefore cling to low self-evaluations by explaining away their successes.
If you suffer from low self-esteem it can be more comfortable to continue feeling you are worthless than to have to change your self-concept.
But let's go a little deeper.
'Spirituality', or self-indulgence?
Some people may lack the self-knowledge or objectivity to know when they are being cowardly, lazy, cruel, or attention-seeking at the expense of others. But these self-same people may believe that they are 'working on themselves' by burning incense, using the word 'spiritual' a lot, or doing all manner of things they have been conditioned to believe have to do with self-improvement.
Being an official man or woman 'of the cloth' is certainly not synonymous with being 'a good person'. A superficial spirituality is readily constructed from the outward forms of special modes of dress, distinctive jargon, and ritual behaviours. It makes an excellent - and even outwardly convincing - relief valve and camouflage for any cognitive dissonance arising from a mismatch between behaviours and beliefs. In this sense an entire identity can become a form of rationalizing away bad behavior.
Of course sometimes it's what we've invested time or money or emotion in that makes us want to keep our heads firmly buried in the sand.
The cost of commitment
Cognitive dissonance is essentially a matter of commitment to the choices one has made and an ongoing need to satisfactorily justify that commitment, even in the face of convincing but conflicting evidence.
This is why it can take a long time to leave a cult or an abusive relationship - or even to stop smoking. Life's commitments, whether to a job, a social cause, or a romantic partner, require heavy emotional investment, and so carry significant emotional risks. If you have put years of effort and heartache into a relationship it might feel too much to 'cut your losses' and get out, even if it really is the best thing to do.
In a way, it makes sense that our brains should be hardwired for monitoring and justifying our choices and actions. We do this to avoid too much truth breaking into us at once and overwhelming us.
This is why we throw good money after bad when a financial decision seems to be backfiring, clutch at the straws of a fading relationship, or 'send more troops' into ill-advised military adventures.
So where does this leave us?
Growing up and making cognitive dissonance work for you
I guess we can't really develop unless we start to get a grip and have some personal honesty about what really motivates us. This is part of genuine maturity.
If I know I am being lazy, and can admit it to myself, that at least is a first step towards correcting it. If, however, I tell myself it's more sensible to wait until later to vacuum, then I can go around with a comfortable self-concept of 'being sensible' while my filthy carpets and laziness remain unchanged.
On the other hand, it's worth considering that sometimes when we behave in ways that contradict our value system it might be because the value system itself is too narrow and rigid.
Cognitive dissonance can actually help me mature, if I can bring myself first to notice it (make it conscious), and second to be open to the message it brings, however uncomfortable it may be.
As dissonance increases, provided I do not get carried away with self-justification, I can get a clearer and clearer sense of what has changed, and what I need to do about it.
And then I can remember what Darwin had to say about who will survive...
References
- Festinger, L., & Carlsmith, J. M. (1959). Cognitive consequences of forced compliance. Journal of Abnormal and Social Psychology, 58(2), 203-210. http://dx.doi.org/10.1037/h0041593
- Westen et al. Motivated political reasoning
- Geher, G., Bloodworth, R., Mason, J., Stoaks, C., Downey, H. J., Renstrom, K. L., & Romero, J. F. (2005). Motivational underpinnings of romantic partner perceptions: Psychological and physiological evidence. Journal of Social and Personal Relationships, 22(2), 255-281.
- Westen, D. (2008). The political brain: The role of emotion in deciding the fate of the nation. PublicAffairs, New York.