People don’t know when they’re lying to themselves

ByEd Yong
March 07, 2011
7 min read

“I am on a drug. It’s called Charlie Sheen. It’s not available because if you try it, you will die. Your face will melt off and your children will weep over your exploded body.” – Charlie Sheen

“We put our fingers in the eyes of those who doubt that Libya is ruled by anyone other than its people.” – Muammar Gaddafi

You don’t have to look far for instances of people lying to themselves. Whether it’s a drug-addled actor or an almost-toppled dictator, some people seem to have an endless capacity for rationalising what they did, no matter how questionable. We might imagine that these people really know that they’re deceiving themselves, and that their words are mere bravado. But Zoe Chance from Harvard Business School thinks otherwise.

Using experiments where people could cheat on a test, Chance has found that cheaters not only deceive themselves, but are largely oblivious to their own lies. Their ruse is so potent that they’ll continue to overestimate their abilities in the future, even if they suffer for it. Cheaters continue to prosper in their own heads, even if they fail in reality.

Chance asked 76 students to take a maths test, half of whom could see an answer key at the bottom of their sheets. Afterwards, they had to predict their scores on a second longer test. Even though they knew that they wouldn’t be able to see the answers this time round, they imagined higher scores for themselves (81%) if they had the answers on the first test than if they hadn’t (72%). They might have deliberately cheated, or they might have told themselves that they were only looking to “check” the answers they knew all along. Either way, they had fooled themselves into thinking that their strong performance reflected their own intellect, rather than the presence of the answers.

And they were wrong – when Chance asked her recruits to actually take the hypothetical second test, neither group outperformed the other. Those who had used the answers the first-time round were labouring under an inflated view of their abilities.

LIMITED TIME OFFER

Get a FREE tote featuring 1 of 7 ICONIC PLACES OF THE WORLD

Chance also found that the students weren’t aware that they were deceiving themselves. She asked 36 fresh recruits to run through the same hypothetical scenario in their heads. Those who imagined having the answers predicted that they’d get a higher score, but not that they would also expect a better score in the second test. They knew that they would cheat the test, but not that they would cheat themselves.

Some people are more prone to this than others. Before the second test, Chance gave the students a questionnaire designed to measure their capacity for deceiving themselves. The “high self-deceivers” not only predicted that they would get better scores in the second test, but they were especially prone to “taking credit for their answers-aided performance”.

These experiments are part of a rich vein of psychological studies, which show just how easy it is for people to lie to themselves, In a previous (and smaller) study, Chance herself asked 23 men to choose between two fake sports magazines, one with broader coverage and one with more features. She found that the volunteers would pick whichever one was accompanied by a special swimsuit cover, but they cited the coverage or features as the reason for their choice (Chance even titled her paper “I read Playboy for the articles”)

In 2004, Michael Norton (who worked with Chance on the latest study) showed that people can explain away biases in recruitment choices just as easily. He asked male volunteers to pick male or female candidates for the position of construction company manager. For some of the recruiters, the male candidate had more experience but poorer education and for others, he had better education but less experience. In both cases, the recruits preferred the male applicant, and they cited whichever area he was strongest in as the deciding factor.  Norton found the same trends in racial biases in college admissions.

In these cases, it’s debatable whether the volunteers were actually lying to themselves, or merely justifying their choices to the researchers. But Chance addressed that problem in her latest study that by putting money on the line. In a variant of the same experiment, She told a new batch of recruits that they could earn up to $20 depending on their score on the second test and how accurately they predicted that score. Despite the potential reward, the group that saw the answers were no better at predicting their scores. And as a result, they earned less money. Even when there was an actual reward at stake, they failed to correct for their self-deception.

Things get even worse when people are actually rewarded for cheating. In a final experiment, Chance gave some of the students a certificate of recognition, in honour of their above-average scores. And if students saw the answers on the first test and got the certificate, they predicted that they would get even higher scores on the second. Those who didn’t see the answers first-time round were unmoved by the extra recognition.

This final result could not be more important. Cheaters convince themselves that they succeed because of their own skill, and if other people agree, their capacity for conning themselves increases. Chance puts it mildly: “The fact that social recognition, which so often accompanies self-deception in the real world, enhances self-deception has troubling implications.”

This tells us a little about the mindset of people who fake their research, who build careers on plagiarised work or who wave around spurious credentials. There’s a tendency to think that these people know full well what they’re doing and go through life with a sort of Machiavellian glee. But the outlook from Chance’s study is subtler.

She showed that even though people know that they occasionally behave dishonestly, they don’t know that they can convincingly lie to themselves to gloss over these misdeeds. Their scam is so convincing that they don’t know that they’re doing it. As she writes, “Our findings show that people not only fail to judge themselves harshly for unethical behaviour, but can even use the positive results of such behaviour to see themselves as better than ever.”

Related Topics

Go Further