Integrity. It’s a quality every man worth his salt aspires to. It encompasses many of the best and most admirable traits in a man: honesty, uprightness, trustworthiness, fairness, loyalty, and the courage to keep one’s word and one’s promises, regardless of the consequences. The word integrity derives from the Latin for “wholeness” and it denotes a man who has successfully integrated all good virtues – who not only talks the talk, but walks the walk.
It’s not too difficult to discuss this quality in a general way and offer advice on maintaining one’s integrity of the “just do it” variety. But a quick glance at the never-ending news headlines trumpeting the latest scandal and tale of corruption shows that that’s not always the most effective approach. While the foundation of integrity is having a firm moral code of right and wrong, it can also be enormously helpful, even crucial, to understand the psychological and environmental factors that can tempt us to stray from that code. What’s at the root of our decision to sometimes compromise our principles? What kinds of things lead us to be less honest and what kinds of things help us to be more upright? What are some practical ways we can check our temptations to be immoral or unethical? How can we strengthen not only our own integrity, but the integrity of society as well?
In this four-part series on integrity, we will use the research of Dan Ariely, professor of psychology and behavioral economics, and others in order to answer these vital questions.
Why Do We Compromise Our Integrity?
Every day we are faced with little decisions that reflect on our integrity. What’s okay to call a business expense or put on the company charge card? Is it really so bad to stretch the truth a little on your resume in order to land your dream job? Is it wrong to do a little casual flirting when your girlfriend isn’t around? If you’ve missed a lot of class, can you tell your professor a family member died? Is it bad to call in sick to work (or to the social/family function you’re dreading) when you’re hungover? Is it okay to pirate movies or use ad block when surfing the web?
For a long time it was thought that people made such decisions by employing a rational cost/benefit analysis. When tempted to engage in an unethical behavior, they would weigh the chances of getting caught and the resulting punishment against the possible reward, and then act accordingly.
However, experiments by Dr. Ariely and others have shown that far from being a deliberate, rational choice, dishonesty often results from psychological and environmental factors that people typically aren’t even aware of.
Ariely discovered this truth by constructing an experiment where participants (consisting of college students) sat in a classroom-like setting and were given 20 mathematical matrices to solve. They were tasked with solving as many matrices as they could within 5 minutes and given 50 cents for each one they got correct. Once the 5 minutes were up, the participants would take their worksheets to the experimenter, who counted up the correct answers and paid out the appropriate amount of money. In this control condition, participants correctly solved an average of 4 matrices.
Ariely then introduced a condition that allowed for cheating. Once the participants were finished, they checked their own answers, shredded their worksheets at the back of the room, and self-reported how many matrices they had correctly solved to the experimenter at the front, who then paid them accordingly. Once the possibility of cheating was introduced, participants claimed to solve 6 matrices on average – two more than the control group. Ariely found that given the chance, lots of people cheated – but just by a little bit.
To test the idea that people were making a cost/benefit analysis when deciding whether or not to cheat, Ariely introduced a new condition that made it clear that there was no chance of being caught: after checking their own answers and shredding their worksheets, participants retrieved their payout not from the experimenter, but by grabbing it out of a communal bowl of cash with no one watching. Yet contrary to expectations, removing the possibility of being caught did not increase the rate of cheating at all. So then Ariely tried upping the amount of money the participants could earn for each correctly solved matrix; if cheating was indeed a rational choice based on financial incentive, then the rate of cheating should have risen as the reward did. But boosting the possible payout had no such effect. In fact, when the reward was at its highest — $10 for each correct answer the participant had only to claim he had gotten – cheating wentdown. Why? “It was harder for them to cheat and still feel good about their own sense of integrity,” Ariely explains. “At $10 a matrix, we’re not talking about cheating on the level of, say, taking a pencil from the office. It’s more akin to taking several boxes of pens, a stapler, and a ream of printer paper, which is much more difficult to ignore or rationalize.”
This, Ariely, had discovered, went to the root of people’s true motivations for cheating. Rather than decisions to be dishonest only being made on the basis of risk vs. reward, they’re also greatly influenced by the degree to which they’ll affect our ability to still see ourselves in a positive light. Ariely explains these two opposing drives:
“On one hand, we want to view ourselves as honest, honorable people. We want to be able to look at ourselves in the mirror and feel good about ourselves (psychologists call this ego motivation). On the other hand, we want to benefit from cheating and get as much money as possible (this is the standard financial motivation). Clearly these two motivations are in conflict. How can we secure the benefits of cheating and at the same time still view ourselves as honest, wonderful people?
This is where our amazing cognitive flexibility comes into play. Thanks to this human skill, as long as we cheat by only a little bit, we can benefit from cheating, and still view ourselves as marvelous human beings. This balancing act is the process of rationalization, and it is the basis of what we’ll call the ‘fudge factor theory.’”
The “fudge factor theory” explains how we decide where to draw the line between “okay,” and “not okay,” between decisions that make us feel guilty and those we find a way to confidently justify. The more we’re able to rationalize our decisions as morally acceptable, the wider this fudge factor margin becomes. And most of us are highly adept at it: Everyone else is doing it. This just levels the playing field. They’re such a huge company that this won’t affect them at all. They don’t pay me enough anyway. He owes me this. She cheated on me once too. If I don’t, my future will be ruined.
Where you draw the line and how wide you allow your fudge factor margin to become is influenced by a variety of external and internal conditions, the most important one being this: simply taking a first, however small, dishonest step. Other conditions can increase or decrease your likelihood of taking that initial step, and we’ll discuss them in the subsequent parts of this series. But since whether or not you make that first dishonest decision very often constitutes the crux of the matter, let us begin there.