Why do we overestimate our ethical behavior? Stages and dynamics of our decision-making process
Behavioral ethics series #4: creating successful companies attuned to the values and challenges of the 21st century
In the latest text of this series, I described how our limited rationality and cognitive biases lead us to overestimate our ethical behavior.
In this text, I am going to address the second reason why we usually fall short of our ethical standards.
In addition to our distorted perception of reality, the inflated perception of our ethical behavior also results from the stages and dynamics of our decision-making process.
Let’s start with the first point. Our decisions can be divided into three moments: planning (pre-decision), action (the actual moment of the decision), and evaluation (post-decision).[i]
In each situation, there is usually tension between doing what is best for us (what “I want to do”) and doing what we believe is the right behavior for a harmonious coexistence with the world (what “I ought to do”).
What “I want to do” corresponds to the impulsive decision that is most pleasurable or personally advantageous to us in that situation. What “I ought to do,” in turn, corresponds to the more reflected action based on the principles and values we intend to uphold for a cooperative life in society.
A simple example of a traffic situation demonstrates how this process occurs. Imagine you need to make a left turn on an avenue with three lanes during your commute to work in a day that you’re particularly tight on time.
There is only one lane available for conversion, while the others are used by vehicles that continue straight ahead. However, there is a long line of cars on the left lane for conversion. What “I ought to do” is obviously going to the end of the line just like all other vehicles.
However, to avoid spending time queueing like everyone else, it may be tempting to go forward through the center lane and make the conversion at the very last moment, cutting the line by jumping in front of a vehicle in the left lane. This often represents what “I want to do.”
How do you think you would act in such situation?
Pre-decision: The planning
Before making decisions, almost everybody plans to act ethically. This is the moment in which the “rider” is in charge, and we have a broader, long-term, and deontological (principle-based) view of our actions: “I should behave ethically, and that is how I am going to conduct myself!”
In the traffic situation of the previous section, we all believe (or intend to believe) that we would respect the norms by queueing just like the other drivers. The problem, though, is that research shows that we have a strong tendency to make wrong predictions of our behavior in social situations.[ii]
A paper compared, for example, people’s forecast of their behavior during negotiations with their actual performance when they were indeed bargaining. Researchers concluded that there was a complete disconnection between negotiators’ forecasts and their actual behaviors. In their words, “Negotiators may think they will fight fire with fire, think they will be lions that roar, but in the end, they are merely mice that whimper.”[iii]
Action: The moment of decision
When faced with the situation itself, a conflict arises between what we “ought to do” and what “we want to do.” The “want self,” which represents the visceral reaction of our “elephant,” often takes over and dominates our thoughts at this point.
As a result, all the principle-based and long-term view we displayed in the beginning usually ends up becoming a narrow and short-term view focused on the immediate gain or pleasure.
The “heat of the moment” can also increase the likelihood of deviating from our values and principles by preventing us from perceiving the wider consequences of our actions. Implicitly, it works as if we tell ourselves: “I do not realize the ethical implications of this decision, so I’ll do what’s best for me.”
Moving back to the traffic example, this impulse could lead us to avoid the line on the left and follow the central lane until jumping in front of a queueing vehicle at the time of conversion.
In cases of corruption and organizational fraud, something similar may occur. The vast majority of people firmly believe they would always act in line with what they “ought to do.”
That is, they act according to their morals, ideas, and values. In many cases, though, individuals end up behaving differently when things actually happen, subdued by their immediate gain or pleasure.
Post-decision: The evaluation
As we distance ourselves from our thoughtless decision, the ethical implications of our actions start to come to our minds with increasing intensity. It is at this moment that an uncomfortable feeling often arises between the ethical image we have of ourselves and our actual conduct.
This state of psychological stress resulting from the internal contradiction between our values and our actual behavior is called “cognitive dissonance.” Psychologist Lou Festinger coined this term after conducting several experiments in the 1950s.[iv] In a classic study, he found that participants who received $1 to perform an extremely boring exercise rated the activity as more enjoyable than those who received $20 to do the same task.
People who received $1 did not want to admit that they had been part of such a tedious exercise for so little. As a result, they chose to create a truth for themselves that the task had been nice in order to show internal psychological consistency. His experiments, therefore, showed how the quest to eliminate our cognitive dissonance can lead us to self-deceive ourselves without even realizing it.
In the business world, cognitive dissonance can be represented by the gap between a manager’s genuine desire to behave honestly and his or her unethical (or even illegal) actions in the exercise of his or her professional position.
Cognitive dissonance causes irritation, stress, and reduces energy. To cope with it, we go through a process called “moral disengagement” by which we deactivate our moral self-regulatory processes so that we end up persuading ourselves that our questionable conduct was acceptable.[v]
At this point, the so-called “rationalizations” come into play. These are the justifications for dubious behaviors we tell ourselves through the use of seemingly rational arguments to avoid being seen as the bad guys of our own narrative.
Rationalizations, which are one of the main mechanisms of moral disengagement, can take many forms. For instance, we may come to portray an unethical behavior as something that serves a greater purpose, misrepresents the harmful consequences of our behavior, and even dehumanizes or blames the victims of our unethical actions.[vi]
Our mind has an incredible ability to rationalize improper behaviors. The greater the cognitive dissonance, the greater is the motivation to rationalize our conduct to make it more tolerable.
In the simple traffic example discussed in this chapter, we could justify our unethical deed based on rationalizations such as: “it was only today that I acted like this,” “I was late,” “everybody does it,” or “this did not hurt anyone.”
In addition to the rationalizations, we have developed a series of memory biases designed to protect our self-esteem and minimize our cognitive dissonance.[vii] Our minds are so malleable and selective that we are even able to unconsciously distort the memories of our decisions to feel better.
There is evidence, for example, that we remember in more detail the situations in which we behaved ethically than those in which we acted in a questionable way, a process called “motivated forgetting.”[viii]
One of the main works in this area was published in 2016 with the appropriate title “Memories of Unethical Actions become Obfuscated over Time.”[ix] Based on nine different experiments with more than 2,100 participants, the researchers concluded that, after engaging in dishonest behavior, individuals’ memories of their actions become increasingly obscured due to the psychological distress and discomfort caused by such misdeeds.
In one experiment, for example, volunteers participated in 10 rounds of a coin-toss task in which they could lie to increase their pay. Two weeks later, they were asked to remember details of the coin-toss task and of another event that occurred in the same day (e.g., their dinner that night). Researchers observed that the greater the amount of cheating on the coin-toss task, the lower the quality of participants’ memories about the experiment (the memory concerning the dinner on the same day, though, was not affected).
The conclusion was the same for several variants of the experiment performed with different groups. The authors named this process “unethical amnesia,” a situation in which memories of our past unethical behavior become less clear, less detailed, and less vivid over time than memories of ethical actions (or of actions unrelated to ethics). For them, ethical amnesia is the main reason why people are more likely to repeat dishonest acts over time.[x]
The results of this and other similar papers show that we continually act as a sort of “revisionist historians” of our acts in order to preserve our positive self-image.
In other words, our plastic and selective mind often allows us to have the best of worlds: We manage to maximize our personal gains (in the traffic example, doing the conversion without queuing) while, at the same time, keeping intact our positive self-image (“I should have behaved ethically. Well … on second thought, that’s how I indeed have acted!”).
In the jargon of psychologists, this is the outcome of the so-called “self-concept maintenance theory.”[xi] This approach argues that we are often divided between two competing motivations: gaining from cheating versus maintaining our positive self-concept as honest persons.
As a result, we tend to solve this motivational dilemma by finding a balance in a way that we behave dishonestly enough to profit from the situation but honestly enough to maintain our positive self-view as decent individuals.
As Harvard Business Professor Eugene Soltes observed after interacting with senior executives convicted for creating Ponzi schemes: “None of the former executives I spoke with saw himself as a fraud… the person they saw in the mirror was successful, entrepreneurial, and ambitious. They didn’t see themselves as the kind of people who created fraudulent enterprises. Most victims of deception are easy to spot… but the individuals perpetrating these schemes were also victims of a sort — victims of their own self-deception.”[xii]
Many recent experiments corroborate this idea. They show that the vast majority of people inflate their performance to increase their pay. This dishonesty, though, goes only up to a certain point, from which it would be difficult to keep maintaining a positive concept of themselves (in general, studies show that the average magnitude of dishonesty ranges between 10% to 20% of legitimate claims).[xiii]
In short, we try to look fair and honest not only in front of others but also to ourselves.[xiv] This makes us more likely to perform small unethical deeds of easy rationalization, as they allow us to earn undue benefits without harming the positive image we have of ourselves.[xv]
People who say they can sleep with a “clean conscience” after dishonest acts, therefore, can often be telling the truth, although that does not mean they have behaved honestly. The figure below illustrates our decision-making process. It shows how we can get out of a decision involving a situation in which we acted reprehensibly with the conviction that “ethics indeed is what matters.”
Stages of our decision-making process.
Source: Adapted from Bazerman, M., Tenbrunsel, A. 2011. Blind Spots. Princeton University Press.
In addition to the temporal inconsistencies that often prevent us from being as ethical as we desire to be, another issue makes it difficult for us to always exhibit proper conduct: Our ethical behavior tends to be dynamic and volatile rather than being always just “good” or “bad.”
According to psychologists, we have a kind of mental scoreboard that counts our good and bad deeds for the sake of achieving a “moral balance.”[xvi] The figure below illustrates this concept.
Moral balance: Our mental scoreboard that computes right and wrong behaviors.
Moral balance is divided into two aspects: moral compensation and moral licensing.
When we act below our ethical standards, we tend to feel bad and try to compensate for our bad behavior with good deeds. As a result, we actively look for an opportunity to do something good in order to regain the balance of our mental scoreboard. This is called moral compensation or moral cleansing.[xvii]
The need to compensate for our bad behaviors may occur in a curious way: One study, for example, divided participants into two groups. In one, participants were asked to recall in detail an ethical action from their past and to describe any related feelings or emotions they experienced. The other group, in turn, was asked to recall an unethical act.
Both groups then engaged in a word completion task in which they converted word fragments into meaningful words. One of them, for instance, was W_ _H. In this case, the full word could be “wish” or “wash.” Participants who recalled an unethical deed generated much more cleansing-related words (in this case, “wash”) than those who recalled an ethical deed.
At the end of experiment, participants were also offered a choice between an antiseptic wipe and a pencil as a free gift. In this case, those who recalled unethical acts were twice more likely to pick the antiseptic wipe (67%) than those who recalled ethical deeds (33%).[xviii]
The concept of moral compensation applies not only for individuals but also for legal persons. It is usual, for example, to observe companies involved in serious environmental or human rights issues investing more in sustainability and corporate responsibility initiatives as an attempt to cleanse their sins.[xix] Scientific evidence supports this claim.[xx]
A study of 3,000 US companies over 15 years, for instance, concluded that firms tend to engage in corporate social responsibility (CSR) with the purpose of offsetting corporate social irresponsibility such as tax problems, controversial investments, or environmental fines.[xxi] Specifically, the researchers observed that, when companies do more “harm,” they also do more “good.”
This result was particularly strong in industries in which social irresponsibility tends to be subject to greater public scrutiny, such as the chemical, pharmaceutical, and the automobile industries. According to the authors, companies tend to use CSR strategically as a marketing tool to compensate for the negative effects of irresponsible misconducts.
Another paper on this field analyzed 4,500 companies over 19 years. It also concluded that firms’ CSR initiatives seems to trail their social irresponsibility. They also found that CSR has worked to a large extent as a way for companies to reduce the costs to society stemming from irresponsible practices without having to pay them in full.[xxii]
There is also the other side of the coin. When we do something good, we may feel a surplus on our mental scoreboard. This may give us a feeling that we are entitled not to live up to our own ethical standards. This is called “moral licensing.”[xxiii] Moral licensing is dangerous because it suggests that it is precisely when we are feeling good about our conduct that we are in greater danger of behaving unethically.
Research found, for instance, that people act less altruistically after purchasing green products than after buying conventional products.[xxiv] Even more worryingly, a second related experiment showed that “green” consumers became more likely to cheat and steal than buyers of conventional products. According to the authors, purchasing green products provides a sort of moral credential that can produce the counterintuitive effect of fostering selfish and unethical behaviors.[xxv]
The scientific evidence on moral licensing makes it clear that good behavior today is no guarantee of good behavior tomorrow. Thus, we should understand how our past behavior may influence our future decisions in order to avoid succumbing to the tendency of playing a zero-sum game from the ethical standpoint.
To sum up:
§ Our decisions can be divided into three moments: planning, action, and post-evaluation. In each situation, there is frequently tension between doing what is best for us and doing what we believe is the right behavior.
§ Before making decisions, everybody plans to act ethically. In this moment, we have a broader, long-term, and principle-based view of our actions. When faced with the situation itself, a conflict often arises between doing what is best for us and what is the right thing to do. As a result, our principle-based view frequently gives way to a narrow view focused on our immediate gain or pleasure.
§ As we distance ourselves from a bad ethical decision, we are likely to experience an uncomfortable feeling resulting from the internal contradiction between our values and our actual behavior called “cognitive dissonance”. To cope with it, we often resort to the so-called “rationalizations”: justifications for dubious behaviors we tell ourselves to avoid being seen as the bad guys of our own narrative.
§ Our mind has an incredible ability to rationalize improper behaviors. It is fair to say that we act as a sort of “revisionist historians” of our acts in order to preserve our positive self-image.
§ In addition to our temporal inconsistencies, we have a kind of mental scoreboard that counts our good and bad deeds for the sake of achieving a “moral balance”. Moral balance is divided into two aspects: moral compensation and moral licensing.
§ When we act below our ethical standards, we tend to feel bad and try to compensate for our bad behavior with good deeds. As a result, we actively look for an opportunity to do something good in order to regain the balance of our mental scoreboard. This is called moral compensation or moral cleansing.
§ Now, the other side. When we do something good, we may feel a surplus on our mental scoreboard. This may give us a feeling that we are entitled not to live up to our own ethical standards. This is called “moral licensing.”
Behavioral ethics series:
Prof. Dr. Alexandre Di Miceli is a professional speaker, business thinker and founder of Virtuous Company, a top management consultancy that provides cutting edge knowledge on corporate governance, ethical culture, leadership, diversity, and company purpose.
He is the author of “The Virtuous Barrel: How to Transform Corporate Scandals into Good Businesses” as well as of the best-selling books on corporate governance and business ethics in Brazil, including “Corporate Governance in Brazil and in the World”, “Behavioral Business Ethics: Solutions for Management in the 21st Century”, and “Corporate Governance: The Essentials for Leaders”.
He thanks Prof. Dr. Angela Donaggio for her valuable comments and suggestions.
[i] These steps are described by Bazerman and Tenbrunsel (2011). Source: Bazerman, M. H., & Tenbrunsel, A. E. (2011). Blind spots: Why we fail to do what’s right and what to do about it. Princeton University Press.
[iii] Diekmann et al. (2003: 673). Source: Diekmann, K. A., Tenbrunsel, A. E., & Galinsky, A. D. (2003). From self-prediction to self-defeat: Behavioral forecasting, self-fulfilling prophecies, and the effect of competitive expectations. Journal of Personality and Social Psychology, 85(4), 672.
[iv] Brehm e Festinger (1957). Source: Brehm, J., & Festinger, L. (1957). Pressures toward uniformity of performance in groups. Human Relations, 10(1), 85–91.
[v] For more on moral disengagement, see Welsh et al. (2014), Detert et al. (2008), and Bandura (1990). Sources: Welsh, D. T., Ordoñez, L. D., Snyder, D. G., & Christian, M. S. (2014). The slippery slope: How small ethical transgressions pave the way for larger future transgressions. Journal of Applied Psychology, 100(1), 114; Detert, J. R., Treviño, L. K., & Sweitzer, V. L. (2008). Moral disengagement in ethical decision making: a study of antecedents and outcomes. Journal of Applied Psychology, 93(2), 374.
[vi] Bandura (1990: 27); Bandura, A. (1990). Selective activation and disengagement of moral control. Journal of Social Issues, 46(1), 27–46.
[vii] The link https://en.wikipedia.org/wiki/List_of_memory_biases describes a full list of these biases.
[viii] Kouchaki and Gino (2016), Anderson and Hanslmayr (2014) and Shu et al. (2011). Sources: Kouchaki, M., & Gino, F. (2016). Memories of unethical actions become obfuscated over time. Proceedings of the National Academy of Sciences, 113(22), 6166–6171; Anderson, M. C., & Hanslmayr, S. (2014). Neural mechanisms of motivated forgetting. Trends in cognitive sciences, 18(6), 279–292; Shu, L. L., Gino, F., & Bazerman, M. H. (2011). Dishonest deed, clear conscience: When cheating leads to moral disengagement and motivated forgetting. Personality and Social Psychology Bulletin, 37(3), 330–349.
[ix] Kochaki and Gino (2016: 6166). Sources: Kouchaki, M., & Gino, F. (2016). Memories of unethical actions become obfuscated over time. Proceedings of the National Academy of Sciences, 113(22), 6166–6171.
[x] One study of the research asked participants to remember unethical actions they had committed or that others had committed. In this case, people remembered with less clarity their unethical acts, but continued to remember with high clarity the dishonesty of others.
[xi] Mazar et al. (2008). Source: Mazar, N., Amir, O., & Ariely, D. (2008). The dishonesty of honest people: A theory of self-concept maintenance. Journal of marketing research, 45(6), 633–644.
[xii] Soltes (2016: 257). Source: Soltes, E. (2016). Why they do it: inside the mind of the white-collar criminal. PublicAffairs.
[xiii] Ayal and Gino (2011) and Gino et al. (2009). Sources: Ayal, S., & Gino, F. (2011). Honest rationales for dishonest behavior. The social psychology of morality: Exploring the causes of good and evil. Washington, DC: American Psychological Association, 149–66; Gino, F., Ayal, S., & Ariely, D. (2009). Contagion and differentiation in unethical behavior the effect of one bad apple on the barrel. Psychological science, 20(3), 393–398.
[xiv] Touré-Tillery and Fishbach (2012) provide interesting evidence on this. Source: Touré-Tillery, M., & Fishbach, A. (2012). The end justifies the means, but only in the middle. Journal of Experimental Psychology: General, 141(3), 570.
[xv] Still according to this theory, the greater the ambiguity of a situation, the greater is the propensity to rationalize our actions in order to maximize our personal benefit.
[xvi] Nisan (1990) developed the first conceptual model on “moral balance”. Source: Nisan, M. (1990). Moral balance: A model of how people arrive at moral decisions. The moral domain, 283–314.
[xvii] Tetlock et al. (2000). Source: Tetlock, P. E., Kristel, O. V., Elson, S. B., Green, M. C., & Lerner, J. S. (2000). The psychology of the unthinkable: taboo trade-offs, forbidden base rates, and heretical counterfactuals. Journal of personality and social psychology, 78(5), 853.
[xviii] In a variant, participants were told to hand-copy a short story written in the first person. The story described either an ethical deed (helping a co-worker) or an unethical act (sabotaging a co-worker). Afterwards, they were asked to rate the desirability of various products on a 1–7 scale. Researchers then observed that those asked to copy the unethical story increased the desirability of cleansing products (such as soaps, disinfectants, etc.) as compared to other products (such as juices, CD cases, etc.).
[xix] In Enron’s notorious case, Kenneth Lay, its Chairman of the Board, used to spend most of his time on charity events. Oftentimes, the greatest offender can be the greatest philanthropist!
[xx] Kang et al. (2016), Kotchen and Moon (2012), and Heal (2005). Sources: Kang, C., Germann, F., & Grewal, R. (2016). Washing away your sins? Corporate social responsibility, corporate social irresponsibility, and firm performance. Journal of Marketing, 80(2), 59–79; Kotchen, M. (2012). Corporate Social Responsibility for Irresponsibility. The BE Journal of Economic Analysis & Policy, 12(1), 1–23; Heal, G. (2005). Corporate social responsibility: An economic and financial framework. The Geneva papers on risk and insurance Issues and practice, 30(3), 387–409.
[xxi] Kotchen and Moon (2012). Source: Kotchen, M. (2012). Corporate Social Responsibility for Irresponsibility. The BE Journal of Economic Analysis & Policy, 12(1), 1–23.
[xxii] Kang et al. (2016). Source: Kang, C., Germann, F., & Grewal, R. (2016). Washing away your sins? Corporate social responsibility, corporate social irresponsibility, and firm performance. Journal of Marketing, 80(2), 59–79
[xxiii] Merritt et al. (2010), Sachdeva et al. (2009), and Zhong et al. (2009). Sources: Merritt, A. C., Effron, D. A., & Monin, B. (2010). Moral self‐licensing: When being good frees us to be bad. Social and personality psychology compass, 4(5), 344–357; Sachdeva, S., Iliev, R., & Medin, D. L. (2009). Sinning saints and saintly sinners the paradox of moral self-regulation. Psychological science, 20(4), 523–528; Zhong, C. B., Liljenquist, K. A., & Cain, D. M. (2009). Moral self-regulation. Psychological perspectives on ethical behavior and decision making, 75–89.
[xxiv] Mazar and Zhong (2010). Curiously, the authors observed that people acted more altruistically after being exposed to green products than after being exposed to conventional products (probably because of the activation of social responsibility and ethical conduct norms associated with these products). So, the mere exposure to green products and the purchase of such products seem to lead to significant different behavioral consequences. Source: Mazar, N., & Zhong, C. B. (2010). Do green products make us better people? Psychological science.
[xxv] Mazar and Zhong (2010, 494).