Why do we overestimate our ethical behavior? Limited rationality and cognitive biases

Behavioral ethics series #3: creating successful companies attuned to the values and challenges of the 21st century

Alexandre Di Miceli
17 min readJun 25, 2020

--

In the latest text of this series, I described the first key finding of behavioral ethics: we overestimate our ethical behavior.

In this text, I am going to address the first reason why we often do not realize the gap between how ethical we would like to be and how ethical we actually have been.

When we become aware of a new case of corporate malfeasance, there is a strong tendency to assume that the individuals involved in these episodes carefully calculated the benefits and costs of their decisions before acting. In other words, we tend to assume that these wrongdoings are the consequence of malicious people with lack of character deliberately opting for dishonesty.

Our tendency to think that dishonest people always act in a cold and rational way, however, is not supported by the experiments in the field of behavioral ethics. On the contrary. As we shall see, research shows that most of the wrong things are done by ordinary individuals who, instead of being guided by deliberative reasoning, simply got carried away by the circumstances without proper consideration of the consequences of their actions.

This is also the conclusion of Eugene Soltes in his book Why They Do It. After interviewing dozens of convicted senior managers, he concluded that “Many [executives] were not mindfully weighting the expected benefits against the expected costs … I didn’t see this. Instead, I found that they expended surprisingly little effort deliberating the consequences of their actions. They seem to have reached their decisions to commit crimes with little thought or reflection. In many cases, it was difficult to say that they had ever really ‘decided’ to commit a crime at all.” [i]

This is also the finding of Joris Luyendijk, a trained anthropologist who interviewed around 200 London-based bankers in the aftermath of the 2008 global financial crisis. He concluded that “Yes, there is a lot of greed in the City, as there is elsewhere. But if you blame all the scandals on individuals you imply that the system itself is fine and all we need to do is to smoke out the crooks …. I am convinced that were we to pack off all the employees in the City to a desert island and replace them a quarter of a million people, we would see in no time the same kind of abuse and dysfunctionality. The problem is the system.”[ii]

To accept the argument that most unethical acts are committed by those who did not deeply ponder on the pros and cons of their actions from a utilitarian perspective, it is first necessary to understand how people actually make decisions.

Until the end of the twentieth century, the prevailing idea was that our judgments are the outcome of an elaborate deliberative system. Consequently, we could be able to be fully rational most of the time, except for the moments in which we would be under the strong effect of emotions.[iii] Actually, though, this is a wrong and outdated view.

The modern view on our decision-making process, based on countless research in the field of applied psychology starting in the 1970s and now widely accepted in the scientific community nowadays, demystified the idea of full rationality.

The research has shown that our reasoning is rather limited and that our decisions are strongly influenced by our (often distorted) perception of reality. As a result, our decisions are, to a large extent, predictable but not rational.

Systems 1 and 2

According to Daniel Kahneman, the first psychologist to win a Nobel Prize in economics in 2002,[iv] our mind can be understood by a metaphor dividing it into two components, the so-called Systems 1 and 2.

System 1 operates quickly, automatically, and intuitively, virtually without requiring any effort or energy expenditure. If you hear the sentence “coffee with …,” for example, you will think of “milk” immediately, before it ends. System 1 makes the vast majority of our day-to-day decisions, such as completing this sentence or driving a car on an empty street. Most of the time, therefore, we simply operate on “autopilot.”

System 2, in turn, is much slower, logical, and deliberative, requiring more concentration and substantial energy expenditure of our body. If I show you the calculation “24 x 17,” you will have to focus in order to find its result. During this time, research shows that your pupils will dilate, your heart rate will accelerate, and your blood pressure will increase, among other physiological changes. System 2, therefore, makes the decisions that require more mental effort, such as resolving this calculation or parking the car in a tight spot.

The rider and the elephant

A similar and more playful way to understand how we process information is via the metaphor of the “rider and the elephant.” According to psychologist Jonathan Haidt, who conceived this idea, our mind is divided into two cognitive processes: intuition and reasoning.[v]

The “rider” represents our processes based on reason, i.e., our explicit, conscious, and deliberative reasoning. Because it depends on language, the rider is much more recent in evolutionary terms, being located in the prefrontal cortex region of our brain.

The “elephant,” in turn, represents our intuition. It is our gut feeling that reacts almost instantaneously to anything we are exposed to, such as ideas, people, or situations, and it may or may not include emotions.[vi] The elephant is independent of language and much older in evolutionary terms, being located in the limbic system of our brain.

The figure below illustrates the rider and the elephant metaphor.

Source: Adapted from Haidt, J. (2006). The Happiness Hypothesis: Finding Modern Truth in Ancient Wisdom. Basic Books.

According to Haidt, several experiments show that the elephant dominates the rider, determines our habits, and makes the vast majority of our decisions. This conclusion endorses eighteenth-century philosopher David Hume’s view that “reason is the slave of passions.”[vii]

We have a strong tendency, therefore, to immediately judge what is presented to us based on our intuition. Frequently, it is only after this judgment that we seek to find arguments and explanations to justify the decisions we have already made. Thus, even when we convince ourselves that we are applying a reflective reasoning to find the truth, we are often just searching for evidence to support our immediate intuitive judgment.

From an evolutionary perspective, this is the main role of the rider: to act as a sort of “internal lawyer” or “public relations officer” specialized in creating explanations for the decisions we make viscerally.

Metaphors of our decision-making process and ethical behavior

There is a direct relationship between the metaphors illustrating the inner workings of our mind and our ethical behavior. Unlike the traditional view, which argues that people who act dishonestly always do so after rationally weighing the pros and cons of their decisions, the reality is that many unethical behaviors are the outcome of automatic, thoughtless decisions.

It is always important to highlight that ethics is, to a large extent, the result of awareness, reflection, and consideration of impacts on other people. Thus, the more we get carried away by the impulses of the “elephant” (or “System 1”), the less likely it is that we will be able to analyze a decision from an ethical perspective and make the decision in accordance with our values. The more impulsive our behavior, the less reflected it is. Consequently, potentially worse our ethical conduct will tend to be.

This does not mean, however, that our visceral reaction should be neglected and that we should only seek to act “rationally.” Intuition plays a key role in the quality of all our decisions, including from the ethical standpoint. In many situations, for example, we feel there is something wrong in going ahead with a particular course of action, even when it is legal and apparently defensible from the technical perspective.[viii] Neuroscientist Antonio Damasio, one of the main authorities on the relation between brain and body, states: “Feelings do not only reveal the dark side of reason: they help us making decisions that are good to us.”[ix]

Every good decision, including those with ethical implications, should be based on the agreement of our Systems 1 and 2 or, alternatively, of the rider and the elephant. It is essential, therefore, to put the two modes of thinking, i.e., reason and intuition, to dialogue and reach a consensus before our decisions.[x]

Heuristics and cognitive biases

Because the rider (or System 2) requires a lot of energy from our bodies, we try to make the most use of the elephant (or System 1). To make quick and almost effortless decisions, our intuitive system makes use of the so-called “heuristics.” These are “rules of thumb” or “cognitive shortcuts” that we unconsciously developed to make quick decisions in the complex environment in which we live.

Heuristics are useful on most occasions. However, because they are affected by our emotional motivations and limited ability to process information, these rules of thumbs can lead us to make systematically wrong judgments in predictable ways.[xi]

Although useful for helping to save time and brain energy, heuristics can generate so-called “cognitive biases.” Biases are predictable and systematic errors arising from our personal beliefs and preferences. They result from our distorted framing of reality and can lead us to make irrational decisions.

We are subject to numerous biases, both at the individual as well as at the collective level. Over the past decades, more than 170 cognitive biases that may impair our judgment have been cataloged.[xii]

Cognitive biases can lead us to make wrong decisions not only from a technical point of view but also from the ethical standpoint. The following are examples of some of the most common cognitive biases, including their ethical implications.

Self-serving bias

We have a strong propensity to analyze a situation and to remember it from a perspective that unduly favors us. The self-serving bias aims to protect and enhance our self-esteem. It causes our minds to unconsciously process and absorb the information that is advantageous to us and, at the same time, to ignore or even erase from memory the information that is not good.

One consequence of this bias is that our judgments tend to be inevitably affected by our self-interest and personal preferences, particularly when the decision at stake is important and there is high uncertainty about the facts. Another outcome of the self-serving bias is that we tend to ascribe successes to our talent and efforts and impute failures to the circumstances or other people.

This bias, thus, inflates the perception of our individual competence as well as makes us less prone to criticism.[xiii] There is also a serious ethical implication at stake because our tendency to unfairly judge credits or errors related to a decision may inadvertently harm the lives of others.

Egocentric bias

We tend to have an inflated perception of our personal contribution to any collective endeavor. For example, consider the following question: What is the percentage of good ideas of your work team generated by yourself? Research conducted after the completion of projects shows that the sum of the percentages in a group can easily exceed 100% when members are individually asked about their personal contributions.[xiv]

The same applies to personal life. In a classic research, married couples were asked to separately assess their individual roles in several household chores, such as preparing breakfast, cleaning, and shopping. In this case, self-allocated responsibility oscillated around 150% when summed together.[xv]

Thus, because we focus too much on our own efforts and too little on others, the perception of our personal contribution to group efforts is often unrealistic. The egocentric bias can even lead us to mentally reconstruct the past for the sake of increasing the role we had in a certain circumstance.[xvi] This bias, therefore, also has a relevant ethical implication because it can lead us to unfairly allocate the credits of collective initiatives.

Confirmation bias

We tend to assign more importance to information that confirms our beliefs or initial opinion on a certain subject. Research shows, for instance, that we tend to spend twice as much time to actively seek information that supports our point of view than on finding facts that contradict it.[xvii]

Peter Drucker, a leading management guru of the twentieth century, noticed this bias after decades as a consultant for large organizations. In his words, “People simply do what everyone is far too prone to do anyhow: look for the facts that fit the conclusion they have already reached. And no one has ever failed to find fact he is looking for.”[xviii]

Individuals with better education or higher intellectual capacity are not immune to the confirmation bias. On the contrary. There is evidence that people with a high IQ (intellectual quotient) are more likely to create justifications that confirm their initial opinions, thus reinforcing the problem.

Confirmation bias shows, therefore, that we are very good at challenging other’s people points of views but not our own. This leads to a serious ethical implication: By being prone to unfairly criticize potentially valuable ideas from other people that are contrary to our beliefs, we may inadvertently end up demotivating or harming them.

Optimism or overconfidence bias

Optimism and overconfidence are two of the most documented biases in the literature. They are interrelated and are present, to a greater or lesser extent, in almost everyone. A curious survey conducted with 1,000 respondents in the United States demonstrates the extent of this bias. In this poll, 87% of survey takers claimed to believe they would go to heaven, while only 79% said they believed that Mother Teresa of Calcutta would also make it.[xix]

Exhibiting a high degree of optimism and overconfidence can lead us to overestimate the prospects of our initiatives as well as to underestimate its risks. These biases generate a kind of “illusion of control” about the outcomes of our activities, even when we do not have any control over the various factors having an impact on them.

The overconfidence bias leads us to overestimate the power of our own good intentions. In an experiment on sexual harassment, for instance, a group of women was asked how they would respond to inappropriate job interview questions posed by a male interviewer (such as if they had a boyfriend). Although 70% predicted that they would refuse to answer the sexist questions or terminate the interview, none of them did so when actually placed in a realistic job interview.[xx]

In-group favoritism bias

Our primitive “elephant” always prefers what is perceived as familiar and secure. As a result, we have a strong tendency to support and favor opinions from people who belong to our group, compared with views from those perceived as out of our circle. It is important to note that the idea of “group” goes beyond our relatives or work team, encompassing those who belong to our religion, race, or gender.

Research has confirmed our preference for people with whom we have any kind of similarity. One study, for example, revealed that individuals exhibited a higher degree of cooperation in a prisoners’ dilemma game with people whom they believe to share the same birthday date.[xxi] Another paper noticed that we are even disproportionately likely to marry others whose first or last names resemble our own.[xxii]

Our preference for what is familiar manifests itself even in our consumption and career decisions. In the first case, surveys show that people whose names start with the letter C (like Carol) are more likely to opt for Coca-Cola, while those starting with the letter P (like Peter) tend to choose Pepsi-Cola.[xxiii] Research conducted in the United States also shows that people whose names begin with letter D have a greater likelihood of becoming dentists, while those with names starting with G have a greater chance of moving to states beginning with the same letter, such as Georgia.[xxiv]

The main ethical implication of implicit favoritism of in-group members is that, by giving preference to those perceived as belonging to our group, we can inadvertently discriminate against and harm people of social out-groups — especially those belonging to minorities.[xxv] Actually, the problems of prejudice and racism may be more related to an unconscious affection for the people belonging to our group than due to a negative and conscious attitude toward people outside our circle.

To sum up:

§ Research on behavioral ethics show that most of the wrong things are done by ordinary individuals who, instead of being guided by deliberative reasoning, simply got carried away by the circumstances without proper consideration of the consequences of their actions.

§ To understand this finding, it is necessary to understand how we make decisions. Instead of being fully rational, our reasoning is rather limited, and our decisions are strongly influenced by our often-distorted perception of reality.

§ Two popular metaphors help us understand the way our mind works: Systems 1 and 2 by Daniel Kahneman and Amos Tversky, and the Rider and the Elephant by Jonathan Haidt.

§ Good ethical decisions decision should be based on the agreement of our Systems 1 and 2 or, alternatively, of the rider and the elephant. It is critical to put the two modes of thinking (reason and intuition) to dialogue and reach a consensus before our decisions.

§ To make quick decisions in the complex environment in which we live, we resort to “heuristics” or “cognitive shortcuts”. Although they are often useful, these rules of thumbs can generate so-called “cognitive biases” that can lead us to make systematically wrong judgments — including from the ethical standpoint — in predictable ways.

Behavioral ethics series:

# 1: What is behavioral ethics and why this subject is important for all business leaders.

# 2: The first key finding of behavioral ethics: We overestimate our ethical behavior.

Prof. Dr. Alexandre Di Miceli is a professional speaker, business thinker and founder of Virtuous Company, a top management consultancy that provides cutting edge knowledge on corporate governance, ethical culture, leadership, diversity, and company purpose.

He is the author of “The Virtuous Barrel: How to Transform Corporate Scandals into Good Businesses” as well as of the best-selling books on corporate governance and business ethics in Brazil, including “Corporate Governance in Brazil and in the World”, “Behavioral Business Ethics: Solutions for Management in the 21st Century”, and “Corporate Governance: The Essentials for Leaders”.

He thanks Prof. Dr. Angela Donaggio for her valuable comments and suggestions.

[i] Soltes (2016: 6). Soltes, E. (2016). Why they do it: inside the mind of the white-collar criminal. PublicAffairs.

[ii] Luyendijk (2015: 253–254). Luyendijk, J. (2015). Swimming with Sharks: My Journey into the World of the Bankers (Vol. 4). Guardian Faber Publishing.

[iii] The concept of “rationality” means that we are able to process all the information available to solve a problem (even if very complex) in an objective, complete, fact-based and unbiased way — without the influence of the environment or other people. It also means that at the end of this process, we will always be able to consistently choose the option that maximizes our personal utility. The concept of emotions, in turn, refers to our primary feelings: anger, fear, disgust, joy, sadness, contempt, surprise and disappointment.

[iv] Kahneman carried out his pioneering studies on behavioral economics during the 1970s with his colleague Amos Tversky. With the death of Tversky in 1996, Kahneman ended up receiving the Nobel Prize in Economics alone. Among his seminal works, stand out Tversky and Kahneman (1973) and Kahneman and Tversky (1979). In 2011, Kahneman summarized his forty years of research in an international bestseller entitled “Thinking, fast and slow”. Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive psychology, 5(2), 207–232. Kahneman, D., & Tversky, A. (1979). Prospect theory: An analysis of decision under risk. Econometrica: Journal of the econometric society, 263–291.

[v] Haidt (2006) and Haidt (2013). Haidt, J. (2006). The Happiness Hypothesis: Finding Modern Truth in Ancient Wisdom. Basic Books. Haidt, J. (2012). The Righteous mind: Why good people are divided by politics and religion. Vintage.

[vi] It is here that the intuitionist approach of psychologists such as Haidt differs from the classic paradigm of reason vs. emotion. For them, although all emotions express themselves automatically, there are many intuitive and automatic reactions in our mind that do not manifest themselves as emotions.

[vii] Full quote: “Reason is and ought only to be the slave of the passions, and can never pretend to any other office than to serve and obey them”. David Hume (1711–1776).

[viii] In this sense, the modern view in the field of medicine is that the concept that there is an “imperial brain” commanding our whole body is wrong. In fact, our body is also filled with neurons (there are about 100 million neurons in the intestines alone), as well as there is a constant bidirectional communication between brain and body, including for decision-making. Source: Mente Cérebro Magazine, 286: 21–22. “How the brain helps heal the body”.

[ix] Interview to Mente Cérebro Magazine, n. 286: 47. “Feelings are made of emotions”.

[x] The need to unite the cognitive and emotional aspects of our decisions is not new to Chinese philosophy. The word Xin, for example, simultaneously serves to denote the concepts of heart and mind. Source: http://languagelog.ldc.upenn.edu/nll/?p=14807

[xi] The busier and hurried we are and the more experience we have in a certain activity, the greater will be the likelihood that we will make use of the intuitive system 1 for our decisions.

[xii] The link http://en.wikipedia.org/wiki/List_of_cognitive_biases provides a list of more than 170 cognitive biases.

[xiii] Curiously, our judgment tends to reverse when we evaluate others: in this case, we tend to ascribe other people’s success to the circumstances and their failures to their own personal deficiencies.

[xiv] Caruso et al. (2006). Caruso, E., Epley, N., & Bazerman, M. H. (2006). The costs and benefits of undoing egocentric responsibility assessments in groups. Journal of personality and social psychology, 91(5), 857.

[xv] Ross and Sicoly (1979). Ross, M., & Sicoly, F. (1979). Egocentric biases in availability and attribution. Journal of personality and social psychology, 37(3), 322–336.

[xvi] Although bearing some resemblance to self-serving bias, egocentric bias is different because people also tend to consider themselves overly responsible for the negative outcomes of the group to which they belong. This bias shows how we attach greater weight to what we do in relation to other people belonging to our group under all circumstances, including the negative ones.

[xvii] Hart et al. (2009). Some papers refer to confirmation bias by the term congeniality bias. Hart, W., Albarracín, D., Eagly, A. H., Brechan, I., Lindberg, M. J., & Merrill, L. (2009). Feeling validated versus being correct: a meta-analysis of selective exposure to information.

[xviii] Source: Drucker. P. 2008. The Essential Drucker: The Best of Sixty Years of Peter Drucker’s Essential Writings on Management. Ed. HarperBusiness. Chapter. 17.

[xix] 1997 US News and World Report study. Available at http://www.nytimes.com/2013/01/26/your-money/tips-for-making-decisions-and-sticking-to-them.html. This survey is also cited by Gino (2013) in her book “Sidetracked: Why Our Decisions Get Derailed and How We Can Stick to the Plan”.

[xx] Woodzicka and LaFrance (2001). Woodzicka, J. A., & LaFrance, M. (2001). Real versus imagined gender harassment. Journal of Social Issues, 57(1), 15–30.

[xxi] Miller et al. (1998). Miller, D. T., Downs, J. S., & Prentice, D. A. (1998). Minimal conditions for the creation of a unit relationship: The social bond between birthdaymates. European Journal of Social Psychology, 28(3), 475–481.

[xxii] Jones et al. (2004). Haidt (2006) describes other interesting examples related to this issue in his book. Jones, J. T., Pelham, B. W., Carvallo, M., & Mirenberg, M. C. (2004). How do I love thee? Let me count the Js: implicit egotism and interpersonal attraction. Journal of personality and social psychology, 87(5), 665. Haidt, J. (2006). The Happiness Hypothesis: Finding Modern Truth in Ancient Wisdom. Basic Books.

[xxiii] Brendl et al. (2005). Brendl, C. M., Chattopadhyay, A., Pelham, B. W., & Carvallo, M. (2005). Name letter branding: Valence transfers when product specific needs are active. Journal of Consumer Research, 32(3), 405–415.

[xxiv] Pelham et al. (2002). Pelham, B. W., Mirenberg, M. C., & Jones, J. T. (2002). Why Susie sells seashells by the seashore: implicit egotism and major life decisions. Journal of personality and social psychology, 82(4), 469.

[xxv] The so-called implicit association test developed by non-profit organization Project Implicit® shows how most of us is full of prejudices, especially regarding people who are not part of our social group. More information at https://implicit.harvard.edu/implicit/

--

--

Alexandre Di Miceli

Professional speaker, business thinker and founder of Virtuous Company, a top management consultancy on corporate governance, culture, leadership, and purpose.