Why most of the wrong things are done by good people? The pressures of our daily work life
Behavioral ethics series #5: creating successful companies attuned to the values and challenges of the 21st century
In the previous texts of this series, I described how our cognitive biases, stages and dynamics of our decision-making process often lead us to overestimate our ethical behavior.
In this text, I’ll start discussing the second major conclusion of behavioral ethics: that most of the wrong things are done by ordinary people. That is, by individuals with no malicious intent nor any relevant personality disorder such as psychopathic traits.
It is difficult for us to accept that the argument that most of us could end up behaving unethically given the right circumstances. Most people have been educated to think that bad things are done by “bad people” while “good people” do virtuous things. However, most of us have not been educated to believe that people of “good character” can do bad things.
Good people can do bad things because they gradually become “ethically blind,” in the sense that they fail to fully understand the consequences of their behaviors at the time of their actions. As a result, they simply come to think that what they are doing is right or at least justifiable.
According to professors Guido Palazzo, Franciska Krings, and Ulrich Hoffrage, who first coined this term, ethical blindness can be defined as the temporary inability of a decision maker to see the ethical dimension of a decision at stake. According to them, this phenomenon can be understood along three aspects.[i]
The first is that people deviate from the values and principles they have tried to live up to in the past. The second is that ethical blindness is a temporary and context-bound state. This means that, depending on the circumstances, individuals are not able to use their moral reasoning when deciding. However, when the situation changes, they are likely to practice their original values and principles once again. The third is that ethical blindness is unconscious, in the sense that individuals are either unaware to be deviating from their values or are not able to access those values when deciding.
People who become ethically blind only acquire a clearer dimension of the implications of their actions afterward and would not want to repeat them if they had another perspective of reality at that time.[ii] It is no coincidence that many people involved in corporate scandals are shocked by their own behavior when the context changes.[iii]
Ethical blindness is the consequence of a process called “ethical fading.” This process represents the gradual loss of ethical discomfort by individuals with good values. In the words of Ann Tenbrunsel and David Messick, who pioneered this concept, ethical fading is the process “by which the moral colors of an ethical decision fade into bleached hues that are void of moral implications.”[iv]
Ethical fading generally develops as follows: We initially feel that something is wrong with certain practices we see or are compelled to do. In this situation, we feel a strong tension between the implications of these behaviors and our personal values.
Over time, however, we tend to become less and less sensitive to these practices (our stress becomes less intense and moral concerns begin to disappear). The process continues, in small steps, until we completely lose sight of the ethical dimension of our actions or omissions.
After a certain point, we stop questioning ourselves. Transgressions that initially generated internal dilemmas come to be seen as normal or even defensible. At this moment, we have become ethically blind.[v]
According to robust scientific evidence on behavioral ethics, there are three main layers of pressures in our social context that can lead us to become ethically blind:
§ The immediate context (our daily work life);
§ The organizational context (the culture and practices of the organization we work for); and,
§ The institutional context (the environment in which our organization is embedded).
In this article, I am going to focus on the first layer. In the next ones, I will write about the other two.
The first tier of pressures is the immediate context in which we are inserted. Three main aspects related to the circumstances of our daily lives can affect our conduct:
§ The pressure of authority;
§ Peer pressure; and,
§ The self-imposed pressure due to our professional role.
The pressure of authority
The pressure coming from hierarchical superiors is probably the most powerful force that can harm our ethical judgment. From childhood, we are indoctrinated to obey and please figures representing authority such as parents and teachers. Years later, this obedience is reinforced in organizations. This tendency to obey authority figures comes not only from fear but also from the will to please and to show that we are competent.
According to social learning theory, we discover how to behave in a certain situation by noticing others’ behaviors and attitudes and their outcomes.[vi] Individuals that are observed are called models. We are particularly likely to follow influential models who symbolize the norms of conduct of the groups to which we belong to.
In companies, senior leaders as C-level executives and board members embody the behavior to be noticed and imitated, as they represent the organization’s standards, what is in fact important and accepted in this environment.
Numerous experiments demonstrate how pressure from authority figures may lead us to follow orders in a thoughtless way, often against our principles. In two different studies, for example, employees ended up discriminating against black people and foreigners in a personnel-selection process after being told by superiors to behave this way.[vii]
In another study, twenty-two night nurses received a call by a fake physician who described himself as “Dr. Smith.”[viii] He asked them to see if they had a drug named “astroten.” When the nurses checked the stock, they could see that the maximum dosage was supposed to be 10mg.
After reporting to the “doctor” that the drug was available, they were told to administer 20mg of the drug to a patient (“Dr. Smith” justified the request by saying he was in a hurry and would sign the authorization form when he came to see the patient later on). The medication was not real, though the nurses thought it was.
Incredibly, 21 out of the 22 nurses tested (95%) were easily influenced into executing the order. Curiously, when other nurses from the same hospital were asked to discuss what they would do in a similar situation, all but one said they would not comply with the order under any circumstances.
The most famous series of experiments on the power of authority were conducted by psychologist Stanley Milgram in the 1960s.[ix] Milgram’s work produced dramatic results by showing that about two-thirds of ordinary people would apply a lethal shock of 450 volts to another individual just because a scientist asked them to do so.
Milgram’s studies were replicated with more than 1,000 people in various contexts and countries, yielding qualitatively similar results. Men and women exhibited statistically similar rates of obedience, with women usually showing higher levels of stress during the experiment.
After interviewing the participants, Milgram concluded that we have a nature intrinsically linked to loyalty that lead us to obey authority in most circumstances. For him, we go through a so-called “agency shift” in which we forego our rationality and values to satisfy the expectations of authorities. In these circumstances, we suspend our autonomy and simply come to see ourselves as instruments of a greater authority.
Milgram experiments: Dramatic evidence of the power of authority over our ethical judgment.
The overall result of the many experiments on the influence of authority is that most people would go against their own values and principles if they were driven by someone seen as a legitimate leader to behave in a questionable way.[x] The vast majority of people, therefore, tend to obey to authorities under certain circumstances, including in the performance of shameful acts.
The pressure of authority that melts away people’s morality tends to be even more relevant in highly hierarchical structures, as is the typical case in the corporate world. Take, for example, the case of Petrobras, the Brazilian oil company involved in a huge corruption scandal.
One of the key players in the wrongdoings at Petrobras was its senior manager in charge of supply-chain named Paulo Roberto Costa. After being promoted to the C-Suite, Mr. Costa replaced all managers immediately reporting to him. In the first meeting with his team, he reminded everyone that “the music has changed, and whoever does not dance according to the music will be out.” According to a former manager reporting to him, the message was clear: “Whoever did not obey would not have a position in his department.”[xi]
A second example comes Banco Panamericano, a Brazilian bank listed on the stock exchange that announced a massive fraud of around $2.5 billion (half of its total assets) at the end of 2010. In an interview to a local newspaper, the former chairman of its board of directors Luiz Sandoval was asked how he learned about the fraud. He said that the CEO and the CFO (later convicted as the main architects of the malfeasance) have told him at a meeting that the bank had suffered massive losses from “errors of accounting parameterizations.”
Because the chairman did not know what this alleged practice was about, he called the bank’s leading accountant. Shortly after arriving, the accountant started telling him about the fraud. Sandoval then questioned, “Do you know this is illegal? It is illicit?” To which the accountant replied: “Yes, sir. But I did it because I got orders from the CFO.”[xii]
These examples corroborate the conclusions of sociologist Robert Jackall in his superb book Moral Mazes after interviewing dozens of managers at large US corporations. He observed that one of the golden rules to survive in this environment is that subordinates must never contradict their boss’s judgment in public. Violation of this admonition “is thought to constitute a kind of death wish in business, and one who does so should practice what one executive calls ‘flexibility drills,’ an exercise ‘where you put your head between your legs and kiss your ass goodbye.’”[xiii]
At the end of his analysis, Jackall summarized the fundamental rules of corporate life in five warnings, all related to the dangers of authority pressure: “(1) You never go around your boss; (2) You tell your boss what he wants to hear, even when he claims he wants dissenting views; (3) If your boss wants something dropped, you drop it; (4) You are sensitive to your boss’s wishes, so you anticipate what he wants; (5) Your job is not to report something your boss does not want reported, but rather to cover it up. You do what your job requires, and you keep your mouth shut.”[xiv]
Peer pressure from co-workers or other team members is the second element of the immediate context that may increase the risk of ethical blindness. Throughout our evolution, we developed an innate tendency to go along with the group and often to behave as part of a herd.[xv]
This claim comes from the “multilevel selection theory,” which argues that our genes came down to us not only because some individuals outcompeted their neighbors but also because some groups outcompeted neighboring groups.[xvi]
Contemporary evidence of this idea in the business world comes from a statement of a treasury director at a global financial institution, who said that “Bankers work in teams, and the ethics is: you are with us or you are against us … This is about tribal bonding, about belonging and sticking to your mates.”[xvii]
This propensity to display team spirit is positive as it fosters social cohesion and the adherence to the formal and informal norms of the organization. At the same time, though, group pressure can lead us to switch off our moral judgments and end up behaving in ways we would normally not agree with.[xviii]
This argument was corroborated by an in-depth research of employees working for corrupt organizations. It concluded that “team spirit” and “security” were the two main values they shared with peers.[xix] As the authors point out, “Employees need to stick together, which increases group coercion. Through this coercion, employees face higher pressure to follow group norms.”
In another related survey, this time focusing on the finance industry, the author argues that “to an important degree, the financial world is not populated by people willfully doing evil but by conformists who have simply stopped asking questions about right and wrong. Things are going rather well for them and, in their bubble, they are exposed to likeminded people.”[xx]
Unethical behaviors, therefore, often result from our readiness to follow conventions for the sake of avoiding problems, even at the expense of our own values.
The classic experiments carried out by Solomon Asch in the 1950s demonstrated the huge power of peers over our judgment.[xxi] In these studies, a volunteer was asked to answer a simple test of visual accuracy, as illustrated in the figure below. Volunteers were required to look at the left line and indicate which of the three lines on the right was equal to it in length (as you can easily see, the right answer in this example is letter C).
Asch Experiments: evidence of the power of peers over our judgment.
To assess to what extent the participant was influenced by the group’s opinion, he was surrounded by several other “volunteers” (actually, all actors). In some cases, the confederates intentionally gave clearly wrong answers. In the figure above, for example, sometimes all erroneously stated that the correct answer would be letter A.
Asch conducted the experiment with hundreds of people. He found that about 75% of individuals gave at least one wrong answer after 12 rounds, while more than half of the volunteers gave incorrect answers in more than 50% of the rounds. Impressively, 5% gave wrong answers in every round of the experiment.
The researcher interviewed the participants afterward, asking why they had given clearly wrong answers. Two main reasons emerged. Some claimed to have gone along with the group because they no longer believed in their own perception (“if everyone sees, then it must be true”).
Others, in turn, argued to be aware they were giving the wrong answer. However, they preferred erring along with the others to fit in than giving right answers and staying out of the group. Regardless of the quality of the responses, all participants also reported feeling strong pressure to decide in line with peers.[xxii]
Our propensity toward conformity tends to be even stronger when we are surrounded by work colleagues or friends as well as when we are asked to decide on more subjective and ambiguous issues (just as is the case of many business decisions with ethical implications).
In the corporate world, where people relate to each other on a daily basis and issues are much more complex than in the Asch experiment, the pressure for everyone to show cohesion with their peers is obviously much greater.[xxiii]
In a survey with over 2,000 corporate executives in the United States, for example, nearly half reported working for organizations where they regularly feel the need to conform, while more than half said that people in their organizations do not question the status quo.[xxiv]
Self-imposed pressure due to our professional role
The third factor of the immediate context that can undermine our ethical judgments is the least observed one: the pressure we impose on ourselves due to the expectations of our position.
When we adopt a different ethical conduct depending on the professional roles we undertake, we engage in a concept called “role morality.”[xxv]
This mindset is potentially dangerous because it may lead to a feeling that our professional position grants us a kind of “special permission” to behave unethically, even if such behavior would be reprehensible in other circumstances. As a result, we may end up applying ethical standards in our professional lives that diverge from those applied in our personal lives.
In the business world, role morality may lead executives to act in a way they would consider unethical if acting solely for their personal benefit. However, because they are acting on behalf of their organizations, they come to see the same behaviors as acceptable. According to research, people working in large corporations tend to have an even greater propensity to differentiate their personal values from those prevailing in their workplace.[xxvi]
Robert Jackall also observed this phenomenon in his book about the life at big corporations. He concluded that “bureaucratic work causes people to bracket, while at work, the moralities that they might hold outside the workplace or that they might adhere to privately and to follow instead the prevailing morality of their particular organizational situation.”[xxvii]
He also quotes a former vice president of a large company who declared that “What is right in the corporation is not what is right in a man’s home or in his church. What is right in the corporation is what the guy above you wants from you. That’s what morality is in the corporation.”[xxviii]
One of the most impressive experiments on how people tend to drastically change their behavior due to the social role assigned to them was carried out by psychologist Philip Zimbardo in the early 1970s.[xxix] In his classic study, volunteers began to display cruel behaviors toward other participants just because they were required to play the role of prison guards in a role-playing exercise simulating prison life.[xxx]
The Stanford Prison experiment: A dramatic example of how our social role can influence our behavior
More recently, another experiment published in the prestigious journal Nature corroborates the power of role morality by showing how much the professional self-image can influence individual behavior.[xxxi]
In this research, carried out with 128 employees of a large international financial institution with around 12 years of experience in the industry, executives acted much more dishonestly just by having their professional image as “bankers” reinforced before the study.
For half of the executives (the so-called “control group”), researchers asked general questions about their personal lives, such as how many children they had and what were their main hobbies. The purpose of these questions was to highlight their personal identity as ordinary citizens prior to the experiment.
For the other half (the “experimental group”), researchers asked questions related to their professional lives, such as how much time they worked every day, what were their positions, and their main daily activities at the bank. In this case, the goal was to emphasize their professional identities as “bankers.”
Participants were then taken to an isolated room where they were asked to flip a coin 10 times and anonymously report the results in a computer. They knew that they would receive $20 for each tail reported.
Statistically, the mean number of tails reported for each group should fluctuate around 50%. This is exactly what happened with the “control group.” People who were reminded of their personal identity reported on average 51.6% of successful coin flips, a figure statistically similar to the expected 50% of a binomial distribution. Thus, bank employees were honest under normal circumstances.
The most interesting thing occurred to the individuals reminded of their professional identity before the test. In the “experimental group,” the percentage of successful coin flips rose to around 60%, a much greater figure than what was supposed to be randomly obtained. Thus, individuals started to behave dishonestly simply because they were reminded right before the test that they were executives from the financial industry.
In the experimental group, researchers also noted that 8% of participants claimed to have obtained 10 “tails” in a row in order to pocket the maximum $200 earnings. This virtually impossible statistical outcome was not reported by anyone from the control group. After having their professional identities highlighted right before the test, therefore, some executives left aside any ethical concerns and began to behave like true homo economicus.[xxxii]
According to authors, the overall conclusion of their research is that “the prevailing business culture in the banking industry favors dishonest behavior and thus has contributed to the loss of the industry’s reputation.” On the other hand, the positive news is that researchers also acknowledged that “differently from the perception of the public opinion, we observe that bank executives tend to behave honestly in a control condition.”
This important experiment demonstrates how the position we hold can change our self-perception and, consequently, influence our behavior. In a book on white-collar crimes, for example, many executives involved in illicit acts saw themselves primarily as “problem-solvers.” As a result, they simply believed that their role was to solve any kind of problem with ingenuity, even circumventing the rules if necessary.
As Robert Jones, a former executive convicted for financial fraud, pointed out: “You have a career of being able to find solutions, being successful… I worked in the industry for 20 years, and we always figured stuff out. We always did well… you have to understand, it comes from a career of trying to find solutions to things. We never grew up with someone saying you can’t do it and therefore you’re not going to hit a number.”[xxxiii]
To sum up:
§ The second major conclusion of behavioral ethics is that most of the wrong things are done by ordinary people. Good people can do bad things because they gradually become “ethically blind,” in the sense that they fail to fully understand the consequences of their behaviors at the time of their actions.
§ Ethical blindness results from a combination of three main layers of pressures of our social context: the immediate context (our daily work life); the organizational context (the culture and practices of the organization we work for); and, the institutional context (the environment in which our organization is embedded).
§ In our daily working life, three main aspects can harm our ethical judgments: the pressure of authority; peer pressure; and, the self-imposed pressure due to our professional role.
§ The pressure coming from hierarchical superiors is probably the most powerful force that can affect our ethical judgment. Numerous experiments demonstrate how pressure from authority figures may lead us to follow orders in a thoughtless way, often against our principles.
§ Peer pressure from co-workers or other team members may also increase the risk of ethical blindness. Throughout our evolution, we developed an innate tendency to go along with the group and often to behave as part of a herd. Group pressure can lead us to switch off our moral judgments and end up behaving in ways we would normally not agree with.
§ The third factor is the pressure we impose on ourselves due to the expectations of our position. When we adopt a different ethical conduct depending on the professional roles we undertake, we engage in a concept called “role morality.” In the business world, role morality may lead people to act in a way they would consider unethical if acting solely for their personal benefit. However, because they are acting on behalf of their organizations, they come to see the same behaviors as acceptable. The position we hold, therefore, can change our self-perception and, consequently, influence our behavior.
Behavioral ethics series:
Prof. Dr. Alexandre Di Miceli is a professional speaker, business thinker and founder of Virtuous Company, a top management consultancy that provides cutting edge knowledge on corporate governance, ethical culture, leadership, diversity, and company purpose.
He is the author of “The Virtuous Barrel: How to Transform Corporate Scandals into Good Businesses” as well as of the best-selling books on corporate governance and business ethics in Brazil, including “Corporate Governance in Brazil and in the World”, “Behavioral Business Ethics: Solutions for Management in the 21st Century”, and “Corporate Governance: The Essentials for Leaders”.
He thanks Prof. Dr. Angela Donaggio for her valuable comments and suggestions.
[i] Palazzo, G., Krings, F., & Hoffrage, U. (2012). Ethical blindness. Journal of Business Ethics, 109(3), 323–338.
[ii] Authors like Bazerman and Tenbrunsel (2011) employ the term “bounded ethicality” instead of the concept of ethical blindness. According to them, bounded ethicality represents the systematic and predictable ways in which people make decisions without realizing the implications of their behavior. Chugh et al. (2005) is probably be the first paper to have coined the term “bounded ethicality”. Sources: Bazerman, M. H., & Tenbrunsel, A. E. (2011). Blind spots: Why we fail to do what’s right and what to do about it. Princeton University Press. Chugh, D., Bazerman, M. H., & Banaji, M. R. (2005). Bounded ethicality as a psychological barrier to recognizing conflicts of interest. Conflicts of interest: Challenges and solutions in business, law, medicine, and public policy, 74–95.
[iii] This was also one of the conclusions of Soltes (2016: 304): “Many of the executives I spoke with did not perceive the harm associated with their choices at the time they were making them. It’s only with the benefit of hindsight that the detrimental effects of their conduct became clear to them.” Source: Soltes, E. (2016). Why they do it: inside the mind of the white-collar criminal. PublicAffairs.
[iv] Source: Tenbrunsel, A. E., & Messick, D. M. (2004). Ethical fading: The role of self-deception in unethical behavior. Social Justice Research, 17(2), 223–236.
[v] This argument is based on a rationale carried out by Bazerman and Tenbrunsel (2011: 69–71). Source: Bazerman, M. H., & Tenbrunsel, A. E. (2011). Blind spots: Why we fail to do what’s right and what to do about it. Princeton University Press.
[vi] Social learning theory was developed by Albert Bandura in the 1960s. It argues that learning is a cognitive process that occurs within a social context and through direct observation and instruction, not necessarily requiring reinforcements and rewards. In his classic studies, Bandura showed that children exposed to an aggressive model reproduced considerably more aggressive behaviors toward a Bobo doll than children not exposed to the aggressive example. For more information, see Bandura (1969). Source: Bandura, A. (1969). Social-learning theory of identificatory processes. Handbook of socialization theory and research, 213.
[vii] Brief et al. (1995), and Petersen e Dietz (2000). Sources: Brief, A. P., Buttram, R. T., Elliott, J. D., Reizenstein, R. M., & McCline, R. L. (1995). Releasing the beast: A study of compliance with orders to use race as a selection criterion. Journal of Social Issues, 51(3), 177–193. Petersen, L. E., & Dietz, J. (2000). Social Discrimination in a Personnel Selection Context: The Effects of an Authority’s Instruction to Discriminate and Followers’ Authoritarianism1. Journal of Applied Social Psychology, 30(1), 206–220.
[viii] Hofling et al. (1966). A summary is available at McLeod (2008). Sources: Hofling, C. K., Brotzman, E., Dalrymple, S., Graves, N., & Pierce, C. M. (1966). An experimental study in nurse-physician relationships. The Journal of nervous and mental disease, 143(2), 171–180. McLeod, S. A. (2008). Hofling hospital experiment. Retrieved from www.simplypsychology.org/hofling-obedience.html
[ix] Sources: Milgram, S. (1963). Behavioral Study of obedience. The Journal of abnormal and social psychology, 67(4), 371. Milgram, S. (1965). Some conditions of obedience and disobedience to authority. Human relations, 18(1), 57–76; Milgram, S., & Gudehus, C. (1978). Obedience to authority.
[x] Milgram’s experiments have been recently replicated out of the academic world to check if people’s behavior had changed substantially fifty years after the original experiment. The results were not encouraging. In a 2009 replication carried out by BBC, nine of the twelve participants (75%) were until the very end by applying the lethal shock on the volunteer (the link https://vimeo.com/89396290 shows the video of this experiment). In another 2009 replication by France Télévision as part of a program called jeu de la mort, over 90% of the participants applied the supposedly lethal shock to other people, including a Jewish participant whose grandparents had been killed in a concentration camp during World War II! (The video for this program is available at https://youtu.be/6gsKGyMZ_Q4)
[xi] Paduan (2016, p. 203). Source: Paduan, Roberta. (2016). Petrobras: Uma História de Orgulho e Vergonha. Editora Objetiva.
[xii] Estado de São Paulo. 12/25/2010. O vice-presidente de finanças do banco mandou o contador maquiar o balanço. Available at http://economia.estadao.com.br/noticias/geral,o-vice-presidente-de-financas-do-banco-mandou-o-contador-maquiar-o-balanco,658184
[xiii] Jackall (1988: 20). Source: Jackall, Robert. (1988). Moral Mazes: The World of Corporate Managers. New York: Oxford University Press.
[xiv] Ibid: 110.
[xv] Acting in conformity with the group was a winning strategy in our early days, when humans lived in small numbers and survived as hunter-gatherers. The video available at https://youtu.be/BgRoiTWkBHU illustrates in a funny way the extent of our strong tendency to replicate the behavior of others surrounding us.
[xvi] Kluver et al. (2014: 153, 154). As pointed out by the authors “the key implication of this work for researchers in behavioral ethics is that humans may have inherited a suite of biobehavioral adaptations and tendencies conducive to acting in the interest of groups, above and beyond what might be expected…”. Source: Kluver, J., Frazier, R., & Haidt, J. (2014). Behavioral Ethics for Homo economicus, Homo heuristicus, and Homo duplex. Organizational behavior and human decision processes, 123(2), 150–158.
[xvii] Luyendijk (2015: 236, 238). Source : Luyendijk, J. (2015). Swimming with Sharks: My Journey into the World of the Bankers (Vol. 4). Guardian Faber Publishing.
[xviii] Based on fMRI scans, Cikara et al. (2014) concluded that the part of our brain that makes moral judgements is less active when we’re in a group. Source: Cikara, M., Jenkins, A. C., Dufour, N., & Saxe, R. (2014). Reduced self-referential neural response during intergroup competition predicts competitor harm. NeuroImage, 96, 36–43.
[xix] Campbell and Göritz (2014: 305). Source: Campbell, J. L., & Göritz, A. S. (2014). Culture corrupts! A qualitative study of organizational culture in corrupt organizations. Journal of business ethics, 120(3), 291–311.
[xx] Luyendijk (2015: 220).
[xxi] Sources: Asch, S. E. (1955). Opinions and social pressure. Readings about the social animal, 193, 17–26.
Asch, S. E. (1956). Studies of independence and conformity: I. A minority of one against a unanimous majority. Psychological monographs: General and applied, 70(9), 1.
[xxii] According to Asch (1956), the factors that increased the percentage of people’s conformity with the group were: the size of the group (the more people, the greater the tendency towards conformity), the difficulty of the task (the more complex the task, the greater the conformity), and the status of the other people of the group (the higher the perceived status, the greater the conformity). On the other hand, the factors that reduced conformity were the following: the lack of unanimity (a single dissident greatly reduced the degree of conformity with the group), and the possibility of giving anonymous responses.
[xxiii] This applies not only to group decisions, but also to the way of dressing, talking, behaving, etc.
[xxiv] Harvard Business Review. 24/10/2016. Let your workers Rebel. By Francesca Gino. Available at https://hbr.org/cover-story/2016/10/let-your-workers-rebel
[xxv] For more on role morality, see Andre (1991), Werhane and Freeman (1999), and Gibson (2003). Sources: Andre, J. (1991). Role morality as a complex instance of ordinary morality. American Philosophical Quarterly, 28(1), 73–80. Werhane, P. H., & Freeman, R. E. (1999). Business ethics: the state of the art. International Journal of Management Reviews, 1(1), 1–16. Gibson, K. (2003). Contrasting role morality and professional morality: Implications for practice. Journal of Applied Philosophy, 20(1), 17–29.
[xxvi] For a compilation of this evidence, see Jackall (1988).
[xxvii] Jackall (1988: 6).
[xxviii] Ibid: 6.
[xxix] Haney et al. (1972), Haney et al. (1973), and Zimbardo (2007). Sources: Haney, C., Banks, C., & Zimbardo, P. (1972). Interpersonal dynamics in a simulated prison (No. ONR-TR-Z-09). Stanford University Department of Psychology. Haney, C., Banks, W. C., & Zimbardo, P. G. (1973). A study of prisoners and guards in a simulated prison. Naval Research Review, 30, 4–17. Zimbardo, P. G. (2007). Lucifer Effect. Blackwell Publishing Ltd.
[xxx] The original videos of Zimbardo’s experiment are available at https://youtu.be/sYtX2sEaeFE, https://youtu.be/uTdttd7XTfQ e https://youtu.be/fQnOkmvigi0 . The seminal article describing the experiment was written by Haney et al. (1973). A recent summary can be found at McLeod (2016). Source: McLeod, S. A. (2016). Zimbardo — Stanford Prison Experiment. Available at www.simplypsychology.org/zimbardo.html
[xxxi] Cohn et al. (2014). Source: Cohn, A., Fehr, E., & Maréchal, M. A. (2014). Business culture and dishonesty in the banking industry. Nature, 516(7529), 86–89.
[xxxii] The concept of homo economicus was originally coined by critics of John Stuart Mill’s 1836 work on political economy. It assumes that human beings always: make perfectly rational decisions; think exclusively in maximizing their own personal economic gains; and, are interested in breaking the rules if the applicable penalty multiplied by the probability of being caught is lower than the expected benefit of a dishonest act. Homo economicus is a standard assumption of most neoclassical economic models.
[xxxiii] Soltes (2016: 189). Source: Soltes, E. (2016). Why they do it: inside the mind of the white-collar criminal. PublicAffairs.