Moral blindness

Moral blindness is defined as the temporary inability to see the unethical side of a certain context or situation. It is often caused by external factors due to which an individual is unable to see the immoral aspect of their behavior in that particular situation.[1]

While the concept of moral blindness (and more broadly, that of immorality) has its roots in ancient philosophy,[2][3] the idea of moral blindness became popular after events of World War II, particularly the Holocaust.[4] This led to more research by psychologists and some surprising findings (notably by Milgram and Zimbardo) on human behavior in the context of obedience and authority.[1]

Over the years, moral blindness has been identified as being a concern in wide ranging areas such as in business organisations and legal systems.[5][6] It has also spurred research on other related concepts such as moral blind spots, ethical fading and ethical erosion.

Overview

Moral blindness is a phenomenon in which people with sufficient moral reasoning abilities are temporarily unable to see reason which causes them to behave in ways counter to their actual moral values. This behaviour can be due to situational or other factors. The idea of moral blindness usually requires the following: people need to deviate from their intrinsic moral beliefs and this deviation should be temporary and unconscious i.e. people are unaware at the time of their unethical behaviour.[1][7]

Interest in the idea of moral blindness increased after Hannah Arendt's Eichmann in Jerusalem. A Report on the Banality of Evil [4] which focussed on Adolf Eichmann, a German-Austrian Nazi soldier who was responsible for deportation of Jews to extermination camps and thus played a major role in the Holocaust.[8]

The ideas of moral blindness and the "banality of evil" also influenced the field of psychology and led to some notable studies in the 70s such as the obedience studies by Stanley Milgram and the Stanford Prison Experiment by Philip Zimbardo. These studies looked at the impact of authority on obedience and individual behaviour.[1]

Subsequent research has looked at moral blindness in contexts beyond war crimes and genocide. The idea has been expanded to study people's behaviour in areas as diverse as organisational behavior and mental health to name a few.

Origins and early theories

Roots in philosophy

The origins of moral blindness lie in philosophy and can be traced to ancient Greek philosophers such as Socrates who spoke of moral intellectualism, Plato who spoke about emotions clouding moral judgements and Aristotle who first used the term "ethics" for the field of moral philosophy.[2] Early spiritual leaders such as Buddha and Confucius also spoke about moral behaviour in their discourses although they were more prescriptive in nature.[3] Modern contributions to moral judgement came from Western philosophers such as Descartes, Locke, Hume and Kant around the 17th and 18th century [9][10][11] and more contemporary philosophers such as G.E. Moore, who in his book Principia Ethica talks about the "indefinability of good"[12]

The idea of normative ethics

A lot of the early thought on ethics and morality was normative in nature: axioms for how an individual was supposed to act in a given situation which subsequently led to the development of two opposing views of ethical evaluation within this area of normative ethics: deontology and consequentialism where the morality of an action depended on the appropriateness of the action itself with respect to rules or the results of that action. These views are often reflected in responses to Greene's famous trolley problem. [13]

In psychology

Moral blindness has been studied jointly across philosophy and psychology with empirical studies of morality going back to the 1890s. The focus on a normative approach to moral behaviour led to research focus on the cognitive and developmental context. Piaget put forth his prominent theory of cognitive development in 1936 which Kohlberg developed upon to come up with the three stages of moral development in 1958.[14] Later, in 1982, James Rest published his influential Four Component Model of Morality (FCM) where he identified four distinct stages from which immoral behaviour could arise: moral sensitivity, moral judgment, moral motivation, and moral implementation.[13] This model was meant to convey the complexity behind moral behaviour: competence in one stage did not imply competence in another with the result that immoral behaviour could be a consequence of failure at any stage.[15] The above cognitive focus was found to be in contrast to some of the observed behavior. The field of behavioral ethics eventually emerged to study how people actually behaved in situations of moral dilemma.

Major theoretical and experimental research in psychology

A major driver for modern research on moral blindness is purported to be post World War II sentiments towards people such as Adolf Eichmann (responsible for genocide under the Nazi regime during the Holocaust). His capture and subsequent trial in 1961 had many observers comment on his ordinary nature and appearance which seemed at contrast with his 'evil' behaviour. Hannah Arendt, who was covering the trial for the New Yorker, coined the term the "banality of evil" in reference to Eichmann as during the trial, Eichmann showed no remorse nor did he accept responsibility - he claimed to have done what he was told to do. This is believed to have influenced researchers such as Milgram to study individual behaviour in response to obedience to authority.[1][16][17]

In his obedience studies in 1961-62, Milgram had subjects administer electric shocks to a confederate. These studies had been designed to answer questions such as: "Could it be that Eichmann and his million accomplices in the Holocaust were just following orders? Could we call them all accomplices?".[18] To most people's surprise then, results showed that 65% of the subjects from the original study went ahead to administer the maximum of 450 volts.[19]

Later in 1971, ZImbardo in his infamous Stanford Prison Experiment studied showed how “good people behave in pathological ways that are alien to their nature"”.[1] Male undergraduate students at Stanford were assigned to be guards or prisoners in a simulated prison setting. The experiment was designed to see how far subjects would go to internalise their roles and obey external orders and later raised some ethical concerns about the nature of the study itself.[20]

Post these findings, researchers began to study moral agency, its exercise and drivers of moral blindness. In his research, Bandura argued that moral disengagement could arise out of various forces (individual, situational or institutional) along with mechanisms such as diffusion of responsibility and disconnected division of tasks could lead to immoral behaviour.[21][1][22]

More recent research has led to the development of the concept of 'bounded ethicality" - the idea that people can be unintentionally unethical when it comes to their behaviour as well as judging others' behaviour; something they may realise only on further reflection.[23][24] Studies on individual unethicality have also looked at the role of social norms and as well as how we view others' unethical behaviour.[25][26]

Moral blindness has been studied and applied in a range of domains beyond war crimes, politics and administration. A major area of application has been in the field of management and organisational behaviour with research looking at a wide range of topics such as corporate transgressions, business ethics and moral disengagement at work.[27][5] Law and justice is another area where moral blindness, especially when it comes to lawyers, is seen as a concern.[28][6] Some research has also referred to psychopathy being a specific kind of moral blindness although the findings are not conclusive.[29]

The field has also been expanded to study broader ideas such as moral blind spots (overestimating ability to act ethically),[30] ethical erosion (gradual decline of ethics over time),[26] and ethical fading (when ethical concerns around a situation 'fade' during decision making).[31]

See also

References

  1. Palazzo, Guido; Krings, Franciska; Hoffrage, Ulrich (2012-09-01). "Ethical Blindness". Journal of Business Ethics. 109 (3): 323–338. doi:10.1007/s10551-011-1130-4. ISSN 1573-0697.
  2. Oberhelman, David D. (June 2001). "Stanford Encyclopedia of Philosophy2001311Principal Editor, Edward N. Zalta. Stanford Encyclopedia of Philosophy. Stanford, CA: The Metaphysics Research Lab, Center for the Study of Language and Information, Stanford University 1999; updated every three months. Internet URL: http://plato.stanford.edu, ISSN: 1095-5054 Gratis Last visited: May 2001". Reference Reviews. 15 (6): 9–9. doi:10.1108/rr.2001.15.6.9.311. ISSN 0950-4125.
  3. Tucker, John A. (2015-02-03), "Japanese Neo-Confucian Philosophy", The Oxford Handbook of Japanese Philosophy, Oxford University Press, pp. 272–290, ISBN 978-0-19-994572-6, retrieved 2020-11-30
  4. Burin, Frederic S.; Arendt, Hannah (March 1964). "Eichmann in Jerusalem: A Report on the Banality of Evil". Political Science Quarterly. 79 (1): 122. doi:10.2307/2146583. ISSN 0032-3195.
  5. Barsky, Adam (2011-06-16). "Investigating the Effects of Moral Disengagement and Participation on Unethical Work Behavior". Journal of Business Ethics. 104 (1): 59. doi:10.1007/s10551-011-0889-7. ISSN 1573-0697.
  6. Eldred, Tigran (2012-09-28). "Prescriptions for Ethical Blindness: Improving Advocacy for Indigent Defendants in Criminal Cases". Rochester, NY. Cite journal requires |journal= (help)
  7. de Klerk, J. J. (2017-04-01). "Nobody is as Blind as Those Who Cannot Bear to See: Psychoanalytic Perspectives on the Management of Emotions and Moral Blindness". Journal of Business Ethics. 141 (4): 745–761. doi:10.1007/s10551-016-3114-x. ISSN 1573-0697.
  8. "Becoming Eichmann: rethinking the life, crimes, and trial of a "desk murderer"". Choice Reviews Online. 44 (02): 44–1163-44-1163. 2006-10-01. doi:10.5860/choice.44-1163. ISSN 0009-4978.
  9. Cohon, Rachel (2018), Zalta, Edward N. (ed.), "Hume's Moral Philosophy", The Stanford Encyclopedia of Philosophy (Fall 2018 ed.), Metaphysics Research Lab, Stanford University, retrieved 2020-11-29
  10. García Moriyon (2011). "MORAL BLINDNESS". doi:10.13140/2.1.1717.0885. Cite journal requires |journal= (help)
  11. Hare, John (2019), Zalta, Edward N. (ed.), "Religion and Morality", The Stanford Encyclopedia of Philosophy (Fall 2019 ed.), Metaphysics Research Lab, Stanford University, retrieved 2020-11-29
  12. Cooper, Barton C. (1959-01-01). "The Alleged Indefinability of Good". The Journal of Philosophy. doi:10.2307/2022719. Retrieved 2020-11-29.
  13. Bazerman, Max H.; Gino, Francesca (December 2012). "Behavioral Ethics: Toward a Deeper Understanding of Moral Judgment and Dishonesty". Annual Review of Law and Social Science. 8 (1): 85–104. doi:10.1146/annurev-lawsocsci-102811-173815. ISSN 1550-3585.
  14. Hallpike, C. R. (Christopher Robert) (2004). The evolution of moral understanding. Prometheus Research Group. Alton: Prometheus Research Group. ISBN 0-9542168-4-9. OCLC 56463709.
  15. You, Di; Bebeau, Muriel J. (2013-11-01). "The independence of James Rest's components of morality: evidence from a professional ethics curriculum study". Ethics and Education. 8 (3): 202–216. doi:10.1080/17449642.2013.846059. ISSN 1744-9642.
  16. "Eichmann Trial". encyclopedia.ushmm.org. Retrieved 2020-11-30.
  17. Russell, Nestar John Charles (2011). "Milgram's obedience to authority experiments: Origins and early evolution". British Journal of Social Psychology. 50 (1): 140–162. doi:10.1348/014466610X492205. ISSN 2044-8309.
  18. Schulweis, Harold M. (2009). Conscience : the duty to obey and the duty to disobey. Woodstock, Vt.: Jewish Lights Pub. ISBN 978-1-58023-419-1. OCLC 731340449.
  19. Blass, Thomas (March 1991). "Understanding behavior in the Milgram obedience experiment: The role of personality, situations, and their interactions". Journal of Personality and Social Psychology. 60 (3): 398–413. doi:10.1037/0022-3514.60.3.398. ISSN 1939-1315.
  20. Bartels, Jared (2019-11-02). "Revisiting the Stanford prison experiment, again: Examining demand characteristics in the guard orientation". The Journal of Social Psychology. 159 (6): 780–790. doi:10.1080/00224545.2019.1596058. ISSN 0022-4545. PMID 30961456.
  21. Bandura, Albert (1999-08-01). "Moral Disengagement in the Perpetration of Inhumanities". Personality and Social Psychology Review. 3 (3): 193–209. doi:10.1207/s15327957pspr0303_3. ISSN 1088-8683.
  22. Bandura, Albert (2002-06-01). "Selective Moral Disengagement in the Exercise of Moral Agency". Journal of Moral Education. 31 (2): 101–119. doi:10.1080/0305724022014322. ISSN 0305-7240.
  23. Gino, Francesca (2015-06-01). "Understanding ordinary unethical behavior: why people who value morality act immorally". Current Opinion in Behavioral Sciences. Social behavior. 3: 107–111. doi:10.1016/j.cobeha.2015.03.001. ISSN 2352-1546.
  24. Chugh, Dolly; Bazerman, Max H.; Banaji, Mahzarin R. (2005-04-18), "Bounded Ethicality as a Psychological Barrier to Recognizing Conflicts of Interest", Conflicts of Interest, Cambridge University Press, pp. 74–95, ISBN 978-0-521-84439-0, retrieved 2020-11-30
  25. Gino, Francesca; Ayal, Shahar; Ariely, Dan (2009-03-01). "Contagion and Differentiation in Unethical Behavior: The Effect of One Bad Apple on the Barrel". Psychological Science. doi:10.1111/j.1467-9280.2009.02306.x. ISSN 1467-9280.
  26. Gino, Francesca; Moore, Don A.; Bazerman, Max H. (2008). "See No Evil: When We Overlook Other People's Unethical Behavior". SSRN Electronic Journal. doi:10.2139/ssrn.1079969. ISSN 1556-5068.
  27. Bandura, Albert; Caprara, Gian-Vittorio; Zsolnai, Laszlo (2016-07-24). "Corporate Transgressions through Moral Disengagement:". Journal of Human Values. doi:10.1177/097168580000600106.
  28. Hall, Katherine (2010), Why good intentions are often not enough: The potential for ethical blindness in legal decision-making, Routledge, ISBN 978-0-415-54653-9, retrieved 2020-11-30
  29. Larsen, Rasmus Rosenberg (2020-09-01). "Psychopathy as moral blindness: a qualifying exploration of the blindness-analogy in psychopathy theory and research". Philosophical Explorations. 23 (3): 214–233. doi:10.1080/13869795.2020.1799662. ISSN 1386-9795.
  30. Bazerman, Max H.; Tenbrunsel, Ann E. (2011-12-31). Blind Spots. Princeton: Princeton University Press. ISBN 978-1-4008-3799-1.
  31. Tenbrunsel, Ann E.; Messick, David M. (June 2004). "Ethical Fading: The Role of Self-Deception in Unethical Behavior". Social Justice Research. 17 (2): 223–236. doi:10.1023/B:SORE.0000027411.35832.53. ISSN 0885-7466.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.