Hostname: page-component-788cddb947-jbjwg Total loading time: 0 Render date: 2024-10-19T16:51:21.606Z Has data issue: false hasContentIssue false

Moral reasoning performance determines epistemic peerdom

Published online by Cambridge University Press:  11 September 2019

William H. B. McAuliffe
Affiliation:
Department of Psychology, University of Miami, Coral Gables, FL 33146. w.mcauliffe@umiami.eduhttp://williamhbmcauliffe.com/mikem@miami.eduhttp://local.psy.miami.edu/faculty/mmccullough/
Michael E. McCullough
Affiliation:
Department of Psychology, University of Miami, Coral Gables, FL 33146. w.mcauliffe@umiami.eduhttp://williamhbmcauliffe.com/mikem@miami.eduhttp://local.psy.miami.edu/faculty/mmccullough/

Abstract

We offer a friendly criticism of May's fantastic book on moral reasoning: It is overly charitable to the argument that moral disagreement undermines moral knowledge. To highlight the role that reasoning quality plays in moral judgments, we review literature that he did not mention showing that individual differences in intelligence and cognitive reflection explain much of moral disagreement. The burden is on skeptics of moral knowledge to show that moral disagreement arises from non-rational origins.

Type
Open Peer Commentary
Copyright
Copyright © Cambridge University Press 2019 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

In chapter 5 of Joshua May's (Reference May2018) Regard for Reason in the Moral Mind, he concedes that moral disagreement among “epistemic peers” – people who have equally good access to the truth of a matter – can undermine the claim to moral knowledge. However, he also rightly points out that moral disagreement often arises from poor thinking, such as motivated reasoning, and disagreement about the non-moral premises undergirding moral conclusions, such as whether same-sex marriage undermines social stability (p. 120). May concludes that it is difficult to identify whether a disagreeing peer is also an epistemic peer in practice, especially if he or she is from a different cultural milieu. We agree with this conclusion, but contend that researchers have been successful in identifying some factors that, in aggregate, show that much of moral disagreement does not occur among epistemic peers. Here, we review this literature because May did not, and because it points up the importance of rational factors – namely, differences in cognitive ability, education, and tendency toward cognitive reflection – in explaining the quality of moral judgments.

Consider the simple case of competently judging whether a moral violation has taken place. The judge must assess (a) whether there was an actual or potential patient of harm, (b) whether there was an agent who intended that harm, and (c) whether the harm was a means to selfish ends (Sousa & Piazza Reference Sousa and Piazza2014). Achieving these tasks requires the judge to experience empathy for the putative victim, to deploy theory of mind regarding the agent's intent, to apply accurate background beliefs about the act's typical consequences, and to impartially consider whether the act would be acceptable regardless of the identities of the agent and the patient (Gibbs Reference Gibbs2013). The judge must then check whether her initial impression coheres with her other moral beliefs and whether there are relevant mitigating circumstances (Holyoak & Powell Reference Holyoak and Powell2016). All the while, the judge must ensure that self-interest or a desire to pander to a certain audience does not corrupt any of these processes (Krebs & Denton Reference Krebs and Denton2005). Each of these tasks considerably increases in difficulty if the violation in question is not common in the judge's everyday life (Davidson et al. Reference Davidson, Turiel and Black1983) or concerns several stakeholders from diverse walks of life (Gibbs Reference Gibbs2013). The judge must perform optimally in all of these tasks to be an epistemic peer of another person who performed optimally.

All components of moral judgment require the application of sophisticated cognitive and socioemotional capacities that differ in strength across people and do not fully develop until at least adolescence. It is no surprise, then, that differences in intellectual achievement are strong predictors of differences of moral opinion. For example, intelligence at age 10 predicts anti-traditional beliefs (e.g., endorsement of gender equality in the workplace, opposition to retributive justice, and rejection of racism) at age 30, even after controlling for educational achievement (Deary et al. Reference Deary, Batty and Gale2008). Also, meta-analyses indicate that illiberal attitudes are positively associated with about a dozen different measures of cognitive rigidity (Jost Reference Jost2017). And to round the bases on May's aforementioned example, intelligence is positively associated with support for same-sex marriage (Perales Reference Perales2018).

Additionally, young people who are still at low levels of cognitive development tend to make category mistakes in moral reasoning. For example, still-developing minds tends to confuse morality with power dynamics, self-interest, peer approval, and the status quo (Gibbs Reference Gibbs2013; Piaget Reference Piaget and Gabain1932). Moreover, intelligence is strongly associated with successful distinctions between moral violations (i.e., actions that intrinsically have detrimental consequences for others) and convention violations (i.e., actions that disrupt social order within a given culture, but would not be harmful in other contexts; Aharoni et al. Reference Aharoni, Sinnott-Armstrong and Kiehl2012; Royzman et al. Reference Royzman, Landy and Goodwin2014b). A failure to make this distinction is partly responsible for why less reflective people tend to treat violations of the “binding foundations” of morality – authority, tradition, and purity – as intrinsically wrong (Landy Reference Landy2016). Contra May, then, one need not grant that people who moralize different sets of values are epistemic peers (p. 123). Someone who does not know which distinctions really count when making moral judgments is not an epistemic peer of someone who does know which distinctions really count.

Of course, intelligence is not everything: One must also be motivated to apply it when making a judgment. Seminarians, for example, typically recognize morally mature arguments, but some of them choose to relinquish reason in favor of obedience to God (Lawrence Reference Lawrence1987). Failure to think through the details of a dilemma can lead people to dogmatically champion one moral consideration to the neglect of legitimate alternatives. For example, reflective thinkers regard either the deontological or utilitarian resolutions to moral dilemmas as morally permissible (Royzman et al. Reference Royzman, Landy and Leeman2015b). Less reflective people prefer a particular resolution, suggesting that they do not acknowledge that there really is a dilemma. Although taking a hard line on an issue sometimes reflects principled belief, in dilemmatic contexts strong opinions more often reflect an unthinking adherence to a rule, an inflexibility that most children eventually learn is inadequate for dealing with the complexities of life (Lourenço Reference Lourenço2003).

Perhaps the most pernicious misapplication of intelligence to the moral domain is the motivated rationalization of views one wants to maintain (Stanovich et al. Reference Stanovich, West and Toplak2013). For example, intelligence is positively associated with prejudice against conservative targets such as corporations, Christians, and the military (Brandt & Crawford Reference Brandt and Crawford2016). People rely on stereotypes less when individuating information is available (Jussim Reference Jussim2017), but they must be willing to seek out such information in the first place. Hence, the Big Five trait that is most negatively related to generalized prejudice is agreeableness, which reflects lenience in judging and a desire to get along with others, not openness to experience, the trait most linked to intelligence (Crawford & Brandt Reference Crawford and Brandt2019). This example reinforces the point that because making good moral judgments depends on so many distinct capacities, suboptimal performance on any one task can compromise one's claim to epistemic peerdom.

Moral reasoning is also not a mere academic ability, as its importance is evident when examining moral heroes and moral transgressors. In their landmark comparison of rescuers of Jews during the Holocaust to bystanders, Oliner and Oliner (Reference Oliner and Oliner1988) found that rescuers were more likely to have been raised to adopt a universal care ethic and less likely to accept an ethic of obedience. Walker et al. (Reference Walker, Frimer and Dunlop2010) found that moral reasoning ability was the distinguishing feature of a subset of people who had won lifetime achievement awards for their prosocial contributions to society. Among ordinary persons, scores on moral reasoning tests are positively associated with volunteering and going into a helping profession (Comunian & Gielen Reference Comunian and Gielen1995; Rest et al. Reference Rest, Narvaez, Bebeau and Thoma1999). Similarly, cognitive ability is positively associated with charitable giving (Bekkers & Wiepking Reference Bekkers and Wiepking2011). Conversely, moral reasoning scores relate negatively to selfish, manipulative tendencies (Marshall et al. Reference Marshall, Watts, Frankel and Lilienfeld2017). Criminal offenders have lower scores on moral reasoning tests than do non-offenders (Stams et al. Reference Stams, Brugman, Deković, Van Rosmalen, Van Der Laan and Gibbs2006), a group difference that is likely mediated by deficits in cognitive empathy and general intelligence (O'Kane et al. Reference O'Kane, Fawcett and Blackburn1996; Van Langen et al. Reference Van Langen, Wissink, Van Vugt, Van der Stouwe and Stams2014). Among criminal offenders, lower moral reasoning scores predict increased recidivism (Van Vugt et al. Reference Van Vugt, Gibbs, Stams, Bijleveld, Hendriks and van der Laan2011) and psychopathic traits predict deficits in detecting social contract violations (Ermer & Kiehl Reference Ermer and Kiehl2010). Both lines of evidence suggest that recalcitrant offenders have difficulty obtaining and applying moral knowledge.

The role of reason in promoting prosocial behavior and inhibiting antisocial behavior is even evident in the historical record: As societies became more cosmopolitan over time, the justifications governments gave for helping the needy became more distinctively moral (McCullough, Reference McCulloughforthcoming). The earliest justifications for regard for the poor were mostly self-serving inasmuch as they secured reputational benefits for rulers, enabling them to consolidate their power in the face of competing interests, establish peaceable kingdoms, and lubricate trade relations with other societies and ethnic groups. Later justifications were based on prudential arguments about the collateral effects of poverty on the prevalence of disease, crime, vice, and social unrest. It was only during the enlightenment era that arguments about helping the poor and preventing poverty became distinctly moral in character, invoking distributive justice, the equal dignity of all persons, and the maximization of utility at the societal level. Thought experiments involving veils of ignorance, original positions, and children drowning in shallow ponds would not come until the latter half of the twentieth century. The spread of literacy, along with reductions in the prices of books and the speed with which information could travel, also encouraged distinctively moral reasoning by providing people with humanizing portraits of poor and distant victims. Similar advances in moral reasoning, education, and literacy also help to explain the decline of violence between states over the past 500 years (Pinker Reference Pinker2011a). The qualitative changes in moral justifications for helping others and against harming others across generations is remarkably similar to qualitative changes in moral reasoning within the lifetime of single persons (Gibbs Reference Gibbs2013).

A by-product of moral progress is that the standards for becoming an epistemic peer in the moral domain have increased now that access to information is more available than ever (McCullough, Reference McCulloughforthcoming; Pinker Reference Pinker2018). Newspapers, radio, television, and the Internet make it easier to learn about other people's plights, which ideally enable people to come to better agreement about when societies have moral obligations to combat injustice and improve the lot of those in dire need. And now with considerable historical precedent for offering impartial reasons for one's point of view, it is harder to get away with a patently self-interested moral compass (Shermer Reference Shermer2015).

May concludes that people need only be skeptical toward people's ability to obtain moral knowledge about particularly controversial issues, where reasonable people disagree because the relevant empirical premises are uncertain and the temptation toward motivated reasoning is strong (p. 128). We agree that intellectual humility is an antidote to counterproductive polarization, but we counsel against relying on the proportion of people who believe a certain point of view to determine which moral conclusions are beyond our ken. For history reveals not only that the moral compass of the masses has improved over time, but also that there have always been individuals who were centuries ahead of their time in their moral outlook. For example, long before sizable abolitionist movements caught hold in the United States, there were those who cogently argued that there are no differences between blacks and whites that entitle whites to subordinate blacks (Lepore Reference Lepore2018). Otherwise reasonable people – including some founders of the U.S. constitution who were ahead of their time in other ways – disagreed, but their counterarguments were self-interested and based on false claims such as that blacks wanted to be ruled or that they did not possess rational capacities. Others, such as Benjamin Franklin, were resistant at first, but changed their minds after reflecting on abolitionist arguments and taking the time to observe black communities in a disinterested manner. What this example shows is that simply counting the number of learned people who hold a certain moral point of view is not an infallible means of detecting whether that view is reasonable, positive correlations between cognitive ability and moral positions notwithstanding. In all cases, one must examine the reasoning and evidence that each side has brought to bear to determine who is an epistemic peer of whom. Only those who are not committed to taking the time to think about and research an issue need withhold judgment.

References

Aharoni, E., Sinnott-Armstrong, W. & Kiehl, K. A. (2012) Can psychopathic offenders discern moral wrongs? A new look at the moral/conventional distinction. Journal of Abnormal Psychology 121(2):484–97.Google Scholar
Bekkers, R. & Wiepking, P. (2011) Who gives? A literature review of predictors of charitable giving, part one: Religion, education, age and socialisation. Voluntary Sector Review 2(3):337–65.Google Scholar
Brandt, M. J. & Crawford, J. T. (2016) Answering unresolved questions about the relationship between cognitive ability and prejudice. Social Psychological and Personality Science 7(8): 884–92.Google Scholar
Comunian, A. L. & Gielen, U. P. (1995) Moral reasoning and prosocial action in Italian culture. The Journal of Social Psychology 135(6): 699706.Google Scholar
Crawford, J. & Brandt, M. J. (2019) Who is prejudiced, and toward whom? Big five traits and inclusive generalized prejudice. Available at: https://doi.org/10.31234/osf.io/6vqwk.Google Scholar
Davidson, P., Turiel, E. & Black, A. (1983) The effect of stimulus familiarity on the use of criteria and justifications in children's social reasoning. British Journal of Developmental Psychology 1(1):4965.Google Scholar
Deary, I. J., Batty, G. D. & Gale, C. R. (2008) Bright children become enlightened adults. Psychological Science 19(1):16.Google Scholar
Ermer, E. & Kiehl, K. A. (2010) Psychopaths are impaired in social exchange and precautionary reasoning. Psychological Science 21(10):13991405.Google Scholar
Gibbs, J. C. (2013) Moral development and reality: Beyond the theories of Kohlberg, Hoffman, and Haidt. Oxford University Press.Google Scholar
Holyoak, K. J. & Powell, D. (2016) Deontological coherence: A framework for commonsense moral reasoning. Psychological Bulletin 142(11):11791203.Google Scholar
Jost, J. T. (2017) Ideological asymmetries and the essence of political psychology. Political Psychology 38(2):167208.Google Scholar
Jussim, L. (2017) Précis of Social Perception and Social Reality: Why Accuracy Dominates Bias and Self-Fulfilling Prophecy. Behavioral and Brain Sciences 165.Google Scholar
Krebs, D. L. & Denton, K. (2005) Toward a more pragmatic approach to morality: A critical evaluation of Kohlberg's model. Psychological Review 112(3):629–49.Google Scholar
Landy, J. F. (2016) Representations of moral violations: Category members and associated features. Judgment and Decision Making 11(5): 496508.Google Scholar
Lawrence, J. A. (1987) Verbal processing of the Defining Issues Test by principled and non-principled moral reasoners. Journal of Moral Education 16(2):117–30.Google Scholar
Lepore, J. (2018) These truths: A history of the United States. Norton.Google Scholar
Lourenço, O. (2003) Making sense of Turiel's dispute with Kohlberg: The case of the child's moral competence. New Ideas in Psychology 21(1):4368.Google Scholar
Marshall, J., Watts, A. L., Frankel, E. L. & Lilienfeld, S. O. (2017) An examination of psychopathy's relationship with two indices of moral judgment. Personality and Individual Differences 113:240–45.Google Scholar
May, J. (2018) Regard for reason in the moral mind. Oxford University Press.Google Scholar
McCullough, M. E. (forthcoming) Why we give a damn. Basic Books.Google Scholar
O'Kane, A., Fawcett, D. & Blackburn, R. (1996) Psychopathy and moral reasoning: Comparison of two classifications. Personality and Individual Differences 20:505–14.Google Scholar
Oliner, S. & Oliner, P. (1988) The altruistic personality: Rescuers of Jews in Nazi Europe. Free Press.Google Scholar
Perales, F. (2018) The cognitive roots of prejudice towards same-sex couples: An analysis of an Australian national sample. Intelligence 68:117–27.Google Scholar
Piaget, J. (1932/1965) The moral judgement of the child, trans. Gabain, M.. Free Press/Harcourt. (Original work published in 1932.)Google Scholar
Pinker, S. (2011a) The better angels of our nature: The decline of violence in history and its causes. Penguin.Google Scholar
Pinker, S. (2018) Enlightenment now: The case for reason, science, humanism, and progress. Viking.Google Scholar
Rest, J. R., Narvaez, D., Bebeau, M. & Thoma, S. (1999) Postconventional moral thinking: A neo-Kohlbergian approach. Erlbaum.Google Scholar
Royzman, E. B., Landy, J. F. & Goodwin, G. P. (2014b) Are good reasoners more incest-friendly? Trait cognitive reflection predicts selective moralization in a sample of American adults. Judgment and Decision Making 9(3):176–90.Google Scholar
Royzman, E. B., Landy, J. F. & Leeman, R. F. (2015b) Are thoughtful people more utilitarian? CRT as a unique predictor of moral minimalism in the dilemmatic context. Cognitive Science 39(2):325–52.Google Scholar
Shermer, M. (2015) The moral arc: How science and reason lead humanity toward truth, justice, and freedom. Macmillan.Google Scholar
Sousa, P. & Piazza, J. (2014) Harmful transgressions qua moral transgressions: A deflationary view. Thinking and Reasoning 20(1):99128.Google Scholar
Stams, G. J., Brugman, D., Deković, M., Van Rosmalen, L., Van Der Laan, P. & Gibbs, J. C. (2006) The moral judgment of juvenile delinquents: A meta-analysis. Journal of Abnormal Child Psychology 34(5):692708.Google Scholar
Stanovich, K. E., West, R. F. & Toplak, M. E. (2013) Myside bias, rational thinking, and intelligence. Current Directions in Psychological Science 22(4):259–64.Google Scholar
Van Langen, M. A., Wissink, I. B., Van Vugt, E. S., Van der Stouwe, T. & Stams, G. J. J. M. (2014) The relation between empathy and offending: A meta-analysis. Aggression and Violent Behavior 19(2):179–89.Google Scholar
Van Vugt, E., Gibbs, J., Stams, G. J., Bijleveld, C., Hendriks, J. & van der Laan, P. (2011) Moral development and recidivism: A meta-analysis. International Journal of Offender Therapy and Comparative Criminology 55(8):1234–50.Google Scholar
Walker, L. J., Frimer, J. A. & Dunlop, W. L. (2010) Varieties of moral personality: Beyond the banality of heroism. Journal of Personality 78(3):907–42.Google Scholar