As promised last time, we shall explore which types of moral transgressions you consider to be important. Moral Foundations Theory (MFT) is an attempt to devise a psychology model which offers a cross-cultural, explanatory paradigm for moral reasoning (Graham et al. 2012).
The fundamental premises of MFT are as follows:
(1) Moral reasoning is an intuitive process and strategic reasoning is secondary.
(2) Morality is not limited to harm or fairness.
(3) Morality binds and blinds (that is to say, group morality disinclines individuals to seek answers outside that context and bonds members together).
It should be noted that an ‘intuitive response’ in this context does not imply an innate biological process which is ‘hardwired’. Rather, it describes a preparedness that is “organised in advance of experience” (Haidt and Joseph, 2009, p. 382). That is to say, moral development is a combination of biological predisposition and an exposure to specific environmental stimuli which foster an innate response to a moral transgression. For example, a predisposition for empathetic feeling which is nurtured through parental influences may produce a specific morality regarding violations of care.
Moral Foundations Theory
The following are the model’s moral foundations:
- Fairness/ Cheating
Here is a brief description of each foundation (Haidt, Graham, and Joseph, 2009) and examples of the types of actions someone who endorses the foundation would find to be morally transgressive:
The care/harm and fairness/cheating domains are considered individualising foundations, in that the rights of the individual are the moral imperative for that foundation.
Care/Harm: Basic concern for the suffering of humans/animals in both an emotional and physical context.
Transgression examples: Animal testing, imposing the death penalty, negligent of vulnerable members of society.
Fairness/Cheating: Concerns for abstract notions of justice and inequality alongside explicit/implicit acts of unfair treatment.
Transgression examples: Job promotion based on nepotism, exploiting tax laws for personal gain.
The loyalty/betrayal, authority/subversion and sanctity/degradation domains are considered binding foundations, in that they support moral acts that help to maintain group bonds (Haidt, 2012).
Loyalty/Betrayal: Focus on concerns related to obligations of group membership, such as self-sacrifice, loyalty and betrayal.
Transgression examples: Pursuit of individual gain by betraying one’s family, burning of the national flag.
Authority/Subversion: Concerns relating to maintaining social orders, and the obligations of hierarchical relationships.
Transgression examples: A rejection of a dress code for a special occasion, Someone interrupting a university ceremony to make a protest.
Sanctity/Degradation: Focus on concerns relating to physical and spiritual contagion though acts which transgress perceived norms.
Transgression examples: Destroying a holy text (Bible, Qur’an), sexual acts outside of perceived norms, choosing to have an abortion.
This foundation list is not exhaustive; a proposed Liberty/Oppression foundation focuses on domination attempts and resistance to such a transgression. However, to date, it has not been extensively researched by other researchers.
Your Moral Foundations
Before we go any further, it would be good to explore which foundations you endorse. This link http://www.yourmorals.org/ will take you to the MFT questionnaire (approx.15-20 minutes). I hate to thrust homework upon you at this early stage but understanding your own foundational endorsement will substantially increase your engagement with the content. So best to do it now…yep, right now! I’ll be right here when you’re ready to come back.
The Liberal/Conservative Divide
Research on MFT shows that liberals and conservatives have distinctive and different foundation endorsement patterns. Graham et al. (2009) found that liberals are predominately concerned with care/harm and fairness/cheating whilst conservatives show a more equal endorsement of all five domains. The most distinct difference between liberals and conservatives is a moral endorsement of the sanctity/degradation foundation. Graham et al found liberals felt sanctity issues had the least relevance to their morality (which is in direct opposition to conservative endorsment). Moreover, liberals required the lowest financial inducement to violate a sanctity-based moral issue. However, the same could not be said for conservatives regarding foundations highly endorsed by liberals (e.g.: care/harm or fairness/cheating). Thus, a clear distinction emerges between liberals and conservatives on valuing binding foundations with the latter group more readily able to endorse these values.
The socially liberal also find less moral value in the loyalty/betrayal compared with the conservatively inclined. Not that liberals are not also ‘groupish’ in behaviour, but reject nationalism (and related symbols/behaviours) in favour of universalism (Gray, 1995).Now depending on your perspective, this may be evidence that liberals care about the ‘important things’ or that conservatives possess a more ’rounded’ moral worldview. However, what of topics in which liberals and conservatives both highly endorse? (e.g.: fairness/cheating foundation?). Surely this joint endorsement will mean a moral context exists in which agreement will flourish?…..well, not so fast! It could that this joint endorsement is actually for different aspects of fairness. For example, the social liberals may be more concerned with equality for marginalised groups and wealthy individuals or organisations contributing ‘fairly’ in taxes. In contrast, conservatives may argue fairness relative to proportionality (i.e.: people are rewarded in proportion to what they contribute) and resist any policy which offers the potential for ‘free-loaders’ (Haidt and Kesebir, 2010). Taken collectively, these endorsement differences; be it large (sanctity endorsement) or nuanced (fairness interpretation), produce distinctive defences in moral outlook.
Morality in Action
So how do these distinctive moral differences actually manifest themselves in terms of behaviour and attitudinal responses? Distinctive patterns of foundation endorsement were found for life-narrative paradigms (McAdams et al. 2008). Liberals described ‘life-lessons’ regarding openness and empathy whilst conservatives identified self-discipline lessons learned from authority figures. MFT can be used to predict culture war attitudes among differing political orientations (Koleva et al. 2012). Kertzer et al. (2014) found MFT to be strongly associated with foreign policy attitudes. Specifically, that cooperative internationalism is linked to ‘individualising’ foundations whilst, in contrast, the ‘binding’ foundations are associated with the pursuit of militant internationalism. Clark et al. (2017) found that the individualising foundations (care/harm, fairness/cheating) were the significant predictive domains for implicit co-operation with others in a financial context. Recent research has also found that charitable prosocial behaviour for specific causes may be predicted by understanding foundation endorsement (English and James, 2018).
What does this all mean?
These foundation differences among liberals and conservatives do provide pessimistic reading for those willing to engage in productive moral discourse. The reality may well be that, for some issues, a consensus between differing groups resides in the realm of the unattainable. It could be reasonably argued that, in a professional political context, a moral consensus is neither attainable nor desirable. However, conflict within a interpersonal relationship (family, friend partner etc.) because of a differing moral outlook can be challenging (especially in a referendum context!). If you do find yourself in such a situation, it is worth considering if you’re both actually talking about the same thing. As we have seen, debating a fairness-related moral topic does not ensure you are both actually discussing the same thing. Both liberals and conservatives value fairness highly and consider themselves to be fair (after all, people rarely see themselves as a ‘bad person’). Consequently, any accusations of unfairness will evoke dismissive rebuttals as, from their perspective, they are being fair; just a different version of fair to you. Also, if moral reasoning is initiated by intuition, changing someone’s moral perspective simply by presenting facts will most likely not be successful. Intellectually ‘over-riding’ an intuitive feeling can require an Herculean effort (especially after years of social group/media reinforcement that the moral position is correct). A contrasting moral position just won’t ‘feel right’ and, thus, a resistance to change will endure. Of course, I’m not suggesting that nobody ever changes their mind about anything moral, but rather it’s not always related to the validity of the opposition arguments.
If you found the last paragraph’s depiction of interpersonal discourse on morality somewhat depressing, I share your perspective. However, in an attempt to remain positive, we shall embark on an exploration of how one could engage with a conflicting moral perspective next time. Also, this entry has not uttered a dissenting word against either the concepts of Moral Foundation Theory or the explanatory paradigm it offers. Next time we shall look at alternative explanations for how moral reasoning occurs to widen the discussion.
Clark, C. B., Swails, J. A., Pontinen, H. M., Bowerman, S. E., Kriz, K, A., and Hendricks, P. S. (2017) A behavioural economic assessment of individualising versus binding moral foundations. Personality and Individual Differences.112: 49-54.
English, A., and James, T. Morality in Action? Testing the Predictive Validity of Moral Foundations Theory on Prosocial Behaviour. Preparing for publication.
Graham, J., Haidt, J., Koleva, S., Motyl, M., Iyer, R., Wojcik, S. P., and Ditto, P. H. (2012) Moral Foundations Theory: The pragmatic validity of moral pluralism. Advances in Experimental Social Psychology. 12: 1-64
Graham, J., Haidt, J., and Nosek, B. A. (2009) Liberals and conservatives rely on different sets of moral foundations. Journal of Personality and Social Psychology. 96 (5): 1029-1046.
Gray, J. (1995) Liberalism (second edition). Minneapolis: University of Minnesota Press.
Gray, K., Waytz, A., and Young, L. (2012) The Moral Dyad: A fundamental template unifying moral judgement. Psychological Inquiry. 23: 206-215.
Haidt, J. (2012) The Righteous Mind: Why good people are divided by politics and religion. London: Penguin Books.
Haidt, J., Graham, J., and Joseph, C. (2009) Above and below left-right: Ideological narratives and moral foundations. Psychological Inquiry. 20: 110-119.
Haidt, J., and Kesebir, S. (2010) ‘Morality’ in Handbook of Social Psychology, ed. S. T. Fiske, D. Gilbert, and G. Lindzey (5th Edition): 797-832, Hoboken, NJ: Wiley.
Kertzer, J. D., Powers, K. E., Rathburn, B. C., and Iyer, R. (2014) Moral support: How moral values shape foreign policy. The Journal of Politics. 76 (3): 825-840.
Koleva, S. P., Graham, J., Iyer, R., Ditto, P. H., and Haidt, J. (2012) Tracing the threads: How five moral concerns (especially Purity) help explain culture war attitudes. Journal of Research in Personality. 46 (2): 184-194.
Schnall, S, Haidt, J., Clore, G. L., Jordan, A. H. (2008) Disgust as embodied moral judgement. Personality and Social Psychology Bulletin. 34: 1096-1109.
2 thoughts on “Which Moral Foundations Do You Value?”
Thank you for your most generous response to the latest blog. In turn, I look forward to reading your future entries.