Morality, like art, means drawing a line someplace. — Oscar Wilde
Are you a moral person? If you’re like most people (which most people are), you tend to overestimate how ethical you are compared to others.
Maybe you’re moral in one situation, but a little immoral in another. Maybe you act ethically around one set of people, but a little unethically with a different set.
Yet, how do you justify your inconsistency in the face of what morality is supposed to entail: “moral positions [should be] equally valid everywhere and are as objectively true as 2 + 2 = 4” (Mueller & Skitka, 2018; p. 711). If so, why are we okay with some people committing moral transgressions, whereas we’re appalled by others committing the same act?
Welcome to the wild and wacky world of human morality. For today, let’s take a spin through two of moral psychology’s cutting-edge findings.
LIARS AND ZEALOTS
If you’ve ever badmouthed an opposing party’s politician for an immoral act, you’ve likely ignored (or justified) a very similar immoral act by a politician you support.
Part of the reason we’re able to “get away” with this inconsistency comes through the distinction between moral process and moral outcomes. For example, let’s say I have the belief that confederate statues are morally wrong. However, in order to achieve their deconstruction/demise, I support other people illegally tearing them down.
In this example, although the process of achieving the outcome is “immoral,” the final outcome itself is one that supports my morality. Therefore, I’m able to justify the process (or means) that gets me to my desired outcome (or end).
As another example, let’s say I was morally opposed to Colin Kaepernick’s protests during the National Anthem. When he was later “immorally” fired, I would be able to justify it because the end result was something that aligned with my moral beliefs.
In one study testing these ideas, researchers gave participants a monologue in support of federally funding Planned Parenthood. After reading the monologue (which the participants believed had been played over public radio), participants learned that the facts presented in the monologue were either (a) real/legitimate or (b) fabricated by the monologue’s author.
Afterward, the researchers had participants evaluate the monologue’s author and his act.
Only those who morally supported Planned Parenthood (vs. those who supported it but had no moral connection) said that it was okay the author lied. That is, when the outcome is in line with our morals, we’re very good at justifying whatever process achieved it.
BEING MORAL IS DIFFICULT
Although most people believe that our morals should be universal (i.e., we should judge the process at the same level we judge the outcome), we have other beliefs that influence when we’re more or less likely to act honestly.
And one of those beliefs is simply whether acting moral is easy or hard.
Previously, I’ve talked about lay theories on this site before, but essentially, they’re people’s everyday beliefs about how the world works. And the more a person endorses one, the more that belief influences his or her behavior.
As one example, researchers brought participants into the lab and measured how strongly people believed “being honest is difficult” (vs. “being honest is easy”). And they found that the more people believed the former statement, the more likely they were to cheat at a game to win money.
In another study, researchers actually manipulated people’s lay theory on this topic by giving them (what appeared to be) a newspaper article, either in support of one lay theory or the other. Afterward, they found the same effects: those who were led to believe honesty was difficult (vs. easy) were more likely to lie in order to get a greater financial reward.
MORAL SUBJECTIVISM
So, although we would like to believe our morals are some of our strongest and most consistent beliefs, if you pay attention to how you apply them, you may be surprised to find they’re not as stable as you thought.
Believes He’s Moral (Most of the Time),
jdt
Psych·o·philosophy to Ponder: What are times that you’ve witnessed double-standards within your own morality? That is, what are some instances from your own life where “the means were justified by the end” (i.e., you supported an immoral process to achieve a moral end)? Although we generally have no issue with this “hypocrisy,” there is a line we won’t cross. For example, when people commit heinous or violent acts to achieve a moral outcome, we no longer support the process to achieve it. Unfortunately, our lack of support here doesn’t come from the harm it causes others; instead, it seems that the possibility that we might be viewed hypocritically (or at least negatively) for supporting a particularly unjust process is what keeps us from supporting it.
Lee, J. J., Ong, M., Parmar, B., & Amit, E. (2018). Lay theories of effortful honesty: Does the honesty–effort association justify making a dishonest decision?. Journal of Applied Psychology.
Mueller, A. B., & Skitka, L. J. (2018). Liars, damned liars, and zealots: The effect of moral mandates on transgressive advocacy acceptance. Social Psychological and Personality Science, 9(6), 711-718.