The Fault in Our Moral Intuitions – Part: II

0
236

The apparent golden rule when we make moral decisions is to do unto others as you would have them do unto you. Yet why are moral decisions so heavily debated? Are we motivated by guilt or by fairness?

According to a Radboud University, people may rely on principles of both guilt and fairness and are prone to the fluctuations in their moral compass when exposed to different circumstances. This discovery challenges the main premise in economics, psychology, and neuroscience, which is that people are motivated by one moral principle that remains constant over time.

Postdoctoral research associate in the department of cognitive, linguistic, and psychological sciences at Brown University Jeroen van Baar says, “Our study demonstrates that with moral behaviour, people may not in fact always stick to the golden rule. While most people tend to exhibit some concern for others, others may demonstrate what we have called ‘moral opportunism,’ where they still want to look moral but want to maximize their own benefit.”

Assistant professor of psychological and brain sciences and director of the Computational Social Affective Neuroscience Laboratory at Dartmouth Luke J. Chang says, “In everyday life, we may not notice that out morals are context-dependent since our contexts tend to stay in the same daily. However, under new circumstances, we may find that the moral rules we thought we’d always follow are actually quite malleable. This had tremendous ramifications if one considers how our moral behaviour could change under new context, such as during war.”

Researchers designed a modified trust game called the Hidden Multiplier Trust Game, which allowed them to classify decisions as a function of an individual’s moral strategy. The team could determine which type of moral strategy a participant was using: inequity aversion, where people reciprocate because they want to seek fairness, guilt aversion, where people reciprocate because they want to avoid feeling guilty, greed, or moral opportunism, where people switch between inequity aversion and guilt aversion depending on what will serve their interests best. The researchers also developed a computational, moral strategy model that could be used to examine the brain activity patterns associated with the moral strategies.

Their findings show that the brain undergoes unique patterns that differentiate a person who chooses the inequity aversion from the guilt aversion, even when the strategies yield the same behaviour. Researchers observed that the participants’ brain patterns who were morally opportunistic switched between the two moral strategies across different contexts. The results say that “people may use different moral principles to make their decisions, and that some people are much more flexible and will apply different principles depending on the situation. This may explain why people that we like and respect occasionally do things that we find morally objectionable.”

1,2,3