Some argue that morality is everywhere, or maybe nowhere, in our brain.
Making moral decisions is a complex process. We have to think about the consequences of our actions for ourselves (will I go to jail?), others (will this person suffer as a consequence of my decision?), and society at large (does society benefit from my choice?).
Depending on the situation, it involves brain regions linked to decision making, empathy, Theory of Mind (the ability to think about the mental states of others), memory, agency – or a combination of these. This has led some to argue that morality is everywhere and – maybe – nowhere in our brain.
There’s no single region in the brain responsible for all moral decision making. Nor are there specific brain regions devoted only to this process. But neuroscience research shows that certain, specific brain regions are often involved when we’re faced with a moral dilemma.
Here’s a dilemma
A nice example of a moral dilemma is the well-known “trolley problem“.
A person is confronted with a hypothetical life-or-death decision where a train is about to run over five people on a track. The person can turn a switch that will divert the train from the main track. This will save the five people on the track, but it will kill the person on the other track.
What would you do? Would you turn the switch and save five people but be responsible for the death of one person? Or, would you do nothing? Typically, people choose to turn the switch because sacrificing one life to save five others is the most rational decision.
But emotions also play an important role in moral decision making and this is demonstrated by a small variation in the trolley problem, the so-called “footbridge dilemma”. In this “emotional” version of the dilemma, a person has to push a stranger from a bridge and onto the track to stop the train and save the life of the five people on the track.
The outcome is the same (one person is sacrificed to save five others) but the results typically show that, in this situation, people are much less willing to intervene. Imagine that person on the bridge is someone you love, for instance. In that case, it’s likely that nobody would be willing to sacrifice one life to save five others.
This shows that emotions, distance, and agency play important parts in moral decision making. Think of it this way: it’s easier to kill a person you hate from a distance with a gun than killing a person you love with your bare hands.
The neuroscience of morality
One of the first neuroimaging studies investigating these moral dilemmas showed that in more rational, impersonal situations (such as the trolley problem) brain regions involved in abstract reasoning (such as the dorsolateral prefrontal cortex) became more active. While in more emotional, personal situations (such as the footbridge dilemma) brain regions involved in emotional processing (such as the ventral medial prefrontal cortex) were more active.
But the problem with the trolley paradigm and similar moral dilemmas is that they’re hypothetical, artificial and unusual. In real life, moral decisions often have to be made quickly and implicitly. And these processes typically involve different brain regions than those involved in complex decision making.
To investigate moral situations in which people actually had to harm others in real life themselves, my research group recently conducted an fMRI experiment in which people had to give electric shocks to others. Our results show more activation in both the left and right lateral orbitofrontal cortex when people were harming others. This is the part of the brain involved in feelings of displeasure.
Interestingly, in another fMRI experiment we showed that these same regions become active when we kill an innocent person. But when we kill a soldier who attacks us, this regions doesn’t become active.
These results show that, depending on the situation, we can “switch off” brain regions that typically prevent us from harming others if we feel the situation justifies violence (when we have to defend our own life, for instance).
Meting out justice
Making moral decisions about the actions of others or so called third-party punishment is also relevant for the legal system. The relevant questions here are: how severe is the harm caused? and, was it done intentionally?
If a person drives his car off the road but nobody is harmed, then, typically, no punishment is given. But when a person is accidentally killed during the process, this can lead to an involuntary manslaughter charge. Depending on the circumstances, a mild or severe punishment is given in this situation. However, when the person intentionally kills another person with their car, the charge becomes murder, and the punishment is much more severe.
Previous fMRI research has shown that when we have to decide if a person is responsible for his or her actions, the dorsolateral prefrontal cortex is involved.
Now, new research from the same authors, published today in the journal Neuron, shows that when you disrupt this region using a non-invasive brain stimulation technique called transcranial magnetic stimulation (TMS), people give less severe punishments to the perpetrator.
Detailed analysis shows the disruption in their dorsolateral prefrontal cortex caused people in the experiment to base their punishment decisions more on the consequences of the crime rather than on the intentions. The findings suggest this part of the brain plays a critical role in balancing information about intent and harm, to enable appropriate punishment decisions.
The authors of the paper say this brain region has undergone significant expansion in humans, compared to other apes. They suggest this is one of the reasons why human society has evolved such a complex system of norm enforcement.
These new results provide important insights into how specific parts of our brain play a critical role in deciding the fate of others.
This article originally appeared on The Conversation Africa.
Photo credit: Getty Images