What drives moral decision-making?


Steven Schultz

Princeton NJ -- Here's a dilemma -- hypothetical, but, in today's world, not entirely unimaginable:

Would you push someone into an oncoming train if you knew that doing so would save the lives of five others farther down the track? What if you could save the five people by flipping a switch that sends the train onto a spur where it will kill one person?

Researchers at the Center for the Study of Brain, Mind and Behavior used functional magnetic resonance imaging (fMRI) to study brain activity that underlies moral decision-making. Standing next to the fMRI scanner, which was installed in Green Hall last year, are, from left: Jonathan Cohen, who directs the center; Joshua Greene, a graduate student in philosophy; and Leigh Nystrom, a staff scientist in psychology.

 

 

Both scenarios offer the opportunity to save five lives at the expense of one. But many people find a compelling difference between the cases, and judge that it would be appropriate to flip a switch but not to push someone.

In a study that combines philosophy and neuroscience, Princeton researchers have begun to uncover the processes of brain and mind that explain the difference between the cases. They have used functional magnetic resonance imaging (fMRI) to analyze brain activity in people who were asked to ponder a range of moral dilemmas.

The results, published in the Sept. 14 issue of Science, suggest that, while people regularly reach the same conclusions when faced with uncomfortable moral choices, their answers often do not grow out of the reasoned application of general moral principles. Instead, they draw on emotional reactions, particularly for certain kinds of moral dilemmas.

The results also show how tools of neuroscience are beginning to reveal the biological underpinnings of the subtlest elements of human behavior, said Joshua Greene, a graduate student in philosophy who conducted the study in collaboration with scientists in the psychology department and the Center for the Study of Brain, Mind and Behavior.

"We think of moral judgments as so ethereal," said Greene. "Now we're in a position to start looking at brain anatomy and understanding how neural mechanisms produce patterns in our behavior."

The study focused on a classic set of problems that have fascinated moral philosophers for years because of the difficulty in identifying moral principles that agree with the way people react.

One dilemma, known as the trolley problem, involves the runaway train with the spur. In the other dilemma, sometimes called the footbridge problem, two bystanders are on a bridge above the tracks and the only way to save the five people is for one bystander to push the other in front of the train.

The people in the study followed the common pattern -- agreeing that it is permissible to flip a switch, but not to push a person off the bridge. This distinction has puzzled philosophers who have not been able to find a hard and fast rule to explain why one seems right and the other wrong. For each potential principle, there seems to be another scenario that undermines it.

One reason for the difficulty, said Greene, appears to be that the two problems engage different psychological processes -- some more emotional, some less so -- that rely on different areas of the brain.

"They're very similar problems -- they seem like they are off the same page -- but we appear to approach them in very different ways," said Greene.

Greene emphasized that the researchers were not trying to answer questions about what is right or wrong. Instead, given that people follow a pattern of behavior, the study seeks to describe how that behavior arises. In turn, a better understanding of how moral judgments are made may change our attitudes toward those judgments, Greene said.

Emotional response

The researchers conducted the study with two groups of nine people, who each answered a battery of 60 questions while undergoing MRI scanning. The researchers divided the questions into personal and non-personal categories based on the general notion that the difference between the trolley and footbridge problems may have to do with the degree of personal involvement, and ultimately the level of emotional response.

Examples of non-personal ethical dilemmas included a case of keeping money from a lost wallet and a case of voting for a policy expected to cause more deaths than its alternatives. The researchers also included non-moral questions, such as the best way to arrange a travel schedule given certain constraints and which of two coupons to use at a store.

The scanning consistently showed a greater level of activation in emotion-related brain areas during the personal moral questions than during the impersonal moral or non-moral questions. At the same time, areas associated with working memory, which has been linked to ordinary manipulation of information, were considerably less active during the personal moral questions than during the others.

The researchers also measured how long it took subjects to respond to the questions. In the few cases in which people said it is appropriate to take action in the personal moral questions -- like pushing a person off the footbridge -- they tended to take longer to make their decisions. These delays suggest that this subgroup of people were working to overcome a primary emotional response, the researchers said.

Taken together, the imaging and response time results strongly suggest that emotional responses influenced moral decision-making and were not just a coincidental effect, the researchers concluded.

Professor of psychology John Darley, a co-author of the paper, said the result fits into a growing area of moral psychology which contends that moral decision-making is not a strictly reasoned process, as has been believed for many years. "Moral issues do not come to you with a sign saying 'I'm a moral issue; treat me in a special way,'" Darley said. Instead, they engage a range of mental processes.

New set of tools

Other co-authors on the paper are Brian Sommerville, a former research assistant now at Columbia University Medical School; Leigh Nystrom, a research scientist in psychology; and Jonathan Cohen, a professor of psychology at Princeton.

Cohen also is director of the University's newly established Center for the Study of Brain, Mind and Behavior, which houses the fMRI scanner used in the study and which seeks to combine the methods of cognitive psychology with neuroscience.

"Measuring people's behavior has served psychology well for many years and will continue to do so, but now that approach is augmented by a whole new set of tools," said Cohen.

Brain imaging allows scientists to build a catalog of brain areas and their functions, which can then be cross-referenced with behaviors that employ the same processes, Cohen said. Eventually, this combination of behavioral analysis and biological neuroscience could inform questions in fields from philosophy to economics, he said.

The current study, he said, "is a really nice example of how cognitive neuroscience -- and neuroimaging in particular -- provide an interface between the sciences and the humanities."

top

[an error occurred while processing this directive]