October 04, 2011
Utilitarianism Embraced By Psychopaths, Rational Empaths?

If your empathy and desire to minimize suffering brings you to the same conclusions as a psychopaths should you be alarmed? Or should psychopaths feel vindicated?

NEW YORK – September 30, 2011 – A study conducted by Daniel Bartels, Columbia Business School, Marketing, and David Pizarro, Cornell University, Psychology found that people who endorse actions consistent with an ethic of utilitarianism—the view that what is the morally right thing to do is whatever produces the best overall consequences—tend to possess psychopathic and Machiavellian personality traits.

In the study, Bartels and Pizarro gave participants a set of moral dilemmas widely used by behavioral scientists who study morality, like the following: "A runaway trolley is about to run over and kill five people, and you are standing on a footbridge next to a large stranger; your body is too light to stop the train, but if you push the stranger onto the tracks, killing him, you will save the five people. Would you push the man?" Participants also completed a set of three personality scales: one for assessing psychopathic traits in a non-clinical sample, one that assessed Machiavellian traits, and one that assessed whether participants believed that life was meaningful. Bartels and Pizarro found a strong link between utilitarian responses to these dilemmas (e.g., approving the killing of an innocent person to save the others) and personality styles that were psychopathic, Machiavellian or tended to view life as meaningless.

Anyone familiar with how a Machiavellian differs from a psychopath?

Do rational empaths end up reaching the same conclusions as psychopaths but due to different emotional motivations?

These results (which recently appeared in the journal Cognition) raise questions for psychological theories of moral judgment that equate utilitarian responses with optimal morality, and treat non-utilitarian responses as moral "mistakes". The issue, for these theories, is that these results would lead to the counterintuitive conclusion that those who are "optimal" moral decision makers (i.e., who are likely to favor utilitarian solutions) are also those who possess a set of traits that many would consider prototypically immoral (e.g., the emotional callousness and manipulative nature of psychopathy and Machiavellianism).

While some might be tempted to conclude that these findings undermine utilitarianism as an ethical theory, Prof. Bartels explained that he and his co-author have a different interpretation: "Although the study does not resolve the ethical debate, it points to a flaw in the widely-adopted use of sacrificial dilemmas to identify optimal moral judgment. These methods fail to distinguish between people who endorse utilitarian moral choices because of underlying emotional deficits (like those captured by our measures of psychopathy and Machiavellianism) and those who endorse them out of genuine concern for the welfare of others." In short, if scientists' methods cannot identify a difference between the morality of a utilitarian philosopher who sacrifices her own interest for the sake of others, and a manipulative con artist who cares little about the feelings and welfare of anyone but himself, then perhaps better methods are needed.

This raises an important question: Are economists Machiavellians hiding as rational empaths? Or psychopaths who like to manipulate the minds of undergrads?

Share |      Randall Parker, 2011 October 04 06:09 AM  Brain Economics


Comments
Rob said at October 4, 2011 7:52 AM:

The difficulty with Utilitarianism is in the assessment of "good". If your goal is "the greatest good for the greatest number", then you have to have a universally agreed on way to judge "good", which we do not. In fact, it's amazing how often doing something for someone that you think is for their own good will be interpreted by that person as harm rather than good.

Dave said at October 4, 2011 8:09 AM:

http://en.wikipedia.org/wiki/Psychopathy

Psychopathy (/saɪˈkɒpəθi/[1][2]) is a mental disorder characterized primarily by a lack of empathy and remorse, shallow emotions, egocentricity, and deceptiveness.

http://en.wikipedia.org/wiki/Machiavellian#In_psychology

Machiavellianism is also a term that some social and personality psychologists use to describe a person's tendency to deceive and manipulate other people for their personal gain. In the 1960s, Richard Christie and Florence L. Geis developed a test for measuring a person's level of Machiavellianism. This eventually became the MACH-IV test, a twenty-statement personality survey that is now the standard self-assessment tool of Machiavellianism.Machiavellianism is one of the three personality traits referred to as the dark triad, along with narcissism and psychopathy. Some psychologists consider ...

Machiavellianism to be essentially a subclinical form of psychopathy,[5] although recent research suggests that while Machiavellianism and psychopathy overlap, they are distinct personality constructs.[6]

Lono said at October 4, 2011 10:34 AM:

A very topical subject for me as I have just begun to read some of the essays published online by UK Philosopher (and Utilitarianism champion) David Pearce.

As a self-diagnosed atypical Fanatic Narcissist (and an atypical atypical one - at that) I find a lot of his work compelling. And to be a strict Utilitarianist I would think you would have to believe that you somehow have more insight or intelligence or "clarity of vision" than the majority of people milling about out there - so, really, there has to be a bit of Machiavellianism in such people.

I do - however - believe that augmenting such psychological testing with additionaly testing for empathetic ability should rather easily seperate the wheat from the chaff - so to speak.

Just because the Philosopher King, and the Despot, share some superficial traits does not discount the usefulness of the former to forment, direct, and inspire genuine social and technological progress within their sphere of influence.

Phillep Harding said at October 4, 2011 11:02 AM:

"This raises an important question: Are economists Machiavellians hiding as rational empaths? Or psychopaths who like to manipulate the minds of undergrads?"

Economics was once used accounting to keep track of the effectiveness of various hypotheses,and thus offended those who were inclined toward a "so much the worse for the facts" attitude. The sort who cannot stand to be proven wrong.

So, I would add at least one more possibility, that some economists are twits who likes to build structures of "pure" thought. Structures that are not falsifiable.

spindizzy said at October 4, 2011 11:06 AM:

> the morality of a utilitarian philosopher who sacrifices her own interest for the sake of others

I wince whenever I read these mealy-mouthed politically-correct uses of the female pronoun, most of all when employed in such a gender-stereotypical way.

In any case, utilitarianism forbids its practitioners from sacrificing their own interest just as much as it forbids them from prioritising it.

Although it has its problems, I like the core ideas of utilitarianism. At least it is not logically contradictory like most moral "theories"...

I guess that's why people hate it so much.

Michael B. said at October 4, 2011 3:47 PM:

"This raises an important question: Are economists Machiavellians hiding as rational empaths? Or psychopaths who like to manipulate the minds of undergrads?"

Hahahahahahah. As someone who majored in economics in college, I found this hilarious.

bbartlog said at October 4, 2011 7:19 PM:

Utilitarianism is an attempt to avoid difficult choices by pretending that all human experiences are somehow commensurable. Even if you could make an argument for that (I find it laughable), it would still fail as a practical philosophy due to the difficulty of measurement.

Fat Man said at October 4, 2011 9:23 PM:

"people who endorse actions consistent with an ethic of utilitarianism—the view that what is the morally right thing to do is whatever produces the best overall consequences—tend to possess psychopathic and Machiavellian personality traits."

More garbage psychology. First, the ability to say anything from this type of set-up is extremely limited. Providing the most typical test subjects (undergraduates trying to make a couple of bucks) with this type of problem is just asking to get less than serious answers. Second, what people say and what they actually do are often two completely different things. Unconstrained BS, is just that. A truly Machiavellian subject, would of course, dissemble in order to gain power over the researcher. I wonder how the researchers adjust for that.

BTW, the PI is a professor of marketing. It is not like he is a real climate scientist.

Ignoto said at October 6, 2011 3:03 AM:

"In short, if scientists' methods cannot identify a difference between the morality of a utilitarian philosopher who sacrifices her own interest for the sake of others, and a manipulative con artist who cares little about the feelings and welfare of anyone but himself"

In most scientific endeavours, where the science cannot measure a difference, there is no scientific difference.

Brett Bellmore said at October 6, 2011 3:58 AM:

"Although it has its problems, I like the core ideas of utilitarianism. At least it is not logically contradictory like most moral "theories"..."

No, it would have to be considerably more like a theory which could actually be applied in the real world, in order to suffer such problems. It avoids such conflicts not by being better than other moral theories, but less feasible.

1. No way of objectively measuring "utility". (Which is presumed for convenience to be a simple number, rather than complex.)
2. No basis for determining how utility is added across different individuals. (A scalar? A vector sum with each individual their own dimension?)
3. Even if you satisfied 1 and 2, no actual way of collecting the data. (You're not omniscient, but the theory demands you behave as though you were.)
4. Even if 1-3 were satisfied, the correct course of action would be incomputable in any feasible amount of time.

Essentially, the attraction of utilitarianism is that, since nobody can ACTUALLY apply the theory, you're free to imagine that it works better than moral theories people can really try to put into practice. But that's all you're doing, imagining. The theory boils down to, "Imagine a perfect world. Do whatever leads to it. Pretend you're doing math in between."

And how can that lead to contradictions?

Russ said at October 6, 2011 7:49 AM:

Are the five people idiots? Is the one large guy somebody special? Are we assuming that all people are equally valuable?
These questions (and the theories built upon them) never seem to accept the value of context in decision-making.

spindizzy said at October 6, 2011 10:43 AM:

1. No way of objectively measuring "utility". (Which is presumed for convenience to be a simple number, rather than complex.)

Different forms of utilitarianism can have different utility functions. Some forms are more easily quantifiable than others.

For example, one form of utilitarianism might equate economic productivity of a group with moral virtue.

Of course, this might not accord with your instincts about morality, but the point of moral theories is to inform your decisions, not confirm your prejudices (a point lost on many popular philosophers).

2. No basis for determining how utility is added across different individuals. (A scalar? A vector sum with each individual their own dimension?)

Since ranking is required, scalar quantities are obviously implied. The scale need not be linear.

3. Even if you satisfied 1 and 2, no actual way of collecting the data. (You're not omniscient, but the theory demands you behave as though you were.)

It depends on the utility function, as in point 1.

4. Even if 1-3 were satisfied, the correct course of action would be incomputable in any feasible amount of time.

Lots of problems are too hard for optimal solutions, which is why we use heuristics.

My point is: utilitarianism is to morality as science is to religion... imperfect, but preferable to mysticism and hand-waving.

solaris said at October 6, 2011 1:18 PM:

In the split second you have to make a decision, you're not going to have time to engage in any complicated moral calculus. So what I would do, and what everybody else would do, is to yell "Look out!".

They load the dice with their "large man" who we are encouraged to sacrifice. It could just as easily be a woman in that position, but that would not result in the same responses.

The "rational empath" might well conclude that the best thing to do is to throw his own body on the track and save all the other people. Again, they did not want to ask their respondents to have to make THAT decision, even though it would bring out the true distinction between the rational empath and the psychopath.

solaris said at October 6, 2011 1:27 PM:

>"utilitarianism is to morality as science is to religion... imperfect, but preferable to mysticism and hand-waving."

Science is to religion as a cabbage is to a brick. The are two different things with two different uses. The same is true for utilitarianism and morality. None are "preferable" to the others in principle, though one may be preferable to the others under certain circumstances.

You are engaging in scientism, something which Hayek spend several books criticizing.

spindizzy said at October 7, 2011 1:27 PM:

> They are two different things with two different uses.

I agree, but in the context of my statement it is obvious that I am referring to religion within the narrow domain where it competes with scientific theory.

> The same is true for utilitarianism and morality.

And what is the use for morality?

> You are engaging in scientism, something which Hayek spend several books criticizing.

Not at all, and please do not attempt to straw-man my arguments. To clarify, I am agnostic about the existence of moral facts. However, I cannot see the value in a moral theory which exists as a retrospective justification of what its practitioners intended to do all along.

Avenist said at October 7, 2011 3:24 PM:

My thinking is that morality is an adaptation of the survival instinct that benefits the culture in which it developed. The fact that morality originated to benefit an in-group makes it possible to be co-opted and used against an an out-group. It's been very a successful technique for governments and I consider it the basis for the greatest evil of all-collective evil.

As for the Cornell study, I think psychopathy is a part of the social brain and psychopaths are just high function variants. Their version of the Trolley Problem may have selected for the psychopaths.

And I think it's wrong to throw the switch or to pitch the fat man.


http://www.nytimes.com/2008/01/13/magazine/13Psychology-t.html?pagewanted=print

The gap between people’s convictions and their justifications is also on display in the favorite new sandbox for moral psychologists, a thought experiment devised by the philosophers Philippa Foot and Judith Jarvis Thomson called the Trolley Problem. On your morning walk, you see a trolley car hurtling down the track, the conductor slumped over the controls. In the path of the trolley are five men working on the track, oblivious to the danger. You are standing at a fork in the track and can pull a lever that will divert the trolley onto a spur, saving the five men. Unfortunately, the trolley would then run over a single worker who is laboring on the spur. Is it permissible to throw the switch, killing one man to save five? Almost everyone says “yes.”

Consider now a different scene. You are on a bridge overlooking the tracks and have spotted the runaway trolley bearing down on the five workers. Now the only way to stop the trolley is to throw a heavy object in its path. And the only heavy object within reach is a fat man standing next to you. Should you throw the man off the bridge? Both dilemmas present you with the option of sacrificing one life to save five, and so, by the utilitarian standard of what would result in the greatest good for the greatest number, the two dilemmas are morally equivalent. But most people don’t see it that way: though they would pull the switch in the first dilemma, they would not heave the fat man in the second. When pressed for a reason, they can’t come up with anything coherent, though moral philosophers haven’t had an easy time coming up with a relevant difference, either.

When psychologists say “most people” they usually mean “most of the two dozen sophomores who filled out a questionnaire for beer money.” But in this case it means most of the 200,000 people from a hundred countries who shared their intuitions on a Web-based experiment conducted by the psychologists Fiery Cushman and Liane Young and the biologist Marc Hauser. A difference between the acceptability of switch-pulling and man-heaving, and an inability to justify the choice, was found in respondents from Europe, Asia and North and South America; among men and women, blacks and whites, teenagers and octogenarians, Hindus, Muslims, Buddhists, Christians, Jews and atheists; people with elementary-school educations and people with Ph.D.’s.

Joshua Greene, a philosopher and cognitive neuroscientist, suggests that evolution equipped people with a revulsion to manhandling an innocent person. This instinct, he suggests, tends to overwhelm any utilitarian calculus that would tot up the lives saved and lost. The impulse against roughing up a fellow human would explain other examples in which people abjure killing one to save many, like euthanizing a hospital patient to harvest his organs and save five dying patients in need of transplants, or throwing someone out of a crowded lifeboat to keep it afloat.

By itself this would be no more than a plausible story, but Greene teamed up with the cognitive neuroscientist Jonathan Cohen and several Princeton colleagues to peer into people’s brains using functional M.R.I. They sought to find signs of a conflict between brain areas associated with emotion (the ones that recoil from harming someone) and areas dedicated to rational analysis (the ones that calculate lives lost and saved).

When people pondered the dilemmas that required killing someone with their bare hands, several networks in their brains lighted up. One, which included the medial (inward-facing) parts of the frontal lobes, has been implicated in emotions about other people. A second, the dorsolateral (upper and outer-facing) surface of the frontal lobes, has been implicated in ongoing mental computation (including nonmoral reasoning, like deciding whether to get somewhere by plane or train). And a third region, the anterior cingulate cortex (an evolutionarily ancient strip lying at the base of the inner surface of each cerebral hemisphere), registers a conflict between an urge coming from one part of the brain and an advisory coming from another.

But when the people were pondering a hands-off dilemma, like switching the trolley onto the spur with the single worker, the brain reacted differently: only the area involved in rational calculation stood out. Other studies have shown that neurological patients who have blunted emotions because of damage to the frontal lobes become utilitarians: they think it makes perfect sense to throw the fat man off the bridge. Together, the findings corroborate Greene’s theory that our nonutilitarian intuitions come from the victory of an emotional impulse over a cost-benefit analysis. Steven Pinker


spindizzy said at October 8, 2011 9:31 AM:

> And I think it's wrong to throw the switch or to pitch the fat man.

I am interested in your perspective.

How many lives would need to be saved to justify the sacrifice of one individual? If not five, how about twenty, or one hundred, or one thousand, or one million?

And how much involvement could you accept in his sacrifice? How about if the fat man was going to fall of his own accord unless you grabbed his hand, or unless you shouted a warning?

Would you sacrifice your own life to save others? Would you be willing to persuade someone else to sacrifice their life to save others? Would you consider someone culpable if they did not make such a sacrifice?

solaris said at October 9, 2011 3:03 AM:

>"And what is the use for morality?"

Other than keeping you alive?


>"To clarify, I am agnostic about the existence of moral facts."

What about the moral fact which is you? Are you agnostic about that?


>"I cannot see the value in a moral theory which exists as a retrospective justification of what its practitioners intended to do all along."

And you have the audacity to complain about what you think are other people making straw-man arguments ...

Avenist said at October 10, 2011 12:46 PM:

"I am interested in your perspective.'

In both actions you murder an innocent person. If you want to sacrifice yourself, fine. If you want to sacrifice someone else, I wouldn't consider you human. Other people are not your property, nor the property of the collective.

"How many lives would need to be saved to justify the sacrifice of one individual? If not five, how about twenty, or one hundred, or one thousand, or one million?"

I'll play your silly game. What if the person, or one of the people you would sacrifice, turns out to be a young Norman Borlaug, who would grow up and save perhaps a billion lives?

"And how much involvement could you accept in his sacrifice? How about if the fat man was going to fall of his own accord unless you grabbed his hand, or unless you shouted a warning?"

Not sure I understand your point but I would not want to be involved in taking his life.

"Would you sacrifice your own life to save others? Would you be willing to persuade someone else to sacrifice their life to save others? Would you consider someone culpable if they did not make such a sacrifice?"

No, no, and no.

Are you an organ donor volunteer? Do you contribute generously to third world disaster relief funds? Governments of the world killed about 400 million people during the 20th century, are you actively against government? Do you drive when drunk? Tired? Distracted?

spindizzy said at October 11, 2011 3:06 PM:

> Other people are not your property, nor the property of the collective.

This is exactly the sort of issue which a moral theory is supposed to determine. You cannot hide your conclusion in your premises.

Also, please avoid personal insults and emotive language. It does not strengthen your argument. I assure you that my questions are a sincere attempt to understand your point of view.

> What if the person, or one of the people you would sacrifice, turns out to be a young Norman Borlaug, who would grow up and save perhaps a billion lives?

But this hypothetical savior is more likely to be among the million lives I saved than the single one you saved.

> Not sure I understand your point but I would not want to be involved in taking his life.

In these scenarios, you are involved whether you like it or not... the only question is whether you opt for one death or many.

If I understand your view correctly, you feel that there is a moral distinction between a doctor who refuses the antidote to a patient who has been poisoned and a doctor who administers poison to a healthy patient. On the other hand, I consider them essentially the same. Is this an accurate summary?

> No, no, and no.

You consider self-sacrifice for the good of others to be immoral? Please clarify since that is an unusual position.

> Are you an organ donor volunteer?

No, but primarily from laziness. I suspect it would be moral to donate.

> Do you contribute generously to third world disaster relief funds?

No, from skepticism and also laziness.

I assume your point is that my failure to act makes me, by my view, a murderer. I accept that conclusion. Like most people who are not totally powerless, I am probably responsible for the deaths of others. This knowledge does not especially bother me.

> Governments of the world killed about 400 million people during the 20th century, are you actively against government?

Your point in tendentious.

Randall Parker said at October 12, 2011 9:57 PM:

Avenist,

Regards Norman Borlaug and food supply: More people are hungry today than were alive in 1800. Technology has not reduced the total amount of world hunger.

Avenist said at October 16, 2011 4:58 PM:

Randall Parker,

Hungry and starving are two different things.

Randall Parker said at October 17, 2011 6:36 PM:

Avenist,

Are you saying more people starved to death in 1800? Not sure if that is true. Back then hunger more likely translated into death from disease long before it caused death from starvation.

One thing that has changed: Living standards in Africa have gone down as compared to 200 years ago. Drugs and vaccines reduce death from disease. So people end up living at lower living standards. Death from disease used to boost living standards of the survivors by reducing the competition for food.

Post a comment
Comments:
Name (not anon or anonymous):
Email Address:
URL:
Remember info?

                       
Go Read More Posts On FuturePundit
Site Traffic Info
The contents of this site are copyright ©