Home > Športno svetovanje > Posamezniki > Joshua Greene: Moral Tribes
Human morality

Human morality

The Tragedy of Commonsense Morality is real.

Nearly all of us are collectivists to some extent.

At least in the developed world, tribes generally refrain from imposing their most arbitrary rules on one another. 

Understanding morality requires two things:

  • We must understand the structure of modern moral problems.
  • We must understand the structure of our moral brains and how different kinds of thinking are suited to solve different kinds of problems.

The two kind of moral problems:

  • Me versus Us selfishness versus concern for others.
  • Us versus Them: out interests and values versus theirs. This is the Tragedy of Commonsense Morality.

Automatic settings are effective but inflexible. Manual mode is flexible but inefficient.

A more apt name for utilitarianism is deep pragmatism. Deep pragmatism is about making principled compromises.

Moral Problems

The Tragedy of Commons

 Garret Hardin published in 1968 a classic paper entitled “The Tragedy of the Commons”. It illustrates the problem of cooperation. Problem of cooperation is getting collective interests to triumph over individual interests.

Morality evolved as a solution to the problem of cooperation, as a way of averting the Tragedy of Commons.

Our brains did evolve for cooperation within the group, but not between groups.

Two moral tragedies are:

  • The original tragedy: The Tragedy of the Commons.
  • The modern tragedy: The Tragedy of Commonsense Morality.  

Morality evolved to avert the Tragedy of Commons, but it did not evolved to avert the Tragedy if Commonsense Morality.

The idea of metamorality is not entirely new.

Moral Machinery

Morality is a collection of devices, a suite of psychological capacities and dispositions that together promote and stabilize cooperative behavior.

The Prisoner’s Dilemma, like the Tragedy of Commons, involves a tension between individual interest and collective interest.

We humans have feelings that do the thinking for us.

Our capacity for forgiveness, which tempers our negative reactive emotions, has deep biological, following the logic of reciprocity in an uncertain world.

Our basic decency extends beyond nonaggression to positive act of kindness.

Robin Dunbar, humans spend about 65 percent of their conversation time talking about the good and bad deeds of other humans – that is gossiping.

At the age of six months, long before they can walk or talk, human infants are making value judgments about actions and agents.

Common problem is that all cooperative groups must protect themselves from exploitation.

We favor people closer to us. Call this tendency tribalism, which is sometimes known as parochial altruism.

In the modern world, one of the most salient delimiters of Us versus Them is race.

Numerous lab experiments confirm that people are, indeed, pro-social punishers. Ernst Fehr and Simon Gachter created an experiment called the “Public Goods Game”. People get money, they put it in the common pool. The money that goes into pool gets multiplied. The rational thing would be that everybody would put everything into the pool. When people start to free ride the cooperation falls. If there is a chance to punish free riders, people will punish them.

For each cooperative strategy, our moral brains have a corresponding set of emotional dispositions that execute the strategy.

  • Concerns for others. Corresponding to this strategy, humans have empathy. I’ve called this minimal decency.
  • Direct reciprocity. Corresponding to this strategy, humans have anger and disgust. Positive incentives is gratitude.
  • Commitments to threats and promises. We are often vengeful. But we are also honorable and loyal.
  • Reputation. We are judgmental. And we can get visibly embarrassed.
  • Assortment. We are tribalistic.
  • Indirect reciprocity. We are social punishers.

 The faster people decided, the more they cooperated. Built into our moral brains are automated psychological programs that enable and facilitate cooperation.

Strife on the New Pastures

Humans nearly always put Us ahead of Them. Beyond tribalism, groups have genuine differences in values, disagreements concerning the proper terms of cooperation.

Joseph Henrich and colleagues created the Ultimatum Game. Two person. They got some money. One person proposes the split of money. If the other accepts, they share it and get the money. If she/he refuses they don’t get anything.

In the Dictator Game the one with the money has control. They offer what they offer.

In the Western nations in the Ultimatum Game, the average offer is about 44 %. In the Public Goods Game, the average contributions generated are between 40 and 60 %, with people contributing either 0 or everything. In the Dictator game the offer is typically 50 % or nothing, consistent with their behavior in Public Good Games.

Religious moral values and local moral values are intimately related.

People have biased fairness. Our tribal allegiances can make us disagree about the facts.

Six psychological tendencies that exacerbate intertribal conflict:

  • Human tribes are tribalistic, favoring Us versus Them.
  • Tribes gave genuine disagreements about how societies should be organized, emphasizing, to different extents, the rights of individuals versus the greater good of the group.
  • Tribes have distinctive moral commitments.
  • Tribes, like individuals within them, are prone to biased fairness.
  • Tribal beliefs are easily biased.
  • The way we process information about social events can cause us to underestimate the harm we cause others.

Morality – Fast and Slow

Trolleyology

Jeremy Bentham and John Stuart Mill. The utilitarianism. It is a great idea with an awful name. It says that we should do whatever will produce the best overall consequences for all concerned.

Author liked utilitarianism because it gave him a value premise that itself allowed for the balancing of values.

The most common complaint about utilitarianism is that it undervalues people’s rights, that it allows us to do things to people that are fundamentally wrong, independent of the consequences.

When, and why, do the rights of the individual take precedence over the greater good?

The Trolley Problem. The Switch Problem. The Footbridge Problem. They are the variations of the same issue. It was Kant versus Mill, all in one neat little puzzle.

The dual-process theory of moral judgment is about two moralities, abstract and sympathetic. When we think about controlled response to moral questions, we need to override the emotions impulse a lot of times.

Removing time pressure and encouraging deliberation increases utilitarian judgment.

We have dual-process moral brains. We should have separate automatic and controlled responses to moral questions. In ideal world, moral intuition is all you need, but in the real world, there are benefits to having a dual-process brain.

Efficiency, Flexibility, and the Dual-Process Brain

The human brain is like a dual-mode camera with both automatic settings and a manual mode. Emotions are automatic processes. They are devices for achieving behavioral efficiency. Emotions have action tendencies.

We need our emotional automatic settings, as well as our manual mode, and we need them for different things.

Common Currency

A Splendid Idea

Metamortality is a high-level moral system that adjudicates competing tribal moralities, just as a tribe’s morality adjudicates among competing individuals.

Collectivism is simply unjust, punishing the best people while rewarding the worst.

The idea that we should do whatever works best sounds a lot like pragmatism in the colloquial sense.

Utilitarianism is about core values. It’s about taking pragmatism all the way down to the level of first principles.

One’s happiness is the overall quality of one’s experience, and to value happiness is to value everything that improves the quality of experience for oneself and for others – and especially for others whose lives leave much room for improvement.

Happiness is the common currency of human values.

The second defining feature of utilitarianism, in addition to its focus on experience, is that it’s impartial. Everyone’s happiness counts the same.

according to utilitarianism, what ultimately matters is the quality of our experience.

The Tragedy of the Commons, is a tragedy of selfishness, but the Tragedy of Commonsense Morality is a tragedy of moral inflexibility.

In Search of Common Currency

Many contemporary moral thinkers believe that morality is, at its core, about rights.

The three approaches are: the religious model, the mathematical model and the scientific model.

Author claims that utilitarianism becomes attractive once our moral thinking has been objectively improved by a scientific understanding of morality.

Common Currency Found

What we have is only a default commitment to maximizing happiness, reflected in the qualifier “if all else is equal”. If we drop the “if all else is equal”, we get utilitarianism.

Our moral brains evolved to help us spread our genes, not to maximize our collective happiness.

If author is right, utilitarianism is the native philosophy of the human manual mode, and all of the objections to utilitarianism are ultimately driven by automatic settings.

Solving a behavioral problem is about realizing a goal state. The manual mode’s job is to realize goal states, to produce desired consequences.

A problem solver begins with an idea (a representation) of how the world could be and then operates (behaves) on the world so as to make the world that way.

The human manual mode, housed in the PFC, is general-purpose problem solver, an optimizer of consequences. It is by nature, cost-benefit reasoning system that aims for optimal consequences. Optimal for whom? What counts as optimal for a given person?

The ideal of impartiality. Only creatures with a manual mode can grasp the ideal of impartiality.

Utilitarianism can be summarized in three words: Maximize happiness impartially.

Moral Convictions

Alarming Acts

Two general strategies: accommodation and reform.

The world’s moral tribes have different versions of common sense – hence the Tragedy of Commonsense Morality.

Our automatic settings, our moral intuitions, can fail us in two ways: they can be oversensitive or they can be undersensitive.

It seems that one important psychological difference between the footbridge and switch cases has to do with the “personalness” of the harm and, more specifically, with personal force – pushing versus hitting a switch. It is about harm caused as a means to an end and harm caused as a side effect.

The mean/side-effect distinction has a long history in philosophy, going back at least as far as St. Tomas Aquinas who framed the “Doctrine of Double Effects”, which is essentially the “Doctrine of Side Effect”. Kant says that the moral law requires us to treat people “always as an end and never as a means only”.

The modular myopia hypothesis synthesizes the dual-process theory of moral judgment with a theory about how our minds represent actions. It explains why we care less about harms caused as side effects.

Idea that we can be emotionally blind but not cognitively blind, should sound familiar.

Author’s hypothesis is that the myopic module is an action-plan inspecting system.

The Doctrine of Doing and Allowing says that harms caused by actions, by things that we actively do, are worse than harms of omission.

In evaluating people, it makes sense to take the action/omission distinction, the means/side-effect distinction, and the personal-force/no-personal-force distinction seriously.

Justice and Fairness

What should we be aiming for?

It is surprisingly hard to justify treating the nearby drowning child and the faraway starving child differently. But physical distance is an important factor.

Utilitarianism is firm but reasonable in practice, accommodating our needs and limitations.

It is not reasonable to expect actual humans to put aside nearly everything they love for the sake of the greater good.

For utilitarians, punishment is just a necessary evil. We are natural punishers because punishment serves a social function. In some cases, our punishment judgments are clearly irrational.

People confuse utility with wealth, and this makes maximizing utility seem less attractive, and perhaps unjust. Utility is closely related to stuff, but it is not itself stuff.

Moral Solutions

Deep pragmatism

We all believe that what we want is for the best. We need a shared moral standard, what author calls a metamorality.

The essence of deep pragmatism is to seek common ground not where we think it ought to be, but where it actually is.

Properly understood utilitarianism is deep pragmatism.

Rationalization is the great enemy of moral progress, and thus of deep pragmatism.

If author is right, rights and duties are the manual mode’s attempt to translate elusive feelings into more object-like things that it can understand and manipulate.

Some moral judgments really are common sense. Common doesn’t mean universal. It means common enough for practical, political purposes.

Morality is not what generations of philosophers and theologians have thought it to be. Morality is not a set of freestanding abstract truths that we can somehow access with our limited human minds.

We can describe our tribal automatic settings (Aristotle), and we can attempt to prove that they are correct (Kant). Instead of trusting our tribal moral sensibilities, or rationalizing them, we can instead seek agreement in shared values, using a system of common currency.

Jonathan Haidt: The Righteous Mind. In the book Haidt is talking about that we can get along better if we could be less self-righteous. We’re very good at seeing through our opponents’ moral rationalizations, but we need to get better at seeing our own.

Haidt identifies six moral foundations, which can be labeled in positive or negative terms: care/harm, fairness/cheating, loyalty/betrayal, authority/subversion, sanctity/degradation and liberty/oppression.

Beyond Point-and-Shoot Morality: Six Rules for Modern Herders

When it is Me versus Us problem think fast. When it is Us versus Them problem think slow.

Six rules for modern herders:

  • In the face of moral controversy, consult, but do not trust, your moral instinct.
  • Rights are not for making arguments; they’re for ending arguments.
  • Focus on the facts, and make others do the same.
  • Beware of biased fairness.
  • Use common currency.
  • Give.

Leave a Reply