Among law students, it’s apparently taboo to say you want to be a judge. Why this is, I don’t know; the nature of a taboo is that you don’t ask why it’s taboo. But it seems that the very desire to be a judge is at odds with a judge’s temperament: cool, disinterested, objective. If you want to be a judge, you shouldn’t be a judge.
This dilemma isn’t limited to law schools. Like ambitious law students, we all have faith in our moral appraisals; we believe we can step outside our ideologies to make tough, objective judgments based solely on The Facts. Secretly, we all believe we would make pretty good judges. And we are all wrong.
Recent work on moral psychology shows how far we are from the coldly rational observers prized by antiquity and the Enlightenment. Our moral and political minds are less like judges and more like lawyers: instead of weighing the possible answers to a moral problem, we lead with our conclusions and then marshal arguments to support them.
Renowned social psychologist Jonathan Haidt ’85 has done much to demonstrate this aspect of our nature. By posing moral dilemmas — whether or not incest is wrong, for example — to experimental subjects while also monitoring their brainwaves, Haidt showed that our moral judgments occur far too quickly to be reasoned and deliberative. Instead, moral judgments are delivered via affective flashes: the instantaneous, emotional feeling that an act is wrong or right. By this account, making a moral judgment is more similar to tasting something than to solving a math problem — an analogy first made in David Hume’s “Treatise on Human Nature.”
When asked why they thought a given act was right or wrong, though, Haidt’s subjects didn’t respond, “It just felt that way.” They defended their judgment with reasons — but again, those judgments occurred too quickly for such reasoning to have come first. Instead, as Haidt puts it, his subjects made their judgments first and then engaged in often-tortured “strategic reasoning” to justify them.
Haidt’s findings explain why moral arguments are so frustrating: our moral perceptions don’t rest on arguments, but give rise to them. When faced with an ethical dilemma, we think like lawyers, starting with the answer and then working backwards to find the best arguments for it.
Reflecting on my own moral judgments, I believe Haidt’s account is deeply plausible. And I believe that just such a process was at work in liberals’ collective response to Rachel Dolezal, the NAACP activist who identifies and presents herself as black despite having been born to a white family. Many people I knew dismissed Dolezal as a hack but proudly supported Caitlyn Jenner. My own reaction was similar. And yet, it’s difficult to articulate the putative difference between being transgendered and being transracial: both race and gender, we are told, are social constructs that can either privilege or oppress. So it seems to me that our collective judgment of Dolezal resulted not from a principled objection, but a shared intuition that what she did was wrong. Now she’s a laughingstock and Caitlyn Jenner is a hero — but I’m not quite sure why.
The dangers of intuitive moral judgment are compounded by another human tendency: We internalize information much more easily when we agree with it than when we don’t. This means that liberals and conservatives often move farther apart in their attitudes when exposed to the same facts — the implication being that, thanks to human self-deception, the same evidence can be used to support opposite positions. Like lawyers, we only look for information that supports our own argument.
None of this is particularly surprising. But combined with our intuitive moral judgments, our preference for comforting information can trap us in an ethical and political echo chamber of snap-second judgments and one-sided arguments.
The upshot of all this? Outrage is easy. Our minds are Manichaean, starting with conclusions that feel deeply correct and then accumulating evidence until it seems that only a villain could disagree. But righteous conviction does not lead to clear moral thinking. Rather, as the philosopher Jacques Derrida suggested, a truly moral mind is an anxious one: doubtful of impulses and gut feelings, painfully aware of the possibility of error.
And yet we can’t discard our impulses entirely. At the very least, intuitions can provide a valuable starting point in our moral thinking. And doesn’t the very notion of morality stem from the universal gut feeling, common to all people, that some things are right and others wrong?
This is where Haidt’s analysis falters. He believes that, because moral perception is intuitive, moral reasoning is almost impossible. But our own experiences tell us this isn’t true. We’ve all had moments when we realized that one of our deeply held beliefs was not worth holding onto. The unsettling question is: What about all the others?
David Whipple is a senior in Pierson College and a former Weekend editor. Contact him at firstname.lastname@example.org .