Thinking, Fast and Slow. (And 2 worries.)Series 4.2.

Some of my books have well-worn pages, others are falling apart from use. The books I use heavily are usually those I think worth giving others. And so I do. I peruse used-book shops and thrift stores to stock my library with multiple copies. This way, I often have a book on hand to share. If I were to hand you a book from my shelf right now, I’d reach for Daniel Kahneman’s Thinking, Fast and Slow.

Kahneman explains not only how and why we’re prone to errors (and biases) in our judgments but also describes in detail those errors we’re most likely to make, such as jumping to conclusions. Let’s take a quick look at this potential cause of error.  (I emphasize potential since a lot of our errors occur as a result of processes that most times work quite well for us. If they didn’t we’d not be here to tell the tale!)

As Kahneman notes, our tendency to arrive at conclusions with little supportive evidence has physiological underpinnings that allow us to make accurate judgments with a minimal expenditure of energy. The catch is that this mode of judging is most reliable in familiar environments where the risks associated with the odd error are low. I use my own examples here. Think of the social faux pas. Since Anna uses some four-letter words, I think it’s okay to drop the F-bomb. I do so and offend Anna. This error causes some mild discomfort for Anna and a little embarrassment for myself. But we’re still friends. Since Anna is my friend, I already had a good notion she’d forgive my faux pas as she’s done for others I’ve made.

In unfamiliar environments where the stakes are high, jumping to conclusions can be disastrous. A con-man knows this risk full well and exploits it to his advantage. He plays on his good looks (or nice voice) and a good story to garner your trust and empty your savings. Why do we fall for the con-man’s scam? He seems nice, you think, and nice people wouldn’t take advantage of me. And so when the “nice” con-man tells you a plausible story to boot, you’re sold! You’ve given him money. We’re remiss to believe that we’re not susceptible (not-I) to the con-man’s wiles, that only the weak and foolish are vulnerable. As the Travis Tritt song goes, we think ourselves as 10 feet tall and bullet proof. Intellectually, at least. But this overconfidence is, as Kahneman drives home, also a source of error. 

The upshot is that our cognitive errors (and biases) aren’t features of untrained, ignorant, or morally reprehensible people, but rather occur as a result of regularities in the human cognitive/perceptual system — a system that most often works incredibly well!

The most highly-skilled and conscientious among us might be better equipped to recognize and avoid certain errors, but none escape making them. Fortunately, as noted, there are usually few serious consequences to these inevitable glitches. But sometimes our errors in judgment can be very costly — for ourselves as well as others; e.g. The Titanic. The problem, as Kahneman explains, is that it can be very hard — even impossible — to catch ourselves in errors leading to poor judgments. But this is not an entirely hopeless situation. With the aid of his book and some dedicated practice, we might correct ourselves before we do some damage. However, particularly for decisions with high-stakes, relying solely on one’s own judgment is an unwarranted gamble. The most reliable way to avoid mistakes is to seek an outside view from someone who is also trained to recognize systematic errors. As Kahneman notes, what we humans are good at is noticing when others are “walking into a cognitive minefield.” And not only are we good at detecting others’ impending mistakes, but also we’re motivated to notice.  It’s fun to point out others’ gaffes! Hence Kahneman “orients [Thinking, Fast and Slow] to “water-cooler gossips [since] it is much easier, as well as far more enjoyable, to identify and label the mistakes of others than recognize our own.” (pp 3-4, pp 417-481)

Daniel Kahneman’s overarching goal in Thinking, Fast and Slow is not only to provide people the tools to identify cognitive/perceptual errors (and biases), but also to give each a language with which to use these tools to  provide “fair and sophisticated criticism” to others. And it’s this last point that warrants some criticism of its own. I’ll explain.

As much as I recommend everyone read Thinking, Fast and Slow, I’m left with two worries.

1) We’re not always willing to take correction from others, especially from those we don’t like. “Fair and sophisticated criticism” — however well-intentioned, polite, clearly stated, and accurate — uttered from the lips of someone we find morally reprehensible or intellectually inferior (or both) is often really hard to accept. In the worst case scenario, we might not only reject the criticism, but also pursue our (possibly) erroneous reasoning with all the more vigour — the so-called backfire effect. And if our mistakes are particularly costly, and embarrassing, we’ll often dislike our critic all the more. In this light, the current political polarization can’t be very healthy for decision-making on either side of the divide.

2)In even the best of relationships, “fair and sophisticated” criticism can be taken as a slight and permanently damage friendships. There is always some risk of hard-feelings when we criticize another, and we tend to do a calculation before we open our mouths. Yet no matter how careful our wording, we can nevertheless be misunderstood. And there are other hazards. We make errors in judgment about when and how we should offer a criticism. To avoid this difficulty, we sometimes ask people to vet the criticism we’re about to offer another friend or colleague. But in so doing we’re relying on their own better judgments — which as we’ve seen aren’t always so reliable. And so on. The upshot is that we’re rather stuck in a best-we-can-do world where social risks weigh in our decisions as much as more concrete risks; e.g. how to safely cross a river. And in both cases, we are forever building bridges.

A careful study of Kahneman’s book may go some way to removing barriers to listening to others’ criticisms; e.g. providing good reasons to be a little less sure of one’s own judgments and to value an outside view. And if these reasons are taken both to heart and to mind, we might foster a habit of humility. Humility being the stance that, well I might be wrong. Or, not quite as right as I think I am. But humility will take us only so far. Strong feelings — particularly repulsion or dislike, fear and shame — are difficult to surmount. As David Hume says, “These sentiments are not to be controlled or altered by any philosophical theory or speculation whatever.” Says he, “Nature will always maintain her rights, and prevail in the end over any abstract reasoning whatsoever.”

David Hume. Eric Steinberg, Ed. An Enquiry Concerning Human Understanding. Hacket Publishing Company, Inc.: Indianapolis. 1993. (p 68.)

Recommended reading as a complement to Thinking, Fast and Slow is Jonathan Haidt’s The Righteous Mind : Why Good People are Divided by Politics and Religion. 

A study of Haidt’s moral theory alongside Kahneman’s work on our cognitive/perceptual errors and biases might well make the barriers to both offering and receiving criticism a little more surmountable.

 

Addendum: A message from the chief, safety and mission assurance (July 2014) at NASA with some helpful advice about recognizing Group Think, “Watch Out for Groupthink.”



Categories: Political Rhetoric

Tags: , , , , , , , , , , , , , , ,

2 replies

Trackbacks

  1. The Art of Rhetoric: Working through the challenges and disagreements that arise from our shared lives. – pam-mentations.com
  2. What, if anything, is wrong with MSNBC, CNN, and FOX? (Part 1 of 2) – pam-mentations.com

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: