Thinking Fast and Slow by Daniel Kahneman — Some Insights

Thinking-Fast-and-Slow

I read parts of this book on the plane today and found it useful to think of all the ways in which humans make bad decisions.

Kahneman discusses the process of decisionmaking as the product of two Systems at work. System 1 relies on intuitive thinking and is more prone to errors, irrationality and laziness. System 2 is slow thinking that is deliberate and controlled.

To be more specific, Kahneman writes explains: “System 1 detect simply relations (“they are all alike”…) and excels at integrating information about one thing, but it does not deal with multiple distinct topics at once, nor is it adept at using purely statistical information.”

In contrast, the functions of “System 2 is to monitor and control thoughts and actions ‘suggested’ by System 1, allowing some to be expressed directly in behavior and suppressing or modifying others.” Self-criticism is also one of the main functions of System 2.

The point is that we have a tendency as humans to make irrational choices and judgements by lazily relying on System 1. High intelligence does not make us immune to biases. Intelligence, he defines, is the ability to reason and the ability to “find relevant material in memory and to deploy attention when needed.” Memory function is a part of System 1, but slow and deliberate fact checking is part of System 2.

Now, it appears that neither System is inferior to the other. In fact, our intuitive areas in System 1 can be just a strong and accurate, given the right conditions — for example, he mentions that being happy loosen the mood on intuitive performance and this had the effect of doubling accuracy. It doesn’t mean that we cannot trust intuition in System 1, but we should be critical. Intuition can be trusted when the answer comes from an environment that is predictable and one has the opportunity to learn those regularities through prolonged practice.

On the other hand, too much exertion on System 2, for example, burdening it with exhaustive, detailed fact-checking may lead to “ego depletion” and lead to irrational choices.

The key to disciplined reasoning is to:

“1) anchor our judgment of the probability of an outcome on a plausible base rate

2) question the diagnosticity of our evidence.”

Whenever possible, in hiring practices or other situations, it is often best to use mathematical formulas and rely on statistical evidence to make human judgments. As a social scientist trained to be skeptical of nearly everything I read or hear about, I agree with the suggestions for more disciplined reasoning.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s