Thinking too fast, and too slow

October 2018
thinking too fast
Behavioural and neurological science show us that storytelling is deeply embedded in human nature. Narratives are, at times, more important to us than being factually correct. This is great for authors and screenwriters, but extremely dangerous if we are investing money or making government policy.

 

Plato (the classical Greek philosopher), Roddenberry (author of Star Trek) as well as Kahneman and Tversky (founders of behavioural science), all use the analogy of two competing systems to describe how the mind works. One system is fast and based on intuition and the other is slow and founded on logic. Paraphrasing Kahneman’s famous book “Thinking Fast, and Slow” I will illustrate how our own biases are getting us into trouble time and time again.

Thinking too fast…

Most behavioural experiments measure mistakes we make when we are thinking too fast. The experiments are constructed in such a way that we apply our intuition to problems that require logic. Most of us will fall into this carefully designed trap no matter how intelligent we are.

Think of a savvy salesperson who gives you a special offer – a deal of a lifetime – available today only. This person tries to get us to think too fast, but de-biasing ourselves is relatively straight forward. We just need to take a step back, count to ten and activate the logical part of our brain.

Thinking too slow…

The real problems occur when we are thinking too slow: when we let our biases use our own logic against us. This mistake is much more difficult to detect and to protect ourselves from, because it is deeply rooted in human nature. We hate ambiguity; we want to find simple relationships between action and reaction, we want to be in control of our investment portfolios and we want to believe that our bosses, political leaders and central bankers are in control.

Often, we are faced with situations that academics would categorise as fundamental uncertainty. In other words, we do not know exactly how the world works and there are no simple relationships or clear answers available. These situations make us feel inherently uncomfortable; our storytelling skills come in handy and we create a narrative that fits. If real data is insufficient, we tend to extrapolate our knowledge and the narrative becomes more of a belief than an accurate mental model of reality.

A belief shared by a small group is harmless to the rest of us, but narratives that go viral take on a life of their own. In rare cases, the viral narrative could be promoted from a belief into science and then it becomes self-reinforcing. But what if our beloved narrative/belief/science primarily designed to make us sleep well at night, turns out to be an erroneous depiction of reality?

Beliefs are sticky

As individuals, we crave being part of a group, which results in interesting group dynamics. We tend to self-censor, we are prone to groupthink and to obey authority. These group dynamics make it very difficult to challenge a current paradigm. When upgrading a belief to ‘science’, the narrative is legitimatised by the authority of scientists.

The brightest, most egocentric are more prone to be victims of thinking too slowly. Imagine how difficult it would be for a renowned scientist who spent the bulk of their career on a specific theory to acknowledge to themselves that their theory does not hold and that all their work had been in vain. Keep in mind that the easiest person to lie to is ourselves – the smarter we are, the more convincing the lies. The economist Paul Samuelsson, inspired by Max Planck, captured this here: “Science makes progress funeral by funeral”.

Update our beliefs or change the world!

When our beliefs are not supported by observations in the real world, there are two ways to deal with this. We can either adapt our beliefs to fit reality better or we can try to reshape the world so that it fits those beliefs. The latter approach is often attempted by politicians and religious leaders, but most of the time their attempts are futile. The communist experiment in the former Soviet Union is only one of many historical cases of trying to change reality to fit a narrative.

A more promising approach is to challenge and update our own narratives. If a mental model does not fit reality, we have to accept that the model is wrong. Leaving the comforting narrative and acknowledging it is wrong can be daunting. But if we do not see reality for what it is, we might end up taking decisions that can really hurt us.

De-biasing ourselves leads to better decisions

True courage is to have the guts to challenge our personal narratives. Kahneman argues that it is really difficult to do this as an individual, but as a group or institution we can install processes, structures and tools that will help us to de-bias ourselves before making important decisions. To do that we need to be humble, strive for diversity in thinking and avoid becoming wedded to our own narratives. This sounds simple but requires a lot of practice.

Avoiding thinking too slowly is instrumental when investing other people’s money, running a central bank, designing public policy and in our relationships with other people. As Churchill said: “however beautiful the strategy, you should occasionally look at the results”.

My advice to you is: be brave and choose to see the world for what it is, not for what you want it to be.