• Categories: Role of the board, Book review
  • Author: John Page
  • Published: Oct 25, 2021
  • share on linkedin
  • share article

In the last issue we reviewed Noise from Daniel Kahneman and colleagues. Considering the many flaws in human judgement, they concluded that a constantly open mind was a fundamental precondition for making good decisions. Adam Grant’s book is a logical extension of that premise.

Indeed, the back cover carries a glowing endorsement from Kahneman. There is also a wonderful description of an insightful lunch between the two authors in which Kahneman notes how delighted he is when he is wrong about something because, in general, “I am now less wrong than I was before” and that “my attachment to ideas is provisional, there's no unconditional love for them”.

In a world that increasingly features people occupying entrenched positions, Grant urges people to let go of knowledge and opinion that no longer serve us well; to anchor ourselves in values and be flexible on information, opinions and received wisdom. As an example, he cites the well-known frog in hot water fable. Actually, a frog will jump out when things get too warm. It is not the frog that fails to re-evaluate. It's us. Once we hear such a story and accept it as true, we rarely bother to question it.

Throughout the book Grant refers to being in ‘scientist mode’, meaning that ideas do not settle to become ideologies. We don’t start with answers or solutions but lead with questions or puzzles. Others have suggested the same approach. Charlie Munger believes you cannot hold a position until you have sought out and understood the opposite perspective. Einstein allegedly said that, given an hour to save the world, he would spend the first 55 minutes defining the problem. Scientist mode means more than just reacting with an open mind; it means being actively open minded, searching for reasons why you might be wrong and revising a position based on subsequent learning.

Kahneman noted that the only thing you can tell about someone who presents a compelling and passionate argument is that they have formed a confident story in their own mind. Grant gives us the wonderful term ‘Mount Stupid’. This is the blip on the graph that we know so well, a little knowledge but an expansive propensity to offer opinion. The research on this phenomenon undertaken by David Dunning and Justin Kruger originally featured in the Ignoble awards but went on to be rather useful. They found that people who scored lowest on tests of logical reasoning, grammar and sense of humour had the most inflated opinion of their skills. A later study[1] found that all managers across 20 countries had a self-perception of their competency ranging from slightly above (Japan) to woefully above (Mexico) actual skill levels.

Grant suggests that scientific thinking favours humility over pride, doubt over certainty, curiosity over closure. The opposite approach loops through conviction, seeking information to confirm belief (confirmation biases), validation of conviction, and pride in one’s certainty.

If we engage with people and seek to shift their positions, we won't have much luck changing their minds if we refuse to change ours. Grant suggests we can demonstrate openness by acknowledging where we agree with our critics and even what we've learned from them. Then, when we ask what views they might be willing to revise, we're not hypocrites.

Skilled negotiators focus on finding common ground, asking questions, and avoiding getting into defend / attack spirals. An illuminating and salutary story is given about a public exchange between a champion debater and a computer. The machine had studied a vast number of human-authored articles and was cogent and highly informed. It was even skilled enough to make a joke. The machine was good but lost because it could not deploy empathy to agree with elements of the opponent’s position. Apparently, this was because among the 400 million articles it absorbed empathy was insufficiently in evidence to learn from—a rather sobering conclusion.

We often talk about boards necessarily being learning entities. This book has many lessons on a learning approach. In learning cultures, the norm is for people to know what they don't know, doubt their existing practices, stay curious and look for new routines to try out. Grant suggests that in learning cultures, organisations innovate more and make fewer mistakes.

There are a number of observations on decision making. A bad decision process is based on shallow thinking. A good process is grounded in deep thinking and rethinking, enabling people to form and express independent opinions. Research shows that when we have to explain the procedures behind our decisions in real time, we think more critically and process the possibilities more thoroughly.

Even if the outcome of a decision is positive, it doesn't necessarily qualify as a success. If the process was shallow, you were lucky. If the decision process was deep, you can count it as an improvement; you've discovered a better practice. If the outcome is negative but you evaluated the process thoroughly, you have run a smart experiment.

The book concludes with a useful ‘actions for impact’ section—thirty points to help your rethinking skills. A couple of favourites: ask how people originally formed an opinion, and remember, less is often more when presenting an argument.

This quote probably sums up the central thesis of the book well, ‘it takes a confident humility to admit that we are a work in progress. It shows that we care more about improving ourselves than proving ourselves’.

 

 

 

[1] World Management Survey, Bloom and Van Reenen 2007; and Maloney 2017b.