• Categories: Strategy and Planning, Thinking and learning
  • Published: Mar 28, 2023
  • share on linkedin
  • share article

In the final of this series on mental models and decision making, we discuss the use of models.

Without a disciplined approach to thinking and structure to decision-making, cherry-picking models may not serve your purpose.

There was—and possibly still is—a fashionable fondness for Sun Tzu’s Art of War. Its many wisdoms make clear that success lies with careful thought and reasoned strategy rather than bravado and pointless conflict: ‘the skilful leader subdues the enemy's troops without any fighting’.

In more recent times, Charlie Munger [1] echoed the same sentiment, ‘It is remarkable how much long-term advantage people like us have gotten by trying to be consistently not stupid, instead of trying to be very intelligent’.

Even Munger, famous for his extensive reading and rational approach, admits there are limits to his models. As Benjamin Graham [2] noted, ‘you can get in way more trouble with a good idea than a bad idea, because you forget that the good idea has limits’.

Disciplined thought is not about being a genius. It is about fully understanding an issue and the consequences of a proposed action. Most importantly, it is about not getting caught in the hamster wheel of cleaning up the consequences of your poor decisions.

How not to think

The contemporary enemy of thought is multitasking, and its friend, the connected world. A recent study by researchers [3] at Stanford concluded that the more people multitask, the worse they are not just at other mental abilities, but at multitasking itself. Multitasking is detrimental to thinking. It impairs the mind’s ability to organise and store information. Thinking requires periods of concentration on one thing and sufficient focus to develop ideas around it. It is not aided by rapid swapping between email, WhatsApp, Facebook, Instagram and YouTube.

Daniel Kahneman [4] talks about system 1 thinking and system 2 thinking—fast and slow. System 1 is useful: red means stop, fire is dangerous etc. This has helped us survive. But it is also lazy thinking, and the system 1 bucket collects bias, prejudice, assumption, generalisation, impulse and guesswork. We don’t have to work too hard with these approaches and humans tend to be lazy thinkers. System 2 or rational thought requires conscious effort and time. System 2 is required in a boardroom.

Our love affair with PowerPoint and bullet points is not helpful to thinking. In a list, we note the first and last items and those in the middle tend to blur out. The discipline of less is more in board papers is often sadly lacking. Being succinct, offering analysis and adding meaning takes time. Much easier to throw everything at the page and let the board do the work. Jeff Bezos famously banned PowerPoint and demands that ideas are presented in short memos, which he admits should take days to write:

The reason writing a ‘good’ four-page memo is harder than ‘writing’ a 20-page PowerPoint is because the narrative structure of a good memo forces better thought and better understanding of what’s more important than what. [5]

Kahneman quipped that the only thing you can deduce from an idea passionately presented is that the speaker has convinced himself of its veracity. In his excellent book Think Again, [6] Adam Grant offers us the Dunning-Kruger Principle. This tendency to offer expansive opinion was tested by research, [7] displaying a negative correlation between self-perception of skill and actual competency. When graphed, the highest point of divergence they termed ‘Mount Stupid’.

This can be very limiting in the boardroom. If the most senior or loudest person speaks first and lays down a strong opinion, this is framing bias. It may be one possible approach, or it could be truly wide of the mark. Subsequent discussion then tends to stay within the parameters laid out by the dominant individual.

A better approach

There cannot be one only way to think. Everyone is different and each issue is contextual. But there are some useful guidelines

First, it is essential to take on other perspectives. ‘How’s the water?’ has no meaning to a fish. To quote Charlie Munger again, you cannot responsibly hold a position on something until you have considered and fully understood the opposite view.

Adam Grant notes that scientific thinking favours humility over pride, doubt over certainty, curiosity over closure. In short, we need to take our ego out of the loop. There is a danger of defending our ideas rather than upgrading them. Grant also recalls a wonderful lunch conversation with Kahneman where he notes how delighted he is when he is wrong about something because, in general, “I am now less wrong than I was before” and that “my attachment to ideas is provisional, there's no unconditional love for them”. This is the idea that any opinion on something is a ‘beta position’ open to upgrade as new information becomes available.

For significant discussions, we suggest the rugby ball analogy is helpful, starting at one end and adding ideas until a logical middle point is reached. In this phase there are no bad ideas, there is no right or wrong, better or worse. That comes as the discussion moves to its conclusion at the other end of the ball. Through this phase a good question is always, what would need to be true for this assertion/ proposition to be valid?


Using models

The map is not the territory

A map, like a model, is not reality. Any experienced hiker knows that the topographic map, as detailed as it might be, is just a guide. Soil conditions, river heights, weather or density of the bush are the real world on the ground. The London Underground map is wonderful for travellers but pointless for train drivers. Maps are necessary but flawed.

So many models

There are a vast number of mental models across general thought, science, military, systems, economics and so on. We have only scratched the surface in these articles. The key is to find the handful that are most useful to you. As Peter Bevelin [8] notes:

I don’t need hundreds of concepts, methods or tricks in my head—there are a few basic, time-filtered fundamental ones that are good enough.

In the Farnam Street series of books on mental models, the authors intend to cover ‘the 80-90 mental models you need to get started’ [9]. That is a solid chunk of learning for anyone. Charlie Munger agrees that those 80-90 cover things ninety percent of the time but, ’of course only a mere handful really carry very heavy freight’. [10]

Not the right model

Within that smaller group, choosing the right one for the right situation needs to be learned. If a particular approach doesn’t produce the right outcome, selecting the right model may be the problem, not a lack of application. Roger Martin [11] comments:

It has become clear to me over the years that in nearly every case, the poor results weren’t down to their not working diligently enough in pursuit of their goals; it was because the model that guided their actions wasn’t up to the task.

Martin also contends that there are no completely correct or completely wrong answers, just ones that are better or worse.

Models need to be updated when evidence indicates any flaws. Failure to do so places us back in the world of the sun orbiting the Earth.

Getting better at using models

As noted, the choice of model is key. Record what you used and how it was applied in each situation, and the results achieved. Don’t be hasty to discard on the first use. Other factors may be in play, notably a range of biases.

Equally you need to understand why something worked in each context. Over time this should allow both better use of a specific model and better choice among options.

Notes

_____________________________________________________________

  1. Charles Munger, vice chairman of Berkshire Hathaway Inc
  2. British-born American economist, professor and investor
  3. Nass, C, Wagner A, Ophir E. Proceedings of the National Academy of Sciences. August 2009
  4. See Kahneman, D. Thinking Fast and Slow. Allen Lane 2011
  5. Jeff Bezos interviewed by CNBC April 23, 2018
  6. Grant, A. Think Again. WH Allen 2021. See the review in Good Governance 77
  7. Dunning, D. Kruger J. Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments (1999)
  8. Peter Bevelin, Swedish investor and author
  9. The Great Mental Models Vol 1. Lattice Work Publishing 2019
  10. ibid
  11. Roger Martin. American strategist and author