A 2000 review of 136 case studies further confirmed the finding, with the blunt conclusion that simple models beat humans:
Any satisfaction that you felt with your quality of judgement was an illusion: the illusion of validity. [2]
In the climate of populist politics, experts have become whipping dogs. Experts rooted in sound science we should listen to; experts proffering opinion possibly less so. In his scathing 2005 book Expert Political Judgement, [3] psychologist Paul Tetlock looked at the predictions of nearly 300 prominent journalists, academics and high-level advisers. His book became famous for its punch line, ‘the average expert was as accurate as a dart-throwing chimpanzee’.
Warren Buffet weighed in on the same matter at the 2022 Berkshire Hathaway annual meeting, saying that he would bet on monkeys throwing darts to outperform financial advisers. [4]
Daniel Kahneman warns that the only thing you know about a passionate story well told is that the teller of the story has fully convinced himself of it—nothing more. Tetlock further cautions that ‘pundits blessed with clear theories about how the world works were the most confident and the least accurate’.
Boardrooms are prone to all sorts of bias, not least of which are various forms of group think—some stem from simple laziness; others from variations of status and peer pressure:
I've always been worried that when my team gets together, we end up confident and unified—and firmly committed to the course of action that we choose. I guess there is something in our internal processes that isn't going that well. [5]
In decision making, one useful trick is duplicating a discipline that criminal investigators understand well: witnesses must not meet. Wilfully or not, their interaction will at best dilute the evidence and, at worse, produce a cooked-up, shared story:
The second opinion is not independent if the person giving it knows what the first opinion was. And the third one, even less so; there can be a bias cascade. [6]
Before a major decision, creating independent and fully isolated teams with the same information will produce varied analysis. Considering and aggregating multiple judgements will reduce noise in the decision system and, on average, result in better decisions.
This series of articles is about bringing discipline to decision making and ensuring a quality process. And a reminder from the first article that:
While analysis is important in decision making, the process used matters more than analysis by a factor of six. [7]
Kahneman reminds us of the lazy mind and its default to System 1 thinking,[8] redolent with bias and assumption. Purposeful thinking takes effort. Good decisions take time but pay dividends for years to come, while bad ones become black holes for time and resource. One useful observation is that good decision makers appear to have time on hand, but busy people are often simply spending time running faster to deal with the consequences of their poor decisions.
We continue with some thinking models. When to use them depends, of course, on context—but the more possible approaches one has available, the more likely a solid process will result.
Occam’s razor
While not the first or last to write about the concept, it is medieval logician William of Ockham whose name has stuck to the idea. Essentially it suggests that we should prefer the simplest and least elaborate explanation. We are all prone to making things overly complex. The availability of endless data has only exacerbated that tendency. Clinicians use this approach in diagnosis: it is possible that your headache is a precursor of a brain tumour but there are many more likely and benign reasons.
In expressing ideas, the analogous phrase is, ‘I only made this letter longer because I had not the leisure to make it shorter’. [9] Being succinct requires effort. The lazy approach is to throw everything at a page without pausing to analyse or reflect. We have commented on this often in relation to board material.
Complexity can be viewed mathematically. If—for a solution to be valid—one explanation requires only two variables to be true while the other requires ten interacting variables, then the former has a far higher chance of being correct.
Not every complex problem requires a complex solution. One apocryphal example was the story that when the Americans were struggling to find a functional writing device for use in space, the Russians came up with a solution—a pencil. Sadly, not true [10] this story but indicative that sometimes we think that difficult problems require difficult solutions.
Occam’s Razor is not fool proof—sometimes things are not that simple. But if you are choosing between predictive models of equal efficacy, then it is more likely that the simple solution is the one to pursue.
Hanlon’s razor
Like Occam’s Razor, this idea has been with us for some time but Robert J Hanlon’s [11] name has stuck to it. It suggests that we should not assume malice as the cause of something that is more likely the result of simple neglect, or stupidity if you are feeling less charitable. We tend to perceive intent where none exists; something was just a simple mistake. Science fiction author Robert Heinlein similarly noted, ‘you have attributed conditions to villainy that simply result from stupidity’. [12] In his correspondence to King George VI, Churchill sniffed about De Gaulle that ‘his insolence…may be founded on stupidity rather than malice’.
It is human nature to assume that someone else’s behaviour is in some way linked to our own actions, when often there is simply no connection. Hanlon’s Razor suggests we pause and step back from the hasty conclusion. This is especially useful in negotiations when understanding the drivers of another’s actions is crucial.
Hanlon’s Razor can help us understand related biases. Confirmation bias means we seek out information that adds weight to existing beliefs—even if that evidence is sketchy. When dealing with people or organisations we already dislike, we are all too quick to look for ill intent in their dealings with us. It may well be simple neglect or incompetence—irritating, yes, but not malicious.
As with Occam’s Razor, the idea has limitations. While genuine malicious intent may be far less common than other explanations, that does not mean we should believe all is roses—occasionally harmful behaviour does exist.
Second-order thinking
‘Second order’ is simply thinking beyond the consequences of your immediate actions. This is neither easy nor common. America investor Howard Marks noted: [13]
The difference in workload between first-level and second-level thinking is clearly massive, and the number of people capable of the latter is tiny compared to the number capable of the former.
The world is full of examples of failure to think beyond the first level. Cane toads were introduced to Australia to deal with a pest problem. The fact that they are poisonous and have no natural predators was overlooked. The Mao-era campaign on four pests in China included grain-eating sparrows. In their absence, locusts—combined with poor weather—contributed to the famine that killed millions. Gorse may have been good hedging in the English countryside but its spread in temperate Aotearoa New Zealand now extends over 700,000 hectares. [14]
Could these outcomes have been predicted? Absolutely.
Second-order thinking is not a predictive tool, as you can only use information already at hand. Marks provides a useful list of questions:
- What is the range of likely future outcomes?
- Which outcome do I think will occur?
- What’s the probability I’m right?
- What does the consensus think?
- How does my expectation differ from the consensus?
Marks rather unkindly goes on to observe that this kind of thinking is common among academics and professional investors, some (but not many) business leaders—and seemingly completely absent from the minds of politicians.
Second-order thinking involves taking a longer and broader view. What will this look like in ten years? What is the impact on our employees, suppliers, competitors? Perhaps nothing, but you still need to ask the question.
There are of course practical limitations; you can’t think down through all the layers trying to foresee all consequences of your actions. To do so would invite analysis paralysis. We should, however, use rather more often the question: and then what?
_________________________________________________________________________________
Notes
- https://en.wikipedia.org/wiki/Paul_E._Meehl
- Kahneman D, Sibony O, Sunstein C. Noise. William Collins. 2021.
- Tetlock, P. Expert Political Judgement. Princeton University Press. 2005
- CNBC video of annual meeting highlights
- Ibid
- Kahneman D, Sibony O, Sunstein C. Noise. William Collins. 2021.
- McKinsey
- https://thedecisionlab.com/reference-guide/philosophy/system-1-and-system-2-thinking
- Blaise Pascal. Provincial Letters 1656–7.
- https://www.reuters.com/article/factcheck-nasa-pens-idUSL1N2MQ1RR
- Apparently, he submitted it to Murphy’s Law Book Two (1980)
- Heinlein, Robert. Logic of Empire. Astounding Science-Fiction. Vol. 27, March 1941
- Marks, Howard. The Most Important Thing. Columbia Business School. 2011
- https://en.wikipedia.org/wiki/Gorse_in_New_Zealand