The Guide to Decision Making by Helga Drummond
IN THE medieval church of St Cross near Winchester in the UK, sunlight shines through the window onto the cross on only two days of the year: May 3, the Feast of the Invention of the Cross, and September 14, Holy Cross Day. The medieval astronomers knew where the sun would fall on two days of the year.
The subtitle of this book on making decisions is Getting it more right than wrong. Why can’t we get our decisions as accurate as those medieval craftsmen got the window positioning?
The explanation is, simply, that decision making is not that simple. Decision-sciences literature is dominated by causal models assuming that outcomes are a function of a small set of variables which, if properly addressed through a set of tools, should give us the correct answer. If only…
Professor Helga Drummond, who teaches at Liverpool University, has put together a compelling survey of why we make decisions that turn out so wrong. Awareness of these traps alone will raise our chances of making better decisions more often.
There are two types of mistakes – those made because of the absence of clear information, and systemic errors of judgement. We can do nothing if the information is not available, but systemic errors are avoidable and Drummond explains why they happen. Her central notion is that “rationality is not so much a guide as an achievement forged from a myriad of non-rational influences”.
What would make Porsche’s chief executive believe that his company could take over a Volkswagen, a company 82 times its size? What made NASA change its approach to rocket launching from a refusal to a launch unless safety could be proven, to a commitment to cancel a launch if it was not proven unsafe to do so?
Calamity loves the over-confident, explains Professor Drummond, and this applies as much to institutions that have enjoyed a run of success as to executives who have high self-esteem.
The only reality a person can know is the reality they know. One cannot know what one does not know and this existential truth is a tautology. Even if we try to know more, there are the seven traps we can easily fall into. These include seeing only what want to see and missing important differences between situations.
That we see what we want to see is indicated by our innate tendency to notice information that supports what we already believe, and to dismiss or be critical of contradictory information. The executives of the destroyed Barings bank might have been saved had they taken note of the warnings respected banks like Flemming and Morgan Stanley were giving their customers about using Barings as a counterparty.
Barings executives assumed the market couldn’t accept that a bank as small as theirs could perform so well, rather than considering that there could be truth in the market’s concerns.
Firefighters are more likely to be killed when they are experienced than when they are new to the job. The reason cited is their thinking that they have seen it before, and that the current situation is the same as the others.
The frighteningly high failure rate of mergers may well be the business manifestation of the same problem, causing executives to overlook nuanced differences between previously successful integrations and the challenges posed by the current one.
Will having more information assist in avoiding error? “Emphatically, more detailed information is not the answer,” Professor Drummond asserts. One need only consider how more elaborate risk registers and more elaborate scorecards lead decision-makers to spend their time scrutinising the same narrow band of information.
Scepticism is far more likely to narrow the gap between representation and reality than could more information about the same issues. Drummond even suggests holding meetings where risk registers, balanced scorecards and other tools for decision supports are banned.
While making decisions in groups rather than alone should add greater clarity, the dynamics of groups can cause exactly the opposite to happen. Studies indicate that most of the work done by groups is carried out by only a third of the participants - and then there are the additional challenges of deference to high status individuals. Groups also fall prey to groupthink, the main consequence of which is a loss of analytical rigour and sloppiness in decision-making.
With the gloomy outlook Drummond paints of our ability to make good decisions, what is to do done? Perhaps the best place to start is by reading this, which will most certainly heighten awareness of the problem which Drummond firmly believes is a key to the solution.
The chapter on “predictable surprises” does just that directly. Consider this mentioned in passing: if you knew no better, would you choose a restaurant that was almost empty or almost full? The obvious answer is the better patronised one, but this popularity could as easily be a function of the first diners making a random choice which others simply followed.
Reading this book should be compulsory for all of us, and perhaps we should read it annually, lest we forget.
Readability: Light ---+- Serious
Insights: High -+--- Low
Practical: High +---- Low
*Ian Mann of Gateways consults internationally on leadership and strategy.