Support my work on Patreon. Join for free.

Mental Shortcuts: 5 Ways Heuristics Can Lead to Poor Decisions

How many days would you save when taking the Gotthard Tunnel through the Swiss Alps? At first glance, mental shortcuts such as the answer you just gave in your head seem to work like any shortcut. The Gotthard Tunnel cuts through the Alps and gets us faster from Milan to Zurich. Which is great. As long as we’re happy with trading scenic meandering mountain roads for the pleasure of staring into a tunnel for 57 kilometres. Similarly, mental shortcuts are all about trade-offs. In this post, we’re going to take a look at how our mind cuts corners, its benefits and five ways they might lead to poor decision-making if we misapply them.

What are Mental Shortcuts?

Mental shortcuts, also known as heuristic decision-making, are what our mind uses when we need to answer a difficult question or solve a complex problem quickly. Heuristics allow us to make judgements and come to decisions in a time-saving and efficient manner. The term heuristic is closely related to the Greek word Eureka, ‘I have found it!’ Though, what we find may very well lead to bad decisions. Here’s the psychology behind mental shortcuts.

System 1 and System 2

The human capacity for decision-making is limited by all sorts of factors including the availability of information, the time we have to make a judgement, cognitive ability and biases. Sometimes it seems like our minds can’t keep up. We help ourselves with mental models such as the OODA Loop to make decisions in disorienting situations. In reality, our brain seems to be a bit of a know-it-all.

The “normal state of your mind”, according to famed psychologist Daniel Kahneman, “is that you have intuitive feelings and opinions about almost everything that comes your way”. The reason for this, as Kahneman describes in his bestseller Thinking, Fast and Slow, lies in the two systems operating in our minds. System 1 is the intuitive, fast, instinctive and emotional one. System 2 is much more deliberate and logical, yet slow.

When faced with a complex problem, System 1 simply substitutes the difficult question with an easier one. Consider Kahneman’s example in which the Target Question “How happy are you with your life these days?” becomes the Heuristic Question “What is my mood right now?” The first query — which would probably require writing a whole essay about your definition of ‘happiness’ and factors impacting it over a period of time — is collapsed into a much more accessible matter of intuitive judgement.

Types of Heuristics

Kahneman distinguishes between different types of heuristics. I’ve tried to simplify the main types as follows:

  • Availability: Making decisions based on what’s readily available in our mind
  • Representativeness: Making judgements by assessing how the situation at hand compares to a familiar mental prototype
  • Affect: Making decisions based on how we feel in the moment
  • Anchoring & Adjustment: Making judgements based on the first piece of information we receive

Surely, mental shortcuts are generally a positive habit of our minds. We’re faced with so many impressions and micro-decisions on a daily basis. We can’t always stop and send System 2 on a painstaking quest to determine which coffee place to pick or which ice cream flavour best goes with our Cappucino. Heuristics work just fine. Until they don’t.

Mental Shortcuts and Misapplied Heuristics

As useful as decision heuristics can be, they’re still imperfect rules of thumb that can be misapplied. This opens the door to intuitive traps or cognitive biases, hinders analytic thinking and ultimately leads to poor decisions. The concept of Misapplied Heuristics was coined by analytics expert Randy Pherson. Naturally, he looks at mental shortcuts from the perspective of a System 2-focussed analyst. According to Pherson, misapplied heuristics can still “lead to a correct decision based on a non-rigorous thought process”. But only if we’re lucky.

In his Handbook of Analytic Tools & Techniques, he identifies several potential thinking errors. Here are five of the most common Misapplied Heuristics to look out for. We start with the mental shotgun and make our way to premature closure.

1. Mental Shotgun

Somebody sees lights flashing in the sky. They never seen it before. They don’t understand what it is. They say: A UFO! The ‘U’ stands for unidentified. So they say: “I don’t know what it is. It must be aliens from outer space visiting from another plant.”

Well, if you don’t know what it is, that’s where your conversation should stop. You don’t then say: “It must be anything.”

Neil deGrasse Tyson

Our mind can’t help itself but continuously make assessments. It computes all the time — often more than needed — as we search for quick answers. To illustrate this, Kahneman invokes the image of a shotgun, which can be fired quickly but lacks precision. Imagine numerous birdshot pellets spreading all over the target and beyond. The mental shotgun is a mechanism of the fast and intuitive System 1. But it’s triggered when System 2 is presented with a specific question to be answered. The lack of “precision and control” (Pherson) is where this mental shortcut can go wrong.

Neil Tyson quotes the person’s spontaneous thoughts on the “lights flashing in the sky” almost in a stream of consciousness manner. Their judgment is riddled with quick and intuitive assessments and full of free associations with other familiar concepts such as aliens. It’s a good example of a mental shotgun, a spontaneous extrapolation from lights in the sky to the presence of a spacefaring civilisation within seconds.

The opposite of a mental shotgun could be conceptualised as a mental precision rifle. It fires a single bullet with maximum accuracy after careful deliberation of environmental factors. This process would be more akin to our System 2 of thinking. System 2 is also what I think Neil invokes in his example when he calls for a slow and careful evaluation of the aerial phenomenon.

But our mind just likes to shoot first and then ask questions. The misapplication happens when the quick and easy answer to an obviously complex problem is not caught in time and used to make far-reaching decisions. To be fair, there’s probably an equally fast heuristic at play to acknowledge that an alien space invasion is not imminent.


2. Availability Heuristic

[Australia] has more things that will kill you than anywhere else. Of the world’s ten most poisonous snakes, all are Australian. […] If you are not stung or pronged to death in some unexpected manner, you may be fatally chomped by sharks or crocodiles, or carried helplessly out to sea by irresistible currents, or left to stagger to an unhappy death in the baking outback. It’s a tough place.

Bill Bryson, In a Sunburned Country

Would I be wrong to assume you’re familiar with the legendary deadliness of Australia? Bill Bryson’s travel writing probably contributed a fair bit to that image. Imagine you’ve just finished reading Bryson. Now somebody asks you if Australia was a safe country to travel to. Your mind will probably use the availability heuristic to give an ad-hoc answer in the negative. If something comes to mind quickly, it must have more relevance, we seem to reason.

As touched on above, the availability heuristic causes us to judge “the frequency of an event or category based on the ease with which instances come to mind” (Pherson). As a quick mental shortcut, we do well to be suspicious of fauna and be careful wandering around Australia. It’s a good idea to keep in mind that plenty of spiders are venomous and sharks pose a danger at the beaches. But in reality shark attacks, for example, are rarer than our minds would lead us to believe.

The misapplication seems to happen when we fail to acknowledge that the actual question is more complex and difficult to answer. Once System 2 takes over, though, we can use Bayesian thinking to gather more information and update our decisions accordingly.

3. Anchoring Effect

$119,900,000

The clue to the price tag is in the painting itself. $119.9 million is what the below version of Norwegian painter Edvard Munk’s famous The Scream sold for in 2012. $18.25 is the price for a ticket to the Munch Museum Tøyen in Oslo, Norway. You can afford that. Instead of buying, you could go see one of the other original versions of the same painting over six million times. What a bargain!

Mental Shortcuts
The Scream by Edvard Munk

If you’re now rushing to book flights and museum tickets to Oslo, chances are you’ve misapplied the anchoring heuristic by accepting “the given value of something unknown as a starting point”. Imagine I had quoted the ticket price of $2 for a different art museum first. The Munch Museum would’ve seemed rather expensive in comparison.

Anchoring is a classic negotiation tactic. It works with numbers, but you can also anchor emotions. How would you feel as a receptionist if a guest told you they were about to ruin your day? You may be inclined to think of the worst that could’ve happened. Until it turns out all the guest wants is an upgrade. What a relief! This is an example from master negotiator Chris Voss who teaches all about anchoring in his book Never Split the Difference. It shows how quickly our tendency to take mental shortcuts can be taken advantage of.

We seem to be particularly susceptible to misapplying this mental shortcut when we don’t know much about something. Sure, it gives our minds something to hold on to when we’re lost. However, before making a decision, it’s a good idea to pause and reflect on who provided us with the anchor and with what intention. Speaking of pause and reflection.

4. Groupthink

Whenever you find yourself on the side of the majority, it is time to reform (or pause and reflect).

Mark Twain

Mark Twain’s aphorism probably sounds counterintuitive. Group verdicts have major advantages. Majority decisions are the bedrock of liberal democracy. Collaborative sensemaking benefits from the wisdom of many minds and can help us overcome the biases of the individual. However, sometimes we go with the majority opinion not because we think it’s the right one. We choose to agree out of a mere desire for consensus.

Granted, acknowledging and caring about what other people think is generally considered to be a good sign we’re not psychopathic.[1] But simply going with what the group deems best can become disadvantageous when groupthink sets in. Identified by Pherson as a misapplied heuristic, it was first coined by psychologist Irving L. Janis in a 1971 article:

I use the term groupthink as a quick and easy way to refer to the mode of thinking that persons engage in when concurrence-seeking becomes so dominant in a cohesive ingroup that it tends to override realistic appraisal of alternative courses of action.[2]

Janis further explains: The better the group gets along, the greater the danger of a lack of independent critical thinking. This can lead to self-censorship and the ingroup making irrational decisions against an outgroup. As I’ve discussed in the Tenth Man Rule, a form of institutionalised devil’s advocacy can break through self-censorship and false consensus. It’s an adversarial approach and probably not for everyone. We don’t want to end up becoming a loud minority dictatorship either. A bit of pause and reflection seems like a good idea either way.

5. Premature Closure

Therefore test, who wants to bind himself forever,
Whether heart will find right heart.
Euphoria is short, remorse is long.

Friedrich Schiller, Song of the Bell

This excerpt from Schiller’s famous poem sums up beautifully the problem with Premature Closure. In Germany, the lines are often intentionally misquoted when it comes to relationships: “Therefore test, who wants to bind himself forever if he cannot find someone better.” Alluding to System 1 and System 2, Pherson warns that Premature Closure can be misapplied if we stop our efforts “when a seemingly satisfactory answer is found before sufficient information is collected and proper analysis can be performed”. So the question is: When do we know enough to put the lid on a decision?

Sacrificing rigour and patience for something that appears satisfactory on the surface could mean we’re missing out on something even better. Again, it would be ludicrous to prepare a cost-benefit analysis for every minuscule life decision. But settling for a heuristic becomes more and more costly the higher the stakes are.

On the one hand, stopping the search for an adequate answer at the very first sign of success doesn’t seem like a good idea. Social psychologist Jonathan Haidt has pointed out the negative effects of motivated reasoning which causes us to stop the search for evidence as soon as our initial intuitive judgement is confirmed.

On the other hand, it’s also not very efficient to keep searching for eternity. This is particularly true since decisions are mere hypotheses of what will happen once we implement them. In a sense we’re trying to predict Black Swans, that is the unpredictable. As soon as we start to think deliberately and logical about it, the lack of a stopping rule for gathering more information and generating hypotheses becomes apparent.

Whether premature closure would be considered a misapplied heuristic seems to depend on the decision at hand. My five principles for decision-making advice might bring some clarity, though, I’m afraid, there’s no ultimately satisfactory solution to this. Except for following Schiller and not trading euphoria for remorse. When in doubt, play the long game and choose long term over the short term.

Closing Thoughts

Our minds are inclined to cut corners. Did you fall for my anchoring? Or did you know that the Gotthard Tunnel would only save you about 30 minutes?

Maybe I spoke too soon, too, when I compared heuristics to real-life shortcuts. As opposed to choosing a tunnel to save some time, we can’t help ourselves but cut corners mentally. We often don’t seem to realise when we use a heuristic and fail to notice that it didn’t even take us where we hoped it would. In other words, mental shortcuts are far less reliable. Especially when it comes to decisions with a long-term impact, it seems like a good idea to prevent misapplied heuristics.

Apart from knowing about the nature of mental shortcuts, their limitations and trade-offs we can use analytical techniques to keep us from jumping to conclusions and making bad decisions. Am I being deceived? Try the structured analytic technique of deception detection. What are the risks of starting something new? A SWOT Analysis might work. Is the submarine in the front yard the real deal? A simple five-step satellite image analysis can help. Plus, it can be quite entertaining to observe how our mind jumps to ridiculous conclusions.