The problem with traps is that they tend to pop up unexpectedly. The convenient thing about our intuition is its quick reaction time. What sounds like a perfect match can turn into serious barriers to critical thinking when our intuition itself becomes the trap. Let’s first take a closer look at what intuitive traps are before diving into the five most common barriers to critical thinking.
What Are Intuitive Traps
The term intuitive trap was coined by Randy Pherson, a former intelligence analyst turned entrepreneur. In a 2017 paper on Cognitive Biases and Intuitive Traps Most Often Encountered by Analysts, Pherson and co-author Mary C. Boardman define intuitive traps as “manifestations of cognitive biases”. In other words, they’re “shortcuts or mental mistakes” people make in the process of thinking critically.
Their ranking of intuitive traps resulted from polling nine experienced U.S. intelligence analysts with a background in national security. While the sample size doesn’t seem overwhelming, bear in mind that these are professionals who think for a living. It should give us a good indication of which traps we may want to look out for in the future and how we can spot them.
5 Most Common Barriers to Critical Thinking
Here are the five most common intuitive traps from Pherson and Boardman’s paper.
1. Projecting Past Experiences
History doesn’t repeat itself, but it often rhymes.Mark Twain (allegedly)
To start us off, a majority of analysts considered the projection of past experiences to be the most common intuitive trap. As the term suggests, we tend to put an unreasonable amount of weight on our personal experiences. As a result, we may instinctively assume that the same familiar dynamic we experienced earlier is at play when facing a similar scenario.
Imagine you’d regularly sit on selection panels for hiring new staff. A candidate shows up to the job interview with mismatched socks. Now, do you remember the last time an applicant with complete disregard for pedal symmetry was accepted, she turned out to be rather unorganised. Ha! You know this one: Mismatched socks mean chaos. You’re instantly highly suspicious of today’s non-conforming candidate – and might not even notice it.
I suspect what makes this trap so common is that there’s a sizable grain of truth in there. Predictive performance assessments tend to be built on the axiom that past behaviour and performance predict future performance. In a job interview, we base our judgement on the skills and abilities the applicant demonstrated in previous jobs, expecting that he or she will fare similarly in our organisation. Standardised performance prediction testing takes this idea one step further. By employing lab-tested statistical models they show how an applicant compares to his or her peers based on more universal indicators. Mind you, these methods are still limited and fallible.
Fooled By Intuition
Our intuition seems to function in a very similar way. Data from our past experiences are readily available in our memory. But is there really a correlation between sock selection and orderliness? It’s impossible to say with a sample size of n=1. A single observation or even a couple of them don’t make a valid indicator let alone a method on which we should base important decisions. If we still went ahead and based our hiring decision on it, it would be a case of unreflected inductive reasoning at best. At worst we’d be looking at a manifestation of misguided intuition. However we put it, Admiral Ackbar would be up in arms.
In fact, we may be surprised to hear that some of our colleagues are in favour of our unorthodox applicant. How come? Dubbed the Red Sneakers Effect, a 2014 Harvard study found evidence that nonconformity is often seen as a sign of competence and higher status. At least as long as the observer considers the deviation from the norm to be intentional. But I digress. Perhaps socks just aren’t a good performance predictor and we shouldn’t expect our past experiences to repeat themselves only because they sound, look, or feel similar.
2. Presuming Patterns
And God said, Let there be lights in the firmament of the heaven to divide the day from the night; and let them be for signs, and for seasons, and for days, and years.Genesis 1:14
As we know, humans are pattern-seeking animals. Sometimes that means we fabricate them (the patterns) in order to make sense of the world. So it’s probably not surprising that Pherson and Boardman identified the presumption of patterns as the second most common intuitive trap. In reference to Daniel Kahneman’s Thinking, Fast and Slow, they define this pitfall as “believing that actions are the result of centralized planning or direction and finding patterns where they do not exist”.
Having said that, “it’s amazing what doesn’t exist in the real world,” as philosopher Alan Watts beautifully put it. I suppose we’ve all looked up at the stars trying to spot star constellations. In Australia, we even put one on our flags: the Southern Cross. However, Watts enlightens us, “they’re groups of stars in the mind’s eye” only. Nobody made a blueprint and put them there to look like a cross mainly visible from the Southern Hemisphere. I suppose the only reason it works is that we all roughly share the same perspective from planet Earth and can therefore agree on the presumed pattern.
But imagine what happens if we start seeing patterns in people’s behaviours, thinking there’s some grand plan behind them. We’re quickly in conspiracy theory territory. Sometimes there’s no grand design behind any of it as things can grow organically. It’s only our confirmation bias that keeps us trapped in this pitfall. Sometimes conspiracies prove to be true, though, and there actually was a pattern. You probably guessed it, it’s hard to tell based on intuition alone.
Finding patterns is bread and butter for an analyst as well as our everyday sensemaking. But instead of jumping to conclusions, we should see them as mere springboards, that is as assumptions or hypotheses that are to be questioned, falsified, or verified.
Nothing is so painful to the human mind as a great and sudden change.Mary Shelley, Frankenstein
If we find ourselves in this common intuitive trap, we falsely considered change to be always slow or incremental and limited our thinking accordingly. We were unprepared for the possibility of swift radical transformation.
A great example is Black Swans. Black Swans are extremely rare events with a severe impact that nobody thought of or deemed possible. For instance, if you were a Brazilian football fan in 2014, you were probably unprepared for the emotional fallout of being beaten 7-1 by Germany. In a world cup semi-final. On your own turf. Sure, not reaching the finals after a close game and a lucky German goal was an option. But by a tear-inducing landslide of epic proportions? Probably not. Of course, radical change doesn’t have to be negative per se. Overnight success or fame can be equally unexpected and consequential. All of a sudden, you’re in high demand and completely overwhelmed.
At any rate, events of radical transformation tend to be more memorable than the gradual change we were expecting anyway. That’s why Mary Shelley’s observation seems to be such an accurate explanation for this intuitive trap. Who can blame our mind for favouring the more probable – and seemingly pain-avoiding – possibilities. Any sudden change can catch us off guard and throw us into chaos. So it might be a good idea to spend at least some resources on preparing for black swan-like events.
4. Favouring First-Hand Information
We know […] from simple empirical evidence in the history of science that the lowest form of evidence that exists in this world is eye-witness testimony.Dr. Neil deGrasse Tyson, Astrophysicist
When we favour first-hand information we tend to put more weight on direct observations from anyone who can claim: “I saw/heard…”. That of course includes ourselves. Second-hand information by contrast comes from a source that did not observe an event directly but potentially heard about it from a first-hand source. Though, it can also include anything from rumours to third-party investigation reports, or even meta-analyses.
After a car accident we’re probably inclined to trust our own eyes – or any eye witness for that matter – more than what we hear from some talking head on the news. Where did the cars come from? How fast were they? Was the lady in the red shirt driving or was it her husband? Was it really a red shirt?
That said, it‘s probably fair to assume most of us don’t observe the traffic all day long in the hopes that we see some action first-hand. Even if we were to witness an accident unfold before our very eyes, we’re likely to miss crucial details. That‘s because we’d function as mere human recording devices at the mercy of all our biases. We tend only to see what we aim at to such a degree that we‘d likely even miss the guy in the gorilla costume.
While first-hand information is a valuable piece in the puzzle, we may gain more reliable data on a car crash from a slow and painstaking analysis of skid marks, car telemetry, or camera footage. It’s entirely possible that an analyst looking at the data on the other end of the planet 30 years later comes to more accurate conclusions than our own eyes, ears and brains. At the end of the day, it‘s about the level of impact we allow each source to have before we‘ve vetted them for reliability and triangulated the information with all available data.
So, Pluto enthusiast and cameo collector Neil deGrasse Tyson is right, of course. Putting first-hand information first is an intuitive trap worth avoiding. Believe me, I‘ve seen it myself.
5. The Halo Effect
Never meet you heroes.Proverb
The Halo effect occurs if we let our emotions cloud our judgment. This happens when our positive (or negative) opinion of a person‘s characteristic leads us to judge them positively (or negatively) in a different area, too. Literary speaking, we mistake people for flat characters who revolve around a single trait when in reality they’re quite round; that is complex and complicated.
Perhaps our personal hero is a witty and relatable talk show host. Though, dining in a restaurant with his family, our role model may be not-so-witty and not-so-relatable when he tells us that he’d prefer to have dinner without us. On the other end of the spectrum, we may be inclined to reject an opinion held by a politician we despise, even if it’s actually one that would benefit the community.
It doesn’t really matter if we’re on the receiving end, or whether we seemingly benefit from the Halo Effect: a teacher giving out bad marks because of cringeworthy handwriting, or a boss deciding on a promotion to a supervisor position based on likability. The effect is equally bad. It distorts reality. It also doesn’t change the fact that we may actually be a genius in chemistry or that we’ll probably fail as a likeable yet unqualified supervisor.
I guess what I’m trying to say is this: As long as you watch out for this intuitive trap, do meet your heroes.
Bonus Pitfall: The Fallacy-fallacy
Just because you’re paranoid doesn’t mean they aren’t after you.Joseph Heller, Catch-22
While this trap is not part of the Pherson paper, I think it fits nicely into the idea of intuitive traps. The fallacy-fallacy postulates that, if an argument contains a fallacy, it must be false. Indeed, with everything we now know it’s tempting to dismiss a decision because someone stepped into a trap while making it.
It’s possible to fall into all intuitive traps at once. We may have presumed a pattern of incremental change which we learned first-hand from our personal science hero who had similar experiences in the past. The good news is that this doesn’t mean our conclusion is wrong. The bad news is that this doesn’t mean our conclusion is wrong. Just like crossing a freeway with our eyes closed and surviving doesn’t mean it’s not a sub-optimal idea. The question is how likely we’d get away with it a second or a third time.
Overcoming Barriers to Critical Thinking
So how can we solve this trap conundrum? First of all, we’ve already made a good effort to understand what common intuitive traps look like and where they lurk in our intuitive thinking. This will help us notice when we take a mental shortcut without any pressure to do so. Instead, we can pause and question our assumptions.
Secondly, we should consider using analytical techniques to guide and scaffold our thinking. That’s often easier than it may sound since analysis doesn’t have to be abstract and conceptual. For instance, a simple set of criteria can go a long way to examine a phenomenon from different angles in order to figure out if the submarine in the front yard is actually a fake.
Third, even the most sophisticated analytical technique is limited by our own intellect. As Pherson and Boardman point out, we’re more inclined to see bias in others than in ourselves. So the obvious solution is to team up with like-minded people as a means of checks and balances.
Intuitive decision-making makes great sense in situations we have to think fast and act quickly. Unfortunately, thinking fast also invites cognitive bias. In the worst case, we’ve rushed to a judgement without even realising that we’ve fallen into a trap. In the best case, however, we can harness analytical techniques and the intuition of a whole team to avoid intuitive traps and overcome the barriers to critical thinking.