Get New Ideas by Email. Join 2k+ Readers.

50 Most Interesting Ideas and Concepts (100th Newsletter Special)

The 3 Ideas in 2 Minutes newsletter, featuring timeless knowledge and wisdom about critical thinking, philosophy, decision-making and more, was born two years ago. As a 100th newsletter special, here are my top 50 interesting ideas and concepts from ninety-nine 3 Ideas in 2 Minutes newsletters. Thank you to all subscribers and to all the legends who have supported my work with donations, a paid subscription or as Patrons. Without you, this wouldn’t be possible.

1. Motivated Reasoning

Social psychologist Jonathan Haidt on how we tend to reason:

Reasoning is very heavily motivated. We’re not very good at objective careful balanced reasoning. When we evaluate a proposition, anything: That that is good for you, that Obama was born in Hawaii or Indonesia, wherever. Any proposition you evaluate, we don’t say: ‘What’s the evidence on one side, what’s the evidence on the other? Which one wins?’ We don’t do that. Our brains are not set up to do that.

We start with a feeling, we want to believe X or we want to doubt X. We ask: ‘Can I believe it? I want to believe it’. And then we send our reasoning off on a search to find evidence. If we find one piece of evidence we can stop.

If someone holds us accountable and says, ‘Why do you think that?’ you pull out the piece of evidence and say: ‘Here, this is why.’

Jonathan Haidt, Two incompatible sacred values in American universities

From: 3 Ideas in 2 Minutes on the Art of Reasoning

2. Chewbacca Defense

The Chewbacca Defense is a legal strategy that aims to confuse by deploying an elaborate nonsense argument enriched with needless repetitions, logical fallacies and irrelevant conclusions. It originated from a South Park episode mocking the closing argument of the O.J. Simpson trial:

Ladies and gentlemen of this supposed jury, I have one final thing I want you to consider. Ladies and gentlemen, this is Chewbacca. Chewbacca is a Wookiee from the planet Kashyyyk. But Chewbacca lives on the planet Endor. Now think about it; that does not make sense!

Why would a Wookiee, an 8-foot-tall Wookiee, want to live on Endor, with a bunch of 2-foot-tall Ewoks? That does not make sense! But more important, you have to ask yourself: What does this have to do with this case? Nothing. Ladies and gentlemen, it has nothing to do with this case! It does not make sense!

Look at me. I’m a lawyer defending a major record company, and I’m talkin’ about Chewbacca! Does that make sense? Ladies and gentlemen, I am not making any sense! None of this makes sense! And so you have to remember, when you’re in that jury room deliberatin’ and conjugatin’ the Emancipation Proclamation, does it make sense? No! Ladies and gentlemen of this supposed jury, it does not make sense! If Chewbacca lives on Endor, you must acquit! The defense rests.

Johnnie Cochran, South Park 2×14 Chef Aid

Note, even if your own argument is intelligent and well-reasoned you may be wrongly accused of having mounted a Chewbacca Defense. Hence the term Chewbacca Dilemma.

From: 3 Ideas in 2 Minutes on Sowing Confusion

3. The Linda Problem

Consider the following scenario proposed by psychologists Daniel Kahneman and Amos Tversky in the 1980s:

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Which is more probable?

  1. Linda is a bank teller.
  2. Linda is a bank teller and is active in the feminist movement.
Kahnemann & Tversky, Judgments of and by Representativeness

If you chose no. 2 you’ve fallen for the conjunction fallacy. It states that the probability of the conjunctions can never be higher than the probability of its conjuncts.

In other words, both events combined (Linda being a bank teller and active in the feminist movement) cannot be more likely than a single event on its own. No matter how plausible Linda’s ideological leanings may sound.

From: 3 Ideas in 2 Minutes on Probabilistic Thinking

4. The Dead Cat Manoeuvre

British politician Boris Johnson relates a story of Australian political strategist Lynton Crosby:

Let us suppose you are losing an argument. The facts are overwhelmingly against you, and the more people focus on the reality the worse it is for you and your case. Your best bet in these circumstances is to perform a manoeuvre that a great campaigner describes as “throwing a dead cat on the table, mate”.

That is because here is one thing that is absolutely certain about throwing a dead cat on the dining room table — and I don’t mean that people will be outraged, alarmed, disgusted. That is true, but irrelevant. The key point, says my Australian friend, is that everyone will shout, ‘Jeez, mate, there’s a dead cat on the table!’ In other words, they will be talking about the dead cat — the thing you want them to talk about — and they will not be talking about the issue that has been causing you so much grief.

Boris Johnson

From: 3 Ideas In 2 Minutes on Not Being Misled

5. Muphry’s Law

You know Murphy’s Law. Have you heard of Muphry’s Law? Here’s Australian author John Bangsund explaining the pitfalls of editing and critique:

(a) If you write anything criticizing editing or proofreading, there will be a fault of some kind in what you have written.

(b) if an author thanks you in a book for your editing or proofreading, there will be mistakes in the book;

(c) the stronger the sentiment expressed in (a) and (b), the greater the fault;

(d) any book devoted to editing or style will be internally inconsistent.

John Bangsund, Muphry’s Law

I’d like to take this opportunity to express my immense gratitude to the love of my life for proofreading all newsletters.

From: 3 Ideas in 2 Minutes on the Pitfalls of Perfect Planning

6. The Streetlight Effect

The Streetlight Effect is an observer bias whose name can be traced back to a common joke:

A policeman sees a drunk man searching for something under a streetlight and asks what the drunk has lost. He says he lost his keys and they both look under the streetlight together. After a few minutes the policeman asks if he is sure he lost them here, and the drunk replies, no, and that he lost them in the park. The policeman asks why he is searching here, and the drunk replies, “this is where the light is”.

David H. Freedman, Wrong: Why Experts Keep Failing Us

However, the Streetlight Effect does not only apply to the drunk. Political scientist Robert Jervis explains why:

Just like the drunk who looked for his keys not where he dropped them, but under the lamppost where the light was better, people often seek inadequate information that is readily available, use misleading measures because they are simple, and employ methods of calculation whose main virtue is ease.

Robert Jervis, The Drunkard’s Search

From: 3 Ideas in 2 Minutes on Knowledge About Knowledge

7. Permit A38

Acquiring a Permit A38 is one of the tasks cartoon characters Asterix and Obelix have to complete in the animated film Twelve Tasks of Asterix. It’s only a formality, really. They must get the permit in the Place That Sends You Mad, a large building housing incompetent and useless bureaucrats. The two are almost driven insane by unhelpful staff who send them from one office to another on their pointless quest to get Permit A38.

Asterix and Obelix only manage to complete the task by requesting a made-up Permit A39. This plunges the place into chaos as staff are trying to figure out what the form is. Eventually, the bureaucrat in chief hands Asterix Permit A38. Just to get rid of them. It’s worth remembering the next time you have to visit your local administration.

From: 3 Ideas in 2 Minutes on Bureaucratic Insanity

8. Mental Shotgun

The mental shotgun illustrates our human tendency to assess the world around us continuously. We compute all the time, effortlessly and intuitively, though often more than needed. Unfortunately, this quick way of thinking lacks precision — like a shotgun.

The term was coined by psychologist Daniel Kahneman. He popularised the distinction between our fast and instinctive System 1 of thinking and the slow and logical System 2, which sets off the mental shotgun:

An intention of System 2 to answer a specific question or evaluate a particular attribute of the situation automatically triggers other computations, including basic assessments.

Daniel Kahneman, Thinking, Fast and Slow

From: 3 Ideas in 2 Minutes on Good Decision-Making

9. Decision Fatigue

Sushi or sandwiches? Coffee or juice? Eat-in or takeaway? The more decisions we make throughout the day, the lower our mental energy level drops. We get tired of picking and choosing as it depletes our self-regulatory resources. The quality of our decisions deteriorates. Decision fatigue sets in.

Naturally, our mind tries to save energy. Either impulsively by taking mental shortcuts (“Sushi. Whatever…”) or by shutting down entirely and doing nothing (“No lunch for me, thanks.”) — with potential long-term consequences.

Decision fatigue is also the reason why you see certain businesspeople wear the same outfits every day. Seems like reducing the number of trivial choices we have to make every day can be quite liberating.

From: 3 Ideas in 2 Minutes on Good Decision-Making

10. The Closing Time Effect

Do people really get prettier the later the night gets, as a German saying goes? Psychologists have been onto this since the late 1970s. Here are the findings of a 2010 paper from Australian researchers Carly Johnco et al.:

87 patrons in an Australian pub rated the attractiveness of opposite sex and same sex participants at three times over the course of a night in a repeated measures design. As the night progressed, Blood Alcohol Concentration (BAC) as measured with a breathalyzer increased, as did ratings of opposite sex attractiveness. Same sex attractiveness did not change. The increase in opposite sex attractiveness ratings was only partially due to BAC. Because participants with partners showed the same closing time effect as single participants, reactance theory [the feeling that someone’s taking away your choices as the night progresses], the usual explanation for the closing time effect, is not an adequate explanation. Mere exposure and a scarcity effect are better explanations.

Johnco et al., They Do Get Prettier at Closing Time

It turns out that the Closing Time Effect is real. Even when sober.

From: 3 Ideas in 2 Minutes on the Progress of Time

11. Zeigarnik Effect

Imagine you’re a waiter in Berlin, Germany in the 1920s. Then imagine you’re trying to remember the orders of your customers. Now imagine you’re being observed by a Soviet psychologist named Bluma Zeigarnik who notices that you can better recall those orders which are still being prepared.

Congrats, you have just inspired her to run a series of experiments that establish the Zeigarnik Effect, the notion of interrupted activities being more easily recalled than finished ones.

Even if you haven’t heard of Frau Zeigarnik, you’ve probably seen the effect in action. Whenever a TV show leaves you in anticipation of what comes next, they’re probably banking on the Zeigarnik Effect and your need for closure. We should note, though, that the Zeigarnik Effect has not always been successfully replicated. To what extent it’s a real phenomenon is yet to be fully determined…

From: 3 Ideas in 2 Minutes on Unfinished Business

12. The Peter Principle

The Peter Principle is a semi-satirical explanation for incompetence in the workplace. It was formulated in 1969 by Laurence J. Peter and Raymond Hull who looked into the nature of hierarchical structures. Evaluating hundreds of case studies the authors concluded:

In a Hierarchy Every Employee Tends to Rise to His Level of Incompetence.

In other words, the last promotion someone receives is always to a position of ineptitude. The lesser-known Peter’s Corollary takes this idea to its logical conclusion:

In time, every post tends to be occupied by an employee who is incompetent to carry out his duties.

If you find yourself wondering who does all the work then, Peter has the answer:

Work is accomplished by those employees who have not yet reached their level of incompetence.

I’ve written an in-depth essay about the Peter Principle in which I addressed the only solution to the conundrum: creative incompetence.

From: 3 Ideas in 2 Minutes on Staggering Incompetence

13. Riker’s Razor

You’re incapable of that level of incompetence, Mr La Forge!

Says Commander Will Riker to his engineer extraordinaire Geordi in Star Trek: The Next Generation episode 4×08 Future Imperfect. The officer has been behaving stupidly for far too long. Shortly after, Riker unmasks him and the whole crew as part of an enemy simulation designed to extract information from the deceived. It was all a charade.

With this anecdote in mind, I propose a new principle, Riker’s Razor:

If someone’s incompetence is too staggering to be true, they’re most likely faking it and you should find out why.

From: 3 Ideas in 2 Minutes on Staggering Incompetence

14. The Buttered Cat Paradox

The Buttered Cat is a faux paradox based on two well-known principles:

  1. Cats always land on their feet.
  2. Toast always lands on the buttered side.

According to its creator, John Frazee, the real question here is — of course — what would happen if you strapped a buttered toast to a cat and threw it off the table?

Frazee concocted the thought experiment in 1993 and won the GRAND PRIZE in a magazine’s I Have a Theory competition. He suggested the following outcome:

The two will hover, spinning, inches above the ground. With a giant buttered cat array, a high-speed monorail could easily link New York with Chicago.

Source: OMNI Magazine

15. Taleb’s Turkey

Essayist and former risk analyst Nassim Nicholas Taleb popularized the idea of Black Swans, improbable high-impact events that are only obvious in hindsight. To illustrate his point he relates the story of a Thanksgiving turkey:

Consider a turkey that is fed every day. Every single feeding will firm up the bird’s belief that it is the general rule of life to be fed every day by friendly members of the human race ‘looking out for its best interests,’ as a politician would say. On the afternoon of the Wednesday before Thanksgiving, something unexpected will happen to the turkey. It will incur a revision of belief.

Nassim Nicholas Taleb, Black Swan: The Impact of the Highly Improbable

As Taleb notes, though, the event was only unexpected from the turkey’s perspective. The butcher knew it all along.

From: 3 Ideas in 2 Minutes on Unknown Unknowns

16. The Tenth Man Rule

You’ve heard of devil’s advocacy, but have you heard about its cinematic version called the Tenth Man Rule? It’s an institutionalised way of arguing against a prevailing opinion or orthodoxy. It was introduced in the zombie blockbuster World War Z (2013):

If nine of us who get the same information arrived at the same conclusion, it’s the duty of the tenth man to disagree. No matter how improbable it may seem. The tenth man has to start thinking about the assumption that the other nine are wrong.

Mossad Chief Jurgen Warmbrunn, World War Z

If this idea intrigues you, check out my popular long-form essay on the Tenth Man Rule.

From: 3 Ideas in 2 Minutes on Group Dynamics

17. Knoll’s Law of Media Accuracy

Imagine you witness a car accident first-hand. A day later you read about it in the newspaper. But what you read has little to do with your personal experience from the day before. Here’s journalist Erwin Knoll reminding us about the imperfection of news stories:

Everything you read in the newspapers is absolutely true except for the rare story of which you happen to have firsthand knowledge.

Erwin Knoll, quoted in The New York Times

Put into practice, Knoll’s Law of Media Accuracy serves as a reminder that journalists and editors are prone to biases and mistakes just as anyone else. Though the key is not to overgeneralise but to extrapolate from the stories we’ve experienced ourselves.

So use articles of which you had firsthand knowledge as an indicator of how trustworthy a journalist is. If that journalist gets our firsthand knowledge of events right, it can be a good indicator of his or her reliability.

From: 3 Ideas in 2 Minutes on the Media Getting Things Wrong

18. Reward Prediction Error

Imagine you anticipate being rewarded for something you do, but the reward you actually receive is much less. You’ve committed a Reward Prediction Error. Here’s neuroscientist Andrew Huberman on how to use this knowledge to your advantage when it comes to reacting to negative comments:

Understand “reward prediction error” & you will never reply to a negative comment again. Negative comments open a dopamine anticipation loop (in the commenter). Respond and the circuit closes; they get rewarded. Don’t respond & their dopamine will eventually drop below baseline.

Andrew D. Huberman

From: 3 Ideas in 2 Minutes on Overcoming Negativity

19. The Cobra Effect

Unfortunately, the mere intent to make the world a better place sometimes has the opposite effect. The Cobra Effect is a classic illustration of such unintended negative consequences:

During the British rule of India, when the population of venomous cobras rose to worrying levels in Delhi, authorities offered a reward for dead cobras. People tracked the snakes down, killed them and turned them in. It worked — until it didn’t.

Eventually, some inventive locals began to breed cobras so they could make a profit. This of course led the British government to end the program. Problem solved — only it wasn’t.

The cobras had suddenly become useless to the breeders. So they set them free, once again causing a cobra plague in Delhi. It’s even said it was worst than before the government intervention.

Horst Siebert, a German economist, related a version of the above anecdote coining the Cobra Effect. The road to hell can truly be paved with good intentions.

From: 3 Ideas in 2 Minutes on Unintended Consequences

20. The Ship of Theseus

The Ship of Theseus is a famous thought experiment about identity and identity change. First mentioned by the ancient Greek philosopher Plutarch, it goes like this:

The ship wherein Theseus and the youth of Athens returned had thirty oars, and was preserved by the Athenians down even to the time of Demetrius Phalereus, for they took away the old planks as they decayed, putting in new and stronger timber in their place, insomuch that this ship became a standing example among the philosophers, for the logical question of things that grow; one side holding that the ship remained the same, and the other contending that it was not the same.

Plutarch, Theseus

Put simply, if all the planks of Theseus’ ship are replaced, is it still the king’s ship?

From: 3 Ideas in 2 Minutes on the Question of Who We Are

21. The Worf Effect

The Worf Effect is a trope named after one of the strongest and toughest characters in the science fiction series Star Trek: The Next Generation: the Klingon Worf. It’s used in storytelling to show how powerful an antagonist is.

If the villain fights Worf (or any known strong character ) and wins, we immediately know what kind of evil we’re dealing with. The dangerousness of the character is made clear. The conflict is established. The plot can take its course.

Careful though, if it’s overused it can damage the tough character’s reputation. How strong can Worf really be if he gets beaten up every other episode…?

From: 3 Ideas in 2 Minutes on Power Dynamics

22. Mind Palace Technique

The Mind Palace is where Benedict Cumberbatch’s eccentric detective Sherlock Holmes retreats to when he tries to solve a case. It’s a modern spin on the character’s unique ability to remember details and notice patterns. Dr Watson explains:

It’s a memory technique, a sort of a mental map. You plot a map with a location — it doesn’t have to be a real place — and then you deposit memories there that, theoretically, you can never forget anything. All you have to do is find your way back to it.

Dr Watson, Sherlock: The Hounds of Baskerville

I’ve written in detail about the real-life applications of this method in Sherlock’s Mind Palace: How to Memorise Information Like Sherlock Holmes.

From: 3 Ideas in 2 Minutes on Memorization Techniques

23. Curse of Knowledge

Has this ever happened to you? You’ve learned a new skill such as playing chess. It feels like you were the last person on earth to do so. You’ve gotten pretty good at it. But now you can’t even remember what it was like to be completely oblivious about the Game of Kings. Perhaps, when talking to other people, you even treat them as if they knew all about the board game — while they just smile and nod.

You’re under the Curse of Knowledge, a cognitive bias that makes it difficult for us to walk the proverbial mile in someone else’s metaphorical sneakers. We just assume everyone knows what we know. Being cursed with expertise is particularly tricky for teachers. We want them to be both, highly knowledgeable and skilled but also able to cater to students at a myriad of different levels.

From: 3 Ideas in 2 Minutes on the Curse of Knowledge

24. Shaggy Dog Story

A Shaggy Dog Story is a form of storytelling that will blow your mind — albeit in an unexpected way. Here’s the one that gave the genre its name:

A boy owned a dog that was uncommonly shaggy. Many people remarked upon its considerable shagginess. When the boy learned that there are contests for shaggy dogs, he entered his dog. The dog won first prize for shagginess in both the local and the regional competitions. The boy entered the dog in ever-larger contests, until finally he entered it in the world championship for shaggy dogs. When the judges had inspected all of the competing dogs, they remarked about the boy’s dog: “He’s not that shaggy.”

Ted Cohen

If you find this short anecdote rather long-winded, irrelevant and pointless, you’ve recognised the essence of a Shaggy Dog Story.

From: 3 Ideas in 2 Minutes on Compelling Narratives

25. Scope Neglect

Scope Neglect is a cognitive bias that leads people to disregard the extent of a problem. In an experiment, psychologists asked people how much they would give to save birds affected by oil spillage. Whether 2,000; 20,000 or 200,000 birds were in need of saving didn’t make much difference to how much people would donate.

Why? Psychologist Daniel Kahneman suggested that humans can’t cope with large numbers and tend to revert back to a single simple image:

The story […] probably evokes for many readers a mental representation of a prototypical incident, perhaps an image of an exhausted bird, its feathers soaked in black oil, unable to escape.

Daniel Kahneman

As a species that evolved from tribal life, we don’t seem to cope well with huge numbers.

From: 3 Ideas in 2 Minutes on Putting Things in Perspective

26. Chatham House Rule

At Chatham House, a British think tank, there’s one rule and one rule only. The Chatham House Rule:

When a meeting, or part thereof, is held under the Chatham House Rule, participants are free to use the information received, but neither the identity nor the affiliation of the speaker(s), nor that of any other participant, may be revealed.

Chatham House

The rule was devised in 1927. It’s intended to facilitate an open dialogue within the walls of the policy institute. Free of groupthink and without the risk of punishment for wrongthink.

From: 3 Ideas in 2 Minutes on Codes of Confidentiality

27. The Trap of Marginal Thinking

Remember the movie rental service Blockbuster? Their business model relied on customers renting movies. But also on customers returning the films in time so that other customers could rent them again. Since people didn’t like returning the DVDs, Blockbuster ended up massively increasing their late fees.

In the context of Blockbuster’s decision, Clayton Christensen, author of How Will You Measure Your Life?, explains this Trap of Marginal Thinking:

Set against this backdrop, a little upstart called Netflix emerged in the 1990s with a novel idea: rather than make people go to the video store, why don’t we mail DVDs to them? Netflix’s business model made profit in just the opposite way to Blockbuster’s. Netflix customers paid a monthly fee-and the company made money when customers didn’t watch the DVDs that they had ordered. As long as the DVDs sat unwatched at customers’ homes, Netflix did not have to pay return postage — or send out the next batch of movies that the customer had already paid the monthly fee to get.

Clayton Christensen

Settling for small changes (as opposed to innovating a whole business model) can be fateful. Blockbuster filed for bankruptcy in 2010.

From: 3 Ideas in 2 Minutes on Innovative Thinking

28. Crime Pattern Theory

Why do people commit crimes in certain areas? Crime Pattern Theory suggests that some criminals pick targets based on opportunities they find during everyday activities:

As offenders move through routine activities of home, school, work, entertainment (shopping and recreation) they develop knowledge of the paths to their routine activities as well as areas around routine activities (personal awareness spaces). Different offenders may have different awareness spaces which may overlap. Generally, motivated offenders will discover potentially good target areas which offer a good choice of targets and low risk within their awareness space, although some will seek out uncharted areas.

Leakha Henry & Brett Bryan, Paper on Visualising Motor-Vehicle Theft

In practice, someone might choose to break into a fancy car parked in a dark corner next to a public park. Because they pass it regularly on the way to their piano lessons.

From: 3 Ideas in 2 Minutes on the Power of Habits

29. Unread Guilt Factor

It can be difficult to build a habit of reading newsletters. Especially if there are so many of them. Unread Guilt Factor is one of the reasons why we cancel our subscriptions. Not because we don’t enjoy the content. Because we cannot keep up with it.

The term was coined by Denise Law, then lead development manager at The Economist. Offering audio versions of articles seems to have helped the paper to keep readers happy. It’s great to have the option to listen to the content. Guilt-free while going about our day. For example on the way to piano lessons.

From: 3 Ideas in 2 Minutes on the Power of Habits

30. Backwards Law

In life, things tend to turn out differently than we think. Philosopher Alan Watts has described this phenomenon as the Backwards Law:

I have always been fascinated by the law of reversed effort. Sometimes I call it the ‘backwards law.’ When you try to stay on the surface of the water, you sink; but when you try to sink, you float. When you hold your breath, you lose it — which immediately calls to mind an ancient and much neglected saying, ‘Whosoever would save his soul shall lose it.’

Alan Watts, The Wisdom of Insecurity

From: 3 Ideas in 2 Minutes on the Struggle of Life

31. Narrative Fallacy

The Narrative Fallacy is one of the challenges that come with our tendency to think in stories. This misconception causes us to see narratives where there are none.

When we learn about a series of events or facts we tend to sequence them together. We fill in the gaps of time and place, flesh out the characters and think about what might happen and why. In short, we turn random facts into a story.

The dog sleeps under the combine harvester.

The farmer turns the keys.

…are two completely unrelated statements. But our mental cinema carefully crafts a needlessly terrible story out of them. With imaginary causes and effects. So relax, the farmer is alive and well.

From: 3 Ideas in 2 Minutes on Persuasive Storytelling

32. The Diderot Effect

Denis Diderot was full of excitement when he was gifted a new red dressing gown. It was magnificent. Fashionable. Expensive. Only now, the old possessions of the 18th-century philosopher seemed dated and inadequate in comparison. Diderot’s solution was to spend money on replacing his cheap-looking old stuff to match his fancy new gown.

Called the Diderot Effect, this social phenomenon states that new possessions can lead us to buy more things. Even though these reactive purchases are usually completely unnecessary. The effect goes back to Diderot’s essay Regrets on Parting with My Old Dressing Gown in which he notes:

I was the absolute master of my old robe. I have become the slave of the new one.

From: 3 Ideas in 2 Minutes on Understanding Our Desires

33. Premortem Analysis

Premortem Analysis is a structured analytic technique designed to identify mistakes before they’re made. It’s closely related to a post-mortem, the medical examination of a body to determine the cause of death.

A Premortem is best done in a group, shortly before a decision is implemented or a project is finalised. It hinges on a crucial reframing of the situation at hand:

Imagine yourselves a few weeks or months in the future. Our decision has turned out to be spectacularly mistaken, our project has failed. What went wrong?

The group then spends time brainstorming potential sources of failure. After collating and discussing the results, they decide what actions to take. Intrigued? Read my article on Premortem Analysis: How to Anticipate Failure.

From: 3 Ideas in 2 Minutes on the Benefit of Hindsight

34. Cunningham’s Law

You’d think the best way to get answers in life is to ask the right questions. According to Cunningham’s Law, though, you’d be better off with a different strategy. At least on the internet:

The best way to get the right answer on the internet is not to ask a question; it’s to post the wrong answer.

The law has its name from Ward Cunningham, a programmer who first developed the wiki software. Ironically, Wikipedia seems to be based on the idea that people love to find mistakes and correct them. The rest is just a happy coincidence.

From: 3 Ideas in 2 Minutes on Unexpected Contradictions

35. Russell Conjugation

Russell Conjugation, aka Emotive Conjugation, can be used to manipulate how people think of us and others — or to entertain. Named after British philosopher Bertrand Russell, it’s derived from the idea of conjugating irregular English verbs. Here’s how it works.

When describing an event or person, think of a neutral verb or adjective. Then pick synonyms depending on whether you want those involved to be seen in a positive or negative light.

I explained it, you schooled him, and she pontificated over it.

I am passionate. You are angry. He is unhinged.

Needless to say, we tend to use more charitable expressions when describing our own behaviour. Look for Russell Conjugation in news reports and opinion pieces. Perhaps the person who was purportedly “eviscerated” verbally was merely being informed of a fact she didn’t know and happily conceded a point.

From: 3 Ideas in 2 Minutes on Language and Our Thinking

36. Meyer’s Law

We all know those emails we get on a Friday afternoon. Chris Meyer, a writer and analyst with a weakness for self-referential humour, has coined a relevant law:

Any email received on a Friday afternoon, shortly before close of business is bad news. Either the sender is terrified of the response, wants to ruin your weekend, or both.

Chris Meyer

From: 3 Ideas in 2 Minutes on the Art of Writing Emails

37. Abilene Paradox

Imagine yourself in Texas, USA. You’re bored out of your skull. And so is the rest of your family. To cut through the awkward silence, one family member suggests a trip to the small town of Abilene for dinner. Everyone seems to agree, so you go and pretend to enjoy the awful family trip. It’s only when you’re back home that everyone realises: Nobody wanted to go on that trip in the first place.

This phenomenon is called the Abilene Paradox: Sometimes, a group acts against its members‘ preferences even though everyone secretly agrees on how wrong a decision is. The will of the group is merely assumed. Nobody is keen on upsetting anyone by refusing to go with a decision.

The term was coined by management expert Jerry B. Harvey who related the above scenario in his 1974 article The Abilene Paradox: The Management of Agreement.

From: 3 Ideas in 2 Minutes on Being Too Agreeable

38. Villain of the Week

It’s hard to deny that there’s a high demand for malevolent characters and creatures when it comes to entertainment. Especially when you’re running a weekly TV show.

The Villain of the Week (aka Monster of the Week or Alien of the Week) delivers just that; a new antagonist every episode. The storytelling trope is popular with weekly TV shows. For the most part, the TV series of the 80s and 90s, such as The ATeam, didn’t have an ongoing plotline. Instead, our heroes had to start from square one and fight a different antagonist every week.

It’s almost as if evil is much more replaceable than good.

From: 3 Ideas in 2 Minutes on Thinking About Malevolence

39. Benford’s Law of Controversy

Benford’s Law of Controversy is an adage coined by astrophysicist and writer Gregory Benford. In his 1980 novel Timescape, he wrote:

Passion is inversely proportional to the amount of real information available.

The more data we have available the less passionate we tend to be about something as it leaves less room for personal opinions and interpretations. The less data we have, on the other hand, the more our feelings, emotions and passions come into play.

From: 3 Ideas in 2 Minutes on Finding Your Passion

40. Moral Dumbfounding

Would you sell your soul for $2? Imagine — as part of an experiment — you’re being asked to sign a contract to sell your soul after your death. You earn $2 on the spot, can tear apart the finished contract and keep the pieces. But is this moral? And what do you base your judgement on? Intuition, emotions or reasoning?

This and other “moral intuition” scenarios were part of a psychology experiment to test how we come to moral judgements. Many participants had a gut feeling about the ethics of similar situations. But they were unable to provide reasons to support their judgement. Hence the term Moral Dumbfounding:

Moral dumbfounding occurs when people stubbornly maintain a moral judgement, even though they can provide no reason to support their judgements.

Jonathan Haidt et al.

The origins of the term can be traced back to the 2000 paper Moral Dumbfounding: When Intuition Finds No Reason by psychologists Jonathan Haidt, Fredrik Björklund and Scott Murphy.

From: 3 Ideas in 2 Minutes on Moral Dilemmas

41. Apophenia

Apophenia is a term for the human tendency to see patterns where none exist. The data we look at may be random. But we still try to find connections to make sense of them. Alan Watts has talked about this phenomenon from a philosophical perspective:

Now it’s amazing what doesn’t exist in the real world. For example, in the real world there aren’t any things, nor are there any events. That doesn’t mean to say that the real world is a perfectly featureless blank.

It means that it is a marvelous system of wiggles, in which we describe things and events in the same way as we would project images on a Rorschach plot. Or pick out particular groups of stars in the sky and call them constellations as if they were separate groups of stars.

Well, they’re groups of stars in the minds’ eye, in our system of concepts. They are not out there as constellations already grouped in the sky.

Alan Watts

The term was coined by German psychiatrist Klaus Conrad while studying the early stages of schizophrenia.

From: 3 Ideas in 2 Minutes on Thinking Critically About Data

42. Loki’s Wager

Loki (the god from the original Norse mythology, not the former Thanos collaborator) was the name giver for this verbal fallacy called Loki’s Wager. His reputation was that of a cunning trickster who loved to play pranks on friends and foes. Here’s one you haven’t seen in a Marvel movie:

In a bet with the dwarf Brokkr, which he lost, Loki wagered his head, which he kept nonetheless. Don’t get me wrong, the god was happy to oblige and have his head severed from the rest of his body. But he insisted that, in doing so, Brokkr must not take parts of his neck. But where exactly do the neck end and the head begin? With this linguistic trick, Loki kept his head as the matter was discussed indefinitely.

For us mortals, the implications of Loki’s Wager are threefold:

  1. It’s easy to agree to a deal. But when it comes to implementing it, the devil is in the detail.
  2. Beware of linguistic stalling tactics used to postpone a decision or action. Meaning, don’t get lost in semantics.
  3. Pay attention to people who claim something cannot be defined. It could be a ploy to shield an idea from criticism.

Anyhow, Loki moves in mysterious ways, which is why we cannot be sure the Norse god doesn’t actually live on as Tom Hiddleston.

From: 3 Ideas in 2 Minutes on Staying Sceptical

43. Buridan’s Ass

Buridan’s Ass is a satirical spin on a philosophical paradox about free will. It’s attributed to Jean Buridan, a 14th-century French philosopher who is quoted saying:

Should two courses be judged equal, then the will cannot break the deadlock, all it can do is to suspend judgement until the circumstances change, and the right course of action is clear.

The idea was later simplified into the picture of an ass (donkey) that’s equally hungry and thirsty. It’s put halfway between a bucket of water and a stack of hay. What’s the ass going to do? Well, according to Buridan, it will die as it is unable to choose between food and water.

From: 3 Ideas in 2 Minutes on Making Impossible Decisions

44. Noble Cause Corruption

Corrupt people abuse the power they’ve been given in the pursuit of private gains. Noble Cause Corruption, however, is a form of corruption that comes in the guise of virtue. When we are convinced of the nobility of our goals, we may think the ends justify the means. This makes it particularly sinister. After all, it’s easier to justify and legitimise our immoral actions if they come from a good place in our hearts.

Noble Cause Corruption has its origins in the ethics of law enforcement where it’s a major consideration. Back in 1983, it was still known as The Dirty Harry Problem. Criminal justice scholar Carl Klockars noted how the fictional detective Harry Callahan (Clint Eastwood) tortures and murders his way towards justice. Much like Liam Neeson in Taken 25 years later.

From: 3 Ideas in 2 Minutes on Seeing the Good in People

45. Gurwinder’s Theory of Bespoke Bullshit

With so much happening in the world, we can’t opine on everything. Needless to say, we do it anyway. Writer Gurwinder Bhogal calls this the Theory of Bespoke Bullshit.

Many don’t have an opinion until they’re asked for it, at which point they cobble together a viewpoint from whim & half-remembered hearsay, before deciding that this 2-minute-old makeshift opinion will be their new hill to die on.

Gurwinder Bhogal

The concept is reminiscent of Motivated Reasoning, the idea that we rarely form an opinion based on careful analysis. Instead, we start with a gut feeling and then go on a search to find justification for it.

From: 3 Ideas in 2 Minutes on Forming Your Opinions

46. The Bed of Aristotle

The Bed of Aristotle is a nonexistent philosophical concept that I asked OpenAI to define anyway.

The Bed of Aristotle was a legendary bed designed by the ancient Greek philosopher Aristotle. The bed was said to be constructed of bronze and iron and was flanked by two bronze statues of lions. The bed had a secret compartment where Aristotle could store his scrolls and documents. According to legend, the bed’s frame was adorned with intricate carvings depicting the constellations of the night sky.

The Bed of Aristotle was not only a symbol of the philosopher’s wealth and power, but it also serves as an example of his innovative approach to problem-solving and critical thinking. Aristotle used the bed as a thinking tool, and the secret compartment was likely a place where he kept his notes and records. By studying the stars and constellations, Aristotle was able to develop his theories of philosophy, physics, and astronomy.

The Bed of Aristotle is a reminder that critical thinking and problem-solving require careful thought and analysis and that we can often find creative solutions to difficult problems by looking to the heavens.

Keen on more pointless ideas? Check out my essay about 10 AI-Generated Ideas That Fly in the Face of Critical Thinking.

From: 3 Ideas in 2 Minutes on Critical Thinking in the Age of AI

47. Social-Circle Heuristic

Should you learn to play chess or a new instrument? When we use the Social-Circle Heuristic, we don’t embark on a long-winded process of soul-searching and reasoning to answer such questions. Instead, we infer the best option of two alternatives by considering our current social circle.

This involves taking into account the beliefs, attitudes, and opinions of the people who are close to us. We rely on their collective wisdom so to speak. This heuristic is especially useful when we’re unfamiliar with the particular problem we’re trying to solve. It can help us to gain insight into the best solution quickly and accurately.

Let’s say most of the people we hang out with are musicians. That’ll end our search quickly as we’ll probably decide to go learn an instrument. Reminiscent of the theory of mimetic desire, the quality of our choices seems to depend heavily on other people.

From: 3 Ideas in 2 Minutes on Thinking More Efficiently

48. Bulverism

Arguments should be evaluated based on their merits. Unfortunately, that’s not how we usually approach them. Instead, we assume that whoever uttered them is wrong and then go on a quest to find out why. British writer C.S. Lewis noted this fallacy in the 1940s and named it Bulverism.

You must show that a man is wrong before you start explaining why he is wrong. The modern method is to assume without discussion that he is wrong and then distract his attention from this (the only real issue) by busily explaining how he became so silly. In the course of the last fifteen years I have found this vice so common that I have had to invent a name for it. I call it “Bulverism”.

C.S. Lewis, Bulverism

The fallacy is named after Ezekiel Bulver — its imaginary inventor Lewis created with a complete backstory.

Bulverists tend to attack the speakers and their motives instead of the substance of their claims. This is why Bulverism is considered an ad hominem, one of the lowest forms of dissent according to Graham’s Hierarchy Disagreement.

From: 3 Ideas in 2 Minutes on the Dynamics of Disagreement

49. Scarcity Effect

Don’t be fooled by the Scarcity Effect. If something appears rare, we tend to be more likely to desire it. Why? Because we perceive products that are in limited supply as more valuable.

The psychological phenomenon is often used in advertising and marketing. To create a sense of urgency and encourage people to take action before the opportunity is gone.

For example, when I published this idea in the original newsletter, I offered a limited-time-only yearly subscription for $40 instead of $50. Unfortunately, this opportunity is now gone. Would you subscribe anyway?

From: 3 Ideas in 2 Minutes on Making Difficult Decisions

50. Wittgenstein’s Ruler

Wittgenstein’s Ruler is a philosophical concept coined by author Nassim Nicholas Taleb, who named it after philosopher Ludwig Wittgenstein:

Unless the source of a statement has extremely high qualifications, the statement will be more revealing of the author than the information intended by him. This applies to matters of judgment. According to Wittgenstein’s ruler: Unless you have confidence in the ruler’s reliability, if you use a ruler to measure a table you may also be using the table to measure the ruler.

Nassim Nicholas Taleb, Fooled by Randomness

So rather than the length of objects, Wittgenstein’s Ruler challenges our assumptions about people and their judgements. Check out my latest essay on Wittgenstein’s Ruler to learn more about this fascinating concept.

From: 3 Ideas in 2 Minutes on the Power of Reliability

If you enjoy these kinds of interesting ideas and concepts, like and share this post. Become a subscriber to get new ideas weekly and support my work. I’ll donate 5% of my revenue to Save the Elephants. 🐘

On to the next 100 newsletters!