Get New Ideas by Email. Join 3k+ Readers.

The Law of Unintended Consequences: What Could Possibly Go Wrong?

A common definition of insanity is to do the same thing over and over again and expect different results. But who says you cannot turn mad after the first attempt? When the Law of Unintended Consequences strikes, insanity is often not far away. Our decisions and actions have the pesky habit of turning out differently than we thought. There’s no shortage of prominent examples of interventions gone sideways. At varying levels of madness, they teach us valuable lessons about decision-making and life in general.

What Is the Law of Unintended Consequences?

The Law of Unintended Consequences states that any intervention is bound to have multiple effects, which almost always leads to results that weren’t part of a decision-maker’s plan. The law serves as a warning against overconfidence and the illusion of control.

The adage can trace its roots back to philosophical giants such as Adam Smith. But it was not until 1936 that American sociologist Robert K. Merton popularised the underlying concept in his essay The Unanticipated Consequences of Purposive Social Action. Analysing the mechanisms for the first time, he looked into people’s decisions to enact social change and what was causing the consequences to be often so unexpected.

More recently, the concept has made its way into colloquial language as an idiomatic expression. At its heart, we might say, the Law of Unintended Consequences is about agency and the lack of it. While Merton spoke of consequences that were “unanticipated”, the term “unintended” has since become the convention. Today, it essentially means “unforeseen side effects”. Let’s dive into three prominent examples at increasing levels of failure.

1. The Diderot Effect

The Diderot Effect is a prime example of a goal well achieved; albeit with unexpected drawbacks. The effect is named after French philosopher Denis Diderot, who was gifted a fancy new red dressing gown. Nobody could say that the gift giver missed the mark. Denis was delighted with his new possession.

So delighted that all of a sudden, his other possessions looked cheap and dated in comparison. In Regrets on Parting with My Old Dressing Gown, the 18th-century philosopher documented the side effects of the gift. How it led him to replace his entire wardrobe and furniture with new expensive stuff so it would match his fashionable new robe. Sad to say, Denis ended up unhappy and in debt.

We don’t have to search long for similar contemporary examples. In 2016, a British government agency wanted to let the public choose the name of their newest research vessel. The campaign was successful. The public made its choice. Though, Boaty McBoatface was not the kind of name the agency had anticipated when conducting the poll. The drawbacks were unexpected but arguably minor. RRS Sir David Attenborough was the more dignified name with which the vessel ended up. But at least the ship’s leading research submarine now bears the name Boaty McBoatface.

2. The Vodka Effect

If gifts and happy online polls can have these kinds of unforeseen side effects, imagine what bans can do. Let’s consider the Vodka Effect. In 2015, I visited a friend in Minsk, Belarus. The city was plastered with billboard ads for a local vodka. So it seemed. I had tasted the iconic drink the night before. The black, blue and silver label of the crystal clear liquor was very recognisable. But the billboards didn’t promote vodka at all.

A while back, the government had outlawed ads for alcoholic drinks. Being prohibited from promoting their alcohol, the company started selling the water they used for vodka production, too. The labels for the water bottles looked suspiciously like the ones used for their vodka. What I was looking at was an advertising campaign for their crystal clear “spring water”.

Technically, the goal of the ban was achieved. But advertisers found a way to break the rules by following them. Ironically, the intention of the policy decision was to keep people from drinking too much alcohol. The result was vodka being likened to water. The blunt weapon of the ban proved ineffective. At least until the government closed the loophole. Not all unintended consequences can be fixed, though. The Peltzman Effect shows how mandated security measures can have lasting detrimental effects. Then there is the even more permanent Streisand Effect.

3. The Streisand Effect

The Streisand Effect is perhaps one of the most famous examples of an intervention leading to wholly perverse results. In 2003, a U.S. photographer took an aerial photo of a Los Angeles beachfront mansion. This was done as part of a documentation project of the coastline of California. However, the mansion in question belonged to none other than famed actress and singer Barbra Streisand, who went to court over the alleged invasion of her privacy.

The Law of Unintended Consequences
Copyright (C) 2002 Kenneth & Gabrielle Adelman, California Coastal Records Project, www.californiacoastline.org

Streisand ended up losing the lawsuit. The resulting publicity put a spotlight on her house with a myriad of people downloading the photo. Before the lawsuit, nobody paid much attention to the California Coastal Records Project’s photo collection. Today, the infamous photo plasters the internet. Streisand not only failed to achieve her goal. Her actions made her situation infinitely worse.

The strange outcomes of the Streisand Effect are probably only surpassed by the Cobra Effects. These phenomena are named after a failed attempt to get a hold of a cobra plague by offering a reward for killing them. You can either guess what the result was or read my essay about it. In any case, when the Law of Unintended Consequences strikes in the form of the Streisand Effect, the opposite of the original goal is achieved. The devastating consequences cannot be fixed.

Overturning the Law of Unintended Consequences

Clearly, the Law of Unintended Consequences can be relentless. Reason enough to look into its underlying causes as well as ways to avert impending disaster.

Causes of Unintended Consequences

In his 1936 essay, Merton determined five primary sources of unforeseen side effects. Here are the three that stand out for me:


  • Ignorance: The most obvious explanation for the Law of Unintended Consequences is this: We don’t know enough. The world is a system too complex for us to comprehend let alone predict. Nassim Taleb illustrates this beautifully with his Black Swan Theory. Remember Diderot? The potential consequences of accepting a gift simply didn’t occur to him.
  • Analytic errors: In many ways, decision-making is an exercise in predictive analytics. In our efforts to transcend our ignorance, we inevitably face limitations or make mistakes. Especially when it comes to predictions of human behaviour. Solutions based on what worked in the past may not work again. We’re quite an adaptable species, and our reactions are not always calculable. The policy-makers in Minsk? They may have done their analytic due diligence but failed to predict the eventual outcome.
  • Immediate interests: Humans often act based on short-term interests, neglecting the long-term consequences. Not because they don’t matter. But because a pressing immediate problem outweighs the potential challenges down the road. Remember Barbra Streisand? Her decision seemed shortsighted albeit principle-driven. The unforeseen consequences echo to this day.

Now, with the causes in mind, it’s finally time to consider how we can overturn the Law of Unintended Consequences.

How to Solve the Law of Unintended Consequences

Essentially, we have to address ignorance, analytic errors and short-term thinking. It’s evident that the one or two desired results we have in mind are not the only ones we’ll be confronted with. Here are five techniques to better anticipate the unexpected ones:

  1. Chesterton’s Fence is a basic rule when talking about change and reform. It urges us to understand the purpose behind an old rule or institution before abolishing or reforming it.
  2. The Fisher Protocol is a radical yet effective solution to avoiding the consequences of nuclear war. Invented by negotiation expert Roger Fisher, it would force the U.S. President to kill one of his aides with a butcher knife if he ever wanted to get his hands on the launch codes.
  3. Wrong decisions and analytical errors often arise because we fail to take all relevant information into account. Methods such as the Tenth Man Rule harness dissent to discover sides of an issue that would otherwise be hidden.
  4. Premortem Analysis and Red Team Analysis are similar yet more elaborate analytical techniques. The former attempts to catch wrong decisions before they’re implemented. The latter enables us to walk a mile in our adversary’s shoes; to analyse how they think and act.
  5. In my essay about making recommendations, I showed how small actions could give us a preview of the consequences of our actions. Surface an idea, run a small trial, or otherwise tip your toe in the water to get an idea of potential side effects you didn’t even think of.
  6. Finally, the conceptualisation of decisions as reversible or irreversible can be of help. Think of it as an essential exercise in second-order thinking. If a decision can be easily reversed, we can afford to make it quickly. If it’s more akin to a one-way door, we should take the time for more careful analysis.

All methods are designed to reduce our unknown unknowns, improve analytic rigour and reconcile short-term and long-term thinking. And if you’re still not sure, remember that doing nothing is also an option.

Beyond the Law of Unintended Consequences

Elephant in the Room

The purpose of the above methods is straightforward. Turn unanticipated side effects into anticipated ones. But even when successful, the challenges don’t end there. Because beyond the Law of Unintended Consequences lie those side effects that were unintended but indeed anticipated. In other words, decision-makers were aware of the consequences of a judgment. Yet they chose to act anyway.

We can think of this conundrum as the unintended consequences of knowing the unintended consequences. Being aware of them doesn’t necessarily mean we can avert them. As economist Peter F. Drucker famously noted, even “the best strategic decision is only an approximation – and a risk”. There is no “perfect strategic decision”. All we can do is pick our poison, the best bad solution so to speak. Only this time we cannot claim ignorance as an excuse.

A Philosophical Approach

This would be a rather gloomy note to end the essay on. Luckily, there’s one last missing piece in the puzzle. Unintended consequences aren’t exclusively negative. Believe it or not, even the most disastrous decisions and actions in history can have positive side effects. Take the establishment of a demilitarized zone as a result of the Korean War, for example. It led to a revived ecosystem. The same can be said about sunken ships from World War II. Today, they’ve turned into a paradise for divers.

It all depends on how far we zoom out. How long our time horizon of cause and effect is. The anecdote of the Chinese Farmer beautifully illustrates this eternal chain reaction of unforeseen side effects. It’s a Zen story famously told by English philosopher Alan Watts:

Once upon a time there was a Chinese farmer whose horse ran away. All the neighbours came around that evening and said, “That’s too bad.” And the farmer said, “Maybe.” The next day the horse came back and brought seven wild horses with it. And all the neighbours came around and said, “That’s great, isn’t it?” And the farmer said, “Maybe.”

The next day his son, who was attempting to tame one of these horses, and was riding it and was thrown broke his leg. And all the neighbours came around in the evening and said, “Well, that’s too bad, isn’t it?” And the farmer said, “Maybe.”

The next day the conscription officers came around looking for people for the army. They rejected his son because he had a broken leg. And all the neighbours came around that evening and said, “Well, isn’t that wonderful?” And the farmer said, “Maybe.”

In the grand scheme of things, virtually everything will eventually go wrong. Or sort itself out. It just depends on your time frame. It’s our choice whether we take this as an excuse for nihilism. Or find a sense of humility in it. Either nothing we do matters. Or everything does.

Closing Thoughts

As it turns out, there are a number of things that can possibly go wrong. Madness may be just one poor decision away. Investor and philosopher Naval Ravikant once defined wisdom as the ability to know the long-term consequences of our actions. Sounds like an unachievable high bar. Unless we acknowledge that we can never fully understand the extent of cause and effect. So perhaps true wisdom is to be aware of the limits of our ability to control. And to have the patience to bear the inevitable madness until a new opportunity arises.