Karl Popper said in his case against Sigmund Freud, “It’s easy to confirm a theory if you’re looking to confirm it.” This has since been popularized as confirmation bias.
While confirmation bias seeks to reinforce pre-existing beliefs, falsifiability seeks evidence that disproves one's beliefs.
Karl Popper, an ambassador of critical thinking for the better part of the 20th century, didn’t just upset philosophers and scientists with his skepticism – he challenged our entire approach to reasoning. Standing on the shoulders of people like Kant, Hume, Wittgenstein, and Einstein, Popper shattered the glass walls of dogma with his concept of falsifiability.
Theories can’t be proven true.
The theories we attempt to prove are influenced by an inconceivably large number of factors, too complicated for our minds to comprehend.
Theories can only be tested in an attempt to falsify them. And if a theory fails, great – we’re a step closer to truth.
Closer.
But nothing leads to complete certainty.
In pursuit of a Popper perspective
It’s no secret that ‘BEYOND CAPITAL’ is highly influenced by Popper and other figures such as Nassim Taleb and Charlie Munger, who, in turn, have been influenced by Popper’s thinking.
A running theme in this newsletter has been and will continue to be about building a structure of thinking that attacks and destroys the need for certainty while embracing the inevitability of uncertainty. Cultivating the ability to be comfortable and even thrive amidst uncertain circumstances is a worthy pursuit, I believe.
Confirmation bias
Imagine your brain as a thoughtful librarian, organizing a variety of ideas. Confirmation bias is the librarian's sneaky habit of constantly stacking new books that match those already on the shelves while stashing opposing books in dusty corners. It feels good to have a structure of consensus on the shelves; no conflicts, no dissent, no dissonance. It’s a type of comfort, and we’re unconsciously being pulled toward it.
But in the pursuit of what's right or true, comfort is a poor incentive to go by.
And just as confirmation bias shields us from discomfort, so too does the feeling of control, especially when it comes to the future.
Before we circle back to Popper, let me introduce a fellow skeptic and his praise of scrutiny and critical thinking: the mathematician Henri Poincaré.
Poincaré prohibits predictions
Picture yourself with a compass in hand. Although it's a reliable tool, the precision of its readings is confined to whole numbers, lacking decimals. Your goal is to reach a location 1km away. You know the direction from your starting point, but you have to draw the line on the map before you walk, and you can't keep checking as you walk. My guess is you'll end up pretty close to the exact location. But what if the same rules apply, and the location is now 1000km away? My guess is you'll end up quite far from your location.
This is a simplification of Poincaré’s argument for unpredictability; there are limitations in accurately predicting the future (concerning complex systems, a common feature in many aspects of life). Due to sensitivity to tiny changes in the starting conditions, even precise mathematical equations are flawed in their predictions.
As mathematicians know, starting with a small error in the equation can have an enormous impact on the outcome.
This has influenced discussions on our ability to predict at all.
So why, then, do economists continue to create an overwhelming amount of models to predict the future economy when we know that capturing all data regarding human behavior and affairs is impossible? And if we don’t capture all the data to start with, isn’t the prediction, according to Poincaré, doomed to fail?
Just as the compounding of a small amount of money leads to pleasant surprises, the compounding of small errors leads to unpleasant surprises.
The problem with induction
We’re so wired to ask the question “How can I know this is true?”, while we should be asking, “How can I know this is false?”
We simply can’t draw general conclusions from particular instances.
This is the problem with induction, and it’s precisely what Popper was opposing.
No matter how many red cars you observe, you will never be able to state the fact that all cars are red. We simply don’t have the power, skill, and presence to check the entire universe for other cars. However, as soon as we observe one green car, we CAN state the fact that not all cars are red.
It’s a complete shift in our thinking.
Inversion revisited
This is a great time to revisit the concept of inversion, which I wrote about in the essay “Execution Matters”.
"Always invert" is a mental model and problem-solving technique popularized by Charlie Munger. When faced with a problem, people often focus on finding the right steps to achieve their desires. They look for solutions that move them forward.
Munger's advice is to turn the problem on its head. Instead of asking how to achieve success, ask what could lead to failure. By identifying and then avoiding the causes of failure, you indirectly increase the likelihood of success.
To end
Descartes said we must come to complete certainty. No, says Popper. Holding a position of certainty is not only unachievable but also risky. It will lead to a closed mind, unable to revise old information in the face of new information.
“Every false belief we discover is actually good," says Popper, "because that gets us closer to believing only true things.”
Others call it an appreciation for failing.