Last week I wrote about decision-making, and I promised to dig into a few mental models. So in this article I’m sharing some of my favorites.
If you’ve been around here a while, you know that one of the big influences in my life—after Beyoncé and Taylor Swift—is the person I share it with. We trade obsessions, and if he now sings Anti-Hero without a hitch, I owe him my growing interest in Warren Buffett and that whole lovely world of finance.
Fun fact: the first time we went to the Berkshire Hathaway annual meeting in Omaha, Nebraska (Buffett opens the AGM to all shareholders; it takes place in a giant arena and it’s pretty wild), I was wearing multi-colored floral pants and pink seashell earrings in rooms full of men in suits. I signed all my thank-you emails “the one in the flower pants”—efficient!
Berkshire Hathaway – 2023 Shareholder Meeting
The Circle of Competence
“I want to think about things where I have an advantage over other people. I don’t want to play a game where people have an advantage over me. I don’t play in a game where other people are wise and I am stupid. I look for a game where I am wise and they are stupid. And believe me it works better. God bless our stupid competitors. They make us rich.”
— Charlie Munger
The circle of competence is easy to grasp. It’s about establishing what you’re good at and knowing the exact perimeter of those competencies—meaning, doing the work to determine precisely where that circle ends. The challenge is defining that circle and knowing its boundaries perfectly; as soon as you step outside it, you risk losing your competitive advantage (even if your ego tells you otherwise).
Most people worry their circle of competence isn’t broad enough, when the real issue is recognizing when you’re approaching its edge. Overconfidence can be fatal to your advantage.
“I’m no genius. I’m smart in spots—but I stay around those spots.”
— Tom Watson Sr., Founder of IBM
What’s interesting about the circle of competence is that when you learn to use it with humility and discipline—which means learning to say “I don’t know” and refusing missions outside that circle—you can reach success more safely than by chasing whatever’s trendy or copying your neighbor’s way of working.
Don’t confuse what you truly know with what you think you know. Don’t let overconfidence push you beyond your circle of competence.
(In green: what you know / in white: what you think you know.
Beautiful diagram by yours truly—adapted from a Farnam Street illustration.)
Confirmation Bias
Staying with humility and clear-sightedness—both when evaluating your own skills and when analyzing a situation—let’s talk about confirmation bias.
Confirmation bias is our tendency to see only the slice of facts that fits our theory and validates our arguments. It’s the bias at work when, over Christmas dinner, my father and I use the same pieces of evidence to serve two opposing arguments—and we each feel those proofs obviously support our point. Confirmation bias often shows up where the stakes are emotionally or ideologically high.
You have to be extremely vigilant with this bias, because losing objectivity in your analysis can lead to serious errors of judgment. In business, it can cause strategic mistakes with painful consequences.
(Left: Facts – Middle: what you see – Right: what confirms your beliefs)
“The confirmation bias is so fundamental to your development and your reality that you might not even realize it is happening. We look for evidence that supports our beliefs and opinions about the world but excludes those that run contrary to our own… In an attempt to simplify the world and make it conform to our expectations, we have been blessed with the gift of cognitive biases.”
— Sia Mohajer, The Little Book of Stupidity
Confirmation bias is our ego nudging us to massage the facts so we don’t have to admit we’re wrong. I’m simplifying a bit brutally here, but very often that’s exactly what happens.
Example: You support hypothesis A. Someone presents arguments proving hypothesis B is correct. You’ll likely find a way to interpret those arguments so they still serve hypothesis A—because it’s yours.
The difficulty with this bias is that if you don’t know it exists, you can’t counter it. And when you explain it to someone for the first time, their initial reaction is generally to think they’re not subject to it.
In a Stanford study, half the participants supported capital punishment and the other half opposed it. Both groups read detailed summaries of the same two (fictional) studies. Half of the participants were told one study supported the deterrent effect of capital punishment and the other refuted it; the other half were told the opposite. Either way, the majority stuck to their original position—highlighting the data that backed it and rejecting the rest.
Confirmation bias clouds our judgment. It gives us a skewed (surprise) view of information—even when we’re just dealing with numbers. Understanding this can transform our vision of the world—or rather, our perspective on it. Lewis Carroll said “we are what we believe we are,” but it also seems the world is what we believe it is.
You might say this isn’t a model but a bias—fair. This bias stems from our need to build mental models that simplify our complex experience of the daily information flood, so we can interact and decide more easily. The very model that’s supposed to help us can end up hurting us.
The information we take in is influenced by our existing beliefs, which makes it more memorable to us. As a result, we tend to notice more evidence that reinforces our worldview. Confirming data gets serious weight; disconfirming data gets treated with skepticism. Our overall assimilation of information is subject to deep biases.
That’s why it’s crucial to properly inform yourself about theses opposed to your own when you’re forming an opinion—and for that, I have the perfect model.
Inversion
The concept is simple: aim to avoid being foolish rather than to be brilliant.
It may not feel natural, but the idea is to think backward to reach your goals. Inversion helps you better understand a problem, situation, or argument. By forcing yourself to do the work required to have an opinion, you’re compelled to consider different perspectives.
Example: you want to encourage creativity in your teams.
Instead of asking, “How do we encourage creativity?” ask, “What would I do if I wanted to eliminate creativity completely?” In that list of creativity-killers, you’ll probably spot a few that already exist in your team—so you can start by removing those.
Inversion is an excellent way to find solutions when you feel stuck in a dead end.
I hope this tour gave you a clearer sense of how studying mental models can sharpen your perception of the world—and improve your decisions.



0 Comments