The enemy is us: Mitigating confirmation bias
Back in March, as physical distancing practices were being implemented globally, I was bemused by contrasting provocatively titled articles published within a day of one another. The Globe and Mail ran: “Not okay, boomer: Tensions mount between generations as some seniors resist social distancing.” The next day, The Wall Street Journal published: “A generational war is brewing over coronavirus: Scientists say lack of alarm among young people could hinder the fight against the virus and endanger elders.”
Despite divergent conclusions, these articles weren’t “fake news”: each of them cited sources and offered evidence—albeit mostly anecdotal—that supported their respective cases. But as I read them, I couldn’t help but think of a theory popularized by sports and pop culture writer/podcaster Bill Simmons about the 1997 summer blockbuster movie, Face/Off: that they came up with the title of the film first and then wrote the script.
The COVID-19 outbreak has ushered in a tremendous degree of uncertainty, fear, suffering, and a heavy emotional burden. Furthermore, we’re flooded with a deluge of information available through various forms of traditional and social media. The combination of these factors has produced an environment full of dissonance ripe for deeply ingrained cognitive biases to skew our judgment.
This piece will explore one of these cognitive biases, specifically, the one I immediately thought of when I saw those two articles: confirmation bias. We’ll unpack the evolutionary roots of this shortcoming and offer some ideas we think are helpful for mitigating its risks. By no means have we solved this problem at Mawer. The world of investing is, at the best of times, an inherently complex endeavour aimed at making decisions under uncertainty and therefore fertile ground for various cognitive biases, such as confirmation bias. But over the years, we have deliberately organized ourselves and adopted frameworks in an effort to dampen its effects.
We have met the enemy, and he is us
Why are humans prone to confirmation bias, and how does it manifest?
For this, we need to understand the history of our brains. To oversimplify, the human brain developed over hundreds of millions of years of evolution, and many of its core components long predate our species. The oldest inherited parts of our brain—often referred to as the reptilian or primitive brain—are responsible for many of the unconscious, instinctual activities that ensure our survival and for making unconscious, snap judgments about what we perceive in the physical world.
For example, Mark Bowden, an expert on body language, explains that when you initially meet another person, before either of you have uttered a word, your primitive brain instantly classifies them into one of four different categories : (a) friend; (b) enemy; (c) potential sexual partner; or (d) indifference. And, it turns out, these perceptual beliefs are quite rigid given the asymmetry of payoffs. In the wild, type I errors (false positives) are less costly than type II errors (false negatives). Put differently, judging somebody who looks threatening enough to be an enemy, even if they’re actually harmless, won’t endanger your life; making the opposite mistake might. Over millions of years, the primitive brain has been conditioned to accept and not question our perceptual beliefs.
By contrast, the modern human brain is relatively new at ~200,000 years old. Abstract beliefs—those formed not through direct experience but that are instead related through language—are distinctly human and a function of the neocortex. Yet, despite the neocortex’s capacity for much of what makes the human experience so rich—imagination, consciousness, and complex learning—the process of forming abstract beliefs wasn’t reinvented. It simply borrowed from the primitive brain’s established practice of unquestionably accepting its perceptual beliefs.
In her book, Thinking in Bets: Making Smarter Decisions When You Don’t Have all the Facts, former professional poker player Annie Duke summarizes this critical flaw in our abstract belief formation:
This is how we think we form abstract beliefs:
- We hear something;
- We think about it and vet it, determining whether it is true or false; only after that
- We form our belief.
It turns out, though, that we actually form abstract beliefs this way:
- We hear something;
- We believe it to be true;
- Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.
Only after taking her suggestion to Google “common misconceptions” have I learned that coffee doesn’t actually come from beans, bulls aren’t enraged by the colour red, and ostriches don’t bury their heads in the sand when they’re scared.
This idea that we are hardwired to believe what we hear and not naturally prone to skepticism is made worse by the fact that once a belief has formed, it takes considerable effort to displace it. The human ego has a distinct preference to be right, and when we encounter information that contradicts an abstract belief, it is much easier to disregard that new information than to admit we may be wrong. Rather, we tend to engage in motivated reasoning or to seek out information that confirms our existing beliefs.
Fast-forward to the current environment. Motivated reasoning and pursuing evidence that affirm our existing beliefs are on full throttle. I see it everywhere. I see it in debates over how best to reopen the economy, and in predictions for which letter of the alphabet will best represent the shape of the economic recovery. Quite frankly, the internet makes it worse; algorithms in our social media feeds only serve to magnify our predispositions. And, as Annie Duke points out, higher IQ doesn’t help: it may simply mean that you have the capacity to unconsciously perform the mental gymnastics necessary to manipulate the data or narrative to suit your pre-existing beliefs.
(Heck, I’m guilty too: at no point in the above paragraph did I point out examples (they exist!) of genuine and balanced critical thinking.)
During the discovery process ahead of a criminal trial, prosecutors are not allowed to withhold any exculpatory evidence that could be favourable to the defendant and that might impair the prosecution’s case; this evidence must be disclosed to the defendant. In the U.S., these are called Brady materials, and any Brady violations—intentional or accidental—can result in an overturned verdict.
Of course, prosecutors should want to turn over this evidence. The prosecutor’s job is to seek justice and truth, not to win convictions. But prosecutors are human beings, prone to cognitive shortcomings. There may be social or political pressures to win. Prosecutors are therefore susceptible to prioritizing evidence that supports a guilty verdict, sometimes unethically but also simply through unconscious bias.
The mental model of avoiding Brady violations has obvious benefits when it comes to researching companies or putting together investment portfolios. The following sections outline ways in which we’ve purpose-built and organized our team to guard against confirmation bias as best we can.
A variety of perspectives
It seems self-evident that a team-based approach should help mitigate an individual’s propensity for motivated reasoning. And while true, it depends on the team: a team comprised of individuals who all think the same way can have the opposite effect and exacerbate the problem.
This should be baked into the hiring process. Recognizing that humans have bias toward affinity, an assessment of a candidate’s background and skillsets that are different and additive to the existing team should be made an explicit component of the evaluation criteria. Investment teams comprised of individuals who all grew up in the same country, went to the same school, majored in the same discipline, or read the same morning paper, run the risk of having too narrow a perspective. An engineer thinks differently than a history major; both approaches are valuable.
(Interviews, by the way, are another area where confirmatory thinking can run rampant. There’s a reason it’s so important not to make typos on your resumé, as it may cause your interviewer to use the hour to look for further evidence that you’re careless and don’t pay attention to detail…if you get an interview at all!)
To further harness these perspectives, we’ve deliberately chosen to organize ourselves as generalists within asset classes, and to refrain from intense specialization. While there are certainly merits to a more focused approach, we feel these are outweighed by the risk that the team could be tempted to defer to the “expert,” reducing the checks and balances on that individual’s opinions. The reason I’ve always thought that an ostrich hides its head in the sand when scared is likely because somebody I trusted as more knowledgeable than me once said so, and I never questioned it—thanks a lot, Mom and Dad!
Finally, we encourage independent thinking. One of our most important portfolio construction tools is our Matrix process, a framework that involves scoring the relative attractiveness of new and existing holdings across the various elements of our investment philosophy (strength of business model, strength of management team, and ability to purchase at a discount to intrinsic value). The Matrix ultimately helps us determine what weight to ascribe to a given security. Perhaps more importantly, the Matrix is a wonderful tool in focusing debate. To ensure this debate reflects the diversity of opinions across the team as much as possible, each member of an asset class team is tasked with preparing their own Matrix scores independently. This reduces the risk that differences go unnoticed or that Matrix scores and opinions across the team unconsciously converge.
Good ideas borne from a diversity of perspectives are wasted if they go unvoiced or unheard.
A culture aligned with the mission
The Renaissance ushered in incredible advances in literature, art, architecture, science, and technology. One of the major contributing drivers of discovery was the adoption of an approach to reasoning, based on the works of Francis Bacon, that eventually developed into what we know today as the scientific method. In contrast to the mindset that prevailed during the Middle Ages, the principles of the scientific method promote curiosity, skepticism of commonly held beliefs, and demand that beliefs (hypotheses) be tested rigorously. The goal is to get as close as possible to the truth. As the scientific method mindset expanded, so did the pace of advancements.
The benefits of an organizational culture that explicitly values curiosity and candour in pursuit of best ideas is certainly applicable to investing, even though investing is very much in the realm of pseudoscience.
In such a culture, opposing viewpoints are less likely to be interpreted as an attack, but are rather welcome in that they help us collectively get to the best possible answer given the information available. Language can help. To borrow from the scientific method, observations or statements of fact are more effective than statements of opinion. “Your presentation sucked and everybody was bored” is likely to cause me to get defensive; whereas “I observed that 4 of the 7 people in the room were on Instagram during your presentation” is more likely to prompt self-reflection about what I could have done better. Some of our analysts go as far as colour-coding their research reports to explicitly differentiate between opinions and facts.
Structures that promote alignment support such a culture. At Mawer, our employee-ownership structure goes a long way to ensuring alignment between our interests, the work we do, and the interests of our clients. As an owner, I have a vested interest in ensuring the best possible decisions are made and am therefore motivated to be open to new ideas that challenge my own, since others around the table are similarly inclined. A broad ownership structure isn’t the only way to accomplish this, but it’s effective.
And appreciation plays an underrated role in cementing our values. Last year, we reduced our position in one of the world’s largest paints and coatings companies. The decision was not without internal debate. One of our longest-tenured investors had been more supportive of maintaining our holding due to the company’s strong industry position and associated scale advantages. By contrast, our most recent hire was worried about declining growth and cost-cutting efforts, which might not bode well for long-term equity holders.
Ultimately, it was the lead manager’s decision to trim and eventually exit the position. But the nature of the internal debate was highlighted on the trade memo shared firm-wide, effectively celebrating a healthy disagreement and reinforcing a lack of hierarchy. We have found that such seemingly inconsequential signals of appreciation have a compounding effect in maintaining a culture of candidness within our team.
Process, process, process
One of the corollaries associated with confirmation bias is that human beings aren’t wired to think probabilistically. Nobel Laureate Daniel Kahneman’s Thinking Fast and Slow is a must-read in demonstrating the systematic ways humans fail in appropriately calibrating the odds.
The scientific method depends on the ability to test hypotheses through experiments. One of the reasons that investing is a pseudoscience is that “experiments” in investing cannot be done in a reproducible manner. As an investor, it would be useful to know the odds associated with various revenue growth projections for a particular company, but it’s impossible to run the experiment more than once in real time, and therefore impossible to know whether the odds I came up with at any given time were the right ones. There are no parallel universes.
Still, the pursuit of probabilistic thinking is worthwhile, even if the inputs may be prone to bias, and we have implemented several tools to help. All of our discounted cash flow models (DCF) are stochastic and use Monte Carlo simulation to perform sensitivity analysis on the results. The output of these models isn’t a price target, nor a set of ternary bull/bear/base case scenarios, but rather a probability distribution for potential outcomes. These results are useful in that they help us understand the odds that we may be able to purchase a given investment at a discount to its intrinsic value. But the process of building the DCF—of constantly being exposed to the impact that individual inputs have on shaping distributions—embeds a discipline that permeates into everything else we do, including more qualitative assessments of competitive advantages or management teams.
Bayesian frameworks are also useful in weighing new information appropriately, as opposed to intuitively. “When was the last time XYZ happened?” is a great question in handicapping signal in attempting to calibrate the denominator in Bayes’ theorem.
We recently invested in a UK-based distributor of various industrial and electronic components. Management claims they are in the midst of a turnaround under a new CEO, yet while the narrative is compelling—management certainly talks the talk—there’s decent statistical evidence supporting this claim too. The company has demonstrated margin expansion in three of the four years since the new CEO joined. But looking back over a longer period in the company’s history, expanding margins year-over-year tended to occur only 20% of the time. As a result, the probability that the recent string of results has been driven purely by chance is low, shifting the odds that the new CEO’s efforts may very well be creating real traction.
Finally, adhering to process forces us to slow down and engage our System 2—conscious/purposeful—thinking. Before introducing any stock to a portfolio, a company needs to go through our “bathroom list” process. Much in the same way that public restrooms often have a chart on the back of the door cataloguing when the floors were last mopped, the sinks cleaned, and paper towel replaced, we have a series of steps in our research process that are required to be completed before we’ll contemplate an investment (e.g., interview management, perform forensic accounting analysis, build the DCF, write the research report). These bathroom list exercises likewise must be repeated on a regular basis to continue holding the position in the portfolio. The bathroom list ensures that we don’t take mental shortcuts, we do our homework and—in conjunction with the Matrix review process described above—we incorporate relevant information and opinions from the rest of the team before committing our clients’ capital.
Mitigating, not eliminating
To reaffirm: confirmation bias is not a problem specific to investing. It’s a universal flaw in human decision-making.
We’ve suffered our share of cognitive errors due to confirmatory thinking that have resulted in capital impairment. As an example, our initial thesis in CRH Medical rested on a heuristic: that a management team employing a strategy of buying asset-light, stable cash flows with high switching costs at reasonable multiples leads to wealth-creation for shareholders. We have seen this strategy successfully applied in other fruitful investments, and CRH’s acquisitions of anaesthesia practices in the U.S. tied to gastrointestinal clinics seemed to fit these criteria. Ultimately, we lost money on CRH: we focused too much on why CRH’s strategy lined up with this model, as opposed to acknowledging that the cash flows in this particular business were less stable.
We know we can be our own worst enemies. And because of that, we know we need help. This has impacted how we hire, how we’ve chosen to structure our team, how we interact with one another, as well as the frameworks and tools we’ve built into our investment process.
Real Brady violations can result in innocent people on death row. While certainly less consequential in our line of work, the mental model is relevant when it comes to judgments that impact our clients’ capital.