This article was originally published on this site
The most important skill in life is good decision making. And yet, we’re never taught how to make good decisions. Instead, we stumble from one mistake to the next, making our way hopefully to “wisdom”.
Even worse, other people exploit our biases for profit. Marketers use behavioral psychology to design ads and sites that keep us clicking and buying: think flash sales, impulse buys, and celebrity endorsements. From this we conclude that human biases make us weak. Optimal behavior is to suppress emotion and instead make calculated, rational decisions, like a computer.
But what if we thought about cognitive biases differently? What if we treated them as the boon of billions of years of evolution? They allow us to detect patterns from few instances, to imagine things we’ve never seen, and to function robustly in a changing environment. Can we design decision-making processes that capture the benefits of human bias without the drawbacks? Can we harness the non-rational to become even more rational?
I believe we can. We can become “post-rational”, leveraging rather than suppressing our biases. I provide strategies that transform our cognitive weaknesses into strengths. I focus on the investing domain given my experience as a hedge fund manager, but I strongly believe the thinking transfers to other fields which require balancing rationality and intuition (e.g. the intelligence industry, medical diagnosis, hiring, entrepreneurship, etc.)
We are wired for storytelling. We create stories to make sense of anything and everything: the past, other people, ourselves. We even create stories to explain the inexplicable, like daily stock market movements. Stories provide comfort by making things appear less random.
But stories can get us into investing trouble. Serious allocators, professionals responsible for investing the aggregated billions of retirement accounts, have told me that they won’t invest in overweight CEOs because it indicates a lack of discipline. Others have told me the opposite: they invest in overweight CEOs because it indicates extreme focus. The truth is probably that weight has little correlation to performance.
What’s happening here is quite common: we generalize from very few data points and create narratives to rationalize what we see. We’re so good at storytelling that we can come up with “explanations” for anything we encounter.
How can we make better use of our storytelling genius? Perhaps, rather than create stories to “explain” what we see, we can leverage stories to identify new risks.
For example, suppose that we’re evaluating whether to invest in a company. To identify potential issues, it is common to ask “How could this investment go wrong?” This question relies on our theorizing or forecasting skill to elicit risks.
What if, instead, we imagine that we’re living in a world in which the investment already went awry, and we write a story explaining what happened? Research shows that this technique of explaining what went wrong prospectively–a “Pre-mortem”–yields richer explanations than asking how something might go wrong. It engages our storytelling skill, asking us to come up with narratives, which we know comes naturally to us.
Try it out with a decision you’re currently considering: tell the story of how it all went wrong. You may uncover important new risks and develop measures to counter those risks.
We are overconfident. 90% of people consider themselves above average drivers. And 75% of fund managers consider their performance above average. We believe we’re more skilled than we actually are, and that we know more than we actually do.
This is a problem in investing. Overconfidence leads us to quantify the uncertainty of our forecasts inaccurately. We end up taking bets that are too large relative to the uncertainty involved.
Even worse, investment analysts make forecasts that take quarters or years to resolve. So we only learn much later whether we were right or wrong. Compare that to poker players or meteorologists who get feedback on their forecasts right away.
Perhaps we can create feedback environments to improve our forecasting. Research has shown that calibration training is effective at curbing overconfidence. One method involves quantifying our uncertainty around a large number of trivia questions, e.g. how certain are you that Newton wrote the Principia after the year 1650? After a few dozen questions, you’ll begin to get a much better sense of the real magnitude of your uncertainty. To mitigate overconfidence, we constantly need to show ourselves how little we know.
Anchoring To The Opposite
Suppose I ask you to bid on a bottle of wine. But first, write down the last 2 digits of your social security number. Would those 2 digits influence your bid? Experiments have shown that it does; the table below shows students bids on items (rows) based on the last 2 digits of their social security numbers (columns).
The students professed that the digits had no impact on their bids. But the table reveals a large impact: students with the largest digits bid about 3x more for items than students with the smallest digits! The takeaway is that our decision making is subconsciously influenced by what we are exposed to beforehand.
This anchoring effect can harm investing. For example, we may be overly influenced by the price we paid for a stock and hold on to it to avoid realizing losses. Or we may be anchored to recent levels of volatility when determining leverage levels, piling more risk into investments when volatility has been low.
If we are unduly influenced by the stimuli preceding a decision, then it makes a lot of sense to control and pre-select the best stimuli. And what is the best stimuli? Well, what’s hard in decision making isn’t so much selecting wisely from a set of choices, but rather imagining all the choices we actually have. The simple protocol of considering the opposite helps to widen the set of alternatives being considered and avoid narrow framing.
I believe that the best stimuli ahead of a decision are ones that make it easy to consider the opposite. So, for example, we might designate an ‘opposite thinker’ for important meetings. Or we might install physical cues (like the one below) in meeting rooms that remind us to think the opposite.
Our cognitive biases get a lot of negative press, but, they are survival tools. And although we spend our time at a computer in an office, that doesn’t mean our survival tools are bad or irrelevant. Instead, we can creatively redesign our environment to harness these tools. Doing so allows us to become post-rational, reclaiming these biases for our own gain.