Know yourself, conquer the markets: How knowing these 3 behavioral biases can lead to investment success

Quantitative Investments
Read 8 min

When you have a packed room full of people all shouting at you, and there’s nothing you can do to help them, it’s intimidating. This was a normal weekend evening for my colleague when he was in university. He worked in a student bar in the 90s, where the Student Union had invested in a new payment card system. It was a fabulous idea at the time: one card, no cash needed, just load money onto your card and then pay for everything. The new card was rolled out with much fanfare, all students received one and were encouraged to use it… and they did. There was just one snag.

It didn’t work. Even when it did, the tills remained locked while it took minutes to process each transaction. But what does this have to do with behavioral biases?

Rather than taking a simple approach and asking for cash at the bar, the Union decided to invest in several more card machines. Unfortunately, this resulted in even greater system overload, longer transaction times, and even more shouting!

This is an example of a behavioral bias called the “sunk cost fallacy”, sometimes referred to as a Texas Hedge: when in trouble, double. Instead of recognizing a bad decision and cutting your losses, you invest more, hoping to improve things. When this has happened, it’s likely that the cause was a natural behavioral bias. Our rational brain says one thing, but our emotional brain says the opposite and in times of stress, overcoming emotional triggers is nigh on impossible.  

For investors, behavioral biases are generally harmful to investment decision making. The only way that biases can be eliminated, in our view, is by handing over investment decisions to automated computer models, which are by design unemotional. In particular, investors should use systematic models to take asset allocation decisions.

In this article, we look at three behavioral biases that often prove costly to investors and the systematic models that can be used to avoid these biases to generate consistent long-term returns.

1.    Information overload

We all deal with information overload on a daily basis. When you open up your work email, the deluge of news and information feels overwhelming. On the other hand, we crave information for decision-making and our fear of missing out comes into play.

Unfortunately, we humans are not designed to handle large amounts of data, yet the world seems designed to overload us. People simply don’t write PowerPoint presentations designed for human consumption with only three takeaway points and it’s rare that writers organize their points into a short list (Like this one, thank you). We’re designed to take in just a few data points at a time, process them in our head, and then make a decision. When we are faced with a large number of variables, it can lead to a feeling of overconfidence, a trait that is detrimental to investment performance.

Let’s have a closer look at how the overconfidence factor works out in scientific studies and then see how computers begin to outperform humans when the number of variables increases.

In a study of venture capitalists’ decision making, Zacharakis and Shepherd found that 96% of venture capitalists were overconfident and this overconfidence had negative results on decision-making accuracy. According to the study, one of the factors leading to overconfidence was the amount of information the venture capitalists used. “Even if more information is available, people usually don't analyze all of it (even though they believe they do). Thus, more information creates greater confidence, but it also leads to lower decision accuracy.”

Our tendency towards overconfidence combined with our inability to process large data sets does not affect computers in the same way. For us humans, it can be argued that a route to better decision making is concentrating on a few key pieces of data, but this is not the same for computers. While human predictive accuracy does not improve with the amount of data available, a computer’s predictive power does improve the more data they have. For example, a systematic computer model can help to understand the sensitivity of capital markets to key economic variables.  

The key to overcoming this behavioral bias is to take a systematic approach. By systematic we mean a structured approach to analyzing data, similar to how pilots go through a pre-flight checklist. However, economic data sets have so much information that a human “tick-the-box model” simply doesn’t work. Computers, on the other hand, are by definition systematic and capable of handling substantial amounts of data in a structured manner.

Information is key in all decision making, however, in complex decision-making processes involving a large data set, the human mind’s ability to make accurate decisions is not improved with receiving more information, it results in overconfidence when making decisions. This behavioral bias can be overcome with choosing to focus on a few key pieces of information or then using AI and computer programs, which can use a systematic approach unhindered by human biases to handling large data sets without biases.

2.    Temporary paralysis

Fight or flight, these reflexes have kept us humans alive throughout our evolution. We have an inbuilt animal instinct that gives us a command to either kill what’s in front of us or run like Usain Bolt to get distance between us and a threat. But is this fight or flight reflex lacking something? Is it not missing another inbuilt reflex, one “F”? Should it not be fight, flight, or freeze? Many animals, including humans, do freeze up in stressful situations, is this an inability to handle the stress, or is it perhaps a defense mechanism in our DNA?

This is an area of study probably best left to scientists and academics. Nevertheless, one thing we can all agree on regarding investing is that many investors do “freeze” in times of major selloffs, they find themselves incapable of carrying out actions that would be easy in a normal, non-stressful situation, such as “I think this stock would be a bargain if it dropped 20%, so I will place a buy order.” Instead, investors are often paralyzed by a sudden turn of events and are unable to act. This temporary paralysis, while it may have helped us humans to remain unseen by predators thousands of years ago, is a hindrance to good decision making when it comes to investing.

History has shown us that temporary paralysis under significant market stress is a bias that is extremely difficult to overcome, even for experienced investors. However, it is specifically in times of large drawdowns when bargains are on offer. Being in a state of mind where you’re unable to act, even though the evidence tells you to, can be costly to a portfolio.

Active investing often involves the age-old investing axiom of buying when others are fearful and selling when they are euphoric. By employing an active and automated model, the behavioral bias of temporary paralysis can be overcome. A computer model does not feel fear; it operates on data alone. This lack of fear results in an active approach that can operate with high conviction: when the model gets a signal, it doesn’t hesitate, it acts on it immediately. Active investing is about beating the benchmark and it is worth remembering that if you’re always neutral, you’ll never outperform the benchmark.

By programming active investment strategies into a computer model, investors can override the human temporary paralysis bias and open up their portfolio to profiting from market drawdowns.

3.    Anchoring

Remember what it was like to travel? To go on holiday abroad? You know, that thing we all did before the coronavirus lockdowns. So, let’s imagine a world that is free of travel restrictions and you are once again able to go on holiday without fear of contracting coronavirus.

Planning and going on holiday is all about experiencing positive emotions. One of the great joys of traveling is the prior planning, going on Google and looking for a place in the sun to spend your summer holiday, somewhere warm and close to the beach, where you can just lie under the rays reading an engrossing book or just listen to the waves crashing on the shore. You come across some enticing pictures from Chania in Greece, you’re immediately sold on the idea, but being the diligent sort you want to do a bit more research and of course, you have to convince the family that it’s a good choice.

Well, you’re sure to find plenty of good reasons to make the trip. Your other half likes good food; no problem, Trip Advisor will recommend a plethora of wonderful restaurants to satisfy the palate. The kids demand a trip to a waterpark with big slides; again, no problem. You convince yourself that prices will be cheap this year and your stay will support the local economy – You’ll be getting a good deal and doing a good deed. There’s no end to the excellent reasons you can find to book your holiday. This is an example of anchoring at work: finding a piece of information and then making any subsequent data support the initial finding.

Anchoring works so well because, as much as we’d like to think we are rational thinkers, we humans are not: the stronger the emotion, the more it overrides rational thinking. Advertising companies make great use of anchoring, by triggering an emotion and then anchoring that emotion to the product they are selling.

Anchoring is also evident in our beliefs. For example, in politics people’s opinions on current issues tend to be formed based on their existing beliefs. We see this in action today driven by the internet and social media, which feed you information based on your previous interests. The problem with anchoring is that it creates information bubbles, where only a certain type of filtered information is received. This can cause relevant information to be missed.

Anchoring is extremely difficult for us humans to overcome. While media and advertising make use of anchoring to deliver information that you are likely to act upon, when it comes to investing, algorithms can be used to eradicate anchoring, by both recognizing information that is biased and also by drawing in information from a broad data set. By operating unemotionally, rational decision making is possible.

Never meddle with the model

Remember the student bar with its non-functioning card system? Well, the bar continued to lose money and at the next election, the board was voted out. One of the first decisions of the new board was to throw out the card system and go back to using cash and cards. While the former board was encumbered by their previous decisions, the new members were free to make decisions without the baggage of previous mistakes.

“When the facts change, I change my mind. What do you do, Sir?” is a quote often attributed to the economist John Maynard Keynes (although the evidence to him having said this is inconclusive). As we have seen, because of our cognitive biases, most humans will not change their minds when the facts change, they will simply make them fit their existing opinion.

It is only a systematic, active, and unemotional computer model that can exclude cognitive biases from the decision making process. Humans of course build these algorithms, so there are times when the market is going haywire that my colleagues and I find ourselves disagreeing with it and wonder if we should “meddle with the model”.

This happened as recently as 2019 when our model was indicating to go long on equities, yet many market participants considered equities too expensive. Now, looking back, 2019 was a good year to be long equities, despite the steep valuations. It is because we know these algorithms are specifically designed to eliminate doubt and knee-jerk reactions brought on by cognitive biases that we never “meddle with the model”.

 

 

 

About the author

Related insights