Behavioural anomalies

In his seminal account of the origins and findings of behavioural economics, Thinking, Fast and Slow, Daniel Kahneman tells the story of his visit to a professional investor, who had just bought tens of millions of dollars of stock in Ford Motor Company.

“When I asked how he made that decision, he replied that he had recently attended an automobile show and had been impressed. ‘Boy, do they know how to make a car!’ was his explanation”.

Kahneman points out that the investor preferred to trust his raw emotion, rather than confront what would determine whether he would make a profit, i.e. establish whether the stock was under- priced.

“The question that the executive faced (should I invest in Ford stock?) was difficult, but the answer to an easier and related question (do I like Ford cars?) came readily to his mind and determined his choice.”

Prospect theory

Prospect theory stems from psychological studies that attempt to offer a more realistic account of human behaviour than expected utility theory. It studies how real people respond to potential gains and losses in a laboratory setting (“prospect” is used as another word for “gamble”). In a seminal paper in 1979 Daniel Kahneman and Amos Tversky tested for two famous effects:

  • The reflection effect – finds that people are risk averse over gains, but risk-loving over losses.
  • The certainty effect – (also known as the “Allais paradox”) finds that people underweigh outcomes that are merely probable compared to those that are certain.

This launched a field whereby economists conducted experimental research and established a number of interesting findings. We can split behavioural anomalies into two types: biases, which are when people have a tendency toward suboptimal decision making; and heuristics, which are rules of thumb that backfire. Let’s look at some of the more famous ones:

Biases

Studies suggest that people have a tendency to overestimate the likelihood of favourable outcomes. We have a tendency to think that bad things only happen to other people, which might be com- forting, but can lead to problems as this IMF paper points out:

Concerns that foreign investors may be subject to herd behavior, and suffer from excessive optimism, have grown stronger; and even when flows are fundamentally sound, it is recognized that they may contribute to collateral damage, including bubbles and asset booms and busts.


Ostry et al. (2010)

The UK government are so concerned with “excessive optimism” that they released guidance on how to mitigate it.

We have a tendency to overestimate our own abilities/ideas.

A majority of people think they are better than average drivers, and when students finish an exam they tend to believe that they did better than they actually did.

We tend to convince ourselves that an event was more predictable than it actually was. This might be because we have an innate bias for storytelling, and therefore we instinctively attempt to attribute meaning to events even if they’re random. For example, imagine we have 10,000 fund managers who make their investment decisions based on the toss of a coin. For any given year, each has a 50–50 chance of making a profit. After four years there will be 625 that have made a profit every year. Randomly. And yet these 625 will probably be lauded as heroes. People will clamour for their thoughts. One of them may be “fund manager of the year”. But only 312 will win again next year. There’s 50% chance the “fund manager of the year” loses out. Will people say, “that’s regression to the mean”, or “he got cocky”?

In hindsight, people consistently exaggerate what could have been anticipated in foresight. They not only tend to view what has happened as having been inevitable but also to view it as having appeared “relatively inevitable” before it happened. People believe that others should have been able to anticipate events much better than was actually the case. They even misremember their own predictions so as to exag- gerate in hindsight what they knew in foresight.

This survivorship bias is incredibly important. If traders that lose money leave the market, then it is no surprise that the ones who remain have a track record of success. But this success is as much of a warning sign as a signal of competence. Successful traders will be systematically over represented among current traders.

This occurs when you search for or interpret new information in a way that supports existing beliefs, as opposed to trying to challenge them. In a famous study students were asked to give their opinions on the death penalty. They were then given two pieces of evidence, one of which suggested that the death penalty was an effective deterrent, and one saying that it wasn’t. You might hope that people would carefully weigh up the evidence and update their beliefs accordingly. In actual fact the students already in favour of the death penalty said that they found the study supporting this view to be the most convincing. And those against thought that the one that confirmed their prior beliefs was more compelling. This effect can occur via three main channels:

  • Overemphasizing supporting evidence.
  • Underemphasizing conflicting evidence.

Confirmation bias can also lead to what is known as belief perseverance, which occurs when we stick to a belief despite conflicting evidence. It suggests that if people interpret new information through a mistaken starting assumption, then giving people more information may not lead to better decisions. As Tolstoy said,

“The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him.”

Tolstoy

We tend to overestimate how much control we have over events, and thus attribute outcomes to individuals rather than situations. How often have you been in a busy restaurant, faced with a discourteous waiter, and attributed their behaviour to their personality rather than the stressful circumstances in which they work? This is also known as “the illusion of control”

A nice example of attribution bias is provided in this article from The Economist, asking “what if executive memos were clear and honest?”:

We had a dreadful 2020. To be fair, nobody could have reasonably expected the executive team to predict a global pandemic which resulted in widespread economic shutdowns. But by the same token, if managers aren’t at least partly responsible during the bad times, they shouldn’t take full credit for the good times. Most executives are riding on the backs of central bankers who have slashed the cost of capital and on technology pioneers who have made it easier to transact and communicate… So, given that my fellow executives took bonuses in the boom years, we are slashing their salaries by half.

We tend to place a higher valuation on an asset purely by own-ing it. In one study students were randomly given tickets to watch a basketball match, and then asked what value they placed on it. Given that the allocation was random, there should not be a sig- nificant difference between the two groups. But those who had been given tickets stated that they were worth 14 times as much as the ones who didn’t receive them. If we attach higher values to things once we own them, this has an implication in terms of how markets operate. Economists assume that traders place a value on a particular asset, and if the price goes below it they’ll want to buy more, and if the price goes above it they’ll be happy to sell. But as The Economist says, “professional market traders are often reluctant to sell investments they already hold, even though they could trade them for assets they would prefer to invest in if starting from scratch.”

This is also known as the endowment effect.

Heuristics

These are judgements based on stereotypes, confusing the fact that just because something is a stereotype does not necessarily make it more likely. For example if you encountered a Chinese professor, and had to guess whether he was a professor of Chinese literature, or of psychology, what would you say? The typical answer is the former, due to the fact that you would expect most professors of Chinese literature to be Chinese.

But this ignores the base rates. There are far more professors of psychology than of Chinese literature, and so there will be more Chinese professors of psychology than of Chinese literature.

Poker player Annie Duke uses the representative heuristic to her advantage. She often encounters opponents who make assumptions about her ability because she is a woman. She categorized her male opponents into three groups, based on how they treated her – the flirting chauvinist (who she’d be nice to, and distract); the disrespectful chauvinist (who underestimated her, so she’d be able to bluff); and the angry chauvinist (who would do anything to avoid being beaten by a women, so her response was to be patient and wait for them to become reckless).

Judgement based on the ease with which instances come to mind. Perhaps the most famous example is the widespread fear of flying, despite the fact that on many measures (such as per journey made, per distance, per time spent travelling) cars are more dangerous. If something is particularly (i) salient (i.e. we overestimate the divorce rate in Hollywood because the examples attract more attention); (ii) dramatic (e.g. strong visual imagery); (iii) personal (we place additional weight on our own experience relative to reading about things that happen to other people); or (iv) recent, these events are more “available” and thus we tend to over emphasize their likelihood of occurring.

Budding pop stars underestimate the difficulty of becoming famous because they only see the successes. This is another example of selection bias (or survivorship bias).

This is viewing things in relation to an irrelevant comparison point. In a deeply concerning study, experienced judges were asked to roll a dice before making a decision about custodial sentences for a woman who had been caught shoplifting. The dice were loaded to deliver either a 3 or a 9. Even though they were dealing with the exact same case, the judges who rolled a 9 gave an average sentence of 8 months, whereas those who rolled a 3 chose 5 months. Anchors can be an important marketing tool because how something is framed can generate points of comparison that affect people’s judgement. The classic example was a 1992 study at Stanford that looked at the impact of a new $429 bread maker. Although it had lots of additional functionality it was very expensive and it didn’t sell many units. However, the company noticed that sales of the standard $279 model almost doubled. Suddenly it didn’t seem as expensive any more. As Amos Tversky said, “we choose between descriptions of options, rather than between the options themselves”

One of my favourite uses of behavioural economics is to reflect on the design of a menu when I am eating in a restaurant. This analysis by William Poundstone is truly fascinating. We should be very careful about believing too much of the highly disputed social priming literature, but framing effects are fun to think about. Apparently the second cheapest bottle of wine on a menu is actually good value for money, and here is an explanation of the decoy effect (note that this is different to the wine list example I use in class):

This is the over-reliance on our immediate emotional reactions, such as intuition, gut feeling or instinct. It occurs when people let their likes and dislikes determine their beliefs about the world, and can apply to situations where people rely too much on emotion, possibly because of how a problem is phrased. For example a study split people up into two groups and told them about a potential new drug. Group 1 were told that there was a 7% mortality rate. Group 2 told were told that there was a 93% survival rate. Note that this is the exact same thing. But group 2 were more likely to recommend the treatment.

One study showed that people believed that a disease that kills 1,286 people out of every 10,000” was more dangerous than “a disease that kills 24.14% of the population”.24 We have a tendency to place too much weight on situations that are vivid, and imaginable, and too little on ones that are abstract.

That’s a long list of biases and anomalies, and it may strike you as a bit of a ragbag. And to some extent it is – these are what I consider to be the most important ones, but they are simply psychological phe- nomena that economists have sought to incorporate. A 2010 McKinsey study found that when organizations reduced the amount of biases in their decision-making process, returns rose by up to seven percentage points.28 So there are clearly important gains from minimizing the negative impact of behavioural anoma- lies.

Here’s a nice poster of cognitive biases:

If you are familiar with the Amanda Knox case you should find this essay fascinating, where she identifies a wide range of behavioural biases that played a role in her wrongful conviction and ongoing reputational damage: “A Surprising Gift from my Wrongful Conviction“. I was flattered that she liked my tweet endorsing it.

If you aren’t familiar with her case, I highly recommend the Meredith Kercher episode of the ‘Women and Crime’ podcast.

  • Lambert, Craig “The Marketplace of Perceptions”, Harvard Magazine, March-April 2006 – A summary of chief insights from behavioural economics and neuroeconomics
  • Poundstone, W., (2011) “Prospect Theory” (Chapter 16) and “Ultimatum Game” (chapter 18) from Priceless: The Hidden Psychology of Value, One World – Good introductions to key concepts
  • Tabarrok, A., “A Phool and His Money” Review of PHISHING FOR PHOOLS: The Economics of Manipulation and Deception, by George A. Akerlof and Robert Shiller, Princeton University Press – A defence of standard economic theory against behavioural claims
  • Poundstone, William (2010) Priceless: The Hidden Psychology Of Value Oneworld
  • Kahneman, Daniel (2011) Thinking, Fast and Slow Farrar Straus and Giroux