The Behavioural Insights Team (BIT), which used to be part of the UK’s Cabinet Office when David Cameron was the Prime Minister, has released a study investigating the relationship between online deposit limits and the amount customers actually do deposit. Despite the headlines, “Research suggests gambling spend reduced with deposit limit options removed”, the actual results were not so clear cut.
The BIT examined whether the actual deposits made by customers were impacted by the default amount in the limit-setting box for online gambling. The customers were randomly divided into three groups. In the control group where nothing had changed, customers were presented with a drop-down list with a maximum of £100,000 or no limit. The second group were presented with a drop-down list with a maximum of £250, but they could manually enter a higher amount if they wished, with a maximum of £100,000 or no limit. The third group were presented with an empty box, in which they had to enter an amount manually, again with a maximum of £100,000. Forty-five thousand people were invited to join the trial, but only 1,731 (~4%) agreed to do so.
Whilst the average deposit limit for the control group was approximately double the other two groups, the actual amount deposited was £446, £426 and £361, respectively, for the three groups. Although the average amount deposited by group 3 was 18% lower than the control, the results were not statistically significant, probably because of the small size of the data sets.
The BIT were attempting to see a phenomenon called “anchoring”. Those who are familiar with the work of two Israeli psychologists, Amos Tversky and Daniel Kahneman, will understand this term, which refers to humans’ capacity to be biased by the first piece of information they see. When making plans or estimates about something, humans view the more recent information from the viewpoint of the original, “the anchor”, instead of seeing it objectively. This can skew our judgment and prevent us from reviewing our plans or predictions as much as we perhaps should.
Tversky and Kahneman, two heroes of mine, changed the way we think about how humans make decisions. Today, they are known as behavioural economists, a term that did not exist in the 1960s. Having successfully united the world of psychology and economics, in 2002, Kahneman was awarded the Nobel Prize in Economic Sciences. He would have shared the prize with Tversky, had Tversky lived; Nobel prizes are not awarded posthumously.
Kahneman was a Jewish refugee who, with his mother and sister, fled war-torn Europe. Tversky was the son of Eastern European immigrants who had escaped the pogroms. Both found it difficult to decide on their careers and came upon psychology almost by accident. Having delayed national service until he completed his degree, on graduation Kahneman, at the age of 21 was drafted as the head of psychology for the Israeli military.
One of his first tasks was to refine the recruitment process for officers. Realising that people’s judgment of their fellow man was deeply flawed, he removed any subjective assessments, much to the chagrin of the recruiters. He also understood, based on the data, that there were no real differences in the personality requirements for officers in the various branches in the military. The qualities that make a good air force officer were the same as those that make a good infantry officer. The tests he designed were so successful that they are still used today and have been copied by other military forces around the world.
Tversky and Kahneman met in the mid-1960s and it was remarkable that they were able to work together. Both were incredibly smart and somewhat eccentric. Tversky was an optimist, full of self-confidence, Kahneman an extreme pessimist, consumed by self-doubt. Tversky later said that Kahneman’s continuing doubt led them to probe their subject deeper and ensure the papers they published were impervious to challenge.
Prior to their work, economists and psychologist believed that humans make rational decisions, that we make comparisons and chose the one that provides the most utility. At that time, most of the relevant psychological and economic theory was based around this idea.
Their initial work investigated “framing”; how, contrary to the belief at the time, people make rational decisions, basing their decisions on their understanding of fairness, past events and aversion to loss. For example, they found that people’s decisions can be swayed by how the situation is “framed”. When randomly selected groups of people were asked to hypothetically decide what procedure to take to cure a disease, most preferred a procedure that saved 80 percent of people to one that killed 20 percent.
By necessity, having little research money, Tversky and Kahneman changed the nature of psychological research, taking it out of the sterile laboratory environment and into the real world. They recruited university students, people in shopping malls, anywhere they could find them, and asked them simple questions, using nothing more than a clipboard and pen. But they spent hours formulating the questions.
They proved that most people misunderstand probability, but they do so in a predictable way. A famous question they asked was about Linda. They gave some information about her: 31 years old, single, outspoken, very bright, college graduate, deeply concerned about discrimination and social justice, etc. They then asked which of the following was more likely:
- Linda is a bank teller, or
- Linda is a bank teller and active in the feminist movement.
Most people selected “b” to be more likely than “a”. Although probability theory tells us that the probability for two or more conjoined events can never be more than the probability for one of those single events, our internal biases tell us otherwise.
In a series of experiments, they went on to show that humans value losses more than they value gains. For example, the majority of subjects preferred a guaranteed $1,000 rather than a 50% chance to win $2,500 or get nothing. That would, on the face of it, be a perfectly rational position, but when asked if they would prefer a certain loss of $1,000 against a 50% chance of losing $2,500 or losing nothing, they chose the riskier option. People’s choices are not symmetrical; they are biased, but biased in a predictable way. Or, as Dan Ariely said in his 2008 book of the same name, we are “predictably Irrational”.
And this brings me back to the BIT study. It is admirable that research is being commissioned that looks at humans as we are and how we might react to something, rather than a simplistic approach that we will always act in a rational manner. Politicians, regulators and some in the media who might believe that simplistic solutions will solve problems would do better if they read some of the research papers of Tversky and Kahneman and the work that followed.