Nudging Better Decisions - An Experiment

 

If you want to promote ethical decisions, should you teach your employees about biases?

We explored this question with a large pharmaceutical company. Specifically, asking whether a series of ‘bias cards’ outlining common cognitive errors would influence employees’ ethical decisions, preferences and behaviour. The ultimate goal was to run a short, accessible experiment that would impact the launch of the organisation’s Ethics and Risk strategy the following year.


The Experiment

We randomised 216 employees into two groups and provided them with two separate introductions to a ‘decision-making task’:

  1. The Bias Card Group - ‘The Treatment’ - received three bias cards with examples of common cognitive errors.

  2. The Policy Group - ‘The Control’ - received a few paragraphs from the company’s ethics and decision-making policy document.

 

The impact on decision-making:

#1 The Ethical Scenario: After reading through one of our two introductions, we posed employees an ethical scenario with no right answer: A colleague, Marina, was considering a high-end cinema venue for a medicine product launch. It was heavily discounted behind the scenes, but Marina was worried about the public perception of high spending. Should she go ahead with the high-end venue, or find somewhere lower-key for her launch?

Should Marina use a high-end venue for the product launch?

85% of our treatment group thought Marina should find a new venue compared to 76% of our Control Group. The difference is significant at p=<0.1
 

#2 The Economic Decision: To assess present bias and risk aversion, we asked participants to imagine they are considering an insurance payout with a time-delay option. What is the minimum amount of money they would accept in 30 years as an alternative to £300 cash today?

How much money would you accept in 30 years as an alternative to £300 today?

Tre treatment group would accept a minimum of £19,322, representing an annual rate of return of 14.9%. This compared to £8640 and 11.9% for the Control. The difference is significant at p=<0.1
 

The impact on attitudes towards decision-making:

After the decision-making tasks, we asked participants a short series of questions about their attitudes towards decision-making. There was a significant difference in two of the groups’ responses:

How do you feel about decision-making?

The above %'s represent deviation from the mean score, measured with a 6 point Likert Scale from strongly agree to strongly disagree. The differences are significant at p=<0.1
 

The impact on the willingness to learn more about good decision-making.

Finally, we invited our participants to explore some additional learning content about good decision-making on the company intranet pages, measuring the percentage clicks as an indication of interest: 65% of employees from our Bias Card Group went on to explore additional content in contrast to 61% of employees from our Policy Group, a statistically insignificant difference.


Discussion & Implications

It was fascinating to find that being exposed to cognitive errors through bias cards made our treatment group feel less confident about their decisions. There are a few ways to interpret the subsequent impact on the ethical and economic decision-making tasks, but the most coherent is that the reduction in confidence primed risk-aversion. Our treatment group were more likely to take the safer route on the product launch (book a lower-end venue) and demanded a much higher rate of return on the £300.

While this was a light study, it’s also a great example of how a simple treatment can impact the decisions people make in your organisation. We encourage more firms to follow this methodology and test how different ‘cultural frames’ can make your organisation more or less fair, ethical and sustainable.


Together, the principles of behavioural insight and experimentation can help people make better decisions in your organisation.

Let’s put them to work.

 
MoreThanNow