We know that policy professionals are biased. Every one is, after all.
Sheheryar Banuri, Stefan Dercon and Varun Gauri have conducted a study on the biases of policy professionals at the World Bank and DFID. They review three biases:
- Confirmation bias: that is, that policymakers will be more likely to accept evidence that confirms their preconceived believes.
- Sunk cost bias: that is, that policymakers will be more likely to keep a project open after certain threshold cost has been met even in light of evidence that shows that the project won’t work.
- Framing and risk bias: that is, that policymakers are more likely to accept risks framed in a positive way (rather than negative) and that they will take greater risks when their institutions are on the firing line than when they are.
They found that all three biases are present among World Bank and DFID staff. This is not surprising.
despite the non-political charter of many bureaucracies, such as at the World Bank and DFID, and despite the fact that public institutions are designed to address, in Weber’s language, instrumental/purposive rationality rather than values/belief rationality, significant biases in decision-making are evident.
When the authors considered the impact that deliberation may have on these biases they found that only confirmation bias was “corrected”. Individuals were more accurate in their assessment of evidence after a brief period of deliberation with their peers.
This has a strong and practical implication for think tanks. Publishing and disseminating results (even through policy briefs are other more modern methods) is not enough. Communication of evidence must prioritise opportunities for meaningful engagement – a discussion of the evidence is necessary.
The other two biases showed little sign of budging as the result of deliberation. Sunk cost bias may be institutional -or a career bias in which pushing ahead is seen as a desired attribute. In this case evidence should show that funds could be better used to deliver the desired objectives – rather than simply presenting evidence that they can’t be achieved.
Framing of risk, on the other hand, is highly dependent on preferences. Therefore, deliberation may have the oposite effect. Risk aversion may be intensified.
A further implication of this study is that policy agencies (and, why not, think tanks) ought to carryout cognitive biases of their staff. Hiding them (or pretending they do not exist) will not make them go away; instead, it will fail to deliver the best possible used of evidence.