How to promote evidence informed policy? Join the discussion

[This article has been prepared as a report on a session at the 2019 OTT Conference held on 5-7th February in Geneva. It provides feedback to the Hewlett Foundation on a consultation that it hosted at Bellagio on their evidence informed policy strategy.The session was convened by: Josephine Tsui (ODI), Andrea Ordóñez (Southern Voice), Kerry Albright (UNICEF) and Jeff Knezovich (WHO)]

Broadly speaking, there was very positive reaction to the four ideas produced post-Bellagio. This article outlines the main contributions made to each of the main recommendations put forward by Hewlett to promote evidence informed policymaking (EIPM). You can read all about them here:

Idea for Action One- Conceptualising the Field

The field is relatively fragmented with individuals and organisations working in separate spheres (and sectors) and not connecting with each other; often not even using the same concepts or labels.

  • The think tank representatives in the room by and large strongly supported both the rationale behind and the need to conceptualise the field, which was the most clearly endorsed recommendation.  However, there was some pushback on the need to have a more holistic vision of EIPM, saying that given that it is recognised that EIPM is context and resource specific, why not let 1,000 flowers bloom?
  • It was stated that as part of this, it would be useful to have a one-stop-shop for evidence reading/repositories such as Results for All’s email monthly reading list, EBPDN, etc.- all EIPM literature is currently too fragmented in different libraries and so it is hard to build a community of practice or debate EIPM concepts in one place.  
  • There was much discussion around how the concept of ‘evidence’ was defined, the need to diversify the definition and include more voices- some felt that public opinion should be recognised as one form of evidence beyond formally-generated evidence, for example.  In a similar vein, too much evidence is currently being cited as being ‘gold standard’ without including cost-effectiveness or contextually-relevant information.
  • Important, too, to define the values that underpin EIPM- it’s more than just the evidence – it’s ‘good’ policies that are informed by evidence what matter.
  • The concept of what counts pragmatically as ‘good enough’ evidence in difficult circumstances also resonated. Some also felt that different sorts of evidence are required at different points in the policy process and for this reason, maybe the term ‘knowledge-informed policy’ may resonate better than ‘evidence-informed policy’.   It is also important to note the term ‘evidence’ isn’t neutral and there is no such thing as ‘the Evidence’. An interesting point was made about whether any definition of ‘evidence’ is culturally grounded and therefore a universally-agreed definition may be impossible.

Idea for Action 2- Strengthening Messaging and Stimulating Public Engagement

Would it be possible to make evidence informed policy a voter demand? Would this not generate a demand for evidence on the part of politicians? How, then, to engage the public to adopt the EIPM agenda?

  • There was also considerable support to the idea of recognising and articulating the core values behind EIPM and developing a common messaging framework, recognising that EIPM is not a pithy concept.  
  • However, there was some concern that the idea for action underplays the difficulty in getting people to think of themselves as EIPM actors- people rarely identify themselves as EIPM specialists so more work will be needed to encourage them, the general public, to consider themselves as part of this field.  
  • Unlike other consultations, the group in Geneva did not express concern at the idea of evidence campaigns, perhaps seeing this as a core part of their public engagement role as think tanks.  
  • An interesting comment asked whether Hewlett really wanted to strengthen the field of EIPM (maybe seen as slightly more academic) or instead build an evidence (social) movement based on citizen engagement?  Both were seen as valuable.

Idea for Action Three- Institutionalising Evidence Use through Support and Accountability

Could monitoring, acting as watchdogs and holding policymakers to account on their use of evidence be the best way forward? Or should the focus be on support?

  • There was an identified need to think more about ‘people-based’ interventions as well as ‘infrastructure’ – for example, more about how to support relationships, networking, confidence-building as well as accountability mechanisms.  
  • Building the capacity of officials to understand and analyse evidence as well as an appreciation of building an evidence culture is needed before holding them accountable for its use.  Could also think of embedded research projects where the policymakers are the principal investigators.
  • There was strong support for a funding mechanism to document and validate success stories/case studies in EIPM expanding upon the TTI work.

Idea for Action Four- Inspiring Global Commitments to Systematic Use of Evidence

Just as governments make transparency or openness commitments, could they make commitments about the use of evidence in policymaking?

  • Whilst there was support for building upon a global Open Government Partnership-type model, there is also a need to consider the role of regional networks as well as country-level ones and the role that these can play as well as accountability at different levels in-country.  
  • Need to acknowledge the unevenness of data availability and collection to inform EIPM within country- there are huge information asymmetries across districts etc so is important not to always think at the national/macro level when contemplating EIPM investments.

Other Ideas

Ideas that may be missing from the original strategy developed by Hewlett and its partners and which could add value include:

  • The need for a greater focus on politics/politicisation of evidence (already acknowledged) including a more explicit acknowledgement of values, morals and the power of storytelling.
  • Mechanisms/frameworks for adapting impact evaluations to different contexts and the scaling up challenge.
  • The need for a more nuanced understanding of the fact that in some areas such as e.g. youth empowerment, collating and identifying quality evidence to inform policymaking is not so easy- how to measure the unmeasurable?
  • The fact that there was no specific recommendation around generating better quality evidence was noted- is this perceived as the role of research funding generally? If so, how do people who use the evidence get a better say in what evidence is generated?
  • Could help articulate the need for long-term investment in universities and funding of PhD’s in the global south if really want to build a culture of evidence use and scientific decision-making.  This long-term institutional strengthening of universities is now often falling between cracks in funding programmes because of the political imperative by many funders to show quick results.
  • Are some key actors missing in this analysis- the role of the media is largely absent as is the potential to scale up valuable but poorly-funded homegrown fund fact-checking initiatives.
  • Whilst the value of a funder’s circle in having more coherent large-scale investments in EIPM was recognised, some felt that it was important for funders like Hewlett, which are committed to promoting a better use of evidence, to advocate for more flexible funding instruments and financing models.  There is already an increasing trend for larger budget programmes which may undermine the innovativeness, flexibility and speed of small-scale seed funding, making it harder for think tanks and partners in the global South to apply for such grants.  Furthermore, it may be wiser to hedge bets and scatter investments until we identify ‘what works’ rather than have fewer, larger multi-donor programmes.
  • There is an implicit assumption in the document that EIPM is always good- in fact some of the most effective examples of EIPM demonstrate how effectively bad evidence or science can influence the decision-making process.
  • The need to consult beyond communities of like-minded people and advocates of EIPM was also echoed- is a need for public engagement to consult the general public and sceptics to get different perspectives of EIPM as well as drawing more upon insights from behavioural science, framing and psychology literature to understand why evidence-thinking is often failing to resonate with key parts of the population.
  • Need to think further about how to subvert the accountability system of evidence so that it becomes citizen or user-led, rather than donor-led e.g. think a real-time ‘TripAdvisor’ to rate quality of delivery of development services etc.

Additional Offer

On Think Tanks are encouraging and supporting the establishment of light-touch communities of practice post-conference, so as to steer the conversation forward until the next 2020 meeting.  There was an offer from Enrique Mendizabal to use this mechanism to help facilitate and broker ongoing engagement with Southern think tanks to solicit further feedback on Hewlett’s proposed next steps if useful. So if you would like to join this discussion, add your voices below!