Evidence-use diagnostic tools: choosing and using them

31 January 2024

One of the strongest consensus in the evidence-use field is that context really does matter. We also have a pretty good understanding of what factors affect evidence use. What is harder to understand is how these factors play out in specific contexts. 

That’s where evidence diagnostic tools can help. 

Diagnostic tools are used at the design stages of a new initiative aimed at strengthening evidence use. Using one helps to ensure that your efforts are aligned with existing government-led structures and efforts, rather than replacing or undermining them.

They help to identify windows of opportunity for improved evidence use; to build understanding of existing structures, capacities and working cultures in organisations where change is targeted; and to co-define priorities. 

In this way, they build on the evidence about evidence use. This emphasises the need to understand and respond to contextual political factors in programming and the need to accompany change rather than impose it.

Here, I look at six diagnostic tools, identify some key principles they all share, and offer some advice on choosing and using a diagnostic tool.

Six diagnostic tools

Some published diagnostic tools for evidence are outlined below.

1. Guidelines and good practices for evidence-informed policymaking in a government department was developed through a partnership between the Department of Environmental Affairs in South Africa and the Overseas Development Institute. It is focused on the organisational level and offers guiding principles for improving evidence use. +

2. Context Matters Framework (Weyrauch et al, 2016) is a participatory, problem-driven framework to co-identify windows of opportunity for improved evidence use with policy-makers or other evidence users. The Framework was developed using interviews from 50+ policymakers and practitioners in the Global South and considers both the macro and political economy contexts and has an in-depth organisational element. +

3. Political Economy + (Shaxson et al, 2021) was developed to assist the Strengthening the Use of Evidence for Development Impact (SEDI) programme in identifying priority sectors for evidence institutionalisation work in Ghana, Uganda and Pakistan (where it was used in nine different sectors). It covers both the macro and organisational level (with the latter level being approached in two stages). +

4. Ecosystemic framework for analysing evidence-informed policy systems for agricultural transformation (Thoto et al, 2023) is A sector-focused approach that was initially used in agriculture in Benin and can be adapted to other sectors. At the time of writing a pilot was underway in Cote d’Ivoire and further pilots are planned for Togo, Niger, Senegal and Burkina Faso from 2024. It was developed by Actions pour l’Environnement et le Développement Durable (ACED) and the Food and Agriculture Organisation of the United Nations (FAO). +

5. Rapid Evidence-Support System Assessments (RESSAs) were developed by the Global Commission on Evidence 2022. RESSAs aim to identify strengths, opportunities for scale-up and the gaps that need to be addressed in each country’s national evidence support system. They cover both the macro and organisational level. +

6. Supporting the routine use of evidence during the policy-making process: a WHO checklist (2023). This is targeted at government agencies, knowledge intermediaries and researchers interested in institutionalising evidence use. While health is the initial focus, it can be applied beyond this context and is currently being piloted and adapted in diverse contexts including Brazil, Trinidad and Tobago, Ethiopia and Thailand. + 

Common principles across these tools  

In October, we discussed a range of these tools at an event on education policy labs as a mechanism for strengthening evidence use, which was hosted by the Jacobs Foundation and the UK Foreign, Commonwealth and Development Office. 

In our report on evidence use in education for the Jacobs Foundation, we found that while the contextual factors affecting evidence use appear to fall into the same broad categories as in other sectors, evidence diagnostic tools don’t appear to yet be commonly used in education. 

One thing that struck us was how many of the tools cover the same things.

1. They pay attention to the broader political economy analysis as well as to knowledge, systems and organisational contexts

All these things will have a strong bearing on whether evidence use happens or not – and it’s not enough to look at just one of the layers, because they all interact with each other. 

For example, an organisational assessment alone will give you a great picture of the ministry you want to work with to strengthen evidence use, but it might miss the critical, broader political dynamics that will have a bearing on your work.

Many sectors, including education, already use sophisticated sectoral context analysis approaches. Evidence diagnostics build on such analyses by overlaying an evidence use lens—as demonstrated by the SEDI programme, which brought together experts in political economy, knowledge systems and evidence use to collaborate with sector experts to conduct evidence diagnostics across nine different sectors in three countries.

2. They aim to understand the users before proposing activities

All the tools invest a lot of time and effort in understanding the ways of working of evidence users within government agencies: their working cultures, processes, standards and structures. 

This takes time and it’s very complex. For instance, different parts of a government ministry have different perspectives on evidence priorities and varying levels of collaboration and communication with each other.

3. They help you make decisions

These diagnostic frameworks are not primarily research tools; they’re meant to inform decisions and help you to prioritise the actions that need to be taken to improve evidence use.

  • Evidence diagnostics are used to make decisions, such as the following:Where in government should evidence-use activities be targeted? For instance, Where should a policy lab sit? Which civil servants should be trained? Where should a new procedure on evidence be introduced? 
  • What kinds of evidence use should be targeted in a programme? For example – instrumental, embedded or transparent? 
  • How can the new initiative contribute to strengthening existing structures and systems that are already in place in government to gather and use evidence? 
  • Who are the key stakeholders that the initiative needs to engage with? What are the existing spaces for engagement that the new initiative can get involved in? 
  • What should the timing and sequencing of activities be to respond to windows of opportunity? This might involve, for instance, awareness of planning and budgeting or electoral cycles.

Using a diagnostic tool

If you’re planning on using a diagnostic tool then you’ve made a great decision! 

Using an evidence diagnostic can really help to ground your initiative in a way that responds to context. The learning from the sector has repeatedly pointed out how important this is.

Here are some tips on using diagnostic tools:

1. Don’t create a new one 

You can adapt from one or more existing tools as the tools mostly cover the same factors in different ways. They each have distinctive merits, which might be useful in different circumstances

For instance, the WHO’s tool has a useful model of the stages of institutionalisation, which can help act as a progress marker. 

The Overseas Development Institute’s tool is more organisation-focused, whereas the Global Commission’s tool is more systems-focused.

Both the SEDI Political Economy + tool and the Context Matters Framework aim to cover the same three lenses – political economy analysis, knowledge systems and organisational – but with different levels of emphasis and different methods. For example, the Context Matters Framework puts more emphasis on the organisational level, whereas Political Economy +  puts more emphasis on a broader systems level.

2. Build on existing learning about these tools

Most of the diagnostic tools listed above have published learning and reflections about their use. There’s been a lot of frank reflection on how to use evidence diagnostics, considering content, scope and process. 

Most of these tools have been through various pilots and different iterations so there’s plenty of useful insight that might be helpful for others who want to use them.

Over the past year, I’ve been part of an informal community of practice convened by WHO EVIPNet, dedicated to debating the institutionalisation of evidence-informed decision making following the launch of Supporting the routine use of evidence during the policy-making process: WHO checklist. Alongside several partners around the world, we’ve been sharing experiences in developing and piloting these tools. And in 2024, we are planning a webinar series in which we look forward to a broader discussion about institutionalising evidence use. Follow us on social media and subscribe to our newsletter for more details.