I work at the London School of Hygiene and Tropical Medicine as a monitoring and evaluation specialist for the Sanitation and Hygiene Applied Research for Equity (SHARE) research consortium. Outcome mapping is an iterative approach to project planning, monitoring and evaluation which seeks to identify, engage and work with the key stakeholders who are likely to help foster transformative change. It’s especially relevant for policy influencing and SHARE have been using it for research uptake over the past three years.
I recently published a report documenting what SHARE has learnt from using outcome mapping. SHARE built upon ODI’s RAPID Outcome Mapping Approach to plan, implement, monitor and evaluate research into use strategies to reach policy makers and practitioners in East and Southern Africa.
SHARE have five implementing partners – four national research institutions, one NGO – who are delivering sanitation and hygiene research projects in Kenya, Malawi, Tanzania and Zambia. Outcome mapping has helped us reach new stakeholders, engage local and national policymakers, leverage new donor funding as well as form new partnerships with NGOs and research institutions. While there was an initial learning curve to integrate and apply outcome mapping, our partner research institutions ultimately found OM useful and valuable for their research uptake work.
It’s a bit of an unusual activity given where we are coming from as a research organisation but I think it’s one that is well worth doing. I think our final outcome mapping document is quite simple and it’s something that we need to share with other research institutions. – Dr Roma Chilengi, Centre for Infectious Disease Research Zambia
This article highlights five key recommendations from the report – these are likely to be relevant for other research consortia, funders and think tanks who are interested in better planning and monitoring research uptake. These recommendations would be most useful to consider during programme design and implementation but could be used at any point in the programme cycle.
1. Invest in staff skills
We found that implementing partners who know their context well are best placed to develop realistic outcome mapping documents and research into use plans. However specific skills around communications and stakeholder engagement are also needed. Each implementing partner identified or recruited in-country staff with the skills to establish rapport with stakeholders and to identify opportunities for policy influencing.
For example, our partners in Malawi and Tanzania invested in full time Research into Use Coordinators to lead on this important area. Other partners identified individuals with pre-existing skills, helped build their capacity and ensured they had time to work on research into use. We recommend mapping existing skills and investing in building capacity/recruiting staff with the right skills if needed.
2. Resource monitoring and evaluation
Another key area to invest in was monitoring and evaluation – the programme employed me as a full time M&E specialist who was trained in outcome mapping. A centralised full time role was important during the design process to support partners and to provide training a. During implementation, M&E was essential to provide ongoing partner support, to manage and consolidate monitoring data, to facilitate learning workshops and to report progress to programme management and our donor. We recommend building in sufficient budget for M&E staff in order to successfully introduce and implement the outcome mapping approach.
3. Consider context
The programme team had initially assumed that implementing partners would have similar progress markers within their outcome mapping plans. However, we discovered that it was more appropriate for each partner to tailor their OM plan to their individual context.
While all our projects were in East and Southern Africa, we found that contexts particularly differed in terms of government stakeholders who operate differently according to levels of decentralisation. In some contexts the entry point to government is at district level, whereas elsewhere direct interaction with national ministries is more common. We recommend that multi-country programmes consider and acknowledge contextual differences and ensure that there is flexibility to tailor progress markers to each setting.
4. Adapt your terminology
Readers who have worked with outcome mapping already will know that it has its own terminology and language. The programme team had not initially factored in that ‘outcome’ has a specific meaning for health researchers – referring to pre-defined robust, quantifiable and objectively verifiable measures as to whether a health impact is attributable to a specific intervention. This is very different to the definitions used by outcome mapping, where outcomes generally refer to observable changes in behaviour, attitudes, relationships or practices.
It was important to contextualise that OM focuses on contribution to change, rather than direct attribution, and to discuss the differences between OM and results based management or SMART indicators. We recommend understanding current practices and language in order to effectively translate OM principles into appropriate terminology.
5. Use mixed methods
We linked outcome mapping plans to a programme logframe indicator in order to be able to aggregate and quantify our process. This provided a simple means of donor accountability but did not answer complex questions about change. Rich qualitative data was therefore still essential for us. We recommend combining OM with quantitative methods as a way to satisfy accountability requirements – while also creating space and flexibility for learning from qualitative data.
If you’d like to learn more about our experience using outcome mapping including some of the exciting research uptake outcomes that came out of it, you can read the full report on our website.