Public engagement: How to build a global evidence base?

17 January 2022

Done well, public engagement can empower the public and help researchers, scientists and policymakers to identify the most relevant questions and best decisions.

Done poorly and there’s a risk of marginalising those perspectives most needed to build effective science and policy interventions. Or engagement becomes tokenism, undermining rather than strengthening the trustworthiness of science and policy.

In 2020, the Wellcome Trust commissioned OTT Consulting to explore approaches to enable public engagement communities to more effectively gather and share evidence on public engagement practice. 

Wellcome’s starting point was that public engagement practice is poorly evidenced. And where evidence does exist, it’s often not accessible. This is backed up by our initial survey findings, which suggest that public engagement communities tend to operate in silos, forming ‘islands of activity’ that rarely exchange ideas and practice. All of this makes it difficult to embed and scale good practice to create meaningful impact.

And so, if a funder – or a group of funders – wanted to invest in a global public engagement evidence initiative, what would be the best approach?

To answer the question, we carried out a desk-based literature review, held workshops in India, Peru and the UK and a series of interviews with international organisations. 

In this post, we present some of our findings and three possible structures that funders could pursue in developing a new public engagement evidence initiative. We hope this will help to start a conversation among funders who have an interest or stake in strengthening public engagement practice globally.

What we learned from our research

Below are five important considerations that emerge from our research as essential for a public engagement initiative to be effective.

What about a What Works model?

Wellcome asked us to look specifically at the UK Government’s ‘What Works Centres’ as a potential model for a global public engagement evidence initiative (the What Works Centres were set up in 2013 with the aim of improving the way that government and other publicly funded organisations create, share and use evidence in decision making). 

While everyone we spoke to agreed that there is a need to generate evidence on public engagement practice, there was a strong sense that evidence alone isn’t enough. One participant in our London workshop said, we need to ‘put the question mark back into “what works”.’

A broad view of what constitutes evidence

One of the biggest critiques of the What Works model was a narrow view of evidence. The stakeholders we consulted strongly suggested that evidence on public engagement practice must be construed in the broadest sense, encompassing a range of different approaches and methods. There’s a risk of ‘ivory tower approaches’ that exclude experiential and practice-based evidence.

This is particularly important given that public engagement, and evidence related to it, tend to operate in a context characterised by stark power asymmetries between publics and power holders. This is exacerbated by research and policy cultures that struggle to articulate, understand or embed public engagement approaches within expert-dominated disciplines.

Complexity and systems thinking 

Meaningful public engagement interventions entail a shift in these power asymmetries. This makes it difficult to evaluate such interventions unless we take a contextual approach that reflects contestation and power dynamics within political systems. 

It follows that to gather useful evidence on public engagement, we need to understand the complex socio-political systems within which engagement operates. Complexity and systems thinking, therefore, would be a helpful starting point for an evidence initiative. 

Using a systems lens, a public engagement evidence initiative could have a dual focus: first, generating and sharing evidence on the necessary conditions for effective public engagement. Second, evidence on what changes are needed in underlying complex systems (such as policy, research or democracy) to enable more effective public engagement.

Action learning 

Achieving innovation and change within complex systems requires adaptive strategies and learning to inform adaptation. Our interviews suggest that it would be a mistake to separate evidence gathering from practice and innovation.

Rather than just focusing on evidence generation and sharing, a more effective approach for a public engagement evidence initiative would be an action learning cycle. 

The action learning cycle approach recognises that the practice of – and innovation in – public engagement is itself a useful contributor to evidence gathering.  Thus, the types of activities would expand from evidence generation to piloting, innovation, strategic communication of evidence and capacity building of public engagement practitioners.

Field building 

In recent decades, funders have launched several field building initiatives, so we now have a much better idea of what it takes to build a field.

Given that the current field of evidence generation and exchange on public engagement practice is so underdeveloped, any new public engagement evidence initiative should involve an element of field building.

Three possible structures for a new evidence initiative 

We looked at several different evidence initiatives out there, and came up with three possible approaches that a funder, or group of funders, could pursue for public engagement:

  1.  Commission an external institution to create a new centre
  2. Incubate a new centre within the funding body
  3. ‘Seed-fund a group of institutions to form an international partnership 

Examples that we found of funders commissioning new external centres include The Healthcare Improvement Studies Institute funded by the Health Foundation, and the Alliance for Health Policy and Systems Research funded by the World Health Organization. It takes time for such a centre to establish itself and its relatively resource intensive. However, it does enable a gradual approach to relationship building with actors already engaged in field.

Examples of funders who have incubated a new initiative within their institution include the Wellcome Trust’s independent research initiative Understanding Patient Data, and the Nuffield Foundation’s National Council on Bioethics. The benefit of this approach is that it enables quick early growth at relatively low cost. It also positions the initiative to influence practice within the funding organisation. On the flip side, the initiative may struggle to be seen as independent or as relevant globally.

An example of an international partnership was the Research and Policy in Development programme, hosted at the Overseas Development Institute (ODI). Simon Maxwell (Executive Director of ODI at the time) calls this the ‘airline alliance’ model, whereby a group of connected centres shaped and informed by their context and knowledge of local issues work together but are financially independent and sustainable. 

We recommend that a network of funders resource the scoping and identification of the most feasible option from these three.

Read the full report.