Skip to content

Posts tagged ‘evidence’

The onthinktanks interview: Dr Asep Suryahadi

In this interview, SMERU's Director, Dr Asep Suryahadi, describes his motivations for joining the think tank, the centre's history, and its current and future challenges. Pak Asep explains that as a think tank in Indonesia SMERU must balance a number of sometimes competing expectations from multiple stakeholders.

Read more

The onthinktanks interview: Jorge Vargas Cullell

Ileana Avalos interviews Jorge Vargas Cullell, Adjunct Director of the State of the Nation Program (Programa Estado de la Nacion), which is an information platform on development for the Costa Rican population. He discusses the role that this program plays among political actors in Costa Rica.

Read more

The density model: an alternative to trying to control the uncontrollable

Policy spaces with more information density are more likely to be better informed. How then can we promote greater availability of more and better information and knowledge about key public policy issues? This blog offers some ideas.

Read more

The politics of the evidence based policy mantra

Andries Du Toit's paper on the politic of research is one of the best studies on the links between research and policy that I have ever read. It is also one of the few coming from a developing country and written from that perspective -and in English which that will help in getting some of the points it makes across.

Read more

Huw Davies: “When contextualised, research has the power to animate, inform, empower or infuriate”

Huw Davies talked to the LSE Impact of Social Sciences blog on how to treat academic research in order to give it the best foundations when before it enters the policymaking process.

He said:

Research Does Not Speak For Itself: research needs to be actively translated in communication; it needs to be set in context, and it needs to be brought to life. By itself, ‘research’ is just inanimate data: in conversation and contextualised it has the power to animate, inform, empower or infuriate. It has the latent capacity to become knowledge, or even evidence.

Which is why I think that when donors ask for indicators of research outcomes they are in fact asking for indicators of the efforts to communicate (share, explain, disseminate, popularise, test, etc) research processes and outputs.

Research Does Not Stand Alone: of course any research must be seen in the context of other research, gradually building up a picture of our social world. Individual studies have far less value than careful synthesis and review. But more than this, research needs to be interpreted in the context of local systems, cultures and resources; and explored with an understanding of political sensitivities, expediencies and implementation challenges.

Which means that policy research programmes should never start with the assumption that nothing can be communicated yet (until we have done the research). Their research is not isolated from other begin done alongside it or done before.

Research Has To Be Integrated: research ways of knowing have to be integrated with other forms of knowing: knowing that comes from a complex and sophisticated conceptual understanding of the world (including ideological preferences), and knowing that comes from deep experience, including tacit ways of knowing or feeling.

We tend to be carried away by the idea that there is one type of evidence: research based evidence. But this is not true. Evidence can be arrived at by different means. Knowledge is even more complex. Whose evidence is as important a question as any.

Using Research is Often Not An Event: use of research is often better seen as a dynamic and iterative process, usually faltering, but occasionally dramatic; most often seen better in retrospect than in prospect. Research-based ideas can slowly seep into policy discourse in a slow and percolative way, gradually changing the sense of what is important or what is possible in policy debates.

Hence the ill-conceived idea of commissioning case studies of specific influencing events, or the stories of change that think tanks tend to use (under pressure from donors). If the process can be easily described within the confines of a brief story of change then it is likely to have been one of those exceptions to the rule -and more often than not the consequence of a consultancy.

It’s Not Just Learning – Unlearning Matters Too: letting go of previously cherished notions, conceptual models or so-called ‘facts about the world’ can be as important as the acquisition of new understandings. But this is far from simple: the new does not necessarily displace the old. Sometimes uncomfortable accommodations or amalgamations are made.

But to do this think tanks need to create sufficient space to test and fail (or make mistakes, at least). The pressure to maximise impact (to ensure that all research is policy relevant and useful -and usable) works against this. Think tanks all over the world offer safe spaces from which to launch new innovative ideas. But this requires a mindset that rewards ideas above all.

Knowledge is Often Co-Produced: rather than seeing research as the preserve of technical experts, new policy-relevant knowledge often comes from collaborative processes that break down the distinction between roles – where technical expertise around data meets other forms of knowing rooted in experience or a sense of the possible. Shared journeys can produce shared understandings.

In a project on trade and poverty in Latin America we commissioned a journalist to study the same issues we were studying but from a different point of view. We never really properly integrated these but it would be a good idea to build coalitions that seek to study the same issues from different disciplines and perspectives. Why not build coalitions between academia, think tanks, NGOs, and the media? A soon to be launched DFID Zambia project will be funding several think tanks to study similar issues and then debate their ideas in public -this has never been seen before!

Knowledge Creation is Deeply Social: the creation of knowledge from various ingredients (including, but by no means limited to, research) is therefore a deeply social and contextual process – happening through interaction and dialogue. It reflects a persuasive process triggered as much in the gut as in the brain.

Dialogue is underrepresented in this sector. We talk about engagement and ‘two-way communications’ a lot but not really about research and influence as a dialogue process. The marketplace metaphor has taken over every aspect of the work including the language with use: demand, supply, etc. We should be talking, as Daniel Ricci says, about a big conversation. Think tanks role there is to improve the terms of the dialogue. Jeffrey Puryear wrote that Chilean think tanks greater contribution was the way they helped politicians to learn how to talk to each other and work together.

Not Products But Process: from all of this it then makes more sense to think of ongoing processes of knowing than the creation and sharing of knowledge products, and so…

And so think tanks contribution is ongoing. A research output or a policy is a step along the way (a not always linear way). Various outputs may contribute to change over time, or prevent it, depending on how they come together. More importantly, though we must not forget that all stakeholders involved in the research and policy process have their own history and agency. They are not static players.

It’s Not All About Decisions But More Often About Framings: because research often has the most profound impacts not when it directly underpins specific decisions (instrumentalist action) but instead when it causes shifts in the language, concepts, conceptual models or frameworks that are used to define the contours of the policy landscape. Research can be at its most powerful when it shakes prior certainties, questions core assumptions or even re-shapes cherished values.

It is not about evidence, it is about arguments!

So, when we focus on research as proving evidence for policy decisions we both overplay its short-term role as technical arbiter and undersell its longer-term transformative power.

I could not agree with him more.

An just so you don’t think Huw Davies is a member of the anti-comms brigade, here is LSE’s bio on him:

Huw Davies is Co-Head of School and Professor of Health Care Policy & Management at The School of Management, the University of St Andrews, and he was formerly Director of Knowledge Mobilisation for the UK NIHR ‘Service Delivery and Organisation’ national R&D Programme (2008-10). His research interests are in service delivery, encompassing: evidence-informed policy and practice; performance measurement and management; accountability, governance and trust. Huw has published widely in each of these areas, including the highly acclaimed Using Evidence: How Research Can Inform Public Services (Policy Press, 2007).

Influencing policy and practice: the long and complex road

I think it is well known by all by now that influencing policy and practice cannot be assumed to be a linear and predictable affair. Initiatives that plan for influence within a few months or a year (or even longer) are, it has to be said, stretching the truth a bit too far.

Here is an interesting example of how change happens –but not necessarily in a way that can be planned, logframed, and measured.

Allison Aubrey and Eliza Barclay wrote a short article for NPR’s food blog on the history of changes public and private policies towards the use of trans fats in food (in the United States). They report on a study that shows that:

the amount of trans-fatty acids in some Americans decreased significantly — 58 percent among white adults between 2000 and 2009. Researchers at the U.S. Centers for Disease Control and Prevention, who published their findings in the Journal of the American Medical Association, say that is “substantial progress.”

The history of trans fats is interesting because it points at a number of elements of the evidence informed or based policy debate. First of all, trans fats were initially promoted as an evidence-based alternative to lard, in the 1980s. So we should be conscious of the fact that there are other views and these can also be backed up by evidence and science.

Science caught-up with the new invention and in the 1990s some health activists started to use new finding to denounce it. However, it took until 2006 for the government to demand manufacturers to change their behaviour but only in terms of labelling the use of trans fats –not prohibiting it. This came later. Change happens slowly in part because science is not one. When advocates refer to science they may try to convince us that all scientists agree but this is hardy ever the case. And people do not always keep up with all the latest developments in science. Does anyone remember if broccoli causes cancer -or is it in fact good for you?

Another important point to note is that the story is not over. There are policies and new practices on the issue that this has not eliminated trans fats. They are still used.

So three decades of campaigning, research, policy change and new practices have brought forth a number of changes. And a number of players have been involved: scientists (inventing, promoting, and rejecting trans fats), advocates, policymakers, corporations, restaurants, super markets, nutritionists and other opinion makers and educators, the general public, etc.

Understanding the demand for World Bank research within the Bank

A year or so ago, DFID asked us (in ODI) to assess whether its research and evaluations were being used by DFID staff. Harry Jones and I developed an approach that focused not on the studies themselves (the supply) but on the way that DFID staff made choices and the roles or functions that evidence played in those. Evidence, we assumed, could come from different sources, one of which could be DFID research and evaluations. We did not, however, want to bias our study by focusing on them. And we felt that to say anything useful about the use that these had within DFID we had to do it in a wider context of all other inputs to decision making.

A recent study from the World Bank takes a different approach. Martin Ravallion’s study: Knowledgeable bankers ? the demand for research in World Bank operations, focuses on the demand for World Bank research. It does, however, share some of our conclusions:

The methods used affect demand:

Today’s research priorities may well be poorly matched with the issues faced by practitioners in these sectors. For example, the current emphasis on randomized trials in development economics has arguably distorted knowledge even further away from the hard infrastructure sectors where these tools have less applicability (Ravallion, 2009). Making the supply of research more relevant to the needs of development practitioners would undoubtedly help.

Absorptive capacity is crucial:

The differences across units in the demand for the Bank’s research are correlated with the incidence of PhDs and economists, suggesting that internal research capacity in operational units helps create absorptive capacity for knowledge in those units.

Researchers need to make an effort, too, however:

The slope of the relationship between perceived value and familiarity with research is positive but significantly less than unity, suggesting frictions in how the incentive for learning translates into knowledge. The responsiveness of researchers and the timeliness and accessibility of their outputs are clearly important to how much learning incentives lead to useful knowledge.

This study also finds two models that explain how research affects decisions:

In the first, they have a demand for knowledge that does not stem from its direct bearing on their work. Much development research is a public good. Practitioners might read research findings to better understand the world in which they work, even when that understanding is essentially irrelevant to the specifics of that work.

Alternatively, in the second model, research has a direct value in the work of practitioners—such as by informing project choices at the entry stage and assessing impacts later on—and research findings are sufficiently relevant and accessible to assure that practitioners become well informed.

We found a few more options, depending on the type of decisions that staff had to make:

  • In some cases, evidence generation was incorporated into the policy cycle
  • In others, evidence was used to make small incremental changes and corrections to ongoing policies and programmes
  • In other cases, evidence had to catch up to events
  • And in others, more often than not, it was used to make sense of political demands

We concluded that DFId was better at using the evidence and learning from it. These are two different things. It was also:

much better at using research and evaluation findings during or as part a project cycle than in more complex and emergent decision making processes.

In other words, it was better at working with a consultant than with an academic (or so I liked to think about it) -and this resonates with Ravallion’s finding of the importance of staff capacity.

This, in turn, points towards a possible mismatch between the ideals and realities of lesson- learning in DFID. For example, research is largely done outside the organisation by increasingly larger consortia with clear incentives to communicate to audiences other than DFID. The incorporation of their findings into DFID policymaking processes then depends on these programme‟s communications capacities, intermediaries (both technology based and knowledge brokers), and DFID staff themselves –who, according to the study are under increasing time pressures that reduce incentives towards engaging with research and evaluation processes and the analysis of their evidence and findings.

To us this meant that DFID was pushing research away from itself making it a foreign concept. Its efforts to bring it back by hiring Knowledge Brokers (PhDs) to mediate did not seem to fit with what emerged as the more effective model:

The system emerging from this is one where intermediaries between research and evaluation and policy and practice play a significant role.

In other words, learning in DFID (of the kind that promotes the incorporation of analysis into decision making and the development of a learning organisation) is more akin to a system with fewer intermediaries and more direct relations between users and producers of knowledge.

I liked this conclusion because it fits nicely with a belief that we need to pay more attention to people in this business of international development.

Latin American think tanks: using knowledge to inform public policy

A few months ago I reported on an event I helped organise in Lima on think tanks. It was hosted by CIES and co-organised by GDNet, the Think Tank Initiative, CIPPEC, Grupo FARO and the EBPDN. Micaela Pesantes has shared the event’s report with me (and I share it with you: in English, this time).

View this document on Scribd

Looking for evidence in political debates: the case of GMOs in Zambia

I have mentioned the work by Emma Broadbent a few times in this blog. She is conducting a series of cases for the Evidence based Policy in Development Network that explore the contribution of research based evidence to the development of political debates in Africa. Zambia Analysis has published a synthesis of the case she wrote for the GMO debate in Zambia. (Zambia Analysis, by the way, is an interesting initiative. It is an effort to improve the policy debate in Zambia and a perfect opportunity create a new demand for think tanks’ research -I hope donors are paying attention.)

The watershed moment for Zambia’s position on GMOs came in 2002 when, faced with food shortages and widespread starvation, the government chose to reject 35,000 tonnes of food aid from the US because it included GMO maize. The move was starkly criticised by the WFP, FAO and USAID on stark grounds that it endangered the lives of starving people and was based on a lack of evidence.

But was it? In fact, the account of the GMO debate in Zambia shows that evidence was used by both sides (for and against).

The ban was not made without advice and deliberation. The decision to go beyond banning the aid shipment and ban all GMO imports came after intense debate and serious attempts to weigh up existing knowledge. After a number of research institutes advised the government against accepting GM maize, a team of Zambian scientists and civil society representatives was sent on a US-funded international study tour and concluded that GMOs could be a health hazard.

The full article is on page 26:

View this document on Scribd

Not evidence but arguments: translating evidence into policy in Ecuador

Orazio Bellettini and Andrea Ordonez, from Grupo FARO, have published a paper on translating evidence into policy in Ecuadordrawing from two policy debates: Fighting Political Clientelism at Social Pograms; and the Yasuni ITT Initiative Proposal.

The Yasuni initiative provides an excellent illustration of the relationship between science and policy influence that is often overlooked in the evidence based policy discourse. The assumption is that there is a direct relationship between evidence and policy. But this overlooks the act of translating evidence into policy options and policy recommendations.

This translation is not straight forward. Evidence does not include ‘what to do’. Evidence, or what we call evidence, is about ‘what is happening’, ‘what is working’, what is not working’, ‘what is the probability that something will work’, etc. In the case of the Yasuni, scientists offered evidence along these lines:

“Our first conclusion is that Yasuní National Park protects a region of extraordinary value in terms of its biodiversity, cultural heritage, and largely intact wilderness. This region — the Napo Moist Forests of the Western Amazon — has levels of diversity of many taxonomic groups that are locally and globally outstanding. For example, with an estimated 2,274 tree and shrub species, Yasuní protects a large stretch of the world’s most diverse tree community. In fact, there are almost as many tree and shrub species in just one hectare of Yasuní’s forests as in the entire United States and Canada combined. Yasuní has 567 bird species recorded — 44% of the total found in the Amazon Basin — making it among the world’s most diverse avian sites. Harboring approximately 80 bat species, Yasuní appears to be in the world’s top five sites for bat diversity. With 105 amphibian and 83 reptile species documented, Yasuní National Park appears to have the highest herpetofauna diversity in all of South America. Yasuní also has 64 species of social bees, the highest diversity for that group for any single site on the globe. Overall, Yasuní has more than 100,000 species of insects per hectare, and 6 trillion individuals per hectare. That is the highest known biodiversity in the world.”(Scientists Concerned for the Yasuni, 2004)

This is evidence of the rich biodiversity of the Yasuni. But there is no immediate policy action that can be inferred from this. A policymaker could decide to protect it or to build a road right through it. The decision is not evidence based (although, evidence of the rich biodiversity of the area can certainly influence  or inform it) but value based. What do the policymakers value more, and why?

Are they willing to put a price to nature? If they are, then they will be quite happy to get rid of the forest if they can identify some clear monetary gains. What if they are not willing to monetize nature?

This is a point I tried to make twice last month: first at the Royal Society to a group of representatives of national scientific societies in Africa and then to the social sciences sector cadre of the Inter-American Development Bank. The point is not that science has no place in policymaking but that we must accept that values play a role too; and a very significant one.

How can think tanks deal with this? Four initial ideas.

Work with others: As the Ecuador case suggests, think tanks must work with other organisations which may be more comfortable with the language of values: political parties, religious groups, NGOs, etc.

Include more views and perspectives: Think tanks should stop talking about multidisciplinary work and make it a reality. One of my favourite things about the RAPID programme was that we were a fairly diverse bunch, each with his or her own views and values (at some point the research team was made of: an economist (Peruvian), a veterinarian (British), an engineer (British Asian), an astrophysicist (British), a mathematician and philosopher (British), and a political scientist (Malawian). Not just that, but we all had different class, ethnic, political, and religious backgrounds. (I am not sure we ever took advantage of it, though.) A friend recently mentioned that Danish think tanks are apparently dominated by men 10 to 1. I’ve never been one for quotas or affirmative action but it does occur to me that if anything can be done to encourage a more balanced set of skills, background and views, then maybe it should.

Appeal to values: Think tanks must make an effort to avoid over-stretching their use of science and instead explicitly appeal to values and other sources of power in building their policy arguments. What is wrong with arguing for justice alongside effectiveness? Or standing firm on certain values? The Jesuit Centre for Theological Reflection (JCTR) combines their research with the teachings of the Church in their messages. (I am not religious but I can see their point.) The Occupy the City of London Movement has resorted to asking: what would Jesus do (about the levels of inequality that the financial sectors are fuelling)? I am not religious but I can get their meaning. The understanding of justice that the Catholic Church was built on is not unique to it -it is a universal value.

Build arguments: Communication of findings is not enough when we are trying to change policy. Bigger ideas and arguments are necessary.


Get every new post delivered to your Inbox.

Join 7,084 other followers