Skip to content

Should think tanks write policy briefs? What an RCT can tell us

[Editor’s note: Jeff Knezovich is the Policy Influence and Research Uptake Manager for the Future Health Systems research consortium, and a frequent contributor to On Think Tanks. This is the first in a series of posts examining the role of policy briefs in communicating research. For a more up to date critique on the final paper see: “A policy brief is a piece of paper. It doesn’t DO anything on its own”]

There is no doubt that policy briefs are an important tool for research communications, especially for policy-oriented think tanks. A recent study from CIPPEC found that 80% of think tanks surveyed across Latin America, Africa and Asia produced some form of policy brief.

‘Yeah, but do policy briefs actually DO anything? Do they actually have impact?’

As someone with ‘policy influence’ in my job title, and as someone who trains researchers and other research uptake professionals on the development of policy briefs, it’s a question I get a lot.

In fact, I remember in January 2011 sitting in Delhi at the headquarters of the International Initiative for Impact Evaluation (3ie), and being posed that exact question by their Senior Advocacy and Policy Officer, Christelle Chapoy. The answer is, of course, both incredibly simple and incredibly complicated.

The simple answer

A policy brief is a piece of paper. It doesn’t DO anything, and is therefore unlikely to have impact on its own.

This is something I try to remind people of all the time when discussing policy influence and research uptake. In fact, most communication outputs by themselves probably aren’t very impactful. It is a shame, then, that these outputs tend to be listed as deliverables in contracts with funders and thus tend to become viewed as an ‘end’ rather than a ‘means to an end’.

Here’s where it get’s a bit more difficult. Policy influence requires a concerted effort to engage target audiences at different times with different information and messages – which is why putting together a clear strategy is such an important first step. The type of engagement necessary will depend on the context within which one is working and might depend on a number of factors including the stage of the policy cycle, how controversial a topic is, the capacity of the policy makers to understand evidence and research, the strength of the evidence, etc. (see the RAPID framework for a good list of things to consider).

Getting something on the policy agenda, for instance, requires different information (and a different strategy) than improving the implementation of an established policy. This, in turn, requires a variety of targeted communication products.

Probably even more important is some sort of human interaction – never underestimate the power of a cheeky pint (or more culturally appropriate informal meeting, whether it be tea or coffee, or even a breakfast) with your key audience. In that context, passing them a policy brief might be an easy way for them to recall parts of your discussion, or act as a way to establish credibility in your discussions.

The (even more) complicated answer

OK, so we know that it probably takes more than a piece of paper to influence policy. That said, can some policy briefs be more effective than others? Absolutely.

At a seminar earlier this week at IDS entitled, ‘Research, Knowledge, Action – What can a policy brief really achieve?’, researchers from IDS and 3ie (Penelope Beynon, IDS; Marie Gaarder, formerly of 3ie; and Edouardo Masset, IDS) presented some initial findings from a research study looking into the utility of policy briefs as a tool for research uptake. And I was glad to see an attempt to answer the question posed over a year ago.

Using a randomised control trial, or RCT (which presented its own series of challenges), they tested to see what sort of actions a relevant audience might take. They were particularly interested in digging deeper into a finding from Jones and Walsh (2008) that indicated that nearly 80% of researchers and policy makers in the science, technology and innovation field valued the opinion of the researcher in addition to just presenting basic facts. In other words they were looking at the role of personal authority and credibility.

After inviting participation from some 70,000 individuals through IDS and 3ie contact lists, 807 participants self-selected to participate in the study (which, says a lot right there!). Participants were generally well educated and interested in the field of agriculture and development. Participants self-identified their level of influence at various levels of policy – some indicated they had more and some said they had less.

They divided the sample into four groups. The control group was given a policy brief from IDS vaguely related to the topic of the study. The first experimental group was given a two-page brief that summarised findings from a systematic review on agriculture and nutrition, particularly looking at bio-fortification and home-based farming. To another group they circulated the brief and attached a two-page opinion piece by Lawrence Haddad, Director of IDS. For a third group, they circulated both the brief and the opinion, but this time from an unnamed IDS research fellow.

I’ll leave it to the researchers to publish a more thorough analysis of their findings, but a few points from their study I found particularly interesting:

  • Confirmation bias applies, even to policy briefs: I’ve written before about the role of confirmation bias in policy making. Confirmation plays out in a lot of ways, one of which is that when presented with evidence that challenges existing beliefs, it tends to reaffirm rather than change that mis-held belief. In the study, they found that participants who didn’t have any particular opinion on bio-fortification changed their opinion in line with the evidence presented in the brief. But those who had already formed an opinion didn’t budge – even, or perhaps especially, if it was contrary to their own beliefs.
  • The ‘Lawrence effect’ is insignificant: Sorry Lawrence! We may love you here at IDS, but adding your opinion to the policy brief only had a minor but statistically insignificant effect on whether or not people changed their opinion on the topic or passed the brief on. I should add that using the opinion of an unnamed research fellow didn’t have an effect either, so Lawrence, you may be off the hook! Don’t stop blogging just yet. And on that note…
  • ‘I’d rather change policy than write a blog!’: When it came to what sort of follow up actions resulted from reading the brief, simple actions like passing it along, or telling someone about the findings were most likely. The least likely action cited by respondents was to write a blog about it… even less likely than actually changing a policy. There are any number of explanations for this, including the study sample and their self-perceived responsibilities, but it certainly confirms that convincing researchers and policy makers of the value of blogging is an uphill battle.
  • Getting the wrong message: Over 50% of study participants shared the findings of the policy brief. Unfortunately, when queried about the key message of the brief in a qualitative follow up interview, many weren’t able to accurately recall the message. Some interviewees even recalled key messages contradictory to those intended. Whoops! This may have been because of a poorly crafted brief, or more likely, it may have to do with the fact that it was summarising a systematic review, a research approach that tends to end in more questions than answers… and therefore rather unclear messages. Maybe they could have applied Enrique’s taxi driver test to refine their messages. Or, as Ariana Huffington recently pointed out, perhaps we shouldn’t fetishise the notion of ‘going viral’. It also reminds me of previous research by HP Labs that noted how quickly the quality and accuracy of messages decreased as they spread through communities – though I hadn’t expected it was as short as the first share.
  • Women less prone to share information than men!: It’s the attention grabbing headline that Marie proposed, if rather glibly, the media might make of the study. But the gender divide was found to be consistent throughout the study, even controlling for a variety of factors. So the question remains, what is it about policy briefs that make them seemingly less relevant for/to women?

It’s clear that randomised control trials can tell us something about the utility and use of policy briefs. But I would also argue that RCTs are not the only arbiter of good knowledge. As such, in the next post I will reflect more on my past working with and developing policy briefs for a range of audiences, to see what experience can tell us about how to make policy briefs more effective.

About these ads
7 Comments Post a comment
  1. Jeff,

    Great post – I too found the research on policy briefs fascinating but ultimately I am not sure how it has really helped us here in the IDS central comms team. It was interesting that the control brief used in the research was a policy brief we produced for the MDG summit in the autumn of 2010 based on research by Martin Greeley. This particular brief has rather passed into IDS legend as the one that ended up being reading for Andrew Mitchell on the plane to NY for the UN Summit. Did it have any impact on policy – who knows – but it did get to the right person at the right time despite competition from a torrent of think tank and NGO briefs and reports which is something. Why did it succeed in this way. I am pretty confident it was great timing, the right framing and of course the reputation of IDS and our excellent relationship with the SOS’s team.

    But for me the really important function of policy briefs is that their production is a process that allows individuals and institutions to develop positioning and policy. This process mobilises support and enthusiasm for the vision of change or messages internally which in turn can lead to the mobilisation of resources also. Campaigning NGOs have long known this and they have large policy teams whose sole purpose it so make sense of the research (when it is a research based initiative) and work with the advocacy people (who may or may not be in the same team) to develop clear targeted policy asks. In pure research organisations we have a gap between research and communications that is not plugged by this special breed of policy wonks. At IDS we are gradually moving toward a situation where Comm staff no longer just edit researchers’ policy briefs but write them from scratch and then seek feedback from the researchers. We are just beginning to to mandated to fulfil this policy wonk function. There are great challenges and great risks in this for sure but the results are encouraging. Even where researchers are more than capable of drafting their own briefs and have an excellent understanding of their policy audiences empowering communications or research uptake teams to determine timings, framing and messaging can only increase the chances of more briefs ending up in the hands of more policy makers at the right time and in the right place.
    James Georgalakis (Communications Manager, IDS)

    March 30, 2012
    • Thanks for this James

      This is an interesting contribution. It reminds me of the role for comms teams that I described here:

      I was going to add that the PB is important in as much as it helps to create a brand and a standard product for the organisation: “This is IDS’s policy brief”. Something like the Chatham House Rule that everyone knows about.

      As you say, the PB is not as important as the process of writing the PB. And the PB in a way is something one leaves behind after a meeting (or sends in advance). It is what gets forwarded, etc. But does is influence on its own? Certainly not.

      The real issue is what do the users need? Maybe in the UK where PBs are common and well known it is appropriate for think tanks to produce them. But if policymakers demand other types of communication outputs then these ought to be produced, too. The media much rather read a press release; academics will prefer a paper (or the abstract); NGOs maybe story or a summary; etc.

      In developing countries the right communication output will certainly depend on these demands.

      A question I have (will be good to see the study) is whether this was an issue that could be RCT-ed in the first place. Who are these groups that were used in the study? Is it possible to control for politics? (It may be if the issue is not of political interest but then that may affect the reaction to the PB …). Is it even worth it?

      March 30, 2012
      • Louise Daniel #

        I feel there is a fundamental difference between the IDS policy brief by Martin Greeley (and other similar institutional policy briefs discussed by James above) used for the control group and the 3ie/IDS policy brief, ‘Evidence Matters’, which is based on a systematic review (a synthesis of research from all over the place, not any one organisation, and thus not a positioning document). And as was pointed out at the seminar on Monday, and as we found when writing the brief, it is very difficult to draw out policy lessons from a systematic review because all the studies (randomised control trials for the most part themselves) and the SR itself are so very circumspect about formulating policy. Looking at the wider context, as a traditional policy brief is meant to do and does in its stride, goes against the grain for systematic reviews.

        March 30, 2012
  2. Thanks for such valuable information and analysis of what PBs can or cannot achieve. There is another aspect of PBs that should not be underestimated: when think tanks decide to have PBs as a regular communications tool this becomes a very concrete opportunity for capacity development. In the case of CIPPEC (think tank based in Argentina) we have produced so far more than 100 PBs with a very diverse degree of quality, relevance, impact, etc. However, there is a clear trend of organizational improvement in terms of selecting policy issues on which to write, on producing clear and succint policy analysis and on developing politically and fiscally viable recommendations, among others. Though it´s true that donors want to see external impact of these pieces, I insist on looking both at internal and external policy influencing objectives. Internal change, i.e. improving your policy influencing capacity is key to enhance external policy influence impact. PBs can become an excellent and concrete tool to develop and strengthen multiple capacities and skills related to policy influence.

    April 3, 2012
    • Hi Jeff – I’m glad the seminar gave you food for thought. We are aiming to circulate a summary of results to research participants, and publish on the 3ie and IDS websites towards the end of April – so watch these spaces!

      Some interesting issues were raised at the seminar and also in the comments here. In particular, I agree that when used as part of an influencing strategy, most organisations would never just send a policy brief on its own as the lone soldier in an influencing battle; instead, a policy brief would be (as Enrique says) “something one leaves behind after a meeting (or sends in advance)”.

      But even if we don’t intend that a policy brief will have influence on its own (and don’t rely on this approach to influence our key audiences) we nonetheless put our briefs out on their own and into the public domain on our websites where they can be (and are) read by ANYONE. Our study found that even on its own a policy brief can have some influence on the beliefs held by these wider readers (particularly those who had no prior position on an issue) and that different types of briefs have different effects on their follow up actions – which I think has interesting implications for research communications even if those readers are on the periphery of our key audiences.

      Of course, there are nuances to all of the study findings – for example, it is not so much that we didn’t find ANY Authority effect (here, our Authority was Lawrence Haddad), more that the effect is more evident in the types of action people carry out rather than changes in their beliefs. I hope you will read the full report in due course to find out the rest.

      April 4, 2012

Trackbacks & Pingbacks

  1. Communications handbook for research programmes « on think tanks
  2. “A policy brief is a piece of paper. It doesn’t DO anything on its own” « on think tanks

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


Get every new post delivered to your Inbox.

Join 4,797 other followers

%d bloggers like this: