{"id":1880,"date":"2012-03-30T13:06:31","date_gmt":"2012-03-30T18:06:31","guid":{"rendered":"https:\/\/onthinktanks.org\/articles\/\/"},"modified":"2016-01-25T13:08:42","modified_gmt":"2016-01-25T18:08:42","slug":"should-think-tanks-write-policy-briefs-what-an-rct-can-tell-us","status":"publish","type":"post","link":"https:\/\/onthinktanks.org\/articles\/should-think-tanks-write-policy-briefs-what-an-rct-can-tell-us\/","title":{"rendered":"Should think tanks write policy briefs? What an RCT can tell us"},"content":{"rendered":"

There is no doubt that policy briefs are an important tool for research communications, especially for policy-oriented think tanks. A\u00a0recent study from CIPPEC<\/a>\u00a0found that 80% of think tanks surveyed across Latin America, Africa and Asia produced some form of policy brief.+<\/span><\/span><\/p>\n

\u2018Yeah, but do policy briefs actually DO anything? Do they actually have impact?\u2019<\/p>\n

As someone with \u2018policy influence\u2019 in my job title, and as someone\u00a0who trains researchers and other research uptake professionals on the development of policy briefs<\/a>, it\u2019s a question I get a lot.<\/p>\n

In fact, I remember in January 2011 sitting in Delhi at the headquarters of the\u00a0International Initiative for Impact Evaluation (3ie)<\/a>, and being posed that exact question by their Senior Advocacy and Policy Officer, Christelle Chapoy. The answer is, of course, both incredibly simple and incredibly complicated.<\/p>\n

The simple answer<\/h2>\n

A policy brief is a piece of paper. It doesn\u2019t DO anything, and is therefore unlikely to have impact on its own.<\/p>\n

This is something I try to remind people of all the time when discussing policy influence and research uptake. In fact, most communication outputs\u00a0by themselves<\/em>\u00a0probably aren\u2019t very impactful. It is a shame, then, that these outputs tend to be listed as deliverables in contracts with funders and thus tend to become viewed as an \u2018end\u2019 rather than a \u2018means to an end\u2019.<\/p>\n

Here\u2019s where it get\u2019s a bit more difficult.\u00a0Policy influence requires a concerted effort to engage target audiences at different times with different information and messages<\/a>\u00a0\u2013 which is why putting together a clear strategy is such an important first step. The type of engagement necessary will depend on the context within which one is working and might depend on a number of factors including the stage of the policy cycle, how controversial a topic is, the\u00a0capacity of the policy makers<\/a>\u00a0to understand evidence and research, the strength of the evidence, etc. (see the\u00a0RAPID framework<\/a>\u00a0for a good list of things to consider).<\/p>\n

Getting something on the policy agenda, for instance, requires different information (and a different strategy) than improving the implementation of an established policy. This, in turn, requires a variety of targeted communication products.<\/p>\n

Probably even more important is some sort of human interaction \u2013 never underestimate the power of a cheeky pint (or more culturally appropriate informal meeting, whether it be tea or coffee, or even a breakfast) with your key audience. In that context, passing them a policy brief might be an easy way for them to recall parts of your discussion, or act as a way to establish credibility in your discussions.<\/p>\n

The (even more) complicated answer<\/h2>\n

OK, so we know that it probably takes more than a piece of paper to influence policy. That said, can some policy briefs be more effective than others? Absolutely.<\/p>\n

At a seminar earlier this week at IDS entitled, \u2018Research, Knowledge, Action – What can a policy brief really achieve?\u2019, researchers from IDS and 3ie (Penelope Beynon, IDS<\/a>;\u00a0Marie Gaarder, formerly of 3ie<\/a>; and\u00a0Edouardo Masset, IDS<\/a>) presented some initial findings from a research study looking into the utility of policy briefs as a tool for research uptake. And I was glad to see an attempt to answer the question posed over a year ago.<\/p>\n

Using a randomised control trial, or RCT (which presented its own series of challenges), they tested to see what sort of actions a relevant audience might take. They were particularly interested in digging deeper into a finding from\u00a0Jones and Walsh (2008)<\/a>\u00a0that indicated that nearly 80% of researchers and policy makers in the science, technology and innovation field valued the opinion of the researcher in addition to just presenting basic facts. In other words they were looking at the role of personal authority and credibility.<\/p>\n

After inviting participation from some 70,000 individuals through IDS and 3ie contact lists, 807 participants self-selected to participate in the study (which, says a lot right there!). Participants were generally well educated and interested in the field of agriculture and development. Participants self-identified their level of influence at various levels of policy \u2013 some indicated they had more and some said they had less.<\/p>\n

They divided the sample into four groups. The control group was given a policy brief from IDS vaguely related to the topic of the study. The first experimental group was given a two-page brief that summarised findings from a systematic review on agriculture and nutrition, particularly looking at bio-fortification and home-based farming. To another group they circulated the brief and attached a two-page opinion piece by\u00a0Lawrence Haddad, Director of IDS<\/a>. For a third group, they circulated both the brief and the opinion, but this time from an unnamed IDS research fellow.<\/p>\n

I\u2019ll leave it to the researchers to publish a more thorough analysis of their findings, but a few points from their study I found particularly interesting:<\/p>\n