Helping think tanks work together: 10 recommendations for policy research consortia

16 March 2012

It has now become obvious that funding for research in developing countries is increasingly provided through global or regional policy research programmes. These programmes have some of the following characteristics:

  • They involve more than one policy research centre, or think tank;
  • These centres are based in more than one country –often in more than one region;
  • They involve a leading centre among them (one that holds the contract, manages the programme, and is ultimately responsible) but others share the lead on various aspects of the programme (for instance, research quality, communications, monitoring and evaluation);
  • Research and communications are coordinated across the centres involved; and
  • The programme often constitutes a small project for the centres involved.

This approach is inspired by a number of assumptions and good intentions but, I am afraid, very little evidence to support them.

  • It assumes that there is research capacity or that it is possible to do research while building capacity to do it. This is possible, sure, but in many cases the capacity of the centres and individual researchers is so low that they would be better off involved in a capacity building programme and not one that has high quality research and policy change as its objectives. Also, differences in capacities across the participating centres does not help cohesion, and in fact damages the relationships between them. The better developed ones feel they are being slowed down by the others.
  • It also assumes that collaboration leads to better research outputs. This is true when researchers are close to each other (less than 10 meters, it seems) or when they know each other quite well. It also helps if they have similar skills and experience.
  • There is also the hope that these multi-centre and multi-country programmes will encourage south-south learning. This assumes that researchers (and policymakers) from developing countries like to learn from each other. Sure, there is nothing wrong with finding out about others but developing country researchers (and policymakers) are more likely to want to learn from their peers in developed countries where resources are more readily available and professional skills and organisational capabilities are more developed. And we must also remember that these centres have mandates to work in their own countries -they may be in developing countries but that does not mean they are part of the international development industry.
  • It also hopes that this collaboration will lead to collective action among the centres. Donors dream that these southern-based organisations will organise and launch global campaigns that can be presented at global events. The truth is that all research policy centres, in the north, the south, the west or the east are more interested in their own politics than on anyone else’s. Researchers, particularly those with an interest in policy, are obviously too busy with participating in the political process. And this is good. This is what we want. Getting them to focus on others (even global policy spaces) is a distraction.

But what worries me most is that these approaches to funding research impose complicated governance structures while limiting the resources available to manage it.

  • Funding for management is limited for most of the programmes I have worked with or know about. This includes limited funds for frequent meetings between the organisations involved, a professional and well-paid management team (including managers in each of the centres –particularly when each are charged with several projects), management systems and processes that may support planning, implementation, and monitoring, and to cover the costs of training and mentoring management staff.
  • Funding for communications is also increasingly limited. Donors are not too sure what to do about this anymore. They want more branded materials (which do not go well with the strategies of the centres’ themselves) but are not too keen on supporting the more general communication efforts of the centres. This is a shame because developing a communication strategy for a single project while the organisation does not have a strategy of its own is not a good idea.
  • Limited funding for communications is linked to limited funding for internal communications and networking. This is also a shame because if the funder wants good research and collective action across several countries in different regions of the world then more (much more) needs to be invested in this.

These things make no sense. If managing consortia and delivering outputs together (while at the same time developing the capacities of several of the researchers and centres) why are not more funds allocated to this?

Here are some recommendations for these programmes:

  1. (If the countries have been pre-defined) produce a baseline that focuses on the role of research in policy for the issue or sector that the consortium is working on: this will help develop the most appropriate strategies in the proposals. Why not do something like what AusAid has done in Indonesia? (If the countries have not been pre-defined) then make this the fist task for the winning consortium: which may mean that the final composition of the consortium should be left open.
  2. (If the countries have been pre-defined) make sure that there is absorptive capacity: or make provisions to do so (see ‘start with the building blocks below’)
  3. The Terms of Reference should focus on objectives but not on how to achieve them: this should be the proposed by the bidding consortia. By focusing on objectives the funder will benefit from more options and multiple approaches. It will be able to choose from the best ideas and even ask their preferred bidder to incorporate them or members of other bids. More ideas are usually good.
  4. The objective should be, where possible, to influence the public policy debate: policy change will come later and when it comes by means of public debate then this change will be more sustainable and will, in the process, strengthen the politics (and practice) of the countries where it happens. The idea that policy change is always a good thing is dangerous. When policy change happens as a consequence of opaque lobbying and undue pressure from internal or external players policymaking institutions that we seek to develop are being undermined. If, on the other hand, change is encouraged via the strengthening of democratic institutions (political parties, bureaucracies, parliaments, the media, civil society, etc.) then we can be sure that more and more evidence will make it into the policymaking process in the future.
  5. Start with the building blocks: if the consortium is new, the funder should first fund efforts to 1) develop the consortium (and this is very difficult) and 2) ensure that capacity is (at least relatively) homogenous across. 2-5 years of this before the consortium is asked to move on with research and policy influence. If possible the leading centre (that holds the contract) should not be charged with developing the capacity of the others: why not include an organisation that has that expertise and role? There are plenty of lessons being learned on this, lets put them into practice.
  6. Spend more on management: the rules for admin costs for NGOs should not apply here. These are not service delivery initiatives where the funds are intended to reach the poor. Funds for research never reach the poor. They are intended to pay for activities carried out by the research centres and so there should not be a rule related to how much can be spent on management. But, beware, management is not the same as overhead. There should be a competent manager in each of the partner organisations in charge of running the projects.
  7. Spend more on communications: again, the rules for marketing of aid should not apply. If the funder wants research to make a contribution then it has to spend on communications. The rule DFID had of 10% (or 30%; other funders have similar rules) needs to be reviewed to encourage the consortia to spend however is necessary -in the past I have argued that larger research programmes may need to spend less than smaller ones because the sheer volume of research may be sufficient to have impact. The same percentage for all makes no sense. But the most important thing is to have the right people on the job: dedicated and competent communication managers and officers in each centre.
  8. Where necessary recruit internationally -but always recruit competitively: all posts should be filled by the right people (including researchers). Before a consortia starts working its team members’ competencies ought to be reviewed. All team members should have the same minimum competencies. If they do not then an effort to find new team members needs to be made by developing job descriptions and launching an open and international recruitment process. If no successful candidates are found in a country then the consortia can choose to focus on developing the capacity of the weaker partners or simply recruit foreign candidates (this, by the way, can be a great way of inserting new skills into the organisations -and nationality ought to have nothing to do with being good at management, communications or research; otherwise multinationals who move staff across the world would be incapable of making a profit). Also, recruiting internationally does not mean paying ridiculous fees. There are many young yet experienced managers, researchers, and communicators very keen to work abroad. A good salary will do. And, by expanding the pool of potential recruits the costs will inevitably go down (today, many research centres in the poorest countries ask for astronomical fees because they know there are few options to them).
  9. Research must be seen as one more component: and it may not be the most important one. if the objective is to influence policy and practice then this may be possible by promoting debate, translating existing research, etc. These may even be more cost-effective than the usual: ‘do lots of research, the disseminate with a few months to go on the contract, and evaluated’ approach we are more familiar with. Other pure research or mostly research programmes should also be funded.
  10. Evaluate components accordingly: one big evaluation focusing on outcomes is not very useful. Start with the inputs (did they have the right people and resources?) and the activities (is the research of high quality, is the communications strategy appropriate, is the online strategy good, etc?) first. If they are not then what is the point of evaluating their impact? In these circumstances evaluating impact tells us nothing of the effect the work had.