Skip to content

Posts tagged ‘capacity building’

Learning about efforts to support better links between Politics and Ideas

Vanesa Weyrauch has published a paper on the lessons she and her team and colleagues have learned from years of working to strengthen the capacity of researchers and policymakers to connect the communities of politics and ideas in Latin America.

Read more

Supporting think tanks series: Lessons from TTI’s Policy Engagement and Communications program in Francophone West Africa

This post from the TTI's experience in Francophone Africa adds interesting views to the Supporting Think Tank series. It presents important lessons on capacity development for Policy Engagement and Communications.

Read more

A new Topic Guide on Politics and Ideas

The Politics & Ideas think net has produced a Topic Guide that covers some of the most important issues on the subject: where ideas come from, funding, the politics of evidence, etc. This is an invitation to add your views and resources: make it your Guide.

Read more

Think Tank Initiative 2012 exchange: on building research capacity for ‘young’ think tanks

On Monday, Hans Gutbrod from the Think Tank Initiative shared his views on the coming together of think tanks from across the developing world. Over the next few weeks we will be sharing a number of videos from its sessions at the TTI Exchange in Cape Town earlier this year. I was not able to go to all of them so I am glad they are now out to see what went on. From the email sent by Julie Lafrance from TTI to the participants:

In an effort to continue the interaction and learning from the TTI Exchange, we will make 2 panels / workshops available each week for the next six weeks for you to view, or share with colleagues, business associates or friends that were not able to join us for the Exchange.  For ease of viewing, each 2 hour session has been split into 15 minute clips for each presenter and the discussions have been segmented into two parts.

The first session is on how to develop the research capacity of young/new think tanks:

Panel A: Building capacity for quality research – challenges and opportunities for “young” think tanks can be found here. The videos are below:

Ajaya Dixit: Institute for Social and Environmental Transition – Nepal (ISET-N)

Watch Ajaya’s talk (not really about building capacity but and interesting discussion on limitations of external ‘experts’)

Werner Hermani: Fundación ARU

Watch Werner’s talk

Eberechukwu Uneze: Center for the Study of the Economies of Africa

Watch Ebere’s talk

Ibrahima Hathié: Initiative prospective agricole et rurale (IPAR)

Watch Ibrahima’s talk

And the discussions from the floor: the first part and second part

The peculiar use of training activities as vehicles for policy research uptake in Serbia

[Editor’s note: This blog is part of an ongoing study on communicating complex ideas. This post has been written by Goran Buldioski, Director of the Think Tank Fund, and Sonja Stojanovic, Director of the Belgrade Centre for Security Studies. Their first post can be found here: Civilian control of the state security sector (with special focus on military)]

Capacity building can be both an opportunity for building a network and a vehicle for validating research results. Can it also be designed to help a long-term influence? In this blog we share some of the preliminary findings of our exploration into how the Belgrade Centre for Security Policy (BCSP) used trainings for the latter.

Capacity building is the very first thing to come to mind when someone from a think tank mentions training activities. Many think tanks in Central and Eastern Europe as well as around the world have prospered thanks to their excellent training programmes. Cohorts of researchers, civil servants, decision makers, journalists, NGO professionals and many others have honed in their analytical skills, sharpened their understanding of the policy processes or improved their policy expertise on a given subject  thanks to various programmes organised by think tanks.

The second thing to come to mind is the building of networks of contacts. To any given think tank, the trainees become an ever-increasing network of contacts: entry points to various public and private institutions, avenues to increase their publicity, potential partners and allies, future consumers of the analysis offered by the think tanks, and a budding constituency as a whole. BCSP has turned these contacts into a powerful tool for communication. Their mailing list has increased due to their training activities leading BCSP to expand their distribution channels.

However, using training activities as a key vehicle for research uptake is surprisingly not as common as might be expected. To be fair, many think tanks expose their trainees to the analysis they have produced in the past, and use their reports and projects as case studies throughout the training to explain an idea or illustrate a point. Yet, most  of these activities are aimed at capacity building and are not consciously designed and structured as a means for research uptake.

The practice of the Belgrade Center for Security Policy (BCSP) of using training courses as a key (central) tool for communication of policy research is therefore worth noting. BCSP has consciously designed a series of training courses addressing the democratic control of the armed forces as the best vehicle to secure the uptake of their research finding by the military elite. The seminars came in different formats: from half a day awareness raising discussions at the military barracks to a year-long accredited MA course in International Security organised in partnership with the Faculty of Political Science. Some seminars were organised only for military officials and civilians employed in the Ministry of Defence, although the majority targeted a more diverse groups composed of young politicians, representatives of civil society, media, and different government agencies.

What made BCSP to take such a step (key factors)?

Context

Between 2001 and 2008 when civil and democratic control of military was a ‘hot topic’, the military elite and mid-ranking officials nurtured deep mistrust in BCSP (at that time known as Centre for Civil-Military Relations). The mistrust had a lot in common with other military professions all over the world: a) an overarching mistrust towards any military outsiders, and b) specific skepticism at how an external civilian analytical centre could analyse the armed forces better that its in-house research and strategic institutes.

A specific factor for Serbia was the suspicion of foreign espionage masked under the work of civil society, due to the recent conflict with major international actors. Moreover, due to BCSP founders’ vocal criticism of the military’s involvement in politics and the Yugoslav conflicts, some officers distrusted the capacity of BCSP to provide “objective and constructive knowledge without an activists’ agenda”.

Specific demand

Immediately after the democratic transition, the political elites made of the former opposition to Milosevic’s regime opened a window for BCSP’s engagement as they were more receptive to non-military advice on the modern military arrangement within a democracy –a type of society that Serbia was trying to become. While military officials were skeptical of the changes, there was a still tendency among the some high-ranking and the majority of the mid-level military officers to learn about different realities elsewhere in the world and understand the implications of any incoming reform. Hence a minority was genuinely interested in the reforms, while the rest engaged in order to improve their public image and demonstrate to the political elite they were not opposing the upcoming reforms. Later in the process, the military’s human resources policy was changed so to encourage career officers to seek additional education as a requirement for promotion (this is when BCSP initiated a formal academic programme in partnership with the Faculty of Political Science at the Belgrade University).

This target group, having been groomed by the most rigorous education system in the former Yugoslavia, was more prone to ‘being educated’ than simply told what to do by external independent analyses aimed at influencing their decisions. Given that in Serbia spoken word takes precedence over written communication as well as that equal importance is given to the messenger as to the message these people would rather attend a training course than read a book/analytical report.  The centre’s founders were relatively well-known to the military officials, albeit not necessarily liked or trusted at the very beginning. Finally, while actively engaged in training delivery and convening a lot of meetings with various stakeholders, the centre never strayed from its key function, i.e. To generate new knowledge and provide timely analysis on actual events in this sector in Serbia. All these factors secured the attendance of the officials at the scheduled courses.

What did BCSP do exactly?

BCSP designed these training courses with two goals in mind: a) train/share knowledge on the substance matter and b) systematically present their in-depth research findings interwoven into the training sessions. Once these two goals were agreed to, the key challenge was the design of the courses. The centre opted for an interactive design with a lot of original simulation exercises and role-playing session in addition to dryer lectures.

The background documents for the simulation exercises and the role play sessions all contained references BCSP analysis. The background documents for the simulation exercises and the role plays all contained references to BCSP’s analysis. Observers to the field may object: ‘Nothing new – many organisations use their own research to support their training activities’. BCSP differed in two ways. Frist, they complemented the existing (and already published) work with new unpublished data or analysis. Second, the training design allowed for space for testing the key messages and pitching the data to the relevant policy makers – all in the safe setting of a training course and not at a public event.   Sometimes even by the Chatham House Rule could face the same problem: policy makers are not willing to discuss matters in front of their political opponents, or the discussion becomes a showdown of conflicting arguments (this is very pertinent to Serbia where the open discussion and constructive criticism in public debate are yet to take root)

The flexible design also accommodated discussions among the participants if debate over some substantive point overpowered the educational element. Third, the training courses were also used to further develop the research by getting first hand access to data otherwise very difficult to obtain e.g. data on values of the military. Having this at hand, BCSP could validate its research findings from different sources and then formulate realistic recommendations.

In sum, the training course became a vehicle to accommodate the presentation of research findings and mini seminars   in a manner that was conducive to promoting a debate over some of the key issues.

Lessons learned and their application -or what can we learn from this practice?

  1. Different mindset/expectations. Decision makers and other policy stakeholders go with a different mind-set to training course as opposed to attending a presentation of a research study or a seminar to discuss a policy issue. By design, the training course centers on the individual participants’ priorities and needs. Naturally, such an approach lowers ‘their defensive guard’. In the case of military, this does not happen at once. At the seminars with security professionals, participants were initially reluctant to speak as they were afraid of being reported by their peers. Another inhibition stems from being perceived as being critical in front of a higher raking officer. Therefore, BCSP’s facilitators guided senior officers to open and speak up or confront other government institutions (e.g. independent oversight bodies) as a way of gaining their trust and that of others. It is also important to demonstrate valuable knowledge and ‘show your good intentions for your country’. Also, in a learning environment, there is no emphasis on making decisions –choosing one policy alternative at the expense of another. These are all considered (by the learners) as secondary interests throughout the training and so the pressure is quite low. Add the good educational design to this and BCSP were able to establish an open learning environment.
  2. Trainings are susceptive to the presentation of policy analysis in covert or overt ways.  With the target group being ‘softened’ by the learning environment, the training design should seamlessly interweave the educational components with the data/analysis to be presented within the training sessions. To be clear, the training event is not used as a cover for a policy analysis presentation. That would be a manipulation. Instead, it is a skilful usage of ‘fresh’ analysis as part of the learning process and consciously designing space for positive externalities to emerge. This is easier said than done –designing interactive sessions is hard in its own right. Adding the think tank’s analysis as part of the background documentation, case studies, role plays and simulation exercise and being able to separate it from the daily reality (simulation to be very much pertaining to the reality, but not exactly analysing it outright) is an art in itself. BCSP has managed to design a successful approach within its policy context. It has also put an emphasis on non-formal and interactive learning methods –an approach that does not come naturally to think tanks given that most are more prone to academic-style teaching. While the very design of these courses may not be transferable to other topics and/or realities, the awareness about this possibility and using interactive non-formal learning methods as key vehicle for communication of research findings are.
  3. Not a panacea. This works and should be tried only when the more conventional methods such as one-to-one or group presentations, seminars and workshops do not manage to secure the research uptake.   Should the policy makers be cooperative and responsive to the conventional tools, the training course would be nice layer, but may not have the vital role as in the case of BCSP. Yet, one should consider trainings as a tool for research uptake when changing the organisational culture is part of the policy objectives.  In other words, you can ‘sell’ a message to a relevant stakeholder during one-on-one meeting, but you are unlikely to influence the values and organisational culture without broader interaction.  The freedom to debate and disagree provides an important opportunity to change people’s mindsets. BCSP has designed and carried out many of their training activities with these mid- and long-term policy goals in mind: changing the system as well as the way people discuss military issues.
For a longer debate on capacity building and the use of workshop: Developing research communication capacity: lessons from recent experiences

INASP’s reflections on lessons from recent research communication capacity building experiences

[Editor’s note: Dr Alexander Ademokun is the Acting Head of Programme for Evidence-informed Policy Making (EIPM) at INASP. This post is in response to Research communications support: why do donors, think tanks and consultants keep making the same mistakes?Capacity development: it is time to think about new ways of generating and sharing knowledge, and Developing research communication capacity: lessons from recent experiences]

This is the third in a series of posts in response to a paper that Enrique and Martine produced after evaluating a capacity building for research communication project implemented by INASP and ODI.  There have been some very thoughtful discussions about this paper on the evidence-based policy in development (ebpdn) discussion forum and a couple of interesting blog posts from Caroline Cassidy from the RAPID team at ODI and Vanesa Weyruch from CIPPEC.

The report makes some key arguments and some recommendations from the assessment of this initiative.  During the discussions on ebpdn one of the points that came out was that these lessons, while focusing on a capacity building for research communication project, are relevant to capacity building initiatives more broadly.  Some issues from the discussion that I think are worth exploring further in the context of a wider capacity development conversation are:

  • The need to understand the internal systems of the organisation you are working with:  In the case of this project it is about internal communication systems but in other contexts it may be about the organisational culture or finding out how your project fits into a wider organisation strategy.  This takes time and is labour intensive but it is worth doing from the start.  It also ensures that even if you are building capacity at the individual level it fits into a wider institutional plan and the added capacity is more likely to be made use of.  Understanding where your project fits within a wider plan also reduces the tendency of responding to every call irrespective of ability or capacity to deliver.

For some service providers and intermediary organisations this may mean doing less but better and also being able to say no to offers of new projects.  There is an on-going conversation within the EIPM team at INASP about how we can balance the need to more deeply understand the organisations and context we work in whilst still finding the time to do all the activities we would like to.

  • Work locally and build on what already exists:  This message came through very strongly from Enrique’s and Martine’s report.  This raises some issues that we need to engage with. For instance working locally presumes the capacity and infrastructure exists to deliver the goals of the project.  If they do, that’s great.  If they don’t, have you got the time and expertise to truly build the necessary capacity at the target institution?  You may even find after engaging with the organisation that you are not best placed to deliver what’s needed – will you say so?

The tendency to plug a capacity gap with a highly visible workshop is strong but we know that to build long term solutions you will have to engage more.  This involves asking whose goals you are working towards and how flexible these goals are – is the capacity that is being developed a goal in itself or a means to an end?  The report talks about developing communication strategies with no money for implementation.  This led to lack of interest and lack of ownership.  The model of building on what already exists is illustrated by the example that Enrique gave on the ebpdn where he has decided to take a step back and work with organisations to build organisational communications strategies before building a strategy for a particular external project.  We can all learn from this approach but it requires commitment from the grantees, intermediaries and donors to recognise that we are not just thinking about our specific project but, again, about the sustainability of the capacity you are trying to develop.

Interests, incentives and commitment:  We need to take the time to understand why participants in an initiative are there.  The report mentions some participants who took part because they felt it was important to donors.  For some of us trainers or service providers an opportunity to try something new or work with a particular organisation may be our incentive.  We need to be clear about what we are each trying to achieve by being in the room before we even get started.

This is also linked to commitment.  It is easier to commit if we know what we are committing to: what does the end of the project look like and what happens at the end?  Does the end mean funds stop coming in, mentoring support stops or does the end simply mean a date three years’ down the line?  Working with all involved – funders, grantees and intermediaries to clearly define the end of a project (and what it means beyond financial support) is important at the start of the project.

Use the right people and understand the context:  The report highlighted the value of using local or regional facilitators who may have a better understanding of the context.  Over the last few years, INASP has used the training of trainers’ approach to build a cohort with both the capacity and the remit to deliver capacity building initiatives locally.  A report from a recent workshop in Asia for trainers of policy makers can be found here.  The years doing this work has taught us that just because you are a subject expert does not mean you are a good trainer.  Spending the time to find or build the capacity of trainers to train is just as important as developing or delivering content.  Likewise getting the trainers to understand the content and context before jumping in to deliver an ‘interactive’ workshop is important.  There is only so much small group work/drawing/flipcharts can do if your participants think you don’t understand their realities.  This need to understand how to train is an often undervalued aspect of capacity building.

Linked to this is the understanding that workshops are not the magic bullet they are often thought to be.  At INASP we use workshops as part of a package of activities to engage.  Sometimes this may mean taking the same group of participants through a series of workshops instead of trying to deliver everything in five days.  Other options include adding on a mentoring process before and after a workshop or mixing workshops with other learning models be they online, country visits or peer-exchanges.

We know most of this but don’t always do it yet we respond when the same issues are raised.  This tells me we want to do better.  Using the opportunities and networks we have to share our learning and constructively challenge our approaches is a good thing and I hope we carry on doing more of it.

Capacity development: it is time to think about new ways of generating and sharing knowledge

[Editor’s note: Vanesa Weyrauch is the Principal Researcher of the Influence, Monitoring and Evaluation Programme at CIPPEC, which she created and has led since 2006. She founded a network of leading think tanks in Latin America, with the support of  GDNet, Directores Ejecutivos de America Latina (DEAL). This post is a response to Developing research communication capacity: lessons from recent experiences and can be read alongside Caroline Cassidy‘s own post]

I am not an expert on capacity development per se but I’ve been a practitioner from CIPPEC of a combination of activities during the last years in Latin America which have helped me reflect and learn on what seems more promising in terms of helping others improve the way they try to influence policy through research. Much of this work has been performed under EBPDN LA with ODI and most of it under the program “Spaces for engagement” with GDNet, a strategic partner for us in this field.

This is why Enrique’s and Martine’s review findings from a recent communications project that RAPID and INASP worked on for IDRC have highly raised my attention. Just in time, I thought. After a couple of years of trying out several ways of improving what we know about policy influence (combining formal training workshops with online courses, technical assistance to think tanks, design of an innovative M&E methodology focused on policy influence, etc.) we have decided at CIPPEC to develop a capacity building strategy for 2013-2018 that allows us to be more strategic in the way we use our limited resources to assist policy research organisations and its leaders to enhance policy impact.

Firstly, I believe that some responses to questions posed by Enrique in his previous post and some of his recommendations may vary according to each type of individual/organisation taking part of the initiative (they tend to be very heterogeneous, which on the one hand enriches the exercise but on the other hand makes it extremely hard to please all participants). At CIPPEC we had experiences in training networks on these issues and even though they might share beliefs, values, research projects, etc, each network member had very different capacities and interests in research communications revealed in a pre-diagnosis. So how do we deal with this when resources are scarce? Ideally we would have all the time and resources to work both in groups and individually to support real change processes with viable plans and enable cross-learning but this is not the case in most of the cases. Therefore we face the challenge to make the most of what is available; smart decisions that use the evidence shared by Enrique and our own experience are crucial then.

Another key and related decision is whether those offering the support aim at training individuals and/or organisations. Strategies to do so will differ significantly and it is extremely difficult to make these decisions at the very beginning of a capacity building project and when there is a diverse group that will take part of it.

Finally, another tricky but very profound question is: How do we monitor and evaluate these efforts? How do we know if and how we have contributed to developing this sort of capacities? I agree that stand-alone workshops are not then most desirable strategy but I’ve heard/seen persons and organisations making a big change after attending one where excellent trainers have been able to raise their awareness on these issues and spearheaded the right questions at the individual/organisational levels. Thus, what are we aiming at and how we will know if we’ve done good?

An excellent paper that has significantly influenced how I think about all these issues and how we plan to further develop our capacity to build capacity at CIPPEC is “Learning purposefully in capacity development. Why, what and when to measure?” by Peter Taylor and Alfredo Ortiz. We need to develop new thinking about these issues and this paper triggers this type of thinking for all of us: donors, “trainers”, “trainees”. As we titled one of our handbooks, I believe we are all a little bit of learners, teachers and practitioners. That’s why ways to generate and share knowledge are increasingly horizontal! For us, online courses have enlarged the opportunity to make this happen as the knowledge is shared and discusses between peers and colleagues. What participants of our courses ask, the reflections they make, the real live examples they share, have all largely enhanced the knowledge we share in the next edition of the same course.

Finally, I am more and more convinced of the value of constant cross-fertilisation between theory and practice (and that’s what we’ve tried to do in our work all these years) which will significantly affect what we consider valuable knowledge and how we share it. Sir Ken Robinson  has very effectively conveyed the importance of rethinking education: there is a real and increasing call for creativity in the way we treat knowledge. For this, group learning is key, collaboration is the strategy for defining crucial questions and co-building the answers. Spaces -like this blog- where we can share what we know and don’t know about the topics that we are passionate about are a promising sign of how capacity of all-teachers, learners and practitioners (which are changing roles for all individuals) can be further developed.

[Editor’s note: If you’d like to join the conversation with a post of your own, please send us an email or tweet. See the original post for inspiration: Developing research communication capacity: lessons from recent experiences]

Research communications support: why do donors, think tanks and consultants keep making the same mistakes?

[Editor’s note: Caroline Cassidy is the Research and Policy in Development (RAPID) Programme’s Communication Officer. This post is a response to: Developing research communication capacity: lessons from recent experiences and can be read alongside Vanesa Weyrauch‘s own response -coming up this Wednesday]

Building capacity to develop research communications skills and competencies for policy influence is not a new thing. There are a multitude of players involved in the process who have been working in this area for years. And evaluating that capacity development is not really a new thing either. So why then should I be writing this blog if what I am about to say is nothing new? Because, despite clear recommendations for better support, time and time again, donors, think tanks and consultants keep coming up against the same challenges, leaving research communication to the end of the project, then getting caught up in a cycle of workshops and interventions that are unlikely to have the desired impact, and when researchers or teams are already looking to their next area of work.

I arrive at this type of capacity development from ODI’s Research and Policy in Development (RAPID) programme where I have been working with the team to build on ODI’s years of work helping to develop the capacity of researchers and organisations in a variety of contexts, to have impact in the policy realm. Enrique and Martine’s evaluation findings from a recent communications project that RAPID and INASP worked on for IDRC last year identify  some very interesting (though sadly not all new) issues that frequently surface when we do this type of work: contextual concerns – in a short space of time, can a consultant really get to the crux of the project without having a strong working knowledge of the context themself; support often comes at the end of a project so that therefore it feels like it is ‘tagged on’ as an extra dimension, rather than an integral one; and ensuring you have the right people in a team involved in the first place, who can benefit the most from the support.

One recommendation from Enrique and Martine that I don’t think we at ODI have seen before is assessing demand and talking directly to the grantees who need support before a contract is even signed, then deciding whether this capacity support should be provided and to whom. This is also related to another report lesson on researcher incentives and pressures beyond communications and the fact that many do not believe it is their role to engage at all – that it is someone else’s job. Therefore, assessing the demand and finding the right people within the organisation to work with as early as possible is absolutely critical, (and then re-evaluating this throughout the duration of the support, as circumstances alter). And if it looks as if it’s not going to have the necessary impact – consultants and think tanks should have the ability to just say no from the outset.

Yet, despite these and other well-established, clear and very sensible principles, there seem to be a few key confounding factors that often impede their implementation:

The first is funding; although there is a growing consensus of the importance of communicating research, funding for communication has undoubtedly suffered at the hands of the economic downturn and the growing ‘value for money’ agenda. It is not always seen as a major priority in the research cycle and often too closely, and even wrongly, associated with branding and marketing, rather than policy influence. Moreover, even in the communication arena donors often favour interventions that lead directly to visible outputs like, the workshop.

Secondly, as Enrique and Martine emphasise, there is often poor planning: donors and organisations realise quite late into a project and budget cycle that the teams need extra support in this area, but with not much time and little funding, a ‘quick’ workshop is often seen as an immediate ‘magic wand’.  As a blog by my colleague, Ajoy Datta highlights – workshops do give a good introduction to the topic and some initial support, but are unlikely to make a real impact once the participants have left the building.

I also think that there is still the misconception, at some levels, that researchers and teams shouldn’t be thinking about the communication of their work until later in the process or indeed towards the end. However, whoever leads on communications needs to engage with stakeholders as early as possible to ensure relationships are cemented and that ideally decision-makers have buy in.

And finally, well even if they could do all of the above, donors frequently do not have sufficiently flexible mechanisms and incentives to support a more appropriate response, as discussed in a recent ODI background note: Promoting evidence-based decision-making in development agencies.

So faced with all this doom and gloom, what can be done? While workshops can still be useful, in RAPID, we are now trying to incorporate them where possible, as part of a wider and longer involvement in a project, and one where ideally we are involved from the beginning. For example, we are currently working on a two year project with the International Initiative for Impact Evaluation (3ie) on communications support to their grantees, knowledge management development (at an organisational level) and another three year project on monitoring and evaluating grant policy influence. The latter is in consortium with three other regional organisations: CEPA (Sri Lanka), CIPPEC (Argentina) CommsConsult (Zimbabwe).  It is an exciting, though we recognise, rare opportunity to work at different organisational levels to do some thinking, develop tools, research and capacity work in a ‘quality learning laboratory’. Support will be provided by locally based teams working in context, prioritising face-to-face engagement (which does include workshops!), but also using online engagement where necessary. All of this will hopefully help to ensure better impact, longevity and buy in through stronger, more collaborative relationships between researchers and policy-makers, and from our side, better contextual knowledge.

And for other projects, where we are working with smaller organisations and donor budgets, we are trying to ensure that there is additional support around the workshops through mentoring, field trips, local partners and we will certainly take on board the recommendations put forward by Enrique and Martine.  And sharing evaluation findings in early discussions with donors can make a big difference. An organisation I am working with decided to implement more face-to-face support, because the donor read and assimilated the recommendations from another project evaluation report.

Communications capacity development is a constant learning process and there is no best-case, winning magic formula. But nor should there be – because good support is so dependent on the organisation, project, participants and the context, and just ‘shoehorning’ a ready-made approach or template is not going to work. This report contains some useful principles to guide new forms of support and to encourage donors, think tanks and consultants alike to not fall into the same traps of short-term support that frequently only deliver mediocre results. And above all, interventions are far more likely to become embedded into the life of a project (and hopefully beyond) if they are part of the project from the beginning and not left as an afterthought.

[Editor’s note: Vanesa Weyrauch’s response will come out on Wednesday but if you’d like to join the conversation with a post of your own, please send us an email or tweet. See the original post for inspiration: Developing research communication capacity: lessons from recent experiences]

Developing research communication capacity: lessons from recent experiences

[Editor’s note: This is the first of four blogs on the subject: Research communications support: why do donors, think tanks and consultants keep making the same mistakes?Capacity development: it is time to think about new ways of generating and sharing knowledge, and INASP’s reflections on lessons from recent research communication capacity building experiences. Join the debate.]

Donors spend millions every year trying to build the capacity of researchers to communicate their work more effectively. Unfortunately, most of it goes on one-off workshops and attempts to get them to do things they are clearly uninterested in. Sometimes it feels that lessons are hardly ever learned. But sometimes opportunities come about that let us reflect and learn.

Last year, ODI and INASP asked Martine Zeuthen and me to review their efforts to build the capacity of a series of IDRC funded research programmes in Africa. We assessed each one separately and then brought both reviews together in the synthesis below. We found, among other things, that a lot more time needs to be dedicated to planning the interventions.

I am now trying the recommendations listed below in a project I am working on this year with four Latin American think tanks. I’ll report back on how it goes.

Lessons (more detail on these lessons, the recommendations below, and the approaches themselves, is provided in the document in Scribd or GoogleDocs):

  • The best laid plans… In both cases, as well as in other cases consulted for the purpose of this review, the interventions did not go as planned.
  • An expression of interest does not always imply commitment: Although the grantees had expressed their interest in being involved in the projects several were not engaged in learning and did not change their approach to research communication as a consequence. In one case one of the grantees expresses that their involvement was based on the impression that the project appeared to be important for IDRC and ODI. In other words, their participation was driven more by an interest in being part of such initiative to satisfy donor demands rather than in the initiative itself.
  • Researchers have other interests and pressures besides communications: Most researchers are often more interested in researching than communicating. Additionally, while an individual project may be a priority for the donor or for the lead partner, it is unlikely to be so for individual organisations or researchers. As a consequence, any activities that are not seen to directly support their core business are unlikely to be given the priority they demand to be effective.
  • Face-to-face is better than virtual, but the web is a good alternative
  • If it is not done at the beginning, then it is probably too late: In all cases the project came about as a final activity for the grantees, added to the project with only months to go. Furthermore, while the support provided was intended to lead to a communication strategy, there were no additional funds to implement such a strategy. As a consequence, researchers had few incentives to engage more than necessary.
  • The right people matter: The ambition was for the people receiving the support to then go on and train or mentor other members of their networks or organisations. Unfortunately, those who participated where not always the right people for this objective. Senior researchers, network coordinators, and even communicators may be excellent candidates to make use of any skills learned  but that does not necessarily make them the most appropriate ‘trainers of trainers’.
  • Local or regional facilitators and mentors: INASP’s approach involved using regionally based facilitators and mentors. This had a particularly positive effect on the project. The partners learned from the mentors and enjoyed discussing the specific challenges that they were facing with regional professionals. Conversely, ODI was able to connect with the grantees it was supporting only after visiting their offices, and concerns about the consultants’ lack of familiarity with their context were raised.
  • No one is starting from scratch: All the grantees, to different degrees, have some sort of research communication capacity. In some cases, their personal and professional networks ensure greater levels of impact than any formal research communication strategy could ever promise. Furthermore, many communication tactics and channels that are common for developed countries or the United Kingdom, and that ODI and INASP are more familiar with, may not be appropriate for the grantees’ contexts.

Recommendations:

  • Start early –right from the beginning: Developing the capacity to communicate should not come as an afterthought. Funders must plan this right from the start and service providers like ODI and INASP should be careful about being involved if this is not the case.
  • Confirm demand before starting: Even before signing a contract, the service providers should contact the grantees and effectively treat them as clients; inquiring as to their interests, concerns, and commitment to the initiative. The service providers must be very clear regarding the time and resources that they will have to allocate to the process. They must also discuss, at length, who are the most appropriate people to be directly involved and what will be their responsibilities.
  • More than a needs assessment: really understand the organisation and its context: The service provider should start by either spending time with the organisation or hosting the relevant people. Above all, the service providers need to understand the culture of the organisations and the policy contexts they seek to affect. This is not something easily achieved through a remote diagnostic.
  • Consider who is the most appropriate source of expertise: It may be that the organisations conducting the assessment are not necessarily the most appropriate when it comes to delivering the support. Would they limit their recommendation to the services they can offer?
  • Build on strengths: The service providers should seek to either improve what they already do or introduce new channels or tactics that build on those that they are comfortable with. This is likely to make a bigger impact than if the consultants bring along an entirely foreign and all-encompassing new approach.
  • Focus on the organisation rather than on single projects: Support should be aimed at strengthening the organisation’s capacity and not just a single project’s visibility. This is likely to attract the support of senior managers that is crucial for any change to take hold within the organisation. The project itself can be used as a pilot to text the new tactics or channels proposed.
  • Earmark funds to implement whatever strategy they develop: It is unlikely that the organisations will dedicate the necessary time to develop a strategy or plan unless they know that there will be funds available to implement it. Just as the service providers are not helping for free, it is unlikely that these researchers will be able to dedicate the necessary time to the initiative unless their time is covered. The service provider should therefore make sure that there are sufficient funds for this purpose. On the other hand, if the organisation has the funds but is not willing to allocate them to this purpose this should be seen as a sign that there is little buy-in from the leadership.
  • Maximise peer-to-peer exposure: Depending on the kind of skills being shared and the individuals involved, the donors and service providers should attempt to ensure that people with the right experience deliver the support. Researchers, for example, are more likely to respond to other researchers; communication officers to communication officers; and managers to managers. This means that it is possible that the service providers will have to look beyond their organisations for the right expertise. Instead, they may act as facilitators and help the organisations find the most appropriate people for their needs.

Read the full report here:

View this document on Scribd


Capacity building: straight from the lion’s mouth

A few years ago, the BBC showed a series of programmes on ‘the toughest places to be …’. One of these shows was about the toughest place to be a midwife and it featured a midwife from the Midlands visiting Liberia to work with peers there. The differences between the British and the Liberian midwife’s work conditions could not have been more different it did not take long for both to find themselves at ease with each other, reassured by the fact that they were the same. Their jobs and roles in society were fundamentally the same.

The British midwife spent the first few days observing her peers work, she asked questions, and stopped to reflect on what she saw. Much of what she saw she found shocking and at times difficult to digest but, if I remember correctly, she held off any judgement and just paid attention.

Towards the end of the show she took charge. Having gained the trust of her Liberian peers she decided to show them her way of delivering babies (a way that involved a lot less crying and a lot less blood). The reaction of the Liberian midwives was immediate. They got it. They understood why she was doing things differently and what was the effect of the differences. She was not ‘the northerner’ who went to tell them how to do things, but a peer, someone who had the same skills as they had but probably, because she had the luck of working in the UK, had access to more information and support.

This was peer to peer learning. No intermediaries were necessary.

As a Latin American working in the United Kingdom I have always tried to involve Latin Americans in development projects in Africa or Asia. Often, the response I get from donors and clients has been the same: but what do they know about Africa? Worse: what do they know about ‘development’? (Note that ‘development’ here refers to the discipline of development studies.)  I think that I have only been successful twice and in both cases the researchers involved were able to deliver better services and at a much cheaper rate than any ‘development expert’ from the UK that had been considered.

The Aid industry (donors, NGOs, development think tanks, consultants) often present themselves as brokers but I like to see them (us) more as buffers or barriers to direct peer to peer learning.

On my way to Cairo on Sunday I read a brilliant article on the British support to the afghan Army on the FT Weekend Magazine. Andy McNab wrote about the way in which British soldiers were shadowing Afghan soldiers and offering advice and support. No workshops, toolkits, or consultancies. There wasn’t an intermediary in sight. Only soldiers talking to soldiers.

The contexts in which the British and Afghans learn, train, and work are clearly different. Their cultures are different. They relationships with others members of society are different. Even their languages are not the same. But as professional soldiers they share a lot more and this is sufficient to learn from each other. The model has worked so well that even the Americans are now ready to replicate it (they are 18 months behind the British on this one).

One of the best quotes from the article is from Brigadier Patrick Saunders:

“Sheren Shah (Commander of the Afghan National Army) is our boss., it is as simple as that. We are not here to produce British soldiers. We are not here to replicate the British Army. We are preparing the ANA to function without us.”

The point I am trying to make is that when it comes to supporting think tanks in developing countries maybe the best way forward is to try to get people with experience in the things that the think tanks want to learn about to spend some time working there. It sounds expensive but the fact is that there are plenty of young yet experienced researchers, communicators, and managers who are actively looking for opportunities to work in developing countries. They may not know much about their contexts but they are sufficiently smart of understand them and adapt their approaches to match. By spending time with their hosts they will be able to learn about the nuances of the organisations, advice on new approaches and methods, introduce new attitudes (often think tank researchers tell me that what they need is someone to encourage them to ‘think less as academics’), and even links to global networks.

Researcher to researcher; communicator to communicator; manager to manager. Forget intermediaries (unless, of course, its to build the capacity of intermediaries).

Follow

Get every new post delivered to your Inbox.

Join 5,522 other followers