Skip to content

Posts tagged ‘ODI’

Latin American networking for think tanks: a decade in meetings

Latin American think tanks have been meeting each other and sharing lessons for a very long time. In the last decade, a number of efforts have slowly helped build a community of practice that is, now, coming of age. The future of think tank collaboration in the region looks bright.

Read more

WonkComms: the future of think tank communications

What is the future of think tank communications? IPPR, ODI, the Social Market Foundation and the Economist come together at an event in London to try to address this. The report of the event in itself is a perfect example of what can be done with very little effort -but careful planning.

Read more

Corporate websites: do we need them?

Earlier in 2012, Nick Scott wrote about the decline of the corporate website but then went ahead to redesign ODI's site. IDS also redesigned and relaunched their website. And so did IIED. (On Think Tanks did, too) So, are they in or are they out?

Read more

The Power of Film: Turning your talking head video into a story

I want to explore how to take your next steps into the realm of online film to develop something more elaborate: to produce an online ‘story’. Getting this right is almost as important as who you pick to do your filming and is not always an easy process.

Read more

Who should be the next director of the Overseas Development Institute?

This short post is in part an attempt to offer such a (wish) list of the kind of key skills and competencies that I think that a think tank director should bring to the position. And of course, this post is not just for or about ODI but is intended to inform other think tanks which may be going through a similar process. I am just using ODI as an example.

Read more

ODI’s award-winning online strategy explained

The ODI digital strategy, first outlined in a series of blogs for onthinktanks.org, was awarded Online Strategy of the Year 2012 at the prestigious Digital Communications Awards, held in Berlin on Friday. ODI beat off competition from multinational corporations and specialist digital agencies to claim this major award. This post is based on the speech give to the jury and explains very succinctly what the strategy is and where/why it has worked.

Read more

INASP’s reflections on lessons from recent research communication capacity building experiences

[Editor’s note: Dr Alexander Ademokun is the Acting Head of Programme for Evidence-informed Policy Making (EIPM) at INASP. This post is in response to Research communications support: why do donors, think tanks and consultants keep making the same mistakes?Capacity development: it is time to think about new ways of generating and sharing knowledge, and Developing research communication capacity: lessons from recent experiences]

This is the third in a series of posts in response to a paper that Enrique and Martine produced after evaluating a capacity building for research communication project implemented by INASP and ODI.  There have been some very thoughtful discussions about this paper on the evidence-based policy in development (ebpdn) discussion forum and a couple of interesting blog posts from Caroline Cassidy from the RAPID team at ODI and Vanesa Weyruch from CIPPEC.

The report makes some key arguments and some recommendations from the assessment of this initiative.  During the discussions on ebpdn one of the points that came out was that these lessons, while focusing on a capacity building for research communication project, are relevant to capacity building initiatives more broadly.  Some issues from the discussion that I think are worth exploring further in the context of a wider capacity development conversation are:

  • The need to understand the internal systems of the organisation you are working with:  In the case of this project it is about internal communication systems but in other contexts it may be about the organisational culture or finding out how your project fits into a wider organisation strategy.  This takes time and is labour intensive but it is worth doing from the start.  It also ensures that even if you are building capacity at the individual level it fits into a wider institutional plan and the added capacity is more likely to be made use of.  Understanding where your project fits within a wider plan also reduces the tendency of responding to every call irrespective of ability or capacity to deliver.

For some service providers and intermediary organisations this may mean doing less but better and also being able to say no to offers of new projects.  There is an on-going conversation within the EIPM team at INASP about how we can balance the need to more deeply understand the organisations and context we work in whilst still finding the time to do all the activities we would like to.

  • Work locally and build on what already exists:  This message came through very strongly from Enrique’s and Martine’s report.  This raises some issues that we need to engage with. For instance working locally presumes the capacity and infrastructure exists to deliver the goals of the project.  If they do, that’s great.  If they don’t, have you got the time and expertise to truly build the necessary capacity at the target institution?  You may even find after engaging with the organisation that you are not best placed to deliver what’s needed – will you say so?

The tendency to plug a capacity gap with a highly visible workshop is strong but we know that to build long term solutions you will have to engage more.  This involves asking whose goals you are working towards and how flexible these goals are – is the capacity that is being developed a goal in itself or a means to an end?  The report talks about developing communication strategies with no money for implementation.  This led to lack of interest and lack of ownership.  The model of building on what already exists is illustrated by the example that Enrique gave on the ebpdn where he has decided to take a step back and work with organisations to build organisational communications strategies before building a strategy for a particular external project.  We can all learn from this approach but it requires commitment from the grantees, intermediaries and donors to recognise that we are not just thinking about our specific project but, again, about the sustainability of the capacity you are trying to develop.

Interests, incentives and commitment:  We need to take the time to understand why participants in an initiative are there.  The report mentions some participants who took part because they felt it was important to donors.  For some of us trainers or service providers an opportunity to try something new or work with a particular organisation may be our incentive.  We need to be clear about what we are each trying to achieve by being in the room before we even get started.

This is also linked to commitment.  It is easier to commit if we know what we are committing to: what does the end of the project look like and what happens at the end?  Does the end mean funds stop coming in, mentoring support stops or does the end simply mean a date three years’ down the line?  Working with all involved – funders, grantees and intermediaries to clearly define the end of a project (and what it means beyond financial support) is important at the start of the project.

Use the right people and understand the context:  The report highlighted the value of using local or regional facilitators who may have a better understanding of the context.  Over the last few years, INASP has used the training of trainers’ approach to build a cohort with both the capacity and the remit to deliver capacity building initiatives locally.  A report from a recent workshop in Asia for trainers of policy makers can be found here.  The years doing this work has taught us that just because you are a subject expert does not mean you are a good trainer.  Spending the time to find or build the capacity of trainers to train is just as important as developing or delivering content.  Likewise getting the trainers to understand the content and context before jumping in to deliver an ‘interactive’ workshop is important.  There is only so much small group work/drawing/flipcharts can do if your participants think you don’t understand their realities.  This need to understand how to train is an often undervalued aspect of capacity building.

Linked to this is the understanding that workshops are not the magic bullet they are often thought to be.  At INASP we use workshops as part of a package of activities to engage.  Sometimes this may mean taking the same group of participants through a series of workshops instead of trying to deliver everything in five days.  Other options include adding on a mentoring process before and after a workshop or mixing workshops with other learning models be they online, country visits or peer-exchanges.

We know most of this but don’t always do it yet we respond when the same issues are raised.  This tells me we want to do better.  Using the opportunities and networks we have to share our learning and constructively challenge our approaches is a good thing and I hope we carry on doing more of it.

Capacity development: it is time to think about new ways of generating and sharing knowledge

[Editor’s note: Vanesa Weyrauch is the Principal Researcher of the Influence, Monitoring and Evaluation Programme at CIPPEC, which she created and has led since 2006. She founded a network of leading think tanks in Latin America, with the support of  GDNet, Directores Ejecutivos de America Latina (DEAL). This post is a response to Developing research communication capacity: lessons from recent experiences and can be read alongside Caroline Cassidy‘s own post]

I am not an expert on capacity development per se but I’ve been a practitioner from CIPPEC of a combination of activities during the last years in Latin America which have helped me reflect and learn on what seems more promising in terms of helping others improve the way they try to influence policy through research. Much of this work has been performed under EBPDN LA with ODI and most of it under the program “Spaces for engagement” with GDNet, a strategic partner for us in this field.

This is why Enrique’s and Martine’s review findings from a recent communications project that RAPID and INASP worked on for IDRC have highly raised my attention. Just in time, I thought. After a couple of years of trying out several ways of improving what we know about policy influence (combining formal training workshops with online courses, technical assistance to think tanks, design of an innovative M&E methodology focused on policy influence, etc.) we have decided at CIPPEC to develop a capacity building strategy for 2013-2018 that allows us to be more strategic in the way we use our limited resources to assist policy research organisations and its leaders to enhance policy impact.

Firstly, I believe that some responses to questions posed by Enrique in his previous post and some of his recommendations may vary according to each type of individual/organisation taking part of the initiative (they tend to be very heterogeneous, which on the one hand enriches the exercise but on the other hand makes it extremely hard to please all participants). At CIPPEC we had experiences in training networks on these issues and even though they might share beliefs, values, research projects, etc, each network member had very different capacities and interests in research communications revealed in a pre-diagnosis. So how do we deal with this when resources are scarce? Ideally we would have all the time and resources to work both in groups and individually to support real change processes with viable plans and enable cross-learning but this is not the case in most of the cases. Therefore we face the challenge to make the most of what is available; smart decisions that use the evidence shared by Enrique and our own experience are crucial then.

Another key and related decision is whether those offering the support aim at training individuals and/or organisations. Strategies to do so will differ significantly and it is extremely difficult to make these decisions at the very beginning of a capacity building project and when there is a diverse group that will take part of it.

Finally, another tricky but very profound question is: How do we monitor and evaluate these efforts? How do we know if and how we have contributed to developing this sort of capacities? I agree that stand-alone workshops are not then most desirable strategy but I’ve heard/seen persons and organisations making a big change after attending one where excellent trainers have been able to raise their awareness on these issues and spearheaded the right questions at the individual/organisational levels. Thus, what are we aiming at and how we will know if we’ve done good?

An excellent paper that has significantly influenced how I think about all these issues and how we plan to further develop our capacity to build capacity at CIPPEC is “Learning purposefully in capacity development. Why, what and when to measure?” by Peter Taylor and Alfredo Ortiz. We need to develop new thinking about these issues and this paper triggers this type of thinking for all of us: donors, “trainers”, “trainees”. As we titled one of our handbooks, I believe we are all a little bit of learners, teachers and practitioners. That’s why ways to generate and share knowledge are increasingly horizontal! For us, online courses have enlarged the opportunity to make this happen as the knowledge is shared and discusses between peers and colleagues. What participants of our courses ask, the reflections they make, the real live examples they share, have all largely enhanced the knowledge we share in the next edition of the same course.

Finally, I am more and more convinced of the value of constant cross-fertilisation between theory and practice (and that’s what we’ve tried to do in our work all these years) which will significantly affect what we consider valuable knowledge and how we share it. Sir Ken Robinson  has very effectively conveyed the importance of rethinking education: there is a real and increasing call for creativity in the way we treat knowledge. For this, group learning is key, collaboration is the strategy for defining crucial questions and co-building the answers. Spaces -like this blog- where we can share what we know and don’t know about the topics that we are passionate about are a promising sign of how capacity of all-teachers, learners and practitioners (which are changing roles for all individuals) can be further developed.

[Editor’s note: If you’d like to join the conversation with a post of your own, please send us an email or tweet. See the original post for inspiration: Developing research communication capacity: lessons from recent experiences]

Research communications support: why do donors, think tanks and consultants keep making the same mistakes?

[Editor’s note: Caroline Cassidy is the Research and Policy in Development (RAPID) Programme’s Communication Officer. This post is a response to: Developing research communication capacity: lessons from recent experiences and can be read alongside Vanesa Weyrauch‘s own response -coming up this Wednesday]

Building capacity to develop research communications skills and competencies for policy influence is not a new thing. There are a multitude of players involved in the process who have been working in this area for years. And evaluating that capacity development is not really a new thing either. So why then should I be writing this blog if what I am about to say is nothing new? Because, despite clear recommendations for better support, time and time again, donors, think tanks and consultants keep coming up against the same challenges, leaving research communication to the end of the project, then getting caught up in a cycle of workshops and interventions that are unlikely to have the desired impact, and when researchers or teams are already looking to their next area of work.

I arrive at this type of capacity development from ODI’s Research and Policy in Development (RAPID) programme where I have been working with the team to build on ODI’s years of work helping to develop the capacity of researchers and organisations in a variety of contexts, to have impact in the policy realm. Enrique and Martine’s evaluation findings from a recent communications project that RAPID and INASP worked on for IDRC last year identify  some very interesting (though sadly not all new) issues that frequently surface when we do this type of work: contextual concerns – in a short space of time, can a consultant really get to the crux of the project without having a strong working knowledge of the context themself; support often comes at the end of a project so that therefore it feels like it is ‘tagged on’ as an extra dimension, rather than an integral one; and ensuring you have the right people in a team involved in the first place, who can benefit the most from the support.

One recommendation from Enrique and Martine that I don’t think we at ODI have seen before is assessing demand and talking directly to the grantees who need support before a contract is even signed, then deciding whether this capacity support should be provided and to whom. This is also related to another report lesson on researcher incentives and pressures beyond communications and the fact that many do not believe it is their role to engage at all – that it is someone else’s job. Therefore, assessing the demand and finding the right people within the organisation to work with as early as possible is absolutely critical, (and then re-evaluating this throughout the duration of the support, as circumstances alter). And if it looks as if it’s not going to have the necessary impact – consultants and think tanks should have the ability to just say no from the outset.

Yet, despite these and other well-established, clear and very sensible principles, there seem to be a few key confounding factors that often impede their implementation:

The first is funding; although there is a growing consensus of the importance of communicating research, funding for communication has undoubtedly suffered at the hands of the economic downturn and the growing ‘value for money’ agenda. It is not always seen as a major priority in the research cycle and often too closely, and even wrongly, associated with branding and marketing, rather than policy influence. Moreover, even in the communication arena donors often favour interventions that lead directly to visible outputs like, the workshop.

Secondly, as Enrique and Martine emphasise, there is often poor planning: donors and organisations realise quite late into a project and budget cycle that the teams need extra support in this area, but with not much time and little funding, a ‘quick’ workshop is often seen as an immediate ‘magic wand’.  As a blog by my colleague, Ajoy Datta highlights – workshops do give a good introduction to the topic and some initial support, but are unlikely to make a real impact once the participants have left the building.

I also think that there is still the misconception, at some levels, that researchers and teams shouldn’t be thinking about the communication of their work until later in the process or indeed towards the end. However, whoever leads on communications needs to engage with stakeholders as early as possible to ensure relationships are cemented and that ideally decision-makers have buy in.

And finally, well even if they could do all of the above, donors frequently do not have sufficiently flexible mechanisms and incentives to support a more appropriate response, as discussed in a recent ODI background note: Promoting evidence-based decision-making in development agencies.

So faced with all this doom and gloom, what can be done? While workshops can still be useful, in RAPID, we are now trying to incorporate them where possible, as part of a wider and longer involvement in a project, and one where ideally we are involved from the beginning. For example, we are currently working on a two year project with the International Initiative for Impact Evaluation (3ie) on communications support to their grantees, knowledge management development (at an organisational level) and another three year project on monitoring and evaluating grant policy influence. The latter is in consortium with three other regional organisations: CEPA (Sri Lanka), CIPPEC (Argentina) CommsConsult (Zimbabwe).  It is an exciting, though we recognise, rare opportunity to work at different organisational levels to do some thinking, develop tools, research and capacity work in a ‘quality learning laboratory’. Support will be provided by locally based teams working in context, prioritising face-to-face engagement (which does include workshops!), but also using online engagement where necessary. All of this will hopefully help to ensure better impact, longevity and buy in through stronger, more collaborative relationships between researchers and policy-makers, and from our side, better contextual knowledge.

And for other projects, where we are working with smaller organisations and donor budgets, we are trying to ensure that there is additional support around the workshops through mentoring, field trips, local partners and we will certainly take on board the recommendations put forward by Enrique and Martine.  And sharing evaluation findings in early discussions with donors can make a big difference. An organisation I am working with decided to implement more face-to-face support, because the donor read and assimilated the recommendations from another project evaluation report.

Communications capacity development is a constant learning process and there is no best-case, winning magic formula. But nor should there be – because good support is so dependent on the organisation, project, participants and the context, and just ‘shoehorning’ a ready-made approach or template is not going to work. This report contains some useful principles to guide new forms of support and to encourage donors, think tanks and consultants alike to not fall into the same traps of short-term support that frequently only deliver mediocre results. And above all, interventions are far more likely to become embedded into the life of a project (and hopefully beyond) if they are part of the project from the beginning and not left as an afterthought.

[Editor’s note: Vanesa Weyrauch’s response will come out on Wednesday but if you’d like to join the conversation with a post of your own, please send us an email or tweet. See the original post for inspiration: Developing research communication capacity: lessons from recent experiences]

Developing research communication capacity: lessons from recent experiences

[Editor’s note: This is the first of four blogs on the subject: Research communications support: why do donors, think tanks and consultants keep making the same mistakes?Capacity development: it is time to think about new ways of generating and sharing knowledge, and INASP’s reflections on lessons from recent research communication capacity building experiences. Join the debate.]

Donors spend millions every year trying to build the capacity of researchers to communicate their work more effectively. Unfortunately, most of it goes on one-off workshops and attempts to get them to do things they are clearly uninterested in. Sometimes it feels that lessons are hardly ever learned. But sometimes opportunities come about that let us reflect and learn.

Last year, ODI and INASP asked Martine Zeuthen and me to review their efforts to build the capacity of a series of IDRC funded research programmes in Africa. We assessed each one separately and then brought both reviews together in the synthesis below. We found, among other things, that a lot more time needs to be dedicated to planning the interventions.

I am now trying the recommendations listed below in a project I am working on this year with four Latin American think tanks. I’ll report back on how it goes.

Lessons (more detail on these lessons, the recommendations below, and the approaches themselves, is provided in the document in Scribd or GoogleDocs):

  • The best laid plans… In both cases, as well as in other cases consulted for the purpose of this review, the interventions did not go as planned.
  • An expression of interest does not always imply commitment: Although the grantees had expressed their interest in being involved in the projects several were not engaged in learning and did not change their approach to research communication as a consequence. In one case one of the grantees expresses that their involvement was based on the impression that the project appeared to be important for IDRC and ODI. In other words, their participation was driven more by an interest in being part of such initiative to satisfy donor demands rather than in the initiative itself.
  • Researchers have other interests and pressures besides communications: Most researchers are often more interested in researching than communicating. Additionally, while an individual project may be a priority for the donor or for the lead partner, it is unlikely to be so for individual organisations or researchers. As a consequence, any activities that are not seen to directly support their core business are unlikely to be given the priority they demand to be effective.
  • Face-to-face is better than virtual, but the web is a good alternative
  • If it is not done at the beginning, then it is probably too late: In all cases the project came about as a final activity for the grantees, added to the project with only months to go. Furthermore, while the support provided was intended to lead to a communication strategy, there were no additional funds to implement such a strategy. As a consequence, researchers had few incentives to engage more than necessary.
  • The right people matter: The ambition was for the people receiving the support to then go on and train or mentor other members of their networks or organisations. Unfortunately, those who participated where not always the right people for this objective. Senior researchers, network coordinators, and even communicators may be excellent candidates to make use of any skills learned  but that does not necessarily make them the most appropriate ‘trainers of trainers’.
  • Local or regional facilitators and mentors: INASP’s approach involved using regionally based facilitators and mentors. This had a particularly positive effect on the project. The partners learned from the mentors and enjoyed discussing the specific challenges that they were facing with regional professionals. Conversely, ODI was able to connect with the grantees it was supporting only after visiting their offices, and concerns about the consultants’ lack of familiarity with their context were raised.
  • No one is starting from scratch: All the grantees, to different degrees, have some sort of research communication capacity. In some cases, their personal and professional networks ensure greater levels of impact than any formal research communication strategy could ever promise. Furthermore, many communication tactics and channels that are common for developed countries or the United Kingdom, and that ODI and INASP are more familiar with, may not be appropriate for the grantees’ contexts.

Recommendations:

  • Start early –right from the beginning: Developing the capacity to communicate should not come as an afterthought. Funders must plan this right from the start and service providers like ODI and INASP should be careful about being involved if this is not the case.
  • Confirm demand before starting: Even before signing a contract, the service providers should contact the grantees and effectively treat them as clients; inquiring as to their interests, concerns, and commitment to the initiative. The service providers must be very clear regarding the time and resources that they will have to allocate to the process. They must also discuss, at length, who are the most appropriate people to be directly involved and what will be their responsibilities.
  • More than a needs assessment: really understand the organisation and its context: The service provider should start by either spending time with the organisation or hosting the relevant people. Above all, the service providers need to understand the culture of the organisations and the policy contexts they seek to affect. This is not something easily achieved through a remote diagnostic.
  • Consider who is the most appropriate source of expertise: It may be that the organisations conducting the assessment are not necessarily the most appropriate when it comes to delivering the support. Would they limit their recommendation to the services they can offer?
  • Build on strengths: The service providers should seek to either improve what they already do or introduce new channels or tactics that build on those that they are comfortable with. This is likely to make a bigger impact than if the consultants bring along an entirely foreign and all-encompassing new approach.
  • Focus on the organisation rather than on single projects: Support should be aimed at strengthening the organisation’s capacity and not just a single project’s visibility. This is likely to attract the support of senior managers that is crucial for any change to take hold within the organisation. The project itself can be used as a pilot to text the new tactics or channels proposed.
  • Earmark funds to implement whatever strategy they develop: It is unlikely that the organisations will dedicate the necessary time to develop a strategy or plan unless they know that there will be funds available to implement it. Just as the service providers are not helping for free, it is unlikely that these researchers will be able to dedicate the necessary time to the initiative unless their time is covered. The service provider should therefore make sure that there are sufficient funds for this purpose. On the other hand, if the organisation has the funds but is not willing to allocate them to this purpose this should be seen as a sign that there is little buy-in from the leadership.
  • Maximise peer-to-peer exposure: Depending on the kind of skills being shared and the individuals involved, the donors and service providers should attempt to ensure that people with the right experience deliver the support. Researchers, for example, are more likely to respond to other researchers; communication officers to communication officers; and managers to managers. This means that it is possible that the service providers will have to look beyond their organisations for the right expertise. Instead, they may act as facilitators and help the organisations find the most appropriate people for their needs.

Read the full report here:

View this document on Scribd


Follow

Get every new post delivered to your Inbox.

Join 5,433 other followers