Skip to content

Posts tagged ‘knowledge’

Is knowledge meant to solve technical problems or change the world?

In this second post on the think tanks' Summit, Peter da Costa reflects on Prof. Achille Mbembe's presentation at a recent African think tanks summit and poses an important question: can knowledge ever help change Africa unless it is critically grounded in reality? Otherwise, does it risk being nothing more than a provider of narrow solutions to even narrower expert-defined problems?

Read more

Taking think tank communications to the next level: Becoming fit for purpose (Part 2)

The current set of posts is designed to help organisations to pause and reflect on their communications offer to understand if and in what ways their communication products may need to change. The posts suggest a three-pronged approach: a knowledge audit, a market analysis and an audience assessment. This post focuses on the first two elements -- conducting a knowledge audit and a market analysis.

Read more

The onthinktanks interview: Dr Asep Suryahadi

In this interview, SMERU's Director, Dr Asep Suryahadi, describes his motivations for joining the think tank, the centre's history, and its current and future challenges. Pak Asep explains that as a think tank in Indonesia SMERU must balance a number of sometimes competing expectations from multiple stakeholders.

Read more

From non-renewable resources to unlimited knowledge

Think tanks are too often focused on public policy: education, health, macroeconomics, etc. But few take notice of key sectors of the economy. There is a need for more think tanks to focus on sectors or industries of great importance to developing countries and target their natural resources: from extractive to knowledge economies.

Read more

Paradox of Hoaxes: How Errors Persist, Even When Corrected

Here is an interesting challenge that think tanks face on a regular basis -a challenge often created by other think tanks and linked to the fact that think tanks CAN get it wrong. In Paradox of Hoaxes: How Errors Persist, Even When CorrectedSamuel Arbesman argues that:

Despite our unprecedented ability to rapidly learn new things and crowdfix mistakes, Knowledge and its sinister twin Error continue to propagate in complex and intriguing ways.

Even after an error has been corrected (false information has been updated, a flawed theory has been refuted, or  lie has been caught and shamed) it has a high chance of making a comeback. Like one of those joke candles of our youth.

What caught my attention in this article is Arbesman’s excitement. The world of think tanks is full of talk about evidence based policy and grand programmes based on single studies, a few months’ worth of research, and one or two pilots at most. Donors often fund think tanks in developing countries avoiding overlap: one focusing on health, another on growth, another on education, etc. But as Arbesman points out:

It would be so convenient and predictable if all knowledge stood the test of time. But if that were the measure of being a scientist, then no one would be a scientist. No one would explore or write or even be willing to read about our latest (even if recapitulated or inaccurate) findings. Of course, we still have to be scrupulous; but the good news is that while knowledge is fickle and changing, the way it changes does obey some rules and regularities. There is a method to the madness.

So we should all keep in mind what a former professor of mine said after lecturing his classes on a certain scientific topic on a Tuesday. On Wednesday, he read a paper that was published and that invalidated the lecture. On Thursday, he went into class and told his students, “Remember what I told you on Tuesday? It’s wrong. And if that worries you, you need to get out of science.”

Here then is a role for think tanks. As they pursue certain policy decisions they must also ensure that their ideas are sound (technically, politically ethically, etc.). The often heard comment that ‘all the research has been done’ should be grounds for concern about the intellectual robustness of the organisation. Specially in the field of social sciences where things can’t ever be 100% certain.

K* (and * stands for what exactly?)

Knowledge management sounds as if we are controlling knowledge. Knowledge facilitator sounds as if we are not getting involved. Knowledge translator sounds as if we are just using google translate. Knowledge transfer? Intermediary? Etc. All that and more is the subject of the K* event being organised in Canada this week.

I’ve heard about this for some time already but am still not sure what it is supposed to be about, although every one seems to be in on it. (Even Appleton Estate rum.)

Dr. Alex Bielak is the main proponent of this idea/event and has shared some ideas on the GDNetblog. In a post titled What is KStar Initiative and why do we need it? he says:

er… he doesn’t really define it. Instead:

What was important to us was “getting on with it”, and not letting the terminology – important as it might be – get in the way

Ultimately I don’t think we should be spending a lot of time debating what we call specific elements

I am unfair. He has a video in which he tries to describe what K* is. K* is an attempt to stop the expansion of meaningless but interrelated terms to describe similar activities/roles. Instead of having lots of different groups, let’s have one, in other words. I agree with this. Jargon can be addictive. But it feels a bit contradictory that to get rid of jargon the proponents of K* have created more jargon.

I do not disagree with any of these two statements but it feels, however, that dedicating a whole conference to the concept of K* is kind of ironic -to say the least. It also feels a bit odd that one of the conference’s objectives is to help practitioners demonstrate their impact. So is it not clear that they are important yet?

But back to the concept. Alex Bielak does offer some guidance in the form of a framework (diagram) that points at what he means by K*. There is more in the Green Paper but I warn you that it is full of jargon (and, granted, lots of interesting literature). Let us see:

  • Push and pull: The framework assumes that policy pulls and research pushes. Sure, this happens sometimes but it seems to forget that the policymaking machine is full of researchers, policy analysts, data crunchers, etc. They push knowledge as much (if not more) that researchers in academia, civil society or the private sector can.
  • K* also assumes that there is a separation between producers, intermediaries, and users. As mentioned in the point above this is not always the case. In fact this is rarely the case. Professionalising K* therefore seems rather odd. It would be like professionalising research instead of professionalising economics, law, physics, geography, etc.
  • Does Policy Pull refer to policymakers asking for evidence to make decisions or for policymakers asking for evidence to support decisions already made? If the latter then maybe it should be Policy Push instead.
  • The emphasis on policy pull (see the video) is telling of the people involved in this sector. They tend to see the world in a very organised way. They come from the civil service in developed countries, or from the health sector where the idea of evidence use is already well ingrained into its DNA, and, most important, are not (or tend not to be) content experts nor influential.
  • Throughout the literature on K* and the video one can get the very clear sense that there is an assumption that knowledge moves in the direction of policy. This linear view of the world is contradictory will that K* is supposed to be advocating for. But this is the problem with attempting to model complexity -inevitably we have to simplify it.
  • I do not quite get the difference between translation, adaptation, transfer, and exchange and brokering and mobilisation. The K* community may not be too keen on definitions but these words mean different things and they need to be explained. E.g. A broker is: a person who functions as an intermediary between two ormore parties in negotiating agreements, bargains, or the like; while a translator is: a person who translated -and to translate is to turn from one language to another, to change form or condition, to explain in terms that can be understood, to move from one place to another, etc.
  • Somehow media communications (the media being a key source of information for policymakers) is left out of the K* box -and far away from policy. But the media does all these things that the * includes (it translates, it adapts knowledge, it transfers it from one space to another, it exchanges it in private and in public, it brokers access to information on behalf of the public, it mobilises knowledge, etc.). If ever there are K* professionals these are journalists.
  • Big-C and little-c communications: Again another distinction that sounds nice but is difficult to support. When an organisation communicates a brand or communicates to the general public it does more than just pushing a logo. Advertising is not about the logo but what the logo represents. Successful corporate communications are able to pass on layers upon layers of content and context information with a logo, an image, a sound, etc. Influence, particularly the influence of research, is closely linked to the perception of credibility of the organisations or individuals trying to do the influencing. Corporate communications (Big-C) are therefore critical and impossible to separate form little-c communications.

There is another worry I have. This focus on K* distracts us from the fact that this is already happening all around us. There are several institutions (and specific organisations) that fulfil all these * functions on a daily basis and by design. What we should be doing is focusing on them and strengthening their capacities rather than trying to relabel them or individuals within them.

Think tanks (if they do their job properly) act between academic and policy (and between others too). The media acts between the public and the public interest. The civil service acts between politicians and the public (including NGOs, researchers, etc.). Political parties aggregate evidence, values, interests, and other forces; then they act between politics, policy, and other actors. Etc. These institutions, whether we like them or not, are impossible to replace -unless we do away with our political systems (and in that case new institutions would be necessary).

My opinion is that if donors want to make a real difference they ought to fund these institutions and not attempt to create new ones. Fund the media (and journalism schools); political parties (and political science and public policy faculties while you are at it); fund civil service reform (and the necessary professional cadres: economists, sociologists, managers, etc.); fund professional associations and chambers of commerce (the unsung heroes of intermediaries: this is where research, policy, and practice comes together).

Above all, focus on people. When a competent medical doctor from Malawi meets a competent medical doctor from Canada and they talk about what each other knows there is not need for intermediaries. A competent engineer from Germany will have no problem sharing his or her knowledge with a competent engineer from Zambia. And a competent economist from the United States will not have any problems reading a paper by a competent Vietnamese economist. And the same is true within a country: a good economics professor will have no trouble talking to a good economics journalist, and he or she will find it easy to have a conversation with an economist in the treasury

This is what professions do: they use a common language to ensure that their members can talk to each other regardless of where they are. When the right people talk to each other they need no toolkits and not K* practitioners.

Don’t fund websites that republish what others have worked hard to produce (this is probably illegal -unless they were of course not getting paid to do it), don’t wast money on short term workshops to train people on how to use quick-fix tools or make them aware of new frameworks; don’t get too exited by new fads and all encompassing ideas (when have they ever worked?).

I won’t be able to follow the K* conference but will have a look at what it has been published after its done. I hope to learn more about:

  • What * is and is not (so far it seems like it could be everything  -is anyone not an intermediary between at least two other people?)
  • Why is this really that important that it merits a global conference
  • What roles do political parties, the media (and particularly journalists), the civil service, the private sector, think tanks, academia, etc play in all this?

Any contributions are welcome.

Huw Davies: “When contextualised, research has the power to animate, inform, empower or infuriate”

Huw Davies talked to the LSE Impact of Social Sciences blog on how to treat academic research in order to give it the best foundations when before it enters the policymaking process.

He said:

Research Does Not Speak For Itself: research needs to be actively translated in communication; it needs to be set in context, and it needs to be brought to life. By itself, ‘research’ is just inanimate data: in conversation and contextualised it has the power to animate, inform, empower or infuriate. It has the latent capacity to become knowledge, or even evidence.

Which is why I think that when donors ask for indicators of research outcomes they are in fact asking for indicators of the efforts to communicate (share, explain, disseminate, popularise, test, etc) research processes and outputs.

Research Does Not Stand Alone: of course any research must be seen in the context of other research, gradually building up a picture of our social world. Individual studies have far less value than careful synthesis and review. But more than this, research needs to be interpreted in the context of local systems, cultures and resources; and explored with an understanding of political sensitivities, expediencies and implementation challenges.

Which means that policy research programmes should never start with the assumption that nothing can be communicated yet (until we have done the research). Their research is not isolated from other begin done alongside it or done before.

Research Has To Be Integrated: research ways of knowing have to be integrated with other forms of knowing: knowing that comes from a complex and sophisticated conceptual understanding of the world (including ideological preferences), and knowing that comes from deep experience, including tacit ways of knowing or feeling.

We tend to be carried away by the idea that there is one type of evidence: research based evidence. But this is not true. Evidence can be arrived at by different means. Knowledge is even more complex. Whose evidence is as important a question as any.

Using Research is Often Not An Event: use of research is often better seen as a dynamic and iterative process, usually faltering, but occasionally dramatic; most often seen better in retrospect than in prospect. Research-based ideas can slowly seep into policy discourse in a slow and percolative way, gradually changing the sense of what is important or what is possible in policy debates.

Hence the ill-conceived idea of commissioning case studies of specific influencing events, or the stories of change that think tanks tend to use (under pressure from donors). If the process can be easily described within the confines of a brief story of change then it is likely to have been one of those exceptions to the rule -and more often than not the consequence of a consultancy.

It’s Not Just Learning – Unlearning Matters Too: letting go of previously cherished notions, conceptual models or so-called ‘facts about the world’ can be as important as the acquisition of new understandings. But this is far from simple: the new does not necessarily displace the old. Sometimes uncomfortable accommodations or amalgamations are made.

But to do this think tanks need to create sufficient space to test and fail (or make mistakes, at least). The pressure to maximise impact (to ensure that all research is policy relevant and useful -and usable) works against this. Think tanks all over the world offer safe spaces from which to launch new innovative ideas. But this requires a mindset that rewards ideas above all.

Knowledge is Often Co-Produced: rather than seeing research as the preserve of technical experts, new policy-relevant knowledge often comes from collaborative processes that break down the distinction between roles – where technical expertise around data meets other forms of knowing rooted in experience or a sense of the possible. Shared journeys can produce shared understandings.

In a project on trade and poverty in Latin America we commissioned a journalist to study the same issues we were studying but from a different point of view. We never really properly integrated these but it would be a good idea to build coalitions that seek to study the same issues from different disciplines and perspectives. Why not build coalitions between academia, think tanks, NGOs, and the media? A soon to be launched DFID Zambia project will be funding several think tanks to study similar issues and then debate their ideas in public -this has never been seen before!

Knowledge Creation is Deeply Social: the creation of knowledge from various ingredients (including, but by no means limited to, research) is therefore a deeply social and contextual process – happening through interaction and dialogue. It reflects a persuasive process triggered as much in the gut as in the brain.

Dialogue is underrepresented in this sector. We talk about engagement and ‘two-way communications’ a lot but not really about research and influence as a dialogue process. The marketplace metaphor has taken over every aspect of the work including the language with use: demand, supply, etc. We should be talking, as Daniel Ricci says, about a big conversation. Think tanks role there is to improve the terms of the dialogue. Jeffrey Puryear wrote that Chilean think tanks greater contribution was the way they helped politicians to learn how to talk to each other and work together.

Not Products But Process: from all of this it then makes more sense to think of ongoing processes of knowing than the creation and sharing of knowledge products, and so…

And so think tanks contribution is ongoing. A research output or a policy is a step along the way (a not always linear way). Various outputs may contribute to change over time, or prevent it, depending on how they come together. More importantly, though we must not forget that all stakeholders involved in the research and policy process have their own history and agency. They are not static players.

It’s Not All About Decisions But More Often About Framings: because research often has the most profound impacts not when it directly underpins specific decisions (instrumentalist action) but instead when it causes shifts in the language, concepts, conceptual models or frameworks that are used to define the contours of the policy landscape. Research can be at its most powerful when it shakes prior certainties, questions core assumptions or even re-shapes cherished values.

It is not about evidence, it is about arguments!

So, when we focus on research as proving evidence for policy decisions we both overplay its short-term role as technical arbiter and undersell its longer-term transformative power.

I could not agree with him more.

An just so you don’t think Huw Davies is a member of the anti-comms brigade, here is LSE’s bio on him:

Huw Davies is Co-Head of School and Professor of Health Care Policy & Management at The School of Management, the University of St Andrews, and he was formerly Director of Knowledge Mobilisation for the UK NIHR ‘Service Delivery and Organisation’ national R&D Programme (2008-10). His research interests are in service delivery, encompassing: evidence-informed policy and practice; performance measurement and management; accountability, governance and trust. Huw has published widely in each of these areas, including the highly acclaimed Using Evidence: How Research Can Inform Public Services (Policy Press, 2007).

Revitalising Indonesia’s Knowledge Sector for Development Policy: the diagnostics

A few months ago I wrote about this interesting programme by AusAid in Indonesia. Here is a site with all the diagnostic studies they undertook. The programme is a good idea but I’d like to focus your attention on the range of studies undertaken. It would be fantastic if all donors did this BEFORE funding any policy research programme (large or small) in a country. These studies should inform all future policy research work in Indonesia -they should act as a sort of baseline (with more specific ones required for policy research programmes in particular sectors). From the programme page:

The proposed ‘Revitalising Indonesia’s Knowledge Sector for Development Policy‘ program aims to enable Indonesian policy-makers to make contestable, evidence-informed decisions on how best to spend national budgetary resources in ways that help the poor. It will support the domestic supply of knowledge products to inform policy, as well as the ability of decision makers to use those products to inform their policy choices.

Design document

Diagnostics documents

More information

Draft program design document and diagnostics for Revitalising Indonesia’s Knowledge Sector for Development Policy program

Latin American think tanks: using knowledge to inform public policy

A few months ago I reported on an event I helped organise in Lima on think tanks. It was hosted by CIES and co-organised by GDNet, the Think Tank Initiative, CIPPEC, Grupo FARO and the EBPDN. Micaela Pesantes has shared the event’s report with me (and I share it with you: in English, this time).

View this document on Scribd

Event: increasing social and environmental impact through knowledge transfer

Here is an interesting event organised by Knowledge London on the 22nd November 2011.

Feeling under pressure to demonstrate the impact and relevance of your research?

er …yes

Looking for interesting impact models?

er … yes

Knowledge exchange initiatives can help to deliver sustainable social and environmental benefits. These in in turn can create savings in public services and enhance social cohesion, economic inclusion and improved stewardship of our natural resources. Universities can play a unique role in information services, learning and new skills acquisition that can stimulate community development and prevent increased marginalisation and deprivation.

This session will explore through real life examples and some notable success stories how an investment in community and public engagement can pay dividends for both HEIs and beneficiaries such as hard-to-reach groups, charities and local authorities.

This session will help anyone involved in developing, writing or presenting impact case studies for their institutions.

Confirmed speakers:

Dr Sonia Vougioukalou, Research Associate, Medical School, Division of Health and Social Care, King’s College London

Sally Cray, Senior Consultant within the Department of Leadership and Management Development, Canterbury Christ Church University

Dr John Tweddle and Lucy Carter, The Natural History Museum

Ceri Davies, Development Manager, University of Brighton

Hilary Jackson, Public Engagement Coordinator, University College London

Prof. Annmarie Ruston, Canterbury Christ Church University

Karen Cleaver, Head of Department, Family Care & Mental Health, University of Greenwich

Follow

Get every new post delivered to your Inbox.

Join 5,090 other followers