Skip to content

Posts tagged ‘values’

Advice to Think Tank Startup: do not do it alone

Hans Gutbrod outlines some ideas that should be considered when thinking of setting up a new think tank. He argues that planning and learning from others are critical for success. And do not forget that management will matter as much as the quality of your research.

Read more

Reflections on Bringing Think Tanks Together: a Community of Practice?

[Editor's note: This is the first post by Hans Gutbrod, Director of the Think Tank Initiative. I'd like to welcome him to onthinktanks.org and look forward to his ideas and reflection on think tanks. Over the next few days and weeks we'll also be sharing some of the videos from the sessions organised at the TTI's exchange mentioned in this blog post.]

Bringing together more than 100 think tank executive and research directors from more than 25 countries, the Think Tank Initiative (TTI) Exchange in Cape Town, in June 2012 probably was one of the largest gatherings of think tanks from the global South held to date. Some of the substance of the event has already been covered, in various ways, by lively Twitter commentary. Short videos from the sessions, highlighting particular topics, will be made available in the coming days and weeks.

So what continues to stand out from this event in hindsight? Here are some personal reflections, not intended as a conclusive summary, but as points of discussion.

Stability Matters

For think tanks to make a difference, they have to attract exceptional staff. Thomas Carothers, from the Carnegie Endowment for International Peace, has made this point. It is obvious, but not trivial: think tank research needs to be both substantive and hedged, smart and safe, and, on some occasions, cautious and bold. As think tanks trade on their authority, they are particularly vulnerable when making claims. Raymond Struyk highlights this risk in the first page of his classic book on managing think tanks, describing this unsettling scenario:

A report on a high-visibility and urgent problem is sent to the Ministry of Finance with significant flaws in the statistical analysis. These flaws are discovered by an analyst from another organization after the report has been widely distributed. The think tank loses significant credibility with the government and other clients.

Not many people are good at responding thoughtfully and quickly, while getting everything right. Peer review processes can prevent errors, but given the pressure, any first draft has to be solid, and review steps need to be executed with great care.

That means not only do think tank leaders need to be remarkable (and there were a lot of exceptional people at the TTI Exchange), but they also need a remarkable team behind them. The success of a think tank depends on building a deeper team, on recruiting exceptional staff, and getting exceptional people to work together. Core funding is thus critical, as it allows think tanks to attract and retain exceptional staff. Some innovative funding arrangements notwithstanding, core funding remains a key feature for the type of local research that can improve lives, because it makes high-performing think tank teams possible.

Connecting to Conversations

Even with great teams, unnoticed loneliness at the top can be a severe challenge: since the leaders of think tanks rarely engage peer-to-peer on management questions, their loneliness is often profound. With a few exceptions, leaders mostly figure things out by themselves, maybe with a board or their personal circle. This loneliness sets them apart from people in other professions such as doctors, accountants, lawyers, or even managers in the private or public sector, who regularly discuss and enhance professional practice.

Yet the loneliness can go unnoticed, since the leader of any think tank will engage extensively with fellow researchers and policymakers, clients, maybe diplomats, scholars from abroad, journalists, interns and students. After six years at CRRC, my email address book had accumulated more than 4400 contacts. Yet except for two extended conversations that Goran Buldioski from the Think Tank Fund made possible, I not once – not a single time – in those six years, sat down with someone else who was running a research organization to exchange experiences on how we do things. And to me, the remarkable thing was that I didn’t even realize, until the TTI Exchange, that I had not discussed how to run a research organization with anyone other than my colleagues. Bringing think tank leaders together helps to overcome this loneliness, and gives them an opportunity to begin conversations and connect.

Communities of Practice

Following from this, I think it’s fair to say that there is no fully established community of practice – defined by Wenger as a “group of people who share a concern, a set of problems, or a passion about a topic, and who deepen their knowledge and expertise in this area by interacting on an ongoing basis” – for running think tanks, especially not in the South. There are some personal links, and there are well-established practices of research in relevant academic disciplines, but these are different from the practices of generating policy research with the aim of improving people’s lives – perhaps as different as physics is from engineering.

If policy research in the South is seeking to have a greater impact, it’s probably worth cultivating habits of sharing more, deepening knowledge and expertise, and interacting on an ongoing basis. These are the kind of habits that are well established in other professions. Patent lawyers, insurance actuaries, heart surgeons and project managers alike – they all set up a trade publication, ways of exchanging information, and they get together regularly, to discuss how to do things. The formalization is not all happy: there is tomfoolery in most jamborees, but if you only pick up a handful of better practices, tell a few peers what has worked for you, and identify the colleagues you will call for advice when things get sticky in the office, you serve yourself and the profession, as well as the people your profession serves.

In the case of successful policy research in the South, strengthening a nascent community of practice could probably go a long way toward making better policies stick. On our end, at TTI, we are certainly thinking about how to cultivate such a community of practice. We have quite a few ideas on how to do this, but welcome ideas and input from others.

This is also why I’m contributing these reflections here in this blog. Onthinktanks.org serves as an excellent aggregator for many of the issues that are worth debating in the community. To broaden the debate, we will be making videos from the TTI exchange available in the coming weeks, and hope they find a good audience. Check our website or follow us on Twitter.

In addition to what has already been published on this blog, do you have any other thoughts on what we can do to cultivate a community of practice for policy research and think tank management?

Huw Davies: “When contextualised, research has the power to animate, inform, empower or infuriate”

Huw Davies talked to the LSE Impact of Social Sciences blog on how to treat academic research in order to give it the best foundations when before it enters the policymaking process.

He said:

Research Does Not Speak For Itself: research needs to be actively translated in communication; it needs to be set in context, and it needs to be brought to life. By itself, ‘research’ is just inanimate data: in conversation and contextualised it has the power to animate, inform, empower or infuriate. It has the latent capacity to become knowledge, or even evidence.

Which is why I think that when donors ask for indicators of research outcomes they are in fact asking for indicators of the efforts to communicate (share, explain, disseminate, popularise, test, etc) research processes and outputs.

Research Does Not Stand Alone: of course any research must be seen in the context of other research, gradually building up a picture of our social world. Individual studies have far less value than careful synthesis and review. But more than this, research needs to be interpreted in the context of local systems, cultures and resources; and explored with an understanding of political sensitivities, expediencies and implementation challenges.

Which means that policy research programmes should never start with the assumption that nothing can be communicated yet (until we have done the research). Their research is not isolated from other begin done alongside it or done before.

Research Has To Be Integrated: research ways of knowing have to be integrated with other forms of knowing: knowing that comes from a complex and sophisticated conceptual understanding of the world (including ideological preferences), and knowing that comes from deep experience, including tacit ways of knowing or feeling.

We tend to be carried away by the idea that there is one type of evidence: research based evidence. But this is not true. Evidence can be arrived at by different means. Knowledge is even more complex. Whose evidence is as important a question as any.

Using Research is Often Not An Event: use of research is often better seen as a dynamic and iterative process, usually faltering, but occasionally dramatic; most often seen better in retrospect than in prospect. Research-based ideas can slowly seep into policy discourse in a slow and percolative way, gradually changing the sense of what is important or what is possible in policy debates.

Hence the ill-conceived idea of commissioning case studies of specific influencing events, or the stories of change that think tanks tend to use (under pressure from donors). If the process can be easily described within the confines of a brief story of change then it is likely to have been one of those exceptions to the rule -and more often than not the consequence of a consultancy.

It’s Not Just Learning – Unlearning Matters Too: letting go of previously cherished notions, conceptual models or so-called ‘facts about the world’ can be as important as the acquisition of new understandings. But this is far from simple: the new does not necessarily displace the old. Sometimes uncomfortable accommodations or amalgamations are made.

But to do this think tanks need to create sufficient space to test and fail (or make mistakes, at least). The pressure to maximise impact (to ensure that all research is policy relevant and useful -and usable) works against this. Think tanks all over the world offer safe spaces from which to launch new innovative ideas. But this requires a mindset that rewards ideas above all.

Knowledge is Often Co-Produced: rather than seeing research as the preserve of technical experts, new policy-relevant knowledge often comes from collaborative processes that break down the distinction between roles – where technical expertise around data meets other forms of knowing rooted in experience or a sense of the possible. Shared journeys can produce shared understandings.

In a project on trade and poverty in Latin America we commissioned a journalist to study the same issues we were studying but from a different point of view. We never really properly integrated these but it would be a good idea to build coalitions that seek to study the same issues from different disciplines and perspectives. Why not build coalitions between academia, think tanks, NGOs, and the media? A soon to be launched DFID Zambia project will be funding several think tanks to study similar issues and then debate their ideas in public -this has never been seen before!

Knowledge Creation is Deeply Social: the creation of knowledge from various ingredients (including, but by no means limited to, research) is therefore a deeply social and contextual process – happening through interaction and dialogue. It reflects a persuasive process triggered as much in the gut as in the brain.

Dialogue is underrepresented in this sector. We talk about engagement and ‘two-way communications’ a lot but not really about research and influence as a dialogue process. The marketplace metaphor has taken over every aspect of the work including the language with use: demand, supply, etc. We should be talking, as Daniel Ricci says, about a big conversation. Think tanks role there is to improve the terms of the dialogue. Jeffrey Puryear wrote that Chilean think tanks greater contribution was the way they helped politicians to learn how to talk to each other and work together.

Not Products But Process: from all of this it then makes more sense to think of ongoing processes of knowing than the creation and sharing of knowledge products, and so…

And so think tanks contribution is ongoing. A research output or a policy is a step along the way (a not always linear way). Various outputs may contribute to change over time, or prevent it, depending on how they come together. More importantly, though we must not forget that all stakeholders involved in the research and policy process have their own history and agency. They are not static players.

It’s Not All About Decisions But More Often About Framings: because research often has the most profound impacts not when it directly underpins specific decisions (instrumentalist action) but instead when it causes shifts in the language, concepts, conceptual models or frameworks that are used to define the contours of the policy landscape. Research can be at its most powerful when it shakes prior certainties, questions core assumptions or even re-shapes cherished values.

It is not about evidence, it is about arguments!

So, when we focus on research as proving evidence for policy decisions we both overplay its short-term role as technical arbiter and undersell its longer-term transformative power.

I could not agree with him more.

An just so you don’t think Huw Davies is a member of the anti-comms brigade, here is LSE’s bio on him:

Huw Davies is Co-Head of School and Professor of Health Care Policy & Management at The School of Management, the University of St Andrews, and he was formerly Director of Knowledge Mobilisation for the UK NIHR ‘Service Delivery and Organisation’ national R&D Programme (2008-10). His research interests are in service delivery, encompassing: evidence-informed policy and practice; performance measurement and management; accountability, governance and trust. Huw has published widely in each of these areas, including the highly acclaimed Using Evidence: How Research Can Inform Public Services (Policy Press, 2007).

Is religion a ‘no no’ for think tanks?

The Jesuit Centre for Theological Reflection (JCTR) is a very interesting think tank. It does not just talk about evidence but also about faith. Its mission statement is:

To foster from a faith-inspired perspective a critical understanding of current issues. Guided by the Church’s Social Teaching that emphasises dignity in community, our mission is to generate activities for the promotion of the fullness of human life through research, education, advocacy and consultation. Cooperating widely with other groups, our Jesuit sponsorship directs us to a special concern for the poor and assures an international linkage to our efforts. We aim to promote an inculturated (sic) faith, gender equality and empowerment of local communities in the work of justice and peace and the integrity of creation.

Maybe they have a point. Research from North America shows that atheists are distrusted as much as rapists:

The study, conducted among 350 Americans adults and 420 Canadian college students, asked participants to decide if a fictional driver damaged a parked car and left the scene, then found a wallet and took the money, was the driver more likely to be a teacher, an atheist teacher, or a rapist teacher?

The participants, who were from religious and nonreligious backgrounds, most often chose the atheist teacher.

This moral distrust of non-believers is relevant in very religious societies -and much of the developing world qualifies as such. By combining religion (with explicit references to values) organisations like JCTR are able to award a certain degree of credibility to the, less face it, sometimes God-less work of the researcher -who will not believe it until it can be measured.

Not evidence but arguments: translating evidence into policy in Ecuador

Orazio Bellettini and Andrea Ordonez, from Grupo FARO, have published a paper on translating evidence into policy in Ecuadordrawing from two policy debates: Fighting Political Clientelism at Social Pograms; and the Yasuni ITT Initiative Proposal.

The Yasuni initiative provides an excellent illustration of the relationship between science and policy influence that is often overlooked in the evidence based policy discourse. The assumption is that there is a direct relationship between evidence and policy. But this overlooks the act of translating evidence into policy options and policy recommendations.

This translation is not straight forward. Evidence does not include ‘what to do’. Evidence, or what we call evidence, is about ‘what is happening’, ‘what is working’, what is not working’, ‘what is the probability that something will work’, etc. In the case of the Yasuni, scientists offered evidence along these lines:

“Our first conclusion is that Yasuní National Park protects a region of extraordinary value in terms of its biodiversity, cultural heritage, and largely intact wilderness. This region — the Napo Moist Forests of the Western Amazon — has levels of diversity of many taxonomic groups that are locally and globally outstanding. For example, with an estimated 2,274 tree and shrub species, Yasuní protects a large stretch of the world’s most diverse tree community. In fact, there are almost as many tree and shrub species in just one hectare of Yasuní’s forests as in the entire United States and Canada combined. Yasuní has 567 bird species recorded — 44% of the total found in the Amazon Basin — making it among the world’s most diverse avian sites. Harboring approximately 80 bat species, Yasuní appears to be in the world’s top five sites for bat diversity. With 105 amphibian and 83 reptile species documented, Yasuní National Park appears to have the highest herpetofauna diversity in all of South America. Yasuní also has 64 species of social bees, the highest diversity for that group for any single site on the globe. Overall, Yasuní has more than 100,000 species of insects per hectare, and 6 trillion individuals per hectare. That is the highest known biodiversity in the world.”(Scientists Concerned for the Yasuni, 2004)

This is evidence of the rich biodiversity of the Yasuni. But there is no immediate policy action that can be inferred from this. A policymaker could decide to protect it or to build a road right through it. The decision is not evidence based (although, evidence of the rich biodiversity of the area can certainly influence  or inform it) but value based. What do the policymakers value more, and why?

Are they willing to put a price to nature? If they are, then they will be quite happy to get rid of the forest if they can identify some clear monetary gains. What if they are not willing to monetize nature?

This is a point I tried to make twice last month: first at the Royal Society to a group of representatives of national scientific societies in Africa and then to the social sciences sector cadre of the Inter-American Development Bank. The point is not that science has no place in policymaking but that we must accept that values play a role too; and a very significant one.

How can think tanks deal with this? Four initial ideas.

Work with others: As the Ecuador case suggests, think tanks must work with other organisations which may be more comfortable with the language of values: political parties, religious groups, NGOs, etc.

Include more views and perspectives: Think tanks should stop talking about multidisciplinary work and make it a reality. One of my favourite things about the RAPID programme was that we were a fairly diverse bunch, each with his or her own views and values (at some point the research team was made of: an economist (Peruvian), a veterinarian (British), an engineer (British Asian), an astrophysicist (British), a mathematician and philosopher (British), and a political scientist (Malawian). Not just that, but we all had different class, ethnic, political, and religious backgrounds. (I am not sure we ever took advantage of it, though.) A friend recently mentioned that Danish think tanks are apparently dominated by men 10 to 1. I’ve never been one for quotas or affirmative action but it does occur to me that if anything can be done to encourage a more balanced set of skills, background and views, then maybe it should.

Appeal to values: Think tanks must make an effort to avoid over-stretching their use of science and instead explicitly appeal to values and other sources of power in building their policy arguments. What is wrong with arguing for justice alongside effectiveness? Or standing firm on certain values? The Jesuit Centre for Theological Reflection (JCTR) combines their research with the teachings of the Church in their messages. (I am not religious but I can see their point.) The Occupy the City of London Movement has resorted to asking: what would Jesus do (about the levels of inequality that the financial sectors are fuelling)? I am not religious but I can get their meaning. The understanding of justice that the Catholic Church was built on is not unique to it -it is a universal value.

Build arguments: Communication of findings is not enough when we are trying to change policy. Bigger ideas and arguments are necessary.

Politics not metrics can tell us what works

An interesting article by William Schambra, director of the Bradley Center for Philanthropy and Civic Renewal at the Hudson Institute, for Tactical Philanthropy that challenges the role of metrics in assessing what works and the effectiveness of interventions. It is particularly relevant for the current debate (is there any?) on the value of impact evaluations and randomised control trials in policy making.

The lessons of Bradley’s involvement in welfare reform were the reverse of what might have been expected. Metrics, the heart of social scientific calibration, have long been understood to be the key to successful policy reform. They are supposed to lift policy discussion out of the bitterly contested realm of political values and local, subjective viewpoints, and put it on the serene plateau of indisputable, objective, universal facts.

No such thing had happened in Wisconsin. Metrics were subsumed into the local political debate rather than the other way around. And a vigorous, face-to-face, fiercely partisan contest about the meaning of “what works” held Bradley accountable to its own community for concrete results, in a way that abstract measurement never could.

Unhappily, many foundations today believe that “effectiveness” requires detachment from immediate, hands-on engagement in the civic life of their own local communities, and tie their grantmaking instead to ever more elaborate, arcane, abstract theories and models. They’ll end up with numbers aplenty. But they still won’t be able to answer the question, “what works?”

Can think tanks make a difference? only if they are capable of logical leaps of the mind

CIGI celebrated its 10th anniversary with an event -not intended to showcase its successes or talk about the business of global governance- by reflecting on the role they play in Canada, and more generally, in the world. They invited a bunch of people to their offices (more on this -they are stunning) in Waterloo, Canada, for a day of discussion and fun (more on this, too).

First of all, the CIGI campus is enviable. CIGI’s offices have been built on top (and around) an old distillery (The empty barrels welcome you as you enter the building. It is just stunning). The design, according to the architects, reinterprets traditional colleges of Oxford and Cambridge with their centuries-old landscaped courtyards, in a contemporary glass, brick and stone building that recalls the industrial heritage buildings that formerly occupied the site.

At the heart of the campus is a courtyard: CIGI to one side, the Balsillie School of International Affairs (BSIA) to another, and a new teaching wing to the other. Across the road is the new Perimeter Centre (designed by Stephen Hawkings). Balsillie, by the way, is Jim Balsillie, one half of Research in Motion (of Balckberry fame). He is the engine behind CIGI. The Perimeter Institute for Theoretical Physics is funded by his RIM partner Mike Lazaridis.

What a brilliant combination, if you ask me. Industry and academia working together.

The event started with a brilliant key-note speech by Roger Martin, Dean of the Rotman School of Management. Roger talked about what he called the paradox of think tanks: The reason why think tanks are ever more important is that the world is changing into a place where think tanks are becoming less relevant.

The world, he argued, is heading in a ‘scientific direction’ that prioritises (or gives sole right to) inductive or deductive logic in policymaking. However, he argued, new ideas never come from this type of logic (Aristotle warned about applied it to the world where things can often be other than they are -here, he said, we need rhetoric). Instead, new ideas come from abductive logic or “a logical leap of the mind”.

The world we live in today however seems to want proof for everything. New ideas are not challenged on the ideas but on their evidence. I remember the fanfare around Dambisa Moyo’s book a few years ago: all the criticisms (or most) were on her evidence (or lack of). But few engaged with her proposition: that African countries are addicted to Aid and that something must be done about it.

Think tanks, in his view, must make sure that they are places that not only allow and accept abductive logic but in fact make it central to their work. Abductive logic has a huge effect on how think tanks work. It is not just about coming up with new ideas but what one does with them. And what one does with them is embark in dialogue. One cannot proof or disproof these new ideas by looking back at data -this assumes that what happened in the past will continue to happen in the future (so what is new with that?) but only by bringing these ideas to life through dialogue and action.

This was a great start to the day and a very interesting conversation ensued. Do think tanks (and researchers) often hide behind ‘evidence’ and ‘science’ instead of saying what they actually believe in? (yes) I science being over stretched? (scientific method can, sometimes, tell us what has happened and why but cannot -and is not supposed to- tell us what to do) Can think tanks scape ideology? (no, but they should be able to shift as the world changes)What should we do? (we should strive to make things rightER).

Then there was a panel on policy innovation in the age of social media. Much of what was discussed resonates with the work that Nick Scott has been doing at ODI. My view is that we are still limiting the discussion by our own limited understanding of ‘social media’. Often the presentations and questions went down the path of commenting on the relevance and usefulness of Twitter, Facebook, etc. The digital world is much broader and these are JUST tools. What matters is how we navigate through it -and not whether we can try to manage it (we cannot).

Chad Gaffield had some interesting points when he identified three conceptual changes in this new world: public discussion in real time, not defined by geopolitical borders, and creating new divides.

Alexandra Samuel, Director of the Social + Interactive Media Centre at Emily Carr University, provided a gem of an idea: Social media has been most successful not  when it set out to change a policy but instead it developed to circumvent policy. Of course! Creative Commons is not an attempt to influence copy-right but rather to do without it. Political, economics, art, etc. blogging is not a way into a mainstream media column, it is saying, ‘I do not need you”. Policy change is a thing of the past.

Alexandra wrote a post right after : 6 questions about the impact of social media on think tanks.

The next session was, unfortunately, the session I was involved in and I did not take notes. The question to the panel was whether governments care more about politics than policies. I think we all agreed that they do -although I argued that this is not necessarily a bad thing (and I for one do not want my governments thinking that they can govern over us as if we were automatons). The thing is that evidence cannot be the only thing that matters in policymaking. (Roger Martin said it better.)

To bring politics and politics (ideology and science) closer together we need to stop separating them -as if it was possible to leave our values at the door- and recognise that not one single player can do this. We need to invest in ALL institutions of democracy (political parties, the media, universities, the private sector, the state, etc.) and in the people who lead and staff them. Such is the complexity of finding the right balance between ideology and evidence that this cannot be planned in advance. We must learn to rely in smart and commitment people and lots and lots of opportunities for learning.

The last panel of the day (Policy influence -who has it and how to get it) was won over by Tiffany Jenkins of the Institute of Ideas (of Battle of Ideas fame) who said that their objective was NOT influence. A proper post on her presentation will come soon, I promise, but in the mean time: she argued that we live in depoliticised times where big ideas are off the agenda and where policymakers just as for ‘technocratic’ solutions to very specific problems (I think many think tanks having to answer to ToRs that start with “Demonstrate that”  will be able to relate) and where influence, as a consequence, is easy. Of course it is: if the policymakers ask for the answer and you give it then how difficult can it be.

In the absence of political authority, she said, science has been brought in and overstretched (same argument as Roger Martin’s). It is now used as a weapon and a substitute for political debate.

Patricio Meller, from CIEPLAN, talked about think tanks in Latin America. He presented an argument to use in leveraging more domestic funds for think tanks in the region. The experience of think tanks in Chile is worth looking into. Jeffrey Puryear’s book: Thinking Politics is a must read.

It is worth mentioning that other leading minds in the world of think tanks were present: Lawrence McDonald, Vice President of Communications and Policy Outreach at the Center for Global Development, being one. Lawrence is a rare thinktanker: he does just ‘do it’ he also thinks about what and why he does it: Learning While Doing: A 12-Step Program for Policy Change.

And so the day went. We then had a cocktail in the CIGI campus and dinner. Steve Patterson provided some really good comedy during the dinner.

There was an interesting twitter discussion going on during the event in case you’d like to follow it.

Transparency should replace (strive to) impartiality in policy research

By Goran Buldioski,  Program Director of the Open Society Institute’s Think Tank Fund. His post addresses the persistent issue of how to ensure or review think tank’s independence.

We have all heard so many times that policy research is not value free. Some critics go one step further by claiming that impartial analysis is rather a far-fetched ideal than an attainable goal in the everyday work of a researcher.  In the other camp, more ‘scientific’ oriented researchers claim that it is only about the scrutiny and the quality of the process. Once complied with certain standards, the research would certainly result into an objective account of the problem and the alternative solutions. Given that think tanks (and NGOs) have taken on roles that historically have been part of the state, it will be necessary for a code of conduct to be aligned to the one we expect from the state. The more the think tankers boost of their own impact, the need for their accountability is greater.

The accountability of policy research is thus an aspect that has raised many debates hitherto. Not surprisingly, many of these debates have focused on the way that the research has been carried out. The aspect of who has been carried out the research (who – not only with regard to competencies, but also in terms of values and personal / organizational history) has not been neglected, but somehow treated artificially (including one of my texts cited below).

In the spring 2009 I published an article in the International Journal of Not-For-profit Law in which I advocated for think tanks in Central and Eastern Europe to devise and adopt codes of conduct:

Think tanks do not act alone in the policy environment. Neither are they obliged to be neutral or free of ideology. Many in the region are staunch advocates of certain doctrines and concepts about the development of their own societies. The only position a think tank should avoid is becoming the advocate of a certain client, because that loss of independence undermines the impact of a think tank’s research. It is essential for think tanks to be explicit and transparent about the ethical values underlying their research work and advocacy. At present, think tanks enjoy a reputation as neutral transmitters of scientific ideas and policy analysis. This independence is their key feature well positioning think tanks to promote good communication between state and society. Likewise, the media is also keen on using think tank experts who they expect are serving the public interest.

The lack of a “framework of values” and rules for conduct for think tanks—among the most resolute proponents of government transparency and accountability in CEE—could soon have negative consequences. In spheres of policy where governments are hostile to such organizations, think tanks have to guard against attacks on independent policy research. Defining a proper code of ethics and code of conduct is a way to do that. Think tanks in CEE can only benefit from proposals in this article by being resolute in formulating these essential and overdue codes.

In that text, my framework of analysis included three different pillars: the ethics of policy analysts, the codes of ethics for public service in the transitional democracies of CEE, and the NGO codes of ethics in CEE. If one looks at the full text, it is clear that I have covered more the objectivity (impartiality) of policy research complemented by some organizational safeguards. No surprise then that the text is ridden with values that we should all strive for and calls for more to developed within the think tanks.

This time around, while I stand behind my writing and still would argue for introducing such codes as part of the institutional framework of each and every think tank, I would like to call into attention the second aspect – transparency (which could, but not necessarily needs to, deal with values. The Economist’s Special Report on the News published on July 7th, although focusing on media and not on think tanks, helped me consolidate my thoughts on this issue. In this report, Nick Newman, former future media controller for journalism at the BBC, claims that transparency is the new objectivity in journalism. This catchy line resonated directly with my recent reflections inspired by three real-life situations that involved think tanks (in CEE, but also globally).

Story 1: Over a period of time, a think tank shifts its ideological stance from a proponent of liberal (social and economic) ideas to a zealot for patriot-cum-constructive nationalist agenda.

How transparency kicks in here: I see a need for the think tank in question to put a timeline of its products and a short history/story of its development online. It should mark the change, even if it does not offer a full-fledged rationale behind it. Since analysis is not free from ideology, it is best to let the readers utilize the analysis and recommendations and decide for themselves if the think tank’s ideological change matters to them at all.

Story 2: Few years ago, a gifted and up-and-coming scholar received a slew of scholarships to attain a number of educational degrees from a donor. In the meantime that person became a director of a prominent think tank. Both the individual and partially the think tank in question are harsh critics of the donor – former patron in its current political commentaries.

How this relates to transparency: Not everyone knows that the director has received scholarships in the past. Without entering into any need for justification, the think tank director should simply put his/her CV online and make this transparent. Such move may even result in a higher sence of value for the criticism (since the person does not shy away to criticize the former patron). More importantly, it would allow the stakeholders of the think tank and the public to have a broader picture of the history and context. Nobody needs to make value judgments, only be transparent. (I treat this as if this was a case of conflict of interests.)

Story 3. Many think tanks in Central and Eastern Europe are operating through two parallel legal entities: a not-for-profit organization and for-profit consultancy. I see nothing wrong in this arrangement, especially in the light of complicated and divergent donor practices that includes one of the other legal forms.

[Note: Often the crucial difference is that the consulting arm will work for a particular client producing (at least to some extent) private analytical products (not available to the public, or only available through the client which uses them for its own advocacy, lobbying or other purposes).]

The public (not-for-profit) think tank produces analysis that is publicly available (public good) usually paid for by a donor or from membership fees and other sources of income.

Why transparency is crucial in this case:  There is a web of intertwined aspects here. First, the public has to be aware of the duality of the brand; and who the clients and donors that are funding the organization are. Second, the donors need to know that there is no double dipping (often the two entities are staffed by the same people sharing the overall work and costs). Third, the clients have the right ensure that what they pay for on their ‘private good’ has not been turned out ‘public’ on the other end of the organization. Finally, if the think tank engages into political consulting, there should be clear bottom-line about who could appear as a client and who could not (simply jeopardizing the entire concept of analysis for public good). In my understanding, this bottom-line is context dependent and changes from one place to the other depending on different factors (level of political culture, the maturity of the consulting market and other…)

In conclusion, think tanks should do their best in insuring that the data and facts they use are from trusted sources and their analysis is as objective as possible. However, they should not forget to be transparent about who they are and where do they come. Even if at a first look, this information might seem ‘damaging’ it is always better for think tanks (as probably for everyone else in the policy/political arena). After all, it is better for think tanks to put out public the facts about themselves instead of someone else, usually with ill intentions, spreading rumor and gossiping about the same matter.

An underappreciated benefit of experiments: convincing politicians when their pet projects don’t work | News, views, methods, and insights from the world of impact evaluation

David Mckenzie’s post on Development ImpactAn underappreciated benefit of experiments: convincing politicians when their pet projects don’t work is worth reading.

It agues that impact evaluations can help to stop projects that do not work. However, this raises a question often not answered: can we test everything?

Are there any ‘experiments’ that we should simply avoid? Think tanks can make use of new methods and approaches to strengthen the evidence base in their arguments but they should be careful not to remove all signs of values and principles from them.

In any case, the reason this post caught my attention was that impact evaluations (if we can do them on the cheap) could encourage policymakers and politicians to experiment without having to stick by their ideas (even if they end up being outrageous). If they were able to explain the concept of pilots to the public then they may be able to promote an acceptable culture of innovation and reduce the fear of loosing face that many politicians and bureaucrats have.

When evidence will not make a difference: motivated reasoning

Chris Mooney’s article (The Science of Why We Don’t Believe Science) on motivated reasoning must strike a chord with the onthinktanks‘ audience.

Reasoning is actually suffused with emotion (or what researchers often call “affect”). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we’re aware of it. That shouldn’t be surprising: Evolution required us to react very quickly to stimuli in our environment. It’s a “basic human survival skill,” explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

We’re not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn’t take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that’s highly biased, especially on topics we care a great deal about.

Mooney presents an interesting idea: that when we say we are reasoning in reality we are rationalising a decision already made.

Our “reasoning” is a means to a predetermined end—winning our “case”—and is shot through with biases. They include “confirmation bias,” in which we give greater heed to evidence and arguments that bolster our beliefs, and “disconfirmation bias,” in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.

His analysis leads to the conclusion that values condition how one looks and uses research -and facts. And that:

…paradoxically, you don’t lead with the facts in order to convince. You lead with the values—so as to give the facts a fighting chance.

Follow

Get every new post delivered to your Inbox.

Join 5,301 other followers