Looking at Brookings to argue that we can learn from think tank budgets, and that think tanks should be transparent about their funding.
Posts tagged ‘Brookings’
Should we worry about US think tanks opening offices in developing countries or emerging economies? While the model could present unfair competition to smaller domestic think tanks it can also have positive effects by encouraging new domestic philanthropy and developing research quality.
Nick Scott’s and James Georgalaki’s comments to my post on how to organise and present research are worth sharing beyond the comments section of my post so let me copy-paste a few of their arguments here:
Nick (ODI’s online communications manager):
“Websites for think tanks [are] the place that all other communications activities come together. Most know what they are looking for when they arrive on a site, and are there to find it. That is why you need to be able to organise information by a number of competing taxonomies to give them the greatest chance of finding it; all the while trying to make those taxonomies user-focused and minimise confusion between them in the user.
“The trick is to achieve a balance of all the ‘types’ of site you have [to] offer all to those who want to find something particular,and find ways to highlight flagship reports, news, information about the organisation and any other taxonomies to users too.
“you need to be able to offer all those things all the way through a site, because the vast majority of your users won’t arrive on a home page, they’ll arrive on a page two or three levels down… [because] one of the most effective ways of reaching people … working out how … you’re going to get them to see the information in the course of their travels around the internet in the first place…
“What is your search engine optimisation strategy to get your information top on Google? How are you ensuring that your research findings are linked to from all the top online sources for each specific sector you work on? How do you get an email to a key player, and more importantly get them to read?
“It is quite a challenge and I’m not convinced that any of the organisation [in the blog] have made much progress in making the online space work for them as a proactive route for influence, rather than a reactive one that allows their information to be found and used when needed.
“In response to your question on how to present clear stories online, there are numerous ways to do it, but without some manual synthesis and a clear and focused subject it is difficult. Blogs can be great at this, as can events, podcasts, presentation. I don’t think it would be easy to achieve a clear story by just listing a set of resources, even though that is much easier to do. The most relevant attempt at this for ODI is our ODI on… pages, which we create at times of international events or the like, and where the summary should provide a synthesis of some of the key areas highlighted within the list of documents.”
James (IDS’s communications manager:
“I think you get an even starker demonstration of [the difficulty of presenting research] if you look at organisations’ Annual Reports. Here you will see some present a detailed description of themselves and organise the publication around the structure of their organisation, whilst others use it to report on key acheivements and present their brand or vision. Certainly here at IDS we did the former but are attempting this year to move to the latter.
“The, all that we do approach, is also recognising that our websites – or at least the home pages – have a broader set of audiences. At IDS our website is a marketing tool which promotes our courses, other services such as Knowledge Services as well as providing a platform for discourse on our research.
“Our home page is more concerned with reducing bounce rates and supporting our SEO strategy than anything else.
“That is not to say that we do not also struggle with the externalising internal process issues on our website. There is more debate here for instance about the search by subject research categories than anything else.”
These are all very important points that illustrate the complexity of their job. Some things come to mind (which are not just relevant for websites -as James suggests above):
- ODI, IDS and CGD have much more broader publics -they are after all dealing with global issues and are attempting to reach publics around the world. ODI and IDS and other think tanks in developing countries have a contracting business model (and ‘sell’ a range of goods and services) and so must attempt to present them.
- Think tanks targeting a particular public -say the British, Ecuadorian, Indian, etc.- and more so those with a specific sector focus do not need to market themselves so widely and their front pages can therefore focus their efforts on a particular report, event or message.
- Something similar could be said about funding types: core funding may reduce the need to market ones services, proyect funding makes it so much more important.
I’ve been trying to look for a page to illustrate this (I am sure I have seen one but cannot find it now): another way of presenting information would be to outline our publics more explicitly. That is, have versions of the site or report for policymakers, researchers, NGOs, activists, the general public. Brookings allows something like this by encouraging the user to create a portfolio.
What do you think?
This is a very interesting paper by Cheng Li on the trends of think tanks in China -with particular emphasis on the roles being played by influential returnees such as Justin Lin.
The paper also highlight the new roles played by policymakers, entrepreneurs and intellectuals in these new organisations (such as the China Center for International Economic Exchanges, the Chinese Economists 50 Forum, and the China Center for Economic Research at Peking University):
The most notable is that three distinct groups of elites—current or retired government officials, business leaders, and public intellectuals—have become increasingly active in promoting their personal influence, institutional interests, and policy initiatives through these semi- governmental organizations. In present-day China, think tanks have become not only an important venue for retired government officials to pursue a new phase in their careers, but also a crucial institutional meeting ground where officials, entrepreneurs, and scholars can interact.
The paper addresses how the relationship between the state and think tanks is changing. Traditionally think tanks were given spaces (or acquired space) as a result of the patronage of powerful officials -or the interest of Premiers and Party Chairmen. Today, however, think tanks are fast becoming a place to reflect on a long public career or plan a future one -adopting the US style revolving door think tank model. The relation to power is therefore more plural, open and even critical (within acceptable boundaries).
But what is most interesting to me is the concerted effort of the Chinese government to promote the formation of think tanks in China -even creating super think tanks.
Warning: This is NOT a 1-pager. Jason Stahl spent a year studying the records of the Library of Congress so, if you have an hour (well, his talk is only 30 minutes long), and want to understand the sudden rise of conservative think tanks in the 1970s, then watch this. Really, watch it.
But if you have access to Daniel Ricci’s The Transformation of American Politics: The New Washington and the Rise of Think Tanks, then read it. He traces the main social, economic and political trends that transformed American Politics and their effect on the current ‘ideas marketplace’: the growth of expertise and professionalism, the dissonance of values and the collapse of traditional civil and political religion, the rise of marketing and its adoption by traditional and new think tanks and politicians, and the increasing disorder in political institutions -partly generated by the very advice marketed by think tanks:
Ricci argues that since the late 1960s Americans have lost sight of the familiar guidelines that used to help them assess issues and have become more hospitable to think tank research and advice. He examines the flood of policy-relevant information that has resulted from the growth of expertise and the advent of big government; the confusion over national goals that comes from the decline of the Protestant ethic and the empowerment of minorities; the growing influence of television and its focus on instant testimony from experts; political changes such as the decline of parties, the move to an “open” Congress, and the growth of an independent presidency; the pervasive power of modern marketing; and much more. According to Ricci, policy ideas generated by think-tank research and commentary are helpful in providing greater objectivity and political insight, not only because of their general reliability but also because in their ideological variety think tanks generate a substantial range of policy proposals, giving voice to a healthy factional pluralism and facilitating a constant testing of ideas. In today’s dissonant politics, Ricci concludes, think tanks contribute some order—and occasionally wisdom—in the ongoing battle in Washington over political ideas.
I’ll come back to Ricci soon. Now watch the video.
I’ll accept that James McGann’s effort to identify and rank all the think tanks in the world has some positive outcomes. First of all, it has people talking about think tanks -and some think tanks are even becoming aware that there is a debate out there about themselves. Second… no, that is it. [Also have a look at Goran Buldioski's blog on the same subject]
I am still of the opinion that going beyond the counting and study of individual think tanks (and their immediate systems) is useless and misleading. Here are five reasons why I do not support this ranking, and then a longer semi-rant at the document.
- Think tanks cannot be de-linked from their political, social and economic environment; since think tanks define themselves in relation to the other players in the system. Brookings cannot be described without references to US bipartisanship -when we say independent research in the US we mean independent of either party (as well as of other interests). But independent means something entirely different in China, India, Brazil, or Argentina. Global and regional rankings are therefore unhelpful when the focus of think tanks is local (not local as in of this town or neighbourhood but of their direct interactions).
- The list is too diverse to be relevant. The definition of ‘think tanks’ has improved since I last commented on it to include politics. But he has now included organisations some that cannot be possibly compared with the rest. Let’s put it this way: if I define a mobile phone as a device that allows me to make phone calls while on the move I could be tempted to include laptops (after all I can make Skype calls ‘on the move’) but I wouldn’t because it would be confusing and unhelpful. A mobile is one thing and a laptop is another. Maybe they will do things that the other can also do but that does not make them the same thing. Amnesty International, Human Rights Watch, Transparency International and the various foundations (funders rather than researchers) included …. how useful is it to compare them with IPAR in Rwanda or GRADE in Peru?
- It is still based on perception rather than thoughtful analysis. Thoughtful analysis would have required the development of a database with answers to all the questions or criteria presented in page 56. These are good questions, but the nominators were not asked to provide answers to these, only to use them to think about their nominations. This means that it is all about presentation rather than content: still a popularity contest among people who clearly cannot know about every context and must therefore rely on what is accessible to them (this is obvious when one realises that most of the top non-US think tanks are either focusing on (or working under the banner of) international development, security and foreign affairs). The kind of analysis that I am attempting and that Goran Buldioski, for instance, is undertaking in Eastern Europe is absent.
- A ranking must have a clear definition of what the top spot implies: top 25 by revenue, by number of staff, by number of publications, by happiness of their staff, etc. It is the same as with sport: Usain Bolt is the fastest sprinter. The Ballon d’Or on the other hand is a perception based award given to the best football player according to the votes of coaches and captains of international teams, as well as journalists from around the world. So you either define why one wins or you define who votes; but you cannot keep both unclear or hidden.
- It is dangerous. It creates incentives towards investing in profile raising and visibility rather than focusing on research and research capacity. The director of a think tank that is not on the list emailed me, worried about their absence, what should we do? Given that they are one of the most influential think tanks in their country, undertake research of the highest quality and are running groundbreaking and innovative initiatives (copied all over the world) my answer is: nothing. And those who make it to the list because they are popular rather than good are incentivised against doing anything about it because they may believe that the list confers them credibility.
My recommendation (if some sort of ranking is what we want) then continues to be the promotion of national think tank awards like the one promoted by Prospect Magazine. It is a shame, really, because this project has the potential to collect fantastic data on think tanks unfortunately because of the focus on the ranking a huge opportunity is being lost.
On the report itself, here are some preliminary comments after a single read (I promise to give it another go):
The first thing I notice is that top to the list are Brookings and Chatham House. I often go to their websites and find out a bit more about them and see that, yes, they have fantastic research and wide range of products and are clearly at the top of their game. And when I can I go to Chatham House events. So far so good, I guess. But then, second and third are Amnesty International and Transparency International. I know these organisations well. They are quite active in my country (Peru) but they are international campaigning NGOs, not think tanks. Transparency International participates in electoral processes as an observer. Is this the role of a think tank? Amnesty international campaigns for human rights and against their violations. I don’t think that researchers lobbying for more funds and freedom for think tanks in many developing countries would like their governments to think that this would mean more space for TI and AI to operate there too. Apples and Oranges?
Then I remember that the winner of Prospect Magazine’s 2010 Think Tanks Award was the Institute for Government; I check the top non-US think tanks but find that there are other UK think tanks in the list and the Institute for Government is nowhere to be found. In fact, it is not mentioned in the whole document. That is odd but, OK, not all rankings have to agree. What about Policy Exchange? Policy Exchange was set up by the supporters and members of the Conservative Party and was instrumental in the development of the ideas that shaped the arguments that won the 2010 election and that are guiding the new government’s policy agenda. There is a fantastic indirect account of this in Peter Snowdon’s book: Back from the Brink. No, the Policy Exchange is not listed either.
To make sure I am not missing anything I jump to the table for Europe (page 31) but no luck. They are not there. But the Overseas Development Institute is.
Now, as much as I like ODI, I am sure that it is not more influential than Policy Exchange. So, wait a minute, maybe this ranking is not about influence but about worth..?… about value? reputation? is it about finding the ones more capable of speaking truth to power? But why then have an index every year? What can change year on year to get a new one into the ranking? An annual index suggest that think tanks quality can change in a short period of time and therefore it is possible for an unknown organisation to make it to the top is the happen to do all the right things. Is it possible in this ranking? CGD did it more or less and on the basis of a good combination of research and communications. But is it possible for think tanks in small countries focusing on local issues? And is it really a worthy end?
The more I see Chatham House and other security and international relations think tanks the more it feels as if the theme of this year’s ranking is foreign policy or international development -maybe that is what this year was about. Or maybe this is what the annual ranking should be about: focus on a single theme so that more and better analysis can be done for each think tank.
Nevermind, let’s get back to it. On to Latin America, which I know a bit. The list includes the Centro de Estudios Publicos (CEP) from Chile, the Centro de Implementacion de Politicas Publicas para la Equidad y el Crecimiento (CIPPEC) in Argentina, the Instituto Libertad y Democracia (ILD) in Peru (which by the way is on both 15 and 24), and CEPAL (the UN’s Economic Commission for Latin America and the Caribbean, or ECLAC in English). This is interesting. CEPAL is the only truly regional policy research centre in the list -but it is a UN body. CEP and CIPPEC are clearly focused on their own countries -and they are certainly influential there but not in my country, Peru. And ILD was influential (granted it has been one of the most influential organisations int he world led by their director Hernando de Soto) but it almost has no public presence in Peru and cannot be really compared with other Peruvian and Latin American think tanks if one quickly browses through their work and publications. ILD is a fantastic analysis based consultancy working across the developing world on the basis of some research done in the 1980s. If they make it to the top of the list it is far more interesting to find out why this is the case rather than their place in the ranking: is it because this is what policymakers value, or were the respondents from Africa or Asia where they do most of their work?
In any case, policy in Peru is influenced by (among the think tanks) CIUP, GRADE (which is mentioned), IEP, and others that are not on the list. This is a perfect example of visibility: is it sometimes my impression that GRADE is quite successful in reaching audiences in DC and London and is therefore well known globally; while IEP and CIUP might be more focused on domestic policy debates and hence less well known beyond the country or region -or certain research communities. This probably reflects their origins, mandate and business models. So even within a country, comparison is difficult. Who is to say though whether one is better than the other based on their choice of audiences? [This section has been edited; see comments below.]
Back to Latin America (and for that matter, Europe). In Latin America there isn’t a regional government so what is the point of a regional ranking. So what is the top think tank is Brazilian? Is it informing the Chilean government? Is it valuable for Colombia? Maybe in Europe ‘European think tanks’ make more sense but then is this why domestically focused think tanks are not mentioned? Clearly, international reviewers would not know who are the movers and shakers of Peruvian, British, or Spanish policies. (Again, a point in favour of national awards.)
So maybe the regional focus has little to do with where the think tanks do their influencing and more with quite simply where they are based. But if this is the case then once again we’d be separating think tanks from their context -and this is not right.
And now on to Africa. This list looks a bit messy, to say the least. The first 7 are from South Africa (no surprises there). But number 8 is a regional research network made up of researchers based in think tanks across Africa -I’d like to call it a think tank but I am not sure how it compares with the others. And then it lists a few organisations which can hardly be called organisations at all and are only popular or known because they are among the only ones in their countries. Others are in the process of getting there; but are not there yet. A tiny bit of analysis would have provided sufficient information to disqualify them as worthy of any ranking; and to identify many others who may be more worthy of a mention.
Anyway, what is the point of saying that organisation xyz is among the top 25 in Africa? How does it compare with the Latin American ones, for instance?
What happened with the debate on think tanks in South Asia? I’ve been avidly following a great debate on Indian newspapers on think tanks that would suggest a fantastic opportunity for a study such as this one. And how useful is it to compare them with think tanks in East and Southeast Asia? In fact, how useful is it to compare think tanks in China or Vietnam with those in Japan, Indonesia and South Korea? Our overview study on think tanks and politics in the region showed foundational differences between them that merit more rather than less national focus.
The lack of analysis is telling of the limits of this type of research. A country or region focused study (rather than ranking) would have been much richer and useful.
The thematic rankings are also quite interesting. The fact still remains that one cannot separate theme from politics -and politics are always local.
I would have loved an explanation for Chatham House coming ahead of IDS in the ranking on International Development. Chatham House if by far a better think tank than ODI and IDS on foreign policy (and let’s face it they are a fantastic think tank in general and its contribution to international development debate is invaluable) but given that international development policy is still largely dominated by DFID and that DFID’s research programme is dominated by IDS and ODI (and not Chatham House) and that IDS alumni roam the corridors of DFID I cannot understand the ranking. More explanation is needed, please.
Also, why is Fundacao Getulio Vargas included in this table? They are not focused on International Development policy, their focus is on just policies; the international development prefix is added by ‘northern’ organisations to describe policies for or of developing countries. FGT deal with economic, business and legal research for the development of Brazil. How is this different from the research done by Brookings or IPPR for the development of the US and the UK respectively? (patronising?)
Also FGV is included at the foundation level not at the level of its centres of programmes, however, the Pew Research Center rather than the Pew Charitable Trusts is included. Why? I would suggest that it has to do with the narrow and shallow focus on a global index instead of a desire to understand the richness of the histories of these organisations.
Then it gets confusing -think tanks are in more than one category but in totally different levels and others which one would expect to find are gone. Yes, this is all possible, as most think tanks would be good in one thing and not in all; but Chatham House, for example, is the top UK think tank in most list but behind the International Institute for Strategic Studies when it comes to their core area of expertise: foreign policy and security. This makes no sense.
The potentially most useful list (domestic economic policy) ends up being a US focused one. This further illustrates the limitations of a global ranking and its bias towards international development and foreign affairs think tanks that are more easily identifiable in the blogosphere or more popular communication channels than domestically focused ones.
Then the special categories: most innovative policy idea -great category but what have they been nominated for? what was the idea that got Brookings to the top? Again, another missed opportunity to provide intelligent insights into the rich and complex reality of think tanks. The same goes for the outstanding policy research programme category. Which programme got ODI the 15th place? ODI has quite a lot of programmes -and also projects that we call programmes because they are larger than the usual small projects we run. So which one was it? The Africa Power and Politics Programme? The Research and Policy in Development Programme? The Humanitarian Policy Group’s Integrated Programme? The Chronic Poverty Research Centre? It is important to know because some of these are delivered with other organisations so ODI could not take all the credit.
I got bored a bit and jumped over some tables until I got to the best government affiliated think tank -WBI? Nice to know that the WB is considered a government. If the WB is a ‘government’ would the UN not be one too? (UNU-WIDER and CEPAL are in the other tables.) What about think tanks entirely (or almost entirely) funded by their governments or the international cooperation?
And then, Party Affiliated think tanks -which is an important addition to any work on think tanks. This merits an entirely different post. What does affiliated mean? Does this include Conservative think tanks in the United States like Heritage or the Conservative Party’s Central Research Department? And wouldn’t CASS and VASS (the Vietnamese equivalent of CASS) be part of this category? After all, they are affiliated to the Communist Party and Chinese and Vietnamese line ministries have their own think tanks.
I don’t want this to be a totally anti-Go-to-Think-Tank-of-the-Year rant. As I said before, the ranking has created an opportunity for debate and discussion on think tanks and this is good. But this ought to lead to a proper discussion about think tanks, the roles they play and how they may be able to contribute to their contexts (local and/or global).
The list of questions and criteria in page 56 is the best part of the document and an important contribution to the think tanks debate. It provides a guideline of sorts to study think tanks in greater detail and to promote a more intelligent debate. Focusing on the list and the ranking, I think, robs us of James McGann’s and his team’s undeniable capacity to do this and leave us with a bitchy Oscar nominations season for researchers.
I am no a fan of measuring the value of think tanks by looking at website hits and press mentions but that does not mean that these do not help to tell a part of the story. This press release by CEPR, based on a study by FAIR: Right Ebbs, Left Gains as Media ‘Experts’, of the cost effectiveness of the most widely cited US think tanks provides an interesting take on this.
I guess that it makes sense as a comparator between similar organisations -at least between organisations that are playing under more or less the same rules; as is the case in the US think tank scene.
This does not mean that this type of analysis would work across borders -comparing, say the US with he UK, or countries within Latin America or Africa.
For instance, this morning, Nick Scott, ODI’s online communications manager sent us this Books Google Ngram comparing ODI with the Institute of Development Studies, the Center for Global Development and Brookings. It is not only an unfair comparison between the UK and the US; but also between the international development focus of the first three and the more general focus of the latter.
Peter Singer hits the nail on the head with an article on the ethics of think tanks and the threats that certain funding sources may create for think tanks’ independence.
On Saturday I blogged about the risks of foreign funding to think tanks in India.
Singer argues that:
Thinktankdom is a field that lacks any universal code of ethics, ombudsmen to hold people accountable, a professional association to regulate, etc. Even more, it is filled with people who are happy to speak about everything under the sun –that is, except their own field and the dirty little part that money plays in it.
And believes that:
thinktankers should not take on private consulting contracts with firms they might research and comment upon in their public work.
John Blundell, former chief of the IEA agrees with this assessment. And, I must say, that so do I.
However, this is not always possible. The reality of many think tanks in developing countries -and certainly in the poorest and most aid dependent countries- is one where the funders of research and influence are a few bilateral and multilateral donors; and more recently global foundations. Even in the developed world, international development think tanks (IDS, ODI, DIE, ECDPM, FRIDE, and other smaller non-for profit and for-profit outfits who portray themselves as source of independent sources of expertise) are almost entirely dependent of bilateral donor funds -even though their research and influence is also focused on these donors.
In the aid sector, funding for think tanks tends to come in the form of consultancy contracts rather than research grants or core funding. This creates, according to Singer’s assessment serious conflicts of interest and challenges the very essence of think tanks’ functions.
Singer’s recommendation, that think tanks and researchers should fully disclose the source of their funds -certainly more so when a particular study has been commissioned by a client- should be taken seriously in these contexts.
Transparency can only been a good thing.
This month, Prospect Magazine announced the winners of the its Think Tank of the Year Award. The Institute for Government won the top spot, with the Policy Exchange claiming the prize for the best think tank publication of the year (“Making Housing Affordable,” by Alex Morton); the European Council on Foreign Relations taking the best Britain-based think tank dealing with non-British affairs award; and ResPublica as the “One to Watch.”
The panel included, a senior adviser to David Cameron, a members of the House of Lords, a think tank veteran and experienced journalists. Their verdict reflects a particular kind of deliberation that clearly attempts to understand the complexity of the task of picking the ‘think tank’ of the year.
The judges described the Institute for Government as “indispensable,” praising its work on financial consolidation which helped improve the policymaking process leading up to the CSR. Andrew Adonis, the new head of the think tank, accepted the award but was at pains to point out that he deserved little of the credit.
They were also impressed by Alex Morton’s “fresh, thorough and ambitious set of proposals for radically overhauling housing and planning policy in this country.” Published in August, the Policy Exchange report has been widely discussed—and, said the judges, rightly so.
For the European Council on Foreign Relations, special credit was given to its power audits resulting in audits of EU-US and EU-UN relations and its work on international crisis management. And finally, the judges commended Phillip Blond’s achievement in creating ResPublica: a think tank with a distinctive agenda and set of values, which has also published a handful of deeply stimulating reports over the past 12 months.
Also this months many of us have received a few emails from James McGann urging us to respond to a survey to choose the top go to think tanks all over the world. The survey is a massive list of think tanks (down from an even longer one) for the US, the UK, Europe, Latin America, Africa and Asia; as well as for various policy or topic areas.
A number of dimensions are explored and the respondents are asked to assess the quality of their research, their communication competencies, they degree of influence, etc.
But how can we compare between think tanks in different countries? How can we judge a think tank in the US -endowed and free to speak its collective mind- to be better than one in Ecuador -competing for funds and mindful of what it says and when.
And how relevant is this comparison? Donors are not thinking: should I fund a think tank in the US or a think tank in Kenya. And a think tank in Kenya may look at Brookings for inspiration but cannot copy everything it does -nor should it compare it self with it. An index that compares a US based and a Kenya based think tank is really comparing the countries -and there are better indices for this.
The regional rankings do not make sense either: naturally, Brazilian and Argentinian think tanks dominate the list in Latin America -even when their focus is entirely domestic.
In the future, research funders should follow Prospect’s example and promote the setting up of these kind of nationally focused and led awards. Otherwise we risk promoting a popularity context -and the shallowness that comes with it.
This is the presentation I gave at a recent meeting of think tanks hosted by ODI in London. It draws from other posts in this blog but, I hope, provides a stronger argument. It also has a Prezi:
“I do a lot of work with policymakers, but how much effect am I having? It’s like they’re coming in and saying to you, ‘I’m going to drive my car off a cliff. Should I or should I not wear a seatbelt?’
And you say, ‘I don’t think you should drive your car off the cliff.’
And they say, ‘No, no, that bit’s already been decided—the question is whether to wear a seatbelt.’
And you say, ‘Well, you might as well wear a seatbelt.’ And then they say, ‘We’ve consulted with policy expert Rory Stewart and he says . . . .’ ”
The common definition, employed by experts in the field like Diane Stone, James McGann and others, describes them as a distinctive class of organisations –not-for-profit and different and separate from universities, markets and the state- that seek to use research to influence policy. However, as I found in the study of think tanks in Latin America, Africa and Asia, these particular think tanks only exist in the imaginary of those who idealised the Brookings and Chatham Houses of this world; and more often than not, we find ourselves dealing with the exceptions rather than the rule -this was the point of my presentation on think tanks at an event in ODI in 2009: hybrids are the norm.
- It privileges U.S. and U.K. think tank traditions over all others;
- It leaves out many present day examples that do not fit with the definition: corporatist think tanks in Japan, public think tanks in Vietnam (RAND, by the way, is a federally funded organisation), university based think tanks across Latin America, partisan think tanks in Chile, Uruguay, the U.K. and the U.S., etc.;
- It robs the concept of think tanks of historical depth forgetting that the first think tanks were offshoots of the very same institutions they are now supposed to be independent of; and
- Most significantly, it fails to recognise the importance of the concept itself: He argues that the use of the label is a strategic choice made by organisations within a complex system of actors and relations.
This last point is worth exploring further. The sudden rise of funding for think tanks has seen a rise in the number of organisations positioning (or-rebranding) themselves as think tanks.
Medvetz explains how this positioning as a think tank involves a necessary ‘complex performance of distancing and affinity’:
- On the one hand think tanks assert their independence by differentiating themselves from universities, advocacy groups, public bodies and lobbyists; but
- On the other hand pursue strategies or behaviours that mimic their values and practices: appointing fellows, investing in communication departments and an array of advocacy tactics, pilot projects and policy proposals, and seek to actively influence and lobby policymakers.
The act of definition is then the art of forging the identity -independent or dependent- that best suits the organisation’s objectives; which, according to Medvetz’ analysis, is the accumulation of authority within the policy space. And in a multi-actor world, this is essentially a process that takes place in relation to others: we define Brookings in relation to the Heritage (U.S.), ODI in relation to IDS (U.K.), and CIUP to GRADE (Peru).
But also, And this is left out of his analysis, this definition takes place over time and is likely to change to fit the chaining context.
Another reason why the traditional definition of think tanks is flawed is that it does not offer us anything of practical value. What does one do with a definition that describes something as’something else’ or ‘not something else’? And what do we do when the one thing it says think tanks do is also what lots of other organisations do, too? How does a think tank use this definition to decide how to invest its resources, where to position itself, how to influence, etc?
To address this I attempt to describe think tanks according to their functions as well as to their position in the knowledge policy space. According to recent work by ODI in Latin America –and drawing form the literature on think tanks- we could argue that think tanks can fulfil at least six roles (or services) in their political context:
- They can provide legitimacy to policies (whether it is ex-ante or ex-post);
- They can act as spaces for debate and deliberation –even as a sounding board for policymakers and opinion leaders. In some context they provide a safe house for intellectuals and their ideas;
- They can provide a financing channel for political parties and other policy interest groups;
- They attempt to influence the policy process;
- They are providers of cadres of experts and policymakers for political parties and governments; and
- (I have added Hugh Gusterson suggestion of) An auditing function for think tanks.
This approach to understanding think tanks opens the door to further analysis. The following framework (based on Stephen Yeo’s description of think tanks’ mode of work) might help.
First, think tanks may work in or based their business on one or more business models, including:
- Independent research: this would be work done with core or flexible funding that allows the researchers the liberty to choose their research questions and method. It may be long term and could focus on ‘big ideas’ with no direct policy relevance. On the other hand, it could focus on a key policy problem that requires a thorough research and action investment.
- Consultancy: this would be work done through commissions with specific clients and addressing one or two key questions. Consultancies often respond to an existing agenda.
- Influencing/advocacy: this would be work done through communications, capacity development, networking, campaigns, lobbying, etc. It is likely to be based on research based evidence emerging from independent research or consultancies.
Second, think tanks may base their work or arguments on:
- Ideology, values or interests
- Applied, empirical or synthesis research
- Theoretical or academic research
As described in this table showing the mode of work and basis of the messages, this kind of analysis is more likely to follow from a functional than the traditional definition.
Tom Medvetz provides an alternative (and complementary?) framework for analysis that focuses on the positioning of think tanks within the social space.
With this in mind it is possible to explore where and how the organisation might attempt to bring about change (internal and external).
To compete in one or another space, think tanks might have to trade-off some competencies or skills. For example, to succeed among academics, think tanks might have to trade-off their communication competencies (because of limited resources as well as pressures from more academic staff to focus on academic publications rather and policy engagement). And communications competencies (broadly defined) are what think tanks may offer academic researchers as a contribution to a productive partnership.
But, of course, deciding this depends on where the think tank decides to position itself; and some think tanks may be closer to the media, politics or economic power.
These decisions, about the skills and competencies on which think tanks should invest, should be easier if the functional boundaries of the organisation are more clearly defined.
In the end (or as in the start of the presentation) everyone has an opinion about what is a think tank.