Skip to content

Posts tagged ‘James McGann’

European think tanks and the European Union

What is the relationship between European think tanks and the European Union, and how can European think tanks be defined and classified? A report by the Bureau of European Policy Advisers offers an overview of these issues.

Read more

This year, instead of ranking think tanks lets think about them more carefully

David Roodman and Julia Clark from the Center for Global Development have posted a very interesting reflection on the now unfortunately famous McGann think tank ranking. They offer an alternative to a global ranking exercise.

Read more

The Go to Think Tank Index: two critiques

This is a bit old news but i feel it is worth sharing reviews by other researchers on the think tank index produced by James McGann.

Jan Trevisan, at the International Centre for Climate Governance has published an interesting critique. There is not much I disagree with in his assessment. He points at several mistakes in the analysis regarding think tanks in the sector he is more familiar with. And this is not surprising because it is difficult for any one person (or team) to delve into the detail that is necessary to adequately assess al think tanks in the world. It makes me think that we should not only attempt to think of think tanks at a country level (which is what I have argued before) but also by theme or issue.

My only disagreement, I guess, is that the ICCG does not consider government or party think tanks in their map. I think that this is a mistake as in many developing countries (and indeed developed countries) it is not possible to find truly independent centres. Affiliation with should not be confused with lack of autonomy.

A couple of years ago, Christian Seiler and Klaus Wohlrabe published their own, very well researched, critique of the 2009 think tank index. Their critique, besides identifying several inaccuracies, focused on the methodological weakness of the ranking. I remember that at the time I offered a similar critique.

I think it is clear that the method is inaccurate and the output is therefore flawed. At least, however, it has got many think tanks and researchers thinking about it. And these critiques and analyses offer far more insights into the world of think tanks than the ranking will ever do.

And the winner is: Brookings … but, once again, the loser: critical analysis

I have not been a fan of Jim McGann’s think tank index. There are no surprised there. I think that the energy that goes into this could be used more productively elsewhere; for instance gathering more in-depth information on specific organisations, their strategies, their tactics, etc. The 2011 global think tank index presents some improvements on previous editions, but it is still far from being what it wants to be. McGann says that he is doing this in part as a response to being asked about think tanks around the world, but this is no due diligence. I learn so much more about think tanks from a conversation with a director like Orazio Bellettini or Simon Maxwell than from the entire report. It is still a popularity contest -and it is still not clear who votes.

The main text of the report suggests that policymakers should use this to guide their choice of think tanks when seeking advice but this strikes me as rather naive: it assumes policymakers do not already have their own networks. It is also quite worrying since most countries do not get any mentions (and many get 1 or 2) and this in effect narrows the options that policymakers would have if they chose to use this as a guide. And this would be a mistake because there are many more excellent think tanks that do not even get a mention in the ranking.

There are some positives, though. I must accept that the inclusion of more categories describing specific achievements is a step in the right direction. But without further detail as to what the think tanks achieved in each I am afraid that the rankings remain unhelpful. Also positive is the definition of think tanks used; it includes organisations affiliated to parties, governments, and others.(Unfortunately, the definition has not always been applied to the rankings.)

It seems that there was a clear effort to address some of the criticism that the ranking has received in the past. At least some are acknowledged in the text.

And, without a doubt, the index gets people talking about think tanks at least once a year. This is also quite good and gives us an opportunity to reflect on these organisations and the functions they fulfil.

Still, the index leaves me with a lot of questions:

  • Among the top think tanks in the world we still find organisations such as Amnesty International, Transparency International, and Human Rights Watch. I am still not sure these qualify as think tanks. AERC, also not a think tank, is listed, but not its Latin American equivalent, LACEA (respondents bias?). CODERSRIA a think tank? IDRC, again, is also not a think tank.
  • TED, I am sure, is not. And while I am on the subject of TED, has anyone wondered why it is only mentioned in the technology category? Yes, that is how it started but anyone who is a fan would know that technology plays only a part in what it does. Perception is a powerful thing.
  • I am still unsure about the global or regional lists. I am not sure what to do with a list that compares Chile’s Centro de Estudio Públicos in Chile with Brazil’s Fundación Getulio Vargas. Why is one better than the other?
  • Or with a single list that includes think tanks from Australia, Central Asia, South Asia, and East and SouthEast Asia.
  • In general, the absence of Latin American think tanks from the global lists is suspicious (of the accuracy of the process, not of the researchers’ intentions).
  • There are no surprises in the Mexico, Canada, and the Caribbean list. How could any Caribbean think tank make it into it? Even here, what is the logic behind the inclusion of Mexico in this list and not in one with Central America? Mexico’s influence in Central America is certainly more relevant.
  • As a colleague, Norma Correa, has suggested, maybe a more accurate list would be one including Mexico, Brazil, and Argentina; they are similar in size, intellectual influence in the region, their academic communities are comparable, etc.
  • How do developing country think tanks, Brazil’s Fundación Getulio Vargas, the Bangladesh Institute of Development Studies, India’s Centre for Development Alternatives, and the China Academy of Social Sciences, get into the international development category? Isn’t what they do the same thing that domestic think tanks in developed countries do?
  • Chatham House is an interesting 2nd place in the Environment Think Tank category. It seems to me that its approach is from a foreign policy or security perspective and so I am not sure how comparable it is with, say, the International Institute for Environment and Development in the UK.
  • Health: The Peterson Institute for International Economics is number 26 on the list. A quick search on the website shows that it does not do any health research. This is interesting in itself. It shows the power of perception when judging think tanks. Also that the team has not fact checked the lists. And how does one compare a political think tank like Heritage, a corporately funded centre like the Kaiser Permanente Institute for Health Policy, university-based health research centres (there are a few), and international development health policy research centres such as CGD?
  • Also, by the way, India’s Institute for Economic Growth is mentioned twice in this list.
  • I don’t quite get the international economic policy think tank category. Is it on international economics or is it influencing international bodies? I would be useful if each category was described in more detail and examples of the reasons why the think tanks have been ranked in that order were provided.
  • What is a transparency and good governance think tank? It certainly helps to explain the presence of human rights NGOs but it seems to me that think tank, by their nature, can support transparency and good governance even if they do not actively seek to change ‘transparency’ policies.

Now, the section related to the new categories is much more interesting and I had been looking forward to it. I have argued before that this is far more useful that regional lists. Let’s see:

  • Most innovative policy ideas: There is a long list… yes… but, what was the most innovative policy idea? Why was it innovative? And how does one do this? If I could have commented only on one issue it would have been this. If all efforts could have been focused on something it should have been to identify the policy ideas proposed by these think tanks and explain why they are innovative, their importance, and how or why they came up on top.
  • Google Ideas is the top new think tank. Is it a think tank? What is it? It has been mentioned in media articles and it has a twitter account but beyond that… what? ResPublica is new but not that new.
  • The category Think Tanks with Outstanding Policy-Oriented Public Policy Research Programs should win an award for the most confusing name. Is it policy oriented OR public policy research? Or is it policy oriented public policy research? I don’t quite get it. Still, their work must have been outstanding. But why? What is the criteria being used? And what are these programmes that are so outstanding that deserve our attention?
  • I really liked the inclusion of the category on the use of the internet. And I was glad to see ODI mentioned: Nick Scott’s work has been excellent in this field. But I was not quite sure why ODI was not higher up than, say, the Council on Foreign Relations or the Fraser Institute. I know that the internet is far more than just the corporate website but this is still a good indicator and ODI’s is certainly way more developed. (I think there is no risk that people might think I am biased in its favour; I have many times in the past been rather critical of it.) And TED, really? All the way down on number 23? They rule the web. Maybe this is because very few among the respondents considered TED to be a think tanks but it got just enough votes to make it onto the list.
  • Among the top 4 of the think tanks with the greatest impact are Transparency International, Human Rights Watch, and Amnesty International. Are these think tanks? But most importantly, what was their impact? I must assume that the list is global in the sense that it includes think tanks from all over the world rather than that the impact was global in nature. But, again, what was the impact being rewarded? And how did they achieve it?
  • I was pleased to see the category of university affiliated think tanks. It recognises that these organisations exist and can be described as such. But, again, why are they the best? Is it because they found a good balance between teaching and research? Did they tackle a particularly important issue?
  • The definition is also important. Are they affiliated as IDS is affiliated to Sussex (IDS is an independent charity) or as IDEAS is affiliated to LSE or CIUP is affiliated to the Universidad del Pacifico (they are research centres of the universities)?
  • I would say the same about the government affiliated think tank category; it is good that it has been included. But is it worth putting governments and multilateral agencies in the same bag?  And how is affiliation defined and applied? According to the definition in the document: A part of the structure of government. Is FLACSO Ecuador affiliated because it gets funds from the government or is it because it is part of the Ecuadorian education system or because it was established through an agreement between the State and the international FLACSO system? Would that make IDS or ODI that get quite a lot of their funding from DFID not affiliated, too? And RAND? (I must say that I am always surprised to find FACSO Ecuador in the ranking. FLACSO Costa Rica or FLACSO Argentina are far more popular -at least for me… perception, perception, perception.)
  • What does it mean to be a party affiliated think tank? Affiliation can mean many things but in this case it cannot mean the same as it does for the government affiliated category. The German model of political party foundations cannot be compared with the British model of party friendly think tanks. Yes, DEMOS influenced the New Labour government but they are pretty active in developing links with the new Coalition government, too. The same could be said about the Centre for Policy Studies: yes, it is close to the Tory Party but it is not of the party. The definition used by the index is: Formally affiliated with a political party, but I don’t think the Law would allow this in the UK. It certainly would not allow it in the United States. Maybe a definition that incorporates informal affiliations would have been more appropriate.

This is rather painful. I thought about not writing anything about the index this year. But having read it I felt that some of this had to be said, again. And I am sure I won’t be the only one.

The points I am making this time around are the same as in previous years:

  1. The contexts and organisations being compared are so different that any comparison is quite useless unless it is accompanied by detailed analysis.
  2. Absences (at least the ones I have noticed) are explained by the fact that this index relies on biased perceptions and little if any corroboration. As McGann himself accepts, they do not have a budget or the capacity to visit (not even virtually?) the think tanks. And I doubt that the respondents themselves have spent much time thinking about their nominations.
  3. Perception is important for think tanks, though; and this may be the problem. Without clear examples explaining why a think tank is perceived as good online, influential or innovative how can they learn and improve?

I insist. Ditch global rankings and focus on national exercises with detailed analysis on the reasons why a think tank comes top of a category (e.g. what was the issue it worked on, the programme it developed, etc.?). This will make it easier to fact check and to ensure that the definitions and criteria are actually being applied to the assessment. It will also get a conversation started at the level that matters and not just in DC or the UN where everyone loves a cocktail. We could then compare between them and learn a great deal more.

Until next year, I am sure.

The mighty influence of think tanks (in the US)

A KCPP discussion on the might of think tanks and the influence they have in US politics and the world. James McGann provides an explanation of this index (you’ll be the judge of that -criteria, ‘experts’, ranking… lots to talk about, but let’s leave that for January).

The conversation focused on some of the following questions:

What is the ultimate goal of heavy-weight think tanks? Do they just add to a cacophony of poisoned politics? Or can their researchers contribute ideas isolated from politicking on the Hill? What are the risks and benefits of relying on them? Should their influence be kept in check? And is there a think tank for every political stripe?

McGann suggests that think tanks can help contribute to an informed debate on policy issues, they provide a government in waiting (a revolving role -and an opportunity for policymakers to reflect on issues of policy), and outside independent analysis. This, he clarifies, is different to other countries where the political culture awards a clearer role to the government for analysis. I agree.

I disagree, however, with his suggestion the independent analysis is unique to the United States. But I am not sure if that is what he meant to say.

What about the clear ideological bias of think tanks, asks the interviewer. What is their value if the ‘thinking’ is rigged from the beginning? McGann responds that the US is a hyper-pluralistic society with a range of institutions and so even if there are clear political opinions and political philosophies can be assigned to specific think tanks, the overall democratic debate is not necessarily affected.

Callers to the programme do not seem to agree -with think tanks: Their ideological bickering is unrealistic and unhelpful.

Should think tanks be sued for the mistakes they make? No, says Mike Gonzalez, Vice President, Communications, The Heritage Foundation, and lists a long list of policies they came up with. But most importantly, he suggests that they have the right to make proposals and governments have the responsibility to decide whether or not to take them on or not.

But Faiz Shakir, Vice President, Center for American Progress and serves as Editor in Chief of Think Progress.org — a blog created by the Center for American Progress Action Fund, suggests that while this is true in theory, in practice it is possible for funding and influence to break down this separation of roles suggested by Gonzalez.

I suggest to listen to the interview -very interesting. The question still remains: are they led by the search for ideas or by ideology or interests (theirs or others)? In other words, is the thinking rigged?

And another question: if think tanks are so good at talking about issues of public interest -sometimes even better than politicians and policymakers- are they letting us (the general public) in or keeping us out of the debate?

Evaluating a think tank from the AEA365 (but should we?)

Johanna Morariu, Senior Associate with Innovation Network. Innovation Network (http://www.innonet.org/) describes an approach to evaluating think tanks in the American Evaluation Association’s AEA365 blog. She draws from some studies and studies focused on think tank evaluation to address 10 categories or assessment areas:

  • Organisation infrastructure, capacity and management
  • Strategy and direction -and both organisation and project levels
  • Organisation visibility and reputation
  • Effectiveness of communication and outreach strategy
  • Quality of research products
  • Participation in congressional testimony
  • Research relevance, quality, usefulness, and rigour
  • Uptake of research in media and policy
  • Research influences the work of other leading reseachers
  • Research influences decision makers and/or policy

Again, as in most evaluations of the work of think tanks, this leaves out other contributions these organisations make:

  • What about their capacity to train and prepare new generations of policymakers? -track movement of staffers into policy or the private sector, for instance
  • And what about the power of think tanks to convene people and organisations from different sides of the argument -and help them find common ground?
  • Or the contribution that think tanks make to the education of elites -not to change their minds but to enlighten their own arguments?
  • Think tanks also trade on power and providing their supporters with access to key spaces. Do we rather not measure this? Let’s not pretend that think tanks have no political or economic allegiances (very few can).

By the way the studies she used are here, courtesy of Johanna:

  • Donald E. Abelson (2010). Is Anybody Listening? Assessing the Influence of Think Tanks. Chapter 1 in the edited volume, Think Tanks and Public Policies in Latin America.
  • Richard Bumgarner, Douglas Hattaway, Geoffery Lamb, James G. McGann, and Holly Wise (2006). Center for Global Development: Evaluation of Impact. Arabella Philanthropic Investment Advisors, LLC for the Bill & Melinda Gates Foundation, the William and Flora Hewlett Foundation, the John D. and Catherine T. MacArthur Foundation, and the Rockefeller Foundation.
  • Ingie Hovland (2007). Making a Difference: M&E of Policy Research. Working paper 281 for the Overseas Development Institute, London, UK.
  • James G. McGann (2006). Best Practices for Funding and Evaluating Think Tanks & Policy Research. McGann Associates for the William and Flora Hewlett Foundation.

But critical to this method was the development of a Theory of Change for the think tank. And from it they were able to develop the right indicators and tools to assess the organisation’s performance. The Theory of Change in this case is not the typical one often described by many projects or programmes. And in fact it looks more like a strategy diagram. This is what we want to achieve and this is how we’ll do it.

I have been asked before about evaluating think tanks. And approaches like Johanna Morariu’s are certainly useful. But I am not too sure if an evaluation is what we want for an organisation. I can see how we may evaluate a project (we can tell when the activities of the project are done -even if it is difficult to assess its effects in the short term). The same goes for a programme or policy. There is something that is done, then that something is done no more, and then we assess how (well) that something was done and if that something had the intended effect.

An organisation, however, does not operate in bursts of activity than then wind down to nothing. It does not start and then stop to be evaluated. Its parts may do, but not the organisation. Yes, there is ongoing evaluation but this all points towards the final (ex-post) evaluation.

For an organisation, a strategic review that considers all of the think tank’s functions is far more relevant. A reflection process around annual staff or research retreats is far more useful.

The ‘Top Think Tanks’ in the World commentary from AFRICA IS A COUNTRY

An interesting review of James McGann’s think tank’s list  from Africa is a CountryThe ‘Top Think Tanks’ in the World

The report also notes “… the rise of think tanks in Asia, Latin America, the Middle East and Africa.” Whether these think tanks offer fresh ideas or anything different from their counterparts in the West–given that they share the same funding sources and ideas–is not discussed in the report.


Goran’s recommendations on think tank rankings

Goran Buldioski offers another take on the rankings in his blog Goran’s musings and some very interesting recommendations that I republish below:

As someone who works with think tanks, studies think tanks, writes about think tanks, I see very little value in it. Therefore, it is high time to move to alternatives to this study:

  1. Best national think tanks (see the suggestion of Enrique Mendizabal) modeled on the UK’s ranking done by Prospect magazine. Note: Thematic categories could be also established.
  2. Best Policy study ( for example see the Policy Association for Open Society (PASOS) award for the best study penned by their members).
  3. Best advocacy campaign by a think tanks (consisted of a series of policy products (from op-ed to book), events (briefings, debates, seminars, conferences, training events etc.).
  4. Best online presentation. [Or maybe best online communications strategy?]
  5. Best design and communication strategy

I would add categories related to:

  1. Best long term policy research programme -that has maintained and developed a reputation on a specific issue with or without support
  2. Best prospective think tank -thinking about the future challenges of its country
  3. Best think tank to work for -to highlight the human capital development role that think tanks have
  4. Think tank to watch

I would also encourage categories related to the other members of the policy space:

  1. Best use of research based evidence by the media
  2. Best or most innovative funder
  3. Most evidence based political party manifesto

At this level, whatever category would provide an opportunity to have a real public conversation about think tanks and their contexts (and histories). No need for a ranking: one winner and some honorary mentions would do. And if there are disagreements then these can be aired publicly and addressed rather quickly before the next award.

It would encourage communication between think tank directors and with their audiences (and possible members of the panel), serve as a platform from which to launch important new policy ideas and debates, contribute to the development of more informed cadres of journalists, policymakers and funders, etc.

Do have a look at Goran’s comments about the inconsistencies in the ranking -they are quite to the point.

Another year, another ranking of think tanks (and surprise surprise, Brookings is still the best)

I’ll accept that James McGann’s effort to identify and rank all the think tanks in the world has some positive outcomes. First of all, it has people talking about think tanks -and some think tanks are even becoming aware that there is a debate out there about themselves. Second… no, that is it. [Also have a look at Goran Buldioski's blog on the same subject]

I am still of the opinion that going beyond the counting and study of individual think tanks (and their immediate systems) is useless and misleading. Here are five reasons why I do not support this ranking, and then a longer semi-rant at the document.

  1. Think tanks cannot be de-linked from their political, social and economic environment; since think tanks define themselves in relation to the other players in the system. Brookings cannot be described without references to US bipartisanship -when we say independent research in the US we mean independent of either party (as well as of other interests). But independent means something entirely different in China, India, Brazil, or Argentina. Global and regional rankings are therefore unhelpful when the focus of think tanks is local (not local as in of this town or neighbourhood but of their direct interactions).
  2. The list is too diverse to be relevant. The definition of ‘think tanks’ has improved since I last commented on it to include politics. But he has now included organisations some that cannot be possibly compared with the rest. Let’s put it this way: if I define a mobile phone as a device that allows me to make phone calls while on the move I could be tempted to include laptops (after all I can make Skype calls ‘on the move’) but I wouldn’t because it would be confusing and unhelpful. A mobile is one thing and a laptop is another. Maybe they will do things that the other can also do but that does not make them the same thing. Amnesty International, Human Rights Watch, Transparency International and the various foundations (funders rather than researchers) included …. how useful is it to compare them with IPAR in Rwanda or GRADE in Peru?
  3. It is still based on perception rather than thoughtful analysis. Thoughtful analysis would have required the development of a database with answers to all the questions or criteria presented in page 56. These are good questions, but the nominators were not asked to provide answers to these, only to use them to think about their nominations. This means that it is all about presentation rather than content: still a popularity contest among people who clearly cannot know about every context and must therefore rely on what is accessible to them (this is obvious when one realises that most of the top non-US think tanks are either focusing on (or working under the banner of) international development, security and foreign affairs). The kind of analysis that I am attempting and that Goran Buldioski, for instance, is undertaking in Eastern Europe is absent.
  4. A ranking must have a clear definition of what the top spot implies: top 25 by revenue, by number of staff, by number of publications, by happiness of their staff, etc. It is the same as with sport: Usain Bolt is the fastest sprinter. The Ballon d’Or on the other hand is a perception based award given to the best football player according to the votes of coaches and captains of international teams, as well as journalists from around the world. So you either define why one wins or you define who votes; but you cannot keep both unclear or hidden.
  5. It is dangerous. It creates incentives towards investing in profile raising and visibility rather than focusing on research and research capacity. The director of a think tank that is not on the list emailed me, worried about their absence, what should we do? Given that they are one of the most influential think tanks in their country, undertake research of the highest quality and are running groundbreaking and innovative initiatives (copied all over the world) my answer is: nothing. And those who make it to the list because they are popular rather than good are incentivised against doing anything about it because they may believe that the list confers them credibility.

My recommendation (if some sort of ranking is what we want) then continues to be the promotion of national think tank awards like the one promoted by Prospect Magazine. It is a shame, really, because this project has the potential to collect fantastic data on think tanks unfortunately because of the focus on the ranking a huge opportunity is being lost.

On the report itself, here are some preliminary comments after a single read (I promise to give it another go):

The first thing I notice is that top to the list are Brookings and Chatham House. I often go to their websites and find out a bit more about them and see that, yes, they have fantastic research and wide range of products and are clearly at the top of their game. And when I can I go to Chatham House events. So far so good, I guess. But then, second and third are Amnesty International and Transparency International. I know these organisations well. They are quite active in my country (Peru) but they are international campaigning NGOs, not think tanks. Transparency International participates in electoral processes as an observer. Is this the role of a think tank? Amnesty international campaigns for human rights and against their violations. I don’t think that researchers lobbying for more funds and freedom for think tanks in many developing countries would like their governments to think that this would mean more space for TI and AI to operate there too. Apples and Oranges?

Then I remember that the winner of Prospect Magazine’s 2010 Think Tanks Award was the Institute for Government; I check the top non-US think tanks but find that there are other UK think tanks in the list and the Institute for Government is nowhere to be found. In fact, it is not mentioned in the whole document. That is odd but, OK, not all rankings have to agree. What about Policy Exchange? Policy Exchange was set up by the supporters and members of the Conservative Party and was instrumental in the development of the ideas that shaped the arguments that won the 2010 election and that are guiding the new government’s policy agenda. There is a fantastic indirect account of this in Peter Snowdon’s book: Back from the Brink. No, the Policy Exchange is not listed either.

To make sure I am not missing anything I jump to the table for Europe (page 31) but no luck. They are not there. But the Overseas Development Institute is.

Now, as much as I like ODI, I am sure that it is not more influential than Policy Exchange. So, wait a minute, maybe this ranking is not about influence but about worth..?… about value? reputation? is it about finding the ones more capable of speaking truth to power? But why then have an index every year? What can change year on year to get a new one into the ranking? An annual index suggest that think tanks quality can change in a short period of time and therefore it is possible for an unknown organisation to make it to the top is the happen to do all the right things. Is it possible in this ranking? CGD did it more or less and on the basis of a good combination of research and communications. But is it possible for think tanks in small countries focusing on local issues? And is it really a worthy end?

The more I see Chatham House and other security and international relations think tanks the more it feels as if the theme of this year’s ranking is foreign policy or international development -maybe that is what this year was about. Or maybe this is what the annual ranking should be about: focus on a single theme so that more and better analysis can be done for each think tank.

Nevermind, let’s get back to it. On to Latin America, which I know a bit. The list includes the Centro de Estudios Publicos (CEP) from Chile, the Centro de Implementacion de Politicas Publicas para la Equidad y el Crecimiento (CIPPEC) in Argentina, the Instituto Libertad y Democracia (ILD) in Peru (which by the way is on both 15 and 24), and CEPAL (the UN’s Economic Commission for Latin America and the Caribbean, or ECLAC in English). This is interesting. CEPAL is the only truly regional policy research centre in the list -but it is a UN body. CEP and CIPPEC are clearly focused on their own countries -and they are certainly influential there but not in my country, Peru. And ILD was influential (granted it has been one of the most influential organisations int he world led by their director Hernando de Soto) but it almost has no public presence in Peru and cannot be really compared with other Peruvian and Latin American think tanks if one quickly browses through their work and publications. ILD is a fantastic analysis based consultancy working across the developing world on the basis of some research done in the 1980s. If they make it to the top of the list it is far more interesting to find out why this is the case rather than their place in the ranking: is it because this is what policymakers value, or were the respondents from Africa or Asia where they do most of their work?

In any case, policy in Peru is influenced by (among the think tanks) CIUP, GRADE (which is mentioned), IEP, and others that are not on the list. This is a perfect example of visibility: is it sometimes my impression that GRADE is quite successful in reaching audiences in DC and London and is therefore well known globally; while IEP and CIUP might be more focused on domestic policy debates and hence less well known beyond the country or region -or certain research communities. This probably reflects their origins, mandate and business models. So even within a country, comparison is difficult. Who is to say though whether one is better than the other based on their choice of audiences? [This section has been edited; see comments below.]

Back to Latin America (and for that matter, Europe). In Latin America there isn’t a regional government so what is the point of a regional ranking. So what is the top think tank is Brazilian? Is it informing the Chilean government? Is it valuable for Colombia? Maybe in Europe ‘European think tanks’ make more sense but then is this why domestically focused think tanks are not mentioned? Clearly, international reviewers would not know who are the movers and shakers of Peruvian, British, or Spanish policies. (Again, a point in favour of national awards.)

So maybe the regional focus has little to do with where the think tanks do their influencing and more with quite simply where they are based. But if this is the case then once again we’d be separating think tanks from their context -and this is not right.

And now on to Africa. This list looks a bit messy, to say the least. The first 7 are from South Africa (no surprises there). But number 8 is a regional research network made up of researchers based in think tanks across Africa -I’d like to call it a think tank but I am not sure how it compares with the others. And then it lists a few organisations which can hardly be called organisations at all and are only popular or known because they are among the only ones in their countries. Others are in the process of getting there; but are not there yet. A tiny bit of analysis would have provided sufficient information to disqualify them as worthy of any ranking; and to identify many others who may be more worthy of a mention.

Anyway, what is the point of saying that organisation xyz is among the top 25 in Africa? How does it compare with the Latin American ones, for instance?

What happened with the debate on think tanks in South Asia? I’ve been avidly following a great debate on Indian newspapers on think tanks that would suggest a fantastic opportunity for a study such as this one. And how useful is it to compare them with think tanks in East and Southeast Asia? In fact, how useful is it to compare think tanks in China or Vietnam with those in Japan, Indonesia and South Korea? Our overview study on think tanks and politics in the region showed foundational differences between them that merit more rather than less national focus.

The lack of analysis is telling of the limits of this type of research. A country or region focused study (rather than ranking) would have been much richer and useful.

The thematic rankings are also quite interesting. The fact still remains that one cannot separate theme from politics -and politics are always local.

I would have loved an explanation for Chatham House coming ahead of IDS in the ranking on International Development. Chatham House if by far a better think tank than ODI and IDS on foreign policy (and let’s face it they are a fantastic think tank in general and its contribution to international development debate is invaluable) but given that international development policy is still largely dominated by DFID and that DFID’s research programme is dominated by IDS and ODI (and not Chatham House) and that IDS alumni roam the corridors of DFID I cannot understand the ranking. More explanation is needed, please.

Also, why is Fundacao Getulio Vargas included in this table? They are not focused on International Development policy, their focus is on just policies; the international development prefix is added by ‘northern’ organisations to describe policies for or of developing countries. FGT deal with economic, business and legal research for the development of Brazil. How is this different from the research done by Brookings or IPPR for the development of the US and the UK respectively? (patronising?)

Also FGV is included at the foundation level not at the level of its centres of programmes, however, the Pew Research Center rather than the Pew Charitable Trusts is included. Why? I would suggest that it has to do with the narrow and shallow focus on a global index instead of a desire to understand the richness of the histories of these organisations.

Then it gets confusing -think tanks are in more than one category but in totally different levels and others which one would expect to find are gone. Yes, this is all possible, as most think tanks would be good in one thing and not in all; but Chatham House, for example, is the top UK think tank in most list but behind the International Institute for Strategic Studies when it comes to their core area of expertise: foreign policy and security. This makes no sense.

The potentially most useful list (domestic economic policy) ends up being a US focused one. This further illustrates the limitations of a global ranking and its bias towards international development and foreign affairs think tanks that are more easily identifiable in the blogosphere or more popular communication channels than domestically focused ones.

Then the special categories: most innovative policy idea -great category but what have they been nominated for? what was the idea that got Brookings to the top? Again, another missed opportunity to provide intelligent insights into the rich and complex reality of think tanks. The same goes for the outstanding policy research programme category. Which programme got ODI the 15th place? ODI has quite a lot of programmes -and also projects that we call programmes because they are larger than the usual small projects we run. So which one was it? The Africa Power and Politics Programme? The Research and Policy in Development Programme? The Humanitarian Policy Group’s Integrated Programme? The Chronic Poverty Research Centre? It is important to know because some of these are delivered with other organisations so ODI could not take all the credit.

I got bored a bit and jumped over some tables until I got to the best government affiliated think tank -WBI? Nice to know that the WB is considered a government. If the WB is a ‘government’ would the UN not be one too? (UNU-WIDER and CEPAL are in the other tables.) What about think tanks entirely (or almost entirely) funded by their governments or the international cooperation?

And then, Party Affiliated think tanks -which is an important addition to any work on think tanks. This merits an entirely different post. What does affiliated mean? Does this include Conservative think tanks in the United States like Heritage or the Conservative Party’s Central Research Department? And wouldn’t CASS and VASS (the Vietnamese equivalent of CASS) be part of this category? After all, they are affiliated to the Communist Party and Chinese and Vietnamese line ministries have their own think tanks.

I don’t want this to be a totally anti-Go-to-Think-Tank-of-the-Year rant. As I said before, the ranking has created an opportunity for debate and discussion on think tanks and this is good. But this ought to lead to a proper discussion about think tanks, the roles they play and how they may be able to contribute to their contexts (local and/or global).

The list of questions and criteria in page 56 is the best part of the document and an important contribution to the think tanks debate. It provides a guideline of sorts to study think tanks in greater detail and to promote a more intelligent debate. Focusing on the list and the ranking, I think, robs us of James McGann’s and his team’s undeniable capacity to do this and leave us with a bitchy Oscar nominations season for researchers.

on rankings

This month, Prospect Magazine announced the winners of the its Think Tank of the Year Award. The Institute for Government won the top spot, with the Policy Exchange claiming the prize for the best think tank publication of the year (“Making Housing Affordable,” by Alex Morton); the European Council on Foreign Relations taking the best Britain-based think tank dealing with non-British affairs award; and ResPublica as the “One to Watch.”

The panel included, a senior adviser to David Cameron, a members of the House of Lords, a think tank veteran and experienced journalists. Their verdict reflects a particular kind of deliberation that clearly attempts to  understand the complexity of the task of picking the ‘think tank’ of the year.

The judges described the Institute for Government as “indispensable,” praising its work on financial consolidation which helped improve the policymaking process leading up to the CSR. Andrew Adonis, the new head of the think tank, accepted the award but was at pains to point out that he deserved little of the credit.

They were also impressed by Alex Morton’s “fresh, thorough and ambitious set of proposals for radically overhauling housing and planning policy in this country.” Published in August, the Policy Exchange report has been widely discussed—and, said the judges, rightly so.

For the European Council on Foreign Relations, special credit was given to its power audits resulting in audits of EU-US and EU-UN relations and its work on international crisis management. And finally, the judges commended Phillip Blond’s achievement in creating ResPublica: a think tank with a distinctive agenda and set of values, which has also published a handful of deeply stimulating reports over the past 12 months.

Also this months many of us have received a few emails from James McGann urging us to respond to a survey to choose the top go to think tanks all over the world. The survey is a massive list of think tanks (down from an even longer one) for the US, the UK, Europe, Latin America, Africa and Asia; as well as for various policy or topic areas.

A number of dimensions are explored and the respondents are asked to assess the quality of their research, their communication competencies, they degree of influence, etc.

But how can we compare between think tanks in different countries? How can we judge a think tank in the US -endowed and free to speak its collective mind- to be better than one in Ecuador -competing for funds and mindful of what it says and when.

And how relevant is this comparison? Donors are not thinking: should I fund a think tank in the US or a think tank in Kenya. And a think tank in Kenya may look at Brookings for inspiration but cannot copy everything it does -nor should it compare it self with it. An index that compares a US based and a Kenya based think tank is really comparing the countries -and there are better indices for this.

The regional rankings do not make sense either: naturally, Brazilian and Argentinian think tanks dominate the list in Latin America -even when their focus is entirely domestic.

In the future, research funders should follow Prospect’s example and promote the setting up of these kind of nationally focused and led awards. Otherwise we risk promoting a popularity context -and the shallowness that comes with it.

Follow

Get every new post delivered to your Inbox.

Join 5,040 other followers