Skip to content

Another year, another ranking of think tanks (and surprise surprise, Brookings is still the best)

I’ll accept that James McGann’s effort to identify and rank all the think tanks in the world has some positive outcomes. First of all, it has people talking about think tanks -and some think tanks are even becoming aware that there is a debate out there about themselves. Second… no, that is it. [Also have a look at Goran Buldioski’s blog on the same subject]

I am still of the opinion that going beyond the counting and study of individual think tanks (and their immediate systems) is useless and misleading. Here are five reasons why I do not support this ranking, and then a longer semi-rant at the document.

  1. Think tanks cannot be de-linked from their political, social and economic environment; since think tanks define themselves in relation to the other players in the system. Brookings cannot be described without references to US bipartisanship -when we say independent research in the US we mean independent of either party (as well as of other interests). But independent means something entirely different in China, India, Brazil, or Argentina. Global and regional rankings are therefore unhelpful when the focus of think tanks is local (not local as in of this town or neighbourhood but of their direct interactions).
  2. The list is too diverse to be relevant. The definition of ‘think tanks’ has improved since I last commented on it to include politics. But he has now included organisations some that cannot be possibly compared with the rest. Let’s put it this way: if I define a mobile phone as a device that allows me to make phone calls while on the move I could be tempted to include laptops (after all I can make Skype calls ‘on the move’) but I wouldn’t because it would be confusing and unhelpful. A mobile is one thing and a laptop is another. Maybe they will do things that the other can also do but that does not make them the same thing. Amnesty International, Human Rights Watch, Transparency International and the various foundations (funders rather than researchers) included …. how useful is it to compare them with IPAR in Rwanda or GRADE in Peru?
  3. It is still based on perception rather than thoughtful analysis. Thoughtful analysis would have required the development of a database with answers to all the questions or criteria presented in page 56. These are good questions, but the nominators were not asked to provide answers to these, only to use them to think about their nominations. This means that it is all about presentation rather than content: still a popularity contest among people who clearly cannot know about every context and must therefore rely on what is accessible to them (this is obvious when one realises that most of the top non-US think tanks are either focusing on (or working under the banner of) international development, security and foreign affairs). The kind of analysis that I am attempting and that Goran Buldioski, for instance, is undertaking in Eastern Europe is absent.
  4. A ranking must have a clear definition of what the top spot implies: top 25 by revenue, by number of staff, by number of publications, by happiness of their staff, etc. It is the same as with sport: Usain Bolt is the fastest sprinter. The Ballon d’Or on the other hand is a perception based award given to the best football player according to the votes of coaches and captains of international teams, as well as journalists from around the world. So you either define why one wins or you define who votes; but you cannot keep both unclear or hidden.
  5. It is dangerous. It creates incentives towards investing in profile raising and visibility rather than focusing on research and research capacity. The director of a think tank that is not on the list emailed me, worried about their absence, what should we do? Given that they are one of the most influential think tanks in their country, undertake research of the highest quality and are running groundbreaking and innovative initiatives (copied all over the world) my answer is: nothing. And those who make it to the list because they are popular rather than good are incentivised against doing anything about it because they may believe that the list confers them credibility.

My recommendation (if some sort of ranking is what we want) then continues to be the promotion of national think tank awards like the one promoted by Prospect Magazine. It is a shame, really, because this project has the potential to collect fantastic data on think tanks unfortunately because of the focus on the ranking a huge opportunity is being lost.

On the report itself, here are some preliminary comments after a single read (I promise to give it another go):

The first thing I notice is that top to the list are Brookings and Chatham House. I often go to their websites and find out a bit more about them and see that, yes, they have fantastic research and wide range of products and are clearly at the top of their game. And when I can I go to Chatham House events. So far so good, I guess. But then, second and third are Amnesty International and Transparency International. I know these organisations well. They are quite active in my country (Peru) but they are international campaigning NGOs, not think tanks. Transparency International participates in electoral processes as an observer. Is this the role of a think tank? Amnesty international campaigns for human rights and against their violations. I don’t think that researchers lobbying for more funds and freedom for think tanks in many developing countries would like their governments to think that this would mean more space for TI and AI to operate there too. Apples and Oranges?

Then I remember that the winner of Prospect Magazine’s 2010 Think Tanks Award was the Institute for Government; I check the top non-US think tanks but find that there are other UK think tanks in the list and the Institute for Government is nowhere to be found. In fact, it is not mentioned in the whole document. That is odd but, OK, not all rankings have to agree. What about Policy Exchange? Policy Exchange was set up by the supporters and members of the Conservative Party and was instrumental in the development of the ideas that shaped the arguments that won the 2010 election and that are guiding the new government’s policy agenda. There is a fantastic indirect account of this in Peter Snowdon’s book: Back from the Brink. No, the Policy Exchange is not listed either.

To make sure I am not missing anything I jump to the table for Europe (page 31) but no luck. They are not there. But the Overseas Development Institute is.

Now, as much as I like ODI, I am sure that it is not more influential than Policy Exchange. So, wait a minute, maybe this ranking is not about influence but about worth..?… about value? reputation? is it about finding the ones more capable of speaking truth to power? But why then have an index every year? What can change year on year to get a new one into the ranking? An annual index suggest that think tanks quality can change in a short period of time and therefore it is possible for an unknown organisation to make it to the top is the happen to do all the right things. Is it possible in this ranking? CGD did it more or less and on the basis of a good combination of research and communications. But is it possible for think tanks in small countries focusing on local issues? And is it really a worthy end?

The more I see Chatham House and other security and international relations think tanks the more it feels as if the theme of this year’s ranking is foreign policy or international development -maybe that is what this year was about. Or maybe this is what the annual ranking should be about: focus on a single theme so that more and better analysis can be done for each think tank.

Nevermind, let’s get back to it. On to Latin America, which I know a bit. The list includes the Centro de Estudios Publicos (CEP) from Chile, the Centro de Implementacion de Politicas Publicas para la Equidad y el Crecimiento (CIPPEC) in Argentina, the Instituto Libertad y Democracia (ILD) in Peru (which by the way is on both 15 and 24), and CEPAL (the UN’s Economic Commission for Latin America and the Caribbean, or ECLAC in English). This is interesting. CEPAL is the only truly regional policy research centre in the list -but it is a UN body. CEP and CIPPEC are clearly focused on their own countries -and they are certainly influential there but not in my country, Peru. And ILD was influential (granted it has been one of the most influential organisations int he world led by their director Hernando de Soto) but it almost has no public presence in Peru and cannot be really compared with other Peruvian and Latin American think tanks if one quickly browses through their work and publications. ILD is a fantastic analysis based consultancy working across the developing world on the basis of some research done in the 1980s. If they make it to the top of the list it is far more interesting to find out why this is the case rather than their place in the ranking: is it because this is what policymakers value, or were the respondents from Africa or Asia where they do most of their work?

In any case, policy in Peru is influenced by (among the think tanks) CIUP, GRADE (which is mentioned), IEP, and others that are not on the list. This is a perfect example of visibility: is it sometimes my impression that GRADE is quite successful in reaching audiences in DC and London and is therefore well known globally; while IEP and CIUP might be more focused on domestic policy debates and hence less well known beyond the country or region -or certain research communities. This probably reflects their origins, mandate and business models. So even within a country, comparison is difficult. Who is to say though whether one is better than the other based on their choice of audiences? [This section has been edited; see comments below.]

Back to Latin America (and for that matter, Europe). In Latin America there isn’t a regional government so what is the point of a regional ranking. So what is the top think tank is Brazilian? Is it informing the Chilean government? Is it valuable for Colombia? Maybe in Europe ‘European think tanks’ make more sense but then is this why domestically focused think tanks are not mentioned? Clearly, international reviewers would not know who are the movers and shakers of Peruvian, British, or Spanish policies. (Again, a point in favour of national awards.)

So maybe the regional focus has little to do with where the think tanks do their influencing and more with quite simply where they are based. But if this is the case then once again we’d be separating think tanks from their context -and this is not right.

And now on to Africa. This list looks a bit messy, to say the least. The first 7 are from South Africa (no surprises there). But number 8 is a regional research network made up of researchers based in think tanks across Africa -I’d like to call it a think tank but I am not sure how it compares with the others. And then it lists a few organisations which can hardly be called organisations at all and are only popular or known because they are among the only ones in their countries. Others are in the process of getting there; but are not there yet. A tiny bit of analysis would have provided sufficient information to disqualify them as worthy of any ranking; and to identify many others who may be more worthy of a mention.

Anyway, what is the point of saying that organisation xyz is among the top 25 in Africa? How does it compare with the Latin American ones, for instance?

What happened with the debate on think tanks in South Asia? I’ve been avidly following a great debate on Indian newspapers on think tanks that would suggest a fantastic opportunity for a study such as this one. And how useful is it to compare them with think tanks in East and Southeast Asia? In fact, how useful is it to compare think tanks in China or Vietnam with those in Japan, Indonesia and South Korea? Our overview study on think tanks and politics in the region showed foundational differences between them that merit more rather than less national focus.

The lack of analysis is telling of the limits of this type of research. A country or region focused study (rather than ranking) would have been much richer and useful.

The thematic rankings are also quite interesting. The fact still remains that one cannot separate theme from politics -and politics are always local.

I would have loved an explanation for Chatham House coming ahead of IDS in the ranking on International Development. Chatham House if by far a better think tank than ODI and IDS on foreign policy (and let’s face it they are a fantastic think tank in general and its contribution to international development debate is invaluable) but given that international development policy is still largely dominated by DFID and that DFID’s research programme is dominated by IDS and ODI (and not Chatham House) and that IDS alumni roam the corridors of DFID I cannot understand the ranking. More explanation is needed, please.

Also, why is Fundacao Getulio Vargas included in this table? They are not focused on International Development policy, their focus is on just policies; the international development prefix is added by ‘northern’ organisations to describe policies for or of developing countries. FGT deal with economic, business and legal research for the development of Brazil. How is this different from the research done by Brookings or IPPR for the development of the US and the UK respectively? (patronising?)

Also FGV is included at the foundation level not at the level of its centres of programmes, however, the Pew Research Center rather than the Pew Charitable Trusts is included. Why? I would suggest that it has to do with the narrow and shallow focus on a global index instead of a desire to understand the richness of the histories of these organisations.

Then it gets confusing -think tanks are in more than one category but in totally different levels and others which one would expect to find are gone. Yes, this is all possible, as most think tanks would be good in one thing and not in all; but Chatham House, for example, is the top UK think tank in most list but behind the International Institute for Strategic Studies when it comes to their core area of expertise: foreign policy and security. This makes no sense.

The potentially most useful list (domestic economic policy) ends up being a US focused one. This further illustrates the limitations of a global ranking and its bias towards international development and foreign affairs think tanks that are more easily identifiable in the blogosphere or more popular communication channels than domestically focused ones.

Then the special categories: most innovative policy idea -great category but what have they been nominated for? what was the idea that got Brookings to the top? Again, another missed opportunity to provide intelligent insights into the rich and complex reality of think tanks. The same goes for the outstanding policy research programme category. Which programme got ODI the 15th place? ODI has quite a lot of programmes -and also projects that we call programmes because they are larger than the usual small projects we run. So which one was it? The Africa Power and Politics Programme? The Research and Policy in Development Programme? The Humanitarian Policy Group’s Integrated Programme? The Chronic Poverty Research Centre? It is important to know because some of these are delivered with other organisations so ODI could not take all the credit.

I got bored a bit and jumped over some tables until I got to the best government affiliated think tank -WBI? Nice to know that the WB is considered a government. If the WB is a ‘government’ would the UN not be one too? (UNU-WIDER and CEPAL are in the other tables.) What about think tanks entirely (or almost entirely) funded by their governments or the international cooperation?

And then, Party Affiliated think tanks -which is an important addition to any work on think tanks. This merits an entirely different post. What does affiliated mean? Does this include Conservative think tanks in the United States like Heritage or the Conservative Party’s Central Research Department? And wouldn’t CASS and VASS (the Vietnamese equivalent of CASS) be part of this category? After all, they are affiliated to the Communist Party and Chinese and Vietnamese line ministries have their own think tanks.

I don’t want this to be a totally anti-Go-to-Think-Tank-of-the-Year rant. As I said before, the ranking has created an opportunity for debate and discussion on think tanks and this is good. But this ought to lead to a proper discussion about think tanks, the roles they play and how they may be able to contribute to their contexts (local and/or global).

The list of questions and criteria in page 56 is the best part of the document and an important contribution to the think tanks debate. It provides a guideline of sorts to study think tanks in greater detail and to promote a more intelligent debate. Focusing on the list and the ranking, I think, robs us of James McGann’s and his team’s undeniable capacity to do this and leave us with a bitchy Oscar nominations season for researchers.

About these ads
18 Comments Post a comment
  1. Enrique, I feel that we pay too much attention to this study. I just published a longer post on my blog: http://goranspolicy.com/mirror-mirror-wall-tank-world/ and felt a bit awkward. And while getting things out of my chest and pointing out the inconsistencies in the rankings, I came to the key questions. How much damage this study makes to the think tank sector?
    The surprising answer is that not that great. it is like a commercial for people out side the sector to hear the word think tank and some think tanks ( those that only care about the number they reach) publish on their web-site. The rest is just so not important. Donors will not give money to better ranked think tanks, policy makers and their advisers will answer will not turn to a think tank because it is in the top 5 and so on..

    However, the key question is if there is a critical mass of concerned people in the sector to come up with more meaningful and useful awards such as the one one of the Prospect Magazine.

    January 23, 2011
    • Thanks for the comment Goran.

      I don’t think that donors will be paying attention only to this index -but I am sure that, given that most donors’ staff are not properly immersed and living in a region or country, they will use this list as a proxy for future decisions.

      Interestingly, though, I am aware of discussions going on within think tanks about how to use this index as PR -and I was consulted, in the run up to its publication, by a few think tanks worried about not being on the list. These organisation are making difficult choices about how to invest (sometimes) limited resources or whether or not to invest their abundant resources in developing their capacities at all.

      Let’s follow it up on Wednesday.

      January 23, 2011
  2. martin benavides #

    Enrique, pensé que tu le hacías caso a ese index. Recuerdo un ppt tuyo que mencionaba el ranking de Odi. Nosotros en Grade no le habíamos hecho caso, sino hasta que rebotó la noticia en el comercio. Concuerdo con la mayoría de tus ideas. Lo que si me parece un reflejo de un desconocimiento total de las contribuciones de Grade es tu afirmación que miramos mas a DC que a la política doméstica. Te invito a revisar los estudios de caso sobre nuestro impacto en las políticas sociales (realizados por personas externas a nosotros). No eres justo con nuestra influencia local en educacion, salud, pobreza, desarrollo rural, etc, etc. Me llama la atención tu afirmación por ello y me hace pensar mucho sobre tu real conocimiento de la dinámica entre investigación y políticas en el Peru.

    February 2, 2011
    • Gracias Martín. En algun momento mencionamos el ranking (cuando recién empezamos a mirarlo) pero depués lo he usado para llamar la atención a us riesgos. Sobre GRADE, I stand corrected. Mi sugerencia no estaba dirigida a ningunear la importancia de GRADE en el espacio nacional, sino reforzar que GRADE tiene muy buena llegada y precencia en el espacio internacional. De hecho esa es la impresión que tengo cuando noto la precencia de GRADE en espacios de investigación a nivel global –me cruzo con investigadores de GRADE más que con investigadores de otros think tanks peruanos; por lo menos esa es mi percepción. Y es esa precencia internacional la que estaba resaltando y que es relevante para este tipo de ranking.

      Es también mi impression que GRADE, a diferencia del CIUP y de IEP, por el tipo de modelo de negocio, temas de interés, historia, etc. tiene a estar menos enfocado en el debate a nivel politico y más a nivel técnico. (lo que no implica lo opuesto para los otros –no vaya a ser que me corrijan también).

      De todos modos, voy a corregir esta entrada. For the record decia esto:

      This is a perfect example of visibility: GRADE’s main audience (and clientele) are the DC bureaucrats of the WB and the IDB while IEP and CIUP are much more focused on domestic policy and political debates. So even within a country, comparison is difficult. Who is to say who is better than the other based on their choice of audiences?

      En todo caso esto ilustra la necesidad de prestar mucho más atención a los detalles de cada caso, cada país, cada mercado de ideas.

      February 2, 2011
  3. anonymous #

    This ranking is indeed disservice to the think tanks themselves, governments, most importantly donors who fund think tanks, especially in developing countries. It provides a good resource for PR without serious grounds. It took me 3 months to work in a think tank in a developing country to realize that this organization that is least transparent and has the most inefficient management of all types of organizations I’ve come across (it’s like a social club, things are done through friends here and there, including the authorities) has been included in the world’s top 30 on Transparency and Good Governance. What a disgrace to the initiators of this ranking! Just stop doing it, it really is counterintuitive and as i said is a disservice!

    January 23, 2012

Trackbacks & Pingbacks

  1. Goran’s recommendations on think tank rankings « on think tanks
  2. Is everything from the ‘North’ bad for the ‘South’? « on think tanks
  3. What is influence? – a view from Onalytica « on think tanks
  4. Never mind the gap: on how there is no gap between research and policy and on a new theory (Part 2 of 3) « on think tanks
  5. The onthinktanks interview: Simon Maxwell « on think tanks
  6. ‘Shares’ as an indicator of influence « on think tanks
  7. The onthinktanks interview: Simon Maxwell (Part 2) « on think tanks
  8. I worked at a think tank. I didn’t have time to think « on think tanks
  9. Happy Holidays from onthinktanks! « on think tanks
  10. And the winner is: Brookings … and, once again, the loser: critical analysis « on think tanks
  11. This year, instead of ranking think tanks lets think about them more carefully | on think tanks
  12. Cato Tops New CGD Index of Think Tank Profile | David Roodman | Global Development: Views from the Center
  13. La brújula – Think Tanks, Investigación y PODER

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 5,544 other followers

%d bloggers like this: