I have not been a fan of Jim McGann’s think tank index. There are no surprised there. I think that the energy that goes into this could be used more productively elsewhere; for instance gathering more in-depth information on specific organisations, their strategies, their tactics, etc. The 2011 global think tank index presents some improvements on previous editions, but it is still far from being what it wants to be. McGann says that he is doing this in part as a response to being asked about think tanks around the world, but this is no due diligence. I learn so much more about think tanks from a conversation with a director like Orazio Bellettini or Simon Maxwell than from the entire report. It is still a popularity contest -and it is still not clear who votes.
The main text of the report suggests that policymakers should use this to guide their choice of think tanks when seeking advice but this strikes me as rather naive: it assumes policymakers do not already have their own networks. It is also quite worrying since most countries do not get any mentions (and many get 1 or 2) and this in effect narrows the options that policymakers would have if they chose to use this as a guide. And this would be a mistake because there are many more excellent think tanks that do not even get a mention in the ranking.
There are some positives, though. I must accept that the inclusion of more categories describing specific achievements is a step in the right direction. But without further detail as to what the think tanks achieved in each I am afraid that the rankings remain unhelpful. Also positive is the definition of think tanks used; it includes organisations affiliated to parties, governments, and others.(Unfortunately, the definition has not always been applied to the rankings.)
It seems that there was a clear effort to address some of the criticism that the ranking has received in the past. At least some are acknowledged in the text.
And, without a doubt, the index gets people talking about think tanks at least once a year. This is also quite good and gives us an opportunity to reflect on these organisations and the functions they fulfil.
Still, the index leaves me with a lot of questions:
- Among the top think tanks in the world we still find organisations such as Amnesty International, Transparency International, and Human Rights Watch. I am still not sure these qualify as think tanks. AERC, also not a think tank, is listed, but not its Latin American equivalent, LACEA (respondents bias?). CODERSRIA a think tank? IDRC, again, is also not a think tank.
- TED, I am sure, is not. And while I am on the subject of TED, has anyone wondered why it is only mentioned in the technology category? Yes, that is how it started but anyone who is a fan would know that technology plays only a part in what it does. Perception is a powerful thing.
- I am still unsure about the global or regional lists. I am not sure what to do with a list that compares Chile’s Centro de Estudio Públicos in Chile with Brazil’s Fundación Getulio Vargas. Why is one better than the other?
- Or with a single list that includes think tanks from Australia, Central Asia, South Asia, and East and SouthEast Asia.
- In general, the absence of Latin American think tanks from the global lists is suspicious (of the accuracy of the process, not of the researchers’ intentions).
- There are no surprises in the Mexico, Canada, and the Caribbean list. How could any Caribbean think tank make it into it? Even here, what is the logic behind the inclusion of Mexico in this list and not in one with Central America? Mexico’s influence in Central America is certainly more relevant.
- As a colleague, Norma Correa, has suggested, maybe a more accurate list would be one including Mexico, Brazil, and Argentina; they are similar in size, intellectual influence in the region, their academic communities are comparable, etc.
- How do developing country think tanks, Brazil’s Fundación Getulio Vargas, the Bangladesh Institute of Development Studies, India’s Centre for Development Alternatives, and the China Academy of Social Sciences, get into the international development category? Isn’t what they do the same thing that domestic think tanks in developed countries do?
- Chatham House is an interesting 2nd place in the Environment Think Tank category. It seems to me that its approach is from a foreign policy or security perspective and so I am not sure how comparable it is with, say, the International Institute for Environment and Development in the UK.
- Health: The Peterson Institute for International Economics is number 26 on the list. A quick search on the website shows that it does not do any health research. This is interesting in itself. It shows the power of perception when judging think tanks. Also that the team has not fact checked the lists. And how does one compare a political think tank like Heritage, a corporately funded centre like the Kaiser Permanente Institute for Health Policy, university-based health research centres (there are a few), and international development health policy research centres such as CGD?
- Also, by the way, India’s Institute for Economic Growth is mentioned twice in this list.
- I don’t quite get the international economic policy think tank category. Is it on international economics or is it influencing international bodies? I would be useful if each category was described in more detail and examples of the reasons why the think tanks have been ranked in that order were provided.
- What is a transparency and good governance think tank? It certainly helps to explain the presence of human rights NGOs but it seems to me that think tank, by their nature, can support transparency and good governance even if they do not actively seek to change ‘transparency’ policies.
Now, the section related to the new categories is much more interesting and I had been looking forward to it. I have argued before that this is far more useful that regional lists. Let’s see:
- Most innovative policy ideas: There is a long list… yes… but, what was the most innovative policy idea? Why was it innovative? And how does one do this? If I could have commented only on one issue it would have been this. If all efforts could have been focused on something it should have been to identify the policy ideas proposed by these think tanks and explain why they are innovative, their importance, and how or why they came up on top.
- Google Ideas is the top new think tank. Is it a think tank? What is it? It has been mentioned in media articles and it has a twitter account but beyond that… what? ResPublica is new but not that new.
- The category Think Tanks with Outstanding Policy-Oriented Public Policy Research Programs should win an award for the most confusing name. Is it policy oriented OR public policy research? Or is it policy oriented public policy research? I don’t quite get it. Still, their work must have been outstanding. But why? What is the criteria being used? And what are these programmes that are so outstanding that deserve our attention?
- I really liked the inclusion of the category on the use of the internet. And I was glad to see ODI mentioned: Nick Scott’s work has been excellent in this field. But I was not quite sure why ODI was not higher up than, say, the Council on Foreign Relations or the Fraser Institute. I know that the internet is far more than just the corporate website but this is still a good indicator and ODI’s is certainly way more developed. (I think there is no risk that people might think I am biased in its favour; I have many times in the past been rather critical of it.) And TED, really? All the way down on number 23? They rule the web. Maybe this is because very few among the respondents considered TED to be a think tanks but it got just enough votes to make it onto the list.
- Among the top 4 of the think tanks with the greatest impact are Transparency International, Human Rights Watch, and Amnesty International. Are these think tanks? But most importantly, what was their impact? I must assume that the list is global in the sense that it includes think tanks from all over the world rather than that the impact was global in nature. But, again, what was the impact being rewarded? And how did they achieve it?
- I was pleased to see the category of university affiliated think tanks. It recognises that these organisations exist and can be described as such. But, again, why are they the best? Is it because they found a good balance between teaching and research? Did they tackle a particularly important issue?
- The definition is also important. Are they affiliated as IDS is affiliated to Sussex (IDS is an independent charity) or as IDEAS is affiliated to LSE or CIUP is affiliated to the Universidad del Pacifico (they are research centres of the universities)?
- I would say the same about the government affiliated think tank category; it is good that it has been included. But is it worth putting governments and multilateral agencies in the same bag? And how is affiliation defined and applied? According to the definition in the document: A part of the structure of government. Is FLACSO Ecuador affiliated because it gets funds from the government or is it because it is part of the Ecuadorian education system or because it was established through an agreement between the State and the international FLACSO system? Would that make IDS or ODI that get quite a lot of their funding from DFID not affiliated, too? And RAND? (I must say that I am always surprised to find FACSO Ecuador in the ranking. FLACSO Costa Rica or FLACSO Argentina are far more popular -at least for me… perception, perception, perception.)
- What does it mean to be a party affiliated think tank? Affiliation can mean many things but in this case it cannot mean the same as it does for the government affiliated category. The German model of political party foundations cannot be compared with the British model of party friendly think tanks. Yes, DEMOS influenced the New Labour government but they are pretty active in developing links with the new Coalition government, too. The same could be said about the Centre for Policy Studies: yes, it is close to the Tory Party but it is not of the party. The definition used by the index is: Formally affiliated with a political party, but I don’t think the Law would allow this in the UK. It certainly would not allow it in the United States. Maybe a definition that incorporates informal affiliations would have been more appropriate.
This is rather painful. I thought about not writing anything about the index this year. But having read it I felt that some of this had to be said, again. And I am sure I won’t be the only one.
The points I am making this time around are the same as in previous years:
- The contexts and organisations being compared are so different that any comparison is quite useless unless it is accompanied by detailed analysis.
- Absences (at least the ones I have noticed) are explained by the fact that this index relies on biased perceptions and little if any corroboration. As McGann himself accepts, they do not have a budget or the capacity to visit (not even virtually?) the think tanks. And I doubt that the respondents themselves have spent much time thinking about their nominations.
- Perception is important for think tanks, though; and this may be the problem. Without clear examples explaining why a think tank is perceived as good online, influential or innovative how can they learn and improve?
I insist. Ditch global rankings and focus on national exercises with detailed analysis on the reasons why a think tank comes top of a category (e.g. what was the issue it worked on, the programme it developed, etc.?). This will make it easier to fact check and to ensure that the definitions and criteria are actually being applied to the assessment. It will also get a conversation started at the level that matters and not just in DC or the UN where everyone loves a cocktail. We could then compare between them and learn a great deal more.
Until next year, I am sure.