Are all think tank awards useless? In this article I argue that awards can be developed to celebrate good work, increase the visibility of think tanks in their societies, and contribute to the development of the think tank community as a whole. After reviewing three rankings: the UPenn go to think tank ranking, the RePEc economics think tanks ranking, and the ICCG environment and energy think tank map and ranking, and compares them with the Prospect Award. I argue that the Prospect Magazine Award offer a model to follow and adapt in different countries and an alternative to global or regional de-contextualised rankings.
Posts tagged ‘James McGann’
This is a bit old news but i feel it is worth sharing reviews by other researchers on the think tank index produced by James McGann.
Jan Trevisan, at the International Centre for Climate Governance has published an interesting critique. There is not much I disagree with in his assessment. He points at several mistakes in the analysis regarding think tanks in the sector he is more familiar with. And this is not surprising because it is difficult for any one person (or team) to delve into the detail that is necessary to adequately assess al think tanks in the world. It makes me think that we should not only attempt to think of think tanks at a country level (which is what I have argued before) but also by theme or issue.
My only disagreement, I guess, is that the ICCG does not consider government or party think tanks in their map. I think that this is a mistake as in many developing countries (and indeed developed countries) it is not possible to find truly independent centres. Affiliation with should not be confused with lack of autonomy.
A couple of years ago, Christian Seiler and Klaus Wohlrabe published their own, very well researched, critique of the 2009 think tank index. Their critique, besides identifying several inaccuracies, focused on the methodological weakness of the ranking. I remember that at the time I offered a similar critique.
I think it is clear that the method is inaccurate and the output is therefore flawed. At least, however, it has got many think tanks and researchers thinking about it. And these critiques and analyses offer far more insights into the world of think tanks than the ranking will ever do.
I have not been a fan of Jim McGann’s think tank index. There are no surprised there. I think that the energy that goes into this could be used more productively elsewhere; for instance gathering more in-depth information on specific organisations, their strategies, their tactics, etc. The 2011 global think tank index presents some improvements on previous editions, but it is still far from being what it wants to be. McGann says that he is doing this in part as a response to being asked about think tanks around the world, but this is no due diligence. I learn so much more about think tanks from a conversation with a director like Orazio Bellettini or Simon Maxwell than from the entire report. It is still a popularity contest -and it is still not clear who votes.
The main text of the report suggests that policymakers should use this to guide their choice of think tanks when seeking advice but this strikes me as rather naive: it assumes policymakers do not already have their own networks. It is also quite worrying since most countries do not get any mentions (and many get 1 or 2) and this in effect narrows the options that policymakers would have if they chose to use this as a guide. And this would be a mistake because there are many more excellent think tanks that do not even get a mention in the ranking.
There are some positives, though. I must accept that the inclusion of more categories describing specific achievements is a step in the right direction. But without further detail as to what the think tanks achieved in each I am afraid that the rankings remain unhelpful. Also positive is the definition of think tanks used; it includes organisations affiliated to parties, governments, and others.(Unfortunately, the definition has not always been applied to the rankings.)
It seems that there was a clear effort to address some of the criticism that the ranking has received in the past. At least some are acknowledged in the text.
And, without a doubt, the index gets people talking about think tanks at least once a year. This is also quite good and gives us an opportunity to reflect on these organisations and the functions they fulfil.
Still, the index leaves me with a lot of questions:
- Among the top think tanks in the world we still find organisations such as Amnesty International, Transparency International, and Human Rights Watch. I am still not sure these qualify as think tanks. AERC, also not a think tank, is listed, but not its Latin American equivalent, LACEA (respondents bias?). CODERSRIA a think tank? IDRC, again, is also not a think tank.
- TED, I am sure, is not. And while I am on the subject of TED, has anyone wondered why it is only mentioned in the technology category? Yes, that is how it started but anyone who is a fan would know that technology plays only a part in what it does. Perception is a powerful thing.
- I am still unsure about the global or regional lists. I am not sure what to do with a list that compares Chile’s Centro de Estudio Públicos in Chile with Brazil’s Fundación Getulio Vargas. Why is one better than the other?
- Or with a single list that includes think tanks from Australia, Central Asia, South Asia, and East and SouthEast Asia.
- In general, the absence of Latin American think tanks from the global lists is suspicious (of the accuracy of the process, not of the researchers’ intentions).
- There are no surprises in the Mexico, Canada, and the Caribbean list. How could any Caribbean think tank make it into it? Even here, what is the logic behind the inclusion of Mexico in this list and not in one with Central America? Mexico’s influence in Central America is certainly more relevant.
- As a colleague, Norma Correa, has suggested, maybe a more accurate list would be one including Mexico, Brazil, and Argentina; they are similar in size, intellectual influence in the region, their academic communities are comparable, etc.
- How do developing country think tanks, Brazil’s Fundación Getulio Vargas, the Bangladesh Institute of Development Studies, India’s Centre for Development Alternatives, and the China Academy of Social Sciences, get into the international development category? Isn’t what they do the same thing that domestic think tanks in developed countries do?
- Chatham House is an interesting 2nd place in the Environment Think Tank category. It seems to me that its approach is from a foreign policy or security perspective and so I am not sure how comparable it is with, say, the International Institute for Environment and Development in the UK.
- Health: The Peterson Institute for International Economics is number 26 on the list. A quick search on the website shows that it does not do any health research. This is interesting in itself. It shows the power of perception when judging think tanks. Also that the team has not fact checked the lists. And how does one compare a political think tank like Heritage, a corporately funded centre like the Kaiser Permanente Institute for Health Policy, university-based health research centres (there are a few), and international development health policy research centres such as CGD?
- Also, by the way, India’s Institute for Economic Growth is mentioned twice in this list.
- I don’t quite get the international economic policy think tank category. Is it on international economics or is it influencing international bodies? I would be useful if each category was described in more detail and examples of the reasons why the think tanks have been ranked in that order were provided.
- What is a transparency and good governance think tank? It certainly helps to explain the presence of human rights NGOs but it seems to me that think tank, by their nature, can support transparency and good governance even if they do not actively seek to change ‘transparency’ policies.
Now, the section related to the new categories is much more interesting and I had been looking forward to it. I have argued before that this is far more useful that regional lists. Let’s see:
- Most innovative policy ideas: There is a long list… yes… but, what was the most innovative policy idea? Why was it innovative? And how does one do this? If I could have commented only on one issue it would have been this. If all efforts could have been focused on something it should have been to identify the policy ideas proposed by these think tanks and explain why they are innovative, their importance, and how or why they came up on top.
- Google Ideas is the top new think tank. Is it a think tank? What is it? It has been mentioned in media articles and it has a twitter account but beyond that… what? ResPublica is new but not that new.
- The category Think Tanks with Outstanding Policy-Oriented Public Policy Research Programs should win an award for the most confusing name. Is it policy oriented OR public policy research? Or is it policy oriented public policy research? I don’t quite get it. Still, their work must have been outstanding. But why? What is the criteria being used? And what are these programmes that are so outstanding that deserve our attention?
- I really liked the inclusion of the category on the use of the internet. And I was glad to see ODI mentioned: Nick Scott’s work has been excellent in this field. But I was not quite sure why ODI was not higher up than, say, the Council on Foreign Relations or the Fraser Institute. I know that the internet is far more than just the corporate website but this is still a good indicator and ODI’s is certainly way more developed. (I think there is no risk that people might think I am biased in its favour; I have many times in the past been rather critical of it.) And TED, really? All the way down on number 23? They rule the web. Maybe this is because very few among the respondents considered TED to be a think tanks but it got just enough votes to make it onto the list.
- Among the top 4 of the think tanks with the greatest impact are Transparency International, Human Rights Watch, and Amnesty International. Are these think tanks? But most importantly, what was their impact? I must assume that the list is global in the sense that it includes think tanks from all over the world rather than that the impact was global in nature. But, again, what was the impact being rewarded? And how did they achieve it?
- I was pleased to see the category of university affiliated think tanks. It recognises that these organisations exist and can be described as such. But, again, why are they the best? Is it because they found a good balance between teaching and research? Did they tackle a particularly important issue?
- The definition is also important. Are they affiliated as IDS is affiliated to Sussex (IDS is an independent charity) or as IDEAS is affiliated to LSE or CIUP is affiliated to the Universidad del Pacifico (they are research centres of the universities)?
- I would say the same about the government affiliated think tank category; it is good that it has been included. But is it worth putting governments and multilateral agencies in the same bag? And how is affiliation defined and applied? According to the definition in the document: A part of the structure of government. Is FLACSO Ecuador affiliated because it gets funds from the government or is it because it is part of the Ecuadorian education system or because it was established through an agreement between the State and the international FLACSO system? Would that make IDS or ODI that get quite a lot of their funding from DFID not affiliated, too? And RAND? (I must say that I am always surprised to find FACSO Ecuador in the ranking. FLACSO Costa Rica or FLACSO Argentina are far more popular -at least for me… perception, perception, perception.)
- What does it mean to be a party affiliated think tank? Affiliation can mean many things but in this case it cannot mean the same as it does for the government affiliated category. The German model of political party foundations cannot be compared with the British model of party friendly think tanks. Yes, DEMOS influenced the New Labour government but they are pretty active in developing links with the new Coalition government, too. The same could be said about the Centre for Policy Studies: yes, it is close to the Tory Party but it is not of the party. The definition used by the index is: Formally affiliated with a political party, but I don’t think the Law would allow this in the UK. It certainly would not allow it in the United States. Maybe a definition that incorporates informal affiliations would have been more appropriate.
This is rather painful. I thought about not writing anything about the index this year. But having read it I felt that some of this had to be said, again. And I am sure I won’t be the only one.
The points I am making this time around are the same as in previous years:
- The contexts and organisations being compared are so different that any comparison is quite useless unless it is accompanied by detailed analysis.
- Absences (at least the ones I have noticed) are explained by the fact that this index relies on biased perceptions and little if any corroboration. As McGann himself accepts, they do not have a budget or the capacity to visit (not even virtually?) the think tanks. And I doubt that the respondents themselves have spent much time thinking about their nominations.
- Perception is important for think tanks, though; and this may be the problem. Without clear examples explaining why a think tank is perceived as good online, influential or innovative how can they learn and improve?
I insist. Ditch global rankings and focus on national exercises with detailed analysis on the reasons why a think tank comes top of a category (e.g. what was the issue it worked on, the programme it developed, etc.?). This will make it easier to fact check and to ensure that the definitions and criteria are actually being applied to the assessment. It will also get a conversation started at the level that matters and not just in DC or the UN where everyone loves a cocktail. We could then compare between them and learn a great deal more.
Until next year, I am sure.
The report also notes “… the rise of think tanks in Asia, Latin America, the Middle East and Africa.” Whether these think tanks offer fresh ideas or anything different from their counterparts in the West–given that they share the same funding sources and ideas–is not discussed in the report.
Goran Buldioski offers another take on the rankings in his blog Goran’s musings and some very interesting recommendations that I republish below:
As someone who works with think tanks, studies think tanks, writes about think tanks, I see very little value in it. Therefore, it is high time to move to alternatives to this study:
- Best national think tanks (see the suggestion of Enrique Mendizabal) modeled on the UK’s ranking done by Prospect magazine. Note: Thematic categories could be also established.
- Best Policy study ( for example see the Policy Association for Open Society (PASOS) award for the best study penned by their members).
- Best advocacy campaign by a think tanks (consisted of a series of policy products (from op-ed to book), events (briefings, debates, seminars, conferences, training events etc.).
- Best online presentation. [Or maybe best online communications strategy?]
- Best design and communication strategy
I would add categories related to:
- Best long term policy research programme -that has maintained and developed a reputation on a specific issue with or without support
- Best prospective think tank -thinking about the future challenges of its country
- Best think tank to work for -to highlight the human capital development role that think tanks have
- Think tank to watch
I would also encourage categories related to the other members of the policy space:
- Best use of research based evidence by the media
- Best or most innovative funder
- Most evidence based political party manifesto
At this level, whatever category would provide an opportunity to have a real public conversation about think tanks and their contexts (and histories). No need for a ranking: one winner and some honorary mentions would do. And if there are disagreements then these can be aired publicly and addressed rather quickly before the next award.
It would encourage communication between think tank directors and with their audiences (and possible members of the panel), serve as a platform from which to launch important new policy ideas and debates, contribute to the development of more informed cadres of journalists, policymakers and funders, etc.
Do have a look at Goran’s comments about the inconsistencies in the ranking -they are quite to the point.
I’ll accept that James McGann’s effort to identify and rank all the think tanks in the world has some positive outcomes. First of all, it has people talking about think tanks -and some think tanks are even becoming aware that there is a debate out there about themselves. Second… no, that is it. [Also have a look at Goran Buldioski’s blog on the same subject]
I am still of the opinion that going beyond the counting and study of individual think tanks (and their immediate systems) is useless and misleading. Here are five reasons why I do not support this ranking, and then a longer semi-rant at the document.
- Think tanks cannot be de-linked from their political, social and economic environment; since think tanks define themselves in relation to the other players in the system. Brookings cannot be described without references to US bipartisanship -when we say independent research in the US we mean independent of either party (as well as of other interests). But independent means something entirely different in China, India, Brazil, or Argentina. Global and regional rankings are therefore unhelpful when the focus of think tanks is local (not local as in of this town or neighbourhood but of their direct interactions).
- The list is too diverse to be relevant. The definition of ‘think tanks’ has improved since I last commented on it to include politics. But he has now included organisations some that cannot be possibly compared with the rest. Let’s put it this way: if I define a mobile phone as a device that allows me to make phone calls while on the move I could be tempted to include laptops (after all I can make Skype calls ‘on the move’) but I wouldn’t because it would be confusing and unhelpful. A mobile is one thing and a laptop is another. Maybe they will do things that the other can also do but that does not make them the same thing. Amnesty International, Human Rights Watch, Transparency International and the various foundations (funders rather than researchers) included …. how useful is it to compare them with IPAR in Rwanda or GRADE in Peru?
- It is still based on perception rather than thoughtful analysis. Thoughtful analysis would have required the development of a database with answers to all the questions or criteria presented in page 56. These are good questions, but the nominators were not asked to provide answers to these, only to use them to think about their nominations. This means that it is all about presentation rather than content: still a popularity contest among people who clearly cannot know about every context and must therefore rely on what is accessible to them (this is obvious when one realises that most of the top non-US think tanks are either focusing on (or working under the banner of) international development, security and foreign affairs). The kind of analysis that I am attempting and that Goran Buldioski, for instance, is undertaking in Eastern Europe is absent.
- A ranking must have a clear definition of what the top spot implies: top 25 by revenue, by number of staff, by number of publications, by happiness of their staff, etc. It is the same as with sport: Usain Bolt is the fastest sprinter. The Ballon d’Or on the other hand is a perception based award given to the best football player according to the votes of coaches and captains of international teams, as well as journalists from around the world. So you either define why one wins or you define who votes; but you cannot keep both unclear or hidden.
- It is dangerous. It creates incentives towards investing in profile raising and visibility rather than focusing on research and research capacity. The director of a think tank that is not on the list emailed me, worried about their absence, what should we do? Given that they are one of the most influential think tanks in their country, undertake research of the highest quality and are running groundbreaking and innovative initiatives (copied all over the world) my answer is: nothing. And those who make it to the list because they are popular rather than good are incentivised against doing anything about it because they may believe that the list confers them credibility.
My recommendation (if some sort of ranking is what we want) then continues to be the promotion of national think tank awards like the one promoted by Prospect Magazine. It is a shame, really, because this project has the potential to collect fantastic data on think tanks unfortunately because of the focus on the ranking a huge opportunity is being lost.
On the report itself, here are some preliminary comments after a single read (I promise to give it another go):
The first thing I notice is that top to the list are Brookings and Chatham House. I often go to their websites and find out a bit more about them and see that, yes, they have fantastic research and wide range of products and are clearly at the top of their game. And when I can I go to Chatham House events. So far so good, I guess. But then, second and third are Amnesty International and Transparency International. I know these organisations well. They are quite active in my country (Peru) but they are international campaigning NGOs, not think tanks. Transparency International participates in electoral processes as an observer. Is this the role of a think tank? Amnesty international campaigns for human rights and against their violations. I don’t think that researchers lobbying for more funds and freedom for think tanks in many developing countries would like their governments to think that this would mean more space for TI and AI to operate there too. Apples and Oranges?
Then I remember that the winner of Prospect Magazine’s 2010 Think Tanks Award was the Institute for Government; I check the top non-US think tanks but find that there are other UK think tanks in the list and the Institute for Government is nowhere to be found. In fact, it is not mentioned in the whole document. That is odd but, OK, not all rankings have to agree. What about Policy Exchange? Policy Exchange was set up by the supporters and members of the Conservative Party and was instrumental in the development of the ideas that shaped the arguments that won the 2010 election and that are guiding the new government’s policy agenda. There is a fantastic indirect account of this in Peter Snowdon’s book: Back from the Brink. No, the Policy Exchange is not listed either.
To make sure I am not missing anything I jump to the table for Europe (page 31) but no luck. They are not there. But the Overseas Development Institute is.
Now, as much as I like ODI, I am sure that it is not more influential than Policy Exchange. So, wait a minute, maybe this ranking is not about influence but about worth..?… about value? reputation? is it about finding the ones more capable of speaking truth to power? But why then have an index every year? What can change year on year to get a new one into the ranking? An annual index suggest that think tanks quality can change in a short period of time and therefore it is possible for an unknown organisation to make it to the top is the happen to do all the right things. Is it possible in this ranking? CGD did it more or less and on the basis of a good combination of research and communications. But is it possible for think tanks in small countries focusing on local issues? And is it really a worthy end?
The more I see Chatham House and other security and international relations think tanks the more it feels as if the theme of this year’s ranking is foreign policy or international development -maybe that is what this year was about. Or maybe this is what the annual ranking should be about: focus on a single theme so that more and better analysis can be done for each think tank.
Nevermind, let’s get back to it. On to Latin America, which I know a bit. The list includes the Centro de Estudios Publicos (CEP) from Chile, the Centro de Implementacion de Politicas Publicas para la Equidad y el Crecimiento (CIPPEC) in Argentina, the Instituto Libertad y Democracia (ILD) in Peru (which by the way is on both 15 and 24), and CEPAL (the UN’s Economic Commission for Latin America and the Caribbean, or ECLAC in English). This is interesting. CEPAL is the only truly regional policy research centre in the list -but it is a UN body. CEP and CIPPEC are clearly focused on their own countries -and they are certainly influential there but not in my country, Peru. And ILD was influential (granted it has been one of the most influential organisations int he world led by their director Hernando de Soto) but it almost has no public presence in Peru and cannot be really compared with other Peruvian and Latin American think tanks if one quickly browses through their work and publications. ILD is a fantastic analysis based consultancy working across the developing world on the basis of some research done in the 1980s. If they make it to the top of the list it is far more interesting to find out why this is the case rather than their place in the ranking: is it because this is what policymakers value, or were the respondents from Africa or Asia where they do most of their work?
In any case, policy in Peru is influenced by (among the think tanks) CIUP, GRADE (which is mentioned), IEP, and others that are not on the list. This is a perfect example of visibility: is it sometimes my impression that GRADE is quite successful in reaching audiences in DC and London and is therefore well known globally; while IEP and CIUP might be more focused on domestic policy debates and hence less well known beyond the country or region -or certain research communities. This probably reflects their origins, mandate and business models. So even within a country, comparison is difficult. Who is to say though whether one is better than the other based on their choice of audiences? [This section has been edited; see comments below.]
Back to Latin America (and for that matter, Europe). In Latin America there isn’t a regional government so what is the point of a regional ranking. So what is the top think tank is Brazilian? Is it informing the Chilean government? Is it valuable for Colombia? Maybe in Europe ‘European think tanks’ make more sense but then is this why domestically focused think tanks are not mentioned? Clearly, international reviewers would not know who are the movers and shakers of Peruvian, British, or Spanish policies. (Again, a point in favour of national awards.)
So maybe the regional focus has little to do with where the think tanks do their influencing and more with quite simply where they are based. But if this is the case then once again we’d be separating think tanks from their context -and this is not right.
And now on to Africa. This list looks a bit messy, to say the least. The first 7 are from South Africa (no surprises there). But number 8 is a regional research network made up of researchers based in think tanks across Africa -I’d like to call it a think tank but I am not sure how it compares with the others. And then it lists a few organisations which can hardly be called organisations at all and are only popular or known because they are among the only ones in their countries. Others are in the process of getting there; but are not there yet. A tiny bit of analysis would have provided sufficient information to disqualify them as worthy of any ranking; and to identify many others who may be more worthy of a mention.
Anyway, what is the point of saying that organisation xyz is among the top 25 in Africa? How does it compare with the Latin American ones, for instance?
What happened with the debate on think tanks in South Asia? I’ve been avidly following a great debate on Indian newspapers on think tanks that would suggest a fantastic opportunity for a study such as this one. And how useful is it to compare them with think tanks in East and Southeast Asia? In fact, how useful is it to compare think tanks in China or Vietnam with those in Japan, Indonesia and South Korea? Our overview study on think tanks and politics in the region showed foundational differences between them that merit more rather than less national focus.
The lack of analysis is telling of the limits of this type of research. A country or region focused study (rather than ranking) would have been much richer and useful.
The thematic rankings are also quite interesting. The fact still remains that one cannot separate theme from politics -and politics are always local.
I would have loved an explanation for Chatham House coming ahead of IDS in the ranking on International Development. Chatham House if by far a better think tank than ODI and IDS on foreign policy (and let’s face it they are a fantastic think tank in general and its contribution to international development debate is invaluable) but given that international development policy is still largely dominated by DFID and that DFID’s research programme is dominated by IDS and ODI (and not Chatham House) and that IDS alumni roam the corridors of DFID I cannot understand the ranking. More explanation is needed, please.
Also, why is Fundacao Getulio Vargas included in this table? They are not focused on International Development policy, their focus is on just policies; the international development prefix is added by ‘northern’ organisations to describe policies for or of developing countries. FGT deal with economic, business and legal research for the development of Brazil. How is this different from the research done by Brookings or IPPR for the development of the US and the UK respectively? (patronising?)
Also FGV is included at the foundation level not at the level of its centres of programmes, however, the Pew Research Center rather than the Pew Charitable Trusts is included. Why? I would suggest that it has to do with the narrow and shallow focus on a global index instead of a desire to understand the richness of the histories of these organisations.
Then it gets confusing -think tanks are in more than one category but in totally different levels and others which one would expect to find are gone. Yes, this is all possible, as most think tanks would be good in one thing and not in all; but Chatham House, for example, is the top UK think tank in most list but behind the International Institute for Strategic Studies when it comes to their core area of expertise: foreign policy and security. This makes no sense.
The potentially most useful list (domestic economic policy) ends up being a US focused one. This further illustrates the limitations of a global ranking and its bias towards international development and foreign affairs think tanks that are more easily identifiable in the blogosphere or more popular communication channels than domestically focused ones.
Then the special categories: most innovative policy idea -great category but what have they been nominated for? what was the idea that got Brookings to the top? Again, another missed opportunity to provide intelligent insights into the rich and complex reality of think tanks. The same goes for the outstanding policy research programme category. Which programme got ODI the 15th place? ODI has quite a lot of programmes -and also projects that we call programmes because they are larger than the usual small projects we run. So which one was it? The Africa Power and Politics Programme? The Research and Policy in Development Programme? The Humanitarian Policy Group’s Integrated Programme? The Chronic Poverty Research Centre? It is important to know because some of these are delivered with other organisations so ODI could not take all the credit.
I got bored a bit and jumped over some tables until I got to the best government affiliated think tank -WBI? Nice to know that the WB is considered a government. If the WB is a ‘government’ would the UN not be one too? (UNU-WIDER and CEPAL are in the other tables.) What about think tanks entirely (or almost entirely) funded by their governments or the international cooperation?
And then, Party Affiliated think tanks -which is an important addition to any work on think tanks. This merits an entirely different post. What does affiliated mean? Does this include Conservative think tanks in the United States like Heritage or the Conservative Party’s Central Research Department? And wouldn’t CASS and VASS (the Vietnamese equivalent of CASS) be part of this category? After all, they are affiliated to the Communist Party and Chinese and Vietnamese line ministries have their own think tanks.
I don’t want this to be a totally anti-Go-to-Think-Tank-of-the-Year rant. As I said before, the ranking has created an opportunity for debate and discussion on think tanks and this is good. But this ought to lead to a proper discussion about think tanks, the roles they play and how they may be able to contribute to their contexts (local and/or global).
The list of questions and criteria in page 56 is the best part of the document and an important contribution to the think tanks debate. It provides a guideline of sorts to study think tanks in greater detail and to promote a more intelligent debate. Focusing on the list and the ranking, I think, robs us of James McGann’s and his team’s undeniable capacity to do this and leave us with a bitchy Oscar nominations season for researchers.