Skip to content

Posts tagged ‘James McGann’

Book Review: How Think Tanks Shape Social Development Policies, edited by James McGann, Jillian Rafferty, and Anna Viden

In this post, Marcos Gonzales Hernando provides a critical review of "How Think Tanks Shape Social Development Policies", edited by James McGann, Jillian Rafferty, and Anna Viden. He challenges the notion put forward by the editors, that think tanks can, in its own right, become a powerful force for social development and the strengthening of civil society; but welcomes the comparative literature and cases offered by the volume.

Read more

Think tank rankings and awards: rigged, futile, or useful

Are all think tank awards useless? In this article I argue that awards can be developed to celebrate good work, increase the visibility of think tanks in their societies, and contribute to the development of the think tank community as a whole. After reviewing three rankings: the UPenn go to think tank ranking, the RePEc economics think tanks ranking, and the ICCG environment and energy think tank map and ranking, and compares them with the Prospect Award. I argue that the Prospect Magazine Award offer a model to follow and adapt in different countries and an alternative to global or regional de-contextualised rankings.

Read more

European think tanks and the European Union

What is the relationship between European think tanks and the European Union, and how can European think tanks be defined and classified? A report by the Bureau of European Policy Advisers offers an overview of these issues.

Read more

This year, instead of ranking think tanks lets think about them more carefully

David Roodman and Julia Clark from the Center for Global Development have posted a very interesting reflection on the now unfortunately famous McGann think tank ranking. They offer an alternative to a global ranking exercise.

Read more

The Go to Think Tank Index: two critiques

This is a bit old news but i feel it is worth sharing reviews by other researchers on the think tank index produced by James McGann.

Jan Trevisan, at the International Centre for Climate Governance has published an interesting critique. There is not much I disagree with in his assessment. He points at several mistakes in the analysis regarding think tanks in the sector he is more familiar with. And this is not surprising because it is difficult for any one person (or team) to delve into the detail that is necessary to adequately assess al think tanks in the world. It makes me think that we should not only attempt to think of think tanks at a country level (which is what I have argued before) but also by theme or issue.

My only disagreement, I guess, is that the ICCG does not consider government or party think tanks in their map. I think that this is a mistake as in many developing countries (and indeed developed countries) it is not possible to find truly independent centres. Affiliation with should not be confused with lack of autonomy.

A couple of years ago, Christian Seiler and Klaus Wohlrabe published their own, very well researched, critique of the 2009 think tank index. Their critique, besides identifying several inaccuracies, focused on the methodological weakness of the ranking. I remember that at the time I offered a similar critique.

I think it is clear that the method is inaccurate and the output is therefore flawed. At least, however, it has got many think tanks and researchers thinking about it. And these critiques and analyses offer far more insights into the world of think tanks than the ranking will ever do.

And the winner is: Brookings … but, once again, the loser: critical analysis

I have not been a fan of Jim McGann’s think tank index. There are no surprised there. I think that the energy that goes into this could be used more productively elsewhere; for instance gathering more in-depth information on specific organisations, their strategies, their tactics, etc. The 2011 global think tank index presents some improvements on previous editions, but it is still far from being what it wants to be. McGann says that he is doing this in part as a response to being asked about think tanks around the world, but this is no due diligence. I learn so much more about think tanks from a conversation with a director like Orazio Bellettini or Simon Maxwell than from the entire report. It is still a popularity contest -and it is still not clear who votes.

The main text of the report suggests that policymakers should use this to guide their choice of think tanks when seeking advice but this strikes me as rather naive: it assumes policymakers do not already have their own networks. It is also quite worrying since most countries do not get any mentions (and many get 1 or 2) and this in effect narrows the options that policymakers would have if they chose to use this as a guide. And this would be a mistake because there are many more excellent think tanks that do not even get a mention in the ranking.

There are some positives, though. I must accept that the inclusion of more categories describing specific achievements is a step in the right direction. But without further detail as to what the think tanks achieved in each I am afraid that the rankings remain unhelpful. Also positive is the definition of think tanks used; it includes organisations affiliated to parties, governments, and others.(Unfortunately, the definition has not always been applied to the rankings.)

It seems that there was a clear effort to address some of the criticism that the ranking has received in the past. At least some are acknowledged in the text.

And, without a doubt, the index gets people talking about think tanks at least once a year. This is also quite good and gives us an opportunity to reflect on these organisations and the functions they fulfil.

Still, the index leaves me with a lot of questions:

  • Among the top think tanks in the world we still find organisations such as Amnesty International, Transparency International, and Human Rights Watch. I am still not sure these qualify as think tanks. AERC, also not a think tank, is listed, but not its Latin American equivalent, LACEA (respondents bias?). CODERSRIA a think tank? IDRC, again, is also not a think tank.
  • TED, I am sure, is not. And while I am on the subject of TED, has anyone wondered why it is only mentioned in the technology category? Yes, that is how it started but anyone who is a fan would know that technology plays only a part in what it does. Perception is a powerful thing.
  • I am still unsure about the global or regional lists. I am not sure what to do with a list that compares Chile’s Centro de Estudio Públicos in Chile with Brazil’s Fundación Getulio Vargas. Why is one better than the other?
  • Or with a single list that includes think tanks from Australia, Central Asia, South Asia, and East and SouthEast Asia.
  • In general, the absence of Latin American think tanks from the global lists is suspicious (of the accuracy of the process, not of the researchers’ intentions).
  • There are no surprises in the Mexico, Canada, and the Caribbean list. How could any Caribbean think tank make it into it? Even here, what is the logic behind the inclusion of Mexico in this list and not in one with Central America? Mexico’s influence in Central America is certainly more relevant.
  • As a colleague, Norma Correa, has suggested, maybe a more accurate list would be one including Mexico, Brazil, and Argentina; they are similar in size, intellectual influence in the region, their academic communities are comparable, etc.
  • How do developing country think tanks, Brazil’s Fundación Getulio Vargas, the Bangladesh Institute of Development Studies, India’s Centre for Development Alternatives, and the China Academy of Social Sciences, get into the international development category? Isn’t what they do the same thing that domestic think tanks in developed countries do?
  • Chatham House is an interesting 2nd place in the Environment Think Tank category. It seems to me that its approach is from a foreign policy or security perspective and so I am not sure how comparable it is with, say, the International Institute for Environment and Development in the UK.
  • Health: The Peterson Institute for International Economics is number 26 on the list. A quick search on the website shows that it does not do any health research. This is interesting in itself. It shows the power of perception when judging think tanks. Also that the team has not fact checked the lists. And how does one compare a political think tank like Heritage, a corporately funded centre like the Kaiser Permanente Institute for Health Policy, university-based health research centres (there are a few), and international development health policy research centres such as CGD?
  • Also, by the way, India’s Institute for Economic Growth is mentioned twice in this list.
  • I don’t quite get the international economic policy think tank category. Is it on international economics or is it influencing international bodies? I would be useful if each category was described in more detail and examples of the reasons why the think tanks have been ranked in that order were provided.
  • What is a transparency and good governance think tank? It certainly helps to explain the presence of human rights NGOs but it seems to me that think tank, by their nature, can support transparency and good governance even if they do not actively seek to change ‘transparency’ policies.

Now, the section related to the new categories is much more interesting and I had been looking forward to it. I have argued before that this is far more useful that regional lists. Let’s see:

  • Most innovative policy ideas: There is a long list… yes… but, what was the most innovative policy idea? Why was it innovative? And how does one do this? If I could have commented only on one issue it would have been this. If all efforts could have been focused on something it should have been to identify the policy ideas proposed by these think tanks and explain why they are innovative, their importance, and how or why they came up on top.
  • Google Ideas is the top new think tank. Is it a think tank? What is it? It has been mentioned in media articles and it has a twitter account but beyond that… what? ResPublica is new but not that new.
  • The category Think Tanks with Outstanding Policy-Oriented Public Policy Research Programs should win an award for the most confusing name. Is it policy oriented OR public policy research? Or is it policy oriented public policy research? I don’t quite get it. Still, their work must have been outstanding. But why? What is the criteria being used? And what are these programmes that are so outstanding that deserve our attention?
  • I really liked the inclusion of the category on the use of the internet. And I was glad to see ODI mentioned: Nick Scott’s work has been excellent in this field. But I was not quite sure why ODI was not higher up than, say, the Council on Foreign Relations or the Fraser Institute. I know that the internet is far more than just the corporate website but this is still a good indicator and ODI’s is certainly way more developed. (I think there is no risk that people might think I am biased in its favour; I have many times in the past been rather critical of it.) And TED, really? All the way down on number 23? They rule the web. Maybe this is because very few among the respondents considered TED to be a think tanks but it got just enough votes to make it onto the list.
  • Among the top 4 of the think tanks with the greatest impact are Transparency International, Human Rights Watch, and Amnesty International. Are these think tanks? But most importantly, what was their impact? I must assume that the list is global in the sense that it includes think tanks from all over the world rather than that the impact was global in nature. But, again, what was the impact being rewarded? And how did they achieve it?
  • I was pleased to see the category of university affiliated think tanks. It recognises that these organisations exist and can be described as such. But, again, why are they the best? Is it because they found a good balance between teaching and research? Did they tackle a particularly important issue?
  • The definition is also important. Are they affiliated as IDS is affiliated to Sussex (IDS is an independent charity) or as IDEAS is affiliated to LSE or CIUP is affiliated to the Universidad del Pacifico (they are research centres of the universities)?
  • I would say the same about the government affiliated think tank category; it is good that it has been included. But is it worth putting governments and multilateral agencies in the same bag?  And how is affiliation defined and applied? According to the definition in the document: A part of the structure of government. Is FLACSO Ecuador affiliated because it gets funds from the government or is it because it is part of the Ecuadorian education system or because it was established through an agreement between the State and the international FLACSO system? Would that make IDS or ODI that get quite a lot of their funding from DFID not affiliated, too? And RAND? (I must say that I am always surprised to find FACSO Ecuador in the ranking. FLACSO Costa Rica or FLACSO Argentina are far more popular -at least for me… perception, perception, perception.)
  • What does it mean to be a party affiliated think tank? Affiliation can mean many things but in this case it cannot mean the same as it does for the government affiliated category. The German model of political party foundations cannot be compared with the British model of party friendly think tanks. Yes, DEMOS influenced the New Labour government but they are pretty active in developing links with the new Coalition government, too. The same could be said about the Centre for Policy Studies: yes, it is close to the Tory Party but it is not of the party. The definition used by the index is: Formally affiliated with a political party, but I don’t think the Law would allow this in the UK. It certainly would not allow it in the United States. Maybe a definition that incorporates informal affiliations would have been more appropriate.

This is rather painful. I thought about not writing anything about the index this year. But having read it I felt that some of this had to be said, again. And I am sure I won’t be the only one.

The points I am making this time around are the same as in previous years:

  1. The contexts and organisations being compared are so different that any comparison is quite useless unless it is accompanied by detailed analysis.
  2. Absences (at least the ones I have noticed) are explained by the fact that this index relies on biased perceptions and little if any corroboration. As McGann himself accepts, they do not have a budget or the capacity to visit (not even virtually?) the think tanks. And I doubt that the respondents themselves have spent much time thinking about their nominations.
  3. Perception is important for think tanks, though; and this may be the problem. Without clear examples explaining why a think tank is perceived as good online, influential or innovative how can they learn and improve?

I insist. Ditch global rankings and focus on national exercises with detailed analysis on the reasons why a think tank comes top of a category (e.g. what was the issue it worked on, the programme it developed, etc.?). This will make it easier to fact check and to ensure that the definitions and criteria are actually being applied to the assessment. It will also get a conversation started at the level that matters and not just in DC or the UN where everyone loves a cocktail. We could then compare between them and learn a great deal more.

Until next year, I am sure.

The mighty influence of think tanks (in the US)

A KCPP discussion on the might of think tanks and the influence they have in US politics and the world. James McGann provides an explanation of this index (you’ll be the judge of that -criteria, ‘experts’, ranking… lots to talk about, but let’s leave that for January).

The conversation focused on some of the following questions:

What is the ultimate goal of heavy-weight think tanks? Do they just add to a cacophony of poisoned politics? Or can their researchers contribute ideas isolated from politicking on the Hill? What are the risks and benefits of relying on them? Should their influence be kept in check? And is there a think tank for every political stripe?

McGann suggests that think tanks can help contribute to an informed debate on policy issues, they provide a government in waiting (a revolving role -and an opportunity for policymakers to reflect on issues of policy), and outside independent analysis. This, he clarifies, is different to other countries where the political culture awards a clearer role to the government for analysis. I agree.

I disagree, however, with his suggestion the independent analysis is unique to the United States. But I am not sure if that is what he meant to say.

What about the clear ideological bias of think tanks, asks the interviewer. What is their value if the ‘thinking’ is rigged from the beginning? McGann responds that the US is a hyper-pluralistic society with a range of institutions and so even if there are clear political opinions and political philosophies can be assigned to specific think tanks, the overall democratic debate is not necessarily affected.

Callers to the programme do not seem to agree -with think tanks: Their ideological bickering is unrealistic and unhelpful.

Should think tanks be sued for the mistakes they make? No, says Mike Gonzalez, Vice President, Communications, The Heritage Foundation, and lists a long list of policies they came up with. But most importantly, he suggests that they have the right to make proposals and governments have the responsibility to decide whether or not to take them on or not.

But Faiz Shakir, Vice President, Center for American Progress and serves as Editor in Chief of Think — a blog created by the Center for American Progress Action Fund, suggests that while this is true in theory, in practice it is possible for funding and influence to break down this separation of roles suggested by Gonzalez.

I suggest to listen to the interview -very interesting. The question still remains: are they led by the search for ideas or by ideology or interests (theirs or others)? In other words, is the thinking rigged?

And another question: if think tanks are so good at talking about issues of public interest -sometimes even better than politicians and policymakers- are they letting us (the general public) in or keeping us out of the debate?

Evaluating a think tank from the AEA365 (but should we?)

Johanna Morariu, Senior Associate with Innovation Network. Innovation Network ( describes an approach to evaluating think tanks in the American Evaluation Association’s AEA365 blog. She draws from some studies and studies focused on think tank evaluation to address 10 categories or assessment areas:

  • Organisation infrastructure, capacity and management
  • Strategy and direction -and both organisation and project levels
  • Organisation visibility and reputation
  • Effectiveness of communication and outreach strategy
  • Quality of research products
  • Participation in congressional testimony
  • Research relevance, quality, usefulness, and rigour
  • Uptake of research in media and policy
  • Research influences the work of other leading reseachers
  • Research influences decision makers and/or policy

Again, as in most evaluations of the work of think tanks, this leaves out other contributions these organisations make:

  • What about their capacity to train and prepare new generations of policymakers? -track movement of staffers into policy or the private sector, for instance
  • And what about the power of think tanks to convene people and organisations from different sides of the argument -and help them find common ground?
  • Or the contribution that think tanks make to the education of elites -not to change their minds but to enlighten their own arguments?
  • Think tanks also trade on power and providing their supporters with access to key spaces. Do we rather not measure this? Let’s not pretend that think tanks have no political or economic allegiances (very few can).

By the way the studies she used are here, courtesy of Johanna:

  • Donald E. Abelson (2010). Is Anybody Listening? Assessing the Influence of Think Tanks. Chapter 1 in the edited volume, Think Tanks and Public Policies in Latin America.
  • Richard Bumgarner, Douglas Hattaway, Geoffery Lamb, James G. McGann, and Holly Wise (2006). Center for Global Development: Evaluation of Impact. Arabella Philanthropic Investment Advisors, LLC for the Bill & Melinda Gates Foundation, the William and Flora Hewlett Foundation, the John D. and Catherine T. MacArthur Foundation, and the Rockefeller Foundation.
  • Ingie Hovland (2007). Making a Difference: M&E of Policy Research. Working paper 281 for the Overseas Development Institute, London, UK.
  • James G. McGann (2006). Best Practices for Funding and Evaluating Think Tanks & Policy Research. McGann Associates for the William and Flora Hewlett Foundation.

But critical to this method was the development of a Theory of Change for the think tank. And from it they were able to develop the right indicators and tools to assess the organisation’s performance. The Theory of Change in this case is not the typical one often described by many projects or programmes. And in fact it looks more like a strategy diagram. This is what we want to achieve and this is how we’ll do it.

I have been asked before about evaluating think tanks. And approaches like Johanna Morariu’s are certainly useful. But I am not too sure if an evaluation is what we want for an organisation. I can see how we may evaluate a project (we can tell when the activities of the project are done -even if it is difficult to assess its effects in the short term). The same goes for a programme or policy. There is something that is done, then that something is done no more, and then we assess how (well) that something was done and if that something had the intended effect.

An organisation, however, does not operate in bursts of activity than then wind down to nothing. It does not start and then stop to be evaluated. Its parts may do, but not the organisation. Yes, there is ongoing evaluation but this all points towards the final (ex-post) evaluation.

For an organisation, a strategic review that considers all of the think tank’s functions is far more relevant. A reflection process around annual staff or research retreats is far more useful.

The ‘Top Think Tanks’ in the World commentary from AFRICA IS A COUNTRY

An interesting review of James McGann’s think tank’s list  from Africa is a CountryThe ‘Top Think Tanks’ in the World

The report also notes “… the rise of think tanks in Asia, Latin America, the Middle East and Africa.” Whether these think tanks offer fresh ideas or anything different from their counterparts in the West–given that they share the same funding sources and ideas–is not discussed in the report.

Goran’s recommendations on think tank rankings

Goran Buldioski offers another take on the rankings in his blog Goran’s musings and some very interesting recommendations that I republish below:

As someone who works with think tanks, studies think tanks, writes about think tanks, I see very little value in it. Therefore, it is high time to move to alternatives to this study:

  1. Best national think tanks (see the suggestion of Enrique Mendizabal) modeled on the UK’s ranking done by Prospect magazine. Note: Thematic categories could be also established.
  2. Best Policy study ( for example see the Policy Association for Open Society (PASOS) award for the best study penned by their members).
  3. Best advocacy campaign by a think tanks (consisted of a series of policy products (from op-ed to book), events (briefings, debates, seminars, conferences, training events etc.).
  4. Best online presentation. [Or maybe best online communications strategy?]
  5. Best design and communication strategy

I would add categories related to:

  1. Best long term policy research programme -that has maintained and developed a reputation on a specific issue with or without support
  2. Best prospective think tank -thinking about the future challenges of its country
  3. Best think tank to work for -to highlight the human capital development role that think tanks have
  4. Think tank to watch

I would also encourage categories related to the other members of the policy space:

  1. Best use of research based evidence by the media
  2. Best or most innovative funder
  3. Most evidence based political party manifesto

At this level, whatever category would provide an opportunity to have a real public conversation about think tanks and their contexts (and histories). No need for a ranking: one winner and some honorary mentions would do. And if there are disagreements then these can be aired publicly and addressed rather quickly before the next award.

It would encourage communication between think tank directors and with their audiences (and possible members of the panel), serve as a platform from which to launch important new policy ideas and debates, contribute to the development of more informed cadres of journalists, policymakers and funders, etc.

Do have a look at Goran’s comments about the inconsistencies in the ranking -they are quite to the point.


Get every new post delivered to your Inbox.

Join 7,075 other followers