The flaws in the ranking

2 February 2019

I was going to let it go this year. But after a quick read I found so many mistakes that it became a challenge not to write this at all. OTT and many others have been critical of the ranking in the past. I, for one, do not believe that ranking think tanks offers any value -not to think tanks and not to their missions. Our main critique has always been the same: it is irrelevant to compare think tanks outside of their context and the circumstances of their policy processes they have been involved in.

If you are looking for alternatives try the Prospect Magazine Awards and Transparify‘s rating of a think tank’s level of transparency.

Why? Because they are transparent (you know exactly who is judging or measuring and what they judge or measure) and they are contextual (in the case of the Award) and replicable (in the case of Transparify).

And, most importantly, they offer clear reasons why a think tank won or received 1 or 5 stars. And, because they do not rank.

But we have other operational concerns. Mainly, data collection and analysis is flawed. The irony of the University of Pensilvania ranking is that the University of Pensilvania would not let anyone graduate with a dissertation with so many mistakes -and neither would the think tanks that celebrate it publish anything like it.

The shame is that the effort that goes into it could be used for more positive objectives.

The (mixed up) results

  • Is it a think tank?

So is not a think tank. SENACYT is the national science and technology council of Panama. It is not a think tank. Amnesty International and human Rights Watch are not think tanks. The One Campaign is not a think tank either. Oxford Analytica is definitely not a think tank. Neither is KPMG. And what about the UNDP? Is it a think tank and -what is more- a government affiliated think tank? The Red Anticorrupción Latinoamericana (REAL: real name: Red Anticorrupción de America Latina) is also not a think tank. It is a network of think tanks and it not based in Chile (although one of its members is).

Why is this important? McGann famously wrote that he’d know a think tank when he saw one. I agree with this. Think tanks can take many shapes and forms. But there are boundaries to the definition. This report does not respect any. It treats consultancies, advocacy organisations, one man (yes, man) shows, networks, foundations and government bodies as think tanks and by doing so it undermines the label and the community it is supposedly trying to serve.

A think tank trying to make a case for its arguments could find it self being dismissed as an advocacy campaign or a for-profit consultancy.

  • Are they breaking the law?

Think tanks in the US and the UK are mostly charities or not for profit organisations (almost all). They cannot be partisan. This means that they cannot be affiliated to a political party. They can be fined and even closed if they do -or they apear to do it.

This, however, does not stop the ranking from claiming that several US and UK think tanks are affiliated to parties. Two examples, Demos and the National Democratic Institute are listed in the party affiliated think tank category.

The ranking says: “Best Think Tanks with Political Party Affiliation: Think tanks that are formally affiliated with a political party and ideology. In the US, they are mostly categorized into Democrats, Republicans, and Independents. As aggregate data from 2014 shows, 39% identify as Independents, 32% as Democrats, and 23% as Republicans.” Note it says formally affiliated to a political party – not informally, or ideologically aligned.

But Demos says they are the “leading cross-party think-tank.”

And NDI says they are “a nonprofit, nonpartisan, nongovernmental organization that has supported democratic institutions and practices in every region of the world for more than three decades.”

So either they are lying or the report is mistaken.

  • Are they hiding who they really are?

One of the most worrying “results” is the inclusion of RUSI (the Royal United Services Institute) as a government affiliated think tank. RUSI is not a government affiliated think tank. It is independent. And it has been so since 1831. RUSI staff work across the world in some of the most dangerous environments, for instance, the STRIVE Horn of Africa project.

By claiming that RUSI is government affiliated, in this case affiliated to the British government, the ranking could be putting the lives of people at risk. One thing is to walk into Somalia as an independent researcher; a very different thing is to do so as a representative of the government.

The same problem faces the Open Society Foundations. Every year, the ranking labels them a think tank. The people who had had to move from Budapest to Berlin have faced accusations of bias, of hiding their true nature, or exerting undue influence on policy. Being labeled a think tank does not help

  • How are they ranked?

Somehow, the top think tank of the world (Brookings) is only 4th among the independent think tanks.

Somehow, the Lowy Institute got into the list for science and technology think tanks but its website lists no programmes or projects on the subject.

And is CIVITAS the 29th or the 32nd think tank in the social policy category?

  • Who is the partner?

The category involving 2 or more think tanks only lists 1 think tank.

  • Why?

Why did Brookings win? What was about the winning conference that got the “panel” exited? What aspect of the winning advocacy campaign made a difference? What was that incredible idea that changed a paradigm? We do not know.

Why is this important?

Every year think tanks face challenging situations. They have an impossible job. They must convince their audiences (often several) that their ideas are the most credible, relevant and useful. They have to raise funds against all odds. They must recruit the best minds with little funding to do so. They must keep up with new forms of communication and engagement. Some even have to deal with dangerous contexts.

Every year, they battle to explain what they are, what they do and why they matter. Sure, this ranking puts the label on the map. But it does so in a muddled, miss-leading and, often, risky manner.

In the developing world, where OTT does most of its work, think tanks need to make significant and meaningful investments in their work to improve the quality of their research, communications and management. The ranking offers the promise of a shortcut -easy money. Institutional strengthening is replaced by marketing and networking. Funders who support think tanks in developing country should seriously worry about the outcomes of their investments and be weary of grantees that use this ranking as an indicator of their value and worth.

Year after year, the ranking’s author has refused to correct its obvious mistakes. These are no longer accidental. RUSI is not a government affiliated think tank. SENACYT is not a think tank. The author knows this and still refuses to make a correction. We worry about the negative impact this has.

The efforts of the author could make a positive contribution to think tanks and the think tank community. We have launched an Open Think Tank Directory to promote research on think tanks and networking, but imagine what could be done with the supposedly 8,000+ think tanks claimed in the report? We try to document best practices and publish articles from thinktankers around the world, but imagine if each winning entry was accompanied by a verifiable statement from the judges?

Unfortunately, failing to address the ranking’s conceptual and operational flaws it turns this effort into sad display with potentially dangerous consequences.