A monitoring and evaluation activity for all think tanks: ask what explains your reach

14 January 2013

Nick Scott has published an interesting post on the ODI blog: The 2012 ODI online awards – and some insights they offer into ‘success’. On Think Tanks was spared his critique of organisations and people who publish the lists of the most popular posts of the year (probably because our post offered some analysis into the list) but still his detailed analysis of the various reports, posts, events, etc. provides a challenge for us.

His blog post is also an interesting example of the kind of analysis that ranking efforts ought to consider. It’s useless to offer a simple list of ‘best online strategies’ and not explain what is it that makes each better or worse than the other.

Nick’s post focuses on ODI’s content so it should not be taken to be an assessment of the best out there. Rather, it is an interesting monitoring (and learning) component of any think tank. One that should be on the top of the to-do list of any communications lead. Or, if we follow CGD’s example, something researchers themselves should pay attention to.

A similar kind of analysis is offered by Kent Anderson in: Metrics and Meaning — Can We Find Relevance and Quality Without Measurements? In this post for the The Scholarly Kitchen he muses that:

Thinking about the goals of altmetrics — identifying content that’s more relevant, more interesting, novel, or important, and doing so as quickly as possible after publication or, better yet, helping authors to find the best match for their works — made me wonder if we’re missing some obvious alternatives to metrics, ones based on words rather than numbers. Not alternative metrics (or altmetrics), but alternatives to metrics (which I’ll call alt2metrics).

He argues that:

  • There is an obsession with metrics that borders on fetish and can be rather unhelpful;
  • Often the people and organisations promoting them are interested in selling them to us -of making us play ‘their’ game; and
  • That intangibles can matter a great deal more than things we can measure.

This last point he expands with the following idea:

Perhaps key to all this is the fact that metrics take time to assemble — they are delayed, and secondary to activity. Non-measurement-dependent signals are more important, anticipatory, and upstream from metrics. And they are what scientists rely on every day to guide them and their searches for information. It would be a shame to spend all our time on secondary, derivative measurements while primary, original signals of value are ignored or downplayed inappropriately.

His analysis is worth considering when reading through Nick’s own (and do remember that Nick has argued something quite similar).

For the list of top ODI publications, events, blogs, etc. it’s probably best to read the blog. (Note that the most-read post was one published in this blog: A pragmatic guide to monitoring and evaluating research communications using digital tools; but then again, this is only among those he could measure!)

What I am interested in, are the lessons that come out of the analysis:

The media is a key audience of a think tank’s research and so media uptake should be an indicator of readership:

This key report by Homi Kharas of Brookings and Andrew Rogerson of ODI on the aid industry made a large splash following its release in July. Mentions of it could be found in media outlets such as The GuardianChicago Tribune, the BBC and the Sydney Morning Herald; in blogs such as A View from The CaveDevex and the Huffington Post; and by development organisations including the World Bank,World Vision and Save the Children. Over the months since, it has registered 6,622 downloads – putting it well ahead of the nearest competitor in terms of downloads, The euro zone crisis and developing countries, which registered 3,880.

Global audiences are increasingly relevant in certain occasions and so their origin should feature high in their analysis:

Events were broadcast around the world through video streaming and reached viewers around the globe. There were registrations to watch from as far afield as Zimbabwe, Iraq, Pakistan, the Philippines and the three countries themselves. Overall, we logged registrations from 48 different countries in all five continents

VIPs still matter:

The blog around the graphic received 2,017 views. The big story, however, is not how many times it was viewed but who it was viewed by. We happen to know that this info-graphic was shared among the negotiators on climate finance at the Doha summit.

Alternative metrics demand alternative analyses, too; things are not  always what they appears to be:

The ODI Centre for Aid and Public Expenditure conference saw 113 tweets mentioning the event page. However, this again has a lesson, and it is one that will become increasingly important to take on board as AltMetrics become more important to judging academic success. The issue is what makes up a figure – the 113 tweets about the conference include a number of tweets by ODI researchers and staff promoting the conference. Given this, a more likely winner for the most tweeted piece of content would be Inconvenient truths about corruption and development, a November 2012 blog written by Marta Foresti, which was tweeted about 99 times.

Regular updating has positive effects (is the opposite true as well?):

That is the fastest growing research site I’ve seen in the six years I’ve been at ODI. Perhaps that is because it has, in those months, published 207 posts – that is over a post a day. This includes posts from people at numerous organisations working on the subject. This regular updating has made it a vital source for information in the sector, a fact also evidenced by the 719 followers it has quickly accrued on Twitter, including leading figures in the post-2015 debates.

Third-party posting, and using other people’s networks, can be more effective than attempting to keep it all in-house:

This works for publishing in other spaces:

This points to another of the issues with top 10 lists, especially for ODI given our ‘being there’ digital communications strategy that encourages posting to large or sector-specific blogs wherever possible to reach a wider or more directly relevant audience.

And for asking relevant outsiders to write for the think tank:

Through strategic use of Twitter to engage people with an interest in the subject, the Secure Livelihoods Research Consortium team was able to interest Ben Goldacre, well-known for writing ‘Bad Science’ and the discussions kicked off from there.

Most people are looking for jobs (this appears to be true also for some of the most read newspapers in many countries) so we should not be fooled by ‘visits’:

around 70 percent of visitors to the ODI site come to research or look for jobs. They’re not necessarily in an interacting mood.

Action beyond reading is an important indicator of success:

And my final award of the year goes to the blog that managed to get most comments on the ODI site – a key indication that the blog has generated engagement and interest.

Don’t forget the old stuff; it is often work done many years ago that keeps getting the most downloads and the most interest:

For many years now, this 2003 online toolkit has been by far the most viewed piece of content. In 2012, the toolkit received 30,310 views and the downloadable version was accessed 8,410 times. Why is this still getting so much attention? One word: Google.

Beyond the lessons, too, this kind of analysis is important for any organisation. I would even argue that this is really all that is necessary. Asking questions about impact on a policy simply goes too far. It escapes the legitimate role of most think tanks and demands that they make assumptions that make any assertion highly unreliable; and useless. This, however, offers many useful ideas for immediate and future action.