Following this week’s post on how to analyse online data we publish this more technical post by Nick Scott discusses altmetrics on the LSE Impact of Social Sciences blog. Almetrics are a new form of measuring research impact by adding on a wider set of metrics to traditional bibliographic rankings based on academic journal citation analysis. Aside from measuring citations, it also includes tweets, Facebook shares and saves on Delicious and Mendeley.
However, despite my support for Altmetrics, I’m a little concerned that the nature of debate is not helpful. Messaging is too fixated on whether altmetrics are better than traditional forms of assessing impact – citation analysis and the like. To me, this is to miss the point completely. We shouldn’t be arguing that altmetrics is a better way to measure reach, that makes it seem like we have an alternative. No, we should be arguing that altmetrics are the central way to measure the more varied forms of scholarly communication of the digital age.
While altmetrics are a very useful initiative in measuring impact beyond traditional scholarly output, they still have to show progress in certain areas. For instance, it is difficult to identify different types of outputs. Two main types are used, Digital Object Identifiers and ORCiD, but these cannot be assigned to items other than academic journals. Using URLs is also complicated because most items do not have just one webpage address, and there isn’t a clear system of compiling sets of URLs across multiple sites. Counting hits also seem limited to journals.
Another problem arises from ‘dark social’, which is sharing in places that analytics can’t reach, like emails, bookmarks and offline sharing. An M&E log, like the one used at ODI where researchers can forward examples of their work being used, can be helpful. Integrating atlmetrics with a media monitoring services can also track impact in the media. Campaigns that complement academic papers with outputs in other formats can be measured with altmetrics to get a full sense of its reach, and compare the different types of communication. And finally, measuring real impact is hard to do with altmetrics, which measures indicators of success and not of real impact.
Again, integration of an M&E log dataset of some sort, where people could add some of the more qualitative information of success – or even just a commenting facility to detail how work is used, received and leads to change – would be a great addition to altmetrics tools. Researchers do often hear about impact of their work, so tracking this somehow within altmetrics systems would be wonderful.
Nick has published a guide on monitoring and evaluating research communications using digital tools, which lists an array of principles on measuring impact as well as a list of digital tools. He also explains ODI’s award – winning online strategy.