I recently had a paper accepted and therefore dutifully registered it on the University’s Symplectic database. This forms a repository for all the publications generated by the university, which is of course a vital tool for summarizing research output (for instance for the Research Excellence Framework). It is required to upload the accepted manuscript, but as many journals do not permit uploading a pdf of the published version, usually the word file of the accepted version is submitted (which makes it a great deal less valuable, as such versions are not combined with figures in an easy-to-read journal format and miss corrections made at the proof stage). I noticed that Symplectic uses Scopus H-factor calculations, which is a shame, as this Elsevier database does a very bad job at that (even worse than Web of Science, see this old post).
For the first time, I checked some of the ‘altmetrics‘ for some of my papers. I have been sceptical about tallying the number of twitter mentions etc: if you are on a paper with lots of tweeting authors and/or in a field where people tend to be more active on twitter (e.g. computational biology), there will be a bunch of those, but in general the number seems small and biased. The whole online interactivity surrounding publications is usually quite minimal; this must be for a large part because scientists are too busy with their own research to write more than a lazy tweet about research by other people. I do appreciate that the PLoS journals for instance give the opportunity to comment on papers but I hardly see people actually doing that. Anyway, the immediate reason for writing this brief blog post was that one of the altmetrics I noticed, was a youtube video. When clicking it, I saw to my great surprise five North American scientists, led by Dr. Laura Williams, doing an tele-conference journal club about one of my papers (briefly outlined in this blog post). A one-hour dissection of a paper by colleagues for the world to see: scary!
So far, I have only zapped through and luckily it seemed mostly positive. It was quite interesting to see that some points did not come across as well as I hoped they would do, providing an opportunity to reflect on how to better communicate findings.
Although I really should have done some more reading on altmetrics for this post, it’s social media component, albeit interesting, does not seem a very reliable indicator of research impact. There are other altmetrics that seem very promising though. For instance, just the total number of paper views or downloads could be really useful: they are much greater in number and much less time-lagged than citations, making them perhaps a more accurate proxy for interest garnered. (Of course, citations reflect other scientists having actually built on the work, whereas some of the views/downloads will be of people who after reading it might find that the paper has very little value, but still.) Our Symplectic database does list Mendeley and CiteULike reads but does not go beyond this (small) portion of total reads. As some fields have many more workers than others, and of course as older papers have had more time to accumulate views, some corrections could be applied (the same goes for citation numbers, so this is not a unique criticism). PLoS not only gives the raw numbers of paper downloads, but also a little graph on how a paper compares to others published by the same journal in the same field in the same year, which is neat. I will highlight the accepted manuscript in a next blog post as soon as it is published!