Lies, damned lies and …

SciRate stats.

A few days back Dave did some analysis on papers that were highly scited on SciRate in the past 12 months. He examined papers that had more than 10 scitations and tried to group them by region.

Papers that had multiple co-authors were split between regions and if an author had multiple affiliations between different regions it was split again. He found, somewhat interestingly, that the US beat out Europe and Canada for the number of highly scited papers. This is interesting mostly because the US spends comparatively less than Canada and Europe on Quantum Information theory research.

Somewhat foolishly I decided that it would be interesting to see what the outcome of a similar calculation would be if we did the same analysis by institution instead of geographic region. Well, after an hour or so of downloading papers and checking affiliations I cobbled together the calculation.

I decided to basically use the same scoring mechanism as Dave. Each paper with more than 10 scitations, ie 11 or more, was worth 1 point. If there were multiple co-authors they each received a fraction of that point. Again, if an author had more than 1 affiliation I split their allocation accordingly.

Oh, and I did the calculation taking into account papers from 365 days prior to the 3rd of November. Clearly, the choice of time-period over which this calculation is done makes a big difference.

Now, before presenting a summary of these results I should point out that this was all done on the back of an envelope (actually, the reverse side of a printout of a paper) and isn’t necessarily accurate. While I was happy to waste my time to do this once, it wasn’t really worth checking the stats too thoroughly. Mostly, I was only interested in the broad trends that emerged and I think I counted accurately enough to establish those. But, please, don’t take any of this too seriously. I’m only publishing these stats as a discussion starter!!!

Of the 46 papers that I counted, 37 separate institutions were listed by authors as affiliations. Only 10 of those institutions received a score of 1 or more papers (remember if there were multiple co-authors the score for a paper would be fractional). The top 10 institutions were:

  1. University of Bristol (6.5)
  2. IQC (5.9)
  3. Caltech (4.6)
  4. MIT (4.35)
  5. National University of Singapore (2.16)
  6. Perimeter Institute (2.03)
  7. NEC (2)
  8. Cambridge (1.58)
  9. IBM (1.33)
  10. LANL (1)

I don’t know if these results surprise me too much. It does gel with my own feelings about the way in which the QI community has been performing in the last year or so. If one were to look at the number of papers accepted per institution at QIP in 2009 as an indicator then the results wouldn’t be too dissimilar.

Clearly, there’s a lot of bias in these results. I think the top few institutions – Bristol in particular – may benefit a bit from having multiple scirate users. I’d like to be able to dismiss this bias offhand but it may be a very real effect. People naturally scite what they know about and so if there is a cluster of scirate users geographically, or even in terms of a subject matter, bias will exist. Indeed this bias is part of the reason why people use SciRate – it seems to be a bias that they appreciate.

There isn’t a lot more that can be said about the stats that I gathered. I can say that Bristol had more authors on more highly scited papers by a relatively clear margin. However, Bristol seemed to have less papers that weren’t spread across multiple institutions than, say, Caltech or the IQC.

Mostly, I think that the data set that was examined was way too small. I think that in order for any trends to be established it would be best to do this over a period of several years. It would also be interesting to see the breakdown for the non-highly scited papers. I was really surprised that some institutions were completely omitted from the results, I suspect that these institutions are logging a lot of papers with more than 1 but fewer than 10 scites.

Now, if only I could get those few hours of my life I wasted doing this back somehow…

About these ads

One thought on “Lies, damned lies and …

  1. While the above list certainly picks out some excellent institutions, Scirate probably isn’t the best way to do this. While I love Scirate, it has a relatively small user base, and the way it has grown through word of mouth means that it puts emphasis on specific facets of QIP which reflecting the interests of it’s readers. This would be fine if there was a large fairly randomly distributed readership, but I really don’t think that is yet the case. So I’m not really sure whether useful statistics can be drawn from scirate yet.

    ISI does occasional reviews of institutes in specific areas, ranking them by papers, citations and citations/paper. Unfortunately QIP hasn’t come up yet, but quantum crypto has as have quantum dots.

Comments are closed.