Universities already have their world rankings (and some REF areas) assessed with metrics. Individual researchers may well find they are being asked about their own metrics. This post lists four places to check your bibliometric profile and consider how well it reflects your work. It’s worth doing this, particularly to identify where metrics may NOT be capturing what you consider your successes. Simply claiming “metrics don’t work” or have limitations for your area is not as effective as demonstrating it!
There is no one definitive place to get an accurate count of your publications and citations: each source listed below indexes a limited amount of scholarly content and the figures will reflect that. This is particularly a problem for subject areas that are not well covered (typically Humanities/Social Sciences) and which do not publish primarily journal articles. Older papers (and older citations) can also be missing, although content coverage seems to be expanding on most services.
Swansea University has a subscription to Elsevier’s Scopus database which is the source used for university rankings and some REF UoAs. You will have a profile on Scopus if it has indexed at least one of your papers and it takes the affiliation of your most recent paper. If you have more than one profile, you need to correct this: there is a “Request Author Details Correction” on a profile page; it can take a few weeks for this to get processed.
On your author profile page you can see:
- Total number of papers indexed by Scopus (you may want to consider what’s been missed)
- Total number of citations to your papers on Scopus: sort your list of papers by “Cited By” to see your most highly cited (how many have not been cited at all?)
- Your Scopus h-index
- A graph showing citations over time: this will tail off as it takes time for citations to accrue
Scival (another Elsevier product) uses the same citation data from Scopus to give you further statistics. SciVal uses a limited date range (check the top of the page for the options) so you may see less papers/citations on SciVal than on Scopus.
SciVal can tell you:
- How many of your papers were in the top 10% most cited worldwide.
- How many of your papers were published in the top 10% of journals: you need to select a journal metric for this: “CiteScore” is Elsevier’s version of the Journal Impact Factor; “SNIP” attempts to normalise for your subject area.
- Your Field Weighted Citation Index: this metric should not be used if you have under 50 papers, and even for higher numbers should be treated with caution.
With both Scopus and SciVal metrics, you may wish to compare yourself against colleagues in the same field: comparisons across subject areas will not work as citation and publication patterns differ.
Google Scholar does not give a detailed list of what it indexes which weakens its case for robust use of its metrics. However, it is much more comprehensive than Scopus or Incites in terms of content, particularly for Humanities and Social Science areas. Comparing your content here against Scopus can give an idea of how much is being missed when Scopus metrics are used.
To see your metrics you will need to create a profile (example here): there is a good guide here on the ImpactStory blog. You can then see:
- Total citations
- Your Google Scholar h-index
- A graph of your citations over time
You can also use the “Follow” button to get alerted to new citations. Google Scholar gives a count of citations for each paper – click on the number to see what is being counted. This is likely to be (much) higher than on Scopus/Web of Science, partly because more book data is included but also perhaps some less scholarly sources.
The Publish or Perish software can be used to perform further analysis on Google Scholar data and Anne-Wil Harzing’s site has much information on how it can be used.
We have recently blogged about using Incites which is Clarivate Analytics’ citation analysis tool. It can also give you an author overview:
- Which journals gave you the most citations
- Areas of work which are most highly cited
Incites uses data from the Web of Science, so a different database with its own set of content. Comparisons suggest citations and coverage are roughly similar / slightly less than Scopus (Scopus is expanding its content – particularly book data – more rapidly).
Altmetrics: do they tell a different story?
The site ImpactStory can be used to set up a quick profile and gather your altmetrics, as well as some citation metrics. This may provide additional information on your scholarly activities – how does this compare with your citation metrics?
If you do explore your personal metrics, please let us know! We have been doing some work with specific departments on how well different sources of metrics represent their outputs and all evidence is useful.