Bibliometrics, or research metrics are quantitative values designed to help evaluate the quality and impact of research outputs. These indicators may relate to the author, journal or individual paper. There are a large number of metrics available but they all have limitations, and therefore the responsible use should be evaluated against a broader background of peer-review.
For more information on the responsible and the limitations of metrics, please refer to the University Statement on Responsible Use of Research Metrics. Further sector wide guidance and good practice can be found in CoARA Agreement on Reforming Research Assement, DORA, the Leiden Manifesto, the Metric Tide report and the UK Reproducability Network Position on Responsible Research Evaluation.
There are a number of tutorials online to help you understand Metrics. MyRI's helpful tutorial on measuring research impact is one such tutorial, your Academic Librarian Team can also advise.
Leeds Beckett University has become a signatory of the San Francisco Declaration on Research Assessment (DORA). DORA promotes the responsible use of research metrics, encouraging the assessment of research on its own merits, rather than relying on metrics such as the Journal Impact Factor.
Find out more on the University's Research and Information Governance webpage - Responsible use of Research Metrics.
The most basic metrics are just number of citations a particular work has received.
Be aware:
Journal rankings are a means of judging the relative importance of a journal within its field. They should not be used to evaluate the quality of the individual articles within the journal.
Journal rankings use different calculation methods based on varying quantitative calculations of number of papers and number of citations over a period.
Criticisms of Journal Rankings include:
There a number of Journal Rankings available.
Journal Impact Factor is calculated by taking the number of citations to articles published in the previous two years divided by the total number of papers published in the last two years. As calculated over a short period impact factors can vary wildly from year to year and heavily favours the STEM subjects where currency of literature is more important.
To find if the journal has an impact factor you can check the searchable list of titles. JCR are only available for the 12000 journals indexed within Web of Science covering Sciences, Technology and the Social Sciences. You can register to see some information about the journal but as LBU does not have access to Journal Citation Reports you will not be able to find the Journal Impact Factor, however you may find the impact factor listed on the journal homepage.
All of these can be found within scopus. You can search Sources within SCOPUS to find which journals are included.
Calculated on a similar basis to the Journal Citaiton Reports but over 4 years rather than 2 for JCR. Gives a longer citation window for
SCImago's assigns weight to bibliographic citations based on the importance of the journals that issued them, so that citations issued by more important journals will be more valuable than those issued by less important ones (similar to the Google page rank algorithm). The weightings are averaged over the past 3 years. Includes the 15000 journals indexed within Scopus.
SNIP rankings attempt to remove the discipline-based subjectivity of other journal impact factors. The impact of a citation is given greater weighting in fields of study where citations are less common. Access to the rankings is also available by Open Access from the CWTS at the University of Leiden.
Remember your Academic Librarian can also advise.
The h-index (a single number) takes into account productivity (paper count) AND citations of an individual, research group or institution.
If an individual has a h-index of 7, this means that 7 of their papers have been cited at least 7 times each.
NB: As with other key metrics, the h-index can only be used effectively by comparing like with like, for example: similar institutions, individuals in a similar discipline and at a similar stage in their career. The calculation will also depend on the tool used, for example SCOPUS and Google Scholar both have H-index scores, but because of the relative difference in the size of the databases the H-Indexes vary greatly.
Google Scholar uses H-Index for journal ranking.
Altmetrics measure the 'mentions' of your research (anything with a DOI), through social media, blogs, news reports and websites. They can be a powerful tool in helping you understand how your research reaches beyond the academic sphere of citations in academic sources.
Two tools you can use to check your alternative metrics are Plum Analytics (in Scopus) and Altmetric (in Symplectic).
Plum Analytics has 5 categories of engagement it monitors: Citations, Usage, Captures, Mentions, and Social Media.
Altmetric uses a “donut” of different attention types. You can see your donuts on Symplectic or next to each record in the repository. Each coloured strand represents a different type of 'mention' and Altmetric calculates an overall score, giving more weight to news reports and adjusting for duplication. Clicking on the donut will take you to a detailed page where you can see each of these mentions and where they originate. This enables you to monitor how your research is disseminated.
Look out for the altmetric "donut" on your records in Symplectic – from your publications list click on the Metrics tab.