Scholarly Communications

Assessment

Using bibliometrics responsibly

Loughborough University recognises the importance of using bibliometrics responsibly. To this end, Loughborough University's Statement on the Responsible Use of Metrics was approved by Senate on 16 November 2016.  The full Senate paper is available to members of the university and the main statement is reproduced below.  There are many developments in the external environment around the use of metrics and whilst this is Loughborough University's first clear statement on the matter, it is unlikely to be its last.  

1.      Preamble

Loughborough University is proud of its achievements in research to date and has ambitious plans for the future in line with the ‘Building Excellence’ strategy. The quality of our research clearly affects the academic, social, economic and environmental impact it has. Maximising the visibility of our research is equally important to delivering that impact and bibliometric indicators are currently attracting much attention in these regards. As a university, we are keen to improve the quality and visibility of our research. While recognising their limitations, particularly in certain discipline areas, we also recognise that bibliometric indicators can be a helpful tool in monitoring progress against this goal.  Furthermore, we recognise that external assessments of our research quality already use bibliometric indicators and we might reasonably expect such use to increase in future. Relative to our peers, however, Loughborough does not perform as well on bibliometric indicators, even when they are field-weighted. In considering this, we have observed certain relationships. For example, publishing in journals characterised by high SNIP or SJR values and publishing with international co-authors correlate well with citation performance. This indicates how choices that are not directly related to output quality can have an important effect on output visibility and we should seek all means possible to maximise the visibility of our research.

While seeking to establish an agreed set of indicators for a variety of uses, including review at the individual and institutional levels, we are also committed to using bibliometric indicators sensibly and responsibly. The Leiden Manifesto for Research Metrics (Hicks et al, 2015) outlines ten principles for responsible research evaluation and Loughborough University subscribes to these principles as outlined below.

2.      Responsible research evaluation: the ten principles of the Leiden Manifesto in a Loughborough context. (Key principles in italics).

1) Quantitative evaluation should support qualitative, expert assessment. 

Loughborough University recognises the value of quantitative indicators (where available) to support qualitative, expert peer review. Indicators may be used in a variety of processes including recruitment, probation, reward, promotion, development appraisal and performance review but indicators will not supplant expert assessment of both research outputs and the context in which they sit.  Similarly, indicators may be used for collective assessments at levels from research units to the institution as a whole.

2) Measure performance against the research missions of the institution, group or researcher. 

The “Raising Standards and Aspiration” theme of the University strategy drives our ambition to deliver research of the highest quality. At the same time, the visibility of our research is critical to maximising its impact on the communities it serves, in line with the “Growing capacity and influence” theme. To this end, indicators around the quality of the outlet (journal or conference), collaboration levels and citedness of outputs are helpful in monitoring progress against these strategy themes.  Working within an agreed framework that accommodates variation in missions and the effectiveness of indicators, goals will be set by each School with support from Research Committee.

3) Keep data collection and analytical processes open, transparent and simple. 

There is a balance to be struck between simple transparent indicators, that may disadvantage some groups, and more complex indicators that normalize for differences but are harder for researchers to replicate.  Research Committee will work to ensure that indicators used support the ambitions of each School, as set out within Research Action Plans, and of the institution as a whole.  To this end and in consultation with the Pro Vice-Chancellor (Research), Schools will be able to select the indicators used to support evaluation of their publication performance at the individual and collective levels.  A list of relevant indicators, with their advantages, disadvantages and potential uses, is provided.  Indicators selected should be used consistently across all areas of research performance monitoring.

4) Allow those evaluated to verify data and analysis. 

The publication and citation tools used to collect and monitor research publication data at Loughborough University will continue to be made openly available.  Academics are therefore able to see the data relating to themselves, and to make corrections where necessary.  Staff managing publication systems will also endeavour to ensure that data are as accurate and robust as possible.

5) Account for variation by field in publication and citation practices. 

It is recognised that research practices in disciplines vary widely and bibliometric indicators serve some disciplines better than others. For example, citation tools are currently only based on journal and conference outputs, not monographs or other forms of output.  International collaboration indicators will be less relevant to disciplines where academics tend to publish alone rather than in teams.  In line with best practice, indicators will be normalized wherever appropriate and based on percentiles rather than averages where a single outlier can skew the numbers.  The availability or otherwise of bibliometric data will not drive our decision making about research activities and priorities, either individually or collectively.

6) Protect excellence in locally relevant research. 

It is recognised that most citation counting tools are inherently biased towards English-language publications.  It is important that academics producing work in languages other than English are not penalised for this.

7) Base assessment of individual researchers on a qualitative judgement of their portfolio.

Loughborough University acknowledges how indicators are affected by career stage, gender and discipline and will seek to take these factors into account when interpreting metrics.  It is also recognised that academics undertake a wide range of research communication activities, not all of which can be easily measured or benchmarked.  When assessing the performance of individuals, consideration will be given to as wide a view of their expertise, experience, activities and influence as possible.

8) Avoid misplaced concreteness and false precision. 

Where possible, Loughborough University commits to using multiple indicators to provide a more robust and wide-ranging picture. Indicators will avoid false precision; for example, metrics may be published to three decimal places to avoid ties but, given the limitations of citation counts, it makes no sense to distinguish between entities on the basis of such small differences.

9) Recognize the systemic effects of assessment and indicators. 

It is accepted that any measurements can, in themselves, affect the system they are used to assess through the inevitable incentives they establish.  To minimize such effects, a suite of indicators will be used, wherever practical.

10) Scrutinize indicators regularly and update them. 

As the research activity of the University and the external environment develop, the bibliometric indicators we use will be revisited and revised where appropriate. This will be the responsibility of the Pro-Vice Chancellor for Research.

FAQ

Q1: My probation advisor has said that only articles published in journals in the top 25% by SNIP would count towards meeting my probation target. Doesn't that run counter to our Responsible Metrics Policy?

A: Yes. Passing probation depends on you producing high quality outputs as judged by peer review and not publishing in high visibility journals as indicated by SNIP or SJR.  However, while it is not a probation requirement, there is still a general expectation on you to publish your work in outlets that meet our journal selection guidance based on readership, rigour and reach as outlined in Q4 below.  Journal metrics such as SNIP and SJR can indicate rigour and reach and the Schools are at liberty to provide guidance on this.

Q2: My PDR reviewer has suggested I might want to target some journals with higher SNIP/SJR values - isn’t that an infringement of the Responsible Metrics policy?

A: No. Under our RM policy, the quality of a journal article will not be judged by the journal in which it is published.  However, that is not to say that your journal articles wouldn’t achieve greater visibility and impact if they are placed in journals with higher SNIP & SJR values. Improving the visibility of Loughborough’s research outputs is important because it can maximise impact, and thereby increase citation, of your work. It is perfectly legitimate for your PDR reviewer to suggest target journals with higher SNIP/SJR values in line with your School’s recommendations on this topic.

Q3: My Research Theme Lead has provided a recommended list of journals for us to target, but I’d rather publish somewhere else. Can I do so?

A:  Yes. Loughborough University seeks to provide guidance around publication strategies that might help academics choose where to publish for maximum visibility and impact.  However, it is only guidance.  The ultimate decision as to where an academic chooses to publish belongs to them.  There will be no consequences for academics who do not take this advice as long as alternatives are selected for sound reasons.

Q4: How can I responsibly use journal metrics when choosing where to publish?

A: Loughborough's general advice around choosing a journal is given below.  However, Schools and sub-disciplines may have built on this advice by providing their own expertly-curated lists that might help you make your decision.

When choosing a journal, you may wish to follow these three simple steps - in this order:

  1. Readership. Is the journal a place where your target audience will find, read and use/cite it?
  2. Rigour.  Does the journal have high editorial standards, offer rigorous peer review and an international editorial board?
  3. Reach.  Is it a visible outlet, with international reach and a permissive open access policy or option?

 Field-normalised journal citation metrics such as SJR and SNIP may offer a window onto #3 (visibility and reach).  In some disciplines they may also offer a window onto #2 (rigour).  They do not offer a window onto #1 (readership).