Responsible use of research metrics

Loughborough University recognises the importance of using metrics responsibly. To this end, Loughborough University's first statement on the responsible use of bibliometrics was approved by Senate on 16 November 2016. The original Senate paper is available to members of the university. In line with our commitment to keep this statement under review, a revised and expanded version, encompassing the use of all research metrics, was approved by Senate on 13 November 2019. The text of the revised statement is available below.

Preamble

Loughborough University is proud of its achievements in research to date and has ambitious plans for the future in line with the university’s CALIBRE (Collective Ambition at Loughborough for Building Research Excellence) Strategy. Seeking to enhance Loughborough’s research quality, impact and visibility, while celebrating research excellence wherever it is found, is core to the CALIBRE approach.  As such understanding and monitoring the University’s progress against these ambitions are key to delivering success. While recognising their limitations, particularly in certain discipline areas, we believe that research indicators can be a helpful tool in this monitoring activity. Furthermore, we recognise that external assessments of our research quality already use a range of research indicators and we might reasonably expect such use to increase in future.

 While seeking to establish an agreed set of indicators for a variety of uses, including review at the individual and institutional levels, we are also committed to using research indicators sensibly and responsibly. The Leiden Manifesto for Research Metrics (Hicks et al, 2015) outlines ten principles for responsible use of bibliometrics for research evaluation and Loughborough University’s first responsible metrics policy was based largely on these principles.  This revised policy seeks to extend these principles to the use of all research metrics as outlined below.

Statement on the responsible use of research metrics at Loughborough University

1) Quantitative evaluation should support qualitative, expert assessment.

Loughborough University recognises the value of quantitative indicators (where available) to support qualitative, expert peer review. Indicators may be used in a variety of individual-level processes including recruitment, probation, reward, promotion, development appraisal and performance review but indicators will not supplant expert peer assessment of research outputs, research applications and awards, PGR supervisions, or any other research activity.  Similarly, whilst indicators may be used for collective assessments at levels from research units to the institution as a whole, expert peer assessment will guide their interpretation.

 2) Measure performance against the research missions of the institution, group or researcher.

The “Raising Standards and Aspiration” theme of the University strategy drives our ambition to deliver research of the highest quality, impact and visibility.  To this end, indicators around research applications and awards, the visibility and accessibility of outputs, collaboration levels and citedness of outputs are helpful in monitoring progress against these strategy themes.  Working within an agreed framework that accommodates variation in missions and the effectiveness of indicators, goals will be set by each School with support from Research Committee. Individuals will be supported to set their own research plans in line with school aspirations through the academic PDR process and progress against those plans will be monitored through PDR.

 3) Keep data collection and analytical processes open, transparent and simple.

There is a balance to be struck between simple transparent indicators, that may disadvantage some groups, and more complex indicators that normalize for differences but are harder for researchers to replicate. Research Committee will work to ensure that indicators used support the ambitions of each School, as set out within Research Action Plans, and of the institution as a whole. To this end and in consultation with the Pro Vice-Chancellor (Research), Schools will be able to select the indicators used to support evaluation of their performance at the individual and collective levels. Indicators selected should be used consistently across all areas of research performance monitoring. A list of relevant indicators, their advantages, disadvantages and potential usage, is supplied.

 Whilst the university aspires to keep data collection and analytics processes open, transparent and simple, there is a tension between openness and privacy in relation to the availability of data that might be used for individual performance monitoring. Individual level research performance data that is not publicly available will be available only to that individual, their Dean, ADR and PDR reviewer.

 4) Allow those evaluated to verify data and analysis.

The publication and citation tools used to collect and monitor research publication data at Loughborough University will continue to be made openly available. Research applications and awards data, impact case studies, PGR supervisions, and other evidence of research activity as recorded on university systems will be made available to those staff to whom they relate, plus their PDR Reviewer, via the annual PDR process. Deans and ADRs will have access to individual-level data for monitoring purposes. Academics are therefore able to see the data relating to themselves, and to request corrections where necessary.  Anyone wishing to see the data relating to them should email researchpolicy@lboro.ac.uk in the first instance.  Staff managing research activity recording systems will endeavour to ensure that data are as accurate and robust as possible.

 5) Account for variation by field in research practices.

It is recognised that research practices in disciplines vary widely and indicators, particularly bibliometric indicators, serve some disciplines better than others. For example, citation tools are currently only based on journal and conference outputs, not monographs or other forms of output.  International collaboration indicators will be less relevant to disciplines where academics tend to publish alone rather than in teams. Research applications and awards levels will also vary significantly by field. In line with best practice, indicators will be normalized wherever appropriate and based on percentiles rather than averages where a single outlier can skew the numbers.  The availability or otherwise of data will not drive our decision making about research activities and priorities, either individually or collectively. In particular, in recognition of the fact that most citation counting tools are inherently biased towards English-language publications, the university will seek to ensure that academics producing work in languages other than English are not penalised for this.

 6) Base assessment of individual researchers on a qualitative judgement of their portfolio.

Loughborough University recognises that academics undertake a wide range of research activities, not all of which can be easily measured or benchmarked.  It is also aware that indicators are affected by career stage, gender and discipline and may not serve everybody equally.  To this end, when assessing the performance of individuals, consideration will be given to as wide a view of their expertise, experience, activities and influence as possible, through peer review, in line with the relevant research aspirations of the individual, school and/or university.

 7) Avoid misplaced concreteness and false precision. 

Where possible, Loughborough University commits to using multiple indicators to provide a more robust and wide-ranging picture of the research activities it values. Indicators will seek to avoid false precision; for example, by publishing metrics to three decimal places, or by placing too much weight on values without considering the sample size they represent or rolling averages.

 8) Recognize the systemic effects of assessment and indicators. 

It is accepted that any measurements can, in themselves, affect the system they are used to assess through the inevitable incentives they establish.  To minimize such effects, we will regularly reassess our indicators in the light of our research ambitions, consider any unintended consequences of those indicators and update them to ensure they incentivise appropriate behaviours. To mitigate negative impacts a suite of indicators will be used, wherever practical.

 9) Scrutinize indicators regularly and update them. 

As the research activity of the University and the external environment develop, the research indicators used will be revisited and revised where appropriate. This will be the responsibility of the Pro-Vice Chancellor for Research.

 10) Ensure those who generate and interpret research metrics do so in line with the university’s responsible metrics policy

Training in the responsible use of metrics will be provided to all staff who generate and interpret research indicators to ensure understanding of, and adherence to, this policy.

FAQs

Q1: My probation advisor has said that only articles published in journals in the top 25% by SNIP would count towards meeting my probation target. Doesn't that run counter to our Responsible Metrics Policy?

A: Yes. Passing probation depends on you producing high quality outputs as judged by peer review and not publishing in high visibility journals as indicated by SNIP or SJR.  However, while it is not a probation requirement, there is still a general expectation on you to publish your work in outlets that meet our journal selection guidance based on readership, rigour and reach as outlined in Q4 below.  Journal metrics such as SNIP and SJR can indicate rigour and reach and the Schools are at liberty to provide guidance on this.

Q2: My PDR reviewer has suggested I might want to target some journals with higher SNIP/SJR values - isn’t that an infringement of the Responsible Metrics policy?

A: No. Under our RM policy, the quality of a journal article will not be judged by the journal in which it is published.  However, that is not to say that your journal articles wouldn’t achieve greater visibility and impact if they are placed in journals with higher SNIP & SJR values. Improving the visibility of Loughborough’s research outputs is important because it can maximise impact, and thereby increase citation, of your work. It is perfectly legitimate for your PDR reviewer to suggest target journals with higher SNIP/SJR values in line with your School’s recommendations on this topic.

Q3: My Research Theme Lead has provided a recommended list of journals for us to target, but I’d rather publish somewhere else. Can I do so?

A:  Yes. Loughborough University seeks to provide guidance around publication strategies that might help academics choose where to publish for maximum visibility and impact.  However, it is only guidance.  The ultimate decision as to where an academic chooses to publish belongs to them.  There will be no consequences for academics who do not take this advice as long as alternatives are selected for sound reasons.

Q4: How can I responsibly use journal metrics when choosing where to publish?

A: Loughborough's general advice around choosing a journal is given below.  However, Schools and sub-disciplines may have built on this advice by providing their own expertly-curated lists that might help you make your decision.

When choosing a journal, you may wish to follow these three simple steps - in this order:

  1. Readership. Is the journal a place where your target audience will find, read and use/cite it?
  2. Rigour.  Does the journal have high editorial standards, offer rigorous peer review and an international editorial board?
  3. Reach.  Is it a visible outlet, with international reach and a permissive open access policy or option?

 Field-normalised journal citation metrics such as SJR and SNIP may offer a window onto #3 (visibility and reach).  In some disciplines they may also offer a window onto #2 (rigour).  They do not offer a window onto #1 (readership).