Responsible research assessment

Loughborough University recognises the importance of assessing research responsibly. To this end, Loughborough University's first statement on the responsible use of bibliometrics was approved by Senate on 16 November 2016. The original Senate paper is available to members of the university. In line with our commitment to keep this statement under review, a revised and expanded version, encompassing the use of all research metrics, was approved by Senate on 13 November 2019, and the current version, approved by Senate on 16 June 2023, has been expanded to cover all forms of research assessment, both quantitative and qualitative. The text of the revised statement is available below.


Loughborough University is proud of its achievements in research to date and has ambitious plans for the future in line with the university’s Creating Better Futures Together Strategy. Seeking to enhance Loughborough’s research quality, impact, and visibility, while celebrating the many and varied dimensions of research and innovation activity, is core to Loughborough’s vision.  As such understanding and monitoring the University’s progress against these ambitions are key to delivering success. While recognising their limitations, particularly in certain discipline areas, we believe that research indicators, amongst other research assessment approaches, can be a helpful tool in this monitoring activity. Furthermore, we recognise that external assessments of our research quality already use a range of research assessment approaches, including indicators.

While seeking to establish an agreed set of indicators for a variety of uses, including review at the individual and institutional levels, we have always been so committed to using research indicators sensibly and responsibly. The Leiden Manifesto for Research Metrics (Hicks et al, 2015) outlines ten principles for responsible use of bibliometrics for research evaluation and the first two iterations of Loughborough University’s responsible metrics policy was based largely on these principles.  This revised policy seeks to expand its scope from research metrics to research assessment more generally to encompass the broader expectations of the CoARA Agreement on Reforming Research Assessment and the revised Declaration on Research Assessment, to both of which we are signatory.  It is based on the INORMS SCOPE Framework which provides a simple and memorable acronym which encompasses the five key elements of responsible research assessment.

Our CoARA Action Plan is also available to view.

Statement on responsible research assessment at Loughborough University

START with what you value

Loughborough University is a value-led organization and as such expects any research evaluation exercise to be performed in line with its values. Research assessments should:

Measure performance against the research and innovation missions of the institution, group, or researcher.

The University strategy aims to deliver “Ambitious Research and Innovation” and “Meaningful Partnerships”; to be “Diverse, Equitable and Inclusive” and “Internationally Engaged”. To this end, indicators around the diversity of our research populations, research applications and awards, the visibility and accessibility of outputs, and collaboration levels are helpful in monitoring progress against these strategy aims.  Working within an agreed framework that accommodates variation in missions and the effectiveness of indicators, goals will be set by each School with support from Research & Innovation Committee. Individuals will be supported to set their own research plans in line with school aspirations through the academic PDR process and progress against those plans will be monitored through PDR.

Recognise the diversity of contributions to, and careers in, research and innovation in accordance with the needs and nature of the activity

The University is a signatory to the Researcher Careers Concordat and the HR Excellence in Research Award, and as such is committed to celebrating a diverse range of contributions to, and careers in research and innovation. The academic promotions process has recently been updated to recognize a broader range of scholarly career pathways and within those pathways to celebrate a broader range of activities and outputs. Assessments of research and innovation activity will seek to give due consideration to the diverse range of research characteristics, qualities, activities and outcomes that we value.

CONTEXT Considerations

The University recognizes that appropriate assessment approaches will vary according to context. To this end any assessment will:

Consider the purpose and the subject of an assessment

There are six main reasons for assessing research: Analysis (assessing to ‘understand’), Advocacy (assessing to ‘show off’), Accountability (to monitor), Acclaim (to benchmark), Adaptation (to incentivize) and Allocation (to reward). Assessments may also be undertaken at the level of individual researchers, research groups, schools, or institutions. Both the purpose and the subject of an assessment will have a bearing on the appropriateness of the methods used to perform that assessment. For example, celebrating the number of open access outputs we have as an institution to advocate for open research practices has a much lower impact than assessing research groups on the number of open access outputs in accountability exercises where there may good reasons for differences in the data. Any assessment at the level of individual researchers that might lead to some form of allocation such as workload, prizes or promotion should be designed with particular care to ensure everyone has a fair opportunity to succeed.

Account for variation by field in research and innovation practices.

It is recognised that research practices in disciplines vary widely and indicators, particularly bibliometric indicators, serve some disciplines better than others. For example, citation tools are currently only based on journal and conference outputs, not monographs or other forms of output.  International collaboration indicators will be less relevant to disciplines where academics tend to publish alone rather than in teams. Research applications and awards levels will also vary significantly by field. In line with best practice, indicators will be normalized wherever appropriate and based on percentiles rather than averages where a single outlier can skew the numbers.  The availability or otherwise of data will not drive our decision making about research activities and priorities, either individually or collectively. In recognition of the fact that most citation counting tools are inherently biased towards English-language publications, the university will seek to ensure that academics producing work in languages other than English are not penalised for this.

OPTIONS for evaluating

Whilst metrics and indicators are often criticized for lacking the nuance of expert judgement, peer review can also be problematic and open to bias. At Loughborough we will seek to adopt the most appropriate method for the evaluation in hand and to combine qualitative and quantitative approaches to achieve the best possible assessment. To achieve this, we will:

 Ensure quantitative evaluation is used in combination with qualitative, expert assessment.

Loughborough University recognises the value of quantitative indicators (where available) to support qualitative, expert peer review. Indicators may be used in a variety of individual-level processes including recruitment, probation, reward, promotion, development appraisal and performance review but indicators will not supplant expert peer assessment of research outputs, research applications and awards, PGR supervisions, or any other research activity.  Similarly, whilst indicators may be used for collective assessments at levels from research units to the institution as a whole, expert peer assessment will guide their interpretation.

 Base assessment of individual researchers on a qualitative judgement of their portfolio.

Loughborough University recognises that academics undertake a wide range of research activities, not all of which can be easily measured or benchmarked.  It is also aware that indicators are affected by career stage, gender and discipline and may not serve everybody equally.  To this end, when assessing the performance of individuals, consideration will be given to as wide a view of their expertise, experience, activities and influence as possible, through peer review, in line with the relevant research aspirations of the individual, school and/or university.

 Avoid the use of rankings of research organisations in research assessment

Loughborough University is a signatory to the More Than Our Rank initiative and recognizes that the rank of research organisations in which researchers may have studied or worked says nothing about their quality, importance, or value as a researcher. As such, Loughborough commits not to use rankings of research organisations in the assessment of researchers for recruitment, promotion, appraisal, or any other purpose.

 Keep data collection and analytical processes open, transparent, and simple.

There is a balance to be struck between simple transparent indicators, that may disadvantage some groups, and more complex indicators that normalize for differences but are harder for researchers to replicate. Research & Innovation Committee will work to ensure that indicators used support the ambitions of each School, as set out within Principal Research and Impact Ambition (PRIMA) statements, and of the institution as a whole. To this end and in consultation with the Pro Vice-Chancellor (Research & Innovation), Schools will be able to select the indicators used to support evaluation of their performance at the individual and collective levels. Indicators selected should be used consistently across all areas of research performance monitoring.

Whilst the university aspires to keep data collection and analytics processes open, transparent and simple, there is a tension between openness and privacy in relation to the availability of data that might be used for individual performance monitoring. Individual level research performance data that is not publicly available will be available only to that individual, their Dean, ADRI and PDR reviewer.*

*Please note that the Annual Output Review Framework states that Output Review scores and feedback will only be used in internal assessment processes such as promotion, probation, and PDR at the researcher's own discretion.

Avoid misplaced concreteness and false precision. 

Where possible, Loughborough University commits to using multiple indicators and assessment approaches to provide a more robust and wide-ranging picture of the research activities it values. Indicators will seek to avoid false precision; for example, by publishing metrics to three decimal places, or by placing too much weight on values without considering the sample size they represent or rolling averages.

PROBE assessments for unintended consequences

Loughborough University recognizes that research assessments can sometimes lead to unintended consequences or discriminatory effects. To this end, we commit to:

Recognize the systemic effects of assessment and indicators. 

It is accepted that any measurements can, in themselves, affect the system they are used to assess through the inevitable incentives they establish.  To minimize such effects, we will regularly reassess our indicators in the light of our research ambitions, consider any unintended consequences of those indicators and update them to ensure they incentivise appropriate behaviours. To mitigate negative impacts a suite of indicators will be used, wherever practical.

Run an Equality Impact Assessment on any new research and innovation evaluation approach

In line with our institutional commitment to EDI, an Equality Impact Assessment (EIA) will be conducted when a new form of research assessment is introduced or significantly re-designed. The outcomes of that EIA will inform the ongoing design, development and/or interpretation of that research assessment.

EVALUATE responsibly

To ensure our approach to research evaluation remains current, appropriate, and fair, Loughborough University commits to:

Evaluate with the evaluated

Wherever possible, research assessments will be designed and interpreted with the communities under evaluation. In particular, we will ensure that those evaluated will be able to verify any analysis and underlying data that relates to them. The publication and citation tools used to collect and monitor research publication data at Loughborough University will continue to be made openly available. Research applications and awards data, impact case studies, PGR supervisions, and other evidence of research activity as recorded on university systems will be made available to those staff to whom they relate, plus their PDR Reviewer, via the annual PDR process. Deans and ADR&Is will have access to individual-level data for monitoring purposes. Academics are therefore able to see the data relating to themselves, and to request corrections where necessary.  Anyone wishing to see the data relating to them should email in the first instance. 

Scrutinize indicators regularly and update them. 

As the research activity of the University and the external environment develop, the research indicators used will be revisited and revised where appropriate. This will be the responsibility of the Pro-Vice Chancellor for Research & Innovation.

Ensure those who generate and interpret research metrics do so in line with the university’s responsible metrics policy

Training in the responsible use of metrics will be provided to all staff who generate and interpret research indicators to ensure understanding of, and adherence to, this policy. Staff managing research activity recording systems will endeavour to ensure that data are as accurate and robust as possible.


Q1: My probation advisor has said that only articles published in journals in the top 25% by SNIP would count towards meeting my probation target. Doesn't that run counter to our Responsible Metrics Policy?

A: Yes. Passing probation depends on you producing high quality outputs as judged by peer review and not publishing in high visibility journals as indicated by SNIP or SJR.  However, while it is not a probation requirement, there is still a general expectation on you to publish your work in outlets that meet our journal selection guidance based on readership, rigour and reach as outlined in Q4 below.  Journal metrics such as SNIP and SJR can indicate rigour and reach and the Schools are at liberty to provide guidance on this.

Q2: My PDR reviewer has suggested I might want to target some journals with higher SNIP/SJR values - isn’t that an infringement of the Responsible Metrics policy?

A: No. Under our RM policy, the quality of a journal article will not be judged by the journal in which it is published.  However, that is not to say that your journal articles wouldn’t achieve greater visibility and impact if they are placed in journals with higher SNIP & SJR values. Improving the visibility of Loughborough’s research outputs is important because it can maximise impact, and thereby increase citation, of your work. It is perfectly legitimate for your PDR reviewer to suggest target journals with higher SNIP/SJR values in line with your School’s recommendations on this topic.

Q3: My Research Theme Lead has provided a recommended list of journals for us to target, but I’d rather publish somewhere else. Can I do so?

A:  Yes. Loughborough University seeks to provide guidance around publication strategies that might help academics choose where to publish for maximum visibility and impact.  However, it is only guidance.  The ultimate decision as to where an academic chooses to publish belongs to them.  There will be no consequences for academics who do not take this advice as long as alternatives are selected for sound reasons.

Q4: How can I responsibly use journal metrics when choosing where to publish?

A: Loughborough's general advice around choosing a journal is given below.  However, Schools and sub-disciplines may have built on this advice by providing their own expertly-curated lists that might help you make your decision.

When choosing a journal, you may wish to follow these three simple steps - in this order:

  1. Readership. Is the journal a place where your target audience will find, read and use/cite it?
  2. Rigour.  Does the journal have high editorial standards, offer rigorous peer review and an international editorial board?
  3. Reach.  Is it a visible outlet, with international reach and a permissive open access policy or option?

 Field-normalised journal citation metrics such as SJR and SNIP may offer a window onto #3 (visibility and reach).  In some disciplines they may also offer a window onto #2 (rigour).  They do not offer a window onto #1 (readership).