Scholarly Communications

Assessment

Using bibliometrics responsibly

Loughborough University recognises the importance of using bibliometrics responsibly. To this end, Loughborough University's Statement on the Responsible Use of Metrics was approved by Senate on 16 November 2016.  The full Senate paper is available to members of the university and the main statement is reproduced below.  There are many developments in the external environment around the use of metrics and whilst this is Loughborough University's first clear statement on the matter, it is unlikely to be its last.  

1.      Preamble

Loughborough University is proud of its achievements in research to date and has ambitious plans for the future in line with the ‘Building Excellence’ strategy. The quality of our research clearly affects the academic, social, economic and environmental impact it has. Maximising the visibility of our research is equally important to delivering that impact and bibliometric indicators are currently attracting much attention in these regards. As a university, we are keen to improve the quality and visibility of our research. While recognising their limitations, particularly in certain discipline areas, we also recognise that bibliometric indicators can be a helpful tool in monitoring progress against this goal.  Furthermore, we recognise that external assessments of our research quality already use bibliometric indicators and we might reasonably expect such use to increase in future. Relative to our peers, however, Loughborough does not perform as well on bibliometric indicators, even when they are field-weighted. In considering this, we have observed certain relationships. For example, publishing in journals characterised by high SNIP or SJR values and publishing with international co-authors correlate well with citation performance. This indicates how choices that are not directly related to output quality can have an important effect on output visibility and we should seek all means possible to maximise the visibility of our research.

While seeking to establish an agreed set of indicators for a variety of uses, including review at the individual and institutional levels, we are also committed to using bibliometric indicators sensibly and responsibly. The Leiden Manifesto for Research Metrics (Hicks et al, 2015) outlines ten principles for responsible research evaluation and Loughborough University subscribes to these principles as outlined below.

2.      Responsible research evaluation: the ten principles of the Leiden Manifesto in a Loughborough context. (Key principles in italics).

1) Quantitative evaluation should support qualitative, expert assessment. 

Loughborough University recognises the value of quantitative indicators (where available) to support qualitative, expert peer review. Indicators may be used in a variety of processes including recruitment, probation, reward, promotion, development appraisal and performance review but indicators will not supplant expert assessment of both research outputs and the context in which they sit.  Similarly, indicators may be used for collective assessments at levels from research units to the institution as a whole.

2) Measure performance against the research missions of the institution, group or researcher. 

The “Raising Standards and Aspiration” theme of the University strategy drives our ambition to deliver research of the highest quality. At the same time, the visibility of our research is critical to maximising its impact on the communities it serves, in line with the “Growing capacity and influence” theme. To this end, indicators around the quality of the outlet (journal or conference), collaboration levels and citedness of outputs are helpful in monitoring progress against these strategy themes.  Working within an agreed framework that accommodates variation in missions and the effectiveness of indicators, goals will be set by each School with support from Research Committee.

3) Keep data collection and analytical processes open, transparent and simple. 

There is a balance to be struck between simple transparent indicators, that may disadvantage some groups, and more complex indicators that normalize for differences but are harder for researchers to replicate.  Research Committee will work to ensure that indicators used support the ambitions of each School, as set out within Research Action Plans, and of the institution as a whole.  To this end and in consultation with the Pro Vice-Chancellor (Research), Schools will be able to select the indicators used to support evaluation of their publication performance at the individual and collective levels.  A list of relevant indicators, with their advantages, disadvantages and potential uses, is provided.  Indicators selected should be used consistently across all areas of research performance monitoring.

4) Allow those evaluated to verify data and analysis. 

The publication and citation tools used to collect and monitor research publication data at Loughborough University will continue to be made openly available.  Academics are therefore able to see the data relating to themselves, and to make corrections where necessary.  Staff managing publication systems will also endeavour to ensure that data are as accurate and robust as possible.

5) Account for variation by field in publication and citation practices. 

It is recognised that research practices in disciplines vary widely and bibliometric indicators serve some disciplines better than others. For example, citation tools are currently only based on journal and conference outputs, not monographs or other forms of output.  International collaboration indicators will be less relevant to disciplines where academics tend to publish alone rather than in teams.  In line with best practice, indicators will be normalized wherever appropriate and based on percentiles rather than averages where a single outlier can skew the numbers.  The availability or otherwise of bibliometric data will not drive our decision making about research activities and priorities, either individually or collectively.

6) Protect excellence in locally relevant research. 

It is recognised that most citation counting tools are inherently biased towards English-language publications.  It is important that academics producing work in languages other than English are not penalised for this.

7) Base assessment of individual researchers on a qualitative judgement of their portfolio.

Loughborough University acknowledges how indicators are affected by career stage, gender and discipline and will seek to take these factors into account when interpreting metrics.  It is also recognised that academics undertake a wide range of research communication activities, not all of which can be easily measured or benchmarked.  When assessing the performance of individuals, consideration will be given to as wide a view of their expertise, experience, activities and influence as possible.

8) Avoid misplaced concreteness and false precision. 

Where possible, Loughborough University commits to using multiple indicators to provide a more robust and wide-ranging picture. Indicators will avoid false precision; for example, metrics may be published to three decimal places to avoid ties but, given the limitations of citation counts, it makes no sense to distinguish between entities on the basis of such small differences.

9) Recognize the systemic effects of assessment and indicators. 

It is accepted that any measurements can, in themselves, affect the system they are used to assess through the inevitable incentives they establish.  To minimize such effects, a suite of indicators will be used, wherever practical.

10) Scrutinize indicators regularly and update them. 

As the research activity of the University and the external environment develop, the bibliometric indicators we use will be revisited and revised where appropriate. This will be the responsibility of the Pro-Vice Chancellor for Research.