Evidencing the quality and visibility of your research

How to write Evidence-informed Output Narratives about your research for internal assessment processes at Loughborough University.

In line with our commitment to responsible research assessment, this guidance explains how researchers can evidence the quality and visibility of research outputs for internal assessment processes, such as academic promotion, to support fairer evaluation across disciplines.

A ​menu of evidence​ has been developed that researchers can now draw from when drafting narrative statements for assessment to create Evidence-informed Output Narratives (EONs). Schools have assigned weights to each type of evidence based on disciplinary standards. There is no expectation that all forms of evidence are called upon. Instead, researchers are invited to choose the most relevant evidence for each of their outputs. Guidance is also included on how to describe individual contributions for outputs with multiple contributors.

The weighted menu of evidence could, for example, be used when building your case for meeting the “Research Outputs” requirements for academic promotion. This could be used either when:

  • looking to demonstrate “a research profile that is clearly advancing in terms of quality and visibility of outputs” via “a range of evidence”
  • selecting outputs that “best showcase your research's quality” and discussing their visibility.

Examples of how this can be done are provided below, along with answers to common questions.

The menu is intended to be a living document to which other types of evidence may be added, removed or reweighted over time with agreement from the relevant disciplinary communities.

If you have feedback on this guidance, please share it with us.

Evidence of contribution

Evidence Notes
Narrative description For outputs with multiple contributors, please describe your contribution to the output.  
CRediT statement It may be helpful to characterise your contribution using the Contributor Role Taxonomy (CRediT) although this has its origins in the life sciences.

Evidence of quality/peer validation

Evidence Notes
Peer review scores E.g., from an internal output review process
Peer review comments Please state the source and cite any relevant comments.
Book, art, or exhibition review comments It may be pertinent to describe the credentials of the reviewer, the source & visibility of the review.  See more guidance on finding book reviews.
Prizes and awards  It may be helpful to describe the frequency and/or value of the Prize or Award and the credentials of the awarding body.
Rigour of a book publisher’s peer review process Please describe the peer review process from commissioning to completion, detailing the nature and number of the reviewers, editorial oversight, and other evidence of rigour. If the book is open access, you may wish to use PRISM (peer review information service for monographs) to identify the peer review process used.
Commissioned work Describe the commissioning body and any other useful dimensions of the commission (rarity, value, prestige, etc.)

Evidence of reach/visibility

Evidence Notes
Evidence of influence or impact Describe the communities influenced/impacted and the nature of the influence/impact.
Output made openly available Please provide location e.g., URL or other Permanent Identifier (PID).
Underlying data made open and/or ‘FAIR’ (findable, accessible, interoperable, and reproducible). Please describe and provide location. For more information on this please see the guidance on underlying data.
Exhibition/performance location(s) Please describe. 
Exhibition/performance attendees  Please contextualise in terms of volume and duration.
Attention as measured by ‘alternative metrics’ You may wish to describe the sources of attention, e.g., policy citations, Wikipedia, or Twitter, and the geographic reach if known (e.g., location of Mendeley Readers). See more alternative metrics guidance.
Output-level citations Please provide the source (e.g., Google Scholar, Scopus, Data Citation Index, Web of Science) & a field-weighted figure wherever possible (e.g., percentile in field). Learn more about finding output-level citation indicators
Software sharing platform “forks” For software made openly available on platforms such as GitHub, Bitbucket, etc. Learn more about software sharing platforms
Journal-level citations Please use a field-weighted indicator such as SNIP or SJR for the year of publication. See how to identify journal-level citation metrics.
Views and downloads Please describe the source, e.g., Institutional Repository, Scopus, or Publisher site.
Book sales Please contextualise if possible.
Book publisher reach E.g., global and/or disciplinary communities served.

Examples - how to use the evidence

The following passage illustrates how peer review comments and book reviews can be used to evidence that your research profile is advancing in terms of output quality.

Over the past six years, I have published three books, each showing advancement in my research. Internal peer reviews have highlighted the originality and depth of my work, with one stating, “The methodology is innovative and rigorous,” and another noting, “A strong redefinition of the field's frameworks.” Book reviews have confirmed the quality of my work, with Book Review Quarterly calling my most recent book “a compelling new perspective" and "an essential resource.”

This example illustrates how making underlying data openly available can help evidence how your research profile is advancing in terms of the visibility of your outputs.

In my last four published articles, I have made all underlying data openly available in the University Repository. These datasets have collectively been viewed 450 times, downloaded 120 times, and cited 13 times directly.

The following example represents an Evidence-informed Output Narrative (EON) for a fictional journal article for use in an academic promotion application. It references the Contributor Role Taxonomy (CRediT) to outline the researcher’s contributions and references peer review scores and comments to evidence quality. For visibility, the narrative outlines open-access status, downloads, journal-level citations and output-level citations.

This study establishes a protocol for testing resilience in materials for off-world construction, focusing on lunar environments, which surpasses current international standards. Contributions: Conceptualisation, Methodology, Funding Acquisition, Supervision, Writing – Review & Editing. Quality: The work received an internal score of 9, indicating a strong likelihood of 3*, with minor potential for 4*. Reviewers praised its “methodological clarity” and “innovative applications”. Visibility: Published open access, with 61 citations (Google Scholar) and an FWCI of 7.98, it ranked in the top 10 'Most Downloaded' articles in the Hypothetical Journal of Space Engineering (SJR of 1.278 in 2020).

This EON for a hypothetical media installation provides an example drawing on peer review scores and details about the commissioning body, exhibition locations and attendee numbers to evidence quality and visibility.

In 2021, I was commissioned by a prominent arts foundation to create Chasing Shadows, a public artwork for Lisbon’s Praça do Comércio as part of the Nicho initiative. Contribution: I conducted extensive research into Portuguese idioms around light, shadow and perception, which informed the installation’s concept. Quality: The commission was overseen by a committee of renowned artists, urban designers and cultural historians, and received a “solid 3*” peer review score in Loughborough’s internal output review process. Visibility: The artwork was displayed in a high-traffic location attracting over 350,000 daily visitors, and later acquired by the university for display on campus.

Common questions answered

What types of outputs can this guidance support?

This guidance is intended to apply to a wide range of research outputs, such as journal articles, book chapters, monographs, conference proceedings, software, datasets, digital products, artefacts, exhibitions and performances. It is particularly suited to practice-based, practice-led or ‘non-traditional’ outputs where commonly used publication metrics may not apply.

Can I use scores from an internal output review process?

Yes, you can choose to include scores from an internal review process, such as the Annual Output Review, to evidence the quality of an output if you wish. These scores are not automatically included, however, and you can opt not to include them or select other outputs in your statements if you prefer.

Can I use the name or impact factor of the journal my work is published in to evidence the quality of my paper?

No, the journal name or impact factor does not directly indicate the quality of an individual paper. Research outputs should instead be evaluated on their own merits. At Loughborough University, we prioritise responsible research assessment practices and value peer review or other forms of peer validation as more reliable indicators of research quality. Citations and journal-level metrics can, however, be used to evidence an output’s visibility or reach.