Getting it right
Transforming educational assessment with comparative judgement
The marking of student work is plagued by human error which has caused three systemic problems, weakening public confidence in qualifications.
To reduce marking error, assessments often comprise short, piecemeal questions – fragmenting curricular knowledge and encouraging rote learning. Marking error also means that grades are unreliable, unfairly hindering the life chances of some students. Moreover, marking error means standards over time and across awarding bodies cannot be accurately evaluated or maintained.
Our research into comparative judgement (CJ) led to the development of an assessment technique that eliminates marking error and avoids these problems. The method is widely used across the globe to support marking and enhance learning.
Qualifications regulator Ofqual routinely applies our research
- Fairer examinations have benefitted 5.5 million candidates over five years.
No More Marking Ltd routinely applies our research
- Transformed practice in schools has improved the assessment of 579,400 students.
- The assessment tool has enhanced the professional development of more than 10,000 teachers.
Our research into marking error began almost a decade ago with a study investigating why GCSE mathematics was not fit for purpose.
We found that having to rapidly mark thousands of exam scripts resulted in question papers almost entirely comprising short-form questions that minimise marking error, but reduce validity.
Our comparative judgement (CJ) technique for assessing mathematical knowledge eliminates marking itself. Instead, subject experts decide which of two presented scripts is better in terms of a high-level criteria such as conceptual understanding.
Many such binary decisions are collected from several assessors and then fitted to the Bradley-Terry model to produce a score for each student.
Further studies have proved the reliability of the CJ system as an assessment tool when applied to open-ended mathematics assessments which better demonstrate a student’s understanding. What’s more, we demonstrated the reliability of CJ assessment outcomes across age groups, topics, institutions and jurisdictions.
We then developed a CJ-based technique for investigating standards over time and across equivalent qualifications. This revealed that standards in A-level Mathematics have declined since the 1960s. This research received much national media coverage – and the British Educational Research Journal’s Editor’s Choice Award 2017.
To date, 2,227 schools in 27 countries have subscribed to No More Marking’s services
- National HE STEM Programme
- No More Marking Ltd
- Nuffield Foundation
- Royal Society