3 December 2018

Ofqual has conducted research into finding improvements to the marking of GCSEs, AS and A levels and has published five new reports which look at various aspects of the marking process.

Ofqual’s summary of the reports can be seen below:

1. Online standardisation

Standardisation of markers can be conducted in different ways. We have looked at the processes involved in online standardisation in particular, and have identified some good practices that could be more consistently adopted to improve the experience and performance of examiners. These include receiving personal feedback by phone after being approved to begin marking and receiving confirmation that they are awarding marks on the same basis (as well as the right mark) as intended. It is also important for examiners to take personal responsibility for ensuring they review any feedback received.

2. Hard to mark responses

Previous research has identified that sources of disagreement between examiners can be categorised as: procedural error (mistakes or not following procedure), attentional error (concentration lapses), inferential uncertainty (uncertainty in drawing inferences from the students’ responses) and definitional uncertainty (uncertainty in the definition of what is to be assessed). The first 2 categories can be described as errors, while the last 2 are present in responses for which there can be more than one legitimate mark. Our latest research finds that the frequency of each category tends to vary by subject. For example, in biology, inferential uncertainty is more common, while in English language definitional uncertainty is more likely. We expect exam boards to reflect on these findings to see where they can improve their mark schemes.

3. Marking versus comparative judgement

Our examination system values the use of extended response questions in assessing important higher-level skills. But these responses are harder to mark than shorter, or more constrained question types. This can impact upon the validity of the rank order of candidate work. We are therefore considering rank ordering students’ work by means other than marking. This study looked at 2 different alternatives – paired comparative judgement, and rank ordering by placing extended responses in rank order – and comparing these with ‘traditional’ marking using a mark scheme. The research finds that the 3 methods produce rank orders that are very similar. This work indicates that more research in this area could be worthwhile.

4. Marking consistency metrics – an update

Earlier work has focused on component level marking consistency and found that results in England are comparable to others internationally. This paper reports new qualification level marking metrics, which are shown to be generally higher than those at component level from which they are comprised. And we note that marking consistency remained stable in England between 2013 and 2017. However, this does not mean that improvements cannot be made. In response, the paper considers how minimum acceptable levels of marking consistency might be defined, which would help exam boards to channel additional resource and support. We note that these thresholds would need take into account the subject and/or forms of assessment, but importantly, would need to be understood and accepted by the public.

5. Marking consistency studies

We measure marking consistency of the 4 exam boards offering GCSEs, AS and A levels in England annually. We have previously said that if we were to publish these metrics, we might compromise live marking monitoring. This new research provides an insight into marking consistency without these drawbacks. We found varying levels of marking consistency across subjects and between individual subject units. The results confirm our belief that marking is generally good across the system, albeit there is room for improvement in some specific areas. We want exam boards to reflect on these results and make appropriate changes to question design and mark schemes for future series.

Further information can be found on the gov.uk website

Opportunities for improving quality of marking

CIEA’s response to the reports is below:

“Ofqual’s review of marking identifies five key areas of new research to help us develop our understanding of marking with the aim to drive quality higher in exam practice. The CIEA welcomes the work Ofqual is carrying out which reflects our mission to raise standards in educational assessment. The CIEA offers both teachers and examiners support in improving the robustness of assessment through our training, independent voice and wider membership across the assessment sector. We welcome members’ views and will continue to work with Ofqual and others to improve assessment practice."