Evaluation of a new e-learning resource for calibrating OSCE examiners on the use of rating scales

Rosa Moreno Lopez*, Serena Sinclair

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)
5 Downloads (Pure)


Rating scales have been described as better at assessing behaviours such as professionalism during Objective Structured Clinical Examinations (OSCEs). However, there is an increased need to train and calibrate staff on their use prior to student assessment.

Material and methods
An online e‐learning package was developed and made available to all examiners at the Institute of Dentistry at the University of Aberdeen. The package included videos of three OSCE stations (medical emergency, rubber dam placement and handling a complaint) which were recorded in two different scenarios; (excellent and unsatisfactory candidate). These videos were recorded to meet a pre‐defined marking score. The examiners were required to mark the six videos using pre‐set marking criteria (checklist and rating scales). The rating scales included professionalism, general clinical ability and/or communication skills. For each video, examiners were given four possible options (unsatisfactory, borderline, satisfactory or excellent), and they were provided with a description for each domain. They were also required to complete a questionnaire to gather their views on the use of this e‐learning environment.

Fifteen examiners completed the task. The total scores given were very similar to the expected scores for the medical emergency and complaint stations; however, this was not the case for the rubber dam station (P‐value .017 and .036). This could be attributed to some aspects of the placement of the rubber dam being unclear as commented on in the examiners questionnaires. There was consistency in the selection of marks on the rating scales (inter‐examiner correlation ranged between 0.916 and 0.979).

Further studies are required on the field of e‐learning training to calibrate examiners for practical assessment; however, this study provides preliminary evidence to support the use of videos as part of an online training package to calibrate OSCE examiners on the use of rating scales.
Original languageEnglish
Pages (from-to)276-281
Number of pages6
JournalEuropean Journal of Dental Education
Issue number2
Early online date27 Jan 2020
Publication statusPublished - 1 May 2020

Bibliographical note

We would like to thank the medical team at the University of Aberdeen to help with the recordings of the videos and developing the website for examiners training.


  • examiner training
  • objective structured clinical examination
  • rating scales


Dive into the research topics of 'Evaluation of a new e-learning resource for calibrating OSCE examiners on the use of rating scales'. Together they form a unique fingerprint.

Cite this