Abstract
Purpose
The OSCE is regarded as the gold standard of competence assessment in many healthcare programs, however, there are numerous internal and external sources of variation contributing to checklist marks. There is concern amongst organisers that candidates may be unfairly disadvantaged if they follow an ‘excellent’ preceding candidate. In this study, we assessed if average checklist scores differed depending on who a candidate follows accounted for different sources of variation.
Methods
We examined assessment data from final year MBChB OSCEs at the University of Aberdeen and categorised candidates into three levels dependent on examiner awarded global scores of preceding candidates for each station. We modelled the data using a linear mixed model incorporating fixed and random effects.
Findings
A total of 349 candidates sat the OSCEs. The predicted mean (95% CI) score for students following an ‘excellent’ candidate was 21.6 (20.6, 22.6), followed ‘others’ was 21.5 (20.5, 22.4), and followed an ‘unsatisfactory’ student was 22.2 (21.1, 23.3). When accounted for individual, examiner and station levels variabilities, students following an ‘excellent’ candidate did not have different mean scores compared to those who followed ‘other’ (p=0.829) or ‘unsatisfactory’ candidates (p=0.162), however, students who followed an ‘unsatisfactory’ student scored slightly higher on average compared to those who followed ‘other’ (p=0.038).
Originality
There was weak evidence that candidate’s checklist variations could be attributed to who they followed, particularly those following unsatisfactory students; the difference in predicted mean scores may be of little practical relevance. Further studies with multiple centres may be warranted assuring perceived fairness of the OSCE to candidates and educators.
The OSCE is regarded as the gold standard of competence assessment in many healthcare programs, however, there are numerous internal and external sources of variation contributing to checklist marks. There is concern amongst organisers that candidates may be unfairly disadvantaged if they follow an ‘excellent’ preceding candidate. In this study, we assessed if average checklist scores differed depending on who a candidate follows accounted for different sources of variation.
Methods
We examined assessment data from final year MBChB OSCEs at the University of Aberdeen and categorised candidates into three levels dependent on examiner awarded global scores of preceding candidates for each station. We modelled the data using a linear mixed model incorporating fixed and random effects.
Findings
A total of 349 candidates sat the OSCEs. The predicted mean (95% CI) score for students following an ‘excellent’ candidate was 21.6 (20.6, 22.6), followed ‘others’ was 21.5 (20.5, 22.4), and followed an ‘unsatisfactory’ student was 22.2 (21.1, 23.3). When accounted for individual, examiner and station levels variabilities, students following an ‘excellent’ candidate did not have different mean scores compared to those who followed ‘other’ (p=0.829) or ‘unsatisfactory’ candidates (p=0.162), however, students who followed an ‘unsatisfactory’ student scored slightly higher on average compared to those who followed ‘other’ (p=0.038).
Originality
There was weak evidence that candidate’s checklist variations could be attributed to who they followed, particularly those following unsatisfactory students; the difference in predicted mean scores may be of little practical relevance. Further studies with multiple centres may be warranted assuring perceived fairness of the OSCE to candidates and educators.
| Original language | English |
|---|---|
| Pages (from-to) | 891-903 |
| Number of pages | 13 |
| Journal | Journal of Applied Research in Higher Education |
| Volume | 16 |
| Issue number | 3 |
| Early online date | 22 Aug 2023 |
| DOIs | |
| Publication status | Published - 10 May 2024 |
| Event | HETL 2023 - Aberdeen, United Kingdom Duration: 12 Jun 2023 → 14 Jun 2023 |
Bibliographical note
AcknowledgementsThe authors would like to acknowledge Wai-Lum Sung from the Department of Medical Illustration at the University of Aberdeen for creation of Figure 1.
Data Availability Statement
The data is available upon reasonable request by contacting the corresponding author.Funding
Not applicable
Keywords
- Assessment
- OSCE
- Medical
- Clinical
Fingerprint
Dive into the research topics of 'Does following an “excellent” candidate in the Objective Structured Clinical Examination affect your checklist score?'. Together they form a unique fingerprint.Cite this
- APA
- Standard
- Harvard
- Vancouver
- Author
- BIBTEX
- RIS