Assessing clinical reasoning in the OSCE: pilot-testing a novel oral debrief exercise

Alexis Régent, Harish Thampy, Minal Singh

Research output: Contribution to journalArticlepeer-review


Introduction: Clinical reasoning (CR) is a complex skill enabling transition from clinical novice to expert decision maker. The Objective Structured Clinical Examination (OSCE) is widely used to evaluate clinical competency, though there is limited literature exploring how this assessment is best used to assess CR skills. This proof-of-concept study explored the creation and pilot testing of a post-station CR assessment, named Oral Debrief (OD), in the context of undergraduate medical education. Methods: A modified-Delphi technique was used to create a standardised domain-based OD marking rubric encapsulating the key skills of CR that drew upon existing literature and our existing placement-based CR tool. 16 OSCE examiners were recruited to score three simulated OD recordings that were scripted to portray differing levels of competency. Adopting a think-aloud approach, examiners vocalised their thought processes while utilising the rubric to assess each video. Thereafter, semi-structured interviews explored examiners’ views on the OD approach. Recordings were transcribed, anonymised and analysed deductively and inductively for recurring themes. Additionally, inter-rater agreement of examiners’ scoring was determined using the Fleiss Kappa statistic both within group and in comparison to a reference examiner group. Results: The rubric achieved fair to good levels of inter-rater reliability metrics across its constituent domains and overall global judgement scales. Think-aloud scoring revealed that participating examiners considered several factors when scoring students’ CR abilities. This included the adoption of a confident structured approach, discriminating between relevant and less-relevant information, and the ability to prioritise and justify decision making. Furthermore, students’ CR skills were judged in light of potential risks to patient safety and examiners’ own illness scripts. Feedback from examiners indicated that whilst additional training in rubric usage would be beneficial, OD offered a positive approach for examining CR ability. Conclusion: This pilot study has demonstrated promising results for the use of a novel post-station OD task to evaluate medical students’ CR ability in the OSCE setting. Further work is now planned to evaluate how the OD approach can most effectively be implemented into routine assessment practice.

Original languageEnglish
Article number718
JournalBMC Medical Education
Issue number1
Publication statusPublished - 3 Oct 2023


  • Clinical reasoning assessment
  • Objective structured clinical examination
  • Oral debrief


Dive into the research topics of 'Assessing clinical reasoning in the OSCE: pilot-testing a novel oral debrief exercise'. Together they form a unique fingerprint.

Cite this