Optimizing physician skill development for medical students: The four-part assessment

Justin J.J. Watson, Phillip M.Kemp Bohan, Katrina Ramsey, John D. Yonge, Christopher R. Connelly, Richard J. Mullins, Jennifer M. Watters, Martin A. Schreiber, Laszlo Kiraly

Research output: Contribution to journalArticlepeer-review

Abstract

Background Medical student performance has been poorly correlated with residency performance and warrants further investigation. We propose a novel surgical assessment tool to determine correlations with clinical aptitude. Methods Retrospective review of medical student assessments from 2013 to 2015. Faculty rating of student performance was evaluated by: 1) case presentation, 2) problem definition, 3) question response and 4) use of literature and correlated to final exam assessment. A Likert scale interrater reliability was evaluated. Results Sixty student presentations were scored (4.8 assessors/presentation). A student's case presentation, problem definition, and question response was correlated with performance (r = 0.49 to 0.61, p ≤ 0.003). Moderate correlations for either question response or use of literature was demonstrated (0.3 and 0.26, p < 0.05). Conclusion Our four-part assessment tool identified correlations with course and examination grades for medical students. As surgical education evolves, validated performance and reliable testing measures are required.

Original languageEnglish (US)
Pages (from-to)906-909
Number of pages4
JournalAmerican journal of surgery
Volume213
Issue number5
DOIs
StatePublished - May 2017

ASJC Scopus subject areas

  • Surgery

Fingerprint

Dive into the research topics of 'Optimizing physician skill development for medical students: The four-part assessment'. Together they form a unique fingerprint.

Cite this