Real-time inter-rater reliability of the council of emergency medicine residency directors standardized direct observation assessment tool

Joseph LaMantia, Bryan Kane, Lalena Yarris, Anthony Tadros, Mary Frances Ward, Martin Lesser, Philip Shayne, Patrick Brunett, Chris Kyriakedes, Stephen Rinnert, Joseph Schimdt, David Wald, Meredith Akerman, Elayne Livote, David Soohoo, Jonathan Gong

Research output: Contribution to journalArticlepeer-review

10 Scopus citations

Abstract

Objectives: Developed by the Council of Emergency Medicine Residency Directors (CORD), the standardized direct observation assessment tool (SDOT) is an evaluation instrument used to assess residents' clinical skills in the emergency department (ED). In a previous study examining the inter-rater agreement of the tool, faculty scored simulated resident-patient encounters. The objective of the present study was to evaluate the inter-rater agreement of the SDOT in real-time evaluations of residents in the ED. Methods: This was a multi-center, prospective, observational study in which faculty raters were paired to simultaneously observe and independently evaluate a resident's clinical performance using the SDOT. Data collected from eight emergency medicine (EM) residency programs produced 99 unique resident-patient encounters and reported on 26 individual behaviors related to specific core competencies, global evaluation scores for each core competency, and an overall clinical competency score. Inter-rater agreement was assessed using percentage agreement analyses with three constructs: exact agreement, liberal agreement, and binary (pass ? fail) agreement. Results: Inter-rater agreement between faculty raters varied according to category of measure used. Exact agreement ranged from poor to good, depending on the measure: the overall competency score (good), the competency score for each of the six core competencies (poor to good), and the individual item scores (fair to very good). Liberal agreement and binary agreement were excellent for the overall competency score and the competency score for each of the six core competencies and very good to excellent for the individual item scores. Conclusions: The SDOT demonstrated excellent inter-rater agreement when analyzed with liberal agreement and when dichotomized as a pass ? fail measure and fair to good agreement for most measures with exact agreement. The SDOT can be useful and reliable when evaluating residents' clinical skills in the ED, particularly as it relates to marginal performance.

Original languageEnglish (US)
Pages (from-to)S51-S57
JournalAcademic Emergency Medicine
Volume16
Issue numberSUPPL. 2
DOIs
StatePublished - Dec 2009

Keywords

  • Evaluation
  • Inter-rater variation
  • Reliability
  • Training

ASJC Scopus subject areas

  • Emergency Medicine

Fingerprint

Dive into the research topics of 'Real-time inter-rater reliability of the council of emergency medicine residency directors standardized direct observation assessment tool'. Together they form a unique fingerprint.

Cite this