Real-time inter-rater reliability of the council of emergency medicine residency directors standardized direct observation assessment tool

Joseph LaMantia, Bryan Kane, Lalena Yarris, Anthony Tadros, Mary Frances Ward, Martin Lesser, Philip Shayne, Patrick Brunett, Chris Kyriakedes, Stephen Rinnert, Joseph Schimdt, David Wald, Meredith Akerman, Elayne Livote, David Soohoo, Jonathan Gong

Research output: Contribution to journalArticle

  • 8 Citations

Abstract

Objectives: Developed by the Council of Emergency Medicine Residency Directors (CORD), the standardized direct observation assessment tool (SDOT) is an evaluation instrument used to assess residents' clinical skills in the emergency department (ED). In a previous study examining the inter-rater agreement of the tool, faculty scored simulated resident-patient encounters. The objective of the present study was to evaluate the inter-rater agreement of the SDOT in real-time evaluations of residents in the ED. Methods: This was a multi-center, prospective, observational study in which faculty raters were paired to simultaneously observe and independently evaluate a resident's clinical performance using the SDOT. Data collected from eight emergency medicine (EM) residency programs produced 99 unique resident-patient encounters and reported on 26 individual behaviors related to specific core competencies, global evaluation scores for each core competency, and an overall clinical competency score. Inter-rater agreement was assessed using percentage agreement analyses with three constructs: exact agreement, liberal agreement, and binary (pass ? fail) agreement. Results: Inter-rater agreement between faculty raters varied according to category of measure used. Exact agreement ranged from poor to good, depending on the measure: the overall competency score (good), the competency score for each of the six core competencies (poor to good), and the individual item scores (fair to very good). Liberal agreement and binary agreement were excellent for the overall competency score and the competency score for each of the six core competencies and very good to excellent for the individual item scores. Conclusions: The SDOT demonstrated excellent inter-rater agreement when analyzed with liberal agreement and when dichotomized as a pass ? fail measure and fair to good agreement for most measures with exact agreement. The SDOT can be useful and reliable when evaluating residents' clinical skills in the ED, particularly as it relates to marginal performance.

LanguageEnglish (US)
JournalAcademic Emergency Medicine
Volume16
Issue numberSUPPL. 2
DOIs
StatePublished - Dec 2009

Fingerprint

Emergency Medicine
Internship and Residency
Observation
Clinical Competence
Hospital Emergency Service
Observational Studies
Prospective Studies

Keywords

  • Evaluation
  • Inter-rater variation
  • Reliability
  • Training

ASJC Scopus subject areas

  • Emergency Medicine

Cite this

Real-time inter-rater reliability of the council of emergency medicine residency directors standardized direct observation assessment tool. / LaMantia, Joseph; Kane, Bryan; Yarris, Lalena; Tadros, Anthony; Ward, Mary Frances; Lesser, Martin; Shayne, Philip; Brunett, Patrick; Kyriakedes, Chris; Rinnert, Stephen; Schimdt, Joseph; Wald, David; Akerman, Meredith; Livote, Elayne; Soohoo, David; Gong, Jonathan.

In: Academic Emergency Medicine, Vol. 16, No. SUPPL. 2, 12.2009.

Research output: Contribution to journalArticle

LaMantia, J, Kane, B, Yarris, L, Tadros, A, Ward, MF, Lesser, M, Shayne, P, Brunett, P, Kyriakedes, C, Rinnert, S, Schimdt, J, Wald, D, Akerman, M, Livote, E, Soohoo, D & Gong, J 2009, 'Real-time inter-rater reliability of the council of emergency medicine residency directors standardized direct observation assessment tool' Academic Emergency Medicine, vol. 16, no. SUPPL. 2. https://doi.org/10.1111/j.1553-2712.2009.00593.x
LaMantia, Joseph ; Kane, Bryan ; Yarris, Lalena ; Tadros, Anthony ; Ward, Mary Frances ; Lesser, Martin ; Shayne, Philip ; Brunett, Patrick ; Kyriakedes, Chris ; Rinnert, Stephen ; Schimdt, Joseph ; Wald, David ; Akerman, Meredith ; Livote, Elayne ; Soohoo, David ; Gong, Jonathan. / Real-time inter-rater reliability of the council of emergency medicine residency directors standardized direct observation assessment tool. In: Academic Emergency Medicine. 2009 ; Vol. 16, No. SUPPL. 2.
@article{4ef05a97daa54974960570eab1cbf85d,
title = "Real-time inter-rater reliability of the council of emergency medicine residency directors standardized direct observation assessment tool",
abstract = "Objectives: Developed by the Council of Emergency Medicine Residency Directors (CORD), the standardized direct observation assessment tool (SDOT) is an evaluation instrument used to assess residents' clinical skills in the emergency department (ED). In a previous study examining the inter-rater agreement of the tool, faculty scored simulated resident-patient encounters. The objective of the present study was to evaluate the inter-rater agreement of the SDOT in real-time evaluations of residents in the ED. Methods: This was a multi-center, prospective, observational study in which faculty raters were paired to simultaneously observe and independently evaluate a resident's clinical performance using the SDOT. Data collected from eight emergency medicine (EM) residency programs produced 99 unique resident-patient encounters and reported on 26 individual behaviors related to specific core competencies, global evaluation scores for each core competency, and an overall clinical competency score. Inter-rater agreement was assessed using percentage agreement analyses with three constructs: exact agreement, liberal agreement, and binary (pass ? fail) agreement. Results: Inter-rater agreement between faculty raters varied according to category of measure used. Exact agreement ranged from poor to good, depending on the measure: the overall competency score (good), the competency score for each of the six core competencies (poor to good), and the individual item scores (fair to very good). Liberal agreement and binary agreement were excellent for the overall competency score and the competency score for each of the six core competencies and very good to excellent for the individual item scores. Conclusions: The SDOT demonstrated excellent inter-rater agreement when analyzed with liberal agreement and when dichotomized as a pass ? fail measure and fair to good agreement for most measures with exact agreement. The SDOT can be useful and reliable when evaluating residents' clinical skills in the ED, particularly as it relates to marginal performance.",
keywords = "Evaluation, Inter-rater variation, Reliability, Training",
author = "Joseph LaMantia and Bryan Kane and Lalena Yarris and Anthony Tadros and Ward, {Mary Frances} and Martin Lesser and Philip Shayne and Patrick Brunett and Chris Kyriakedes and Stephen Rinnert and Joseph Schimdt and David Wald and Meredith Akerman and Elayne Livote and David Soohoo and Jonathan Gong",
year = "2009",
month = "12",
doi = "10.1111/j.1553-2712.2009.00593.x",
language = "English (US)",
volume = "16",
journal = "Academic Emergency Medicine",
issn = "1069-6563",
publisher = "Wiley-Blackwell",
number = "SUPPL. 2",

}

TY - JOUR

T1 - Real-time inter-rater reliability of the council of emergency medicine residency directors standardized direct observation assessment tool

AU - LaMantia, Joseph

AU - Kane, Bryan

AU - Yarris, Lalena

AU - Tadros, Anthony

AU - Ward, Mary Frances

AU - Lesser, Martin

AU - Shayne, Philip

AU - Brunett, Patrick

AU - Kyriakedes, Chris

AU - Rinnert, Stephen

AU - Schimdt, Joseph

AU - Wald, David

AU - Akerman, Meredith

AU - Livote, Elayne

AU - Soohoo, David

AU - Gong, Jonathan

PY - 2009/12

Y1 - 2009/12

N2 - Objectives: Developed by the Council of Emergency Medicine Residency Directors (CORD), the standardized direct observation assessment tool (SDOT) is an evaluation instrument used to assess residents' clinical skills in the emergency department (ED). In a previous study examining the inter-rater agreement of the tool, faculty scored simulated resident-patient encounters. The objective of the present study was to evaluate the inter-rater agreement of the SDOT in real-time evaluations of residents in the ED. Methods: This was a multi-center, prospective, observational study in which faculty raters were paired to simultaneously observe and independently evaluate a resident's clinical performance using the SDOT. Data collected from eight emergency medicine (EM) residency programs produced 99 unique resident-patient encounters and reported on 26 individual behaviors related to specific core competencies, global evaluation scores for each core competency, and an overall clinical competency score. Inter-rater agreement was assessed using percentage agreement analyses with three constructs: exact agreement, liberal agreement, and binary (pass ? fail) agreement. Results: Inter-rater agreement between faculty raters varied according to category of measure used. Exact agreement ranged from poor to good, depending on the measure: the overall competency score (good), the competency score for each of the six core competencies (poor to good), and the individual item scores (fair to very good). Liberal agreement and binary agreement were excellent for the overall competency score and the competency score for each of the six core competencies and very good to excellent for the individual item scores. Conclusions: The SDOT demonstrated excellent inter-rater agreement when analyzed with liberal agreement and when dichotomized as a pass ? fail measure and fair to good agreement for most measures with exact agreement. The SDOT can be useful and reliable when evaluating residents' clinical skills in the ED, particularly as it relates to marginal performance.

AB - Objectives: Developed by the Council of Emergency Medicine Residency Directors (CORD), the standardized direct observation assessment tool (SDOT) is an evaluation instrument used to assess residents' clinical skills in the emergency department (ED). In a previous study examining the inter-rater agreement of the tool, faculty scored simulated resident-patient encounters. The objective of the present study was to evaluate the inter-rater agreement of the SDOT in real-time evaluations of residents in the ED. Methods: This was a multi-center, prospective, observational study in which faculty raters were paired to simultaneously observe and independently evaluate a resident's clinical performance using the SDOT. Data collected from eight emergency medicine (EM) residency programs produced 99 unique resident-patient encounters and reported on 26 individual behaviors related to specific core competencies, global evaluation scores for each core competency, and an overall clinical competency score. Inter-rater agreement was assessed using percentage agreement analyses with three constructs: exact agreement, liberal agreement, and binary (pass ? fail) agreement. Results: Inter-rater agreement between faculty raters varied according to category of measure used. Exact agreement ranged from poor to good, depending on the measure: the overall competency score (good), the competency score for each of the six core competencies (poor to good), and the individual item scores (fair to very good). Liberal agreement and binary agreement were excellent for the overall competency score and the competency score for each of the six core competencies and very good to excellent for the individual item scores. Conclusions: The SDOT demonstrated excellent inter-rater agreement when analyzed with liberal agreement and when dichotomized as a pass ? fail measure and fair to good agreement for most measures with exact agreement. The SDOT can be useful and reliable when evaluating residents' clinical skills in the ED, particularly as it relates to marginal performance.

KW - Evaluation

KW - Inter-rater variation

KW - Reliability

KW - Training

UR - http://www.scopus.com/inward/record.url?scp=73349137942&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=73349137942&partnerID=8YFLogxK

U2 - 10.1111/j.1553-2712.2009.00593.x

DO - 10.1111/j.1553-2712.2009.00593.x

M3 - Article

VL - 16

JO - Academic Emergency Medicine

T2 - Academic Emergency Medicine

JF - Academic Emergency Medicine

SN - 1069-6563

IS - SUPPL. 2

ER -