TY - JOUR
T1 - Developing and Implementing a Multisource Feedback Tool to Assess Competencies of Emergency Medicine Residents in the United States
AU - LaMantia, Joseph
AU - Yarris, Lalena M.
AU - Sunga, Kharmene
AU - Weizberg, Moshe
AU - Hart, Danielle
AU - Farina, Gino
AU - Rodriguez, Elliot
AU - Lucas, Raymond
AU - Mahmooth, Zayan
AU - Snock, Alexandra
AU - Lockyear, Jocelyn
N1 - Publisher Copyright:
© 2017 by the Society for Academic Emergency Medicine
PY - 2017/7
Y1 - 2017/7
N2 - Objectives: Multisource feedback (MSF) has potential value in learner assessment, but has not been broadly implemented nor studied in emergency medicine (EM). This study aimed to adapt existing MSF instruments for emergency department implementation, measure feasibility, and collect initial validity evidence to support score interpretation for learner assessment. Methods: Residents from eight U.S. EM residency programs completed a self-assessment and were assessed by eight physicians, eight nonphysician colleagues, and 25 patients using unique instruments. Instruments included a five-point rating scale to assess interpersonal and communication skills, professionalism, systems-based practice, practice-based learning and improvement, and patient care. MSF feasibility was measured by percentage of residents who collected the target number of instruments. To develop internal structure validity evidence, Cronbach's alpha was calculated as a measure of internal consistency. Results: A total of 125 residents collected a mean of 7.0 physician assessments (n = 752), 6.7 nonphysician assessments (n = 775), and 17.8 patient assessments (n = 2,100) with respective response rates of 67.2, 75.2, and 77.5%. Cronbach's alpha values for physicians, nonphysicians, patients, and self were 0.97, 0.97, 0.96, and 0.96, respectively. Conclusions: This study demonstrated that MSF implementation is feasible, although challenging. The tool and its scale demonstrated excellent internal consistency. EM educators may find the adaptation process and tools applicable to their learners.
AB - Objectives: Multisource feedback (MSF) has potential value in learner assessment, but has not been broadly implemented nor studied in emergency medicine (EM). This study aimed to adapt existing MSF instruments for emergency department implementation, measure feasibility, and collect initial validity evidence to support score interpretation for learner assessment. Methods: Residents from eight U.S. EM residency programs completed a self-assessment and were assessed by eight physicians, eight nonphysician colleagues, and 25 patients using unique instruments. Instruments included a five-point rating scale to assess interpersonal and communication skills, professionalism, systems-based practice, practice-based learning and improvement, and patient care. MSF feasibility was measured by percentage of residents who collected the target number of instruments. To develop internal structure validity evidence, Cronbach's alpha was calculated as a measure of internal consistency. Results: A total of 125 residents collected a mean of 7.0 physician assessments (n = 752), 6.7 nonphysician assessments (n = 775), and 17.8 patient assessments (n = 2,100) with respective response rates of 67.2, 75.2, and 77.5%. Cronbach's alpha values for physicians, nonphysicians, patients, and self were 0.97, 0.97, 0.96, and 0.96, respectively. Conclusions: This study demonstrated that MSF implementation is feasible, although challenging. The tool and its scale demonstrated excellent internal consistency. EM educators may find the adaptation process and tools applicable to their learners.
UR - http://www.scopus.com/inward/record.url?scp=85074002032&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85074002032&partnerID=8YFLogxK
U2 - 10.1002/aet2.10043
DO - 10.1002/aet2.10043
M3 - Article
AN - SCOPUS:85074002032
SN - 2472-5390
VL - 1
SP - 243
EP - 249
JO - AEM Education and Training
JF - AEM Education and Training
IS - 3
ER -