The 360-degree evaluation

Increased work with little return?

John A. Weigelt, Karen Brasel, Dawn Bragg, Deborah Simpson

Research output: Contribution to journalArticle

22 Citations (Scopus)

Abstract

Objective: To test a 360-degree resident evaluation tool on our trauma/critical care services to determine if multiple raters yielded equivalent information compared with traditional faculty evaluations. Design: Prospective evaluation. Participants: Residents, nurses, faculty, and staff at an academic medical center. Methods: The evaluation tool was developed based on extensive qualitative analysis of 13 major medical specialties' Residency Review Committee (RRC) criteria relative to the ACGME competencies and then revised with content specific to surgery. The evaluation contained 19 items divided into ACGME competency areas. Each item was scored on a 1 to 9 Likert scale: 1 = not meeting expectations and 9 = exceeding expectations. Residents on the trauma and surgical intensive care unit rotations evaluated themselves, and they were also evaluated by chief residents, surgical intensive care unit fellows, faculty, surgical intensive care unit nurses, trauma nurse clinicians, and nurse practitioners. Multiple analyses of variance were used to compare ratings by rater groups. Results: Ten residents were evaluated on the trauma service from April to August 2003. Between 74 and 106 evaluations were obtained per resident per competency area. Average scores across the competencies were remarkably similar, ranging from 6.18 for practice-based learning and systems-based practice to 6.54 for professionalism. Although there was variability within rater groups, ratings were not statistically different between groups for any ACGME competency. Conclusions: The 360-degree evaluation provide limited new information compared with traditional faculty ratings. Follow-up studies are required to confirm this finding with larger samples of residents and surgical specialties.

Original languageEnglish (US)
Pages (from-to)616-626
Number of pages11
JournalCurrent Surgery
Volume61
Issue number6
DOIs
StatePublished - Nov 2004
Externally publishedYes

Fingerprint

resident
Critical Care
trauma
evaluation
nurse
Intensive Care Units
rating
Wounds and Injuries
Nurses
Surgical Specialties
Nurse Clinicians
Nurse Practitioners
Group
Advisory Committees
Internship and Residency
surgery
Analysis of Variance
Medicine
Learning
staff

Keywords

  • 360-degree evaluation
  • Competency
  • Feedback
  • Resident education

ASJC Scopus subject areas

  • Surgery

Cite this

The 360-degree evaluation : Increased work with little return? / Weigelt, John A.; Brasel, Karen; Bragg, Dawn; Simpson, Deborah.

In: Current Surgery, Vol. 61, No. 6, 11.2004, p. 616-626.

Research output: Contribution to journalArticle

Weigelt, John A. ; Brasel, Karen ; Bragg, Dawn ; Simpson, Deborah. / The 360-degree evaluation : Increased work with little return?. In: Current Surgery. 2004 ; Vol. 61, No. 6. pp. 616-626.
@article{30ac22e4188a4880bb1435498fe0896a,
title = "The 360-degree evaluation: Increased work with little return?",
abstract = "Objective: To test a 360-degree resident evaluation tool on our trauma/critical care services to determine if multiple raters yielded equivalent information compared with traditional faculty evaluations. Design: Prospective evaluation. Participants: Residents, nurses, faculty, and staff at an academic medical center. Methods: The evaluation tool was developed based on extensive qualitative analysis of 13 major medical specialties' Residency Review Committee (RRC) criteria relative to the ACGME competencies and then revised with content specific to surgery. The evaluation contained 19 items divided into ACGME competency areas. Each item was scored on a 1 to 9 Likert scale: 1 = not meeting expectations and 9 = exceeding expectations. Residents on the trauma and surgical intensive care unit rotations evaluated themselves, and they were also evaluated by chief residents, surgical intensive care unit fellows, faculty, surgical intensive care unit nurses, trauma nurse clinicians, and nurse practitioners. Multiple analyses of variance were used to compare ratings by rater groups. Results: Ten residents were evaluated on the trauma service from April to August 2003. Between 74 and 106 evaluations were obtained per resident per competency area. Average scores across the competencies were remarkably similar, ranging from 6.18 for practice-based learning and systems-based practice to 6.54 for professionalism. Although there was variability within rater groups, ratings were not statistically different between groups for any ACGME competency. Conclusions: The 360-degree evaluation provide limited new information compared with traditional faculty ratings. Follow-up studies are required to confirm this finding with larger samples of residents and surgical specialties.",
keywords = "360-degree evaluation, Competency, Feedback, Resident education",
author = "Weigelt, {John A.} and Karen Brasel and Dawn Bragg and Deborah Simpson",
year = "2004",
month = "11",
doi = "10.1016/j.cursur.2004.06.024",
language = "English (US)",
volume = "61",
pages = "616--626",
journal = "Journal of Surgical Education",
issn = "1931-7204",
publisher = "Elsevier Inc.",
number = "6",

}

TY - JOUR

T1 - The 360-degree evaluation

T2 - Increased work with little return?

AU - Weigelt, John A.

AU - Brasel, Karen

AU - Bragg, Dawn

AU - Simpson, Deborah

PY - 2004/11

Y1 - 2004/11

N2 - Objective: To test a 360-degree resident evaluation tool on our trauma/critical care services to determine if multiple raters yielded equivalent information compared with traditional faculty evaluations. Design: Prospective evaluation. Participants: Residents, nurses, faculty, and staff at an academic medical center. Methods: The evaluation tool was developed based on extensive qualitative analysis of 13 major medical specialties' Residency Review Committee (RRC) criteria relative to the ACGME competencies and then revised with content specific to surgery. The evaluation contained 19 items divided into ACGME competency areas. Each item was scored on a 1 to 9 Likert scale: 1 = not meeting expectations and 9 = exceeding expectations. Residents on the trauma and surgical intensive care unit rotations evaluated themselves, and they were also evaluated by chief residents, surgical intensive care unit fellows, faculty, surgical intensive care unit nurses, trauma nurse clinicians, and nurse practitioners. Multiple analyses of variance were used to compare ratings by rater groups. Results: Ten residents were evaluated on the trauma service from April to August 2003. Between 74 and 106 evaluations were obtained per resident per competency area. Average scores across the competencies were remarkably similar, ranging from 6.18 for practice-based learning and systems-based practice to 6.54 for professionalism. Although there was variability within rater groups, ratings were not statistically different between groups for any ACGME competency. Conclusions: The 360-degree evaluation provide limited new information compared with traditional faculty ratings. Follow-up studies are required to confirm this finding with larger samples of residents and surgical specialties.

AB - Objective: To test a 360-degree resident evaluation tool on our trauma/critical care services to determine if multiple raters yielded equivalent information compared with traditional faculty evaluations. Design: Prospective evaluation. Participants: Residents, nurses, faculty, and staff at an academic medical center. Methods: The evaluation tool was developed based on extensive qualitative analysis of 13 major medical specialties' Residency Review Committee (RRC) criteria relative to the ACGME competencies and then revised with content specific to surgery. The evaluation contained 19 items divided into ACGME competency areas. Each item was scored on a 1 to 9 Likert scale: 1 = not meeting expectations and 9 = exceeding expectations. Residents on the trauma and surgical intensive care unit rotations evaluated themselves, and they were also evaluated by chief residents, surgical intensive care unit fellows, faculty, surgical intensive care unit nurses, trauma nurse clinicians, and nurse practitioners. Multiple analyses of variance were used to compare ratings by rater groups. Results: Ten residents were evaluated on the trauma service from April to August 2003. Between 74 and 106 evaluations were obtained per resident per competency area. Average scores across the competencies were remarkably similar, ranging from 6.18 for practice-based learning and systems-based practice to 6.54 for professionalism. Although there was variability within rater groups, ratings were not statistically different between groups for any ACGME competency. Conclusions: The 360-degree evaluation provide limited new information compared with traditional faculty ratings. Follow-up studies are required to confirm this finding with larger samples of residents and surgical specialties.

KW - 360-degree evaluation

KW - Competency

KW - Feedback

KW - Resident education

UR - http://www.scopus.com/inward/record.url?scp=10944250481&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=10944250481&partnerID=8YFLogxK

U2 - 10.1016/j.cursur.2004.06.024

DO - 10.1016/j.cursur.2004.06.024

M3 - Article

VL - 61

SP - 616

EP - 626

JO - Journal of Surgical Education

JF - Journal of Surgical Education

SN - 1931-7204

IS - 6

ER -