Background Most existing residency evaluation tools were constructed to evaluate the Accreditation Council for Graduate Medical Education (ACGME) competencies. Methods Before ACGME's six competency based assessment requirements for resident performance were developed, we created a residency evaluation tool with 5 domains important to successful surgical resident performance. Reliability was determined after 6 months of use. Factor analysis assessed whether the evaluation tool was a construct-valid measure of the ACGME competencies. Results Three hundred forty-three evaluations for 36 surgical residents were tested. The original evaluation tool was highly reliable with an overall reliability of 0.97. Factor analysis defined 4 new combinations of questions analogous to 4 of the ACGME competencies: professionalism (reliability 0.95), patient care (reliability 0.93), medical knowledge (reliability 0.92), and communication (reliability 0.92). The new competency clusters were correlated with each other to a moderate degree. Conclusions Our locally developed tool demonstrated high reliability and construct validity for 4 of 6 ACGME competencies. The correlation between factors suggests overlap between competencies.
- Accreditation Council for Graduate Medical Education competencies
- Evaluation tools
- Residency evaluation
- Traditional rating form
ASJC Scopus subject areas