Understanding Inter-rater Disagreement: A Mixed Methods Approach

Emily M. Campbell, Dean F. Sittig, Wendy W. Chapman, Brian L. Hazlehurst, Aaron Cohen

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

In an experiment to investigate cognitive skill differences between clinicians and lay persons, eight individuals in each group were asked to determine if an explicit concept existed in an ambulatory encounter note (a simple task) or if the concept could be inferred from the same note (a complex task). Subjects answered questions, highlighted text used to answer each question, and commented on their reasoning for selecting specific text. Quantitative results were mixed for expert vs. non-expert task performance on simple vs. complex tasks. Qualitative analysis revealed that data ambiguity obscured quantifiable skill differences between groups. In addition, this analysis offered new insight into whether a concept identification task is simple or complex. We present this case study to demonstrate the value of mixed method approaches to task-based performance study design and evaluation. We discuss the results in terms of their implications for evaluating meaningful use of technologies.

Original languageEnglish (US)
Pages (from-to)81-85
Number of pages5
JournalAMIA ... Annual Symposium proceedings / AMIA Symposium. AMIA Symposium
Volume2010
StatePublished - 2010

Fingerprint

Task Performance and Analysis
Technology

ASJC Scopus subject areas

  • Medicine(all)

Cite this

Understanding Inter-rater Disagreement : A Mixed Methods Approach. / Campbell, Emily M.; Sittig, Dean F.; Chapman, Wendy W.; Hazlehurst, Brian L.; Cohen, Aaron.

In: AMIA ... Annual Symposium proceedings / AMIA Symposium. AMIA Symposium, Vol. 2010, 2010, p. 81-85.

Research output: Contribution to journalArticle

Campbell, Emily M. ; Sittig, Dean F. ; Chapman, Wendy W. ; Hazlehurst, Brian L. ; Cohen, Aaron. / Understanding Inter-rater Disagreement : A Mixed Methods Approach. In: AMIA ... Annual Symposium proceedings / AMIA Symposium. AMIA Symposium. 2010 ; Vol. 2010. pp. 81-85.
@article{709575c4b465473bae7f4593023c0c01,
title = "Understanding Inter-rater Disagreement: A Mixed Methods Approach",
abstract = "In an experiment to investigate cognitive skill differences between clinicians and lay persons, eight individuals in each group were asked to determine if an explicit concept existed in an ambulatory encounter note (a simple task) or if the concept could be inferred from the same note (a complex task). Subjects answered questions, highlighted text used to answer each question, and commented on their reasoning for selecting specific text. Quantitative results were mixed for expert vs. non-expert task performance on simple vs. complex tasks. Qualitative analysis revealed that data ambiguity obscured quantifiable skill differences between groups. In addition, this analysis offered new insight into whether a concept identification task is simple or complex. We present this case study to demonstrate the value of mixed method approaches to task-based performance study design and evaluation. We discuss the results in terms of their implications for evaluating meaningful use of technologies.",
author = "Campbell, {Emily M.} and Sittig, {Dean F.} and Chapman, {Wendy W.} and Hazlehurst, {Brian L.} and Aaron Cohen",
year = "2010",
language = "English (US)",
volume = "2010",
pages = "81--85",
journal = "AMIA ... Annual Symposium proceedings / AMIA Symposium. AMIA Symposium",
issn = "1559-4076",
publisher = "American Medical Informatics Association",

}

TY - JOUR

T1 - Understanding Inter-rater Disagreement

T2 - A Mixed Methods Approach

AU - Campbell, Emily M.

AU - Sittig, Dean F.

AU - Chapman, Wendy W.

AU - Hazlehurst, Brian L.

AU - Cohen, Aaron

PY - 2010

Y1 - 2010

N2 - In an experiment to investigate cognitive skill differences between clinicians and lay persons, eight individuals in each group were asked to determine if an explicit concept existed in an ambulatory encounter note (a simple task) or if the concept could be inferred from the same note (a complex task). Subjects answered questions, highlighted text used to answer each question, and commented on their reasoning for selecting specific text. Quantitative results were mixed for expert vs. non-expert task performance on simple vs. complex tasks. Qualitative analysis revealed that data ambiguity obscured quantifiable skill differences between groups. In addition, this analysis offered new insight into whether a concept identification task is simple or complex. We present this case study to demonstrate the value of mixed method approaches to task-based performance study design and evaluation. We discuss the results in terms of their implications for evaluating meaningful use of technologies.

AB - In an experiment to investigate cognitive skill differences between clinicians and lay persons, eight individuals in each group were asked to determine if an explicit concept existed in an ambulatory encounter note (a simple task) or if the concept could be inferred from the same note (a complex task). Subjects answered questions, highlighted text used to answer each question, and commented on their reasoning for selecting specific text. Quantitative results were mixed for expert vs. non-expert task performance on simple vs. complex tasks. Qualitative analysis revealed that data ambiguity obscured quantifiable skill differences between groups. In addition, this analysis offered new insight into whether a concept identification task is simple or complex. We present this case study to demonstrate the value of mixed method approaches to task-based performance study design and evaluation. We discuss the results in terms of their implications for evaluating meaningful use of technologies.

UR - http://www.scopus.com/inward/record.url?scp=84964950588&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84964950588&partnerID=8YFLogxK

M3 - Article

C2 - 21346945

AN - SCOPUS:84964950588

VL - 2010

SP - 81

EP - 85

JO - AMIA ... Annual Symposium proceedings / AMIA Symposium. AMIA Symposium

JF - AMIA ... Annual Symposium proceedings / AMIA Symposium. AMIA Symposium

SN - 1559-4076

ER -