Representation of Ophthalmology Concepts by Electronic Systems. Intercoder Agreement among Physicians Using Controlled Terminologies

John C. Hwang, Alexander C. Yu, Daniel S. Casper, Justin Starren, James J. Cimino, Michael Chiang

Research output: Contribution to journalArticle

17 Citations (Scopus)

Abstract

Objective: To assess intercoder agreement for ophthalmology concepts by 3 physician coders using 5 controlled terminologies (International Classification of Diseases 9, Clinical Modification [ICD9CM]; Current Procedural Terminology, fourth edition; Logical Observation Identifiers, Names, and Codes [LOINC]; Systematized Nomenclature of Medicine, Clinical Terms [SNOMED-CT]; and Medical Entities Dictionary). Design: Noncomparative case series. Participants: Five complete ophthalmology case presentations selected from a publicly available journal. Methods: Each case was parsed into discrete concepts. Electronic or paper browsers were used independently by 3 physician coders to assign a code for every concept in each terminology. A match score representing adequacy of assignment for each concept was assigned on a 3-point scale (0, no match; 1, partial match; 2, complete match). For every concept, the level of intercoder agreement was determined by 2 methods: (1) based on exact code matching with assignment of complete agreement when all coders assigned the same code, partial agreement when 2 coders assigned the same code, and no agreement when all coders assigned different codes, and (2) based on manual review for semantic equivalence of all assigned codes by an independent ophthalmologist to classify intercoder agreement for each concept as complete agreement, partial agreement, or no agreement. Subsequently, intercoder agreement was calculated in the same manner for the subset of concepts judged to have adequate coverage by each terminology, based on receiving a match score of 2 by at least 2 of the 3 coders. Main Outcome Measures: Intercoder agreement in each controlled terminology: complete, partial, or none. Results: Cases were parsed into 242 unique concepts. When all concepts were analyzed by manual review, the proportion of complete intercoder agreement ranged from 12% (LOINC) to 44% (SNOMED-CT), and the difference in intercoder agreement between LOINC and all other terminologies was statistically significant (P

Original languageEnglish (US)
Pages (from-to)511-519
Number of pages9
JournalOphthalmology
Volume113
Issue number4
DOIs
StatePublished - Apr 2006
Externally publishedYes

Fingerprint

Logical Observation Identifiers Names and Codes
Ophthalmology
Terminology
Physicians
Systematized Nomenclature of Medicine
Medical Dictionaries
Current Procedural Terminology
International Classification of Diseases
Semantics
Outcome Assessment (Health Care)

ASJC Scopus subject areas

  • Ophthalmology

Cite this

Representation of Ophthalmology Concepts by Electronic Systems. Intercoder Agreement among Physicians Using Controlled Terminologies. / Hwang, John C.; Yu, Alexander C.; Casper, Daniel S.; Starren, Justin; Cimino, James J.; Chiang, Michael.

In: Ophthalmology, Vol. 113, No. 4, 04.2006, p. 511-519.

Research output: Contribution to journalArticle

Hwang, John C. ; Yu, Alexander C. ; Casper, Daniel S. ; Starren, Justin ; Cimino, James J. ; Chiang, Michael. / Representation of Ophthalmology Concepts by Electronic Systems. Intercoder Agreement among Physicians Using Controlled Terminologies. In: Ophthalmology. 2006 ; Vol. 113, No. 4. pp. 511-519.
@article{443542ead2ce447f8120132e1cb5a3fa,
title = "Representation of Ophthalmology Concepts by Electronic Systems. Intercoder Agreement among Physicians Using Controlled Terminologies",
abstract = "Objective: To assess intercoder agreement for ophthalmology concepts by 3 physician coders using 5 controlled terminologies (International Classification of Diseases 9, Clinical Modification [ICD9CM]; Current Procedural Terminology, fourth edition; Logical Observation Identifiers, Names, and Codes [LOINC]; Systematized Nomenclature of Medicine, Clinical Terms [SNOMED-CT]; and Medical Entities Dictionary). Design: Noncomparative case series. Participants: Five complete ophthalmology case presentations selected from a publicly available journal. Methods: Each case was parsed into discrete concepts. Electronic or paper browsers were used independently by 3 physician coders to assign a code for every concept in each terminology. A match score representing adequacy of assignment for each concept was assigned on a 3-point scale (0, no match; 1, partial match; 2, complete match). For every concept, the level of intercoder agreement was determined by 2 methods: (1) based on exact code matching with assignment of complete agreement when all coders assigned the same code, partial agreement when 2 coders assigned the same code, and no agreement when all coders assigned different codes, and (2) based on manual review for semantic equivalence of all assigned codes by an independent ophthalmologist to classify intercoder agreement for each concept as complete agreement, partial agreement, or no agreement. Subsequently, intercoder agreement was calculated in the same manner for the subset of concepts judged to have adequate coverage by each terminology, based on receiving a match score of 2 by at least 2 of the 3 coders. Main Outcome Measures: Intercoder agreement in each controlled terminology: complete, partial, or none. Results: Cases were parsed into 242 unique concepts. When all concepts were analyzed by manual review, the proportion of complete intercoder agreement ranged from 12{\%} (LOINC) to 44{\%} (SNOMED-CT), and the difference in intercoder agreement between LOINC and all other terminologies was statistically significant (P",
author = "Hwang, {John C.} and Yu, {Alexander C.} and Casper, {Daniel S.} and Justin Starren and Cimino, {James J.} and Michael Chiang",
year = "2006",
month = "4",
doi = "10.1016/j.ophtha.2006.01.017",
language = "English (US)",
volume = "113",
pages = "511--519",
journal = "Ophthalmology",
issn = "0161-6420",
publisher = "Elsevier Inc.",
number = "4",

}

TY - JOUR

T1 - Representation of Ophthalmology Concepts by Electronic Systems. Intercoder Agreement among Physicians Using Controlled Terminologies

AU - Hwang, John C.

AU - Yu, Alexander C.

AU - Casper, Daniel S.

AU - Starren, Justin

AU - Cimino, James J.

AU - Chiang, Michael

PY - 2006/4

Y1 - 2006/4

N2 - Objective: To assess intercoder agreement for ophthalmology concepts by 3 physician coders using 5 controlled terminologies (International Classification of Diseases 9, Clinical Modification [ICD9CM]; Current Procedural Terminology, fourth edition; Logical Observation Identifiers, Names, and Codes [LOINC]; Systematized Nomenclature of Medicine, Clinical Terms [SNOMED-CT]; and Medical Entities Dictionary). Design: Noncomparative case series. Participants: Five complete ophthalmology case presentations selected from a publicly available journal. Methods: Each case was parsed into discrete concepts. Electronic or paper browsers were used independently by 3 physician coders to assign a code for every concept in each terminology. A match score representing adequacy of assignment for each concept was assigned on a 3-point scale (0, no match; 1, partial match; 2, complete match). For every concept, the level of intercoder agreement was determined by 2 methods: (1) based on exact code matching with assignment of complete agreement when all coders assigned the same code, partial agreement when 2 coders assigned the same code, and no agreement when all coders assigned different codes, and (2) based on manual review for semantic equivalence of all assigned codes by an independent ophthalmologist to classify intercoder agreement for each concept as complete agreement, partial agreement, or no agreement. Subsequently, intercoder agreement was calculated in the same manner for the subset of concepts judged to have adequate coverage by each terminology, based on receiving a match score of 2 by at least 2 of the 3 coders. Main Outcome Measures: Intercoder agreement in each controlled terminology: complete, partial, or none. Results: Cases were parsed into 242 unique concepts. When all concepts were analyzed by manual review, the proportion of complete intercoder agreement ranged from 12% (LOINC) to 44% (SNOMED-CT), and the difference in intercoder agreement between LOINC and all other terminologies was statistically significant (P

AB - Objective: To assess intercoder agreement for ophthalmology concepts by 3 physician coders using 5 controlled terminologies (International Classification of Diseases 9, Clinical Modification [ICD9CM]; Current Procedural Terminology, fourth edition; Logical Observation Identifiers, Names, and Codes [LOINC]; Systematized Nomenclature of Medicine, Clinical Terms [SNOMED-CT]; and Medical Entities Dictionary). Design: Noncomparative case series. Participants: Five complete ophthalmology case presentations selected from a publicly available journal. Methods: Each case was parsed into discrete concepts. Electronic or paper browsers were used independently by 3 physician coders to assign a code for every concept in each terminology. A match score representing adequacy of assignment for each concept was assigned on a 3-point scale (0, no match; 1, partial match; 2, complete match). For every concept, the level of intercoder agreement was determined by 2 methods: (1) based on exact code matching with assignment of complete agreement when all coders assigned the same code, partial agreement when 2 coders assigned the same code, and no agreement when all coders assigned different codes, and (2) based on manual review for semantic equivalence of all assigned codes by an independent ophthalmologist to classify intercoder agreement for each concept as complete agreement, partial agreement, or no agreement. Subsequently, intercoder agreement was calculated in the same manner for the subset of concepts judged to have adequate coverage by each terminology, based on receiving a match score of 2 by at least 2 of the 3 coders. Main Outcome Measures: Intercoder agreement in each controlled terminology: complete, partial, or none. Results: Cases were parsed into 242 unique concepts. When all concepts were analyzed by manual review, the proportion of complete intercoder agreement ranged from 12% (LOINC) to 44% (SNOMED-CT), and the difference in intercoder agreement between LOINC and all other terminologies was statistically significant (P

UR - http://www.scopus.com/inward/record.url?scp=33645348771&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33645348771&partnerID=8YFLogxK

U2 - 10.1016/j.ophtha.2006.01.017

DO - 10.1016/j.ophtha.2006.01.017

M3 - Article

C2 - 16488013

AN - SCOPUS:33645348771

VL - 113

SP - 511

EP - 519

JO - Ophthalmology

JF - Ophthalmology

SN - 0161-6420

IS - 4

ER -