Interobserver Agreement Among Uveitis Experts on Uveitic Diagnoses

The Standardization of Uveitis Nomenclature Experience

Standardization of Uveitis Nomenclature Working Group

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

Purpose To evaluate the interobserver agreement among uveitis experts on the diagnosis of the specific uveitic disease. Design Interobserver agreement analysis. Methods Five committees, each comprised of 9 individuals and working in parallel, reviewed cases from a preliminary database of 25 uveitic diseases, collected by disease, and voted independently online whether the case was the disease in question or not. The agreement statistic, κ, was calculated for the 36 pairwise comparisons for each disease, and a mean κ was calculated for each disease. After the independent online voting, committee consensus conference calls, using nominal group techniques, reviewed all cases not achieving supermajority agreement (>75%) on the diagnosis in the online voting to attempt to arrive at a supermajority agreement. Results A total of 5766 cases for the 25 diseases were evaluated. The overall mean κ for the entire project was 0.39, with disease-specific variation ranging from 0.23 to 0.79. After the formalized consensus conference calls to address cases that did not achieve supermajority agreement in the online voting, supermajority agreement overall was reached on approximately 99% of cases, with disease-specific variation ranging from 96% to 100%. Conclusions Agreement among uveitis experts on diagnosis is moderate at best but can be improved by discussion among them. These data suggest the need for validated and widely used classification criteria in the field of uveitis.

Original languageEnglish (US)
Pages (from-to)19-24
Number of pages6
JournalAmerican Journal of Ophthalmology
Volume186
DOIs
StatePublished - Feb 1 2018

Fingerprint

Uveitis
Terminology
Politics
Databases

ASJC Scopus subject areas

  • Ophthalmology

Cite this

Interobserver Agreement Among Uveitis Experts on Uveitic Diagnoses : The Standardization of Uveitis Nomenclature Experience. / Standardization of Uveitis Nomenclature Working Group.

In: American Journal of Ophthalmology, Vol. 186, 01.02.2018, p. 19-24.

Research output: Contribution to journalArticle

@article{1ce35831d2c044c49d33efe3cb2aeacf,
title = "Interobserver Agreement Among Uveitis Experts on Uveitic Diagnoses: The Standardization of Uveitis Nomenclature Experience",
abstract = "Purpose To evaluate the interobserver agreement among uveitis experts on the diagnosis of the specific uveitic disease. Design Interobserver agreement analysis. Methods Five committees, each comprised of 9 individuals and working in parallel, reviewed cases from a preliminary database of 25 uveitic diseases, collected by disease, and voted independently online whether the case was the disease in question or not. The agreement statistic, κ, was calculated for the 36 pairwise comparisons for each disease, and a mean κ was calculated for each disease. After the independent online voting, committee consensus conference calls, using nominal group techniques, reviewed all cases not achieving supermajority agreement (>75{\%}) on the diagnosis in the online voting to attempt to arrive at a supermajority agreement. Results A total of 5766 cases for the 25 diseases were evaluated. The overall mean κ for the entire project was 0.39, with disease-specific variation ranging from 0.23 to 0.79. After the formalized consensus conference calls to address cases that did not achieve supermajority agreement in the online voting, supermajority agreement overall was reached on approximately 99{\%} of cases, with disease-specific variation ranging from 96{\%} to 100{\%}. Conclusions Agreement among uveitis experts on diagnosis is moderate at best but can be improved by discussion among them. These data suggest the need for validated and widely used classification criteria in the field of uveitis.",
author = "{Standardization of Uveitis Nomenclature Working Group} and Jabs, {Douglas A.} and Andrew Dick and Doucette, {John T.} and Amod Gupta and Susan Lightman and Peter McCluskey and Okada, {Annabelle A.} and Palestine, {Alan G.} and Rosenbaum, {James (Jim)} and Saleem, {Sophia M.} and Jennifer Thorne and Brett Trusko",
year = "2018",
month = "2",
day = "1",
doi = "10.1016/j.ajo.2017.10.028",
language = "English (US)",
volume = "186",
pages = "19--24",
journal = "American Journal of Ophthalmology",
issn = "0002-9394",
publisher = "Elsevier USA",

}

TY - JOUR

T1 - Interobserver Agreement Among Uveitis Experts on Uveitic Diagnoses

T2 - The Standardization of Uveitis Nomenclature Experience

AU - Standardization of Uveitis Nomenclature Working Group

AU - Jabs, Douglas A.

AU - Dick, Andrew

AU - Doucette, John T.

AU - Gupta, Amod

AU - Lightman, Susan

AU - McCluskey, Peter

AU - Okada, Annabelle A.

AU - Palestine, Alan G.

AU - Rosenbaum, James (Jim)

AU - Saleem, Sophia M.

AU - Thorne, Jennifer

AU - Trusko, Brett

PY - 2018/2/1

Y1 - 2018/2/1

N2 - Purpose To evaluate the interobserver agreement among uveitis experts on the diagnosis of the specific uveitic disease. Design Interobserver agreement analysis. Methods Five committees, each comprised of 9 individuals and working in parallel, reviewed cases from a preliminary database of 25 uveitic diseases, collected by disease, and voted independently online whether the case was the disease in question or not. The agreement statistic, κ, was calculated for the 36 pairwise comparisons for each disease, and a mean κ was calculated for each disease. After the independent online voting, committee consensus conference calls, using nominal group techniques, reviewed all cases not achieving supermajority agreement (>75%) on the diagnosis in the online voting to attempt to arrive at a supermajority agreement. Results A total of 5766 cases for the 25 diseases were evaluated. The overall mean κ for the entire project was 0.39, with disease-specific variation ranging from 0.23 to 0.79. After the formalized consensus conference calls to address cases that did not achieve supermajority agreement in the online voting, supermajority agreement overall was reached on approximately 99% of cases, with disease-specific variation ranging from 96% to 100%. Conclusions Agreement among uveitis experts on diagnosis is moderate at best but can be improved by discussion among them. These data suggest the need for validated and widely used classification criteria in the field of uveitis.

AB - Purpose To evaluate the interobserver agreement among uveitis experts on the diagnosis of the specific uveitic disease. Design Interobserver agreement analysis. Methods Five committees, each comprised of 9 individuals and working in parallel, reviewed cases from a preliminary database of 25 uveitic diseases, collected by disease, and voted independently online whether the case was the disease in question or not. The agreement statistic, κ, was calculated for the 36 pairwise comparisons for each disease, and a mean κ was calculated for each disease. After the independent online voting, committee consensus conference calls, using nominal group techniques, reviewed all cases not achieving supermajority agreement (>75%) on the diagnosis in the online voting to attempt to arrive at a supermajority agreement. Results A total of 5766 cases for the 25 diseases were evaluated. The overall mean κ for the entire project was 0.39, with disease-specific variation ranging from 0.23 to 0.79. After the formalized consensus conference calls to address cases that did not achieve supermajority agreement in the online voting, supermajority agreement overall was reached on approximately 99% of cases, with disease-specific variation ranging from 96% to 100%. Conclusions Agreement among uveitis experts on diagnosis is moderate at best but can be improved by discussion among them. These data suggest the need for validated and widely used classification criteria in the field of uveitis.

UR - http://www.scopus.com/inward/record.url?scp=85037650570&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85037650570&partnerID=8YFLogxK

U2 - 10.1016/j.ajo.2017.10.028

DO - 10.1016/j.ajo.2017.10.028

M3 - Article

VL - 186

SP - 19

EP - 24

JO - American Journal of Ophthalmology

JF - American Journal of Ophthalmology

SN - 0002-9394

ER -