Interobserver agreement in hepatitis C grading and staging and in the Banff grading schema for acute cellular rejection: The "Hepatitis C 3" multi-institutional trial experience

George J. Netto, David L. Watkins, James W. Williams, Thomas V. Colby, Giovanni DePetris, Francis E. Sharkey, Christopher L. Corless, David Lewin, Lydia Petrovic, Shobha Sharma, Gary Kanel, Neil Theise, A. Brian West, Alison Koehler, Nirag C. Jhala, Jay Lefkowitch, Julia Lezzoni, Linda W. Jennings, G. Weldon Tillery, Goran B. Klintmalm

Research output: Contribution to journalArticlepeer-review

21 Scopus citations

Abstract

Context. - Establishing adequate interobserver agreement is crucial not only for standardization of patient care but also to ensure validity of findings in multi-institutional trials. Objective. - To evaluate interobserver agreement in assessing chronic hepatitis C (HCV) and acute cellular rejection (ACR) among 17 hepatopathologists involved in the "Hepatitis C 3" trial. Design. - The trial is a randomized multicenter (17 institutions) study involving 312 patients undergoing transplantation for HCV. Patients are randomized to 3 treatment arms. For final data analysis, all biopsy specimens are reviewed by a central pathologist (G.J.N.). Recurrence of HCV is evaluated according to the Batts and Ludwig schema. The 1997 Banff schema is used to evaluate ACR. To assess interobserver agreement, hematoxylin-eosin-stained sections from 11 liver biopsy specimens (6 HCV and 5 ACR) were sent by the central pathologist to 16 local pathologists from 13 institutions. Statistical analysis was performed on raw ACR/HCV data as well as data grouped according to clinically significant primary endpoint cutoffs. Results. - Statistically significant agreement was found among all participating pathologists (P < .001). On κ analysis, the degree of agreement was rated "moderate" for HCV grade and stage and ACR global grading (κ = 0.30, 0.33, and 0.37, respectively). Interobserver agreement was weaker for rejection activity index scoring of ACR (κ = 0.15). A stronger degree of agreement was found when scores were grouped based on endpoint cutoffs (κ = 0.76 "almost perfect" for HCV and 0.62 "substantial" for ACR). Conclusions. - An overall statistically significant interobserver agreement was found among 17 pathologists using the 1997 Banff schema and the Batts and Ludwig schema.

Original languageEnglish (US)
Pages (from-to)1157-1162
Number of pages6
JournalArchives of Pathology and Laboratory Medicine
Volume130
Issue number8
StatePublished - Aug 2006

ASJC Scopus subject areas

  • Pathology and Forensic Medicine
  • Medical Laboratory Technology

Fingerprint

Dive into the research topics of 'Interobserver agreement in hepatitis C grading and staging and in the Banff grading schema for acute cellular rejection: The "Hepatitis C 3" multi-institutional trial experience'. Together they form a unique fingerprint.

Cite this