Distributed deep learning networks among institutions for medical imaging

Ken Chang, Niranjan Balachandar, Carson Lam, Darvin Yi, James Brown, Andrew Beers, Bruce Rosen, Daniel L. Rubin, Jayashree Kalpathy-Cramer

Research output: Contribution to journalArticle

13 Citations (Scopus)

Abstract

Objective: Deep learning has become a promising approach for automated support for clinical diagnosis. When medical data samples are limited, collaboration among multiple institutions is necessary to achieve high algorithm performance. However, sharing patient data often has limitations due to technical, legal, or ethical concerns. In this study,we propose methods of distributing deep learningmodels as an attractive alternative to sharing patient data. Methods: We simulate the distribution of deep learning models across 4 institutions using various training heuristics and compare the results with a deep learning model trained on centrally hosted patient data. The training heuristics investigated include ensembling single institution models, single weight transfer, and cyclical weight transfer. We evaluated these approaches for image classification in 3 independent image collections (retinal fundus photos, mammography, and ImageNet). Results: We find that cyclical weight transfer resulted in a performance that was comparable to that of centrally hosted patient data. We also found that there is an improvement in the performance of cyclical weight transfer heuristic with a high frequency of weight transfer. Conclusions: We show that distributing deep learning models is an effective alternative to sharing patient data. This finding has implications for any collaborative deep learning study.

Original languageEnglish (US)
Pages (from-to)945-954
Number of pages10
JournalJournal of the American Medical Informatics Association
Volume25
Issue number8
DOIs
StatePublished - Jan 1 2018
Externally publishedYes

Fingerprint

Diagnostic Imaging
Learning
Information Dissemination
Weights and Measures
Mammography
Transfer (Psychology)
Heuristics

Keywords

  • Deep Learning
  • Distributed Learning
  • Medical Imaging
  • Neural Networks

ASJC Scopus subject areas

  • Health Informatics

Cite this

Chang, K., Balachandar, N., Lam, C., Yi, D., Brown, J., Beers, A., ... Kalpathy-Cramer, J. (2018). Distributed deep learning networks among institutions for medical imaging. Journal of the American Medical Informatics Association, 25(8), 945-954. https://doi.org/10.1093/jamia/ocy017

Distributed deep learning networks among institutions for medical imaging. / Chang, Ken; Balachandar, Niranjan; Lam, Carson; Yi, Darvin; Brown, James; Beers, Andrew; Rosen, Bruce; Rubin, Daniel L.; Kalpathy-Cramer, Jayashree.

In: Journal of the American Medical Informatics Association, Vol. 25, No. 8, 01.01.2018, p. 945-954.

Research output: Contribution to journalArticle

Chang, K, Balachandar, N, Lam, C, Yi, D, Brown, J, Beers, A, Rosen, B, Rubin, DL & Kalpathy-Cramer, J 2018, 'Distributed deep learning networks among institutions for medical imaging', Journal of the American Medical Informatics Association, vol. 25, no. 8, pp. 945-954. https://doi.org/10.1093/jamia/ocy017
Chang, Ken ; Balachandar, Niranjan ; Lam, Carson ; Yi, Darvin ; Brown, James ; Beers, Andrew ; Rosen, Bruce ; Rubin, Daniel L. ; Kalpathy-Cramer, Jayashree. / Distributed deep learning networks among institutions for medical imaging. In: Journal of the American Medical Informatics Association. 2018 ; Vol. 25, No. 8. pp. 945-954.
@article{6aeb777230014757a099b20e7e0ef737,
title = "Distributed deep learning networks among institutions for medical imaging",
abstract = "Objective: Deep learning has become a promising approach for automated support for clinical diagnosis. When medical data samples are limited, collaboration among multiple institutions is necessary to achieve high algorithm performance. However, sharing patient data often has limitations due to technical, legal, or ethical concerns. In this study,we propose methods of distributing deep learningmodels as an attractive alternative to sharing patient data. Methods: We simulate the distribution of deep learning models across 4 institutions using various training heuristics and compare the results with a deep learning model trained on centrally hosted patient data. The training heuristics investigated include ensembling single institution models, single weight transfer, and cyclical weight transfer. We evaluated these approaches for image classification in 3 independent image collections (retinal fundus photos, mammography, and ImageNet). Results: We find that cyclical weight transfer resulted in a performance that was comparable to that of centrally hosted patient data. We also found that there is an improvement in the performance of cyclical weight transfer heuristic with a high frequency of weight transfer. Conclusions: We show that distributing deep learning models is an effective alternative to sharing patient data. This finding has implications for any collaborative deep learning study.",
keywords = "Deep Learning, Distributed Learning, Medical Imaging, Neural Networks",
author = "Ken Chang and Niranjan Balachandar and Carson Lam and Darvin Yi and James Brown and Andrew Beers and Bruce Rosen and Rubin, {Daniel L.} and Jayashree Kalpathy-Cramer",
year = "2018",
month = "1",
day = "1",
doi = "10.1093/jamia/ocy017",
language = "English (US)",
volume = "25",
pages = "945--954",
journal = "Journal of the American Medical Informatics Association",
issn = "1067-5027",
publisher = "Oxford University Press",
number = "8",

}

TY - JOUR

T1 - Distributed deep learning networks among institutions for medical imaging

AU - Chang, Ken

AU - Balachandar, Niranjan

AU - Lam, Carson

AU - Yi, Darvin

AU - Brown, James

AU - Beers, Andrew

AU - Rosen, Bruce

AU - Rubin, Daniel L.

AU - Kalpathy-Cramer, Jayashree

PY - 2018/1/1

Y1 - 2018/1/1

N2 - Objective: Deep learning has become a promising approach for automated support for clinical diagnosis. When medical data samples are limited, collaboration among multiple institutions is necessary to achieve high algorithm performance. However, sharing patient data often has limitations due to technical, legal, or ethical concerns. In this study,we propose methods of distributing deep learningmodels as an attractive alternative to sharing patient data. Methods: We simulate the distribution of deep learning models across 4 institutions using various training heuristics and compare the results with a deep learning model trained on centrally hosted patient data. The training heuristics investigated include ensembling single institution models, single weight transfer, and cyclical weight transfer. We evaluated these approaches for image classification in 3 independent image collections (retinal fundus photos, mammography, and ImageNet). Results: We find that cyclical weight transfer resulted in a performance that was comparable to that of centrally hosted patient data. We also found that there is an improvement in the performance of cyclical weight transfer heuristic with a high frequency of weight transfer. Conclusions: We show that distributing deep learning models is an effective alternative to sharing patient data. This finding has implications for any collaborative deep learning study.

AB - Objective: Deep learning has become a promising approach for automated support for clinical diagnosis. When medical data samples are limited, collaboration among multiple institutions is necessary to achieve high algorithm performance. However, sharing patient data often has limitations due to technical, legal, or ethical concerns. In this study,we propose methods of distributing deep learningmodels as an attractive alternative to sharing patient data. Methods: We simulate the distribution of deep learning models across 4 institutions using various training heuristics and compare the results with a deep learning model trained on centrally hosted patient data. The training heuristics investigated include ensembling single institution models, single weight transfer, and cyclical weight transfer. We evaluated these approaches for image classification in 3 independent image collections (retinal fundus photos, mammography, and ImageNet). Results: We find that cyclical weight transfer resulted in a performance that was comparable to that of centrally hosted patient data. We also found that there is an improvement in the performance of cyclical weight transfer heuristic with a high frequency of weight transfer. Conclusions: We show that distributing deep learning models is an effective alternative to sharing patient data. This finding has implications for any collaborative deep learning study.

KW - Deep Learning

KW - Distributed Learning

KW - Medical Imaging

KW - Neural Networks

UR - http://www.scopus.com/inward/record.url?scp=85055100903&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85055100903&partnerID=8YFLogxK

U2 - 10.1093/jamia/ocy017

DO - 10.1093/jamia/ocy017

M3 - Article

VL - 25

SP - 945

EP - 954

JO - Journal of the American Medical Informatics Association

JF - Journal of the American Medical Informatics Association

SN - 1067-5027

IS - 8

ER -