TY - JOUR
T1 - Distributed deep learning networks among institutions for medical imaging
AU - Chang, Ken
AU - Balachandar, Niranjan
AU - Lam, Carson
AU - Yi, Darvin
AU - Brown, James
AU - Beers, Andrew
AU - Rosen, Bruce
AU - Rubin, Daniel L.
AU - Kalpathy-Cramer, Jayashree
N1 - Publisher Copyright:
© 2018 The Author(s).
PY - 2018/8/1
Y1 - 2018/8/1
N2 - Objective: Deep learning has become a promising approach for automated support for clinical diagnosis. When medical data samples are limited, collaboration among multiple institutions is necessary to achieve high algorithm performance. However, sharing patient data often has limitations due to technical, legal, or ethical concerns. In this study,we propose methods of distributing deep learningmodels as an attractive alternative to sharing patient data. Methods: We simulate the distribution of deep learning models across 4 institutions using various training heuristics and compare the results with a deep learning model trained on centrally hosted patient data. The training heuristics investigated include ensembling single institution models, single weight transfer, and cyclical weight transfer. We evaluated these approaches for image classification in 3 independent image collections (retinal fundus photos, mammography, and ImageNet). Results: We find that cyclical weight transfer resulted in a performance that was comparable to that of centrally hosted patient data. We also found that there is an improvement in the performance of cyclical weight transfer heuristic with a high frequency of weight transfer. Conclusions: We show that distributing deep learning models is an effective alternative to sharing patient data. This finding has implications for any collaborative deep learning study.
AB - Objective: Deep learning has become a promising approach for automated support for clinical diagnosis. When medical data samples are limited, collaboration among multiple institutions is necessary to achieve high algorithm performance. However, sharing patient data often has limitations due to technical, legal, or ethical concerns. In this study,we propose methods of distributing deep learningmodels as an attractive alternative to sharing patient data. Methods: We simulate the distribution of deep learning models across 4 institutions using various training heuristics and compare the results with a deep learning model trained on centrally hosted patient data. The training heuristics investigated include ensembling single institution models, single weight transfer, and cyclical weight transfer. We evaluated these approaches for image classification in 3 independent image collections (retinal fundus photos, mammography, and ImageNet). Results: We find that cyclical weight transfer resulted in a performance that was comparable to that of centrally hosted patient data. We also found that there is an improvement in the performance of cyclical weight transfer heuristic with a high frequency of weight transfer. Conclusions: We show that distributing deep learning models is an effective alternative to sharing patient data. This finding has implications for any collaborative deep learning study.
KW - Deep Learning
KW - Distributed Learning
KW - Medical Imaging
KW - Neural Networks
UR - http://www.scopus.com/inward/record.url?scp=85055100903&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85055100903&partnerID=8YFLogxK
U2 - 10.1093/jamia/ocy017
DO - 10.1093/jamia/ocy017
M3 - Article
C2 - 29617797
AN - SCOPUS:85055100903
SN - 1067-5027
VL - 25
SP - 945
EP - 954
JO - Journal of the American Medical Informatics Association
JF - Journal of the American Medical Informatics Association
IS - 8
ER -