TY - JOUR
T1 - Automated semantic analysis of changes in image sequences of neurons in culture
AU - Al-Kofahi, Omar
AU - Radke, Richard J.
AU - Roysam, Badrinath
AU - Banker, Gary
N1 - Funding Information:
Manuscript received May 17, 2005; revised October 22, 2005. This work was supported in part by the National Science Foundation (NSF) Center for Subsurface Sensing and Imaging Systems (CenSSIS) under the Engineering Research Centers Program (Award EEC-9986821), in part by the National Institutes of Health (NIH) under Grant NS177112, in part by the Nanobiotechnology Center (an STC program of the NSF Agreement ECS-9876771), and in part by Rens-selaer Polytechnic Institute. Asterisk indicates corresponding author.
PY - 2006/6
Y1 - 2006/6
N2 - Quantitative studies of dynamic behaviors of live neurons are currently limited by the slowness, subjectivity, and tedium of manual analysis of changes in time-lapse image sequences. Challenges to automation include the complexity of the changes of interest, the presence of obfuscating and uninteresting changes due to illumination variations and other imaging artifacts, and the sheer volume of recorded data. This paper describes a highly automated approach that not only detects the interesting changes selectively, but also generates quantitative analyses at multiple levels of detail. Detailed quantitative neuronal morphometry is generated for each frame. Frame-to-frame neuronal changes are measured and labeled as growth, shrinkage, merging, or splitting, as would be done by a human expert. Finally, events unfolding over longer durations, such as apoptosis and axonal specification, are automatically inferred from the short-term changes. The proposed method is based on a Bayesian model selection criterion that leverages a set of short-term neurite change models and takes into account additional evidence provided by an illumination-insensitive change mask. An automated neuron tracing algorithm is used to identify the objects of interest in each frame. A novel curve distance measure and weighted bipartite graph matching are used to compare and associate neurites in successive frames. A separate set of multi-image change models drives the identification of longer term events. The method achieved frame-to-frame change labeling accuracies ranging from 85% to 100% when tested on 8 representative recordings performed under varied imaging and culturing conditions, and successfully detected all higher order events of interest. Two sequences were used for training the models and tuning their parameters; the learned parameter settings can be applied to hundreds of similar image sequences, provided imaging and culturing conditions are similar to the training set. The proposed approach is a substantial innovation over manual annotation and change analysis, accomplishing in minutes what it would take an expert hours to complete.
AB - Quantitative studies of dynamic behaviors of live neurons are currently limited by the slowness, subjectivity, and tedium of manual analysis of changes in time-lapse image sequences. Challenges to automation include the complexity of the changes of interest, the presence of obfuscating and uninteresting changes due to illumination variations and other imaging artifacts, and the sheer volume of recorded data. This paper describes a highly automated approach that not only detects the interesting changes selectively, but also generates quantitative analyses at multiple levels of detail. Detailed quantitative neuronal morphometry is generated for each frame. Frame-to-frame neuronal changes are measured and labeled as growth, shrinkage, merging, or splitting, as would be done by a human expert. Finally, events unfolding over longer durations, such as apoptosis and axonal specification, are automatically inferred from the short-term changes. The proposed method is based on a Bayesian model selection criterion that leverages a set of short-term neurite change models and takes into account additional evidence provided by an illumination-insensitive change mask. An automated neuron tracing algorithm is used to identify the objects of interest in each frame. A novel curve distance measure and weighted bipartite graph matching are used to compare and associate neurites in successive frames. A separate set of multi-image change models drives the identification of longer term events. The method achieved frame-to-frame change labeling accuracies ranging from 85% to 100% when tested on 8 representative recordings performed under varied imaging and culturing conditions, and successfully detected all higher order events of interest. Two sequences were used for training the models and tuning their parameters; the learned parameter settings can be applied to hundreds of similar image sequences, provided imaging and culturing conditions are similar to the training set. The proposed approach is a substantial innovation over manual annotation and change analysis, accomplishing in minutes what it would take an expert hours to complete.
KW - Assay automation
KW - Change detection
KW - Change understanding
KW - Curve similarity
KW - Event analysis
KW - Morphological dynamics
KW - Statistical model selection
UR - http://www.scopus.com/inward/record.url?scp=33745005265&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=33745005265&partnerID=8YFLogxK
U2 - 10.1109/TBME.2006.873565
DO - 10.1109/TBME.2006.873565
M3 - Article
C2 - 16761838
AN - SCOPUS:33745005265
SN - 0018-9294
VL - 53
SP - 1109
EP - 1123
JO - IEEE Transactions on Biomedical Engineering
JF - IEEE Transactions on Biomedical Engineering
IS - 6
M1 - 1634505
ER -