Automated semantic analysis of changes in image sequences of neurons in culture

Omar Al-Kofahi, Richard J. Radke, Badrinath Roysam, Gary Banker

Research output: Contribution to journalArticle

32 Citations (Scopus)

Abstract

Quantitative studies of dynamic behaviors of live neurons are currently limited by the slowness, subjectivity, and tedium of manual analysis of changes in time-lapse image sequences. Challenges to automation include the complexity of the changes of interest, the presence of obfuscating and uninteresting changes due to illumination variations and other imaging artifacts, and the sheer volume of recorded data. This paper describes a highly automated approach that not only detects the interesting changes selectively, but also generates quantitative analyses at multiple levels of detail. Detailed quantitative neuronal morphometry is generated for each frame. Frame-to-frame neuronal changes are measured and labeled as growth, shrinkage, merging, or splitting, as would be done by a human expert. Finally, events unfolding over longer durations, such as apoptosis and axonal specification, are automatically inferred from the short-term changes. The proposed method is based on a Bayesian model selection criterion that leverages a set of short-term neurite change models and takes into account additional evidence provided by an illumination-insensitive change mask. An automated neuron tracing algorithm is used to identify the objects of interest in each frame. A novel curve distance measure and weighted bipartite graph matching are used to compare and associate neurites in successive frames. A separate set of multi-image change models drives the identification of longer term events. The method achieved frame-to-frame change labeling accuracies ranging from 85% to 100% when tested on 8 representative recordings performed under varied imaging and culturing conditions, and successfully detected all higher order events of interest. Two sequences were used for training the models and tuning their parameters; the learned parameter settings can be applied to hundreds of similar image sequences, provided imaging and culturing conditions are similar to the training set. The proposed approach is a substantial innovation over manual annotation and change analysis, accomplishing in minutes what it would take an expert hours to complete.

Original languageEnglish (US)
Article number1634505
Pages (from-to)1109-1123
Number of pages15
JournalIEEE Transactions on Biomedical Engineering
Volume53
Issue number6
DOIs
StatePublished - Jun 2006

Fingerprint

Neurons
Semantics
Imaging techniques
Lighting
Cell death
Merging
Labeling
Masks
Identification (control systems)
Automation
Tuning
Innovation
Specifications

Keywords

  • Assay automation
  • Change detection
  • Change understanding
  • Curve similarity
  • Event analysis
  • Morphological dynamics
  • Statistical model selection

ASJC Scopus subject areas

  • Biomedical Engineering

Cite this

Automated semantic analysis of changes in image sequences of neurons in culture. / Al-Kofahi, Omar; Radke, Richard J.; Roysam, Badrinath; Banker, Gary.

In: IEEE Transactions on Biomedical Engineering, Vol. 53, No. 6, 1634505, 06.2006, p. 1109-1123.

Research output: Contribution to journalArticle

Al-Kofahi, Omar ; Radke, Richard J. ; Roysam, Badrinath ; Banker, Gary. / Automated semantic analysis of changes in image sequences of neurons in culture. In: IEEE Transactions on Biomedical Engineering. 2006 ; Vol. 53, No. 6. pp. 1109-1123.
@article{a2daeb22dcd8421ab9278ea92e5716dc,
title = "Automated semantic analysis of changes in image sequences of neurons in culture",
abstract = "Quantitative studies of dynamic behaviors of live neurons are currently limited by the slowness, subjectivity, and tedium of manual analysis of changes in time-lapse image sequences. Challenges to automation include the complexity of the changes of interest, the presence of obfuscating and uninteresting changes due to illumination variations and other imaging artifacts, and the sheer volume of recorded data. This paper describes a highly automated approach that not only detects the interesting changes selectively, but also generates quantitative analyses at multiple levels of detail. Detailed quantitative neuronal morphometry is generated for each frame. Frame-to-frame neuronal changes are measured and labeled as growth, shrinkage, merging, or splitting, as would be done by a human expert. Finally, events unfolding over longer durations, such as apoptosis and axonal specification, are automatically inferred from the short-term changes. The proposed method is based on a Bayesian model selection criterion that leverages a set of short-term neurite change models and takes into account additional evidence provided by an illumination-insensitive change mask. An automated neuron tracing algorithm is used to identify the objects of interest in each frame. A novel curve distance measure and weighted bipartite graph matching are used to compare and associate neurites in successive frames. A separate set of multi-image change models drives the identification of longer term events. The method achieved frame-to-frame change labeling accuracies ranging from 85{\%} to 100{\%} when tested on 8 representative recordings performed under varied imaging and culturing conditions, and successfully detected all higher order events of interest. Two sequences were used for training the models and tuning their parameters; the learned parameter settings can be applied to hundreds of similar image sequences, provided imaging and culturing conditions are similar to the training set. The proposed approach is a substantial innovation over manual annotation and change analysis, accomplishing in minutes what it would take an expert hours to complete.",
keywords = "Assay automation, Change detection, Change understanding, Curve similarity, Event analysis, Morphological dynamics, Statistical model selection",
author = "Omar Al-Kofahi and Radke, {Richard J.} and Badrinath Roysam and Gary Banker",
year = "2006",
month = "6",
doi = "10.1109/TBME.2006.873565",
language = "English (US)",
volume = "53",
pages = "1109--1123",
journal = "IEEE Transactions on Biomedical Engineering",
issn = "0018-9294",
publisher = "IEEE Computer Society",
number = "6",

}

TY - JOUR

T1 - Automated semantic analysis of changes in image sequences of neurons in culture

AU - Al-Kofahi, Omar

AU - Radke, Richard J.

AU - Roysam, Badrinath

AU - Banker, Gary

PY - 2006/6

Y1 - 2006/6

N2 - Quantitative studies of dynamic behaviors of live neurons are currently limited by the slowness, subjectivity, and tedium of manual analysis of changes in time-lapse image sequences. Challenges to automation include the complexity of the changes of interest, the presence of obfuscating and uninteresting changes due to illumination variations and other imaging artifacts, and the sheer volume of recorded data. This paper describes a highly automated approach that not only detects the interesting changes selectively, but also generates quantitative analyses at multiple levels of detail. Detailed quantitative neuronal morphometry is generated for each frame. Frame-to-frame neuronal changes are measured and labeled as growth, shrinkage, merging, or splitting, as would be done by a human expert. Finally, events unfolding over longer durations, such as apoptosis and axonal specification, are automatically inferred from the short-term changes. The proposed method is based on a Bayesian model selection criterion that leverages a set of short-term neurite change models and takes into account additional evidence provided by an illumination-insensitive change mask. An automated neuron tracing algorithm is used to identify the objects of interest in each frame. A novel curve distance measure and weighted bipartite graph matching are used to compare and associate neurites in successive frames. A separate set of multi-image change models drives the identification of longer term events. The method achieved frame-to-frame change labeling accuracies ranging from 85% to 100% when tested on 8 representative recordings performed under varied imaging and culturing conditions, and successfully detected all higher order events of interest. Two sequences were used for training the models and tuning their parameters; the learned parameter settings can be applied to hundreds of similar image sequences, provided imaging and culturing conditions are similar to the training set. The proposed approach is a substantial innovation over manual annotation and change analysis, accomplishing in minutes what it would take an expert hours to complete.

AB - Quantitative studies of dynamic behaviors of live neurons are currently limited by the slowness, subjectivity, and tedium of manual analysis of changes in time-lapse image sequences. Challenges to automation include the complexity of the changes of interest, the presence of obfuscating and uninteresting changes due to illumination variations and other imaging artifacts, and the sheer volume of recorded data. This paper describes a highly automated approach that not only detects the interesting changes selectively, but also generates quantitative analyses at multiple levels of detail. Detailed quantitative neuronal morphometry is generated for each frame. Frame-to-frame neuronal changes are measured and labeled as growth, shrinkage, merging, or splitting, as would be done by a human expert. Finally, events unfolding over longer durations, such as apoptosis and axonal specification, are automatically inferred from the short-term changes. The proposed method is based on a Bayesian model selection criterion that leverages a set of short-term neurite change models and takes into account additional evidence provided by an illumination-insensitive change mask. An automated neuron tracing algorithm is used to identify the objects of interest in each frame. A novel curve distance measure and weighted bipartite graph matching are used to compare and associate neurites in successive frames. A separate set of multi-image change models drives the identification of longer term events. The method achieved frame-to-frame change labeling accuracies ranging from 85% to 100% when tested on 8 representative recordings performed under varied imaging and culturing conditions, and successfully detected all higher order events of interest. Two sequences were used for training the models and tuning their parameters; the learned parameter settings can be applied to hundreds of similar image sequences, provided imaging and culturing conditions are similar to the training set. The proposed approach is a substantial innovation over manual annotation and change analysis, accomplishing in minutes what it would take an expert hours to complete.

KW - Assay automation

KW - Change detection

KW - Change understanding

KW - Curve similarity

KW - Event analysis

KW - Morphological dynamics

KW - Statistical model selection

UR - http://www.scopus.com/inward/record.url?scp=33745005265&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33745005265&partnerID=8YFLogxK

U2 - 10.1109/TBME.2006.873565

DO - 10.1109/TBME.2006.873565

M3 - Article

C2 - 16761838

AN - SCOPUS:33745005265

VL - 53

SP - 1109

EP - 1123

JO - IEEE Transactions on Biomedical Engineering

JF - IEEE Transactions on Biomedical Engineering

SN - 0018-9294

IS - 6

M1 - 1634505

ER -